Sample records for process control analysis

  1. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    PubMed

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  2. Statistical process control methods allow the analysis and improvement of anesthesia care.

    PubMed

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  3. The application of hazard analysis and critical control points and risk management in the preparation of anti-cancer drugs.

    PubMed

    Bonan, Brigitte; Martelli, Nicolas; Berhoune, Malik; Maestroni, Marie-Laure; Havard, Laurent; Prognon, Patrice

    2009-02-01

    To apply the Hazard analysis and Critical Control Points method to the preparation of anti-cancer drugs. To identify critical control points in our cancer chemotherapy process and to propose control measures and corrective actions to manage these processes. The Hazard Analysis and Critical Control Points application began in January 2004 in our centralized chemotherapy compounding unit. From October 2004 to August 2005, monitoring of the process nonconformities was performed to assess the method. According to the Hazard Analysis and Critical Control Points method, a multidisciplinary team was formed to describe and assess the cancer chemotherapy process. This team listed all of the critical points and calculated their risk indexes according to their frequency of occurrence, their severity and their detectability. The team defined monitoring, control measures and corrective actions for each identified risk. Finally, over a 10-month period, pharmacists reported each non-conformity of the process in a follow-up document. Our team described 11 steps in the cancer chemotherapy process. The team identified 39 critical control points, including 11 of higher importance with a high-risk index. Over 10 months, 16,647 preparations were performed; 1225 nonconformities were reported during this same period. The Hazard Analysis and Critical Control Points method is relevant when it is used to target a specific process such as the preparation of anti-cancer drugs. This method helped us to focus on the production steps, which can have a critical influence on product quality, and led us to improve our process.

  4. Welding process modelling and control

    NASA Technical Reports Server (NTRS)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  5. Root Cause Analysis of Quality Defects Using HPLC-MS Fingerprint Knowledgebase for Batch-to-batch Quality Control of Herbal Drugs.

    PubMed

    Yan, Binjun; Fang, Zhonghua; Shen, Lijuan; Qu, Haibin

    2015-01-01

    The batch-to-batch quality consistency of herbal drugs has always been an important issue. To propose a methodology for batch-to-batch quality control based on HPLC-MS fingerprints and process knowledgebase. The extraction process of Compound E-jiao Oral Liquid was taken as a case study. After establishing the HPLC-MS fingerprint analysis method, the fingerprints of the extract solutions produced under normal and abnormal operation conditions were obtained. Multivariate statistical models were built for fault detection and a discriminant analysis model was built using the probabilistic discriminant partial-least-squares method for fault diagnosis. Based on multivariate statistical analysis, process knowledge was acquired and the cause-effect relationship between process deviations and quality defects was revealed. The quality defects were detected successfully by multivariate statistical control charts and the type of process deviations were diagnosed correctly by discriminant analysis. This work has demonstrated the benefits of combining HPLC-MS fingerprints, process knowledge and multivariate analysis for the quality control of herbal drugs. Copyright © 2015 John Wiley & Sons, Ltd.

  6. Analysis And Control System For Automated Welding

    NASA Technical Reports Server (NTRS)

    Powell, Bradley W.; Burroughs, Ivan A.; Kennedy, Larry Z.; Rodgers, Michael H.; Goode, K. Wayne

    1994-01-01

    Automated variable-polarity plasma arc (VPPA) welding apparatus operates under electronic supervision by welding analysis and control system. System performs all major monitoring and controlling functions. It acquires, analyzes, and displays weld-quality data in real time and adjusts process parameters accordingly. Also records pertinent data for use in post-weld analysis and documentation of quality. System includes optoelectronic sensors and data processors that provide feedback control of welding process.

  7. Process yield improvements with process control terminal for varian serial ion implanters

    NASA Astrophysics Data System (ADS)

    Higashi, Harry; Soni, Ameeta; Martinez, Larry; Week, Ken

    Implant processes in a modern wafer production fab are extremely complex. There can be several types of misprocessing, i.e. wrong dose or species, double implants and missed implants. Process Control Terminals (PCT) for Varian 350Ds installed at Intel fabs were found to substantially reduce the number of misprocessing steps. This paper describes those misprocessing steps and their subsequent reduction with use of PCTs. Reliable and simple process control with serial process ion implanters has been in increasing demand. A well designed process control terminal greatly increases device yield by monitoring all pertinent implanter functions and enabling process engineering personnel to set up process recipes for simple and accurate system operation. By programming user-selectable interlocks, implant errors are reduced and those that occur are logged for further analysis and prevention. A process control terminal should also be compatible with office personal computers for greater flexibility in system use and data analysis. The impact from the capability of a process control terminal is increased productivity, ergo higher device yield.

  8. Model prototype utilization in the analysis of fault tolerant control and data processing systems

    NASA Astrophysics Data System (ADS)

    Kovalev, I. V.; Tsarev, R. Yu; Gruzenkin, D. V.; Prokopenko, A. V.; Knyazkov, A. N.; Laptenok, V. D.

    2016-04-01

    The procedure assessing the profit of control and data processing system implementation is presented in the paper. The reasonability of model prototype creation and analysis results from the implementing of the approach of fault tolerance provision through the inclusion of structural and software assessment redundancy. The developed procedure allows finding the best ratio between the development cost and the analysis of model prototype and earnings from the results of this utilization and information produced. The suggested approach has been illustrated by the model example of profit assessment and analysis of control and data processing system.

  9. A DMAIC approach for process capability improvement an engine crankshaft manufacturing process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, P. Srinivasa

    2014-05-01

    The define-measure-analyze-improve-control (DMAIC) approach is a five-strata approach, namely DMAIC. This approach is the scientific approach for reducing the deviations and improving the capability levels of the manufacturing processes. The present work elaborates on DMAIC approach applied in reducing the process variations of the stub-end-hole boring operation of the manufacture of crankshaft. This statistical process control study starts with selection of the critical-to-quality (CTQ) characteristic in the define stratum. The next stratum constitutes the collection of dimensional measurement data of the CTQ characteristic identified. This is followed by the analysis and improvement strata where the various quality control tools like Ishikawa diagram, physical mechanism analysis, failure modes effects analysis and analysis of variance are applied. Finally, the process monitoring charts are deployed at the workplace for regular monitoring and control of the concerned CTQ characteristic. By adopting DMAIC approach, standard deviation is reduced from 0.003 to 0.002. The process potential capability index ( C P) values improved from 1.29 to 2.02 and the process performance capability index ( C PK) values improved from 0.32 to 1.45, respectively.

  10. OAO battery data analysis

    NASA Technical Reports Server (NTRS)

    Gaston, S.; Wertheim, M.; Orourke, J. A.

    1973-01-01

    Summary, consolidation and analysis of specifications, manufacturing process and test controls, and performance results for OAO-2 and OAO-3 lot 20 Amp-Hr sealed nickel cadmium cells and batteries are reported. Correlation of improvements in control requirements with performance is a key feature. Updates for a cell/battery computer model to improve performance prediction capability are included. Applicability of regression analysis computer techniques to relate process controls to performance is checked.

  11. Real-time feedback control of twin-screw wet granulation based on image analysis.

    PubMed

    Madarász, Lajos; Nagy, Zsombor Kristóf; Hoffer, István; Szabó, Barnabás; Csontos, István; Pataki, Hajnalka; Démuth, Balázs; Szabó, Bence; Csorba, Kristóf; Marosi, György

    2018-06-04

    The present paper reports the first dynamic image analysis-based feedback control of continuous twin-screw wet granulation process. Granulation of the blend of lactose and starch was selected as a model process. The size and size distribution of the obtained particles were successfully monitored by a process camera coupled with an image analysis software developed by the authors. The validation of the developed system showed that the particle size analysis tool can determine the size of the granules with an error of less than 5 µm. The next step was to implement real-time feedback control of the process by controlling the liquid feeding rate of the pump through a PC, based on the real-time determined particle size results. After the establishment of the feedback control, the system could correct different real-life disturbances, creating a Process Analytically Controlled Technology (PACT), which guarantees the real-time monitoring and controlling of the quality of the granules. In the event of changes or bad tendencies in the particle size, the system can automatically compensate the effect of disturbances, ensuring proper product quality. This kind of quality assurance approach is especially important in the case of continuous pharmaceutical technologies. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Physics and control of wall turbulence for drag reduction.

    PubMed

    Kim, John

    2011-04-13

    Turbulence physics responsible for high skin-friction drag in turbulent boundary layers is first reviewed. A self-sustaining process of near-wall turbulence structures is then discussed from the perspective of controlling this process for the purpose of skin-friction drag reduction. After recognizing that key parts of this self-sustaining process are linear, a linear systems approach to boundary-layer control is discussed. It is shown that singular-value decomposition analysis of the linear system allows us to examine different approaches to boundary-layer control without carrying out the expensive nonlinear simulations. Results from the linear analysis are consistent with those observed in full nonlinear simulations, thus demonstrating the validity of the linear analysis. Finally, fundamental performance limit expected of optimal control input is discussed.

  13. [Development of whole process quality control and management system of traditional Chinese medicine decoction pieces based on traditional Chinese medicine quality tree].

    PubMed

    Yu, Wen-Kang; Dong, Ling; Pei, Wen-Xuan; Sun, Zhi-Rong; Dai, Jun-Dong; Wang, Yun

    2017-12-01

    The whole process quality control and management of traditional Chinese medicine (TCM) decoction pieces is a system engineering, involving the base environment, seeds and seedlings, harvesting, processing and other multiple steps, so the accurate identification of factors in TCM production process that may induce the quality risk, as well as reasonable quality control measures are very important. At present, the concept of quality risk is mainly concentrated in the aspects of management and regulations, etc. There is no comprehensive analysis on possible risks in the quality control process of TCM decoction pieces, or analysis summary of effective quality control schemes. A whole process quality control and management system for TCM decoction pieces based on TCM quality tree was proposed in this study. This system effectively combined the process analysis method of TCM quality tree with the quality risk management, and can help managers to make real-time decisions while realizing the whole process quality control of TCM. By providing personalized web interface, this system can realize user-oriented information feedback, and was convenient for users to predict, evaluate and control the quality of TCM. In the application process, the whole process quality control and management system of the TCM decoction pieces can identify the related quality factors such as base environment, cultivation and pieces processing, extend and modify the existing scientific workflow according to their own production conditions, and provide different enterprises with their own quality systems, to achieve the personalized service. As a new quality management model, this paper can provide reference for improving the quality of Chinese medicine production and quality standardization. Copyright© by the Chinese Pharmaceutical Association.

  14. State Analysis: A Control Architecture View of Systems Engineering

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert D.

    2005-01-01

    A viewgraph presentation on the state analysis process is shown. The topics include: 1) Issues with growing complexity; 2) Limits of common practice; 3) Exploiting a control point of view; 4) A glimpse at the State Analysis process; 5) Synergy with model-based systems engineering; and 6) Bridging the systems to software gap.

  15. WE-G-BRA-06: Application of Systems and Control Theory-Based Hazard Analysis to Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pawlicki, T; Samost, A; Leveson, N

    Purpose: The process of delivering radiation occurs in a complex socio-technical system heavily reliant on human operators. Furthermore, both humans and software are notoriously challenging to account for in traditional hazard analysis models. High reliability industries such as aviation have approached this problem through using hazard analysis techniques grounded in systems and control theory. The purpose of this work is to apply the Systems Theoretic Accident Model Processes (STAMP) hazard model to radiotherapy. In particular, the System-Theoretic Process Analysis (STPA) approach is used to perform a hazard analysis of a proposed on-line adaptive cranial radiosurgery procedure that omits the CTmore » Simulation step and uses only CBCT for planning, localization, and treatment. Methods: The STPA procedure first requires the definition of high-level accidents and hazards leading to those accidents. From there, hierarchical control structures were created followed by the identification and description of control actions for each control structure. Utilizing these control structures, unsafe states of each control action were created. Scenarios contributing to unsafe control action states were then identified and translated into system requirements to constrain process behavior within safe boundaries. Results: Ten control structures were created for this new CBCT-only process which covered the areas of hospital and department management, treatment design and delivery, and vendor service. Twenty three control actions were identified that contributed to over 80 unsafe states of those control actions resulting in over 220 failure scenarios. Conclusion: The interaction of people, hardware, and software are highlighted through the STPA approach. STPA provides a hierarchical model for understanding the role of management decisions in impacting system safety so that a process design requirement can be traced back to the hazard and accident that it is intended to mitigate. Varian Medical Systems, Inc.« less

  16. Plan delivery quality assurance for CyberKnife: Statistical process control analysis of 350 film-based patient-specific QAs.

    PubMed

    Bellec, J; Delaby, N; Jouyaux, F; Perdrieux, M; Bouvier, J; Sorel, S; Henry, O; Lafond, C

    2017-07-01

    Robotic radiosurgery requires plan delivery quality assurance (DQA) but there has never been a published comprehensive analysis of a patient-specific DQA process in a clinic. We proposed to evaluate 350 consecutive film-based patient-specific DQAs using statistical process control. We evaluated the performance of the process to propose achievable tolerance criteria for DQA validation and we sought to identify suboptimal DQA using control charts. DQAs were performed on a CyberKnife-M6 using Gafchromic-EBT3 films. The signal-to-dose conversion was performed using a multichannel-correction and a scanning protocol that combined measurement and calibration in a single scan. The DQA analysis comprised a gamma-index analysis at 3%/1.5mm and a separate evaluation of spatial and dosimetric accuracy of the plan delivery. Each parameter was plotted on a control chart and control limits were calculated. A capability index (Cpm) was calculated to evaluate the ability of the process to produce results within specifications. The analysis of capability showed that a gamma pass rate of 85% at 3%/1.5mm was highly achievable as acceptance criteria for DQA validation using a film-based protocol (Cpm>1.33). 3.4% of DQA were outside a control limit of 88% for gamma pass-rate. The analysis of the out-of-control DQA helped identify a dosimetric error in our institute for a specific treatment type. We have defined initial tolerance criteria for DQA validations. We have shown that the implementation of a film-based patient-specific DQA protocol with the use of control charts is an effective method to improve patient treatment safety on CyberKnife. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  17. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... potential for catastrophic consequences. (3) Engineering and administrative controls applicable to the... engineering and administrative controls; (5) Stationary source siting; (6) Human factors; and (7) A...

  18. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... potential for catastrophic consequences. (3) Engineering and administrative controls applicable to the... engineering and administrative controls; (5) Stationary source siting; (6) Human factors; and (7) A...

  19. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... potential for catastrophic consequences. (3) Engineering and administrative controls applicable to the... engineering and administrative controls; (5) Stationary source siting; (6) Human factors; and (7) A...

  20. Chemical Sensing in Process Analysis.

    ERIC Educational Resources Information Center

    Hirschfeld, T.; And Others

    1984-01-01

    Discusses: (1) rationale for chemical sensors in process analysis; (2) existing types of process chemical sensors; (3) sensor limitations, considering lessons of chemometrics; (4) trends in process control sensors; and (5) future prospects. (JN)

  1. U.S. Coast Guard SARSAT Final Evaluation Report. Volume II. Appendices.

    DOT National Transportation Integrated Search

    1987-03-01

    Contents: Controlled Tests; Controlled Test Error Analysis, Processing of Westwind Data; Exercises and Homing Tests; Further Analysis of Controlled Tests; Sar Case Analysis Tables; Narratives of Real Distress Cases; RCC Response Scenarios; Workload A...

  2. Display analysis with the optimal control model of the human operator. [pilot-vehicle display interface and information processing

    NASA Technical Reports Server (NTRS)

    Baron, S.; Levison, W. H.

    1977-01-01

    Application of the optimal control model of the human operator to problems in display analysis is discussed. Those aspects of the model pertaining to the operator-display interface and to operator information processing are reviewed and discussed. The techniques are then applied to the analysis of advanced display/control systems for a Terminal Configured Vehicle. Model results are compared with those obtained in a large, fixed-base simulation.

  3. The application of quality risk management to the bacterial endotoxins test: use of hazard analysis and critical control points.

    PubMed

    Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti

    2013-01-01

    Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.

  4. The numerical modelling and process simulation for the fault diagnosis of rotary kiln incinerator.

    PubMed

    Roh, S D; Kim, S W; Cho, W S

    2001-10-01

    The numerical modelling and process simulation for the fault diagnosis of rotary kiln incinerator were accomplished. In the numerical modelling, two models applied to the modelling within the kiln are the combustion chamber model including the mass and energy balance equations for two combustion chambers and 3D thermal model. The combustion chamber model predicts temperature within the kiln, flue gas composition, flux and heat of combustion. Using the combustion chamber model and 3D thermal model, the production-rules for the process simulation can be obtained through interrelation analysis between control and operation variables. The process simulation of the kiln is operated with the production-rules for automatic operation. The process simulation aims to provide fundamental solutions to the problems in incineration process by introducing an online expert control system to provide an integrity in process control and management. Knowledge-based expert control systems use symbolic logic and heuristic rules to find solutions for various types of problems. It was implemented to be a hybrid intelligent expert control system by mutually connecting with the process control systems which has the capability of process diagnosis, analysis and control.

  5. Applying Statistical Process Control to Clinical Data: An Illustration.

    ERIC Educational Resources Information Center

    Pfadt, Al; And Others

    1992-01-01

    Principles of statistical process control are applied to a clinical setting through the use of control charts to detect changes, as part of treatment planning and clinical decision-making processes. The logic of control chart analysis is derived from principles of statistical inference. Sample charts offer examples of evaluating baselines and…

  6. 21 CFR 120.24 - Process controls.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Process controls. 120.24 Section 120.24 Food and... CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS Pathogen Reduction § 120.24 Process controls. (a) In order to meet the requirements of subpart A of this part, processors of juice products...

  7. 21 CFR 120.24 - Process controls.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Process controls. 120.24 Section 120.24 Food and... CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS Pathogen Reduction § 120.24 Process controls. (a) In order to meet the requirements of subpart A of this part, processors of juice products...

  8. 21 CFR 120.24 - Process controls.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Process controls. 120.24 Section 120.24 Food and... CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS Pathogen Reduction § 120.24 Process controls. (a) In order to meet the requirements of subpart A of this part, processors of juice products...

  9. 21 CFR 120.24 - Process controls.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Process controls. 120.24 Section 120.24 Food and... CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS Pathogen Reduction § 120.24 Process controls. (a) In order to meet the requirements of subpart A of this part, processors of juice products...

  10. [Monitoring method of extraction process for Schisandrae Chinensis Fructus based on near infrared spectroscopy and multivariate statistical process control].

    PubMed

    Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li

    2017-10-01

    To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.

  11. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  12. Statistical process control: separating signal from noise in emergency department operations.

    PubMed

    Pimentel, Laura; Barrueto, Fermin

    2015-05-01

    Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Performance analysis of different tuning rules for an isothermal CSTR using integrated EPC and SPC

    NASA Astrophysics Data System (ADS)

    Roslan, A. H.; Karim, S. F. Abd; Hamzah, N.

    2018-03-01

    This paper demonstrates the integration of Engineering Process Control (EPC) and Statistical Process Control (SPC) for the control of product concentration of an isothermal CSTR. The objectives of this study are to evaluate the performance of Ziegler-Nichols (Z-N), Direct Synthesis, (DS) and Internal Model Control (IMC) tuning methods and determine the most effective method for this process. The simulation model was obtained from past literature and re-constructed using SIMULINK MATLAB to evaluate the process response. Additionally, the process stability, capability and normality were analyzed using Process Capability Sixpack reports in Minitab. Based on the results, DS displays the best response for having the smallest rise time, settling time, overshoot, undershoot, Integral Time Absolute Error (ITAE) and Integral Square Error (ISE). Also, based on statistical analysis, DS yields as the best tuning method as it exhibits the highest process stability and capability.

  14. Abhijit Dutta | NREL

    Science.gov Websites

    Techno-economic analysis Process model development for existing and conceptual processes Detailed heat integration Economic analysis of integrated processes Integration of process simulation learnings into control ;Conceptual Process Design and Techno-Economic Assessment of Ex Situ Catalytic Fast Pyrolysis of Biomass: A

  15. Performance analysis of Integrated Communication and Control System networks

    NASA Technical Reports Server (NTRS)

    Halevi, Y.; Ray, A.

    1990-01-01

    This paper presents statistical analysis of delays in Integrated Communication and Control System (ICCS) networks that are based on asynchronous time-division multiplexing. The models are obtained in closed form for analyzing control systems with randomly varying delays. The results of this research are applicable to ICCS design for complex dynamical processes like advanced aircraft and spacecraft, autonomous manufacturing plants, and chemical and processing plants.

  16. Statistical quality control through overall vibration analysis

    NASA Astrophysics Data System (ADS)

    Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos

    2010-05-01

    The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence of predictive variables (high-frequency vibration displacements) that are sensible to the processes setup and the quality of the products obtained. Based on the result of this overall vibration analysis, a second paper will analyse self-induced vibration spectrums in order to define limit vibration bands, controllable every cycle or connected to permanent vibration-monitoring systems able to adjust sensible process variables identified by ANOVA, once the vibration readings exceed established quality limits.

  17. WeaselBoard :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mulder, John C.; Schwartz, Moses Daniel; Berg, Michael J.

    2013-10-01

    Critical infrastructures, such as electrical power plants and oil refineries, rely on programmable logic controllers (PLCs) to control essential processes. State of the art security cannot detect attacks on PLCs at the hardware or firmware level. This renders critical infrastructure control systems vulnerable to costly and dangerous attacks. WeaselBoard is a PLC backplane analysis system that connects directly to the PLC backplane to capture backplane communications between modules. WeaselBoard forwards inter-module traffic to an external analysis system that detects changes to process control settings, sensor values, module configuration information, firmware updates, and process control program (logic) updates. WeaselBoard provides zero-daymore » exploit detection for PLCs by detecting changes in the PLC and the process. This approach to PLC monitoring is protected under U.S. Patent Application 13/947,887.« less

  18. Applied Behavior Analysis and Statistical Process Control?

    ERIC Educational Resources Information Center

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  19. Trends in International Persuasion: Persuasion in the Arms Control Negotiations.

    ERIC Educational Resources Information Center

    Hopmann, P. Terrence; Walcott, Charles

    An analysis of the bargaining process in international arms control negotiations is possible by developing a framework of interrelated hypotheses, by delineating and practicing interactions study called "Bargaining Process Analysis," and by formulating procedural steps that bridge the gap between laboratory studies and "real world" situations. In…

  20. Online analysis and process control in recombinant protein production (review).

    PubMed

    Palmer, Shane M; Kunji, Edmund R S

    2012-01-01

    Online analysis and control is essential for efficient and reproducible bioprocesses. A key factor in real-time control is the ability to measure critical variables rapidly. Online in situ measurements are the preferred option and minimize the potential loss of sterility. The challenge is to provide sensors with a good lifespan that withstand harsh bioprocess conditions, remain stable for the duration of a process without the need for recalibration, and offer a suitable working range. In recent decades, many new techniques that promise to extend the possibilities of analysis and control, not only by providing new parameters for analysis, but also through the improvement of accepted, well practiced, measurements have arisen.

  1. Implementation of hazard analysis and critical control point (HACCP) in dried anchovy production process

    NASA Astrophysics Data System (ADS)

    Citraresmi, A. D. P.; Wahyuni, E. E.

    2018-03-01

    The aim of this study was to inspect the implementation of Hazard Analysis and Critical Control Point (HACCP) for identification and prevention of potential hazards in the production process of dried anchovy at PT. Kelola Mina Laut (KML), Lobuk unit, Sumenep. Cold storage process is needed in each anchovy processing step in order to maintain its physical and chemical condition. In addition, the implementation of quality assurance system should be undertaken to maintain product quality. The research was conducted using a survey method, by following the whole process of making anchovy from the receiving raw materials to the packaging of final product. The method of data analysis used was descriptive analysis method. Implementation of HACCP at PT. KML, Lobuk unit, Sumenep was conducted by applying Pre Requisite Programs (PRP) and preparation stage consisting of 5 initial stages and 7 principles of HACCP. The results showed that CCP was found in boiling process flow with significant hazard of Listeria monocytogenesis bacteria and final sorting process with significant hazard of foreign material contamination in the product. Actions taken were controlling boiling temperature of 100 – 105°C for 3 - 5 minutes and training for sorting process employees.

  2. 21 CFR 120.24 - Process controls.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Process controls. 120.24 Section 120.24 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS Pathogen Reduction § 120.24 Process...

  3. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculatingmore » a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the security of treatments. They also showed that the dose delivery processes in the cancer center were in control for prostate and head-and-neck treatments. In parallel, long term process performance indices (P{sub p}, P{sub pk}, and P{sub pm}) have been analyzed. Their analysis helped defining which actions should be undertaken in order to improve the performance of the process. The prostate dose delivery process has been shown statistically capable (0.08% of the results is expected to be outside the clinical tolerances) contrary to the head-and-neck dose delivery process (5.76% of the results are expected to be outside the clinical tolerances).« less

  4. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    PubMed

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the security of treatments. They also showed that the dose delivery processes in the cancer center were in control for prostate and head-and-neck treatments. In parallel, long-term process performance indices (P(p), P(pk), and P(pm)) have been analyzed. Their analysis helped defining which actions should be undertaken in order to improve the performance of the process. The prostate dose delivery process has been shown statistically capable (0.08% of the results is expected to be outside the clinical tolerances) contrary to the head-and-neck dose delivery process (5.76% of the results are expected to be outside the clinical tolerances).

  5. Assessment of optimal control mechanism complexity by experimental landscape Hessian analysis: fragmentation of CH2BrI

    NASA Astrophysics Data System (ADS)

    Xing, Xi; Rey-de-Castro, Roberto; Rabitz, Herschel

    2014-12-01

    Optimally shaped femtosecond laser pulses can often be effectively identified in adaptive feedback quantum control experiments, but elucidating the underlying control mechanism can be a difficult task requiring significant additional analysis. We introduce landscape Hessian analysis (LHA) as a practical experimental tool to aid in elucidating control mechanism insights. This technique is applied to the dissociative ionization of CH2BrI using shaped fs laser pulses for optimization of the absolute yields of ionic fragments as well as their ratios for the competing processes of breaking the C-Br and C-I bonds. The experimental results suggest that these nominally complex problems can be reduced to a low-dimensional control space with insights into the control mechanisms. While the optimal yield for some fragments is dominated by a non-resonant intensity-driven process, the optimal generation of other fragments maa difficult task requiring significant additionaly be explained by a non-resonant process coupled to few level resonant dynamics. Theoretical analysis and modeling is consistent with the experimental observations.

  6. Automated system for the on-line monitoring of powder blending processes using near-infrared spectroscopy. Part I. System development and control.

    PubMed

    Hailey, P A; Doherty, P; Tapsell, P; Oliver, T; Aldridge, P K

    1996-03-01

    An automated system for the on-line monitoring of powder blending processes is described. The system employs near-infrared (NIR) spectroscopy using fibre-optics and a graphical user interface (GUI) developed in the LabVIEW environment. The complete supervisory control and data analysis (SCADA) software controls blender and spectrophotometer operation and performs statistical spectral data analysis in real time. A data analysis routine using standard deviation is described to demonstrate an approach to the real-time determination of blend homogeneity.

  7. Multifractal Properties of Process Control Variables

    NASA Astrophysics Data System (ADS)

    Domański, Paweł D.

    2017-06-01

    Control system is an inevitable element of any industrial installation. Its quality affects overall process performance significantly. The assessment, whether control system needs any improvement or not, requires relevant and constructive measures. There are various methods, like time domain based, Minimum Variance, Gaussian and non-Gaussian statistical factors, fractal and entropy indexes. Majority of approaches use time series of control variables. They are able to cover many phenomena. But process complexities and human interventions cause effects that are hardly visible for standard measures. It is shown that the signals originating from industrial installations have multifractal properties and such an analysis may extend standard approach to further observations. The work is based on industrial and simulation data. The analysis delivers additional insight into the properties of control system and the process. It helps to discover internal dependencies and human factors, which are hardly detectable.

  8. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...

  9. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...

  10. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...

  11. A framework supporting the development of a Grid portal for analysis based on ROI.

    PubMed

    Ichikawa, K; Date, S; Kaishima, T; Shimojo, S

    2005-01-01

    In our research on brain function analysis, users require two different simultaneous types of processing: interactive processing to a specific part of data and high-performance batch processing to an entire dataset. The difference between these two types of processing is in whether or not the analysis is for data in the region of interest (ROI). In this study, we propose a Grid portal that has a mechanism to freely assign computing resources to the users on a Grid environment according to the users' two different types of processing requirements. We constructed a Grid portal which integrates interactive processing and batch processing by the following two mechanisms. First, a job steering mechanism controls job execution based on user-tagged priority among organizations with heterogeneous computing resources. Interactive jobs are processed in preference to batch jobs by this mechanism. Second, a priority-based result delivery mechanism that administrates a rank of data significance. The portal ensures a turn-around time of interactive processing by the priority-based job controlling mechanism, and provides the users with quality of services (QoS) for interactive processing. The users can access the analysis results of interactive jobs in preference to the analysis results of batch jobs. The Grid portal has also achieved high-performance computation of MEG analysis with batch processing on the Grid environment. The priority-based job controlling mechanism has been realized to freely assign computing resources to the users' requirements. Furthermore the achievement of high-performance computation contributes greatly to the overall progress of brain science. The portal has thus made it possible for the users to flexibly include the large computational power in what they want to analyze.

  12. [Purifying process of gynostemma pentaphyllum saponins based on "adjoint marker" online control technology and identification of their compositions by UPLC-QTOF-MS].

    PubMed

    Fan, Dong-Dong; Kuang, Yan-Hui; Dong, Li-Hua; Ye, Xiao; Chen, Liang-Mian; Zhang, Dong; Ma, Zhen-Shan; Wang, Jin-Yu; Zhu, Jing-Jing; Wang, Zhi-Min; Wang, De-Qin; Li, Chu-Yuan

    2017-04-01

    To optimize the purification process of gynostemma pentaphyllum saponins (GPS) based on "adjoint marker" online control technology with GPS as the testing index. UPLC-QTOF-MS technology was used for qualitative analysis. "Adjoint marker" online control results showed that the end point of load sample was that the UV absorbance of effluent liquid was equal to half of that of load sample solution, and the absorbance was basically stable when the end point was stable. In UPLC-QTOF-MS qualitative analysis, 16 saponins were identified from GPS, including 13 known gynostemma saponins and 3 new saponins. This optimized method was proved to be simple, scientific, reasonable, easy for online determination, real-time record, and can be better applied to the mass production and automation of production. The results of qualitative analysis indicated that the "adjoint marker" online control technology can well retain main efficacy components of medicinal materials, and provide analysis tools for the process control and quality traceability. Copyright© by the Chinese Pharmaceutical Association.

  13. Relationships between locus of control and paranormal beliefs.

    PubMed

    Newby, Robert W; Davis, Jessica Boyette

    2004-06-01

    The present study investigated the associations between scores on paranormal beliefs, locus of control, and certain psychological processes such as affect and cognitions as measured by the Linguistic Inquiry and Word Count. Analysis yielded significant correlations between scores on Locus of Control and two subscales of Tobacyk's (1988) Revised Paranormal Beliefs Scale, New Age Philosophy and Traditional Paranormal Beliefs. A step-wise multiple regression analysis indicated that Locus of Control was significantly related to New Age Philosophy. Other correlations were found between Tobacyk's subscales, Locus of Control, and three processes measured by the Linguistic Inquiry and Word Count.

  14. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP...

  15. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard Analysis and Critical Control Point (HACCP...

  16. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Hazard Analysis and Critical Control Point (HACCP...

  17. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Hazard Analysis and Critical Control Point (HACCP...

  18. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard Analysis and Critical Control Point (HACCP...

  19. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  20. Nanostructuring of Aluminum Alloy Powders by Cryogenic Attrition with Hydrogen-Free Process Control Agent

    DTIC Science & Technology

    2015-02-01

    Nanostructuring of Aluminum Alloy Powders by Cryogenic Attrition with Hydrogen-Free Process Control Agent by Frank Kellogg , Clara Hofmeister...Process Control Agent Frank Kellogg Bowhead Science and Technology Clara Hofmeister Advanced Materials Processing and Analysis Center...NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Frank Kellogg , Clara Hofmeister, Anit Giri, and Kyu Cho 5d. PROJECT NUMBER 5e

  1. 75 FR 14361 - Notification, Documentation, and Recordkeeping Requirements for Inspected Establishments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-25

    ... establishment's process control plans, that is, its Hazard Analysis and Critical Control Point plans. DATES... control plans, i.e., its Hazard Analysis and Critical Control Point (HACCP) plans; and (3) make the recall... systematic prevention of biological, chemical, and physical hazards. HACCP plans are establishment-developed...

  2. A novel process control method for a TT-300 E-Beam/X-Ray system

    NASA Astrophysics Data System (ADS)

    Mittendorfer, Josef; Gallnböck-Wagner, Bernhard

    2018-02-01

    This paper presents some aspects of the process control method for a TT-300 E-Beam/X-Ray system at Mediscan, Austria. The novelty of the approach is the seamless integration of routine monitoring dosimetry with process data. This allows to calculate a parametric dose for each production unit and consequently a fine grain and holistic process performance monitoring. Process performance is documented in process control charts for the analysis of individual runs as well as historic trending of runs of specific process categories over a specified time range.

  3. Quality transitivity and traceability system of herbal medicine products based on quality markers.

    PubMed

    Liu, Changxiao; Guo, De-An; Liu, Liang

    2018-05-15

    Due to a variety of factors to affect the herb quality, the existing quality management model is unable to evaluate the process control. The development of the concept of "quality marker" (Q-marker) lays basis for establishing an independent process quality control system for herbal products. To ensure the highest degree of safety, effectiveness and quality process control of herbal products, it is aimed to establish a quality transitivity and traceability system of quality and process control from raw materials to finished herbal products. Based on the key issues and challenges of quality assessment, the current status of quality and process controls from raw materials to herbal medicinal products listed in Pharmacopoeia were analyzed and the research models including discovery and identification of Q-markers, analysis and quality management of risk evaluation were designed. Authors introduced a few new technologies and methodologies, such as DNA barcoding, chromatographic technologies, fingerprint analysis, chemical markers, bio-responses, risk management and solution for quality process control. The quality and process control models for herbal medicinal products were proposed and the transitivity and traceability system from raw materials to the finished products was constructed to improve the herbal quality from the entire supply and production chain. The transitivity and traceability system has been established based on quality markers, especially on how to control the production process under Good Engineering Practices, as well as to implement the risk management for quality and process control in herbal medicine production. Copyright © 2018 Elsevier GmbH. All rights reserved.

  4. Automated data acquisition technology development:Automated modeling and control development

    NASA Technical Reports Server (NTRS)

    Romine, Peter L.

    1995-01-01

    This report documents the completion of, and improvements made to, the software developed for automated data acquisition and automated modeling and control development on the Texas Micro rackmounted PC's. This research was initiated because a need was identified by the Metal Processing Branch of NASA Marshall Space Flight Center for a mobile data acquisition and data analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC based system was chosen. The Welding Measurement System (WMS), is a dedicated instrument strickly for use of data acquisition and data analysis. In addition to the data acquisition functions described in this thesis, WMS also supports many functions associated with process control. The hardware and software requirements for an automated acquisition system for welding process parameters, welding equipment checkout, and welding process modeling were determined in 1992. From these recommendations, NASA purchased the necessary hardware and software. The new welding acquisition system is designed to collect welding parameter data and perform analysis to determine the voltage versus current arc-length relationship for VPPA welding. Once the results of this analysis are obtained, they can then be used to develop a RAIL function to control welding startup and shutdown without torch crashing.

  5. Automated Hazard Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riddle, F. J.

    2003-06-26

    The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control and job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the workmore » planning process.« less

  6. Cost-effectiveness of integrated analysis/design systems /IPAD/ An executive summary. II. [for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Miller, R. E., Jr.; Hansen, S. D.; Redhed, D. D.; Southall, J. W.; Kawaguchi, A. S.

    1974-01-01

    Evaluation of the cost-effectiveness of integrated analysis/design systems with particular attention to Integrated Program for Aerospace-Vehicle Design (IPAD) project. An analysis of all the ingredients of IPAD indicates the feasibility of a significant cost and flowtime reduction in the product design process involved. It is also concluded that an IPAD-supported design process will provide a framework for configuration control, whereby the engineering costs for design, analysis and testing can be controlled during the air vehicle development cycle.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    A. Alfonsi; C. Rabiti; D. Mandelli

    The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data miningmore » module« less

  8. Dispersed processing for ATC.

    DOT National Transportation Integrated Search

    1971-06-01

    An analysis has been made of the potentialities and problems involved in assigning some computer processing and control functions to the remote sites in an upgraded third generation air traffic control system. Interrogator sites offer the most fruitf...

  9. [Design of a HACCP Plan for the Gouda-type cheesemaking process in a milk processing plant].

    PubMed

    Dávila, Jacqueline; Reyes, Genara; Corzo, Otoniel

    2006-03-01

    The Hazard Analysis and Critical Control Point (HACCP) is a preventive and systematic method used to identify, assess and control of the hazards related with raw material, ingredients, processing, marketing and intended consumer in order to assure the safety of the food. The aim of this study was to design a HACCP plan for implementing in a Gouda-type cheese-making process in a dairy processing plant. The used methodology was based in the application of the seven principles of the HACCP, the information from the plant about the compliment of the pre-requisite programs (70-80%), the experience of the HACCP team and the sequence of stages settles down by the COVENIN standard 3802 for implementing the HACCP system. A HACCP plan was proposed with the scope, the selection of HACCP team, the description of the product and the intended use, the flow diagram of the process, the hazard analysis and the control table of the plan with the critical control points (CCP). The following CCP were identified in the process: pasteurization, coagulation and ripening.

  10. The Warning System in Disaster Situations: A Selective Analysis.

    DTIC Science & Technology

    DISASTERS, *WARNING SYSTEMS), CIVIL DEFENSE, SOCIAL PSYCHOLOGY, REACTION(PSYCHOLOGY), FACTOR ANALYSIS, CLASSIFICATION, STATISTICAL DATA, TIME ... MANAGEMENT PLANNING AND CONTROL, DAMAGE, CONTROL SYSTEMS, THREAT EVALUATION, DECISION MAKING, DATA PROCESSING, COMMUNICATION SYSTEMS, NUCLEAR EXPLOSIONS

  11. Analyzing a Mature Software Inspection Process Using Statistical Process Control (SPC)

    NASA Technical Reports Server (NTRS)

    Barnard, Julie; Carleton, Anita; Stamper, Darrell E. (Technical Monitor)

    1999-01-01

    This paper presents a cooperative effort where the Software Engineering Institute and the Space Shuttle Onboard Software Project could experiment applying Statistical Process Control (SPC) analysis to inspection activities. The topics include: 1) SPC Collaboration Overview; 2) SPC Collaboration Approach and Results; and 3) Lessons Learned.

  12. Using Simulation Module, PCLAB, for Steady State Disturbance Sensitivity Analysis in Process Control

    ERIC Educational Resources Information Center

    Ali, Emad; Idriss, Arimiyawo

    2009-01-01

    Recently, chemical engineering education moves towards utilizing simulation soft wares to enhance the learning process especially in the field of process control. These training simulators provide interactive learning through visualization and practicing which will bridge the gap between the theoretical abstraction of textbooks and the…

  13. TQM (Total Quality Management) SPARC (Special Process Action Review Committees) Handbook

    DTIC Science & Technology

    1989-08-01

    This document describes the techniques used to support and guide the Special Process Action Review Committees for accomplishing their goals for Total Quality Management (TQM). It includes concepts and definitions, checklists, sample formats, and assessment criteria. Keywords: Continuous process improvement; Logistics information; Process analysis; Quality control; Quality assurance; Total Quality Management ; Statistical processes; Management Planning and control; Management training; Management information systems.

  14. Ultrafast dynamics in atomic clusters: Analysis and control

    PubMed Central

    Bonačić-Koutecký, Vlasta; Mitrić, Roland; Werner, Ute; Wöste, Ludger; Berry, R. Stephen

    2006-01-01

    We present a study of dynamics and ultrafast observables in the frame of pump–probe negative-to-neutral-to-positive ion (NeNePo) spectroscopy illustrated by the examples of bimetallic trimers Ag2Au−/Ag2Au/Ag2Au+ and silver oxides Ag3O2−/Ag3O2/Ag3O2+ in the context of cluster reactivity. First principle multistate adiabatic dynamics allows us to determine time scales of different ultrafast processes and conditions under which these processes can be experimentally observed. Furthermore, we present a strategy for optimal pump–dump control in complex systems based on the ab initio Wigner distribution approach and apply it to tailor laser fields for selective control of the isomerization process in Na3F2. The shapes of pulses can be assigned to underlying processes, and therefore control can be used as a tool for analysis. PMID:16740664

  15. Ultrafast dynamics in atomic clusters: analysis and control.

    PubMed

    Bonacić-Koutecký, Vlasta; Mitrić, Roland; Werner, Ute; Wöste, Ludger; Berry, R Stephen

    2006-07-11

    We present a study of dynamics and ultrafast observables in the frame of pump-probe negative-to-neutral-to-positive ion (NeNePo) spectroscopy illustrated by the examples of bimetallic trimers Ag2Au-/Ag2Au/Ag2Au+ and silver oxides Ag3O2-/Ag3O2/Ag3O2+ in the context of cluster reactivity. First principle multistate adiabatic dynamics allows us to determine time scales of different ultrafast processes and conditions under which these processes can be experimentally observed. Furthermore, we present a strategy for optimal pump-dump control in complex systems based on the ab initio Wigner distribution approach and apply it to tailor laser fields for selective control of the isomerization process in Na3F2. The shapes of pulses can be assigned to underlying processes, and therefore control can be used as a tool for analysis.

  16. Automated plasma control with optical emission spectroscopy

    NASA Astrophysics Data System (ADS)

    Ward, P. P.

    Plasma etching and desmear processes for printed wiring board (PWB) manufacture are difficult to predict and control. Non-uniformity of most plasma processes and sensitivity to environmental changes make it difficult to maintain process stability from day to day. To assure plasma process performance, weight loss coupons or post-plasma destructive testing must be used. These techniques are not real-time methods however, and do not allow for immediate diagnosis and process correction. These tests often require scrapping some fraction of a batch to insure the integrity of the rest. Since these tests verify a successful cycle with post-plasma diagnostics, poor test results often determine that a batch is substandard and the resulting parts unusable. These tests are a costly part of the overall fabrication cost. A more efficient method of testing would allow for constant monitoring of plasma conditions and process control. Process anomalies should be detected and corrected before the parts being treated are damaged. Real time monitoring would allow for instantaneous corrections. Multiple site monitoring would allow for process mapping within one system or simultaneous monitoring of multiple systems. Optical emission spectroscopy conducted external to the plasma apparatus would allow for this sort of multifunctional analysis without perturbing the glow discharge. In this paper, optical emission spectroscopy for non-intrusive, in situ process control will be explored along with applications of this technique to for process control, failure analysis and endpoint determination in PWB manufacture.

  17. Red and processed meat consumption and risk of glioma in adults: A systematic review and meta-analysis of observational studies

    PubMed Central

    Saneei, Parvane; Willett, Walter; Esmaillzadeh, Ahmad

    2015-01-01

    Background: These findings from several observational studies, investigated the association between red meat consumption and gliomas, were inconsistent. We conducted a systematic review and meta-analysis of observational studies to summarize available date on the relation between meat intake and risk of glioma. Materials and Methods: A systematic literature search of relevant reports published until May 2014 of the PubMed/Medline, ISI Web of Knowledge, Excerpta Medica database, Ovid database, Google Scholar, and Scopus databases was conducted. From 723 articles yielded in the preliminary literature search, data from eighteen publications (14 case-control, three cohort, and one nested case-control study) on unprocessed red meat, processed meat, and/or total red meat consumption in relation to glioma in adults were included in the analysis. Quality assessment of studies was performed. Random effects model was used to conduct the meta-analysis. Results: We found a positive significant association between unprocessed red meat intake and risk of glioma (relative risk [RR] = 1.30; 95% confidence interval [CI]: 1.08-1.58) after excluding three studies with uncertain type of brain cancer. This analysis included only one cohort study which revealed no relation between unprocessed red meat intake and glioma (RR = 1.75; 95% CI: 0.35-8.77). Consumption of processed meats was not related to increased risk of glioma in population-based case-control studies (RR = 1.26; 95% CI: 1.05-1.51) and reduced risk in hospital-based case-controls (RR = 0.79; 95% CI: 0.65-0.97). No significant association was seen between processed red meat intake and risk of glioma in cohort studies (RR: 1.08; 95% CI: 0.84-1.37). Total red meat consumption was not associated with risk of adult glioma in case-control or cohort studies. Conclusion: In this meta-analysis of 18 observational studies, we found a modest positive association between unprocessed red meat intake and risk of gliomas based almost entirely on case-control studies. Processed red meat was overall not associated with risk of gliomas in case-control or cohort studies. PMID:26600837

  18. Switching and optimizing control for coal flotation process based on a hybrid model

    PubMed Central

    Dong, Zhiyong; Wang, Ranfeng; Fan, Minqiang; Fu, Xiang

    2017-01-01

    Flotation is an important part of coal preparation, and the flotation column is widely applied as efficient flotation equipment. This process is complex and affected by many factors, with the froth depth and reagent dosage being two of the most important and frequently manipulated variables. This paper proposes a new method of switching and optimizing control for the coal flotation process. A hybrid model is built and evaluated using industrial data. First, wavelet analysis and principal component analysis (PCA) are applied for signal pre-processing. Second, a control model for optimizing the set point of the froth depth is constructed based on fuzzy control, and a control model is designed to optimize the reagent dosages based on expert system. Finally, the least squares-support vector machine (LS-SVM) is used to identify the operating conditions of the flotation process and to select one of the two models (froth depth or reagent dosage) for subsequent operation according to the condition parameters. The hybrid model is developed and evaluated on an industrial coal flotation column and exhibits satisfactory performance. PMID:29040305

  19. Cognitive process modelling of controllers in en route air traffic control.

    PubMed

    Inoue, Satoru; Furuta, Kazuo; Nakata, Keiichi; Kanno, Taro; Aoyama, Hisae; Brown, Mark

    2012-01-01

    In recent years, various efforts have been made in air traffic control (ATC) to maintain traffic safety and efficiency in the face of increasing air traffic demands. ATC is a complex process that depends to a large degree on human capabilities, and so understanding how controllers carry out their tasks is an important issue in the design and development of ATC systems. In particular, the human factor is considered to be a serious problem in ATC safety and has been identified as a causal factor in both major and minor incidents. There is, therefore, a need to analyse the mechanisms by which errors occur due to complex factors and to develop systems that can deal with these errors. From the cognitive process perspective, it is essential that system developers have an understanding of the more complex working processes that involve the cooperative work of multiple controllers. Distributed cognition is a methodological framework for analysing cognitive processes that span multiple actors mediated by technology. In this research, we attempt to analyse and model interactions that take place in en route ATC systems based on distributed cognition. We examine the functional problems in an ATC system from a human factors perspective, and conclude by identifying certain measures by which to address these problems. This research focuses on the analysis of air traffic controllers' tasks for en route ATC and modelling controllers' cognitive processes. This research focuses on an experimental study to gain a better understanding of controllers' cognitive processes in air traffic control. We conducted ethnographic observations and then analysed the data to develop a model of controllers' cognitive process. This analysis revealed that strategic routines are applicable to decision making.

  20. Evidence for a neural dual-process account for adverse effects of cognitive control.

    PubMed

    Zink, Nicolas; Stock, Ann-Kathrin; Colzato, Lorenza; Beste, Christian

    2018-06-09

    Advantageous effects of cognitive control are well-known, but cognitive control may also have adverse effects, for example when it suppresses the implicit processing of stimulus-response (S-R) bindings that could benefit task performance. Yet, the neurophysiological and functional neuroanatomical structures associated with adverse effects of cognitive control are poorly understood. We used an extreme group approach to compare individuals who exhibit adverse effects of cognitive control to individuals who do not by combining event-related potentials (ERPs), source localization, time-frequency analysis and network analysis methods. While neurophysiological correlates of cognitive control (i.e. N2, N450, theta power and theta-mediated neuronal network efficiency) and task-set updating (P3) both reflect control demands and implicit information processing, differences in the degree of adverse cognitive control effects are associated with two independent neural mechanisms: Individuals, who show adverse behavioral effects of cognitive control, show reduced small-world properties and thus reduced efficiency in theta-modulated networks when they fail to effectively process implicit information. In contrast to this, individuals who do not display adverse control effects show enhanced task-set updating mechanism when effectively processing implicit information, which is reflected by the P3 ERP component and associated with the temporo-parietal junction (TPJ, BA 40) and medial frontal gyrus (MFG; BA 8). These findings suggest that implicit S-R contingencies, which benefit response selection without cognitive control, are always 'picked up', but may fail to be integrated with task representations to guide response selection. This provides evidence for a neurophysiological and functional neuroanatomical "dual-process" account of adverse cognitive control effects.

  1. Analysis of the control structures for an integrated ethanol processor for proton exchange membrane fuel cell systems

    NASA Astrophysics Data System (ADS)

    Biset, S.; Nieto Deglioumini, L.; Basualdo, M.; Garcia, V. M.; Serra, M.

    The aim of this work is to investigate which would be a good preliminary plantwide control structure for the process of Hydrogen production from bioethanol to be used in a proton exchange membrane (PEM) accounting only steady-state information. The objective is to keep the process under optimal operation point, that is doing energy integration to achieve the maximum efficiency. Ethanol, produced from renewable feedstocks, feeds a fuel processor investigated for steam reforming, followed by high- and low-temperature shift reactors and preferential oxidation, which are coupled to a polymeric fuel cell. Applying steady-state simulation techniques and using thermodynamic models the performance of the complete system with two different control structures have been evaluated for the most typical perturbations. A sensitivity analysis for the key process variables together with the rigorous operability requirements for the fuel cell are taking into account for defining acceptable plantwide control structure. This is the first work showing an alternative control structure applied to this kind of process.

  2. The effects of computer-assisted instruction and locus of control upon preservice elementary teachers' acquisition of the integrated science process skills

    NASA Astrophysics Data System (ADS)

    Wesley, Beth Eddinger; Krockover, Gerald H.; Devito, Alfred

    The purpose of this study was to determine the effects of computer-assisted instruction (CAI) versus a text mode of programmed instruction (PI), and the cognitive style of locus of control, on preservice elementary teachers' achievement of the integrated science process skills. Eighty-one preservice elementary teachers in six sections of a science methods class were classified as internally or externally controlled. The sections were randomly assigned to receive instruction in the integrated science process skills via a microcomputer or printed text. The study used a pretest-posttest control group design. Before assessing main and interaction effects, analysis of covariance was used to adjust posttest scores using the pretest scores. Statistical analysis revealed that main effects were not significant. Additionally, no interaction effects between treatments and loci of control were demonstrated. The results suggest that printed PI and tutorial CAI are equally effective modes of instruction for teaching internally and externally oriented preservice elementary teachers the integrated science process skills.

  3. Graphical user interface for image acquisition and processing

    DOEpatents

    Goldberg, Kenneth A.

    2002-01-01

    An event-driven GUI-based image acquisition interface for the IDL programming environment designed for CCD camera control and image acquisition directly into the IDL environment where image manipulation and data analysis can be performed, and a toolbox of real-time analysis applications. Running the image acquisition hardware directly from IDL removes the necessity of first saving images in one program and then importing the data into IDL for analysis in a second step. Bringing the data directly into IDL creates an opportunity for the implementation of IDL image processing and display functions in real-time. program allows control over the available charge coupled device (CCD) detector parameters, data acquisition, file saving and loading, and image manipulation and processing, all from within IDL. The program is built using IDL's widget libraries to control the on-screen display and user interface.

  4. Identifying changes in the role of the infection preventionist through the 2014 practice analysis study conducted by the Certification Board of Infection Control and Epidemiology, Inc.

    PubMed

    Henman, Lita Jo; Corrigan, Robert; Carrico, Ruth; Suh, Kathryn N

    2015-07-01

    The Certification Board of Infection Control and Epidemiology, Inc (CBIC) is a voluntary autonomous multidisciplinary board that provides direction and administers the certification process for professionals who are responsible for the infection prevention and control program in a health care facility. The CBIC performs a practice analysis approximately every 4-5 years. The practice analysis is an integral part of the certification examination development process and serves as the backbone of the test content outline. In 2013, the CBIC determined that a practice analysis was required and contracted with Prometric to facilitate the process. The practice analysis was carried out in 2014 by a diverse group of subject matter experts from the United States and Canada. The practice analysis results showed a significant change in the number of tasks and associated knowledge required for the competent practice of infection prevention. As authorized by the CBIC, the test committee is currently reclassifying the bank of examination questions as required and is writing and reviewing questions based on the updated test specifications and content outline. The new content outline will be reflected in examinations that are taken beginning in July 2015. This iterative process of assessing and updating the certification examination ensures not only a valid competency tool but a true reflection of current practices. Copyright © 2015 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  5. Clinical process analysis and activity-based costing at a heart center.

    PubMed

    Ridderstolpe, Lisa; Johansson, Andreas; Skau, Tommy; Rutberg, Hans; Ahlfeldt, Hans

    2002-08-01

    Cost studies, productivity, efficiency, and quality of care measures, the links between resources and patient outcomes, are fundamental issues for hospital management today. This paper describes the implementation of a model for process analysis and activity-based costing (ABC)/management at a Heart Center in Sweden as a tool for administrative cost information, strategic decision-making, quality improvement, and cost reduction. A commercial software package (QPR) containing two interrelated parts, "ProcessGuide and CostControl," was used. All processes at the Heart Center were mapped and graphically outlined. Processes and activities such as health care procedures, research, and education were identified together with their causal relationship to costs and products/services. The construction of the ABC model in CostControl was time-consuming. However, after the ABC/management system was created, it opened the way for new possibilities including process and activity analysis, simulation, and price calculations. Cost analysis showed large variations in the cost obtained for individual patients undergoing coronary artery bypass grafting (CABG) surgery. We conclude that a process-based costing system is applicable and has the potential to be useful in hospital management.

  6. [Investigation on production process quality control of traditional Chinese medicine--Banlangen granule as an example].

    PubMed

    Tan, Manrong; Yan, Dan; Qiu, Lingling; Chen, Longhu; Yan, Yan; Jin, Cheng; Li, Hanbing; Xiao, Xiaohe

    2012-04-01

    For the quality management system of herbal medicines, intermediate and finished products it exists the " short board" effect of methodologies. Based on the concept of process control, new strategies and new methods of the production process quality control had been established with the consideration of the actual production of traditional Chinese medicine an the characteristics of Chinese medicine. Taking Banlangen granule as a practice example, which was effective and widespread application, character identification, determination of index components, chemical fingerprint and biometrics technology were sequentially used respectively to assess the quality of Banlangen herbal medicines, intermediate (water extraction and alcohol precipitation) and finished product. With the transfer rate of chemical information and biological potency as indicators, the effectiveness and transmission of the above different assessments and control methods had been researched. And ultimately, the process quality control methods of Banlangen granule, which were based on chemical composition analysis-biometric analysis, had been set up. It can not only validly solute the current status that there were many manufacturers varying quality of Banlangen granule, but also ensure and enhance its clinical efficacy. Furthermore it provided a foundation for the construction of the quality control of traditional Chinese medicine production process.

  7. Interim results of quality-control sampling of surface water for the Upper Colorado River National Water-Quality Assessment Study Unit, water years 1995-96

    USGS Publications Warehouse

    Spahr, N.E.; Boulger, R.W.

    1997-01-01

    Quality-control samples provide part of the information needed to estimate the bias and variability that result from sample collection, processing, and analysis. Quality-control samples of surface water collected for the Upper Colorado River National Water-Quality Assessment study unit for water years 1995?96 are presented and analyzed in this report. The types of quality-control samples collected include pre-processing split replicates, concurrent replicates, sequential replicates, post-processing split replicates, and field blanks. Analysis of the pre-processing split replicates, concurrent replicates, sequential replicates, and post-processing split replicates is based on differences between analytical results of the environmental samples and analytical results of the quality-control samples. Results of these comparisons indicate that variability introduced by sample collection, processing, and handling is low and will not affect interpretation of the environmental data. The differences for most water-quality constituents is on the order of plus or minus 1 or 2 lowest rounding units. A lowest rounding unit is equivalent to the magnitude of the least significant figure reported for analytical results. The use of lowest rounding units avoids some of the difficulty in comparing differences between pairs of samples when concentrations span orders of magnitude and provides a measure of the practical significance of the effect of variability. Analysis of field-blank quality-control samples indicates that with the exception of chloride and silica, no systematic contamination of samples is apparent. Chloride contamination probably was the result of incomplete rinsing of the dilute cleaning solution from the outlet ports of the decaport sample splitter. Silica contamination seems to have been introduced by the blank water. Sampling and processing procedures for water year 1997 have been modified as a result of these analyses.

  8. Integrating end-to-end threads of control into object-oriented analysis and design

    NASA Technical Reports Server (NTRS)

    Mccandlish, Janet E.; Macdonald, James R.; Graves, Sara J.

    1993-01-01

    Current object-oriented analysis and design methodologies fall short in their use of mechanisms for identifying threads of control for the system being developed. The scenarios which typically describe a system are more global than looking at the individual objects and representing their behavior. Unlike conventional methodologies that use data flow and process-dependency diagrams, object-oriented methodologies do not provide a model for representing these global threads end-to-end. Tracing through threads of control is key to ensuring that a system is complete and timing constraints are addressed. The existence of multiple threads of control in a system necessitates a partitioning of the system into processes. This paper describes the application and representation of end-to-end threads of control to the object-oriented analysis and design process using object-oriented constructs. The issue of representation is viewed as a grouping problem, that is, how to group classes/objects at a higher level of abstraction so that the system may be viewed as a whole with both classes/objects and their associated dynamic behavior. Existing object-oriented development methodology techniques are extended by adding design-level constructs termed logical composite classes and process composite classes. Logical composite classes are design-level classes which group classes/objects both logically and by thread of control information. Process composite classes further refine the logical composite class groupings by using process partitioning criteria to produce optimum concurrent execution results. The goal of these design-level constructs is to ultimately provide the basis for a mechanism that can support the creation of process composite classes in an automated way. Using an automated mechanism makes it easier to partition a system into concurrently executing elements that can be run in parallel on multiple processors.

  9. The Use of Statistical Process Control-Charts for Person-Fit Analysis on Computerized Adaptive Testing. LSAC Research Report Series.

    ERIC Educational Resources Information Center

    Meijer, Rob R.; van Krimpen-Stoop, Edith M. L. A.

    In this study a cumulative-sum (CUSUM) procedure from the theory of Statistical Process Control was modified and applied in the context of person-fit analysis in a computerized adaptive testing (CAT) environment. Six person-fit statistics were proposed using the CUSUM procedure, and three of them could be used to investigate the CAT in online test…

  10. Air Traffic Complexity Measurement Environment (ACME): Software User's Guide

    NASA Technical Reports Server (NTRS)

    1996-01-01

    A user's guide for the Air Traffic Complexity Measurement Environment (ACME) software is presented. The ACME consists of two major components, a complexity analysis tool and user interface. The Complexity Analysis Tool (CAT) analyzes complexity off-line, producing data files which may be examined interactively via the Complexity Data Analysis Tool (CDAT). The Complexity Analysis Tool is composed of three independently executing processes that communicate via PVM (Parallel Virtual Machine) and Unix sockets. The Runtime Data Management and Control process (RUNDMC) extracts flight plan and track information from a SAR input file, and sends the information to GARP (Generate Aircraft Routes Process) and CAT (Complexity Analysis Task). GARP in turn generates aircraft trajectories, which are utilized by CAT to calculate sector complexity. CAT writes flight plan, track and complexity data to an output file, which can be examined interactively. The Complexity Data Analysis Tool (CDAT) provides an interactive graphic environment for examining the complexity data produced by the Complexity Analysis Tool (CAT). CDAT can also play back track data extracted from System Analysis Recording (SAR) tapes. The CDAT user interface consists of a primary window, a controls window, and miscellaneous pop-ups. Aircraft track and position data is displayed in the main viewing area of the primary window. The controls window contains miscellaneous control and display items. Complexity data is displayed in pop-up windows. CDAT plays back sector complexity and aircraft track and position data as a function of time. Controls are provided to start and stop playback, adjust the playback rate, and reposition the display to a specified time.

  11. Statistical process control for residential treated wood

    Treesearch

    Patricia K. Lebow; Timothy M. Young; Stan Lebow

    2017-01-01

    This paper is the first stage of a study that attempts to improve the process of manufacturing treated lumber through the use of statistical process control (SPC). Analysis of industrial and auditing agency data sets revealed there are differences between the industry and agency probability density functions (pdf) for normalized retention data. Resampling of batches of...

  12. Adaptive Data Processing Technique for Lidar-Assisted Control to Bridge the Gap between Lidar Systems and Wind Turbines: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schlipf, David; Raach, Steffen; Haizmann, Florian

    2015-12-14

    This paper presents first steps toward an adaptive lidar data processing technique crucial for lidar-assisted control in wind turbines. The prediction time and the quality of the wind preview from lidar measurements depend on several factors and are not constant. If the data processing is not continually adjusted, the benefit of lidar-assisted control cannot be fully exploited, or can even result in harmful control action. An online analysis of the lidar and turbine data are necessary to continually reassess the prediction time and lidar data quality. In this work, a structured process to develop an analysis tool for the predictionmore » time and a new hardware setup for lidar-assisted control are presented. The tool consists of an online estimation of the rotor effective wind speed from lidar and turbine data and the implementation of an online cross correlation to determine the time shift between both signals. Further, initial results from an ongoing campaign in which this system was employed for providing lidar preview for feed-forward pitch control are presented.« less

  13. Adaptive automation of human-machine system information-processing functions.

    PubMed

    Kaber, David B; Wright, Melanie C; Prinzel, Lawrence J; Clamann, Michael P

    2005-01-01

    The goal of this research was to describe the ability of human operators to interact with adaptive automation (AA) applied to various stages of complex systems information processing, defined in a model of human-automation interaction. Forty participants operated a simulation of an air traffic control task. Automated assistance was adaptively applied to information acquisition, information analysis, decision making, and action implementation aspects of the task based on operator workload states, which were measured using a secondary task. The differential effects of the forms of automation were determined and compared with a manual control condition. Results of two 20-min trials of AA or manual control revealed a significant effect of the type of automation on performance, particularly during manual control periods as part of the adaptive conditions. Humans appear to better adapt to AA applied to sensory and psychomotor information-processing functions (action implementation) than to AA applied to cognitive functions (information analysis and decision making), and AA is superior to completely manual control. Potential applications of this research include the design of automation to support air traffic controller information processing.

  14. Neotectonic control on drainage systems: GIS-based geomorphometric and morphotectonic assessment for Crete, Greece

    NASA Astrophysics Data System (ADS)

    Argyriou, Athanasios V.; Teeuw, Richard M.; Soupios, Pantelis; Sarris, Apostolos

    2017-11-01

    Geomorphic indices can be used to examine the geomorphological and tectonic processes responsible for the development of the drainage basins. Such indices can be dependent on tectonics, erosional processes and other factors that control the morphology of the landforms. The inter-relationships between geomorphic indices can determine the influence of regional tectonic activity in the shape development of drainage basins. A Multi-Criteria Decision Analysis (MCDA) procedure has been used to perform an integrated cluster analysis that highlights information associated with the dominant regional tectonic activity. Factor Analysis (FA) and Analytical Hierarchy Process (AHP) were considered within that procedure, producing a representation of the distributed regional tectonic activity of the drainage basins studied. The study area is western Crete, located in the outer fore-arc of the Hellenic subduction zone, one of the world's most tectonically active regions. The results indicate that in the landscape evolution of the study area (especially the western basins) tectonic controls dominate over lithological controls.

  15. Analysis of extreme rainfall events using attributes control charts in temporal rainfall processes

    NASA Astrophysics Data System (ADS)

    Villeta, María; Valencia, Jose Luis; Saá-Requejo, Antonio; María Tarquis, Ana

    2015-04-01

    The impacts of most intense rainfall events on agriculture and insurance industry can be very severe. This research focuses in the analysis of extreme rainfall events throughout the use of attributes control charts, which constitutes a usual tool in Statistical Process Control (SPC) but unusual in climate studios. Here, series of daily precipitations for the years 1931-2009 within a Spanish region are analyzed, based on a new type of attributes control chart that takes into account the autocorrelation between the extreme rainfall events. The aim is to conclude if there exist or not evidence of a change in the extreme rainfall model of the considered series. After adjusting seasonally the precipitation series and considering the data of the first 30 years, a frequency-based criterion allowed fixing specification limits in order to discriminate between extreme observed rainfall days and normal observed rainfall days. The autocorrelation amongst maximum precipitation is taken into account by a New Binomial Markov Extended Process obtained for each rainfall series. These modelling of the extreme rainfall processes provide a way to generate the attributes control charts for the annual fraction of rainfall extreme days. The extreme rainfall processes along the rest of the years under study can then be monitored by such attributes control charts. The results of the application of this methodology show evidence of change in the model of extreme rainfall events in some of the analyzed precipitation series. This suggests that the attributes control charts proposed for the analysis of the most intense precipitation events will be of practical interest to agriculture and insurance sectors in next future.

  16. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    NASA Astrophysics Data System (ADS)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  17. Flexible distributed architecture for semiconductor process control and experimentation

    NASA Astrophysics Data System (ADS)

    Gower, Aaron E.; Boning, Duane S.; McIlrath, Michael B.

    1997-01-01

    Semiconductor fabrication requires an increasingly expensive and integrated set of tightly controlled processes, driving the need for a fabrication facility with fully computerized, networked processing equipment. We describe an integrated, open system architecture enabling distributed experimentation and process control for plasma etching. The system was developed at MIT's Microsystems Technology Laboratories and employs in-situ CCD interferometry based analysis in the sensor-feedback control of an Applied Materials Precision 5000 Plasma Etcher (AME5000). Our system supports accelerated, advanced research involving feedback control algorithms, and includes a distributed interface that utilizes the internet to make these fabrication capabilities available to remote users. The system architecture is both distributed and modular: specific implementation of any one task does not restrict the implementation of another. The low level architectural components include a host controller that communicates with the AME5000 equipment via SECS-II, and a host controller for the acquisition and analysis of the CCD sensor images. A cell controller (CC) manages communications between these equipment and sensor controllers. The CC is also responsible for process control decisions; algorithmic controllers may be integrated locally or via remote communications. Finally, a system server images connections from internet/intranet (web) based clients and uses a direct link with the CC to access the system. Each component communicates via a predefined set of TCP/IP socket based messages. This flexible architecture makes integration easier and more robust, and enables separate software components to run on the same or different computers independent of hardware or software platform.

  18. An information theory account of cognitive control.

    PubMed

    Fan, Jin

    2014-01-01

    Our ability to efficiently process information and generate appropriate responses depends on the processes collectively called cognitive control. Despite a considerable focus in the literature on the cognitive control of information processing, neural mechanisms underlying control are still unclear, and have not been characterized by considering the quantity of information to be processed. A novel and comprehensive account of cognitive control is proposed using concepts from information theory, which is concerned with communication system analysis and the quantification of information. This account treats the brain as an information-processing entity where cognitive control and its underlying brain networks play a pivotal role in dealing with conditions of uncertainty. This hypothesis and theory article justifies the validity and properties of such an account and relates experimental findings to the frontoparietal network under the framework of information theory.

  19. Atmosphere Explorer control system software (version 1.0)

    NASA Technical Reports Server (NTRS)

    Villasenor, A.

    1972-01-01

    The basic design is described of the Atmosphere Explorer Control System (AECS) software used in the testing, integration, and flight contol of the AE spacecraft and experiments. The software performs several vital functions, such as issuing commands to the spacecraft and experiments, receiving and processing telemetry data, and allowing for extensive data processing by experiment analysis programs. The major processing sections are: executive control section, telemetry decommutation section, command generation section, and utility section.

  20. WE-G-BRA-07: Analyzing the Safety Implications of a Brachytherapy Process Improvement Project Utilizing a Novel System-Theory-Based Hazard-Analysis Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, A; Samost, A; Viswanathan, A

    Purpose: To investigate the hazards in cervical-cancer HDR brachytherapy using a novel hazard-analysis technique, System Theoretic Process Analysis (STPA). The applicability and benefit of STPA to the field of radiation oncology is demonstrated. Methods: We analyzed the tandem and ring HDR procedure through observations, discussions with physicists and physicians, and the use of a previously developed process map. Controllers and their respective control actions were identified and arranged into a hierarchical control model of the system, modeling the workflow from applicator insertion through initiating treatment delivery. We then used the STPA process to identify potentially unsafe control actions. Scenarios weremore » then generated from the identified unsafe control actions and used to develop recommendations for system safety constraints. Results: 10 controllers were identified and included in the final model. From these controllers 32 potentially unsafe control actions were identified, leading to more than 120 potential accident scenarios, including both clinical errors (e.g., using outdated imaging studies for planning), and managerial-based incidents (e.g., unsafe equipment, budget, or staffing decisions). Constraints identified from those scenarios include common themes, such as the need for appropriate feedback to give the controllers an adequate mental model to maintain safe boundaries of operations. As an example, one finding was that the likelihood of the potential accident scenario of the applicator breaking during insertion might be reduced by establishing a feedback loop of equipment-usage metrics and equipment-failure reports to the management controller. Conclusion: The utility of STPA in analyzing system hazards in a clinical brachytherapy system was demonstrated. This technique, rooted in system theory, identified scenarios both technical/clinical and managerial in nature. These results suggest that STPA can be successfully used to analyze safety in brachytherapy and may prove to be an alternative to other hazard analysis techniques.« less

  1. Safety Analysis of Soybean Processing for Advanced Life Support

    NASA Technical Reports Server (NTRS)

    Hentges, Dawn L.

    1999-01-01

    Soybeans (cv. Hoyt) is one of the crops planned for food production within the Advanced Life Support System Integration Testbed (ALSSIT), a proposed habitat simulation for long duration lunar/Mars missions. Soybeans may be processed into a variety of food products, including soymilk, tofu, and tempeh. Due to the closed environmental system and importance of crew health maintenance, food safety is a primary concern on long duration space missions. Identification of the food safety hazards and critical control points associated with the closed ALSSIT system is essential for the development of safe food processing techniques and equipment. A Hazard Analysis Critical Control Point (HACCP) model was developed to reflect proposed production and processing protocols for ALSSIT soybeans. Soybean processing was placed in the type III risk category. During the processing of ALSSIT-grown soybeans, critical control points were identified to control microbiological hazards, particularly mycotoxins, and chemical hazards from antinutrients. Critical limits were suggested at each CCP. Food safety recommendations regarding the hazards and risks associated with growing, harvesting, and processing soybeans; biomass management; and use of multifunctional equipment were made in consideration of the limitations and restraints of the closed ALSSIT.

  2. Model reduction in integrated controls-structures design

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.

    1993-01-01

    It is the objective of this paper to present a model reduction technique developed for the integrated controls-structures design of flexible structures. Integrated controls-structures design problems are typically posed as nonlinear mathematical programming problems, where the design variables consist of both structural and control parameters. In the solution process, both structural and control design variables are constantly changing; therefore, the dynamic characteristics of the structure are also changing. This presents a problem in obtaining a reduced-order model for active control design and analysis which will be valid for all design points within the design space. In other words, the frequency and number of the significant modes of the structure (modes that should be included) may vary considerably throughout the design process. This is also true as the locations and/or masses of the sensors and actuators change. Moreover, since the number of design evaluations in the integrated design process could easily run into thousands, any feasible order-reduction method should not require model reduction analysis at every design iteration. In this paper a novel and efficient technique for model reduction in the integrated controls-structures design process, which addresses these issues, is presented.

  3. A Course in... Multivariable Control Methods.

    ERIC Educational Resources Information Center

    Deshpande, Pradeep B.

    1988-01-01

    Describes an engineering course for graduate study in process control. Lists four major topics: interaction analysis, multiloop controller design, decoupling, and multivariable control strategies. Suggests a course outline and gives information about each topic. (MVL)

  4. Interactive computer graphics system for structural sizing and analysis of aircraft structures

    NASA Technical Reports Server (NTRS)

    Bendavid, D.; Pipano, A.; Raibstein, A.; Somekh, E.

    1975-01-01

    A computerized system for preliminary sizing and analysis of aircraft wing and fuselage structures was described. The system is based upon repeated application of analytical program modules, which are interactively interfaced and sequence-controlled during the iterative design process with the aid of design-oriented graphics software modules. The entire process is initiated and controlled via low-cost interactive graphics terminals driven by a remote computer in a time-sharing mode.

  5. Plasma process control with optical emission spectroscopy

    NASA Astrophysics Data System (ADS)

    Ward, P. P.

    Plasma processes for cleaning, etching and desmear of electronic components and printed wiring boards (PWB) are difficult to predict and control. Non-uniformity of most plasma processes and sensitivity to environmental changes make it difficult to maintain process stability from day to day. To assure plasma process performance, weight loss coupons or post-plasma destructive testing must be used. The problem with these techniques is that they are not real-time methods and do not allow for immediate diagnosis and process correction. These methods often require scrapping some fraction of a batch to insure the integrity of the rest. Since these methods verify a successful cycle with post-plasma diagnostics, poor test results often determine that a batch is substandard and the resulting parts unusable. Both of these methods are a costly part of the overall fabrication cost. A more efficient method of testing would allow for constant monitoring of plasma conditions and process control. Process failures should be detected before the parts being treated. are damaged. Real time monitoring would allow for instantaneous corrections. Multiple site monitoring would allow for process mapping within one system or simultaneous monitoring of multiple systems. Optical emission spectroscopy conducted external to the plasma apparatus would allow for this sort of multifunctional analysis without perturbing the glow discharge. In this paper, optical emission spectroscopy for non-intrusive, in situ process control will be explored. A discussion of this technique as it applies towards process control, failure analysis and endpoint determination will be conducted. Methods for identifying process failures, progress and end of etch back and desmear processes will be discussed.

  6. [Dyslexia as a disfunction in successive processing].

    PubMed

    Pérez-Alvarez, F; Timoneda-Gallart, C

    We present a study on reading and writing difficulties after normal instruction during a year. Verifying if these patients showed a specific pattern of PASS (Planning, Attention, Sequential and Simultaneous) cognitive processing; if so, it allows us a rapid diagnosis and a useful cognitive remediation according to the PASS theory of intelligence. Thirty patients were selected from neuropediatric patients because of learning disability. They were selected according to their performance on several tests of phonological aware and a test of writing to discover errors in spelling. Patients with verbal language problems, as in dysphasia, and patients with learning difficulty not determined by reading or writing were ruled out. A control group of 300 scholars was used. The translated DN:CAS battery was administered to the study group and the control group for assessing the PASS cognitive processing. Statistical factorial analysis of the control group was performed as a validity confirmation to discriminate the four PASS cognitive processes. Cluster analysis of the study group was performed to discriminate its homogeneity. Differences between means were tested with the t-Student. The four PASS cognitive processes were identified in the control group. The study group scored less than minus 1 SD in successive processing, the rest of the processes being clearly higher than minus 1 SD, and the mean of study group was inferior to control group (p = 0.001). A kind of dyslexia may be defined by disfunction in PASS successive processing.

  7. Stability and performance analysis of a jump linear control system subject to digital upsets

    NASA Astrophysics Data System (ADS)

    Wang, Rui; Sun, Hui; Ma, Zhen-Yang

    2015-04-01

    This paper focuses on the methodology analysis for the stability and the corresponding tracking performance of a closed-loop digital jump linear control system with a stochastic switching signal. The method is applied to a flight control system. A distributed recoverable platform is implemented on the flight control system and subject to independent digital upsets. The upset processes are used to stimulate electromagnetic environments. Specifically, the paper presents the scenarios that the upset process is directly injected into the distributed flight control system, which is modeled by independent Markov upset processes and independent and identically distributed (IID) processes. A theoretical performance analysis and simulation modelling are both presented in detail for a more complete independent digital upset injection. The specific examples are proposed to verify the methodology of tracking performance analysis. The general analyses for different configurations are also proposed. Comparisons among different configurations are conducted to demonstrate the availability and the characteristics of the design. Project supported by the Young Scientists Fund of the National Natural Science Foundation of China (Grant No. 61403395), the Natural Science Foundation of Tianjin, China (Grant No. 13JCYBJC39000), the Scientific Research Foundation for the Returned Overseas Chinese Scholars, State Education Ministry, China, the Tianjin Key Laboratory of Civil Aircraft Airworthiness and Maintenance in Civil Aviation of China (Grant No. 104003020106), and the Fund for Scholars of Civil Aviation University of China (Grant No. 2012QD21x).

  8. An Automated Statistical Process Control Study of Inline Mixing Using Spectrophotometric Detection

    ERIC Educational Resources Information Center

    Dickey, Michael D.; Stewart, Michael D.; Willson, C. Grant

    2006-01-01

    An experiment is described, which is designed for a junior-level chemical engineering "fundamentals of measurements and data analysis" course, where students are introduced to the concept of statistical process control (SPC) through a simple inline mixing experiment. The students learn how to create and analyze control charts in an effort to…

  9. Speed isn’t everything: Complex processing speed measures mask individual differences and developmental changes in executive control

    PubMed Central

    Cepeda, Nicholas J.; Blackwell, Katharine A.; Munakata, Yuko

    2012-01-01

    The rate at which people process information appears to influence many aspects of cognition across the lifespan. However, many commonly accepted measures of “processing speed” may require goal maintenance, manipulation of information in working memory, and decision-making, blurring the distinction between processing speed and executive control and resulting in overestimation of processing-speed contributions to cognition. This concern may apply particularly to studies of developmental change, as even seemingly simple processing speed measures may require executive processes to keep children and older adults on task. We report two new studies and a re-analysis of a published study, testing predictions about how different processing speed measures influence conclusions about executive control across the life span. We find that the choice of processing speed measure affects the relationship observed between processing speed and executive control, in a manner that changes with age, and that choice of processing speed measure affects conclusions about development and the relationship among executive control measures. Implications for understanding processing speed, executive control, and their development are discussed. PMID:23432836

  10. Interfacing LabVIEW With Instrumentation for Electronic Failure Analysis and Beyond

    NASA Technical Reports Server (NTRS)

    Buchanan, Randy K.; Bryan, Coleman; Ludwig, Larry

    1996-01-01

    The Laboratory Virtual Instrumentation Engineering Workstation (LabVIEW) software is designed such that equipment and processes related to control systems can be operationally lined and controlled by the use of a computer. Various processes within the failure analysis laboratories of NASA's Kennedy Space Center (KSC) demonstrate the need for modernization and, in some cases, automation, using LabVIEW. An examination of procedures and practices with the Failure Analaysis Laboratory resulted in the conclusion that some device was necessary to elevate the potential users of LabVIEW to an operational level in minimum time. This paper outlines the process involved in creating a tutorial application to enable personnel to apply LabVIEW to their specific projects. Suggestions for furthering the extent to which LabVIEW is used are provided in the areas of data acquisition and process control.

  11. Global sensitivity analysis of DRAINMOD-FOREST, an integrated forest ecosystem model

    Treesearch

    Shiying Tian; Mohamed A. Youssef; Devendra M. Amatya; Eric D. Vance

    2014-01-01

    Global sensitivity analysis is a useful tool to understand process-based ecosystem models by identifying key parameters and processes controlling model predictions. This study reported a comprehensive global sensitivity analysis for DRAINMOD-FOREST, an integrated model for simulating water, carbon (C), and nitrogen (N) cycles and plant growth in lowland forests. The...

  12. Vision-sensing image analysis for GTAW process control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, D.D.

    1994-11-01

    Image analysis of a gas tungsten arc welding (GTAW) process was completed using video images from a charge coupled device (CCD) camera inside a specially designed coaxial (GTAW) electrode holder. Video data was obtained from filtered and unfiltered images, with and without the GTAW arc present, showing weld joint features and locations. Data Translation image processing boards, installed in an IBM PC AT 386 compatible computer, and Media Cybernetics image processing software were used to investigate edge flange weld joint geometry for image analysis.

  13. Process- and controller-adaptations determine the physiological effects of cold acclimation.

    PubMed

    Werner, Jürgen

    2008-09-01

    Experimental results on physiological effects of cold adaptation seem confusing and apparently incompatible with one another. This paper will explain that a substantial part of such a variety of results may be deduced from a common functional concept. A core/shell treatment ("model") of the thermoregulatory system is used with mean body temperature as the controlled variable. Adaptation, as a higher control level, is introduced into the system. Due to persistent stressors, either the (heat transfer) process or the controller properties (parameters) are adjusted (or both). It is convenient to call the one "process adaptation" and the other "controller adaptation". The most commonly demonstrated effect of autonomic cold acclimation is a change in the controller threshold. The analysis shows that this necessarily means a lowering of body temperature because of a lowered metabolic rate. This explains experimental results on both Europeans in the climatic chamber and Australian Aborigines in a natural environment. Exclusive autonomic process adaptation occurs in the form of a better insulation. The analysis explains why the post-adaptive steady-state can only be achieved, if the controller system reduces metabolism and why in spite of this the new state is inevitably characterized by a rise in body temperature. If both process and controller adaptations are simultaneously present, there may be not any change of body temperature at all, e.g., as demonstrated in animal experiments. Whether this kind of adaptation delivers a decrease, an increase or no change of mean body temperature, depends on the proportion of process and controller adaptation.

  14. Fuzzy Adaptive Control for Intelligent Autonomous Space Exploration Problems

    NASA Technical Reports Server (NTRS)

    Esogbue, Augustine O.

    1998-01-01

    The principal objective of the research reported here is the re-design, analysis and optimization of our newly developed neural network fuzzy adaptive controller model for complex processes capable of learning fuzzy control rules using process data and improving its control through on-line adaption. The learned improvement is according to a performance objective function that provides evaluative feedback; this performance objective is broadly defined to meet long-range goals over time. Although fuzzy control had proven effective for complex, nonlinear, imprecisely-defined processes for which standard models and controls are either inefficient, impractical or cannot be derived, the state of the art prior to our work showed that procedures for deriving fuzzy control, however, were mostly ad hoc heuristics. The learning ability of neural networks was exploited to systematically derive fuzzy control and permit on-line adaption and in the process optimize control. The operation of neural networks integrates very naturally with fuzzy logic. The neural networks which were designed and tested using simulation software and simulated data, followed by realistic industrial data were reconfigured for application on several platforms as well as for the employment of improved algorithms. The statistical procedures of the learning process were investigated and evaluated with standard statistical procedures (such as ANOVA, graphical analysis of residuals, etc.). The computational advantage of dynamic programming-like methods of optimal control was used to permit on-line fuzzy adaptive control. Tests for the consistency, completeness and interaction of the control rules were applied. Comparisons to other methods and controllers were made so as to identify the major advantages of the resulting controller model. Several specific modifications and extensions were made to the original controller. Additional modifications and explorations have been proposed for further study. Some of these are in progress in our laboratory while others await additional support. All of these enhancements will improve the attractiveness of the controller as an effective tool for the on line control of an array of complex process environments.

  15. Composition and analysis of a model waste for a CELSS (Controlled Ecological Life Support System)

    NASA Technical Reports Server (NTRS)

    Wydeven, T. J.

    1983-01-01

    A model waste based on a modest vegetarian diet is given, including composition and elemental analysis. Its use is recommended for evaluation of candidate waste treatment processes for a Controlled Ecological Life Support System (CELSS).

  16. An information theory account of cognitive control

    PubMed Central

    Fan, Jin

    2014-01-01

    Our ability to efficiently process information and generate appropriate responses depends on the processes collectively called cognitive control. Despite a considerable focus in the literature on the cognitive control of information processing, neural mechanisms underlying control are still unclear, and have not been characterized by considering the quantity of information to be processed. A novel and comprehensive account of cognitive control is proposed using concepts from information theory, which is concerned with communication system analysis and the quantification of information. This account treats the brain as an information-processing entity where cognitive control and its underlying brain networks play a pivotal role in dealing with conditions of uncertainty. This hypothesis and theory article justifies the validity and properties of such an account and relates experimental findings to the frontoparietal network under the framework of information theory. PMID:25228875

  17. Applications of ICA and fractal dimension in sEMG signal processing for subtle movement analysis: a review.

    PubMed

    Naik, Ganesh R; Arjunan, Sridhar; Kumar, Dinesh

    2011-06-01

    The surface electromyography (sEMG) signal separation and decphompositions has always been an interesting research topic in the field of rehabilitation and medical research. Subtle myoelectric control is an advanced technique concerned with the detection, processing, classification, and application of myoelectric signals to control human-assisting robots or rehabilitation devices. This paper reviews recent research and development in independent component analysis and Fractal dimensional analysis for sEMG pattern recognition, and presents state-of-the-art achievements in terms of their type, structure, and potential application. Directions for future research are also briefly outlined.

  18. 300 Area treated effluent disposal facility sampling schedule

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loll, C.M.

    1994-10-11

    This document is the interface between the 300 Area Liquid Effluent Process Engineering (LEPE) group and the Waste Sampling and Characterization Facility (WSCF), concerning process control samples. It contains a schedule for process control samples at the 300 Area TEDF which describes the parameters to be measured, the frequency of sampling and analysis, the sampling point, and the purpose for each parameter.

  19. The Controlling Function of the Agent in the Analysis of Question-Response Relationships.

    ERIC Educational Resources Information Center

    Bierschenk, Inger

    In contrast to traditional linguistic analysis, a model based on the empirical agent is presented and tested. A text is regarded as an intentionally produced cognitive process. The analysis has to take the agent (perspective) into account to facilitate an adequate processing of its objectives (viewpoints). Moreover, the model is surface-oriented…

  20. Spectroscopic analysis and control

    DOEpatents

    Tate; , James D.; Reed, Christopher J.; Domke, Christopher H.; Le, Linh; Seasholtz, Mary Beth; Weber, Andy; Lipp, Charles

    2017-04-18

    Apparatus for spectroscopic analysis which includes a tunable diode laser spectrometer having a digital output signal and a digital computer for receiving the digital output signal from the spectrometer, the digital computer programmed to process the digital output signal using a multivariate regression algorithm. In addition, a spectroscopic method of analysis using such apparatus. Finally, a method for controlling an ethylene cracker hydrogenator.

  1. Multicapillary SDS-gel electrophoresis for the analysis of fluorescently labeled mAb preparations: a high throughput quality control process for the production of QuantiPlasma and PlasmaScan mAb libraries.

    PubMed

    Székely, Andrea; Szekrényes, Akos; Kerékgyártó, Márta; Balogh, Attila; Kádas, János; Lázár, József; Guttman, András; Kurucz, István; Takács, László

    2014-08-01

    Molecular heterogeneity of mAb preparations is the result of various co- and post-translational modifications and to contaminants related to the production process. Changes in molecular composition results in alterations of functional performance, therefore quality control and validation of therapeutic or diagnostic protein products is essential. A special case is the consistent production of mAb libraries (QuantiPlasma™ and PlasmaScan™) for proteome profiling, quality control of which represents a challenge because of high number of mAbs (>1000). Here, we devise a generally applicable multicapillary SDS-gel electrophoresis process for the analysis of fluorescently labeled mAb preparations for the high throughput quality control of mAbs of the QuantiPlasma™ and PlasmaScan™ libraries. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Uncertainty Budget Analysis for Dimensional Inspection Processes (U)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valdez, Lucas M.

    2012-07-26

    This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensionalmore » inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.« less

  3. Preliminary Retrospective Analysis of Daily Tomotherapy Output Constancy Checks Using Statistical Process Control.

    PubMed

    Mezzenga, Emilio; D'Errico, Vincenzo; Sarnelli, Anna; Strigari, Lidia; Menghi, Enrico; Marcocci, Francesco; Bianchini, David; Benassi, Marcello

    2016-01-01

    The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system.

  4. The concept and science process skills analysis in bomb calorimeter experiment as a foundation for the development of virtual laboratory of bomb calorimeter

    NASA Astrophysics Data System (ADS)

    Kurniati, D. R.; Rohman, I.

    2018-05-01

    This study aims to analyze the concepts and science process skills in bomb calorimeter experiment as a basis for developing the virtual laboratory of bomb calorimeter. This study employed research and development method (R&D) to gain the answer to the proposed problems. This paper discussed the concepts and process skills analysis. The essential concepts and process skills associated with bomb calorimeter are analyze by optimizing the bomb calorimeter experiment. The concepts analysis found seven fundamental concepts to be concerned in developing the virtual laboratory that are internal energy, burning heat, perfect combustion, incomplete combustion, calorimeter constant, bomb calorimeter, and Black principle. Since the concept of bomb calorimeter, perfect and incomplete combustion created to figure out the real situation and contain controllable variables, in virtual the concepts displayed in the form of simulation. Meanwhile, the last four concepts presented in the form of animation because no variable found to be controlled. The process skills analysis detect four notable skills to be developed that are ability to observe, design experiment, interpretation, and communication skills.

  5. [PASS neurocognitive dysfunction in attention deficit].

    PubMed

    Pérez-Alvarez, F; Timoneda-Gallart, C

    Attention deficit disorder shows both cognitive and behavioral patterns. To determine a particular PASS (planning, attention, successive and simultaneous) pattern in order to early diagnosis and remediation according to PASS theory. 80 patients were selected from the neuropediatric attendance, aged 6 to 12 years old, 55 boys and 25 girls. Inclusion criteria were inattention (80 cases) and inattention with hyperactive symptoms (40 cases) according to the Diagnostic and Statistical Manual (DSM-IV). Exclusion criteria were the criteria of phonologic awareness previously reported, considered useful to diagnose dyslexia. A control group of 300 individuals, aged 5 to 12 years old, was used, criteria above mentioned being controlled. DN:CAS (Das-Naglieri Cognitive Assessment System) battery, translated to native language, was given to assess PASS cognitive processes. Results were analyzed with cluster analysis and t-Student test. Statistical factor analysis of the control group had previously identified the four PASS processes: planning, attention, successive and simultaneous. The dendrogram of the cluster analysis discriminated three categories of attention deficit disorder: 1. The most frequent, with planning deficit; 2. Without planning deficit but with deficit in other processes, and 3. Just only a few cases, without cognitive processing deficit. Cognitive deficiency in terms of means of scores was statistically significant when compared to control group (p = 0.001). According to PASS pattern, planning deficiency is a relevant factor. Neurological planning is not exactly the same than neurological executive function. The behavioral pattern is mainly linked to planning deficiency, but also to other PASS processing deficits and even to no processing deficit.

  6. Design and Analysis of A Multi-Backend Database System for Performance Improvement, Functionality Expansion and Capacity Growth. Part II.

    DTIC Science & Technology

    1981-08-01

    of Transactions ..... . 29 5.5.2 Attached Execution of Transactions ........ ... 29 5.5.3 The Choice of Transaction Execution for Access Control...basic access control mech- anism for statistical security and value-dependent security. In Section 5.5, * we describe the process of execution of ...the process of request execution with access control for in- sert and non-insert requests in MDBS. We recall again (see Chapter 4) that the process

  7. Digital signal processing and control and estimation theory -- Points of tangency, area of intersection, and parallel directions

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1976-01-01

    A number of current research directions in the fields of digital signal processing and modern control and estimation theory were studied. Topics such as stability theory, linear prediction and parameter identification, system analysis and implementation, two-dimensional filtering, decentralized control and estimation, image processing, and nonlinear system theory were examined in order to uncover some of the basic similarities and differences in the goals, techniques, and philosophy of the two disciplines. An extensive bibliography is included.

  8. Six-sigma application in tire-manufacturing company: a case study

    NASA Astrophysics Data System (ADS)

    Gupta, Vikash; Jain, Rahul; Meena, M. L.; Dangayach, G. S.

    2017-09-01

    Globalization, advancement of technologies, and increment in the demand of the customer change the way of doing business in the companies. To overcome these barriers, the six-sigma define-measure-analyze-improve-control (DMAIC) method is most popular and useful. This method helps to trim down the wastes and generating the potential ways of improvement in the process as well as service industries. In the current research, the DMAIC method was used for decreasing the process variations of bead splice causing wastage of material. This six-sigma DMAIC research was initiated by problem identification through voice of customer in the define step. The subsequent step constitutes of gathering the specification data of existing tire bead. This step was followed by the analysis and improvement steps, where the six-sigma quality tools such as cause-effect diagram, statistical process control, and substantial analysis of existing system were implemented for root cause identification and reduction in process variation. The process control charts were used for systematic observation and control the process. Utilizing DMAIC methodology, the standard deviation was decreased from 2.17 to 1.69. The process capability index (C p) value was enhanced from 1.65 to 2.95 and the process performance capability index (C pk) value was enhanced from 0.94 to 2.66. A DMAIC methodology was established that can play a key role for reducing defects in the tire-manufacturing process in India.

  9. Providing security for automated process control systems at hydropower engineering facilities

    NASA Astrophysics Data System (ADS)

    Vasiliev, Y. S.; Zegzhda, P. D.; Zegzhda, D. P.

    2016-12-01

    This article suggests the concept of a cyberphysical system to manage computer security of automated process control systems at hydropower engineering facilities. According to the authors, this system consists of a set of information processing tools and computer-controlled physical devices. Examples of cyber attacks on power engineering facilities are provided, and a strategy of improving cybersecurity of hydropower engineering systems is suggested. The architecture of the multilevel protection of the automated process control system (APCS) of power engineering facilities is given, including security systems, control systems, access control, encryption, secure virtual private network of subsystems for monitoring and analysis of security events. The distinctive aspect of the approach is consideration of interrelations and cyber threats, arising when SCADA is integrated with the unified enterprise information system.

  10. Analytic Steering: Inserting Context into the Information Dialog

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bohn, Shawn J.; Calapristi, Augustin J.; Brown, Shyretha D.

    2011-10-23

    An analyst’s intrinsic domain knowledge is a primary asset in almost any analysis task. Unstructured text analysis systems that apply un-supervised content analysis approaches can be more effective if they can leverage this domain knowledge in a manner that augments the information discovery process without obfuscating new or unexpected content. Current unsupervised approaches rely upon the prowess of the analyst to submit the right queries or observe generalized document and term relationships from ranked or visual results. We propose a new approach which allows the user to control or steer the analytic view within the unsupervised space. This process ismore » controlled through the data characterization process via user supplied context in the form of a collection of key terms. We show that steering with an appropriate choice of key terms can provide better relevance to the analytic domain and still enable the analyst to uncover un-expected relationships; this paper discusses cases where various analytic steering approaches can provide enhanced analysis results and cases where analytic steering can have a negative impact on the analysis process.« less

  11. Reliability of fully automated versus visually controlled pre- and post-processing of resting-state EEG.

    PubMed

    Hatz, F; Hardmeier, M; Bousleiman, H; Rüegg, S; Schindler, C; Fuhr, P

    2015-02-01

    To compare the reliability of a newly developed Matlab® toolbox for the fully automated, pre- and post-processing of resting state EEG (automated analysis, AA) with the reliability of analysis involving visually controlled pre- and post-processing (VA). 34 healthy volunteers (age: median 38.2 (20-49), 82% female) had three consecutive 256-channel resting-state EEG at one year intervals. Results of frequency analysis of AA and VA were compared with Pearson correlation coefficients, and reliability over time was assessed with intraclass correlation coefficients (ICC). Mean correlation coefficient between AA and VA was 0.94±0.07, mean ICC for AA 0.83±0.05 and for VA 0.84±0.07. AA and VA yield very similar results for spectral EEG analysis and are equally reliable. AA is less time-consuming, completely standardized, and independent of raters and their training. Automated processing of EEG facilitates workflow in quantitative EEG analysis. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  12. An efficient visualization method for analyzing biometric data

    NASA Astrophysics Data System (ADS)

    Rahmes, Mark; McGonagle, Mike; Yates, J. Harlan; Henning, Ronda; Hackett, Jay

    2013-05-01

    We introduce a novel application for biometric data analysis. This technology can be used as part of a unique and systematic approach designed to augment existing processing chains. Our system provides image quality control and analysis capabilities. We show how analysis and efficient visualization are used as part of an automated process. The goal of this system is to provide a unified platform for the analysis of biometric images that reduce manual effort and increase the likelihood of a match being brought to an examiner's attention from either a manual or lights-out application. We discuss the functionality of FeatureSCOPE™ which provides an efficient tool for feature analysis and quality control of biometric extracted features. Biometric databases must be checked for accuracy for a large volume of data attributes. Our solution accelerates review of features by a factor of up to 100 times. Review of qualitative results and cost reduction is shown by using efficient parallel visual review for quality control. Our process automatically sorts and filters features for examination, and packs these into a condensed view. An analyst can then rapidly page through screens of features and flag and annotate outliers as necessary.

  13. Implementation of statistical process control for proteomic experiments via LC MS/MS.

    PubMed

    Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.

  14. [Quality control in herbal supplements].

    PubMed

    Oelker, Luisa

    2005-01-01

    Quality and safety of food and herbal supplements are the result of a whole of different elements as good manufacturing practice and process control. The process control must be active and able to individuate and correct all possible hazards. The main and most utilized instrument is the hazard analysis critical control point (HACCP) system the correct application of which can guarantee the safety of the product. Herbal supplements need, in addition to standard quality control, a set of checks to assure the harmlessness and safety of the plants used.

  15. Crossing institutional boundaries: mapping the policy process for improved control of endemic and neglected zoonoses in sub-Saharan Africa.

    PubMed

    Okello, Anna; Welburn, Susan; Smith, James

    2015-07-01

    The recent adoption of the World Health Assembly Resolution 66.12 for neglected tropical diseases (NTDs) in May 2013 is an important turning point for advocacy regarding a number of endemic zoonotic infections, defined by the World Health Organization as the neglected zoonotic diseases (NZDs). In addition to NTD-listed zoonoses such as rabies, echinococcosis (hydatid disease), leishmaniasis, Human African trypanosomiasis (sleeping sickness) and Taenia solium cysticercosis, the NZDs also include important bacterial zoonoses such as anthrax, bovine tuberculosis and brucellosis. To date, analysis of the processes that prioritize, develop and deliver zoonoses control programmes in many low- and middle-income countries is lacking, despite its potential to highlight significant evidence gaps and institutional constraints to the intersectoral approach required for their control. Policy process analysis was conducted via a series of semi-structured interviews with key policy actors within various ministries and institutes in Uganda and Nigeria. The study concluded that despite the rhetoric around 'linear' models of health policy development promoting consultation with a wide range of national stakeholders, the decision-making process for zoonotic disease control appears instead overtly influenced by the external political economy of trending pandemic threats, often overlooking national and regional zoonoses priorities. The inclusion of political systems remains a key factor in the zoonoses analysis matrix, enhancing our understanding of the intersectoral and transdisciplinary approaches required for their control. The authors consider policy process analysis to be a fundamental first step of any attempt to holistically strengthen human and animal health systems in a development context, particularly regarding the promotion of integrated control policies for regionally important zoonoses under the growing One Health movement. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2014; all rights reserved.

  16. Emotion and Cognition Processes in Preschool Children

    ERIC Educational Resources Information Center

    Leerkes, Esther M.; Paradise, Matthew; O'Brien, Marion; Calkins, Susan D.; Lange, Garrett

    2008-01-01

    The core processes of emotion understanding, emotion control, cognitive understanding, and cognitive control and their association with early indicators of social and academic success were examined in a sample of 141 3-year-old children. Confirmatory factor analysis supported the hypothesized four-factor model of emotion and cognition in early…

  17. 21 CFR 106.25 - In-process control.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false In-process control. 106.25 Section 106.25 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN..., and carbohydrates (carbohydrates either by analysis or by mathematical difference); (3) The indicator...

  18. 21 CFR 106.25 - In-process control.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false In-process control. 106.25 Section 106.25 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN..., and carbohydrates (carbohydrates either by analysis or by mathematical difference); (3) The indicator...

  19. 21 CFR 106.25 - In-process control.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false In-process control. 106.25 Section 106.25 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN..., and carbohydrates (carbohydrates either by analysis or by mathematical difference); (3) The indicator...

  20. 21 CFR 106.25 - In-process control.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 2 2013-04-01 2013-04-01 false In-process control. 106.25 Section 106.25 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN..., and carbohydrates (carbohydrates either by analysis or by mathematical difference); (3) The indicator...

  1. Gender, the Labor Process and Dignity at Work

    ERIC Educational Resources Information Center

    Crowley, Martha

    2013-01-01

    This study brings together gender inequality and labor process research to investigate how divergent control structures generate inequality in work experiences for women and men. Content-coded data on 155 work groups are analyzed using Qualitative Comparative Analysis to identify combinations of control techniques encountered by female and male…

  2. Facultative Lagoons. Student Manual. Biological Treatment Process Control.

    ERIC Educational Resources Information Center

    Andersen, Lorri

    The textual material for a unit on facultative lagoons is presented in this student manual. Topic areas discussed include: (1) loading; (2) microbial theory; (3) structure and design; (4) process control; (5) lagoon start-up; (6) data handling and analysis; (7) lagoon maintenance (considering visual observations, pond structure, safety, odor,…

  3. 300 Area treated effluent disposal facility sampling schedule. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loll, C.M.

    1995-03-28

    This document is the interface between the 300 Area liquid effluent process engineering (LEPE) group and the waste sampling and characterization facility (WSCF), concerning process control samples. It contains a schedule for process control samples at the 300 Area TEDF which describes the parameters to be measured, the frequency of sampling and analysis, the sampling point, and the purpose for each parameter.

  4. The utilization of six sigma and statistical process control techniques in surgical quality improvement.

    PubMed

    Sedlack, Jeffrey D

    2010-01-01

    Surgeons have been slow to incorporate industrial reliability techniques. Process control methods were applied to surgeon waiting time between cases, and to length of stay (LOS) after colon surgery. Waiting times between surgeries were evaluated by auditing the operating room records of a single hospital over a 1-month period. The medical records of 628 patients undergoing colon surgery over a 5-year period were reviewed. The average surgeon wait time between cases was 53 min, and the busiest surgeon spent 291/2 hr in 1 month waiting between surgeries. Process control charting demonstrated poor overall control of the room turnover process. Average LOS after colon resection also demonstrated very poor control. Mean LOS was 10 days. Weibull's conditional analysis revealed a conditional LOS of 9.83 days. Serious process management problems were identified in both analyses. These process issues are both expensive and adversely affect the quality of service offered by the institution. Process control mechanisms were suggested or implemented to improve these surgical processes. Industrial reliability and quality management tools can easily and effectively identify process control problems that occur on surgical services. © 2010 National Association for Healthcare Quality.

  5. The Hazard Analysis and Critical Control Points (HACCP) generic model for the production of Thai fermented pork sausage (Nham).

    PubMed

    Paukatong, K V; Kunawasen, S

    2001-01-01

    Nham is a traditional Thai fermented pork sausage. The major ingredients of Nham are ground pork meat and shredded pork rind. Nham has been reported to be contaminated with Salmonella spp., Staphylococcus aureus, and Listeria monocytogenes. Therefore, it is a potential cause of foodborne diseases for consumers. A Hazard Analysis and Critical Control Points (HACCP) generic model has been developed for the Nham process. Nham processing plants were observed and a generic flow diagram of Nham processes was constructed. Hazard analysis was then conducted. Other than microbial hazards, the pathogens previously found in Nham, sodium nitrite and metal were identified as chemical and physical hazards in this product, respectively. Four steps in the Nham process have been identified as critical control points. These steps are the weighing of the nitrite compound, stuffing, fermentation, and labeling. The chemical hazard of nitrite must be controlled during the weighing step. The critical limit of nitrite levels in the Nham mixture has been set at 100-200 ppm. This level is high enough to control Clostridium botulinum but does not cause chemical hazards to the consumer. The physical hazard from metal clips could be prevented by visual inspection of every Nham product during stuffing. The microbiological hazard in Nham could be reduced in the fermentation process. The critical limit of the pH of Nham was set at lower than 4.6. Since this product is not cooked during processing, finally, educating the consumer, by providing information on the label such as "safe if cooked before consumption", could be an alternative way to prevent the microbiological hazards of this product.

  6. Levels of integration in cognitive control and sequence processing in the prefrontal cortex.

    PubMed

    Bahlmann, Jörg; Korb, Franziska M; Gratton, Caterina; Friederici, Angela D

    2012-01-01

    Cognitive control is necessary to flexibly act in changing environments. Sequence processing is needed in language comprehension to build the syntactic structure in sentences. Functional imaging studies suggest that sequence processing engages the left ventrolateral prefrontal cortex (PFC). In contrast, cognitive control processes additionally recruit bilateral rostral lateral PFC regions. The present study aimed to investigate these two types of processes in one experimental paradigm. Sequence processing was manipulated using two different sequencing rules varying in complexity. Cognitive control was varied with different cue-sets that determined the choice of a sequencing rule. Univariate analyses revealed distinct PFC regions for the two types of processing (i.e. sequence processing: left ventrolateral PFC and cognitive control processing: bilateral dorsolateral and rostral PFC). Moreover, in a common brain network (including left lateral PFC and intraparietal sulcus) no interaction between sequence and cognitive control processing was observed. In contrast, a multivariate pattern analysis revealed an interaction of sequence and cognitive control processing, such that voxels in left lateral PFC and parietal cortex showed different tuning functions for tasks involving different sequencing and cognitive control demands. These results suggest that the difference between the process of rule selection (i.e. cognitive control) and the process of rule-based sequencing (i.e. sequence processing) find their neuronal underpinnings in distinct activation patterns in lateral PFC. Moreover, the combination of rule selection and rule sequencing can shape the response of neurons in lateral PFC and parietal cortex.

  7. Levels of Integration in Cognitive Control and Sequence Processing in the Prefrontal Cortex

    PubMed Central

    Bahlmann, Jörg; Korb, Franziska M.; Gratton, Caterina; Friederici, Angela D.

    2012-01-01

    Cognitive control is necessary to flexibly act in changing environments. Sequence processing is needed in language comprehension to build the syntactic structure in sentences. Functional imaging studies suggest that sequence processing engages the left ventrolateral prefrontal cortex (PFC). In contrast, cognitive control processes additionally recruit bilateral rostral lateral PFC regions. The present study aimed to investigate these two types of processes in one experimental paradigm. Sequence processing was manipulated using two different sequencing rules varying in complexity. Cognitive control was varied with different cue-sets that determined the choice of a sequencing rule. Univariate analyses revealed distinct PFC regions for the two types of processing (i.e. sequence processing: left ventrolateral PFC and cognitive control processing: bilateral dorsolateral and rostral PFC). Moreover, in a common brain network (including left lateral PFC and intraparietal sulcus) no interaction between sequence and cognitive control processing was observed. In contrast, a multivariate pattern analysis revealed an interaction of sequence and cognitive control processing, such that voxels in left lateral PFC and parietal cortex showed different tuning functions for tasks involving different sequencing and cognitive control demands. These results suggest that the difference between the process of rule selection (i.e. cognitive control) and the process of rule-based sequencing (i.e. sequence processing) find their neuronal underpinnings in distinct activation patterns in lateral PFC. Moreover, the combination of rule selection and rule sequencing can shape the response of neurons in lateral PFC and parietal cortex. PMID:22952762

  8. Taguchi Method Applied in Optimization of Shipley SJR 5740 Positive Resist Deposition

    NASA Technical Reports Server (NTRS)

    Hui, A.; Blosiu, J. O.; Wiberg, D. V.

    1998-01-01

    Taguchi Methods of Robust Design presents a way to optimize output process performance through an organized set of experiments by using orthogonal arrays. Analysis of variance and signal-to-noise ratio is used to evaluate the contribution of each of the process controllable parameters in the realization of the process optimization. In the photoresist deposition process, there are numerous controllable parameters that can affect the surface quality and thickness of the final photoresist layer.

  9. Process monitoring and control with CHEMIN, a miniaturized CCD-based instrument for simultaneous XRD/XRF analysis

    NASA Astrophysics Data System (ADS)

    Vaniman, David T.; Bish, D.; Guthrie, G.; Chipera, S.; Blake, David E.; Collins, S. Andy; Elliott, S. T.; Sarrazin, P.

    1999-10-01

    There is a large variety of mining and manufacturing operations where process monitoring and control can benefit from on-site analysis of both chemical and mineralogic constituents. CHEMIN is a CCD-based instrument capable of both X-ray fluorescence (XRF; chemical) and X-ray diffraction (XRD; mineralogic) analysis. Monitoring and control with an instrument like CHEMIN can be applied to feedstocks, intermediate materials, and final products to optimize production. Examples include control of cement feedstock, of ore for smelting, and of minerals that pose inhalation hazards in the workplace. The combined XRD/XRF capability of CHEMIN can be used wherever a desired commodity is associated with unwanted constituents that may be similar in chemistry or structure but not both (e.g., Ca in both gypsum and feldspar, where only the gypsum is desired to make wallboard). In the mining industry, CHEMIN can determine mineral abundances on the spot and enable more economical mining by providing the means to assay when is being mined, quickly and frequently, at minimal cost. In manufacturing, CHEMIN could be used to spot-check the chemical composition and crystalline makeup of a product at any stage of production. Analysis by CHEMIN can be used as feedback in manufacturing processes where rates of heating, process temperature, mixture of feedstocks, and other variables must be adjusted in real time to correct structure and/or chemistry of the product (e.g., prevention of periclase and alkali sulfate coproduction in cement manufacture).

  10. Robustness of reduced-order multivariable state-space self-tuning controller

    NASA Technical Reports Server (NTRS)

    Yuan, Zhuzhi; Chen, Zengqiang

    1994-01-01

    In this paper, we present a quantitative analysis of the robustness of a reduced-order pole-assignment state-space self-tuning controller for a multivariable adaptive control system whose order of the real process is higher than that of the model used in the controller design. The result of stability analysis shows that, under a specific bounded modelling error, the adaptively controlled closed-loop real system via the reduced-order state-space self-tuner is BIBO stable in the presence of unmodelled dynamics.

  11. Process Analytical Technology (PAT): batch-to-batch reproducibility of fermentation processes by robust process operational design and control.

    PubMed

    Gnoth, S; Jenzsch, M; Simutis, R; Lübbert, A

    2007-10-31

    The Process Analytical Technology (PAT) initiative of the FDA is a reaction on the increasing discrepancy between current possibilities in process supervision and control of pharmaceutical production processes and its current application in industrial manufacturing processes. With rigid approval practices based on standard operational procedures, adaptations of production reactors towards the state of the art were more or less inhibited for long years. Now PAT paves the way for continuous process and product improvements through improved process supervision based on knowledge-based data analysis, "Quality-by-Design"-concepts, and, finally, through feedback control. Examples of up-to-date implementations of this concept are presented. They are taken from one key group of processes in recombinant pharmaceutical protein manufacturing, the cultivations of genetically modified Escherichia coli bacteria.

  12. Dynamic analysis of process reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shadle, L.J.; Lawson, L.O.; Noel, S.D.

    1995-06-01

    The approach and methodology of conducting a dynamic analysis is presented in this poster session in order to describe how this type of analysis can be used to evaluate the operation and control of process reactors. Dynamic analysis of the PyGas{trademark} gasification process is used to illustrate the utility of this approach. PyGas{trademark} is the gasifier being developed for the Gasification Product Improvement Facility (GPIF) by Jacobs-Siffine Engineering and Riley Stoker. In the first step of the analysis, process models are used to calculate the steady-state conditions and associated sensitivities for the process. For the PyGas{trademark} gasifier, the process modelsmore » are non-linear mechanistic models of the jetting fluidized-bed pyrolyzer and the fixed-bed gasifier. These process sensitivities are key input, in the form of gain parameters or transfer functions, to the dynamic engineering models.« less

  13. Data-driven process decomposition and robust online distributed modelling for large-scale processes

    NASA Astrophysics Data System (ADS)

    Shu, Zhang; Lijuan, Li; Lijuan, Yao; Shipin, Yang; Tao, Zou

    2018-02-01

    With the increasing attention of networked control, system decomposition and distributed models show significant importance in the implementation of model-based control strategy. In this paper, a data-driven system decomposition and online distributed subsystem modelling algorithm was proposed for large-scale chemical processes. The key controlled variables are first partitioned by affinity propagation clustering algorithm into several clusters. Each cluster can be regarded as a subsystem. Then the inputs of each subsystem are selected by offline canonical correlation analysis between all process variables and its controlled variables. Process decomposition is then realised after the screening of input and output variables. When the system decomposition is finished, the online subsystem modelling can be carried out by recursively block-wise renewing the samples. The proposed algorithm was applied in the Tennessee Eastman process and the validity was verified.

  14. The Prevention Program for Externalizing Problem Behavior (PEP) Improves Child Behavior by Reducing Negative Parenting: Analysis of Mediating Processes in a Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Hanisch, Charlotte; Hautmann, Christopher; Plück, Julia; Eichelberger, Ilka; Döpfner, Manfred

    2014-01-01

    Background: Our indicated Prevention program for preschool children with Externalizing Problem behavior (PEP) demonstrated improved parenting and child problem behavior in a randomized controlled efficacy trial and in a study with an effectiveness design. The aim of the present analysis of data from the randomized controlled trial was to identify…

  15. Application of Sigma Metrics Analysis for the Assessment and Modification of Quality Control Program in the Clinical Chemistry Laboratory of a Tertiary Care Hospital.

    PubMed

    Iqbal, Sahar; Mustansar, Tazeen

    2017-03-01

    Sigma is a metric that quantifies the performance of a process as a rate of Defects-Per-Million opportunities. In clinical laboratories, sigma metric analysis is used to assess the performance of laboratory process system. Sigma metric is also used as a quality management strategy for a laboratory process to improve the quality by addressing the errors after identification. The aim of this study is to evaluate the errors in quality control of analytical phase of laboratory system by sigma metric. For this purpose sigma metric analysis was done for analytes using the internal and external quality control as quality indicators. Results of sigma metric analysis were used to identify the gaps and need for modification in the strategy of laboratory quality control procedure. Sigma metric was calculated for quality control program of ten clinical chemistry analytes including glucose, chloride, cholesterol, triglyceride, HDL, albumin, direct bilirubin, total bilirubin, protein and creatinine, at two control levels. To calculate the sigma metric imprecision and bias was calculated with internal and external quality control data, respectively. The minimum acceptable performance was considered as 3 sigma. Westgard sigma rules were applied to customize the quality control procedure. Sigma level was found acceptable (≥3) for glucose (L2), cholesterol, triglyceride, HDL, direct bilirubin and creatinine at both levels of control. For rest of the analytes sigma metric was found <3. The lowest value for sigma was found for chloride (1.1) at L2. The highest value of sigma was found for creatinine (10.1) at L3. HDL was found with the highest sigma values at both control levels (8.8 and 8.0 at L2 and L3, respectively). We conclude that analytes with the sigma value <3 are required strict monitoring and modification in quality control procedure. In this study application of sigma rules provided us the practical solution for improved and focused design of QC procedure.

  16. Application of the user-centred design process according ISO 9241-210 in air traffic control.

    PubMed

    König, Christina; Hofmann, Thomas; Bruder, Ralph

    2012-01-01

    Designing a usable human machine interface for air traffic control is challenging and should follow approved methods. The ISO 9241-210 standard promises high usability of products by integrating future users and following an iterative process. This contribution describes the proceeding and first results of the analysis and application of ISO 9241-210 to develop a planning tool for air traffic controllers.

  17. A proven knowledge-based approach to prioritizing process information

    NASA Technical Reports Server (NTRS)

    Corsberg, Daniel R.

    1991-01-01

    Many space-related processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect is rapid analysis of the changing process information. During a disturbance, this task can overwhelm humans as well as computers. Humans deal with this by applying heuristics in determining significant information. A simple, knowledge-based approach to prioritizing information is described. The approach models those heuristics that humans would use in similar circumstances. The approach described has received two patents and was implemented in the Alarm Filtering System (AFS) at the Idaho National Engineering Laboratory (INEL). AFS was first developed for application in a nuclear reactor control room. It has since been used in chemical processing applications, where it has had a significant impact on control room environments. The approach uses knowledge-based heuristics to analyze data from process instrumentation and respond to that data according to knowledge encapsulated in objects and rules. While AFS cannot perform the complete diagnosis and control task, it has proven to be extremely effective at filtering and prioritizing information. AFS was used for over two years as a first level of analysis for human diagnosticians. Given the approach's proven track record in a wide variety of practical applications, it should be useful in both ground- and space-based systems.

  18. The use of isotypic control antibodies in the analysis of CD3+ and CD3+, CD4+ lymphocyte subsets by flow cytometry. Are they really necessary?

    PubMed

    Sreenan, J J; Tbakhi, A; Edinger, M G; Tubbs, R R

    1997-02-01

    Isotypic control reagents are defined as irrelevant antibodies of the same immunoglobulin class as the relevant reagent antibody in a flow cytometry panel. The use of the isotypic control antibody has been advocated as a necessary quality control measure in analysis of flow cytometry. The purpose of this study was to determine the necessity of an isotypic control antibody in the analysis of CD3+ and CD3+, CD4+ lymphocyte subsets. We performed a prospective study of 46 consecutive patient samples received for lymphocyte subset analysis to determine the need for the isotypic control. For each sample, a sham buffer (autocontrol) and isotypic control reagent were stained for three-color immunofluorescence, processed, and identically analyzed with Attractors software. The Attractors software allowed independent, multiparametric, simultaneous gating; was able to identically and reproducibly process each list mode file; and yielded population data in spreadsheet form. Statistical analysis (Fisher's z test) revealed no difference between the CD3+ autocontrol and CD3+ isotypic control (correlation = 1, P < .0001) or between the CD3+, CD4+ autocontrol and the CD3+, CD4+ isotypic control (correlation = 1, P < .0001). The elimination of the isotypic control reagent resulted in a total cost savings of $3.36 per test. Additionally, the subtraction of isotypic background can artifactually depress population enumeration. The use of an isotypic control antibody is not necessary to analyze flow cytometric data that result in discrete cell populations, such as CD3+ and CD3+, CD4+ lymphocyte subsets. The elimination of this unnecessary quality control measure results in substantial cost savings.

  19. Application of systems and control theory-based hazard analysis to radiation oncology.

    PubMed

    Pawlicki, Todd; Samost, Aubrey; Brown, Derek W; Manger, Ryan P; Kim, Gwe-Ya; Leveson, Nancy G

    2016-03-01

    Both humans and software are notoriously challenging to account for in traditional hazard analysis models. The purpose of this work is to investigate and demonstrate the application of a new, extended accident causality model, called systems theoretic accident model and processes (STAMP), to radiation oncology. Specifically, a hazard analysis technique based on STAMP, system-theoretic process analysis (STPA), is used to perform a hazard analysis. The STPA procedure starts with the definition of high-level accidents for radiation oncology at the medical center and the hazards leading to those accidents. From there, the hierarchical safety control structure of the radiation oncology clinic is modeled, i.e., the controls that are used to prevent accidents and provide effective treatment. Using STPA, unsafe control actions (behaviors) are identified that can lead to the hazards as well as causal scenarios that can lead to the identified unsafe control. This information can be used to eliminate or mitigate potential hazards. The STPA procedure is demonstrated on a new online adaptive cranial radiosurgery procedure that omits the CT simulation step and uses CBCT for localization, planning, and surface imaging system during treatment. The STPA procedure generated a comprehensive set of causal scenarios that are traced back to system hazards and accidents. Ten control loops were created for the new SRS procedure, which covered the areas of hospital and department management, treatment design and delivery, and vendor service. Eighty three unsafe control actions were identified as well as 472 causal scenarios that could lead to those unsafe control actions. STPA provides a method for understanding the role of management decisions and hospital operations on system safety and generating process design requirements to prevent hazards and accidents. The interaction of people, hardware, and software is highlighted. The method of STPA produces results that can be used to improve safety and prevent accidents and warrants further investigation.

  20. Numerical continuation and bifurcation analysis in aircraft design: an industrial perspective.

    PubMed

    Sharma, Sanjiv; Coetzee, Etienne B; Lowenberg, Mark H; Neild, Simon A; Krauskopf, Bernd

    2015-09-28

    Bifurcation analysis is a powerful method for studying the steady-state nonlinear dynamics of systems. Software tools exist for the numerical continuation of steady-state solutions as parameters of the system are varied. These tools make it possible to generate 'maps of solutions' in an efficient way that provide valuable insight into the overall dynamic behaviour of a system and potentially to influence the design process. While this approach has been employed in the military aircraft control community to understand the effectiveness of controllers, the use of bifurcation analysis in the wider aircraft industry is yet limited. This paper reports progress on how bifurcation analysis can play a role as part of the design process for passenger aircraft. © 2015 The Author(s).

  1. [Application progress on near infrared spectroscopy in quality control and process monitoring of traditional Chinese medicine].

    PubMed

    Li, Wenlong; Qu, Haibin

    2017-01-25

    The industry of traditional Chinese medicine (TCM) encounters problems like quality fluctuation of raw materials and unstandardized production process. Near infrared (NIR) spectroscopy technology is widely used in quality control of TCM because of its abundant information, fast and nondestructive characters. The main applications include quantitative analysis of Chinese medicinal materials, intermediates and Chinese patent medicines; the authenticity of TCM, species, origins and manufacturers; monitoring and control of the extraction, alcohol precipitation, column chromatography and blending process. This article reviews the progress on the application of NIR spectroscopy technology in TCM field. In view of the problems existing in the application, the article proposes that the standardization of NIR analysis method should be developed according to specific characteristics of TCM, which will promote the application of NIR technology in the TCM industry.

  2. Combat Ration Advanced Manufacturing Technology Demonstration (CRAMTD). ’Generic Inspection-Statistical Process Control System for a Combat Ration Manufacturing Facility’. Short Term Project (STP) Number 3.

    DTIC Science & Technology

    1996-01-01

    failure as due to an adhesive layer between the foil and inner polypropylene layers. "* Under subcontract, NFPA provided HACCP draft manuals for the...parameters of the production process and to ensure that they are within their target values. In addition, a HACCP program was used to assure product...played an important part in implementing Hazard Analysis Critical Control Points ( HACCP ) as part of the Process and Quality Control manual. The National

  3. Quality Control Analysis of Selected Aspects of Programs Administered by the Bureau of Student Financial Assistance. Task 1 and Quality Control Sample; Error-Prone Modeling Analysis Plan.

    ERIC Educational Resources Information Center

    Saavedra, Pedro; And Others

    Parameters and procedures for developing an error-prone model (EPM) to predict financial aid applicants who are likely to misreport on Basic Educational Opportunity Grant (BEOG) applications are introduced. Specifications to adapt these general parameters to secondary data analysis of the Validation, Edits, and Applications Processing Systems…

  4. [Analysis and countermeasure for quality risk in process of traditional Chinese medicine preparations].

    PubMed

    Yang, Ming; Yang, Yuan-Zhen; Wang, Ya-Qi; Wu, Zhen-Feng; Wang, Xue-Cheng; Luo, Jing

    2017-03-01

    Product quality relies on not only testing methods,but also the design and development, production control and product manufacturing all aspects of logistics management. Quality comes from the process control level.Therefore, it is very important to accurately identify the factors that may induce quality risk in the production process and quality control measures correspondingly.This article systematically analyzes the source of the quality risk of all aspects of the production process in traditional Chinese medicine preparation. Discussing ways and methods of quality risk identification of traditional Chinese medicine preparation and providing references for perfecting the whole process quality management of traditional Chinese medicine preparation. Copyright© by the Chinese Pharmaceutical Association.

  5. Intelligent system of coordination and control for manufacturing

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2016-08-01

    This paper wants shaping an intelligent system monitoring and control, which leads to optimizing material and information flows of the company. The paper presents a model for tracking and control system using intelligent real. Production system proposed for simulation analysis provides the ability to track and control the process in real time. Using simulation models be understood: the influence of changes in system structure, commands influence on the general condition of the manufacturing process conditions influence the behavior of some system parameters. Practical character consists of tracking and real-time control of the technological process. It is based on modular systems analyzed using mathematical models, graphic-analytical sizing, configuration, optimization and simulation.

  6. [Application of quality by design in granulation process for Ginkgo leaf tablet (Ⅲ): process control strategy based on design space].

    PubMed

    Cui, Xiang-Long; Xu, Bing; Sun, Fei; Dai, Sheng-Yun; Shi, Xin-Yuan; Qiao, Yan-Jiang

    2017-03-01

    In this paper, under the guidance of quality by design (QbD) concept, the control strategy of the high shear wet granulation process of the ginkgo leaf tablet based on the design space was established to improve the process controllability and product quality consistency. The median granule size (D50) and bulk density (Da) of granules were identified as critical quality attributes (CQAs) and potential critical process parameters (pCPPs) were determined by the failure modes and effect analysis (FMEA). The Plackeet-Burmann experimental design was used to screen pCPPs and the results demonstrated that the binder amount, the wet massing time and the wet mixing impeller speed were critical process parameters (CPPs). The design space of the high shear wet granulation process was developed within pCPPs range based on the Box-Behnken design and quadratic polynomial regression models. ANOVA analysis showed that the P-values of model were less than 0.05 and the values of lack of fit test were more than 0.1, indicating that the relationship between CQAs and CPPs could be well described by the mathematical models. D₅₀ could be controlled within 170 to 500 μm, and the bulk density could be controlled within 0.30 to 0.44 g•cm⁻³ by using any CPPs combination within the scope of design space. Besides, granules produced by process parameters within the design space region could also meet the requirement of tensile strength of the ginkgo leaf tablet.. Copyright© by the Chinese Pharmaceutical Association.

  7. A device for automatic photoelectric control of the analytical gap for emission spectrographs

    USGS Publications Warehouse

    Dietrich, John A.; Cooley, Elmo F.; Curry, Kenneth J.

    1977-01-01

    A photoelectric device has been built that automatically controls the analytical gap between electrodes during excitation period. The control device allows for precise control of the analytical gap during the arcing process of samples, resulting in better precision of analysis.

  8. Unmanned Aerial Vehicle Flight Test Approval Process and Its Implications: A Methodological Approach to Capture and Evaluate Hidden Costs and Value in the Overall Process

    DTIC Science & Technology

    2012-03-22

    world’s first powered and controlled flying machine. Numerous flight designs and tests were done by scientists, engineers, and flight enthusiasts...conceptual flight and preliminary designs before they could control the craft with three-axis control and the correct airfoil design . These pioneers...analysis support. Although wind tunnel testing can provide data to predict and develop control surface designs , few SUAV operators opt to utilize wind

  9. Parallel Wavefront Analysis for a 4D Interferometer

    NASA Technical Reports Server (NTRS)

    Rao, Shanti R.

    2011-01-01

    This software provides a programming interface for automating data collection with a PhaseCam interferometer from 4D Technology, and distributing the image-processing algorithm across a cluster of general-purpose computers. Multiple instances of 4Sight (4D Technology s proprietary software) run on a networked cluster of computers. Each connects to a single server (the controller) and waits for instructions. The controller directs the interferometer to several images, then assigns each image to a different computer for processing. When the image processing is finished, the server directs one of the computers to collate and combine the processed images, saving the resulting measurement in a file on a disk. The available software captures approximately 100 images and analyzes them immediately. This software separates the capture and analysis processes, so that analysis can be done at a different time and faster by running the algorithm in parallel across several processors. The PhaseCam family of interferometers can measure an optical system in milliseconds, but it takes many seconds to process the data so that it is usable. In characterizing an adaptive optics system, like the next generation of astronomical observatories, thousands of measurements are required, and the processing time quickly becomes excessive. A programming interface distributes data processing for a PhaseCam interferometer across a Windows computing cluster. A scriptable controller program coordinates data acquisition from the interferometer, storage on networked hard disks, and parallel processing. Idle time of the interferometer is minimized. This architecture is implemented in Python and JavaScript, and may be altered to fit a customer s needs.

  10. Interdisciplinary study of atmospheric processes and constituents of the mid-Atlantic coastal region.. [air pollution control studies in Virginia

    NASA Technical Reports Server (NTRS)

    Kindle, E. C.; Bandy, E. C.; Copeland, G.; Blais, R.; Levy, G.; Sonenshine, D.

    1975-01-01

    Past research projects for the year 1974-1975 are listed along with future research programs in the area of air pollution control, remote sensor analysis of smoke plumes, the biosphere component, and field experiments. A detailed budget analysis is presented. Attachments are included on the following topics: mapping forest vegetation with ERTS-1 MSS data and automatic data processing techniques, and use of LARS system for the quantitative determination of smoke plume lateral diffusion coefficients from ERTS images of Virginia.

  11. Evolutionary analysis of groundwater flow: Application of multivariate statistical analysis to hydrochemical data in the Densu Basin, Ghana

    NASA Astrophysics Data System (ADS)

    Yidana, Sandow Mark; Bawoyobie, Patrick; Sakyi, Patrick; Fynn, Obed Fiifi

    2018-02-01

    An evolutionary trend has been postulated through the analysis of hydrochemical data of a crystalline rock aquifer system in the Densu Basin, Southern Ghana. Hydrochemcial data from 63 groundwater samples, taken from two main groundwater outlets (Boreholes and hand dug wells) were used to postulate an evolutionary theory for the basin. Sequential factor and hierarchical cluster analysis were used to disintegrate the data into three factors and five clusters (spatial associations). These were used to characterize the controls on groundwater hydrochemistry and its evolution in the terrain. The dissolution of soluble salts and cation exchange processes are the dominant processes controlling groundwater hydrochemistry in the terrain. The trend of evolution of this set of processes follows the pattern of groundwater flow predicted by a calibrated transient groundwater model in the area. The data suggest that anthropogenic activities represent the second most important process in the hydrochemistry. Silicate mineral weathering is the third most important set of processes. Groundwater associations resulting from Q-mode hierarchical cluster analysis indicate an evolutionary pattern consistent with the general groundwater flow pattern in the basin. These key findings are at variance with results of previous investigations and indicate that when carefully done, groundwater hydrochemical data can be very useful for conceptualizing groundwater flow in basins.

  12. Fuzzy control of a fluidized bed dryer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taprantzis, A.V.; Siettos, C.I.; Bafas, G.V.

    1997-05-01

    Fluidized bed dryers are utilized in almost every area of drying applications and therefore improved control strategies are always of great interest. The nonlinear character of the process, exhibited in the mathematical model and the open loop analysis, implies that a fuzzy logic controller is appropriate because, in contrast with conventional control schemes, fuzzy control inherently compensates for process nonlinearities and exhibits more robust behavior. In this study, a fuzzy logic controller is proposed; its design is based on a heuristic approach and its performance is compared against a conventional PI controller for a variety of responses. It is shownmore » that the fuzzy controller exhibits a remarkable dynamic behavior, equivalent if not better than the PI controller, for a wide range of disturbances. In addition, the proposed fuzzy controller seems to be less sensitive to the nonlinearities of the process, achieves energy savings and enables MIMO control.« less

  13. HACCP-Based Programs for Preventing Disease and Injury from Premise Plumbing: A Building Consensus

    PubMed Central

    McCoy, William F.; Rosenblatt, Aaron A.

    2015-01-01

    Thousands of preventable injuries and deaths are annually caused by microbial, chemical and physical hazards from building water systems. Water is processed in buildings before use; this can degrade the quality of the water. Processing steps undertaken on-site in buildings often include conditioning, filtering, storing, heating, cooling, pressure regulation and distribution through fixtures that restrict flow and temperature. Therefore, prevention of disease and injury requires process management. A process management framework for buildings is the hazard analysis and critical control point (HACCP) adaptation of failure mode effects analysis (FMEA). It has been proven effective for building water system management. Validation is proof that hazards have been controlled under operating conditions and may include many kinds of evidence including cultures of building water samples to detect and enumerate potentially pathogenic microorganisms. However, results from culture tests are often inappropriately used because the accuracy and precision are not sufficient to support specifications for control limit or action triggers. A reliable negative screen is based on genus-level Polymerase Chain Reaction (PCR) for Legionella in building water systems; however, building water samples with positive results from this test require further analysis by culture methods. PMID:26184325

  14. HACCP-Based Programs for Preventing Disease and Injury from Premise Plumbing: A Building Consensus.

    PubMed

    McCoy, William F; Rosenblatt, Aaron A

    2015-07-09

    Thousands of preventable injuries and deaths are annually caused by microbial, chemical and physical hazards from building water systems. Water is processed in buildings before use; this can degrade the quality of the water. Processing steps undertaken on-site in buildings often include conditioning, filtering, storing, heating, cooling, pressure regulation and distribution through fixtures that restrict flow and temperature. Therefore, prevention of disease and injury requires process management. A process management framework for buildings is the hazard analysis and critical control point (HACCP) adaptation of failure mode effects analysis (FMEA). It has been proven effective for building water system management. Validation is proof that hazards have been controlled under operating conditions and may include many kinds of evidence including cultures of building water samples to detect and enumerate potentially pathogenic microorganisms. However, results from culture tests are often inappropriately used because the accuracy and precision are not sufficient to support specifications for control limit or action triggers. A reliable negative screen is based on genus-level Polymerase Chain Reaction (PCR) for Legionella in building water systems; however, building water samples with positive results from this test require further analysis by culture methods.

  15. [Internal audit in medical laboratory: what means of control for an effective audit process?].

    PubMed

    Garcia-Hejl, Carine; Chianéa, Denis; Dedome, Emmanuel; Sanmartin, Nancy; Bugier, Sarah; Linard, Cyril; Foissaud, Vincent; Vest, Philippe

    2013-01-01

    To prepare the French Accreditation Committee (COFRAC) visit for initial certification of our medical laboratory, our direction evaluated its quality management system (QMS) and all its technical activities. This evaluation was performed owing an internal audit. This audit was outsourced. Auditors had an expertise in audit, a whole knowledge of biological standards and were independent. Several nonconformities were identified at that time, including a lack of control of several steps of the internal audit process. Hence, necessary corrective actions were taken in order to meet the requirements of standards, in particular, the formalization of all stages, from the audit program, to the implementation, review and follow-up of the corrective actions taken, and also the implementation of the resources needed to carry out audits in a pre-established timing. To ensure an optimum control of each step, the main concepts of risk management were applied: process approach, root cause analysis, effects and criticality analysis (FMECA). After a critical analysis of our practices, this methodology allowed us to define our "internal audit" process, then to formalize it and to follow it up, with a whole documentary system.

  16. Vision-aided Monitoring and Control of Thermal Spray, Spray Forming, and Welding Processes

    NASA Technical Reports Server (NTRS)

    Agapakis, John E.; Bolstad, Jon

    1993-01-01

    Vision is one of the most powerful forms of non-contact sensing for monitoring and control of manufacturing processes. However, processes involving an arc plasma or flame such as welding or thermal spraying pose particularly challenging problems to conventional vision sensing and processing techniques. The arc or plasma is not typically limited to a single spectral region and thus cannot be easily filtered out optically. This paper presents an innovative vision sensing system that uses intense stroboscopic illumination to overpower the arc light and produce a video image that is free of arc light or glare and dedicated image processing and analysis schemes that can enhance the video images or extract features of interest and produce quantitative process measures which can be used for process monitoring and control. Results of two SBIR programs sponsored by NASA and DOE and focusing on the application of this innovative vision sensing and processing technology to thermal spraying and welding process monitoring and control are discussed.

  17. SARTools: A DESeq2- and EdgeR-Based R Pipeline for Comprehensive Differential Analysis of RNA-Seq Data.

    PubMed

    Varet, Hugo; Brillet-Guéguen, Loraine; Coppée, Jean-Yves; Dillies, Marie-Agnès

    2016-01-01

    Several R packages exist for the detection of differentially expressed genes from RNA-Seq data. The analysis process includes three main steps, namely normalization, dispersion estimation and test for differential expression. Quality control steps along this process are recommended but not mandatory, and failing to check the characteristics of the dataset may lead to spurious results. In addition, normalization methods and statistical models are not exchangeable across the packages without adequate transformations the users are often not aware of. Thus, dedicated analysis pipelines are needed to include systematic quality control steps and prevent errors from misusing the proposed methods. SARTools is an R pipeline for differential analysis of RNA-Seq count data. It can handle designs involving two or more conditions of a single biological factor with or without a blocking factor (such as a batch effect or a sample pairing). It is based on DESeq2 and edgeR and is composed of an R package and two R script templates (for DESeq2 and edgeR respectively). Tuning a small number of parameters and executing one of the R scripts, users have access to the full results of the analysis, including lists of differentially expressed genes and a HTML report that (i) displays diagnostic plots for quality control and model hypotheses checking and (ii) keeps track of the whole analysis process, parameter values and versions of the R packages used. SARTools provides systematic quality controls of the dataset as well as diagnostic plots that help to tune the model parameters. It gives access to the main parameters of DESeq2 and edgeR and prevents untrained users from misusing some functionalities of both packages. By keeping track of all the parameters of the analysis process it fits the requirements of reproducible research.

  18. Statistical process management: An essential element of quality improvement

    NASA Astrophysics Data System (ADS)

    Buckner, M. R.

    Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.

  19. [Design of a HACCP plan for the industrial process of frozen sardines].

    PubMed

    Rosas, Patricia; Reyes, Genara

    2009-09-01

    The Hazard Analysis and Critical Control Point (HACCP) is a system to identify, assess and control the hazards related with production, processing, distribution and consumption in order to get safe food. The aim of this study was to design a HACCP plan for implementing in processing line of frozen whole sardine (Sardinella aurita). The methodology was based in the evaluation of the accomplishment of the pre-requisite programs (GMP/SSOP in a previous study), the application of the principles of the HACCP and the sequence of stages settles down by the COVENIN Venezuelan standard No 3802. Time-temperature was recorded in each processing step. Histamine was determined by VERATOX NEOGEN. Results showed that some sardine batches arrived to the plant with high time-temperature records, finding up to 5 ppm of histamine due to the abuse of temperature during transportation. A HACCP plan is proposed with the scope, the selection of the team, the description of the product and the intended use, the flow diagram of the process, hazard analysis and identification of CCP, monitoring system, corrective actions and records. The potential hazards were identified as pathogen growth, presence of histamine and physical objects in the sardines. The control measures of PCC are referred as control of time-temperature during transportation and processing, monitoring of ice supplies and sanitary conditions in the process.

  20. Simulation of process identification and controller tuning for flow control system

    NASA Astrophysics Data System (ADS)

    Chew, I. M.; Wong, F.; Bono, A.; Wong, K. I.

    2017-06-01

    PID controller is undeniably the most popular method used in controlling various industrial processes. The feature to tune the three elements in PID has allowed the controller to deal with specific needs of the industrial processes. This paper discusses the three elements of control actions and improving robustness of controllers through combination of these control actions in various forms. A plant model is simulated using the Process Control Simulator in order to evaluate the controller performance. At first, the open loop response of the plant is studied by applying a step input to the plant and collecting the output data from the plant. Then, FOPDT of physical model is formed by using both Matlab-Simulink and PRC method. Then, calculation of controller’s setting is performed to find the values of Kc and τi that will give satisfactory control in closed loop system. Then, the performance analysis of closed loop system is obtained by set point tracking analysis and disturbance rejection performance. To optimize the overall physical system performance, a refined tuning of PID or detuning is further conducted to ensure a consistent resultant output of closed loop system reaction to the set point changes and disturbances to the physical model. As a result, the PB = 100 (%) and τi = 2.0 (s) is preferably chosen for setpoint tracking while PB = 100 (%) and τi = 2.5 (s) is selected for rejecting the imposed disturbance to the model. In a nutshell, selecting correlation tuning values is likewise depended on the required control’s objective for the stability performance of overall physical model.

  1. A novel patterning control strategy based on real-time fingerprint recognition and adaptive wafer level scanner optimization

    NASA Astrophysics Data System (ADS)

    Cekli, Hakki Ergun; Nije, Jelle; Ypma, Alexander; Bastani, Vahid; Sonntag, Dag; Niesing, Henk; Zhang, Linmiao; Ullah, Zakir; Subramony, Venky; Somasundaram, Ravin; Susanto, William; Matsunobu, Masazumi; Johnson, Jeff; Tabery, Cyrus; Lin, Chenxi; Zou, Yi

    2018-03-01

    In addition to lithography process and equipment induced variations, processes like etching, annealing, film deposition and planarization exhibit variations, each having their own intrinsic characteristics and leaving an effect, a `fingerprint', on the wafers. With ever tighter requirements for CD and overlay, controlling these process induced variations is both increasingly important and increasingly challenging in advanced integrated circuit (IC) manufacturing. For example, the on-product overlay (OPO) requirement for future nodes is approaching <3nm, requiring the allowable budget for process induced variance to become extremely small. Process variance control is seen as an bottleneck to further shrink which drives the need for more sophisticated process control strategies. In this context we developed a novel `computational process control strategy' which provides the capability of proactive control of each individual wafer with aim to maximize the yield, without introducing a significant impact on metrology requirements, cycle time or productivity. The complexity of the wafer process is approached by characterizing the full wafer stack building a fingerprint library containing key patterning performance parameters like Overlay, Focus, etc. Historical wafer metrology is decomposed into dominant fingerprints using Principal Component Analysis. By associating observed fingerprints with their origin e.g. process steps, tools and variables, we can give an inline assessment of the strength and origin of the fingerprints on every wafer. Once the fingerprint library is established, a wafer specific fingerprint correction recipes can be determined based on its processing history. Data science techniques are used in real-time to ensure that the library is adaptive. To realize this concept, ASML TWINSCAN scanners play a vital role with their on-board full wafer detection and exposure correction capabilities. High density metrology data is created by the scanner for each wafer and on every layer during the lithography steps. This metrology data will be used to obtain the process fingerprints. Also, the per exposure and per wafer correction potential of the scanners will be utilized for improved patterning control. Additionally, the fingerprint library will provide early detection of excursions for inline root cause analysis and process optimization guidance.

  2. 78 FR 17142 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-20

    ... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Correction AGENCY... manufacturing, packing, or holding human food (CGMPs) to modernize it and to add requirements for domestic and... ``food-production purposes (i.e., manufacturing, processing, packing, and holding) to consistently use...

  3. Laboratory cost control and financial management software.

    PubMed

    Mayer, M

    1998-02-09

    Economical constraints within the health care system advocate the introduction of tighter control of costs in clinical laboratories. Detailed cost information forms the basis for cost control and financial management. Based on the cost information, proper decisions regarding priorities, procedure choices, personnel policies and investments can be made. This presentation outlines some principles of cost analysis, describes common limitations of cost analysis, and exemplifies use of software to achieve optimized cost control. One commercially available cost analysis software, LabCost, is described in some detail. In addition to provision of cost information, LabCost also serves as a general management tool for resource handling, accounting, inventory management and billing. The application of LabCost in the selection process of a new high throughput analyzer for a large clinical chemistry service is taken as an example for decisions that can be assisted by cost evaluation. It is concluded that laboratory management that wisely utilizes cost analysis to support the decision-making process will undoubtedly have a clear advantage over those laboratories that fail to employ cost considerations to guide their actions.

  4. Neural Network Based Modeling and Analysis of LP Control Surface Allocation

    NASA Technical Reports Server (NTRS)

    Langari, Reza; Krishnakumar, Kalmanje; Gundy-Burlet, Karen

    2003-01-01

    This paper presents an approach to interpretive modeling of LP based control allocation in intelligent flight control. The emphasis is placed on a nonlinear interpretation of the LP allocation process as a static map to support analytical study of the resulting closed loop system, albeit in approximate form. The approach makes use of a bi-layer neural network to capture the essential functioning of the LP allocation process. It is further shown via Lyapunov based analysis that under certain relatively mild conditions the resulting closed loop system is stable. Some preliminary conclusions from a study at Ames are stated and directions for further research are given at the conclusion of the paper.

  5. Image data-processing system for solar astronomy

    NASA Technical Reports Server (NTRS)

    Wilson, R. M.; Teuber, D. L.; Watkins, J. R.; Thomas, D. T.; Cooper, C. M.

    1977-01-01

    The paper describes an image data processing system (IDAPS), its hardware/software configuration, and interactive and batch modes of operation for the analysis of the Skylab/Apollo Telescope Mount S056 X-Ray Telescope experiment data. Interactive IDAPS is primarily designed to provide on-line interactive user control of image processing operations for image familiarization, sequence and parameter optimization, and selective feature extraction and analysis. Batch IDAPS follows the normal conventions of card control and data input and output, and is best suited where the desired parameters and sequence of operations are known and when long image-processing times are required. Particular attention is given to the way in which this system has been used in solar astronomy and other investigations. Some recent results obtained by means of IDAPS are presented.

  6. Combining principles of Cognitive Load Theory and diagnostic error analysis for designing job aids: Effects on motivation and diagnostic performance in a process control task.

    PubMed

    Kluge, Annette; Grauel, Britta; Burkolter, Dina

    2013-03-01

    Two studies are presented in which the design of a procedural aid and the impact of an additional decision aid for process control were assessed. In Study 1, a procedural aid was developed that avoids imposing unnecessary extraneous cognitive load on novices when controlling a complex technical system. This newly designed procedural aid positively affected germane load, attention, satisfaction, motivation, knowledge acquisition and diagnostic speed for novel faults. In Study 2, the effect of a decision aid for use before the procedural aid was investigated, which was developed based on an analysis of diagnostic errors committed in Study 1. Results showed that novices were able to diagnose both novel faults and practised faults, and were even faster at diagnosing novel faults. This research contributes to the question of how to optimally support novices in dealing with technical faults in process control. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  7. EPIBLASTER-fast exhaustive two-locus epistasis detection strategy using graphical processing units

    PubMed Central

    Kam-Thong, Tony; Czamara, Darina; Tsuda, Koji; Borgwardt, Karsten; Lewis, Cathryn M; Erhardt-Lehmann, Angelika; Hemmer, Bernhard; Rieckmann, Peter; Daake, Markus; Weber, Frank; Wolf, Christiane; Ziegler, Andreas; Pütz, Benno; Holsboer, Florian; Schölkopf, Bernhard; Müller-Myhsok, Bertram

    2011-01-01

    Detection of epistatic interaction between loci has been postulated to provide a more in-depth understanding of the complex biological and biochemical pathways underlying human diseases. Studying the interaction between two loci is the natural progression following traditional and well-established single locus analysis. However, the added costs and time duration required for the computation involved have thus far deterred researchers from pursuing a genome-wide analysis of epistasis. In this paper, we propose a method allowing such analysis to be conducted very rapidly. The method, dubbed EPIBLASTER, is applicable to case–control studies and consists of a two-step process in which the difference in Pearson's correlation coefficients is computed between controls and cases across all possible SNP pairs as an indication of significant interaction warranting further analysis. For the subset of interactions deemed potentially significant, a second-stage analysis is performed using the likelihood ratio test from the logistic regression to obtain the P-value for the estimated coefficients of the individual effects and the interaction term. The algorithm is implemented using the parallel computational capability of commercially available graphical processing units to greatly reduce the computation time involved. In the current setup and example data sets (211 cases, 222 controls, 299468 SNPs; and 601 cases, 825 controls, 291095 SNPs), this coefficient evaluation stage can be completed in roughly 1 day. Our method allows for exhaustive and rapid detection of significant SNP pair interactions without imposing significant marginal effects of the single loci involved in the pair. PMID:21150885

  8. From scenarios to domain models: processes and representations

    NASA Astrophysics Data System (ADS)

    Haddock, Gail; Harbison, Karan

    1994-03-01

    The domain specific software architectures (DSSA) community has defined a philosophy for the development of complex systems. This philosophy improves productivity and efficiency by increasing the user's role in the definition of requirements, increasing the systems engineer's role in the reuse of components, and decreasing the software engineer's role to the development of new components and component modifications only. The scenario-based engineering process (SEP), the first instantiation of the DSSA philosophy, has been adopted by the next generation controller project. It is also the chosen methodology of the trauma care information management system project, and the surrogate semi-autonomous vehicle project. SEP uses scenarios from the user to create domain models and define the system's requirements. Domain knowledge is obtained from a variety of sources including experts, documents, and videos. This knowledge is analyzed using three techniques: scenario analysis, task analysis, and object-oriented analysis. Scenario analysis results in formal representations of selected scenarios. Task analysis of the scenario representations results in descriptions of tasks necessary for object-oriented analysis and also subtasks necessary for functional system analysis. Object-oriented analysis of task descriptions produces domain models and system requirements. This paper examines the representations that support the DSSA philosophy, including reference requirements, reference architectures, and domain models. The processes used to create and use the representations are explained through use of the scenario-based engineering process. Selected examples are taken from the next generation controller project.

  9. Nonlinear Markov Control Processes and Games

    DTIC Science & Technology

    2012-11-15

    the analysis of a new class of stochastic games , nonlinear Markov games , as they arise as a ( competitive ) controlled version of nonlinear Markov... competitive interests) a nonlinear Markov game that we are investigating. I 0. :::tUt::JJt:.l.. I I t:t11VI;:, nonlinear Markov game , nonlinear Markov...corresponding stochastic game Γ+(T, h). In a slightly different setting one can assume that changes in a competitive control process occur as a

  10. Scater: pre-processing, quality control, normalization and visualization of single-cell RNA-seq data in R.

    PubMed

    McCarthy, Davis J; Campbell, Kieran R; Lun, Aaron T L; Wills, Quin F

    2017-04-15

    Single-cell RNA sequencing (scRNA-seq) is increasingly used to study gene expression at the level of individual cells. However, preparing raw sequence data for further analysis is not a straightforward process. Biases, artifacts and other sources of unwanted variation are present in the data, requiring substantial time and effort to be spent on pre-processing, quality control (QC) and normalization. We have developed the R/Bioconductor package scater to facilitate rigorous pre-processing, quality control, normalization and visualization of scRNA-seq data. The package provides a convenient, flexible workflow to process raw sequencing reads into a high-quality expression dataset ready for downstream analysis. scater provides a rich suite of plotting tools for single-cell data and a flexible data structure that is compatible with existing tools and can be used as infrastructure for future software development. The open-source code, along with installation instructions, vignettes and case studies, is available through Bioconductor at http://bioconductor.org/packages/scater . davis@ebi.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  11. A neuro-data envelopment analysis approach for optimization of uncorrelated multiple response problems with smaller the better type controllable factors

    NASA Astrophysics Data System (ADS)

    Bashiri, Mahdi; Farshbaf-Geranmayeh, Amir; Mogouie, Hamed

    2013-11-01

    In this paper, a new method is proposed to optimize a multi-response optimization problem based on the Taguchi method for the processes where controllable factors are the smaller-the-better (STB)-type variables and the analyzer desires to find an optimal solution with smaller amount of controllable factors. In such processes, the overall output quality of the product should be maximized while the usage of the process inputs, the controllable factors, should be minimized. Since all possible combinations of factors' levels, are not considered in the Taguchi method, the response values of the possible unpracticed treatments are estimated using the artificial neural network (ANN). The neural network is tuned by the central composite design (CCD) and the genetic algorithm (GA). Then data envelopment analysis (DEA) is applied for determining the efficiency of each treatment. Although the important issue for implementation of DEA is its philosophy, which is maximization of outputs versus minimization of inputs, this important issue has been neglected in previous similar studies in multi-response problems. Finally, the most efficient treatment is determined using the maximin weight model approach. The performance of the proposed method is verified in a plastic molding process. Moreover a sensitivity analysis has been done by an efficiency estimator neural network. The results show efficiency of the proposed approach.

  12. Utilization of a Multi-Disciplinary Approach to Building Effective Command Centers: Process and Products

    DTIC Science & Technology

    2005-06-01

    cognitive task analysis , organizational information dissemination and interaction, systems engineering, collaboration and communications processes, decision-making processes, and data collection and organization. By blending these diverse disciplines command centers can be designed to support decision-making, cognitive analysis, information technology, and the human factors engineering aspects of Command and Control (C2). This model can then be used as a baseline when dealing with work in areas of business processes, workflow engineering, information management,

  13. Containerless processing of undercooled melts

    NASA Technical Reports Server (NTRS)

    Perepezko, J. H.

    1993-01-01

    The investigation focused on the control of microstructural evolution in Mn-Al, Fe-Ni, Ni-V, and Au-Pb-Sb alloys through the high undercooling levels provided by containerless processing, and provided fundamental new information on the control of nucleation. Solidification analysis was conducted by means of thermal analysis, x-ray diffraction, and metallographic characterization on samples processed in a laboratory scale drop tube system. The Mn-Al alloy system offers a useful model system with the capability of phase separation on an individual particle basis, thus permitting a more complete understanding of the operative kinetics and the key containerless processing variables. This system provided the opportunity of analyzing the nucleation rate as a function of processing conditions and allowed for the quantitative assessment of the relevant processing parameters. These factors are essential in the development of a containerless processing model which has a predictive capability. Similarly, Ni-V is a model system that was used to study duplex partitionless solidification, which is a structure possible only in high under cooling solidification processes. Nucleation kinetics for the competing bcc and fcc phases were studied to determine how this structure can develop and the conditions under which it may occur. The Fe-Ni alloy system was studied to identify microstructural transitions with controlled variations in sample size and composition during containerless solidification. This work was forwarded to develop a microstructure map which delineates regimes of structural evolution and provides a unified analysis of experimental observations. The Au-Pb-Sb system was investigated to characterize the thermodynamic properties of the undercooled liquid phase and to characterize the glass transition under a variety of processing conditions. By analyzing key containerless processing parameters in a ground based drop tube study, a carefully designed flight experiment may be planned to utilize the extended duration microgravity conditions of orbiting spacecraft.

  14. [Quality by design approaches for pharmaceutical development and manufacturing of Chinese medicine].

    PubMed

    Xu, Bing; Shi, Xin-Yuan; Wu, Zhi-Sheng; Zhang, Yan-Ling; Wang, Yun; Qiao, Yan-Jiang

    2017-03-01

    The pharmaceutical quality was built by design, formed in the manufacturing process and improved during the product's lifecycle. Based on the comprehensive literature review of pharmaceutical quality by design (QbD), the essential ideas and implementation strategies of pharmaceutical QbD were interpreted. Considering the complex nature of Chinese medicine, the "4H" model was innovated and proposed for implementing QbD in pharmaceutical development and industrial manufacture of Chinese medicine product. "4H" corresponds to the acronym of holistic design, holistic information analysis, holistic quality control, and holistic process optimization, which is consistent with the holistic concept of Chinese medicine theory. The holistic design aims at constructing both the quality problem space from the patient requirement and the quality solution space from multidisciplinary knowledge. Holistic information analysis emphasizes understanding the quality pattern of Chinese medicine by integrating and mining multisource data and information at a relatively high level. The batch-to-batch quality consistence and manufacturing system reliability can be realized by comprehensive application of inspective quality control, statistical quality control, predictive quality control and intelligent quality control strategies. Holistic process optimization is to improve the product quality and process capability during the product lifecycle management. The implementation of QbD is useful to eliminate the ecosystem contradictions lying in the pharmaceutical development and manufacturing process of Chinese medicine product, and helps guarantee the cost effectiveness. Copyright© by the Chinese Pharmaceutical Association.

  15. Measuring and improving the quality of postoperative epidural analgesia for major abdominal surgery using statistical process control charts.

    PubMed

    Duncan, Fiona; Haigh, Carol

    2013-10-01

    To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that has led to the establishment of a national pain registry. © 2013 Blackwell Publishing Ltd.

  16. Systems and methods for interactive virtual reality process control and simulation

    DOEpatents

    Daniel, Jr., William E.; Whitney, Michael A.

    2001-01-01

    A system for visualizing, controlling and managing information includes a data analysis unit for interpreting and classifying raw data using analytical techniques. A data flow coordination unit routes data from its source to other components within the system. A data preparation unit handles the graphical preparation of the data and a data rendering unit presents the data in a three-dimensional interactive environment where the user can observe, interact with, and interpret the data. A user can view the information on various levels, from a high overall process level view, to a view illustrating linkage between variables, to view the hard data itself, or to view results of an analysis of the data. The system allows a user to monitor a physical process in real-time and further allows the user to manage and control the information in a manner not previously possible.

  17. Spectroscopic analysis technique for arc-welding process control

    NASA Astrophysics Data System (ADS)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  18. TU-FG-201-05: Varian MPC as a Statistical Process Control Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carver, A; Rowbottom, C

    Purpose: Quality assurance in radiotherapy requires the measurement of various machine parameters to ensure they remain within permitted values over time. In Truebeam release 2.0 the Machine Performance Check (MPC) was released allowing beam output and machine axis movements to be assessed in a single test. We aim to evaluate the Varian Machine Performance Check (MPC) as a tool for Statistical Process Control (SPC). Methods: Varian’s MPC tool was used on three Truebeam and one EDGE linac for a period of approximately one year. MPC was commissioned against independent systems. After this period the data were reviewed to determine whethermore » or not the MPC was useful as a process control tool. Analyses on individual tests were analysed using Shewhart control plots, using Matlab for analysis. Principal component analysis was used to determine if a multivariate model was of any benefit in analysing the data. Results: Control charts were found to be useful to detect beam output changes, worn T-nuts and jaw calibration issues. Upper and lower control limits were defined at the 95% level. Multivariate SPC was performed using Principal Component Analysis. We found little evidence of clustering beyond that which might be naively expected such as beam uniformity and beam output. Whilst this makes multivariate analysis of little use it suggests that each test is giving independent information. Conclusion: The variety of independent parameters tested in MPC makes it a sensitive tool for routine machine QA. We have determined that using control charts in our QA programme would rapidly detect changes in machine performance. The use of control charts allows large quantities of tests to be performed on all linacs without visual inspection of all results. The use of control limits alerts users when data are inconsistent with previous measurements before they become out of specification. A. Carver has received a speaker’s honorarium from Varian.« less

  19. The development of Canadian nursing: professionalization and proletarianization.

    PubMed

    Coburn, D

    1988-01-01

    In this article, the development of nursing in Canada is described in terms of three major time periods: the emergence of lay nursing, including organization and registration, 1870-1930; the move to the hospital, 1930-1950; and unionization and the routinization of health care, 1950 to the present. This development is viewed in the light of the orienting concepts of professionalization, proletarianization, and medical dominance (and gender analysis). This historical trajectory of nursing shows an increasing occupational autonomy but continuing struggles over control of the labor process. Nursing is now using theory, organizational changes in health care, and credentialism to help make nursing "separate from but equal to" medicine and to gain control over the day-to-day work of the nurse. Nursing can thus be viewed as undergoing processes of both professionalization and proletarianization. As nursing seeks to control the labor process, its occupational conflicts are joined to the class struggle of white-collar workers in general. Analysis of nursing indicates the problems involved in sorting out the meaning of concepts that are relevant to occupational or class analysis but which focus on the same empirical phenomenon.

  20. Cost Estimation and Control for Flight Systems

    NASA Technical Reports Server (NTRS)

    Hammond, Walter E.; Vanhook, Michael E. (Technical Monitor)

    2002-01-01

    Good program management practices, cost analysis, cost estimation, and cost control for aerospace flight systems are interrelated and depend upon each other. The best cost control process cannot overcome poor design or poor systems trades that lead to the wrong approach. The project needs robust Technical, Schedule, Cost, Risk, and Cost Risk practices before it can incorporate adequate Cost Control. Cost analysis both precedes and follows cost estimation -- the two are closely coupled with each other and with Risk analysis. Parametric cost estimating relationships and computerized models are most often used. NASA has learned some valuable lessons in controlling cost problems, and recommends use of a summary Project Manager's checklist as shown here.

  1. Process control using fiber optics and Fourier transform infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Kemsley, E. K.; Wilson, Reginald H.

    1992-03-01

    A process control system has been constructed using optical fibers interfaced to a Fourier transform infrared (FT-IR) spectrometer, to achieve remote spectroscopic analysis of food samples during processing. The multichannel interface accommodates six fibers, allowing the sequential observation of up to six samples. Novel fiber-optic sampling cells have been constructed, including transmission and attenuated total reflectance (ATR) designs. Different fiber types have been evaluated; in particular, plastic clad silica (PCS) and zirconium fluoride fibers. Processes investigated have included the dilution of fruit juice concentrate, and the addition of alcohol to fruit syrup. Suitable algorithms have been written which use the results of spectroscopic measurements to control and monitor the course of each process, by actuating devices such as valves and switches.

  2. Online low-field NMR spectroscopy for process control of an industrial lithiation reaction-automated data analysis.

    PubMed

    Kern, Simon; Meyer, Klas; Guhl, Svetlana; Gräßer, Patrick; Paul, Andrea; King, Rudibert; Maiwald, Michael

    2018-05-01

    Monitoring specific chemical properties is the key to chemical process control. Today, mainly optical online methods are applied, which require time- and cost-intensive calibration effort. NMR spectroscopy, with its advantage being a direct comparison method without need for calibration, has a high potential for enabling closed-loop process control while exhibiting short set-up times. Compact NMR instruments make NMR spectroscopy accessible in industrial and rough environments for process monitoring and advanced process control strategies. We present a fully automated data analysis approach which is completely based on physically motivated spectral models as first principles information (indirect hard modeling-IHM) and applied it to a given pharmaceutical lithiation reaction in the framework of the European Union's Horizon 2020 project CONSENS. Online low-field NMR (LF NMR) data was analyzed by IHM with low calibration effort, compared to a multivariate PLS-R (partial least squares regression) approach, and both validated using online high-field NMR (HF NMR) spectroscopy. Graphical abstract NMR sensor module for monitoring of the aromatic coupling of 1-fluoro-2-nitrobenzene (FNB) with aniline to 2-nitrodiphenylamine (NDPA) using lithium-bis(trimethylsilyl) amide (Li-HMDS) in continuous operation. Online 43.5 MHz low-field NMR (LF) was compared to 500 MHz high-field NMR spectroscopy (HF) as reference method.

  3. The Least Costs Hypothesis: A rational analysis approach to the voluntary symbolic control of attention.

    PubMed

    Pauszek, Joseph R; Gibson, Bradley S

    2018-04-30

    Here we propose a rational analysis account of voluntary symbolic attention control-the Least Costs Hypothesis (LCH)-that construes voluntary control as a decision between intentional cue use and unguided search. Consistent with the LCH, the present study showed that this decision is sensitive to variations in cue processing efficiency. In Experiment 1, observers demonstrated a robust preference for using "easy-to-process" arrow cues but not "hard-to-process" spatial word cues to satisfy an easy visual search goal; Experiment 2 showed that this preference persisted even when the temporal costs of cue processing were neutralized. Experiment 3 showed that observers reported this cue type preference outside the context of a speeded task, and Experiment 4 showed empirical measures of this bias to be relatively stable over the course of the task. Together with previous evidence suggesting that observers' decision between intentional cue use and unguided search is also influenced by variations in unguided search efficiency, these findings suggest that voluntary symbolic attention control is mediated by ongoing metacognitive evaluations of demand that are sensitive to perceived variations in the time, effort, and opportunity costs associated with each course of action. Thus, voluntary symbolic attention control is far more complex than previously held. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  4. Considerations for the Systematic Analysis and Use of Single-Case Research

    ERIC Educational Resources Information Center

    Horner, Robert H.; Swaminathan, Hariharan; Sugai, George; Smolkowski, Keith

    2012-01-01

    Single-case research designs provide a rigorous research methodology for documenting experimental control. If single-case methods are to gain wider application, however, a need exists to define more clearly (a) the logic of single-case designs, (b) the process and decision rules for visual analysis, and (c) an accepted process for integrating…

  5. An Approach to Knowledge-Directed Image Analysis,

    DTIC Science & Technology

    1977-09-01

    34AN APPROACH TO KNOWLEDGE -DIRECTED IMAGE ANALYSIS D.H. Ballard, C.M.’Brown, J.A. Feldman Computer Science Department iThe University of Rochester...Rochester, New York 14627 DTII EECTE UTIC FILE COPY o n I, n 83 - ’ f t 8 11 28 19 1f.. AN APPROACH TO KNOWLEDGE -DIRECTED IMAGE ANALYSIS 5*., D.H...semantic network model and a distributed control structure to accomplish the image analysis process. The process of " understanding an image" leads to

  6. National malaria vector control policy: an analysis of the decision to scale-up larviciding in Nigeria.

    PubMed

    Tesfazghi, Kemi; Hill, Jenny; Jones, Caroline; Ranson, Hilary; Worrall, Eve

    2016-02-01

    New vector control tools are needed to combat insecticide resistance and reduce malaria transmission. The World Health Organization (WHO) endorses larviciding as a supplementary vector control intervention using larvicides recommended by the WHO Pesticides Evaluation Scheme (WHOPES). The decision to scale-up larviciding in Nigeria provided an opportunity to investigate the factors influencing policy adoption and assess the role that actors and evidence play in the policymaking process, in order to draw lessons that help accelerate the uptake of new methods for vector control. A retrospective policy analysis was carried out using in-depth interviews with national level policy stakeholders to establish normative national vector control policy or strategy decision-making processes and compare these with the process that led to the decision to scale-up larviciding. The interviews were transcribed, then coded and analyzed using NVivo10. Data were coded according to pre-defined themes from an analytical policy framework developed a priori. Stakeholders reported that the larviciding decision-making process deviated from the normative vector control decision-making process. National malaria policy is normally strongly influenced by WHO recommendations, but the potential of larviciding to contribute to national economic development objectives through larvicide production in Nigeria was cited as a key factor shaping the decision. The larviciding decision involved a restricted range of policy actors, and notably excluded actors that usually play advisory, consultative and evidence generation roles. Powerful actors limited the access of some actors to the policy processes and content. This may have limited the influence of scientific evidence in this policy decision. This study demonstrates that national vector control policy change can be facilitated by linking malaria control objectives to wider socioeconomic considerations and through engaging powerful policy champions to drive policy change and thereby accelerate access to new vector control tools. © The Author 2015. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.

  7. Controllability analysis and decentralized control of a wet limestone flue gas desulfurization plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perales, A.L.V.; Ortiz, F.J.G.; Ollero, P.

    2008-12-15

    Presently, decentralized feedback control is the only control strategy used in wet limestone flue gas desulfurization (WLFGD) plants. Proper tuning of this control strategy is becoming an important issue in WLFGD plants because more stringent SO{sub 2} regulations have come into force recently. Controllability analysis is a highly valuable tool for proper design of control systems, but it has not been applied to WLFGD plants so far. In this paper a decentralized control strategy is designed and applied to a WLFGD pilot plant taking into account the conclusions of a controllability analysis. The results reveal that good SO{sub 2} controlmore » in WLFGD plants can be achieved mainly because the main disturbance of the process is well-aligned with the plant and interactions between control loops are beneficial to SO{sub 2} control.« less

  8. PLACE: an open-source python package for laboratory automation, control, and experimentation.

    PubMed

    Johnson, Jami L; Tom Wörden, Henrik; van Wijk, Kasper

    2015-02-01

    In modern laboratories, software can drive the full experimental process from data acquisition to storage, processing, and analysis. The automation of laboratory data acquisition is an important consideration for every laboratory. When implementing a laboratory automation scheme, important parameters include its reliability, time to implement, adaptability, and compatibility with software used at other stages of experimentation. In this article, we present an open-source, flexible, and extensible Python package for Laboratory Automation, Control, and Experimentation (PLACE). The package uses modular organization and clear design principles; therefore, it can be easily customized or expanded to meet the needs of diverse laboratories. We discuss the organization of PLACE, data-handling considerations, and then present an example using PLACE for laser-ultrasound experiments. Finally, we demonstrate the seamless transition to post-processing and analysis with Python through the development of an analysis module for data produced by PLACE automation. © 2014 Society for Laboratory Automation and Screening.

  9. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.

  10. Application of mass spectrometry to process control for polymer material in autoclave curing

    NASA Technical Reports Server (NTRS)

    Smith, A. C.

    1983-01-01

    Mass spectrometer analysis of gas samples collected during a cure cycle of polymer materials can be used as a process control technique. This technique is particularly helpful in studying the various types of solvents and resin systems used in the preparation of polymer materials and characterizing the chemical composition of different resin systems and their mechanism of polymerization.

  11. Processing Speed, Inhibitory Control, and Working Memory: Three Important Factors to Account for Age-Related Cognitive Decline

    ERIC Educational Resources Information Center

    Pereiro Rozas, Arturo X.; Juncos-Rabadan, Onesimo; Gonzalez, Maria Soledad Rodriguez

    2008-01-01

    Processing speed, inhibitory control and working memory have been identified as the main possible culprits of age-related cognitive decline. This article describes a study of their interrelationships and dependence on age, including exploration of whether any of them mediates between age and the others. We carried out a LISREL analysis of the…

  12. Do Different ADHD-Related Etiological Risks Involve Specific Neuropsychological Pathways? An Analysis of Mediation Processes by Inhibitory Control and Delay Aversion

    ERIC Educational Resources Information Center

    Pauli-Pott, Ursula; Dalir, Silke; Mingebach, Tanja; Roller, Alisa; Becker, Katja

    2013-01-01

    Background: Inhibitory control (IC) has been regarded as a neuropsychological basic deficit and as an endophenotype of attention deficit/hyperactivity disorder (ADHD). Implicated here are mediation processes between etiological factors and ADHD symptoms. We thus analyze whether and to what extent executive IC and delay aversion (DA; i.e.,…

  13. Quality Management and Control of Low Pressure Cast Aluminum Alloy

    NASA Astrophysics Data System (ADS)

    Zhang, Dianxi; Zhang, Yanbo; Yang, Xiufan; Chen, Zhaosong; Jiang, Zelan

    2018-01-01

    This paper briefly reviews the history of low pressure casting and summarizes the major production processes of low pressure casting. It briefly introduces the quality management and control of low pressure cast aluminum alloy. The main processes include are: preparation of raw materials, Melting, refining, physical and chemical analysis, K-mode inspection, sand core, mold, heat treatment and so on.

  14. A Control System and Streaming DAQ Platform with Image-Based Trigger for X-ray Imaging

    NASA Astrophysics Data System (ADS)

    Stevanovic, Uros; Caselle, Michele; Cecilia, Angelica; Chilingaryan, Suren; Farago, Tomas; Gasilov, Sergey; Herth, Armin; Kopmann, Andreas; Vogelgesang, Matthias; Balzer, Matthias; Baumbach, Tilo; Weber, Marc

    2015-06-01

    High-speed X-ray imaging applications play a crucial role for non-destructive investigations of the dynamics in material science and biology. On-line data analysis is necessary for quality assurance and data-driven feedback, leading to a more efficient use of a beam time and increased data quality. In this article we present a smart camera platform with embedded Field Programmable Gate Array (FPGA) processing that is able to stream and process data continuously in real-time. The setup consists of a Complementary Metal-Oxide-Semiconductor (CMOS) sensor, an FPGA readout card, and a readout computer. It is seamlessly integrated in a new custom experiment control system called Concert that provides a more efficient way of operating a beamline by integrating device control, experiment process control, and data analysis. The potential of the embedded processing is demonstrated by implementing an image-based trigger. It records the temporal evolution of physical events with increased speed while maintaining the full field of view. The complete data acquisition system, with Concert and the smart camera platform was successfully integrated and used for fast X-ray imaging experiments at KIT's synchrotron radiation facility ANKA.

  15. On the analysis of EEG power, frequency and asymmetry in Parkinson's disease during emotion processing.

    PubMed

    Yuvaraj, Rajamanickam; Murugappan, Murugappan; Mohamed Ibrahim, Norlinah; Iqbal, Mohd; Sundaraj, Kenneth; Mohamad, Khairiyah; Palaniappan, Ramaswamy; Mesquita, Edgar; Satiyan, Marimuthu

    2014-04-09

    While Parkinson's disease (PD) has traditionally been described as a movement disorder, there is growing evidence of disruption in emotion information processing associated with the disease. The aim of this study was to investigate whether there are specific electroencephalographic (EEG) characteristics that discriminate PD patients and normal controls during emotion information processing. EEG recordings from 14 scalp sites were collected from 20 PD patients and 30 age-matched normal controls. Multimodal (audio-visual) stimuli were presented to evoke specific targeted emotional states such as happiness, sadness, fear, anger, surprise and disgust. Absolute and relative power, frequency and asymmetry measures derived from spectrally analyzed EEGs were subjected to repeated ANOVA measures for group comparisons as well as to discriminate function analysis to examine their utility as classification indices. In addition, subjective ratings were obtained for the used emotional stimuli. Behaviorally, PD patients showed no impairments in emotion recognition as measured by subjective ratings. Compared with normal controls, PD patients evidenced smaller overall relative delta, theta, alpha and beta power, and at bilateral anterior regions smaller absolute theta, alpha, and beta power and higher mean total spectrum frequency across different emotional states. Inter-hemispheric theta, alpha, and beta power asymmetry index differences were noted, with controls exhibiting greater right than left hemisphere activation. Whereas intra-hemispheric alpha power asymmetry reduction was exhibited in patients bilaterally at all regions. Discriminant analysis correctly classified 95.0% of the patients and controls during emotional stimuli. These distributed spectral powers in different frequency bands might provide meaningful information about emotional processing in PD patients.

  16. Adaptation of a Control Center Development Environment for Industrial Process Control

    NASA Technical Reports Server (NTRS)

    Killough, Ronnie L.; Malik, James M.

    1994-01-01

    In the control center, raw telemetry data is received for storage, display, and analysis. This raw data must be combined and manipulated in various ways by mathematical computations to facilitate analysis, provide diversified fault detection mechanisms, and enhance display readability. A development tool called the Graphical Computation Builder (GCB) has been implemented which provides flight controllers with the capability to implement computations for use in the control center. The GCB provides a language that contains both general programming constructs and language elements specifically tailored for the control center environment. The GCB concept allows staff who are not skilled in computer programming to author and maintain computer programs. The GCB user is isolated from the details of external subsystem interfaces and has access to high-level functions such as matrix operators, trigonometric functions, and unit conversion macros. The GCB provides a high level of feedback during computation development that improves upon the often cryptic errors produced by computer language compilers. An equivalent need can be identified in the industrial data acquisition and process control domain: that of an integrated graphical development tool tailored to the application to hide the operating system, computer language, and data acquisition interface details. The GCB features a modular design which makes it suitable for technology transfer without significant rework. Control center-specific language elements can be replaced by elements specific to industrial process control.

  17. Integrated Structural Analysis and Test Program

    NASA Technical Reports Server (NTRS)

    Kaufman, Daniel

    2005-01-01

    An integrated structural-analysis and structure-testing computer program is being developed in order to: Automate repetitive processes in testing and analysis; Accelerate pre-test analysis; Accelerate reporting of tests; Facilitate planning of tests; Improve execution of tests; Create a vibration, acoustics, and shock test database; and Integrate analysis and test data. The software package includes modules pertaining to sinusoidal and random vibration, shock and time replication, acoustics, base-driven modal survey, and mass properties and static/dynamic balance. The program is commanded by use of ActiveX controls. There is minimal need to generate command lines. Analysis or test files are selected by opening a Windows Explorer display. After selecting the desired input file, the program goes to a so-called analysis data process or test data process, depending on the type of input data. The status of the process is given by a Windows status bar, and when processing is complete, the data are reported in graphical, tubular, and matrix form.

  18. Application of ISO22000 and Failure Mode and Effect Analysis (fmea) for Industrial Processing of Poultry Products

    NASA Astrophysics Data System (ADS)

    Varzakas, Theodoros H.; Arvanitoyannis, Ioannis S.

    Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of poultry slaughtering and manufacturing. In this work comparison of ISO22000 analysis with HACCP is carried out over poultry slaughtering, processing and packaging. Critical Control points and Prerequisite programs (PrPs) have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram and fishbone diagram).

  19. Machine Learning: A Crucial Tool for Sensor Design

    PubMed Central

    Zhao, Weixiang; Bhushan, Abhinav; Santamaria, Anthony D.; Simon, Melinda G.; Davis, Cristina E.

    2009-01-01

    Sensors have been widely used for disease diagnosis, environmental quality monitoring, food quality control, industrial process analysis and control, and other related fields. As a key tool for sensor data analysis, machine learning is becoming a core part of novel sensor design. Dividing a complete machine learning process into three steps: data pre-treatment, feature extraction and dimension reduction, and system modeling, this paper provides a review of the methods that are widely used for each step. For each method, the principles and the key issues that affect modeling results are discussed. After reviewing the potential problems in machine learning processes, this paper gives a summary of current algorithms in this field and provides some feasible directions for future studies. PMID:20191110

  20. The implementation of a Hazard Analysis and Critical Control Point management system in a peanut butter ice cream plant.

    PubMed

    Hung, Yu-Ting; Liu, Chi-Te; Peng, I-Chen; Hsu, Chin; Yu, Roch-Chui; Cheng, Kuan-Chen

    2015-09-01

    To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP) plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and documentation and record keeping were followed to complete the HACCP program. The results of this study indicate that implementing the HACCP system in food industries can effectively enhance food safety and quality while improving the production management. Copyright © 2015. Published by Elsevier B.V.

  1. Quality control process improvement of flexible printed circuit board by FMEA

    NASA Astrophysics Data System (ADS)

    Krasaephol, Siwaporn; Chutima, Parames

    2018-02-01

    This research focuses on the quality control process improvement of Flexible Printed Circuit Board (FPCB), centred around model 7-Flex, by using Failure Mode and Effect Analysis (FMEA) method to decrease proportion of defective finished goods that are found at the final inspection process. Due to a number of defective units that were found at the final inspection process, high scraps may be escaped to customers. The problem comes from poor quality control process which is not efficient enough to filter defective products from in-process because there is no In-Process Quality Control (IPQC) or sampling inspection in the process. Therefore, the quality control process has to be improved by setting inspection gates and IPCQs at critical processes in order to filter the defective products. The critical processes are analysed by the FMEA method. IPQC is used for detecting defective products and reducing chances of defective finished goods escaped to the customers. Reducing proportion of defective finished goods also decreases scrap cost because finished goods incur higher scrap cost than work in-process. Moreover, defective products that are found during process can reflect the abnormal processes; therefore, engineers and operators should timely solve the problems. Improved quality control was implemented for 7-Flex production lines from July 2017 to September 2017. The result shows decreasing of the average proportion of defective finished goods and the average of Customer Manufacturers Lot Reject Rate (%LRR of CMs) equal to 4.5% and 4.1% respectively. Furthermore, cost saving of this quality control process equals to 100K Baht.

  2. Modeling and Analysis of Power Processing Systems (MAPPS). Volume 1: Technical report

    NASA Technical Reports Server (NTRS)

    Lee, F. C.; Rahman, S.; Carter, R. A.; Wu, C. H.; Yu, Y.; Chang, R.

    1980-01-01

    Computer aided design and analysis techniques were applied to power processing equipment. Topics covered include: (1) discrete time domain analysis of switching regulators for performance analysis; (2) design optimization of power converters using augmented Lagrangian penalty function technique; (3) investigation of current-injected multiloop controlled switching regulators; and (4) application of optimization for Navy VSTOL energy power system. The generation of the mathematical models and the development and application of computer aided design techniques to solve the different mathematical models are discussed. Recommendations are made for future work that would enhance the application of the computer aided design techniques for power processing systems.

  3. Multivariate statistical process control of a continuous pharmaceutical twin-screw granulation and fluid bed drying process.

    PubMed

    Silva, A F; Sarraguça, M C; Fonteyne, M; Vercruysse, J; De Leersnyder, F; Vanhoorne, V; Bostijn, N; Verstraeten, M; Vervaet, C; Remon, J P; De Beer, T; Lopes, J A

    2017-08-07

    A multivariate statistical process control (MSPC) strategy was developed for the monitoring of the ConsiGma™-25 continuous tablet manufacturing line. Thirty-five logged variables encompassing three major units, being a twin screw high shear granulator, a fluid bed dryer and a product control unit, were used to monitor the process. The MSPC strategy was based on principal component analysis of data acquired under normal operating conditions using a series of four process runs. Runs with imposed disturbances in the dryer air flow and temperature, in the granulator barrel temperature, speed and liquid mass flow and in the powder dosing unit mass flow were utilized to evaluate the model's monitoring performance. The impact of the imposed deviations to the process continuity was also evaluated using Hotelling's T 2 and Q residuals statistics control charts. The influence of the individual process variables was assessed by analyzing contribution plots at specific time points. Results show that the imposed disturbances were all detected in both control charts. Overall, the MSPC strategy was successfully developed and applied. Additionally, deviations not associated with the imposed changes were detected, mainly in the granulator barrel temperature control. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Old River Control Complex Sedimentation Investigation

    DTIC Science & Technology

    2015-06-01

    efforts to describe the shoaling processes and sediment transport in the two-river system. Geomorphic analysis The geomorphic assessment utilized...District, New Orleans. The investigation was conducted via a combination of field data collection and laboratory analysis, geomorphic assessments, and...6 Geomorphic analysis

  5. Analysis of the use of industrial control systems in simulators: state of the art and basic guidelines.

    PubMed

    Carrasco, Juan A; Dormido, Sebastián

    2006-04-01

    The use of industrial control systems in simulators facilitates the execution of engineering activities related with the installation and the optimization of the control systems in real plants. "Industrial control system" intends to be a valid term that would represent all the control systems which can be installed in an industrial plant, ranging from complex distributed control systems and SCADA packages to small single control devices. This paper summarizes the current alternatives for the development of simulators of industrial plants and presents an analysis of the process of integrating an industrial control system into a simulator, with the aim of helping in the installation of real control systems in simulators.

  6. Rapid quality assessment of Radix Aconiti Preparata using direct analysis in real time mass spectrometry.

    PubMed

    Zhu, Hongbin; Wang, Chunyan; Qi, Yao; Song, Fengrui; Liu, Zhiqiang; Liu, Shuying

    2012-11-08

    This study presents a novel and rapid method to identify chemical markers for the quality control of Radix Aconiti Preparata, a world widely used traditional herbal medicine. In the method, the samples with a fast extraction procedure were analyzed using direct analysis in real time mass spectrometry (DART MS) combined with multivariate data analysis. At present, the quality assessment approach of Radix Aconiti Preparata was based on the two processing methods recorded in Chinese Pharmacopoeia for the purpose of reducing the toxicity of Radix Aconiti and ensuring its clinical therapeutic efficacy. In order to ensure the safety and effectivity in clinical use, the processing degree of Radix Aconiti should be well controlled and assessed. In the paper, hierarchical cluster analysis and principal component analysis were performed to evaluate the DART MS data of Radix Aconiti Preparata samples in different processing times. The results showed that the well processed Radix Aconiti Preparata, unqualified processed and the raw Radix Aconiti could be clustered reasonably corresponding to their constituents. The loading plot shows that the main chemical markers having the most influence on the discrimination amongst the qualified and unqualified samples were mainly some monoester diterpenoid aconitines and diester diterpenoid aconitines, i.e. benzoylmesaconine, hypaconitine, mesaconitine, neoline, benzoylhypaconine, benzoylaconine, fuziline, aconitine and 10-OH-mesaconitine. The established DART MS approach in combination with multivariate data analysis provides a very flexible and reliable method for quality assessment of toxic herbal medicine. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Numerical Simulation Analysis of High-precision Dispensing Needles for Solid-liquid Two-phase Grinding

    NASA Astrophysics Data System (ADS)

    Li, Junye; Hu, Jinglei; Wang, Binyu; Sheng, Liang; Zhang, Xinming

    2018-03-01

    In order to investigate the effect of abrasive flow polishing surface variable diameter pipe parts, with high precision dispensing needles as the research object, the numerical simulation of the process of polishing high precision dispensing needle was carried out. Analysis of different volume fraction conditions, the distribution of the dynamic pressure and the turbulence viscosity of the abrasive flow field in the high precision dispensing needle, through comparative analysis, the effectiveness of the abrasive grain polishing high precision dispensing needle was studied, controlling the volume fraction of silicon carbide can change the viscosity characteristics of the abrasive flow during the polishing process, so that the polishing quality of the abrasive grains can be controlled.

  8. Red and processed meat consumption and gastric cancer risk: a systematic review and meta-analysis

    PubMed Central

    Zhao, Zhanwei; Yin, Zifang; Zhao, Qingchuan

    2017-01-01

    The associations between red and processed meat consumption and gastric cancer risk have remained inconclusive. We performed a systematic review and meta-analysis to analyze these associations. We searched PubMed and EMBASE to identify studies published from inception through October 2016. Subtype analyses of gastric cancer (gastric cardia adenocarcinoma and gastric non-cardiac adenocarcinoma) and dose-response analyses were performed. We finally selected 42 eligible studies. The summary relative risks of highest versus lowest consumption were positive for case-control studies with 1.67 (1.36-2.05) for red meat and 1.76 (1.51-2.05) for processed meat, but negative for cohort studies with 1.14 (0.97-1.34) for red meat and 1.23 (0.98-1.55) for processed meat. Subtype analyses of cohort studies suggested null results for gastric cardia adenocarcinoma (red meat, P = 0.79; processed meat, P = 0.89) and gastric non-cardiac adenocarcinoma (red meat, P = 0.12; processed meat, P = 0.12). In conclusion, the present analysis suggested null results between red and processed meat consumption and gastric cancer risk in cohort studies, although case-control studies yielded positive associations. Further well-designed prospective studies are needed to validate these findings. PMID:28430644

  9. A Value Based Justification Process for Aerospace RDT and E Capability Investments

    DTIC Science & Technology

    2017-12-01

    end of MS C by funding the five-year, $350 million T&E infrastructure investment proposed in the DoD plan. This expansion of the cost - benefit “control...expansion of the cost - benefit “control volume” to include projected system development savings, as described in Ref. [3], proved successful in justifying...Fig. 1 The Expanded Cost - Benefit Analysis Control Volume. It is also worth noting that this process has the greatest potential for success when the

  10. The development of control and monitoring system on marine current renewable energy Case study: strait of Toyapakeh - Nusa Penida, Bali

    NASA Astrophysics Data System (ADS)

    Arief, I. S.; Suherman, I. H.; Wardani, A. Y.; Baidowi, A.

    2017-05-01

    Control and monitoring system is a continuous process of securing the asset in the Marine Current Renewable Energy. A control and monitoring system is existed each critical components which is embedded in Failure Mode Effect Analysis (FMEA) method. As the result, the process in this paper developed through a matrix sensor. The matrix correlated to critical components and monitoring system which supported by sensors to conduct decision-making.

  11. A New Performance Improvement Model: Adding Benchmarking to the Analysis of Performance Indicator Data.

    PubMed

    Al-Kuwaiti, Ahmed; Homa, Karen; Maruthamuthu, Thennarasu

    2016-01-01

    A performance improvement model was developed that focuses on the analysis and interpretation of performance indicator (PI) data using statistical process control and benchmarking. PIs are suitable for comparison with benchmarks only if the data fall within the statistically accepted limit-that is, show only random variation. Specifically, if there is no significant special-cause variation over a period of time, then the data are ready to be benchmarked. The proposed Define, Measure, Control, Internal Threshold, and Benchmark model is adapted from the Define, Measure, Analyze, Improve, Control (DMAIC) model. The model consists of the following five steps: Step 1. Define the process; Step 2. Monitor and measure the variation over the period of time; Step 3. Check the variation of the process; if stable (no significant variation), go to Step 4; otherwise, control variation with the help of an action plan; Step 4. Develop an internal threshold and compare the process with it; Step 5.1. Compare the process with an internal benchmark; and Step 5.2. Compare the process with an external benchmark. The steps are illustrated through the use of health care-associated infection (HAI) data collected for 2013 and 2014 from the Infection Control Unit, King Fahd Hospital, University of Dammam, Saudi Arabia. Monitoring variation is an important strategy in understanding and learning about a process. In the example, HAI was monitored for variation in 2013, and the need to have a more predictable process prompted the need to control variation by an action plan. The action plan was successful, as noted by the shift in the 2014 data, compared to the historical average, and, in addition, the variation was reduced. The model is subject to limitations: For example, it cannot be used without benchmarks, which need to be calculated the same way with similar patient populations, and it focuses only on the "Analyze" part of the DMAIC model.

  12. ITC Guidelines on Quality Control in Scoring, Test Analysis, and Reporting of Test Scores

    ERIC Educational Resources Information Center

    Allalouf, Avi

    2014-01-01

    The Quality Control (QC) Guidelines are intended to increase the efficiency, precision, and accuracy of the scoring, analysis, and reporting process of testing. The QC Guidelines focus on large-scale testing operations where multiple forms of tests are created for use on set dates. However, they may also be used for a wide variety of other testing…

  13. Analysis and quality control of carbohydrates in therapeutic proteins with fluorescence HPLC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Kun; Huang, Jian; Center for Informational Biology, University of Electronic Science and Technology of China, Chengdu 610054

    Conbercept is an Fc fusion protein with very complicated carbohydrate profiles which must be carefully monitored through manufacturing process. Here, we introduce an optimized fluorescence derivatization high-performance liquid chromatographic method for glycan mapping in conbercept. Compared with conventional glycan analysis method, this method has much better resolution and higher reproducibility making it excellent for product quality control.

  14. Characterization of functional brain activity and connectivity using EEG and fMRI in patients with sickle cell disease.

    PubMed

    Case, Michelle; Zhang, Huishi; Mundahl, John; Datta, Yvonne; Nelson, Stephen; Gupta, Kalpna; He, Bin

    2017-01-01

    Sickle cell disease (SCD) is a red blood cell disorder that causes many complications including life-long pain. Treatment of pain remains challenging due to a poor understanding of the mechanisms and limitations to characterize and quantify pain. In the present study, we examined simultaneously recording functional MRI (fMRI) and electroencephalogram (EEG) to better understand neural connectivity as a consequence of chronic pain in SCD patients. We performed independent component analysis and seed-based connectivity on fMRI data. Spontaneous power and microstate analysis was performed on EEG-fMRI data. ICA analysis showed that patients lacked activity in the default mode network (DMN) and executive control network compared to controls. EEG-fMRI data revealed that the insula cortex's role in salience increases with age in patients. EEG microstate analysis showed patients had increased activity in pain processing regions. The cerebellum in patients showed a stronger connection to the periaqueductal gray matter (involved in pain inhibition), and negative connections to pain processing areas. These results suggest that patients have reduced activity of DMN and increased activity in pain processing regions during rest. The present findings suggest resting state connectivity differences between patients and controls can be used as novel biomarkers of SCD pain.

  15. Intelligent Performance Analysis with a Natural Language Interface

    NASA Astrophysics Data System (ADS)

    Juuso, Esko K.

    2017-09-01

    Performance improvement is taken as the primary goal in the asset management. Advanced data analysis is needed to efficiently integrate condition monitoring data into the operation and maintenance. Intelligent stress and condition indices have been developed for control and condition monitoring by combining generalized norms with efficient nonlinear scaling. These nonlinear scaling methodologies can also be used to handle performance measures used for management since management oriented indicators can be presented in the same scale as intelligent condition and stress indices. Performance indicators are responses of the process, machine or system to the stress contributions analyzed from process and condition monitoring data. Scaled values are directly used in intelligent temporal analysis to calculate fluctuations and trends. All these methodologies can be used in prognostics and fatigue prediction. The meanings of the variables are beneficial in extracting expert knowledge and representing information in natural language. The idea of dividing the problems into the variable specific meanings and the directions of interactions provides various improvements for performance monitoring and decision making. The integrated temporal analysis and uncertainty processing facilitates the efficient use of domain expertise. Measurements can be monitored with generalized statistical process control (GSPC) based on the same scaling functions.

  16. A Meta-Analytic Study of the Neural Systems for Auditory Processing of Lexical Tones.

    PubMed

    Kwok, Veronica P Y; Dan, Guo; Yakpo, Kofi; Matthews, Stephen; Fox, Peter T; Li, Ping; Tan, Li-Hai

    2017-01-01

    The neural systems of lexical tone processing have been studied for many years. However, previous findings have been mixed with regard to the hemispheric specialization for the perception of linguistic pitch patterns in native speakers of tonal language. In this study, we performed two activation likelihood estimation (ALE) meta-analyses, one on neuroimaging studies of auditory processing of lexical tones in tonal languages (17 studies), and the other on auditory processing of lexical information in non-tonal languages as a control analysis for comparison (15 studies). The lexical tone ALE analysis showed significant brain activations in bilateral inferior prefrontal regions, bilateral superior temporal regions and the right caudate, while the control ALE analysis showed significant cortical activity in the left inferior frontal gyrus and left temporo-parietal regions. However, we failed to obtain significant differences from the contrast analysis between two auditory conditions, which might be caused by the limited number of studies available for comparison. Although the current study lacks evidence to argue for a lexical tone specific activation pattern, our results provide clues and directions for future investigations on this topic, more sophisticated methods are needed to explore this question in more depth as well.

  17. A Meta-Analytic Study of the Neural Systems for Auditory Processing of Lexical Tones

    PubMed Central

    Kwok, Veronica P. Y.; Dan, Guo; Yakpo, Kofi; Matthews, Stephen; Fox, Peter T.; Li, Ping; Tan, Li-Hai

    2017-01-01

    The neural systems of lexical tone processing have been studied for many years. However, previous findings have been mixed with regard to the hemispheric specialization for the perception of linguistic pitch patterns in native speakers of tonal language. In this study, we performed two activation likelihood estimation (ALE) meta-analyses, one on neuroimaging studies of auditory processing of lexical tones in tonal languages (17 studies), and the other on auditory processing of lexical information in non-tonal languages as a control analysis for comparison (15 studies). The lexical tone ALE analysis showed significant brain activations in bilateral inferior prefrontal regions, bilateral superior temporal regions and the right caudate, while the control ALE analysis showed significant cortical activity in the left inferior frontal gyrus and left temporo-parietal regions. However, we failed to obtain significant differences from the contrast analysis between two auditory conditions, which might be caused by the limited number of studies available for comparison. Although the current study lacks evidence to argue for a lexical tone specific activation pattern, our results provide clues and directions for future investigations on this topic, more sophisticated methods are needed to explore this question in more depth as well. PMID:28798670

  18. Analysis of Hospital Processes with Process Mining Techniques.

    PubMed

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  19. Analysis and control of the METC fluid bed gasifier. Quarterly progress report, January--March 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-03-01

    This document summarizes work performed for the period 10/1/94 to 3/31/95. In this work, three components will form the basis for design of a control scheme for the Fluidized Bed Gasifier (FBG) at METC: (1) a control systems analysis based on simple linear models derived from process data, (2) review of the literature on fluid bed gasifier operation and control, and (3) understanding of present FBG operation and real world considerations. Below we summarize work accomplished to data in each of these areas.

  20. Analysis and asynchronous detection of gradually unfolding errors during monitoring tasks

    NASA Astrophysics Data System (ADS)

    Omedes, Jason; Iturrate, Iñaki; Minguez, Javier; Montesano, Luis

    2015-10-01

    Human studies on cognitive control processes rely on tasks involving sudden-onset stimuli, which allow the analysis of these neural imprints to be time-locked and relative to the stimuli onset. Human perceptual decisions, however, comprise continuous processes where evidence accumulates until reaching a boundary. Surpassing the boundary leads to a decision where measured brain responses are associated to an internal, unknown onset. The lack of this onset for gradual stimuli hinders both the analyses of brain activity and the training of detectors. This paper studies electroencephalographic (EEG)-measurable signatures of human processing for sudden and gradual cognitive processes represented as a trajectory mismatch under a monitoring task. Time-locked potentials and brain-source analysis of the EEG of sudden mismatches revealed the typical components of event-related potentials and the involvement of brain structures related to cognitive control processing. For gradual mismatch events, time-locked analyses did not show any discernible EEG scalp pattern, despite related brain areas being, to a lesser extent, activated. However, and thanks to the use of non-linear pattern recognition algorithms, it is possible to train an asynchronous detector on sudden events and use it to detect gradual mismatches, as well as obtaining an estimate of their unknown onset. Post-hoc time-locked scalp and brain-source analyses revealed that the EEG patterns of detected gradual mismatches originated in brain areas related to cognitive control processing. This indicates that gradual events induce latency in the evaluation process but that similar brain mechanisms are present in sudden and gradual mismatch events. Furthermore, the proposed asynchronous detection model widens the scope of applications of brain-machine interfaces to other gradual processes.

  1. Red and Processed Meat Consumption Increases Risk for Non-Hodgkin Lymphoma

    PubMed Central

    Yang, Li; Dong, Jianming; Jiang, Shenghua; Shi, Wenyu; Xu, Xiaohong; Huang, Hongming; You, Xuefen; Liu, Hong

    2015-01-01

    Abstract The association between consumption of red and processed meat and non-Hodgkin lymphoma (NHL) remains unclear. We performed a meta-analysis of the published observational studies to explore this relationship. We searched databases in MEDLINE and EMBASE to identify observational studies which evaluated the association between consumption of red and processed meat and risk of NHL. Quality of included studies was evaluated using Newcastle-Ottawa Quality Assessment Scale (NOS). Random-effects models were used to calculate summary relative risk (SRR) and the corresponding 95% confidence interval (CI). We identified a total of 16 case–control and 4 prospective cohort studies, including 15,189 subjects with NHL. The SRR of NHL comparing the highest and lowest categories were 1.32 (95% CI: 1.12–1.55) for red meat and 1.17 (95% CI: 1.07–1.29) for processed meat intake. Stratified analysis indicated that a statistically significant risk association between consumption of red and processed meat and NHL risk was observed in case–control studies, but not in cohort studies. The SRR was 1.11 (95% CI: 1.04–1.18) for per 100 g/day increment in red meat intake and 1.28 (95% CI: 1.08–1.53) for per 50 g/day increment in processed meat intake. There was evidence of a nonlinear association for intake of processed meat, but not for intake of red meat. Findings from our meta-analysis indicate that consumption of red and processed meat may be related to NHL risk. More prospective epidemiological studies that control for important confounders and focus on the NHL risk related with different levels of meat consumption are required to clarify this association. PMID:26559248

  2. 40 CFR 412.2 - General definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... also cites the approved methods of analysis. (d) Process wastewater means water directly or indirectly... facilities; direct contact swimming, washing, or spray cooling of animals; or dust control. Process... process wastewater from the production area is or may be applied. (f) New source is defined at 40 CFR 122...

  3. 40 CFR 412.2 - General definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... also cites the approved methods of analysis. (d) Process wastewater means water directly or indirectly... facilities; direct contact swimming, washing, or spray cooling of animals; or dust control. Process... process wastewater from the production area is or may be applied. (f) New source is defined at 40 CFR 122...

  4. 40 CFR 412.2 - General definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... also cites the approved methods of analysis. (d) Process wastewater means water directly or indirectly... facilities; direct contact swimming, washing, or spray cooling of animals; or dust control. Process... process wastewater from the production area is or may be applied. (f) New source is defined at 40 CFR 122...

  5. An application of computer aided requirements analysis to a real time deep space system

    NASA Technical Reports Server (NTRS)

    Farny, A. M.; Morris, R. V.; Hartsough, C.; Callender, E. D.; Teichroew, D.; Chikofsky, E.

    1981-01-01

    The entire procedure of incorporating the requirements and goals of a space flight project into integrated, time ordered sequences of spacecraft commands, is called the uplink process. The Uplink Process Control Task (UPCT) was created to examine the uplink process and determine ways to improve it. The Problem Statement Language/Problem Statement Analyzer (PSL/PSA) designed to assist the designer/analyst/engineer in the preparation of specifications of an information system is used as a supporting tool to aid in the analysis. Attention is given to a definition of the uplink process, the definition of PSL/PSA, the construction of a PSA database, the value of analysis to the study of the uplink process, and the PSL/PSA lessons learned.

  6. Beta/Gamma Oscillations and Event-Related Potentials Indicate Aberrant Multisensory Processing in Schizophrenia

    PubMed Central

    Balz, Johanna; Roa Romero, Yadira; Keil, Julian; Krebber, Martin; Niedeggen, Michael; Gallinat, Jürgen; Senkowski, Daniel

    2016-01-01

    Recent behavioral and neuroimaging studies have suggested multisensory processing deficits in patients with schizophrenia (SCZ). Thus far, the neural mechanisms underlying these deficits are not well understood. Previous studies with unisensory stimulation have shown altered neural oscillations in SCZ. As such, altered oscillations could contribute to aberrant multisensory processing in this patient group. To test this assumption, we conducted an electroencephalography (EEG) study in 15 SCZ and 15 control participants in whom we examined neural oscillations and event-related potentials (ERPs) in the sound-induced flash illusion (SIFI). In the SIFI multiple auditory stimuli that are presented alongside a single visual stimulus can induce the illusory percept of multiple visual stimuli. In SCZ and control participants we compared ERPs and neural oscillations between trials that induced an illusion and trials that did not induce an illusion. On the behavioral level, SCZ (55.7%) and control participants (55.4%) did not significantly differ in illusion rates. The analysis of ERPs revealed diminished amplitudes and altered multisensory processing in SCZ compared to controls around 135 ms after stimulus onset. Moreover, the analysis of neural oscillations revealed altered 25–35 Hz power after 100 to 150 ms over occipital scalp for SCZ compared to controls. Our findings extend previous observations of aberrant neural oscillations in unisensory perception paradigms. They suggest that altered ERPs and altered occipital beta/gamma band power reflect aberrant multisensory processing in SCZ. PMID:27999553

  7. State of the art in on-line techniques coupled to flow injection analysis FIA/on-line- a critical review

    PubMed Central

    Puchades, R.; Maquieira, A.; Atienza, J.; Herrero, M. A.

    1990-01-01

    Flow injection analysis (FIA) has emerged as an increasingly used laboratory tool in chemical analysis. Employment of the technique for on-line sample treatment and on-line measurement in chemical process control is a growing trend. This article reviews the recent applications of FlA. Most papers refer to on-line sample treatment. Although FIA is very well suited to continuous on-line process monitoring, few examples have been found in this areamost of them have been applied to water treatment or fermentation processes. PMID:18925271

  8. Social control and coercion in addiction treatment: towards evidence-based policy and practice.

    PubMed

    Wild, T Cameron

    2006-01-01

    Social pressures are often an integral part of the process of seeking addiction treatment. However, scientists have not developed conclusive evidence on the processes, benefits and limitations of using legal, formal and informal social control tactics to inform policy makers, service providers and the public. This paper characterizes barriers to a robust interdisciplinary analysis of social control and coercion in addiction treatment and provides directions for future research. Conceptual analysis and review of key studies and trends in the area are used to describe eight implicit assumptions underlying policy, practice and scholarship on this topic. Many policies, programmes and researchers are guided by a simplistic behaviourist and health-service perspective on social controls that (a) overemphasizes the use of criminal justice systems to compel individuals into treatment and (b) fails to take into account provider, patient and public views. Policies and programmes that expand addiction treatment options deserve support. However, drawing a firm distinction between social controls (objective use of social pressure) and coercion (client perceptions and decision-making processes) supports a parallel position that rejects treatment policies, programmes, and associated practices that create client perceptions of coercion.

  9. Development of automation software for neutron activation analysis process in Malaysian nuclear agency

    NASA Astrophysics Data System (ADS)

    Yussup, N.; Rahman, N. A. A.; Ibrahim, M. M.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.

    2017-01-01

    Neutron Activation Analysis (NAA) process has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s. Most of the procedures established especially from sample registration to sample analysis are performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient. Hence, a software to support the system automation is developed to provide an effective method to replace redundant manual data entries and produce faster sample analysis and calculation process. This paper describes the design and development of automation software for NAA process which consists of three sub-programs. The sub-programs are sample registration, hardware control and data acquisition; and sample analysis. The data flow and connection between the sub-programs will be explained. The software is developed by using National Instrument LabView development package.

  10. Artificial intelligence applied to process signal analysis

    NASA Technical Reports Server (NTRS)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  11. Active chatter suppression with displacement-only measurement in turning process

    NASA Astrophysics Data System (ADS)

    Ma, Haifeng; Wu, Jianhua; Yang, Liuqing; Xiong, Zhenhua

    2017-08-01

    Regenerative chatter is a major hindrance for achieving high quality and high production rate in machining processes. Various active controllers have been proposed to mitigate chatter. However, most of existing controllers were developed on the basis of multi-states feedback of the system and state observers were usually needed. Moreover, model parameters of the machining process (mass, damping and stiffness) were required in existing active controllers. In this study, an active sliding mode controller, which employs a dynamic output feedback sliding surface for the unmatched condition and an adaptive law for disturbance estimation, is designed, analyzed, and validated for chatter suppression in turning process. Only displacement measurement is required by this approach. Other sensors and state observers are not needed. Moreover, it facilitates a rapid implementation since the designed controller is established without using model parameters of the turning process. Theoretical analysis, numerical simulations and experiments on a computer numerical control (CNC) lathe are presented. It shows that the chatter can be substantially attenuated and the chatter-free region can be significantly expanded with the presented method.

  12. How gamma radiation processing systems are benefiting from the latest advances in information technology

    NASA Astrophysics Data System (ADS)

    Gibson, Wayne H.; Levesque, Daniel

    2000-03-01

    This paper discusses how gamma irradiation plants are putting the latest advances in computer and information technology to use for better process control, cost savings, and strategic advantages. Some irradiator operations are gaining significant benefits by integrating computer technology and robotics with real-time information processing, multi-user databases, and communication networks. The paper reports on several irradiation facilities that are making good use of client/server LANs, user-friendly graphics interfaces, supervisory control and data acquisition (SCADA) systems, distributed I/O with real-time sensor devices, trending analysis, real-time product tracking, dynamic product scheduling, and automated dosimetry reading. These plants are lowering costs by fast and reliable reconciliation of dosimetry data, easier validation to GMP requirements, optimizing production flow, and faster release of sterilized products to market. There is a trend in the manufacturing sector towards total automation using "predictive process control". Real-time verification of process parameters "on-the-run" allows control parameters to be adjusted appropriately, before the process strays out of limits. Applying this technology to the gamma radiation process, control will be based on monitoring the key parameters such as time, and making adjustments during the process to optimize quality and throughput. Dosimetry results will be used as a quality control measurement rather than as a final monitor for the release of the product. Results are correlated with the irradiation process data to quickly and confidently reconcile variations. Ultimately, a parametric process control system utilizing responsive control, feedback and verification will not only increase productivity and process efficiency, but can also result in operating within tighter dose control set points.

  13. Exploring governance for a One Health collaboration for leptospirosis prevention and control in Fiji: Stakeholder perceptions, evidence, and processes.

    PubMed

    McPherson, Anna; Hill, Peter S; Kama, Mike; Reid, Simon

    2018-03-30

    Fiji has a high burden of leptospirosis, with endemic infection and epidemic outbreaks with high mortality, often associated with flooding and cyclones. As a zoonosis, leptospirosis control requires interventions in sectors beyond the usual control of health-in Fiji, the dairy and sugar industries, and water and sanitation and rodent control in communities. This paper presents the findings of qualitative research to inform policy around governance for a One Health multisectoral approach to leptospirosis control. Key informants from relevant government agencies and industry organizations were interviewed in late 2014, and the interviews analyzed and triangulated with documentary analysis. The analysis identified 5 themes: perceptions of the impact of leptospirosis, governance processes, models for collaboration, leptospirosis control, and preferred leadership for leptospirosis management. Data were limited, with poor communication between ministries, and limited awareness of leptospirosis outside outbreaks. Collaboration during outbreaks was positive but not sustained in endemic periods. Mechanism for enhanced collaboration was developed for endemic and outbreak situations. The findings informed a One Health governance approach to leptospirosis, framed within a National Strategic Plan, with a specific National Action Plan for Leptospirosis. The process provides a research based One Health template for application to other zoonotic diseases. Copyright © 2018 John Wiley & Sons, Ltd.

  14. Fulfillment of GMP standard, halal standard, and applying HACCP for production process of beef floss (Case study: Ksatria enterprise)

    NASA Astrophysics Data System (ADS)

    A'diat, Arkan Addien Al; Liquiddanu, Eko; Laksono, Pringgo Widyo; Sutopo, Wahyudi; Suletra, I. Wayan

    2018-02-01

    Along with the increasing number of the modern retail business in Indonesia, give an opportunity to small and medium enterprise (SME) to sell its products through the modern retailer. There are some obstacles faced by the SMEs, one of them is about product standard. Product standard that must be owned by SMEs are GMP standard and halal standard. This research was conducted to know the fulfillment by the beef floss enterprise in jagalan in fulfilling the GMP standard and halal. In addition, Hazard Analysis and Critical Control Points (HACCP) system was applied to analyze the process. HACCP which used in this research was based on the seven principles in SNI (Indonesian National Standard) 01-4852-1998. The seven principles included hazard analysis, critical control point (CCP) determination, critical limit establishment, CCP monitor system establishment, corrective action establishment, verification, and also documentation establishment that must be applied in preparing HACCP plan. Based on this case study, it is concluded that there were 5 CCPs : the boiling process, roasting process, frying process, the beef floss draining process, and the packaging process.

  15. Intelligent Control of Micro Grid: A Big Data-Based Control Center

    NASA Astrophysics Data System (ADS)

    Liu, Lu; Wang, Yanping; Liu, Li; Wang, Zhiseng

    2018-01-01

    In this paper, a structure of micro grid system with big data-based control center is introduced. Energy data from distributed generation, storage and load are analized through the control center, and from the results new trends will be predicted and applied as a feedback to optimize the control. Therefore, each step proceeded in micro grid can be adjusted and orgnized in a form of comprehensive management. A framework of real-time data collection, data processing and data analysis will be proposed by employing big data technology. Consequently, a integrated distributed generation and a optimized energy storage and transmission process can be implemented in the micro grid system.

  16. LMFAO! Humor as a Response to Fear: Decomposing Fear Control within the Extended Parallel Process Model

    PubMed Central

    Abril, Eulàlia P.; Szczypka, Glen; Emery, Sherry L.

    2017-01-01

    This study seeks to analyze fear control responses to the 2012 Tips from Former Smokers campaign using the Extended Parallel Process Model (EPPM). The goal is to examine the occurrence of ancillary fear control responses, like humor. In order to explore individuals’ responses in an organic setting, we use Twitter data—tweets—collected via the Firehose. Content analysis of relevant fear control tweets (N = 14,281) validated the existence of boomerang responses within the EPPM: denial, defensive avoidance, and reactance. More importantly, results showed that humor tweets were not only a significant occurrence but constituted the majority of fear control responses. PMID:29527092

  17. Automation of experimental research of waveguide paths induction soldering

    NASA Astrophysics Data System (ADS)

    Tynchenko, V. S.; Petrenko, V. E.; Kukartsev, V. V.; Tynchenko, V. V.; Antamoshkin, O. A.

    2018-05-01

    The article presents an automated system of experimental studies of the waveguide paths induction soldering process. The system is a part of additional software for a complex of automated control of the technological process of induction soldering of thin-walled waveguide paths from aluminum alloys, expanding its capabilities. The structure of the software product, the general appearance of the controls and the potential application possibilities are presented. The utility of the developed application by approbation in a series of field experiments was considered and justified. The application of the experimental research system makes it possible to improve the process under consideration, providing the possibility of fine-tuning the control regulators, as well as keeping the statistics of the soldering process in a convenient form for analysis.

  18. My Body Looks Like That Girl’s: Body Mass Index Modulates Brain Activity during Body Image Self-Reflection among Young Women

    PubMed Central

    Wen, Xin; She, Ying; Vinke, Petra Corianne; Chen, Hong

    2016-01-01

    Body image distress or body dissatisfaction is one of the most common consequences of obesity and overweight. We investigated the neural bases of body image processing in overweight and average weight young women to understand whether brain regions that were previously found to be involved in processing self-reflective, perspective and affective components of body image would show different activation between two groups. Thirteen overweight (O-W group, age = 20.31±1.70 years) and thirteen average weight (A-W group, age = 20.15±1.62 years) young women underwent functional magnetic resonance imaging while performing a body image self-reflection task. Among both groups, whole-brain analysis revealed activations of a brain network related to perceptive and affective components of body image processing. ROI analysis showed a main effect of group in ACC as well as a group by condition interaction within bilateral EBA, bilateral FBA, right IPL, bilateral DLPFC, left amygdala and left MPFC. For the A-W group, simple effect analysis revealed stronger activations in Thin-Control compared to Fat-Control condition within regions related to perceptive (including bilateral EBA, bilateral FBA, right IPL) and affective components of body image processing (including bilateral DLPFC, left amygdala), as well as self-reference (left MPFC). The O-W group only showed stronger activations in Fat-Control than in Thin-Control condition within regions related to the perceptive component of body image processing (including left EBA and left FBA). Path analysis showed that in the Fat-Thin contrast, body dissatisfaction completely mediated the group difference in brain response in left amygdala across the whole sample. Our data are the first to demonstrate differences in brain response to body pictures between average weight and overweight young females involved in a body image self-reflection task. These results provide insights for understanding the vulnerability to body image distress among overweight or obese young females. PMID:27764116

  19. My Body Looks Like That Girl's: Body Mass Index Modulates Brain Activity during Body Image Self-Reflection among Young Women.

    PubMed

    Gao, Xiao; Deng, Xiao; Wen, Xin; She, Ying; Vinke, Petra Corianne; Chen, Hong

    2016-01-01

    Body image distress or body dissatisfaction is one of the most common consequences of obesity and overweight. We investigated the neural bases of body image processing in overweight and average weight young women to understand whether brain regions that were previously found to be involved in processing self-reflective, perspective and affective components of body image would show different activation between two groups. Thirteen overweight (O-W group, age = 20.31±1.70 years) and thirteen average weight (A-W group, age = 20.15±1.62 years) young women underwent functional magnetic resonance imaging while performing a body image self-reflection task. Among both groups, whole-brain analysis revealed activations of a brain network related to perceptive and affective components of body image processing. ROI analysis showed a main effect of group in ACC as well as a group by condition interaction within bilateral EBA, bilateral FBA, right IPL, bilateral DLPFC, left amygdala and left MPFC. For the A-W group, simple effect analysis revealed stronger activations in Thin-Control compared to Fat-Control condition within regions related to perceptive (including bilateral EBA, bilateral FBA, right IPL) and affective components of body image processing (including bilateral DLPFC, left amygdala), as well as self-reference (left MPFC). The O-W group only showed stronger activations in Fat-Control than in Thin-Control condition within regions related to the perceptive component of body image processing (including left EBA and left FBA). Path analysis showed that in the Fat-Thin contrast, body dissatisfaction completely mediated the group difference in brain response in left amygdala across the whole sample. Our data are the first to demonstrate differences in brain response to body pictures between average weight and overweight young females involved in a body image self-reflection task. These results provide insights for understanding the vulnerability to body image distress among overweight or obese young females.

  20. Guidelines for Automatic Data Processing Physical Security and Risk Management. Federal Information Processing Standards Publication 31.

    ERIC Educational Resources Information Center

    National Bureau of Standards (DOC), Washington, DC.

    These guidelines provide a handbook for use by federal organizations in structuring physical security and risk management programs for their automatic data processing facilities. This publication discusses security analysis, natural disasters, supporting utilities, system reliability, procedural measures and controls, off-site facilities,…

  1. Sentence Context Affects the Brain Response to Masked Words

    ERIC Educational Resources Information Center

    Coulson, Seana; Brang, David

    2010-01-01

    Historically, language researchers have assumed that lexical, or word-level processing is fast and automatic, while slower, more controlled post-lexical processes are sensitive to contextual information from higher levels of linguistic analysis. Here we demonstrate the impact of sentence context on the processing of words not available for…

  2. Computer Instructional Aids for Undergraduate Control Education.

    ERIC Educational Resources Information Center

    Volz, Richard A.; And Others

    Engineering is coming to rely more and more heavily upon the computer for computations, analyses, and graphic displays which aid the design process. A general purpose simulation system, the Time-shared Automatic Control Laboratory (TACL), and a set of computer-aided design programs, Control Oriented Interactive Graphic Analysis and Design…

  3. Baseline Description and Analysis of the Operations Related to Warehouse Controlled Documents at the Navy Publications and Forms Center, Philadelphia, Pennsylvania. Volume I. Phase I.

    DTIC Science & Technology

    1980-03-06

    performing the present NPFC tasks. Potential automation technologies may include order processing mechanization, demand printing from micrographic or...effort and documented in this volume included the following: a. Functional description of the order processing activities as they currently operate. b...covered under each analysis area. i It is obvious from the exhibit that the functional description of order processing operations was to include COG I

  4. Analysis and control on changeable wheel tool system of hybrid grinding and polishing machine tool for blade finishing

    NASA Astrophysics Data System (ADS)

    He, Qiuwei; Lv, Xingming; Wang, Xin; Qu, Xingtian; Zhao, Ji

    2017-01-01

    Blade is the key component in the energy power equipment of turbine, aircraft engines and so on. Researches on the process and equipment for blade finishing become one of important and difficult point. To control precisely tool system of developed hybrid grinding and polishing machine tool for blade finishing, the tool system with changeable wheel for belt polishing is analyzed in this paper. Firstly, the belt length and wrap angle of each wheel in different position of tension wheel swing angle in the process of changing wheel is analyzed. The reasonable belt length is calculated by using MATLAB, and relationships between wrap angle of each wheel and cylinder expansion amount of contact wheel are obtained. Then, the control system for changeable wheel tool structure is developed. Lastly, the surface roughness of blade finishing is verified by experiments. Theoretical analysis and experimental results show that reasonable belt length and wheel wrap angle can be obtained by proposed analysis method, the changeable wheel tool system can be controlled precisely, and the surface roughness of blade after grinding meets the design requirements.

  5. Study of the modifications needed for efficient operation of NASTRAN on the Control Data Corporation STAR-100 computer

    NASA Technical Reports Server (NTRS)

    1975-01-01

    NASA structural analysis (NASTRAN) computer program is operational on three series of third generation computers. The problem and difficulties involved in adapting NASTRAN to a fourth generation computer, namely, the Control Data STAR-100, are discussed. The salient features which distinguish Control Data STAR-100 from third generation computers are hardware vector processing capability and virtual memory. A feasible method is presented for transferring NASTRAN to Control Data STAR-100 system while retaining much of the machine-independent code. Basic matrix operations are noted for optimization for vector processing.

  6. Levels-of-processing effects on a task of olfactory naming.

    PubMed

    Royet, Jean-Pierre; Koenig, Olivier; Paugam-Moisy, Helene; Puzenat, Didier; Chasse, Jean-Luc

    2004-02-01

    The effects of odor processing were investigated at various analytical levels, from simple sensory analysis to deep or semantic analysis, on a subsequent task of odor naming. Students (106 women, 23.6 +/- 5.5 yr. old; 65 men, 25.1 +/- 7.1 yr. old) were tested. The experimental procedure included two successive sessions, a first session to characterize a set of 30 odors with criteria that used various depths of processing and a second session to name the odors as quickly as possible. Four processing conditions rated the odors using descriptors before naming the odor. The control condition did not rate the odors before naming. The processing conditions were based on lower-level olfactory judgments (superficial processing), higher-level olfactory-gustatory-somesthetic judgments (deep processing), and higher-level nonolfactory judgments (Deep-Control processing, with subjects rating odors with auditory and visual descriptors). One experimental condition successively grouped lower- and higher-level olfactory judgments (Superficial-Deep processing). A naming index which depended on response accuracy and the subjects' response time were calculated. Odor naming was modified for 18 out of 30 odorants as a function of the level of processing required. For 94.5% of significant variations, the scores for odor naming were higher following those tasks for which it was hypothesized that the necessary olfactory processing was carried out at a deeper level. Performance in the naming task was progressively improved as follows: no rating of odors, then superficial, deep-control, deep, and superficial-deep processings. These data show that the deepest olfactory encoding was later associated with progressively higher performance in naming.

  7. Three-Dimensional Kinematic Analysis of Prehension Movements in Young Children with Autism Spectrum Disorder: New Insights on Motor Impairment.

    PubMed

    Campione, Giovanna Cristina; Piazza, Caterina; Villa, Laura; Molteni, Massimo

    2016-06-01

    The study was aimed at better clarifying whether action execution impairment in autism depends mainly on disruptions either in feedforward mechanisms or in feedback-based control processes supporting motor execution. To this purpose, we analyzed prehension movement kinematics in 4- and 5-year-old children with autism and in peers with typical development. Statistical analysis showed that the kinematics of the grasp component was spared in autism, whereas early kinematics of the reach component was atypical. We discussed this evidence as suggesting impairment in the feedforward processes involved in action execution, whereas impairment in feedback-based control processes remained unclear. We proposed that certain motor abilities are available in autism, and children may use them differently as a function of motor context complexity.

  8. Application of the suggestion system in the improvement of the production process and product quality control

    NASA Astrophysics Data System (ADS)

    Gołaś, H.; Mazur, A.; Gruszka, J.; Szafer, P.

    2016-08-01

    The elaboration is a case study and the research was carried out in the company Alco-Mot Ltd., which employs 120 people. The company specializes in the production of lead poles for industrial and traction batteries using gravity casting. The elements embedded in the cast are manufactured on a machining centre, which provides the stability of the process and of the dimensions of the product as well as a very short production time. As a result of observation and analysis the authors have developed a concept for the implementation of a dynamic suggestion system in ALCO-MOT, including, among others, a standard for actions in the implementation of the suggestion system, as well as clear guidelines for the processing and presentation of the activities undertaken in the time between the establishment of the concept (suggestions) and the benefits analysis after the proposed solutions have been implemented. The authors also present how suggestions proposed by ALCO-MOT staff contributed to the improvement of the processes of production and quality control. Employees offered more than 30 suggestions, of which more than a half are being implemented now and further actions are being prepared for implementation. The authors will present the results of improvements in, for example, tool replacement time, scrap reduction. The authors will present how kaizen can improve the production and quality control processes. They will present how the production and quality control processes looked before and after the implementation of employee suggestions.

  9. The remote supervisory and controlling experiment system of traditional Chinese medicine production based on Fieldbus

    NASA Astrophysics Data System (ADS)

    Zhan, Jinliang; Lu, Pei

    2006-11-01

    Since the quality of traditional Chinese medicine products are affected by raw material, machining and many other factors, it is difficult for traditional Chinese medicine production process especially the extracting process to ensure the steady and homogeneous quality. At the same time, there exist some quality control blind spots due to lacking on-line quality detection means. But if infrared spectrum analysis technology was used in traditional Chinese medicine production process on the basis of off-line analysis to real-time detect the quality of semi-manufactured goods and to be assisted by advanced automatic control technique, the steady and homogeneous quality can be obtained. It can be seen that the on-line detection of extracting process plays an important role in the development of Chinese patent medicines industry. In this paper, the design and implement of a traditional Chinese medicine extracting process monitoring experiment system which is based on PROFIBUS-DP field bus, OPC, and Internet technology is introduced. The system integrates intelligence node which gathering data, superior sub-system which achieving figure configuration and remote supervisory, during the process of traditional Chinese medicine production, monitors the temperature parameter, pressure parameter, quality parameter etc. And it can be controlled by the remote nodes in the VPN (Visual Private Network). Experiment and application do have proved that the system can reach the anticipation effect fully, and with the merits of operational stability, real-time, reliable, convenient and simple manipulation and so on.

  10. Fermentanomics: Relating quality attributes of a monoclonal antibody to cell culture process variables and raw materials using multivariate data analysis.

    PubMed

    Rathore, Anurag S; Kumar Singh, Sumit; Pathak, Mili; Read, Erik K; Brorson, Kurt A; Agarabi, Cyrus D; Khan, Mansoor

    2015-01-01

    Fermentanomics is an emerging field of research and involves understanding the underlying controlled process variables and their effect on process yield and product quality. Although major advancements have occurred in process analytics over the past two decades, accurate real-time measurement of significant quality attributes for a biotech product during production culture is still not feasible. Researchers have used an amalgam of process models and analytical measurements for monitoring and process control during production. This article focuses on using multivariate data analysis as a tool for monitoring the internal bioreactor dynamics, the metabolic state of the cell, and interactions among them during culture. Quality attributes of the monoclonal antibody product that were monitored include glycosylation profile of the final product along with process attributes, such as viable cell density and level of antibody expression. These were related to process variables, raw materials components of the chemically defined hybridoma media, concentration of metabolites formed during the course of the culture, aeration-related parameters, and supplemented raw materials such as glucose, methionine, threonine, tryptophan, and tyrosine. This article demonstrates the utility of multivariate data analysis for correlating the product quality attributes (especially glycosylation) to process variables and raw materials (especially amino acid supplements in cell culture media). The proposed approach can be applied for process optimization to increase product expression, improve consistency of product quality, and target the desired quality attribute profile. © 2015 American Institute of Chemical Engineers.

  11. On the analysis of EEG power, frequency and asymmetry in Parkinson’s disease during emotion processing

    PubMed Central

    2014-01-01

    Objective While Parkinson’s disease (PD) has traditionally been described as a movement disorder, there is growing evidence of disruption in emotion information processing associated with the disease. The aim of this study was to investigate whether there are specific electroencephalographic (EEG) characteristics that discriminate PD patients and normal controls during emotion information processing. Method EEG recordings from 14 scalp sites were collected from 20 PD patients and 30 age-matched normal controls. Multimodal (audio-visual) stimuli were presented to evoke specific targeted emotional states such as happiness, sadness, fear, anger, surprise and disgust. Absolute and relative power, frequency and asymmetry measures derived from spectrally analyzed EEGs were subjected to repeated ANOVA measures for group comparisons as well as to discriminate function analysis to examine their utility as classification indices. In addition, subjective ratings were obtained for the used emotional stimuli. Results Behaviorally, PD patients showed no impairments in emotion recognition as measured by subjective ratings. Compared with normal controls, PD patients evidenced smaller overall relative delta, theta, alpha and beta power, and at bilateral anterior regions smaller absolute theta, alpha, and beta power and higher mean total spectrum frequency across different emotional states. Inter-hemispheric theta, alpha, and beta power asymmetry index differences were noted, with controls exhibiting greater right than left hemisphere activation. Whereas intra-hemispheric alpha power asymmetry reduction was exhibited in patients bilaterally at all regions. Discriminant analysis correctly classified 95.0% of the patients and controls during emotional stimuli. Conclusion These distributed spectral powers in different frequency bands might provide meaningful information about emotional processing in PD patients. PMID:24716619

  12. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  13. Effects of rigor status during high-pressure processing on the physical qualities of farm-raised abalone (Haliotis rufescens).

    PubMed

    Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I

    2015-01-01

    High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P < 0.05) less firm and chewy than pre-rigor processed irrespective of muscle type, processing time, or pressure. L values increased with pressure to 68.9 at 300 MPa for pre-rigor processed foot, 73.8 for post-rigor processed foot, 90.9 for pre-rigor processed adductor, and 89.0 for post-rigor processed adductor. Scanning electron microscopy images showed fraying of collagen fibers in processed adductor, but did not show pressure-induced compaction of the foot myofibrils. Post-rigor processed abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®

  14. Development of a Premium Quality Plasma-derived IVIg (IQYMUNE®) Utilizing the Principles of Quality by Design-A Worked-through Case Study.

    PubMed

    Paolantonacci, Philippe; Appourchaux, Philippe; Claudel, Béatrice; Ollivier, Monique; Dennett, Richard; Siret, Laurent

    2018-01-01

    Polyvalent human normal immunoglobulins for intravenous use (IVIg), indicated for rare and often severe diseases, are complex plasma-derived protein preparations. A quality by design approach has been used to develop the Laboratoire Français du Fractionnement et des Biotechnologies new-generation IVIg, targeting a high level of purity to generate an enhanced safety profile while maintaining a high level of efficacy. A modular approach of quality by design was implemented consisting of five consecutive steps to cover all the stages from the product design to the final product control strategy.A well-defined target product profile was translated into 27 product quality attributes that formed the basis of the process design. In parallel, a product risk analysis was conducted and identified 19 critical quality attributes among the product quality attributes. Process risk analysis was carried out to establish the links between process parameters and critical quality attributes. Twelve critical steps were identified, and for each of these steps a risk mitigation plan was established.Among the different process risk mitigation exercises, five process robustness studies were conducted at qualified small scale with a design of experiment approach. For each process step, critical process parameters were identified and, for each critical process parameter, proven acceptable ranges were established. The quality risk management and risk mitigation outputs, including verification of proven acceptable ranges, were used to design the process verification exercise at industrial scale.Finally, the control strategy was established using a mix, or hybrid, of the traditional approach plus elements of the quality by design enhanced approach, as illustrated, to more robustly assign material and process controls and in order to securely meet product specifications.The advantages of this quality by design approach were improved process knowledge for industrial design and process validation and a clear justification of the process and product specifications as a basis for control strategy and future comparability exercises. © PDA, Inc. 2018.

  15. Real-time control data wrangling for development of mathematical control models of technological processes

    NASA Astrophysics Data System (ADS)

    Vasilyeva, N. V.; Koteleva, N. I.; Fedorova, E. R.

    2018-05-01

    The relevance of the research is due to the need to stabilize the composition of the melting products of copper-nickel sulfide raw materials in the Vanyukov furnace. The goal of this research is to identify the most suitable methods for the aggregation of the real time data for the development of a mathematical model for control of the technological process of melting copper-nickel sulfide raw materials in the Vanyukov furnace. Statistical methods of analyzing the historical data of the real technological object and the correlation analysis of process parameters are described. Factors that exert the greatest influence on the main output parameter (copper content in matte) and ensure the physical-chemical transformations are revealed. An approach to the processing of the real time data for the development of a mathematical model for control of the melting process is proposed. The stages of processing the real time information are considered. The adopted methodology for the aggregation of data suitable for the development of a control model for the technological process of melting copper-nickel sulfide raw materials in the Vanyukov furnace allows us to interpret the obtained results for their further practical application.

  16. Advanced technologies for maintenance of electrical systems and equipment at the Savannah River Site Defense Waste Processing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Husler, R.O.; Weir, T.J.

    1991-01-01

    An enhanced maintenance program is being established to characterize and monitor cables, components, and process response at the Savannah River Site, Defense Waste Processing Facility. This facility was designed and constructed to immobilize the radioactive waste currently stored in underground storage tanks and is expected to begin operation in 1993. The plant is initiating the program to baseline and monitor instrument and control (I C) and electrical equipment, remote process equipment, embedded instrument and control cables, and in-cell jumper cables used in the facility. This program is based on the electronic characterization and diagnostic (ECAD) system which was modified tomore » include process response analysis and to meet rigid Department of Energy equipment requirements. The system consists of computer-automated, state-of-the-art electronics. The data that are gathered are stored in a computerized database for analysis, trending, and troubleshooting. It is anticipated that the data which are gathered and trended will aid in life extension for the facility.« less

  17. Advanced technologies for maintenance of electrical systems and equipment at the Savannah River Site Defense Waste Processing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Husler, R.O.; Weir, T.J.

    1991-12-31

    An enhanced maintenance program is being established to characterize and monitor cables, components, and process response at the Savannah River Site, Defense Waste Processing Facility. This facility was designed and constructed to immobilize the radioactive waste currently stored in underground storage tanks and is expected to begin operation in 1993. The plant is initiating the program to baseline and monitor instrument and control (I&C) and electrical equipment, remote process equipment, embedded instrument and control cables, and in-cell jumper cables used in the facility. This program is based on the electronic characterization and diagnostic (ECAD) system which was modified to includemore » process response analysis and to meet rigid Department of Energy equipment requirements. The system consists of computer-automated, state-of-the-art electronics. The data that are gathered are stored in a computerized database for analysis, trending, and troubleshooting. It is anticipated that the data which are gathered and trended will aid in life extension for the facility.« less

  18. Glycan Remodeling with Processing Inhibitors and Lectin-Resistant Eukaryotic Cells.

    PubMed

    Chang, Veronica T; Spooner, Robert A; Crispin, Max; Davis, Simon J

    2015-01-01

    Some of the most important and interesting molecules in metazoan biology are glycoproteins. The importance of the carbohydrate component of these structures is often revealed by the disease phenotypes that manifest when the biosynthesis of particular glycoforms is disrupted. On the other hand, the presence of large amounts of carbohydrate can often hinder the structural and functional analysis of glycoproteins. There are often good reasons, therefore, for wanting to engineer and predefine the N-glycans present on glycoproteins, e.g., in order to characterize the functions of the glycans or facilitate their subsequent removal. Here, we describe in detail two distinct ways in which to usefully interfere with oligosaccharide processing, one involving the use of specific processing inhibitors, and the other the selection of cell lines mutated at gene loci that control oligosaccharide processing, using cytotoxic lectins. Both approaches have the capacity for controlled, radical alteration of oligosaccharide processing in eukaryotic cells used for heterologous protein expression, and have great utility in the structural analysis of glycoproteins.

  19. Six Sigma methods applied to cryogenic coolers assembly line

    NASA Astrophysics Data System (ADS)

    Ventre, Jean-Marc; Germain-Lacour, Michel; Martin, Jean-Yves; Cauquil, Jean-Marc; Benschop, Tonny; Griot, René

    2009-05-01

    Six Sigma method have been applied to manufacturing process of a rotary Stirling cooler: RM2. Name of the project is NoVa as main goal of the Six Sigma approach is to reduce variability (No Variability). Project has been based on the DMAIC guideline following five stages: Define, Measure, Analyse, Improve, Control. Objective has been set on the rate of coolers succeeding performance at first attempt with a goal value of 95%. A team has been gathered involving people and skills acting on the RM2 manufacturing line. Measurement System Analysis (MSA) has been applied to test bench and results after R&R gage show that measurement is one of the root cause for variability in RM2 process. Two more root causes have been identified by the team after process mapping analysis: regenerator filling factor and cleaning procedure. Causes for measurement variability have been identified and eradicated as shown by new results from R&R gage. Experimental results show that regenerator filling factor impacts process variability and affects yield. Improved process haven been set after new calibration process for test bench, new filling procedure for regenerator and an additional cleaning stage have been implemented. The objective for 95% coolers succeeding performance test at first attempt has been reached and kept for a significant period. RM2 manufacturing process is now managed according to Statistical Process Control based on control charts. Improvement in process capability have enabled introduction of sample testing procedure before delivery.

  20. Secretome profile analysis of multidrug-resistant, monodrug-resistant and drug-susceptible Mycobacterium tuberculosis.

    PubMed

    Putim, Chanyanuch; Phaonakrop, Narumon; Jaresitthikunchai, Janthima; Gamngoen, Ratikorn; Tragoolpua, Khajornsak; Intorasoot, Sorasak; Anukool, Usanee; Tharincharoen, Chayada Sitthidet; Phunpae, Ponrut; Tayapiwatana, Chatchai; Kasinrerk, Watchara; Roytrakul, Sittiruk; Butr-Indr, Bordin

    2018-03-01

    The emergence of drug-resistant tuberculosis has generated great concern in the control of tuberculosis and HIV/TB patients have established severe complications that are difficult to treat. Although, the gold standard of drug-susceptibility testing is highly accurate and efficient, it is time-consuming. Diagnostic biomarkers are, therefore, necessary in discriminating between infection from drug-resistant and drug-susceptible strains. One strategy that aids to effectively control tuberculosis is understanding the function of secreting proteins that mycobacteria use to manipulate the host cellular defenses. In this study, culture filtrate proteins from Mycobacterium tuberculosis H37Rv, isoniazid-resistant, rifampicin-resistant and multidrug-resistant strains were gathered and profiled by shotgun-proteomics technique. Mass spectrometric analysis of the secreted proteome identified several proteins, of which 837, 892, 838 and 850 were found in M. tuberculosis H37Rv, isoniazid-resistant, rifampicin-resistant and multidrug-resistant strains, respectively. These proteins have been implicated in various cellular processes, including biological adhesion, biological regulation, developmental process, immune system process localization, cellular process, cellular component organization or biogenesis, metabolic process, and response to stimulus. Analysis based on STITCH database predicted the interaction of DNA topoisomerase I, 3-oxoacyl-(acyl-carrier protein) reductase, ESAT-6-like protein, putative prophage phiRv2 integrase, and 3-phosphoshikimate 1-carboxyvinyltransferase with isoniazid, rifampicin, pyrazinamide, ethambutol and streptomycin, suggesting putative roles in controlling the anti-tuberculosis ability. However, several proteins with no interaction with all first-line anti-tuberculosis drugs might be used as markers for mycobacterial identification.

  1. SLUDGE TREATMENT PROJECT KOP CONCEPTUAL DESIGN CONTROL DECISION REPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CARRO CA

    2010-03-09

    This control decision addresses the Knock-Out Pot (KOP) Disposition KOP Processing System (KPS) conceptual design. The KPS functions to (1) retrieve KOP material from canisters, (2) remove particles less than 600 {micro}m in size and low density materials from the KOP material, (3) load the KOP material into Multi-Canister Overpack (MCO) baskets, and (4) stage the MCO baskets for subsequent loading into MCOs. Hazard and accident analyses of the KPS conceptual design have been performed to incorporate safety into the design process. The hazard analysis is documented in PRC-STP-00098, Knock-Out Pot Disposition Project Conceptual Design Hazard Analysis. The accident analysismore » is documented in PRC-STP-CN-N-00167, Knock-Out Pot Disposition Sub-Project Canister Over Lift Accident Analysis. Based on the results of these analyses, and analyses performed in support of MCO transportation and MCO processing and storage activities at the Cold Vacuum Drying Facility (CVDF) and Canister Storage Building (CSB), control decision meetings were held to determine the controls required to protect onsite and offsite receptors and facility workers. At the conceptual design stage, these controls are primarily defined by their safety functions. Safety significant structures, systems, and components (SSCs) that could provide the identified safety functions have been selected for the conceptual design. It is anticipated that some safety SSCs identified herein will be reclassified based on hazard and accident analyses performed in support of preliminary and detailed design.« less

  2. Assessing technical performance in differential gene expression experiments with external spike-in RNA control ratio mixtures.

    PubMed

    Munro, Sarah A; Lund, Steven P; Pine, P Scott; Binder, Hans; Clevert, Djork-Arné; Conesa, Ana; Dopazo, Joaquin; Fasold, Mario; Hochreiter, Sepp; Hong, Huixiao; Jafari, Nadereh; Kreil, David P; Łabaj, Paweł P; Li, Sheng; Liao, Yang; Lin, Simon M; Meehan, Joseph; Mason, Christopher E; Santoyo-Lopez, Javier; Setterquist, Robert A; Shi, Leming; Shi, Wei; Smyth, Gordon K; Stralis-Pavese, Nancy; Su, Zhenqiang; Tong, Weida; Wang, Charles; Wang, Jian; Xu, Joshua; Ye, Zhan; Yang, Yong; Yu, Ying; Salit, Marc

    2014-09-25

    There is a critical need for standard approaches to assess, report and compare the technical performance of genome-scale differential gene expression experiments. Here we assess technical performance with a proposed standard 'dashboard' of metrics derived from analysis of external spike-in RNA control ratio mixtures. These control ratio mixtures with defined abundance ratios enable assessment of diagnostic performance of differentially expressed transcript lists, limit of detection of ratio (LODR) estimates and expression ratio variability and measurement bias. The performance metrics suite is applicable to analysis of a typical experiment, and here we also apply these metrics to evaluate technical performance among laboratories. An interlaboratory study using identical samples shared among 12 laboratories with three different measurement processes demonstrates generally consistent diagnostic power across 11 laboratories. Ratio measurement variability and bias are also comparable among laboratories for the same measurement process. We observe different biases for measurement processes using different mRNA-enrichment protocols.

  3. Independent Orbiter Assessment (IOA): Analysis of the guidance, navigation, and control subsystem

    NASA Technical Reports Server (NTRS)

    Trahan, W. H.; Odonnell, R. A.; Pietz, K. C.; Hiott, J. M.

    1986-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) is presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The independent analysis results corresponding to the Orbiter Guidance, Navigation, and Control (GNC) Subsystem hardware are documented. The function of the GNC hardware is to respond to guidance, navigation, and control software commands to effect vehicle control and to provide sensor and controller data to GNC software. Some of the GNC hardware for which failure modes analysis was performed includes: hand controllers; Rudder Pedal Transducer Assembly (RPTA); Speed Brake Thrust Controller (SBTC); Inertial Measurement Unit (IMU); Star Tracker (ST); Crew Optical Alignment Site (COAS); Air Data Transducer Assembly (ADTA); Rate Gyro Assemblies; Accelerometer Assembly (AA); Aerosurface Servo Amplifier (ASA); and Ascent Thrust Vector Control (ATVC). The IOA analysis process utilized available GNC hardware drawings, workbooks, specifications, schematics, and systems briefs for defining hardware assemblies, components, and circuits. Each hardware item was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode.

  4. Optimisation of shock absorber process parameters using failure mode and effect analysis and genetic algorithm

    NASA Astrophysics Data System (ADS)

    Mariajayaprakash, Arokiasamy; Senthilvelan, Thiyagarajan; Vivekananthan, Krishnapillai Ponnambal

    2013-07-01

    The various process parameters affecting the quality characteristics of the shock absorber during the process were identified using the Ishikawa diagram and by failure mode and effect analysis. The identified process parameters are welding process parameters (squeeze, heat control, wheel speed, and air pressure), damper sealing process parameters (load, hydraulic pressure, air pressure, and fixture height), washing process parameters (total alkalinity, temperature, pH value of rinsing water, and timing), and painting process parameters (flowability, coating thickness, pointage, and temperature). In this paper, the process parameters, namely, painting and washing process parameters, are optimized by Taguchi method. Though the defects are reasonably minimized by Taguchi method, in order to achieve zero defects during the processes, genetic algorithm technique is applied on the optimized parameters obtained by Taguchi method.

  5. Requirements for implementing real-time control functional modules on a hierarchical parallel pipelined system

    NASA Technical Reports Server (NTRS)

    Wheatley, Thomas E.; Michaloski, John L.; Lumia, Ronald

    1989-01-01

    Analysis of a robot control system leads to a broad range of processing requirements. One fundamental requirement of a robot control system is the necessity of a microcomputer system in order to provide sufficient processing capability.The use of multiple processors in a parallel architecture is beneficial for a number of reasons, including better cost performance, modular growth, increased reliability through replication, and flexibility for testing alternate control strategies via different partitioning. A survey of the progression from low level control synchronizing primitives to higher level communication tools is presented. The system communication and control mechanisms of existing robot control systems are compared to the hierarchical control model. The impact of this design methodology on the current robot control systems is explored.

  6. Structural model of control system for hydraulic stepper motor complex

    NASA Astrophysics Data System (ADS)

    Obukhov, A. D.; Dedov, D. L.; Kolodin, A. N.

    2018-03-01

    The article considers the problem of developing a structural model of the control system for a hydraulic stepper drive complex. A comparative analysis of stepper drives and assessment of the applicability of HSM for solving problems, requiring accurate displacement in space with subsequent positioning of the object, are carried out. The presented structural model of the automated control system of the multi-spindle complex of hydraulic stepper drives reflects the main components of the system, as well as the process of its control based on the control signals transfer to the solenoid valves by the controller. The models and methods described in the article can be used to formalize the control process in technical systems based on the application hydraulic stepper drives and allow switching from mechanical control to automated control.

  7. Situation Awareness Implications of Adaptive Automation of Air Traffic Controller Information Processing Functions

    NASA Technical Reports Server (NTRS)

    Kaber, David B.; McClernon, Christopher K.; Perry, Carlene M.; Segall, Noa

    2004-01-01

    The goal of this research was to define a measure of situation awareness (SA) in an air traffic control (ATC) task and to assess the influence of adaptive automation (AA) of various information processing functions on controller perception, comprehension and projection. The measure was also to serve as a basis for defining and developing an approach to triggering dynamic control allocations, as part of AA, based on controller SA. To achieve these objectives, an enhanced version of an ATC simulation (Multitask (copyright)) was developed for use in two human factors experiments. The simulation captured the basic functions of Terminal Radar Approach Control (TRACON) and was capable of presenting to operators four different modes of control, including information acquisition, information analysis, decision making and action implementation automation, as well as a completely manual control mode. The SA measure that was developed as part of the research was based on the Situation Awareness Global Assessment Technique (SAGAT), previous goal-directed task analyses of enroute control and TRACON, and a separate cognitive task analysis on the ATC simulation. The results of the analysis on Multitask were used as a basis for formulating SA queries as part of the SAGAT-based approach to measuring controller SA, which was used in the experiments. A total of 16 subjects were recruited for both experiments. Half the subjects were used in Experiment #1, which focused on assessing the sensitivity and reliability of the SA measurement approach in the ATC simulation. Comparisons were made of manual versus automated control. The remaining subjects were used in the second experiment, which was intended to more completely describe the SA implications of AA applied to specific controller information processing functions, and to describe how the measure could ultimately serve as a trigger of dynamic function allocations in the application of AA to ATC. Comparisons were made of the sensitivity of the SA measure to automation manipulations impacting both higher-order information processing functions, such as information analysis and decision making, versus lower-order functions, including information acquisition and action implementation. All subjects were exposed to all forms of AA of the ATC task and the manual control condition. The approach to AA used in both experiments was to match operator workload, assessed using a secondary task, to dynamic control allocations in the primary task. In total, the subjects in each experiment participated in 10 trials with each lasting between 45 minutes and 1 hour. In both experiments, ATC performance was measured in terms of aircraft cleared, conflicting, and collided. Secondary task (gauge monitoring) performance was assessed in terms of a hit-to-signal ratio. As part of the SA measure, three simulation freezes were conducted during each trial to administer queries on Level 1, 2, and 3 SA.

  8. Development of advanced Czochralski growth process to produce low cost 150 kg silicon ingots from a single crucible for technology readiness

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The design and development of an advanced Czochralski crystal grower are described. Several exhaust gas analysis system equipment specifications studied are discussed. Process control requirements were defined and design work began on the melt temperature, melt level, and continuous diameter control. Sensor development included assembly and testing of a bench prototype of a diameter scanner system.

  9. Automated drug identification system

    NASA Technical Reports Server (NTRS)

    Campen, C. F., Jr.

    1974-01-01

    System speeds up analysis of blood and urine and is capable of identifying 100 commonly abused drugs. System includes computer that controls entire analytical process by ordering various steps in specific sequences. Computer processes data output and has readout of identified drugs.

  10. Diazo techniques for remote sensor data analysis

    NASA Technical Reports Server (NTRS)

    Mount, S.; Whitebay, L. E.

    1979-01-01

    Cost and time to extract land use maps, natural-resource surveys, and other data from aerial and satellite photographs are reduced by diazo processing. Process can be controlled to enhance features such as vegetation, land boundaries, and bodies of water.

  11. Making the Hubble Space Telescope servicing mission safe

    NASA Technical Reports Server (NTRS)

    Bahr, N. J.; Depalo, S. V.

    1992-01-01

    The implementation of the HST system safety program is detailed. Numerous safety analyses are conducted through various phases of design, test, and fabrication, and results are presented to NASA management for discussion during dedicated safety reviews. Attention is given to the system safety assessment and risk analysis methodologies used, i.e., hazard analysis, fault tree analysis, and failure modes and effects analysis, and to how they are coupled with engineering and test analysis for a 'synergistic picture' of the system. Some preliminary safety analysis results, showing the relationship between hazard identification, control or abatement, and finally control verification, are presented as examples of this safety process.

  12. Evaluation of browning ratio in an image analysis of apple slices at different stages of instant controlled pressure drop-assisted hot-air drying (AD-DIC).

    PubMed

    Gao, Kun; Zhou, Linyan; Bi, Jinfeng; Yi, Jianyong; Wu, Xinye; Zhou, Mo; Wang, Xueyuan; Liu, Xuan

    2017-06-01

    Computer vision-based image analysis systems are widely used in food processing to evaluate quality changes. They are able to objectively measure the surface colour of various products since, providing some obvious advantages with their objectivity and quantitative capabilities. In this study, a computer vision-based image analysis system was used to investigate the colour changes of apple slices dried by instant controlled pressure drop-assisted hot air drying (AD-DIC). The CIE L* value and polyphenol oxidase activity in apple slices decreased during the entire drying process, whereas other colour indexes, including CIE a*, b*, ΔE and C* values, increased. The browning ratio calculated by image analysis increased during the drying process, and a sharp increment was observed for the DIC process. The change in 5-hydroxymethylfurfural (5-HMF) and fluorescent compounds (FIC) showed the same trend with browning ratio due to Maillard reaction. Moreover, the concentrations of 5-HMF and FIC both had a good quadratic correlation (R 2  > 0.998) with the browning ratio. Browning ratio was a reliable indicator of 5-HMF and FIC changes in apple slices during drying. The image analysis system could be used to monitor colour changes, 5-HMF and FIC in dehydrated apple slices during the AD-DIC process. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  13. Cluster analysis of cognitive performance in elderly and demented subjects.

    PubMed

    Giaquinto, S; Nolfe, G; Calvani, M

    1985-06-01

    48 elderly normals, 14 demented subjects and 76 young controls were tested for basic cognitive functions. All the tests were quantified and could therefore be subjected to statistical analysis. The results show a difference in the speed of information processing and in memory load between the young controls and elderly normals but the age groups differed in quantitative terms only. Cluster analysis showed that the elderly and the demented formed two distinctly separate groups at the qualitative level, the basic cognitive processes being damaged in the demented group. Age thus appears to be only a risk factor for dementia and not its cause. It is concluded that batteries based on precise and measurable tasks are the most appropriate not only for the study of dementia but for rehabilitation purposes too.

  14. Stabilization and analytical tuning rule of double-loop control scheme for unstable dead-time process

    NASA Astrophysics Data System (ADS)

    Ugon, B.; Nandong, J.; Zang, Z.

    2017-06-01

    The presence of unstable dead-time systems in process plants often leads to a daunting challenge in the design of standard PID controllers, which are not only intended to provide close-loop stability but also to give good performance-robustness overall. In this paper, we conduct stability analysis on a double-loop control scheme based on the Routh-Hurwitz stability criteria. We propose to use this unstable double-loop control scheme which employs two P/PID controllers to control first-order or second-order unstable dead-time processes typically found in process industries. Based on the Routh-Hurwitz stability necessary and sufficient criteria, we establish several stability regions which enclose within them the P/PID parameter values that guarantee close-loop stability of the double-loop control scheme. A systematic tuning rule is developed for the purpose of obtaining the optimal P/PID parameter values within the established regions. The effectiveness of the proposed tuning rule is demonstrated using several numerical examples and the result are compared with some well-established tuning methods reported in the literature.

  15. Integrated Assessment and Improvement of the Quality Assurance System for the Cosworth Casting Process

    NASA Astrophysics Data System (ADS)

    Yousif, Dilon

    The purpose of this study was to improve the Quality Assurance (QA) System at the Nemak Windsor Aluminum Plant (WAP). The project used Six Sigma method based on Define, Measure, Analyze, Improve, and Control (DMAIC). Analysis of in process melt at WAP was based on chemical, thermal, and mechanical testing. The control limits for the W319 Al Alloy were statistically recalculated using the composition measured under stable conditions. The "Chemistry Viewer" software was developed for statistical analysis of alloy composition. This software features the Silicon Equivalency (SiBQ) developed by the IRC. The Melt Sampling Device (MSD) was designed and evaluated at WAP to overcome traditional sampling limitations. The Thermal Analysis "Filters" software was developed for cooling curve analysis of the 3XX Al Alloy(s) using IRC techniques. The impact of low melting point impurities on the start of melting was evaluated using the Universal Metallurgical Simulator and Analyzer (UMSA).

  16. [Introduction of hazard analysis and critical control points (HACCP) principles at the flight catering food production plant].

    PubMed

    Popova, A Yu; Trukhina, G M; Mikailova, O M

    In the article there is considered the quality control and safety system implemented in the one of the largest flight catering food production plant for airline passengers and flying squad. The system for the control was based on the Hazard Analysis And Critical Control Points (HACCP) principles and developed hygienic and antiepidemic measures. There is considered the identification of hazard factors at stages of the technical process. There are presented results of the analysis data of monitoring for 6 critical control points over the five-year period. The quality control and safety system permit to decline food contamination risk during acceptance, preparation and supplying of in-flight meal. There was proved the efficiency of the implemented system. There are determined further ways of harmonization and implementation for HACCP principles in the plant.

  17. Which Measures of Online Control Are Least Sensitive to Offline Processes?

    PubMed

    de Grosbois, John; Tremblay, Luc

    2018-02-28

    A major challenge to the measurement of online control is the contamination by offline, planning-based processes. The current study examined the sensitivity of four measures of online control to offline changes in reaching performance induced by prism adaptation and terminal feedback. These measures included the squared Z scores (Z 2 ) of correlations of limb position at 75% movement time versus movement end, variable error, time after peak velocity, and a frequency-domain analysis (pPower). The results indicated that variable error and time after peak velocity were sensitive to the prism adaptation. Furthermore, only the Z 2 values were biased by the terminal feedback. Ultimately, the current study has demonstrated the sensitivity of limb kinematic measures to offline control processes and that pPower analyses may yield the most suitable measure of online control.

  18. An Analysis of the Air Force Government Operated Civil Engineering Supply Store Logistic System: How Can It Be Improved?

    DTIC Science & Technology

    1990-09-01

    6 Logistics Systems ............ 7 GOCESS Operation . . . . . . . ..... 9 Work Order Processing . . . . ... 12 Job Order Processing . . . . . . . . . . 14...orders and job orders to the Material Control Section will be discussed separately. Work Order Processing . Figure 2 illustrates typical WO processing...logistics function. The JO processing is similar. Job Order Processing . Figure 3 illustrates typical JO processing in a GOCESS operation. As with WOs, this

  19. Advancing biopharmaceutical process development by system-level data analysis and integration of omics data.

    PubMed

    Schaub, Jochen; Clemens, Christoph; Kaufmann, Hitto; Schulz, Torsten W

    2012-01-01

    Development of efficient bioprocesses is essential for cost-effective manufacturing of recombinant therapeutic proteins. To achieve further process improvement and process rationalization comprehensive data analysis of both process data and phenotypic cell-level data is essential. Here, we present a framework for advanced bioprocess data analysis consisting of multivariate data analysis (MVDA), metabolic flux analysis (MFA), and pathway analysis for mapping of large-scale gene expression data sets. This data analysis platform was applied in a process development project with an IgG-producing Chinese hamster ovary (CHO) cell line in which the maximal product titer could be increased from about 5 to 8 g/L.Principal component analysis (PCA), k-means clustering, and partial least-squares (PLS) models were applied to analyze the macroscopic bioprocess data. MFA and gene expression analysis revealed intracellular information on the characteristics of high-performance cell cultivations. By MVDA, for example, correlations between several essential amino acids and the product concentration were observed. Also, a grouping into rather cell specific productivity-driven and process control-driven processes could be unraveled. By MFA, phenotypic characteristics in glycolysis, glutaminolysis, pentose phosphate pathway, citrate cycle, coupling of amino acid metabolism to citrate cycle, and in the energy yield could be identified. By gene expression analysis 247 deregulated metabolic genes were identified which are involved, inter alia, in amino acid metabolism, transport, and protein synthesis.

  20. [Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry].

    PubMed

    Villani, N; Gérard, K; Marchesi, V; Huger, S; François, P; Noël, A

    2010-06-01

    The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (IMRT) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. At Alexis-Vautrin center, pretreatment quality controls in IMRT for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multileaf collimator). Correlation between dose measured at one point, given with the EPID and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. The study allowed to demonstrate the feasibility to reduce the time devoted to pretreatment controls, by substituting the ionisation chamber's measurements with those performed with EPID, and also that a statistical process control monitoring of data brought security guarantee. 2010 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  1. No control genes required: Bayesian analysis of qRT-PCR data.

    PubMed

    Matz, Mikhail V; Wright, Rachel M; Scott, James G

    2013-01-01

    Model-based analysis of data from quantitative reverse-transcription PCR (qRT-PCR) is potentially more powerful and versatile than traditional methods. Yet existing model-based approaches cannot properly deal with the higher sampling variances associated with low-abundant targets, nor do they provide a natural way to incorporate assumptions about the stability of control genes directly into the model-fitting process. In our method, raw qPCR data are represented as molecule counts, and described using generalized linear mixed models under Poisson-lognormal error. A Markov Chain Monte Carlo (MCMC) algorithm is used to sample from the joint posterior distribution over all model parameters, thereby estimating the effects of all experimental factors on the expression of every gene. The Poisson-based model allows for the correct specification of the mean-variance relationship of the PCR amplification process, and can also glean information from instances of no amplification (zero counts). Our method is very flexible with respect to control genes: any prior knowledge about the expected degree of their stability can be directly incorporated into the model. Yet the method provides sensible answers without such assumptions, or even in the complete absence of control genes. We also present a natural Bayesian analogue of the "classic" analysis, which uses standard data pre-processing steps (logarithmic transformation and multi-gene normalization) but estimates all gene expression changes jointly within a single model. The new methods are considerably more flexible and powerful than the standard delta-delta Ct analysis based on pairwise t-tests. Our methodology expands the applicability of the relative-quantification analysis protocol all the way to the lowest-abundance targets, and provides a novel opportunity to analyze qRT-PCR data without making any assumptions concerning target stability. These procedures have been implemented as the MCMC.qpcr package in R.

  2. [Powdered infant formulae preparation guide for hospitals based on Hazard Analysis and Critical Control Points (HACCP) principles].

    PubMed

    Vargas-Leguás, H; Rodríguez Garrido, V; Lorite Cuenca, R; Pérez-Portabella, C; Redecillas Ferreiro, S; Campins Martí, M

    2009-06-01

    This guide for the preparation of powdered infant formulae in hospital environments is a collaborative work between several hospital services and is based on national and European regulations, international experts meetings and the recommendations of scientific societies. This guide also uses the Hazard Analysis and Critical Control Point principles proposed by Codex Alimentarius and emphasises effective verifying measures, microbiological controls of the process and the corrective actions when monitoring indicates that a critical control point is not under control. It is a dynamic guide and specifies the evaluation procedures that allow it to be constantly adapted.

  3. Off-the-shelf Control of Data Analysis Software

    NASA Astrophysics Data System (ADS)

    Wampler, S.

    The Gemini Project must provide convenient access to data analysis facilities to a wide user community. The international nature of this community makes the selection of data analysis software particularly interesting, with staunch advocates of systems such as ADAM and IRAF among the users. Additionally, the continuing trends towards increased use of networked systems and distributed processing impose additional complexity. To meet these needs, the Gemini Project is proposing the novel approach of using low-cost, off-the-shelf software to abstract out both the control and distribution of data analysis from the functionality of the data analysis software. For example, the orthogonal nature of control versus function means that users might select analysis routines from both ADAM and IRAF as appropriate, distributing these routines across a network of machines. It is the belief of the Gemini Project that this approach results in a system that is highly flexible, maintainable, and inexpensive to develop. The Khoros visualization system is presented as an example of control software that is currently available for providing the control and distribution within a data analysis system. The visual programming environment provided with Khoros is also discussed as a means to providing convenient access to this control.

  4. Can we (control) Engineer the degree learning process?

    NASA Astrophysics Data System (ADS)

    White, A. S.; Censlive, M.; Neilsen, D.

    2014-07-01

    This paper investigates how control theory could be applied to learning processes in engineering education. The initial point for the analysis is White's Double Loop learning model of human automation control modified for the education process where a set of governing principals is chosen, probably by the course designer. After initial training the student decides unknowingly on a mental map or model. After observing how the real world is behaving, a strategy to achieve the governing variables is chosen and a set of actions chosen. This may not be a conscious operation, it maybe completely instinctive. These actions will cause some consequences but not until a certain time delay. The current model is compared with the work of Hollenbeck on goal setting, Nelson's model of self-regulation and that of Abdulwahed, Nagy and Blanchard at Loughborough who investigated control methods applied to the learning process.

  5. Application of ICH Q9 Quality Risk Management Tools for Advanced Development of Hot Melt Coated Multiparticulate Systems.

    PubMed

    Stocker, Elena; Becker, Karin; Hate, Siddhi; Hohl, Roland; Schiemenz, Wolfgang; Sacher, Stephan; Zimmer, Andreas; Salar-Behzadi, Sharareh

    2017-01-01

    This study aimed to apply quality risk management based on the The International Conference on Harmonisation guideline Q9 for the early development stage of hot melt coated multiparticulate systems for oral administration. N-acetylcysteine crystals were coated with a formulation composing tripalmitin and polysorbate 65. The critical quality attributes (CQAs) were initially prioritized using failure mode and effects analysis. The CQAs of the coated material were defined as particle size, taste-masking efficiency, and immediate release profile. The hot melt coated process was characterized via a flowchart, based on the identified potential critical process parameters (CPPs) and their impact on the CQAs. These CPPs were prioritized using a process failure mode, effects, and criticality analysis and their critical impact on the CQAs was experimentally confirmed using a statistical design of experiments. Spray rate, atomization air pressure, and air flow rate were identified as CPPs. Coating amount and content of polysorbate 65 in the coating formulation were identified as critical material attributes. A hazard and critical control points analysis was applied to define control strategies at the critical process points. A fault tree analysis evaluated causes for potential process failures. We successfully demonstrated that a standardized quality risk management approach optimizes the product development sustainability and supports the regulatory aspects. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  6. Management of the General Process of Parenteral Nutrition Using mHealth Technologies: Evaluation and Validation Study

    PubMed Central

    2018-01-01

    Background Any system applied to the control of parenteral nutrition (PN) ought to prove that the process meets the established requirements and include a repository of records to allow evaluation of the information about PN processes at any time. Objective The goal of the research was to evaluate the mobile health (mHealth) app and validate its effectiveness in monitoring the management of the PN process. Methods We studied the evaluation and validation of the general process of PN using an mHealth app. The units of analysis were the PN bags prepared and administered at the Son Espases University Hospital, Palma, Spain, from June 1 to September 6, 2016. For the evaluation of the app, we used the Poststudy System Usability Questionnaire and subsequent analysis with the Cronbach alpha coefficient. Validation was performed by checking the compliance of control for all operations on each of the stages (validation and transcription of the prescription, preparation, conservation, and administration) and by monitoring the operative control points and critical control points. Results The results obtained from 387 bags were analyzed, with 30 interruptions of administration. The fulfillment of stages was 100%, including noncritical nonconformities in the storage control. The average deviation in the weight of the bags was less than 5%, and the infusion time did not present deviations greater than 1 hour. Conclusions The developed app successfully passed the evaluation and validation tests and was implemented to perform the monitoring procedures for the overall PN process. A new mobile solution to manage the quality and traceability of sensitive medicines such as blood-derivative drugs and hazardous drugs derived from this project is currently being deployed. PMID:29615389

  7. Inter-noise 89 - Engineering for environmental noise control; Proceedings of the International Conference on Noise Control Engineering, Newport Beach, CA, Dec. 4-6, 1989. Vols. 1 & 2

    NASA Astrophysics Data System (ADS)

    Maling, George C., Jr.

    Recent advances in noise analysis and control theory and technology are discussed in reviews and reports. Topics addressed include noise generation; sound-wave propagation; noise control by external treatments; vibration and shock generation, transmission, isolation, and reduction; multiple sources and paths of environmental noise; noise perception and the physiological and psychological effects of noise; instrumentation, signal processing, and analysis techniques; and noise standards and legal aspects. Diagrams, drawings, graphs, photographs, and tables of numerical data are provided.

  8. Apparatus for use in rapid and accurate controlled-potential coulometric analysis

    DOEpatents

    Frazzini, Thomas L.; Holland, Michael K.; Pietri, Charles E.; Weiss, Jon R.

    1981-01-01

    An apparatus for controlled-potential coulometric analysis of a solution includes a cell to contain the solution to be analyzed and a plurality of electrodes to contact the solution in the cell. Means are provided to stir the solution and to control the atmosphere above it. A potentiostat connected to the electrodes controls potential differences among the electrodes. An electronic circuit connected to the potentiostat provides analog-to-digital conversion and displays a precise count of charge transfer during a desired chemical process. This count provides a measure of the amount of an unknown substance in the solution.

  9. Spacelab data analysis and interactive control study

    NASA Technical Reports Server (NTRS)

    Tarbell, T. D.; Drake, J. F.

    1980-01-01

    The study consisted of two main tasks, a series of interviews of Spacelab users and a survey of data processing and display equipment. Findings from the user interviews on questions of interactive control, downlink data formats, and Spacelab computer software development are presented. Equipment for quick look processing and display of scientific data in the Spacelab Payload Operations Control Center (POCC) was surveyed. Results of this survey effort are discussed in detail, along with recommendations for NASA development of several specific display systems which meet common requirements of many Spacelab experiments.

  10. Generating Options for Active Risk Control (GO-ARC): introducing a novel technique.

    PubMed

    Card, Alan J; Ward, James R; Clarkson, P John

    2014-01-01

    After investing significant amounts of time and money in conducting formal risk assessments, such as root cause analysis (RCA) or failure mode and effects analysis (FMEA), healthcare workers are left to their own devices in generating high-quality risk control options. They often experience difficulty in doing so, and tend toward an overreliance on administrative controls (the weakest category in the hierarchy of risk controls). This has important implications for patient safety and the cost effectiveness of risk management operations. This paper describes a before and after pilot study of the Generating Options for Active Risk Control (GO-ARC) technique, a novel tool to improve the quality of the risk control options generation process. The quantity, quality (using the three-tiered hierarchy of risk controls), variety, and novelty of risk controls generated. Use of the GO-ARC technique was associated with improvement on all measures. While this pilot study has some notable limitations, it appears that the GO-ARC technique improved the risk control options generation process. Further research is needed to confirm this finding. It is also important to note that improved risk control options are a necessary, but not sufficient, step toward the implementation of more robust risk controls. © 2013 National Association for Healthcare Quality.

  11. A Taguchi approach on optimal process control parameters for HDPE pipe extrusion process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, R. Umamaheswara; Rao, P. Srinivasa

    2017-06-01

    High-density polyethylene (HDPE) pipes find versatile applicability for transportation of water, sewage and slurry from one place to another. Hence, these pipes undergo tremendous pressure by the fluid carried. The present work entails the optimization of the withstanding pressure of the HDPE pipes using Taguchi technique. The traditional heuristic methodology stresses on a trial and error approach and relies heavily upon the accumulated experience of the process engineers for determining the optimal process control parameters. This results in setting up of less-than-optimal values. Hence, there arouse a necessity to determine optimal process control parameters for the pipe extrusion process, which can ensure robust pipe quality and process reliability. In the proposed optimization strategy, the design of experiments (DoE) are conducted wherein different control parameter combinations are analyzed by considering multiple setting levels of each control parameter. The concept of signal-to-noise ratio ( S/ N ratio) is applied and ultimately optimum values of process control parameters are obtained as: pushing zone temperature of 166 °C, Dimmer speed at 08 rpm, and Die head temperature to be 192 °C. Confirmation experimental run is also conducted to verify the analysis and research result and values proved to be in synchronization with the main experimental findings and the withstanding pressure showed a significant improvement from 0.60 to 1.004 Mpa.

  12. Control analysis for autonomously oscillating biochemical networks.

    PubMed Central

    Reijenga, Karin A; Westerhoff, Hans V; Kholodenko, Boris N; Snoep, Jacky L

    2002-01-01

    It has hitherto not been possible to analyze the control of oscillatory dynamic cellular processes in other than qualitative ways. The control coefficients, used in metabolic control analyses of steady states, cannot be applied directly to dynamic systems. We here illustrate a way out of this limitation that uses Fourier transforms to convert the time domain into the stationary frequency domain, and then analyses the control of limit cycle oscillations. In addition to the already known summation theorems for frequency and amplitude, we reveal summation theorems that apply to the control of average value, waveform, and phase differences of the oscillations. The approach is made fully operational in an analysis of yeast glycolytic oscillations. It follows an experimental approach, sampling from the model output and using discrete Fourier transforms of this data set. It quantifies the control of various aspects of the oscillations by the external glucose concentration and by various internal molecular processes. We show that the control of various oscillatory properties is distributed over the system enzymes in ways that differ among those properties. The models that are described in this paper can be accessed on http://jjj.biochem.sun.ac.za. PMID:11751299

  13. Depressive Symptoms and Parenting Competence: An Analysis of 13 Regulatory Processes

    ERIC Educational Resources Information Center

    Dix, Theodore; Meunier, Leah N.

    2009-01-01

    Mechanisms that lead depressive symptoms to undermine parenting are poorly understood. This review examines cognitive, affective, and motivational processes thought to be responsible for the impact of depressive symptoms on parenting. We present a five-step, action-control model and review 152 studies relevant to 13 regulatory processes. Evidence…

  14. 78 FR 47701 - Agency Information Collection Activities; Proposed Collection; Comment Request; Procedures for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-06

    ... Fishery Products--21 CFR Part 123 (OMB Control Number 0910-0354)-- Extension FDA regulations in part 123 (21 CFR part 123) mandate the application of hazard analysis and critical control point (HACCP) principles to the processing of seafood. HACCP is a preventive system of hazard control designed to help...

  15. 75 FR 18211 - Agency Information Collection Activities; Proposed Collection; Comment Request; Procedures for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-09

    ... Importing of Fish and Fishery Products--21 CFR Part 123 (OMB Control Number 0910-0354)-- Extension FDA regulations in part 123 (21 CFR part 123) mandate the application of hazard analysis and critical control point (HACCP) principles to the processing of seafood. HACCP is a preventive system of hazard control...

  16. Interfacing sensory input with motor output: does the control architecture converge to a serial process along a single channel?

    PubMed Central

    van de Kamp, Cornelis; Gawthrop, Peter J.; Gollee, Henrik; Lakie, Martin; Loram, Ian D.

    2013-01-01

    Modular organization in control architecture may underlie the versatility of human motor control; but the nature of the interface relating sensory input through task-selection in the space of performance variables to control actions in the space of the elemental variables is currently unknown. Our central question is whether the control architecture converges to a serial process along a single channel? In discrete reaction time experiments, psychologists have firmly associated a serial single channel hypothesis with refractoriness and response selection [psychological refractory period (PRP)]. Recently, we developed a methodology and evidence identifying refractoriness in sustained control of an external single degree-of-freedom system. We hypothesize that multi-segmental whole-body control also shows refractoriness. Eight participants controlled their whole body to ensure a head marker tracked a target as fast and accurately as possible. Analysis showed enhanced delays in response to stimuli with close temporal proximity to the preceding stimulus. Consistent with our preceding work, this evidence is incompatible with control as a linear time invariant process. This evidence is consistent with a single-channel serial ballistic process within the intermittent control paradigm with an intermittent interval of around 0.5 s. A control architecture reproducing intentional human movement control must reproduce refractoriness. Intermittent control is designed to provide computational time for an online optimization process and is appropriate for flexible adaptive control. For human motor control we suggest that parallel sensory input converges to a serial, single channel process involving planning, selection, and temporal inhibition of alternative responses prior to low dimensional motor output. Such design could aid robots to reproduce the flexibility of human control. PMID:23675342

  17. Numerical natural rubber curing simulation, obtaining a controlled gradient of the state of cure in a thick-section part

    NASA Astrophysics Data System (ADS)

    El Labban, A.; Mousseau, P.; Bailleul, J. L.; Deterre, R.

    2007-04-01

    Although numerical simulation has proved to be a useful tool to predict the rubber vulcanization process, few applications in the process control have been reported. Because the end-use rubber properties depend on the state of cure distribution in the parts thickness, the prediction of the optimal distribution remains a challenge for the rubber industry. The analysis of the vulcanization process requires the determination of the thermal behavior of the material and the cure kinetics. A nonisothermal vulcanization model with nonisothermal induction time is used in this numerical study. Numerical results are obtained for natural rubber (NR) thick-section part curing. A controlled gradient of the state of cure in the part thickness is obtained by a curing process that consists not only in mold heating phase, but also a forced convection mold cooling phase in order to stop the vulcanization process and to control the vulcanization distribution. The mold design that allows this control is described. In the heating phase, the state of cure is mainly controlled by the chemical kinetics (the induction time), but in the cooling phase, it is the heat diffusion that controls the state of cure distribution. A comparison among different cooling conditions is shown and a good state of cure gradient control is obtained.

  18. Reference Models for Structural Technology Assessment and Weight Estimation

    NASA Technical Reports Server (NTRS)

    Cerro, Jeff; Martinovic, Zoran; Eldred, Lloyd

    2005-01-01

    Previously the Exploration Concepts Branch of NASA Langley Research Center has developed techniques for automating the preliminary design level of launch vehicle airframe structural analysis for purposes of enhancing historical regression based mass estimating relationships. This past work was useful and greatly reduced design time, however its application area was very narrow in terms of being able to handle a large variety in structural and vehicle general arrangement alternatives. Implementation of the analysis approach presented herein also incorporates some newly developed computer programs. Loft is a program developed to create analysis meshes and simultaneously define structural element design regions. A simple component defining ASCII file is read by Loft to begin the design process. HSLoad is a Visual Basic implementation of the HyperSizer Application Programming Interface, which automates the structural element design process. Details of these two programs and their use are explained in this paper. A feature which falls naturally out of the above analysis paradigm is the concept of "reference models". The flexibility of the FEA based JAVA processing procedures and associated process control classes coupled with the general utility of Loft and HSLoad make it possible to create generic program template files for analysis of components ranging from something as simple as a stiffened flat panel, to curved panels, fuselage and cryogenic tank components, flight control surfaces, wings, through full air and space vehicle general arrangements.

  19. Fluorescence analysis of ubiquinone and its application in quality control of medical supplies

    NASA Astrophysics Data System (ADS)

    Timofeeva, Elvira O.; Gorbunova, Elena V.; Chertov, Aleksandr N.

    2017-02-01

    The presence of antioxidant issues such as redox potential imbalance in human body is a very important question for modern clinical diagnostics. Implementation of fluorescence analysis into optical diagnostics of such wide distributed in a human body antioxidant as ubiquinone is one of the steps for development of the device with a view to clinical diagnostics of redox potential. Recording of fluorescence was carried out with spectrometer using UV irradiation source with thin band (max at 287 and 330 nm) as a background radiation. Concentrations of ubiquinone from 0.25 to 2.5 mmol/l in explored samples were used for investigation. Recording data was processed using correlation analysis and differential analytical technique. The fourth derivative spectrum of fluorescence spectrum provided the basis for a multicomponent analysis of the solutions. As a technique in clinical diagnostics fluorescence analysis with processing method including differential spectrophotometry, it is step forward towards redox potential calculation and quality control in pharmacy for better health care.

  20. Investigation on sense of control parameters for joystick interface in remote operated container crane application

    NASA Astrophysics Data System (ADS)

    Abdullah, U. N. N.; Handroos, H.

    2017-09-01

    Introduction: This paper presents the study of sense of control parameters to improve the lack of direct motion feeling through remote operated container crane station (ROCCS) joystick interface. The investigations of the parameters in this study are important to develop the engineering parameters related to the sense of control goal in the next design process. Methodology: Structured interviews and observations were conducted to obtain the user experience data from thirteen remote container crane operators from two international terminals. Then, interview analysis, task analysis, activity analysis and time line analysis were conducted to compare and contrast the results from interviews and observations. Results: Four experience parameters were identified to support the sense of control goal in the later design improvement of the ROCC joystick interface. The significance of difficulties to control, unsynchronized movements, facilitate in control and decision making in unexpected situation as parameters to the sense of control goal were validated by' feedbacks from operators as well as analysis. Contribution: This study provides feedback directly from end users towards developing a sustainable control interface for ROCCS in specific and remote operated off-road vehicles in general.

  1. Spectral Quantitation Of Hydroponic Nutrients

    NASA Technical Reports Server (NTRS)

    Schlager, Kenneth J.; Kahle, Scott J.; Wilson, Monica A.; Boehlen, Michelle

    1996-01-01

    Instrument continuously monitors hydroponic solution by use of absorption and emission spectrometry to determine concentrations of principal nutrients, including nitrate, iron, potassium, calcium, magnesium, phosphorus, sodium, and others. Does not depend on extraction and processing of samples, use of such surrograte parameters as pH or electrical conductivity for control, or addition of analytical reagents to solution. Solution not chemically altered by analysis and can be returned to hydroponic process stream after analysis.

  2. Computational analysis for biodegradation of exogenously depolymerizable polymer

    NASA Astrophysics Data System (ADS)

    Watanabe, M.; Kawai, F.

    2018-03-01

    This study shows that microbial growth and decay in a biodegradation process of exogenously depolymerizable polymer are controlled by consumption of monomer units. Experimental outcomes for residual polymer were incorporated in inverse analysis for a degradation rate. The Gauss-Newton method was applied to an inverse problem for two parameter values associated with the microbial population. A biodegradation process of polyethylene glycol was analyzed numerically, and numerical outcomes were obtained.

  3. Analysis of Contracting Processes, Internal Controls, and Procurement Fraud Schemes

    DTIC Science & Technology

    2013-06-01

    Government Accountability Office (GAO) has kept DoD Contract Management on its High Risk Series list since 1992 (GAO, 2009). In addition, the GAO also...adopted from the Committee of Sponsoring Organizations of the Treadway Commission ( COSO ) (1992). There are five components: control environment, risk ...2001). 2. Risk Assessment Risk assessment is the identification, analysis, and management of risk faced by an organization. It entails identifying

  4. Codimension-Two Bifurcation Analysis in DC Microgrids Under Droop Control

    NASA Astrophysics Data System (ADS)

    Lenz, Eduardo; Pagano, Daniel J.; Tahim, André P. N.

    This paper addresses local and global bifurcations that may appear in electrical power systems, such as DC microgrids, which recently has attracted interest from the electrical engineering society. Most sources in these networks are voltage-type and operate in parallel. In such configuration, the basic technique for stabilizing the bus voltage is the so-called droop control. The main contribution of this work is a codimension-two bifurcation analysis of a small DC microgrid considering the droop control gain and the power processed by the load as bifurcation parameters. The codimension-two bifurcation set leads to practical rules for achieving a robust droop control design. Moreover, the bifurcation analysis also offers a better understanding of the dynamics involved in the problem and how to avoid possible instabilities. Simulation results are presented in order to illustrate the bifurcation analysis.

  5. Development of control system of coating of rod hydraulic cylinders

    NASA Astrophysics Data System (ADS)

    Aizhambaeva, S. Zh; Maximova, A. V.

    2018-01-01

    In this article, requirements to materials of hydraulic cylinders and methods of eliminating the main factors affecting the quality of the applied coatings rod hydraulic cylinders. The chromium plating process - one of ways of increase of anti-friction properties of coatings rods, stability to the wear and corrosion. The article gives description of differences of the stand-speed chromium plating process from other types of chromium plating that determines a conclusion about cutting time of chromium plating process. Conducting the analysis of technological equipment suggested addressing the modernization of high-speed chromium plating processes by automation and mechanization. Control system developed by design of schematic block diagram of a modernized and stand-speed chromium plating process.

  6. Risk analysis of hematopoietic stem cell transplant process: failure mode, effect, and criticality analysis and hazard analysis critical control point methods integration based on guidelines to good manufacturing practice for medicinal product ANNEX 20 (February 2008).

    PubMed

    Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F

    2010-01-01

    The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.

  7. One Way of Testing a Distributed Processor

    NASA Technical Reports Server (NTRS)

    Edstrom, R.; Kleckner, D.

    1982-01-01

    Launch processing for Space Shuttle is checked out, controlled, and monitored with new system. Entire system can be exercised by two computer programs--one in master console and other in each of operations consoles. Control program in each operations console detects change in status and begins task initiation. All of front-end processors are exercised from consoles through common data buffer, and all data are logged to processed-data recorder for posttest analysis.

  8. Emerging non-invasive Raman methods in process control and forensic applications.

    PubMed

    Macleod, Neil A; Matousek, Pavel

    2008-10-01

    This article reviews emerging Raman techniques (Spatially Offset and Transmission Raman Spectroscopy) for non-invasive, sub-surface probing in process control and forensic applications. New capabilities offered by these methods are discussed and several application examples are given including the non-invasive detection of counterfeit drugs through blister packs and opaque plastic bottles and the rapid quantitative analysis of the bulk content of pharmaceutical tablets and capsules without sub-sampling.

  9. Depth Cue Integration in an Active Control Paradigm

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K.; Sweet, Barabara T.; Shafto, Meredith; Null, Cynthia H. (Technical Monitor)

    1995-01-01

    Numerous models of depth cue integration have been proposed. Of particular interest is how the visual system processes discrepent cues, as might arise when viewing synthetic displays. A powerful paradigm for examining this integration process can be adapted from manual control research. This methodology introduces independent disturbances in the candidate cues, then performs spectral analysis of subjects' resulting motoric responses (e.g., depth matching). We will describe this technique and present initial findings.

  10. Dual-Use Space Technology Transfer Conference and Exhibition. Volume 2

    NASA Technical Reports Server (NTRS)

    Krishen, Kumar (Compiler)

    1994-01-01

    This is the second volume of papers presented at the Dual-Use Space Technology Transfer Conference and Exhibition held at the Johnson Space Center February 1-3, 1994. Possible technology transfers covered during the conference were in the areas of information access; innovative microwave and optical applications; materials and structures; marketing and barriers; intelligent systems; human factors and habitation; communications and data systems; business process and technology transfer; software engineering; biotechnology and advanced bioinstrumentation; communications signal processing and analysis; medical care; applications derived from control center data systems; human performance evaluation; technology transfer methods; mathematics, modeling, and simulation; propulsion; software analysis and decision tools; systems/processes in human support technology; networks, control centers, and distributed systems; power; rapid development; perception and vision technologies; integrated vehicle health management; automation technologies; advanced avionics; and robotics technologies.

  11. Quality-assurance plan for the analysis of fluvial sediment by the U.S. Geological Survey Kentucky Water Science Center Sediment Laboratory

    USGS Publications Warehouse

    Shreve, Elizabeth A.; Downs, Aimee C.

    2005-01-01

    This report describes laboratory procedures used by the U.S. Geological Survey Kentucky Water Science Center Sediment Laboratory for the processing and analysis of fluvial-sediment samples for concentration of sand and finer material. The report details the processing of a sediment sample through the laboratory from receiving the sediment sample, through the analytical process, to compiling results of the requested analysis. Procedures for preserving sample integrity, calibrating and maintaining of laboratory and field instruments and equipment, analyzing samples, internal quality assurance and quality control, and validity of the sediment-analysis results also are described. The report includes a list of references cited and a glossary of sediment and quality-assurance terms.

  12. Spatial recurrence analysis: A sensitive and fast detection tool in digital mammography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prado, T. L.; Galuzio, P. P.; Lopes, S. R.

    Efficient diagnostics of breast cancer requires fast digital mammographic image processing. Many breast lesions, both benign and malignant, are barely visible to the untrained eye and requires accurate and reliable methods of image processing. We propose a new method of digital mammographic image analysis that meets both needs. It uses the concept of spatial recurrence as the basis of a spatial recurrence quantification analysis, which is the spatial extension of the well-known time recurrence analysis. The recurrence-based quantifiers are able to evidence breast lesions in a way as good as the best standard image processing methods available, but with amore » better control over the spurious fragments in the image.« less

  13. Modeling of feed-forward control using the partial least squares regression method in the tablet compression process.

    PubMed

    Hattori, Yusuke; Otsuka, Makoto

    2017-05-30

    In the pharmaceutical industry, the implementation of continuous manufacturing has been widely promoted in lieu of the traditional batch manufacturing approach. More specially, in recent years, the innovative concept of feed-forward control has been introduced in relation to process analytical technology. In the present study, we successfully developed a feed-forward control model for the tablet compression process by integrating data obtained from near-infrared (NIR) spectra and the physical properties of granules. In the pharmaceutical industry, batch manufacturing routinely allows for the preparation of granules with the desired properties through the manual control of process parameters. On the other hand, continuous manufacturing demands the automatic determination of these process parameters. Here, we proposed the development of a control model using the partial least squares regression (PLSR) method. The most significant feature of this method is the use of dataset integrating both the NIR spectra and the physical properties of the granules. Using our model, we determined that the properties of products, such as tablet weight and thickness, need to be included as independent variables in the PLSR analysis in order to predict unknown process parameters. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Optimization Control of the Color-Coating Production Process for Model Uncertainty

    PubMed Central

    He, Dakuo; Wang, Zhengsong; Yang, Le; Mao, Zhizhong

    2016-01-01

    Optimized control of the color-coating production process (CCPP) aims at reducing production costs and improving economic efficiency while meeting quality requirements. However, because optimization control of the CCPP is hampered by model uncertainty, a strategy that considers model uncertainty is proposed. Previous work has introduced a mechanistic model of CCPP based on process analysis to simulate the actual production process and generate process data. The partial least squares method is then applied to develop predictive models of film thickness and economic efficiency. To manage the model uncertainty, the robust optimization approach is introduced to improve the feasibility of the optimized solution. Iterative learning control is then utilized to further refine the model uncertainty. The constrained film thickness is transformed into one of the tracked targets to overcome the drawback that traditional iterative learning control cannot address constraints. The goal setting of economic efficiency is updated continuously according to the film thickness setting until this reaches its desired value. Finally, fuzzy parameter adjustment is adopted to ensure that the economic efficiency and film thickness converge rapidly to their optimized values under the constraint conditions. The effectiveness of the proposed optimization control strategy is validated by simulation results. PMID:27247563

  15. Optimization Control of the Color-Coating Production Process for Model Uncertainty.

    PubMed

    He, Dakuo; Wang, Zhengsong; Yang, Le; Mao, Zhizhong

    2016-01-01

    Optimized control of the color-coating production process (CCPP) aims at reducing production costs and improving economic efficiency while meeting quality requirements. However, because optimization control of the CCPP is hampered by model uncertainty, a strategy that considers model uncertainty is proposed. Previous work has introduced a mechanistic model of CCPP based on process analysis to simulate the actual production process and generate process data. The partial least squares method is then applied to develop predictive models of film thickness and economic efficiency. To manage the model uncertainty, the robust optimization approach is introduced to improve the feasibility of the optimized solution. Iterative learning control is then utilized to further refine the model uncertainty. The constrained film thickness is transformed into one of the tracked targets to overcome the drawback that traditional iterative learning control cannot address constraints. The goal setting of economic efficiency is updated continuously according to the film thickness setting until this reaches its desired value. Finally, fuzzy parameter adjustment is adopted to ensure that the economic efficiency and film thickness converge rapidly to their optimized values under the constraint conditions. The effectiveness of the proposed optimization control strategy is validated by simulation results.

  16. Food safety and nutritional quality for the prevention of non communicable diseases: the Nutrient, hazard Analysis and Critical Control Point process (NACCP).

    PubMed

    Di Renzo, Laura; Colica, Carmen; Carraro, Alberto; Cenci Goga, Beniamino; Marsella, Luigi Tonino; Botta, Roberto; Colombo, Maria Laura; Gratteri, Santo; Chang, Ting Fa Margherita; Droli, Maurizio; Sarlo, Francesca; De Lorenzo, Antonino

    2015-04-23

    The important role of food and nutrition in public health is being increasingly recognized as crucial for its potential impact on health-related quality of life and the economy, both at the societal and individual levels. The prevalence of non-communicable diseases calls for a reformulation of our view of food. The Hazard Analysis and Critical Control Point (HACCP) system, first implemented in the EU with the Directive 43/93/CEE, later replaced by Regulation CE 178/2002 and Regulation CE 852/2004, is the internationally agreed approach for food safety control. Our aim is to develop a new procedure for the assessment of the Nutrient, hazard Analysis and Critical Control Point (NACCP) process, for total quality management (TMQ), and optimize nutritional levels. NACCP was based on four general principles: i) guarantee of health maintenance; ii) evaluate and assure the nutritional quality of food and TMQ; iii) give correct information to the consumers; iv) ensure an ethical profit. There are three stages for the application of the NACCP process: 1) application of NACCP for quality principles; 2) application of NACCP for health principals; 3) implementation of the NACCP process. The actions are: 1) identification of nutritional markers, which must remain intact throughout the food supply chain; 2) identification of critical control points which must monitored in order to minimize the likelihood of a reduction in quality; 3) establishment of critical limits to maintain adequate levels of nutrient; 4) establishment, and implementation of effective monitoring procedures of critical control points; 5) establishment of corrective actions; 6) identification of metabolic biomarkers; 7) evaluation of the effects of food intake, through the application of specific clinical trials; 8) establishment of procedures for consumer information; 9) implementation of the Health claim Regulation EU 1924/2006; 10) starting a training program. We calculate the risk assessment as follows: Risk (R) = probability (P) × damage (D). The NACCP process considers the entire food supply chain "from farm to consumer"; in each point of the chain it is necessary implement a tight monitoring in order to guarantee optimal nutritional quality.

  17. Face and body perception in schizophrenia: a configural processing deficit?

    PubMed

    Soria Bauser, Denise; Thoma, Patrizia; Aizenberg, Victoria; Brüne, Martin; Juckel, Georg; Daum, Irene

    2012-01-30

    Face and body perception rely on common processing mechanisms and activate similar but not identical brain networks. Patients with schizophrenia show impaired face perception, and the present study addressed for the first time body perception in this group. Seventeen patients diagnosed with schizophrenia or schizoaffective disorder were compared to 17 healthy controls on standardized tests assessing basic face perception skills (identity discrimination, memory for faces, recognition of facial affect). A matching-to-sample task including emotional and neutral faces, bodies and cars either in an upright or in an inverted position was administered to assess potential category-specific performance deficits and impairments of configural processing. Relative to healthy controls, schizophrenia patients showed poorer performance on the tasks assessing face perception skills. In the matching-to-sample task, they also responded more slowly and less accurately than controls, regardless of the stimulus category. Accuracy analysis showed significant inversion effects for faces and bodies across groups, reflecting configural processing mechanisms; however reaction time analysis indicated evidence of reduced inversion effects regardless of category in schizophrenia patients. The magnitude of the inversion effects was not related to clinical symptoms. Overall, the data point towards reduced configural processing, not only for faces but also for bodies and cars in individuals with schizophrenia. © 2011 Elsevier Ltd. All rights reserved.

  18. The interaction between practice and performance pressure on the planning and control of fast target directed movement.

    PubMed

    Allsop, Jonathan E; Lawrence, Gavin P; Gray, Robert; Khan, Michael A

    2017-09-01

    Pressure to perform often results in decrements to both outcome accuracy and the kinematics of motor skills. Furthermore, this pressure-performance relationship is moderated by the amount of accumulated practice or the experience of the performer. However, the interactive effects of performance pressure and practice on the underlying processes of motor skills are far from clear. Movement execution involves both an offline pre-planning process and an online control process. The present experiment aimed to investigate the interaction between pressure and practice on these two motor control processes. Two groups of participants (control and pressure; N = 12 and 12, respectively) practiced a video aiming amplitude task and were transferred to either a non-pressure (control group) or a pressure condition (pressure group) both early and late in practice. Results revealed similar accuracy and movement kinematics between the control and pressure groups at early transfer. However, at late transfer, the introduction of pressure was associated with increased performance compared to control conditions. Analysis of kinematic variability throughout the movement suggested that the performance increase was due to participants adopting strategies to improve movement planning in response to pressure reducing the effectiveness of the online control system.

  19. [Establishment and application of "multi-dimensional structure and process dynamic quality control technology system" in preparation products of traditional Chinese medicine (I)].

    PubMed

    Gu, Jun-Fei; Feng, Liang; Zhang, Ming-Hua; Wu, Chan; Jia, Xiao-Bin

    2013-11-01

    Safety is an important component of the quality control of traditional Chinese medicine (TCM) preparation products, as well as an important guarantee for clinical application. Currently, the quality control of TCMs in Chinese Pharmacopoeia mostly focuses on indicative compounds for TCM efficacy. TCM preparations are associated with multiple links, from raw materials to products, and each procedure may have impacts on the safety of preparation. We make a summary and analysis on the factors impacting safety during the preparation of TCM products, and then expound the important role of the "multi-dimensional structure and process dynamic quality control technology system" in the quality safety of TCM preparations. Because the product quality of TCM preparation is closely related to the safety, the control over safety-related material basis is an important component of the product quality control of TCM preparations. The implementation of the quality control over the dynamic process of TCM preparations from raw materials to products, and the improvement of the TCM quality safety control at the microcosmic level help lay a firm foundation for the development of the modernization process of TCM preparations.

  20. Impact of Active Climate Control Seats on Energy Use, Fuel Use, and CO2 Emissions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kreutzer, Cory J; Rugh, John P; Titov, Eugene V

    A project was developed through collaboration between Gentherm and NREL to determine the impact of climate control seats for light-duty vehicles in the United States. The project used a combination of experimentation and analysis, with experimental results providing critical input to the analysis process. First, outdoor stationary vehicle testing was performed at NREL's facility in Golden, CO using multiple occupants. Two pre-production Ford Focus electric vehicles were used for testing; one containing a standard inactive seat and the second vehicle containing a Gentherm climate control seat. Multiple maximum cool-down and steady-state cooling tests were performed in late summer conditions. Themore » two vehicles were used to determine the increase in cabin temperature when using the climate control seat in comparison to the baseline vehicle cabin temperature with a standard seat at the equivalent occupant whole-body sensation. The experiments estimated that on average, the climate control seats allowed for a 2.61 degrees Celsius increase in vehicle cabin temperature at equivalent occupant body sensation compared to the baseline vehicle. The increased cabin air temperature along with their measured energy usage were then used as inputs to the national analysis process. The national analysis process was constructed from full vehicle cabin, HVAC, and propulsion models previously developed by NREL. In addition, three representative vehicle platforms, vehicle usage patterns, and vehicle registration weighted environmental data were integrated into the analysis process. Both the baseline vehicle and the vehicle with climate control seats were simulated, using the experimentally determined cabin temperature offset of 2.61degrees Celsius and added seat energy as inputs to the climate control seat vehicle model. The U.S. composite annual fuel use savings for the climate control seats over the baseline A/C system was determined to be 5.1 gallons of gasoline per year per vehicle, corresponding to 4.0 grams of CO2/mile savings. Finally, the potential impact of 100 percent adoption of climate control seats on U.S. light-duty fleet A/C fuel use was calculated to be 1.3 billion gallons of gasoline annually with a corresponding CO2 emissions reduction of 12.7 million tons. Direct comparison of the impact of the CCS to the ventilated seat off-cycle credit was not possible because the NREL analysis calculated a combined car/truck savings and the baseline A/C CO2 emissions were higher than EPA. To enable comparison, the CCS national A/C CO2 emissions were split into car/truck components and the ventilated seat credit was scaled up. The split CO2 emissions savings due to the CCS were 3.5 g/mi for a car and 4.4 g/mi for a truck. The CCS saved an additional 2.0 g/mi and 2.5 g/mi over the adjusted ventilated seat credit for a car and truck, respectively.« less

  1. A method for identifying EMI critical circuits during development of a large C3

    NASA Astrophysics Data System (ADS)

    Barr, Douglas H.

    The circuit analysis methods and process Boeing Aerospace used on a large, ground-based military command, control, and communications (C3) system are described. This analysis was designed to help identify electromagnetic interference (EMI) critical circuits. The methodology used the MIL-E-6051 equipment criticality categories as the basis for defining critical circuits, relational database technology to help sort through and account for all of the approximately 5000 system signal cables, and Macintosh Plus personal computers to predict critical circuits based on safety margin analysis. The EMI circuit analysis process systematically examined all system circuits to identify which ones were likely to be EMI critical. The process used two separate, sequential safety margin analyses to identify critical circuits (conservative safety margin analysis, and detailed safety margin analysis). These analyses used field-to-wire and wire-to-wire coupling models using both worst-case and detailed circuit parameters (physical and electrical) to predict circuit safety margins. This process identified the predicted critical circuits that could then be verified by test.

  2. Independent Orbiter Assessment (IOA): Analysis of the atmospheric revitalization pressure control subsystem

    NASA Technical Reports Server (NTRS)

    Saiidi, M. J.; Duffy, R. E.; Mclaughlin, T. D.

    1986-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis/Critical Items List (FMEA/CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The independent analysis results corresponding to the Orbiter Atmospheric Revitalization and Pressure Control Subsystem (ARPCS) are documented. The ARPCS hardware was categorized into the following subdivisions: (1) Atmospheric Make-up and Control (including the Auxiliary Oxygen Assembly, Oxygen Assembly, and Nitrogen Assembly); and (2) Atmospheric Vent and Control (including the Positive Relief Vent Assembly, Negative Relief Vent Assembly, and Cabin Vent Assembly). The IOA analysis process utilized available ARPCS hardware drawings and schematics for defining hardware assemblies, components, and hardware items. Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    R.I. Rudyka; Y.E. Zingerman; K.G. Lavrov

    Up-to-date mathematical methods, such as correlation analysis and expert systems, are employed in creating a model of the coking process. Automatic coking-control systems developed by Giprokoks rule out human error. At an existing coke battery, after introducing automatic control, the heating-gas consumption is reduced by {>=}5%.

  4. Evaluation of the Social Motivation Hypothesis of Autism: A Systematic Review and Meta-analysis.

    PubMed

    Clements, Caitlin C; Zoltowski, Alisa R; Yankowitz, Lisa D; Yerys, Benjamin E; Schultz, Robert T; Herrington, John D

    2018-06-13

    The social motivation hypothesis posits that individuals with autism spectrum disorder (ASD) find social stimuli less rewarding than do people with neurotypical activity. However, functional magnetic resonance imaging (fMRI) studies of reward processing have yielded mixed results. To examine whether individuals with ASD process rewarding stimuli differently than typically developing individuals (controls), whether differences are limited to social rewards, and whether contradictory findings in the literature might be due to sample characteristics. Articles were identified in PubMed, Embase, and PsycINFO from database inception until June 1, 2017. Functional MRI data from these articles were provided by most authors. Publications were included that provided brain activation contrasts between a sample with ASD and controls on a reward task, determined by multiple reviewer consensus. When fMRI data were not provided by authors, multiple reviewers extracted peak coordinates and effect sizes from articles to recreate statistical maps using seed-based d mapping software. Random-effects meta-analyses of responses to social, nonsocial, and restricted interest stimuli, as well as all of these domains together, were performed. Secondary analyses included meta-analyses of wanting and liking, meta-regression with age, and correlations with ASD severity. All procedures were conducted in accordance with Meta-analysis of Observational Studies in Epidemiology guidelines. Brain activation differences between groups with ASD and typically developing controls while processing rewards. All analyses except the domain-general meta-analysis were planned before data collection. The meta-analysis included 13 studies (30 total fMRI contrasts) from 259 individuals with ASD and 246 controls. Autism spectrum disorder was associated with aberrant processing of both social and nonsocial rewards in striatal regions and increased activation in response to restricted interests (social reward, caudate cluster: d = -0.25 [95% CI, -0.41 to -0.08]; nonsocial reward, caudate and anterior cingulate cluster: d = -0.22 [95% CI, -0.42 to -0.02]; restricted interests, caudate and nucleus accumbens cluster: d = 0.42 [95% CI, 0.07 to 0.78]). Individuals with ASD show atypical processing of social and nonsocial rewards. Findings support a broader interpretation of the social motivation hypothesis of ASD whereby general atypical reward processing encompasses social reward, nonsocial reward, and perhaps restricted interests. This meta-analysis also suggests that prior mixed results could be driven by sample age differences, warranting further study of the developmental trajectory for reward processing in ASD.

  5. The effectivenes of science domain-based science learning integrated with local potency

    NASA Astrophysics Data System (ADS)

    Kurniawati, Arifah Putri; Prasetyo, Zuhdan Kun; Wilujeng, Insih; Suryadarma, I. Gusti Putu

    2017-08-01

    This research aimed to determine the significant effect of science domain-based science learning integrated with local potency toward science process skills. The research method used was a quasi-experimental design with nonequivalent control group design. The population of this research was all students of class VII SMP Negeri 1 Muntilan. The sample of this research was selected through cluster random sampling, namely class VII B as an experiment class (24 students) and class VII C as a control class (24 students). This research used a test instrument that was adapted from Agus Dwianto's research. The aspect of science process skills in this research was observation, classification, interpretation and communication. The analysis of data used the one factor anova at 0,05 significance level and normalized gain score. The significance level result of science process skills with one factor anova is 0,000. It shows that the significance level < alpha (0,05). It means that there was significant effect of science domain-based science learning integrated with local potency toward science learning process skills. The results of analysis show that the normalized gain score are 0,29 (low category) in control class and 0,67 (medium category) in experiment class.

  6. Comparative genome analysis of the candidate functional starter culture strains Lactobacillus fermentum 222 and Lactobacillus plantarum 80 for controlled cocoa bean fermentation processes.

    PubMed

    Illeghems, Koen; De Vuyst, Luc; Weckx, Stefan

    2015-10-12

    Lactobacillus fermentum 222 and Lactobacillus plantarum 80, isolates from a spontaneous Ghanaian cocoa bean fermentation process, proved to be interesting functional starter culture strains for cocoa bean fermentations. Lactobacillus fermentum 222 is a thermotolerant strain, able to dominate the fermentation process, thereby converting citrate and producing mannitol. Lactobacillus plantarum 80 is an acid-tolerant and facultative heterofermentative strain that is competitive during cocoa bean fermentation processes. In this study, whole-genome sequencing and comparative genome analysis was used to investigate the mechanisms of these strains to dominate the cocoa bean fermentation process. Through functional annotation and analysis of the high-coverage contigs obtained through 454 pyrosequencing, plantaricin production was predicted for L. plantarum 80. For L. fermentum 222, genes encoding a complete arginine deiminase pathway were attributed. Further, in-depth functional analysis revealed the capacities of these strains associated with carbohydrate and amino acid metabolism, such as the ability to use alternative external electron acceptors, the presence of an extended pyruvate metabolism, and the occurrence of several amino acid conversion pathways. A comparative genome sequence analysis using publicly available genome sequences of strains of the species L. plantarum and L. fermentum revealed unique features of both strains studied. Indeed, L. fermentum 222 possessed genes encoding additional citrate transporters and enzymes involved in amino acid conversions, whereas L. plantarum 80 is the only member of this species that harboured a gene cluster involved in uptake and consumption of fructose and/or sorbose. In-depth genome sequence analysis of the candidate functional starter culture strains L. fermentum 222 and L. plantarum 80 revealed their metabolic capacities, niche adaptations and functionalities that enable them to dominate the cocoa bean fermentation process. Further, these results offered insights into the cocoa bean fermentation ecosystem as a whole and will facilitate the selection of appropriate starter culture strains for controlled cocoa bean fermentation processes.

  7. An Analysis of Current Inventory Control and Practices to Provide Recommendations on How to Transfer Interim Supply Support From NAVSEA to NAVSUP and Then Manage That Material

    DTIC Science & Technology

    2011-06-01

    process would result in lower excess material and financial savings. The data set analyzed contained ISS purchased by both NAVAIR and NAVSEA as well as...demand data. Additionally, analysis of current ISS processes at NAVAIR and NAVSEA was conducted. The research resulted in finding that NAVAIR and...eventually categorized as excess material. A more efficient ISS process would result in lower excess material and financial savings. The data set analyzed

  8. Intelligent control system for continuous technological process of alkylation

    NASA Astrophysics Data System (ADS)

    Gebel, E. S.; Hakimov, R. A.

    2018-01-01

    Relevance of intelligent control for complex dynamic objects and processes are shown in this paper. The model of a virtual analyzer based on a neural network is proposed. Comparative analysis of mathematical models implemented in MathLab software showed that the most effective from the point of view of the reproducibility of the result is the model with seven neurons in the hidden layer, the training of which was performed using the method of scaled coupled gradients. Comparison of the data from the laboratory analysis and the theoretical model are showed that the root-mean-square error does not exceed 3.5, and the calculated value of the correlation coefficient corresponds to a "strong" connection between the values.

  9. An integrated exhaust gas analysis system with self-contained data processing and automatic calibration

    NASA Technical Reports Server (NTRS)

    Anderson, R. C.; Summers, R. L.

    1981-01-01

    An integrated gas analysis system designed to operate in automatic, semiautomatic, and manual modes from a remote control panel is described. The system measures the carbon monoxide, oxygen, water vapor, total hydrocarbons, carbon dioxide, and oxides of nitrogen. A pull through design provides increased reliability and eliminates the need for manual flow rate adjustment and pressure correction. The system contains two microprocessors to range the analyzers, calibrate the system, process the raw data to units of concentration, and provides information to the facility research computer and to the operator through terminal and the control panels. After initial setup, the system operates for several hours without significant operator attention.

  10. Rapid evaluation and quality control of next generation sequencing data with FaQCs.

    PubMed

    Lo, Chien-Chi; Chain, Patrick S G

    2014-11-19

    Next generation sequencing (NGS) technologies that parallelize the sequencing process and produce thousands to millions, or even hundreds of millions of sequences in a single sequencing run, have revolutionized genomic and genetic research. Because of the vagaries of any platform's sequencing chemistry, the experimental processing, machine failure, and so on, the quality of sequencing reads is never perfect, and often declines as the read is extended. These errors invariably affect downstream analysis/application and should therefore be identified early on to mitigate any unforeseen effects. Here we present a novel FastQ Quality Control Software (FaQCs) that can rapidly process large volumes of data, and which improves upon previous solutions to monitor the quality and remove poor quality data from sequencing runs. Both the speed of processing and the memory footprint of storing all required information have been optimized via algorithmic and parallel processing solutions. The trimmed output compared side-by-side with the original data is part of the automated PDF output. We show how this tool can help data analysis by providing a few examples, including an increased percentage of reads recruited to references, improved single nucleotide polymorphism identification as well as de novo sequence assembly metrics. FaQCs combines several features of currently available applications into a single, user-friendly process, and includes additional unique capabilities such as filtering the PhiX control sequences, conversion of FASTQ formats, and multi-threading. The original data and trimmed summaries are reported within a variety of graphics and reports, providing a simple way to do data quality control and assurance.

  11. Work Flow Analysis Report Consisting of Work Management - Preventive Maintenance - Materials and Equipment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    JENNINGS, T.L.

    The Work Flow analysis Report will be used to facilitate the requirements for implementing the Work Control module of Passport. The report consists of workflow integration processes for Work Management, Preventative Maintenance, Materials and Equipment

  12. Thermal sensors to control polymer forming. Challenge and solutions

    NASA Astrophysics Data System (ADS)

    Lemeunier, F.; Boyard, N.; Sarda, A.; Plot, C.; Lefèvre, N.; Petit, I.; Colomines, G.; Allanic, N.; Bailleul, J. L.

    2017-10-01

    Many thermal sensors are already used, for many years, to better understand and control material forming processes, especially polymer processing. Due to technical constraints (high pressure, sealing, sensor dimensions…) the thermal measurement is often performed in the tool or close its surface. Thus, it only gives partial and disturbed information. Having reliable information about the heat flux exchanges between the tool and the material during the process would be very helpful to improve the control of the process and to favor the development of new materials. In this work, we present several sensors developed in labs to study the molding steps in forming processes. The analysis of the obtained thermal measurements (temperature, heat flux) shows the required sensitivity threshold of sensitivity of thermal sensors to be able to detect on-line the rate of thermal reaction. Based on these data, we will present new sensor designs which have been patented.

  13. Near-infrared spectroscopy monitoring and control of the fluidized bed granulation and coating processes-A review.

    PubMed

    Liu, Ronghua; Li, Lian; Yin, Wenping; Xu, Dongbo; Zang, Hengchang

    2017-09-15

    The fluidized bed granulation and pellets coating technologies are widely used in pharmaceutical industry, because the particles made in a fluidized bed have good flowability, compressibility, and the coating thickness of pellets are homogeneous. With the popularization of process analytical technology (PAT), real-time analysis for critical quality attributes (CQA) was getting more attention. Near-infrared (NIR) spectroscopy, as a PAT tool, could realize the real-time monitoring and control during the granulating and coating processes, which could optimize the manufacturing processes. This article reviewed the application of NIR spectroscopy in CQA (moisture content, particle size and tablet/pellet thickness) monitoring during fluidized bed granulation and coating processes. Through this review, we would like to provide references for realizing automated control and intelligent production in fluidized bed granulation and pellets coating of pharmaceutical industry. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Field propagation-induced directionality of carrier-envelope phase-controlled photoemission from nanospheres

    PubMed Central

    Süßmann, F.; Seiffert, L.; Zherebtsov, S.; Mondes, V.; Stierle, J.; Arbeiter, M.; Plenge, J.; Rupp, P.; Peltz, C.; Kessel, A.; Trushin, S. A.; Ahn, B.; Kim, D.; Graf, C.; Rühl, E.; Kling, M. F.; Fennel, T.

    2015-01-01

    Near-fields of non-resonantly laser-excited nanostructures enable strong localization of ultrashort light fields and have opened novel routes to fundamentally modify and control electronic strong-field processes. Harnessing spatiotemporally tunable near-fields for the steering of sub-cycle electron dynamics may enable ultrafast optoelectronic devices and unprecedented control in the generation of attosecond electron and photon pulses. Here we utilize unsupported sub-wavelength dielectric nanospheres to generate near-fields with adjustable structure and study the resulting strong-field dynamics via photoelectron imaging. We demonstrate field propagation-induced tunability of the emission direction of fast recollision electrons up to a regime, where nonlinear charge interaction effects become dominant in the acceleration process. Our analysis supports that the timing of the recollision process remains controllable with attosecond resolution by the carrier-envelope phase, indicating the possibility to expand near-field-mediated control far into the realm of high-field phenomena. PMID:26264422

  15. Field propagation-induced directionality of carrier-envelope phase-controlled photoemission from nanospheres.

    PubMed

    Süßmann, F; Seiffert, L; Zherebtsov, S; Mondes, V; Stierle, J; Arbeiter, M; Plenge, J; Rupp, P; Peltz, C; Kessel, A; Trushin, S A; Ahn, B; Kim, D; Graf, C; Rühl, E; Kling, M F; Fennel, T

    2015-08-12

    Near-fields of non-resonantly laser-excited nanostructures enable strong localization of ultrashort light fields and have opened novel routes to fundamentally modify and control electronic strong-field processes. Harnessing spatiotemporally tunable near-fields for the steering of sub-cycle electron dynamics may enable ultrafast optoelectronic devices and unprecedented control in the generation of attosecond electron and photon pulses. Here we utilize unsupported sub-wavelength dielectric nanospheres to generate near-fields with adjustable structure and study the resulting strong-field dynamics via photoelectron imaging. We demonstrate field propagation-induced tunability of the emission direction of fast recollision electrons up to a regime, where nonlinear charge interaction effects become dominant in the acceleration process. Our analysis supports that the timing of the recollision process remains controllable with attosecond resolution by the carrier-envelope phase, indicating the possibility to expand near-field-mediated control far into the realm of high-field phenomena.

  16. Process evaluation of the Enabling Mothers toPrevent Pediatric Obesity Through Web-Based Learning and Reciprocal Determinism (EMPOWER) randomized control trial.

    PubMed

    Knowlden, Adam P; Sharma, Manoj

    2014-09-01

    Family-and-home-based interventions are an important vehicle for preventing childhood obesity. Systematic process evaluations have not been routinely conducted in assessment of these interventions. The purpose of this study was to plan and conduct a process evaluation of the Enabling Mothers to Prevent Pediatric Obesity Through Web-Based Learning and Reciprocal Determinism (EMPOWER) randomized control trial. The trial was composed of two web-based, mother-centered interventions for prevention of obesity in children between 4 and 6 years of age. Process evaluation used the components of program fidelity, dose delivered, dose received, context, reach, and recruitment. Categorical process evaluation data (program fidelity, dose delivered, dose exposure, and context) were assessed using Program Implementation Index (PII) values. Continuous process evaluation variables (dose satisfaction and recruitment) were assessed using ANOVA tests to evaluate mean differences between groups (experimental and control) and sessions (sessions 1 through 5). Process evaluation results found that both groups (experimental and control) were equivalent, and interventions were administered as planned. Analysis of web-based intervention process objectives requires tailoring of process evaluation models for online delivery. Dissemination of process evaluation results can advance best practices for implementing effective online health promotion programs. © 2014 Society for Public Health Education.

  17. Uterus segmentation in dynamic MRI using LBP texture descriptors

    NASA Astrophysics Data System (ADS)

    Namias, R.; Bellemare, M.-E.; Rahim, M.; Pirró, N.

    2014-03-01

    Pelvic floor disorders cover pathologies of which physiopathology is not well understood. However cases get prevalent with an ageing population. Within the context of a project aiming at modelization of the dynamics of pelvic organs, we have developed an efficient segmentation process. It aims at alleviating the radiologist with a tedious one by one image analysis. From a first contour delineating the uterus-vagina set, the organ border is tracked along a dynamic mri sequence. The process combines movement prediction, local intensity and texture analysis and active contour geometry control. Movement prediction allows a contour intitialization for next image in the sequence. Intensity analysis provides image-based local contour detection enhanced by local binary pattern (lbp) texture descriptors. Geometry control prohibits self intersections and smoothes the contour. Results show the efficiency of the method with images produced in clinical routine.

  18. Red and Processed Meat Intake Is Associated with Higher Gastric Cancer Risk: A Meta-Analysis of Epidemiological Observational Studies

    PubMed Central

    Zhang, Chi; Zhu, Chen; Tao, Guangzhou; Zhao, Lianjun; Tang, Shaowen; Shu, Zheng; Cai, Jing; Dai, Shengbin; Qin, Qin; Xu, Liping; Cheng, Hongyan; Sun, Xinchen

    2013-01-01

    Background Red and processed meat was concluded as a limited-suggestive risk factor of gastric cancer by the World Cancer Research Fund. However, recent epidemiological studies have yielded inconclusive results. Methods We searched Medline, EMBASE, and the Cochrane Library from their inception to April 2013 for both cohort and case-control studies which assessed the association between red and/or processed meat intake and gastric cancer risk. Study-specific relative risk estimates were polled by random-effect or fixed-effect models. Results Twelve cohort and thirty case-control studies were included in the meta-analysis. Significant associations were found between both red (RR: 1.45, 95% CI: 1.22–1.73) and processed (RR: 1.45, 95% CI: 1.26–1.65) meat intake and gastric cancer risk generally. Positive findings were also existed in the items of beef (RR: 1.28, 95% CI: 1.04–1.57), bacon (RR: 1.37, 95% CI: 1.17–1.61), ham (RR: 1.44, 95% CI: 1.00–2.06), and sausage (RR: 1.33, 95% CI: 1.16–1.52). When conducted by study design, the association was significant in case-control studies (RR: 1.63, 95% CI: 1.33–1.99) but not in cohort studies (RR: 1.02, 95% CI: 0.90–1.17) for red meat. Increased relative risks were seen in high-quality, adenocarcinoma, cardia and European-population studies for red meat. And most subgroup analysis confirmed the significant association between processed meat intake and gastric cancer risk. Conclusions Our findings indicate that consumption of red and/or processed meat contributes to increased gastric cancer risk. However, further investigation is needed to confirm the association, especially for red meat. PMID:23967140

  19. Mass spectrometry for the characterization of brewing process.

    PubMed

    Vivian, Adriana Fu; Aoyagui, Caroline Tiemi; de Oliveira, Diogo Noin; Catharino, Rodrigo Ramos

    2016-11-01

    Beer is a carbonated alcoholic beverage produced by fermenting ingredients containing starch, especially malted cereals, and other compounds such as water, hops and yeast. The process comprises five main steps: malting, mashing, boiling, fermentation and maturation. There has been growing interest in the subject, since there is increasing demand for beer quality aspects and beer is a ubiquitous alcoholic beverage in the world. This study is based on the manufacturing process of a Brazilian craft brewery, which is characterized by withdrawing samples during key production stages and using electrospray ionization (ESI) high-resolution mass spectrometry (HRMS), a selective and reliable technique used in the identification of substances in an expeditious and practical way. Multivariate data analysis, namely partial least squares discriminant analysis (PLS-DA) is used to define its markers. In both positive and negative modes of PLS-DA score plot, it is possible to notice differences between each stage. VIP score analysis pointed out markers coherent with the process, such as barley components ((+)-catechin), small peptide varieties, hop content (humulone), yeast metabolic compounds and, in maturation, flavoring compounds (caproic acid, glutaric acid and 2,3-butanediol). Besides that, it was possible to identify other important substances such as off-flavor precursors and other different trace compounds, according to the focus given. This is an attractive alternative for the control of food and beverage industry, allowing a quick assessment of process status before it is finished, preventing higher production costs, ensuring quality and helping the control of desirable features, as flavor, foam stability and drinkability. Covering different classes of compounds, this approach suggests a novel analytical strategy: "processomics", aiming at understanding processes in detail, promoting control and being able to make improvements. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. TU-FG-201-12: Designing a Risk-Based Quality Assurance Program for a Newly Implemented Y-90 Microspheres Procedure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vile, D; Zhang, L; Cuttino, L

    2016-06-15

    Purpose: To create a quality assurance program based upon a risk-based assessment of a newly implemented SirSpheres Y-90 procedure. Methods: A process map was created for a newly implemented SirSpheres procedure at a community hospital. The process map documented each step of this collaborative procedure, as well as the roles and responsibilities of each member. From the process map, different potential failure modes were determined as well as any current controls in place. From this list, a full failure mode and effects analysis (FMEA) was performed by grading each failure mode’s likelihood of occurrence, likelihood of detection, and potential severity.more » These numbers were then multiplied to compute the risk priority number (RPN) for each potential failure mode. Failure modes were then ranked based on their RPN. Additional controls were then added, with failure modes corresponding to the highest RPNs taking priority. Results: A process map was created that succinctly outlined each step in the SirSpheres procedure in its current implementation. From this, 72 potential failure modes were identified and ranked according to their associated RPN. Quality assurance controls and safety barriers were then added for failure modes associated with the highest risk being addressed first. Conclusion: A quality assurance program was created from a risk-based assessment of the SirSpheres process. Process mapping and FMEA were effective in identifying potential high-risk failure modes for this new procedure, which were prioritized for new quality assurance controls. TG 100 recommends the fault tree analysis methodology to design a comprehensive and effective QC/QM program, yet we found that by simply introducing additional safety barriers to address high RPN failure modes makes the whole process simpler and safer.« less

  1. Development of a Comprehensive Digital Avionics Curriculum for the Aeronautical Engineer

    DTIC Science & Technology

    2006-03-01

    able to analyze and design aircraft and missile guidance and control systems, including feedback stabilization schemes and stochastic processes, using ...Uncertainty modeling for robust control; Robust closed-loop stability and performance; Robust H- infinity control; Robustness check using mu-analysis...Controlled feedback (reduces noise) 3. Statistical group response (reduce pressure toward conformity) When used as a tool to study a complex problem

  2. The Complexity of Trust-Control Relationships in Creative Organizations: Insights from a Qualitative Analysis of a Conductorless Orchestra

    ERIC Educational Resources Information Center

    Khodyakov, Dmitry M.

    2007-01-01

    Using a qualitative approach, I study two processes of intra-organizational cooperation and coordination--control and trust--in creative organizations. Specifically, I analyze the complex nature of trust-control relationships in Orpheus orchestra, the world's largest contemporary conductorless orchestra. I discuss how it rehearses and performs…

  3. SSS-A attitude control prelaunch analysis and operations plan

    NASA Technical Reports Server (NTRS)

    Werking, R. D.; Beck, J.; Gardner, D.; Moyer, P.; Plett, M.

    1971-01-01

    A description of the attitude control support being supplied by the Mission and Data Operations Directorate is presented. Descriptions of the computer programs being used to support the mission for attitude determination, prediction, control, and definitive attitude processing are included. In addition, descriptions of the operating procedures which will be used to accomplish mission objectives are provided.

  4. Translational Behavior Analysis: From Laboratory Science in Stimulus Control to Intervention with Persons with Neurodevelopmental Disabilities

    ERIC Educational Resources Information Center

    McIlvane, William J.

    2009-01-01

    Throughout its history, laboratory research in the experimental analysis of behavior has been successful in elucidating and clarifying basic learning principles and processes in both humans and nonhumans. In parallel, applied behavior analysis has shown how fundamental behavior-analytic principles and procedures can be employed to promote…

  5. Structural mode significance using INCA. [Interactive Controls Analysis computer program

    NASA Technical Reports Server (NTRS)

    Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.

    1990-01-01

    Structural finite element models are often too large to be used in the design and analysis of control systems. Model reduction techniques must be applied to reduce the structural model to manageable size. In the past, engineers either performed the model order reduction by hand or used distinct computer programs to retrieve the data, to perform the significance analysis and to reduce the order of the model. To expedite this process, the latest version of INCA has been expanded to include an interactive graphical structural mode significance and model order reduction capability.

  6. EMC analysis of MOS-1

    NASA Astrophysics Data System (ADS)

    Ishizawa, Y.; Abe, K.; Shirako, G.; Takai, T.; Kato, H.

    The electromagnetic compatibility (EMC) control method, system EMC analysis method, and system test method which have been applied to test the components of the MOS-1 satellite are described. The merits and demerits of the problem solving, specification, and system approaches to EMC control are summarized, and the data requirements of the SEMCAP (specification and electromagnetic compatibility analysis program) computer program for verifying the EMI safety margin of the components are sumamrized. Examples of EMC design are mentioned, and the EMC design process and selection method for EMC critical points are shown along with sample EMC test results.

  7. The business process management software for successful quality management and organization: A case study from the University of Split School of Medicine.

    PubMed

    Sapunar, Damir; Grković, Ivica; Lukšić, Davor; Marušić, Matko

    2016-05-01

    Our aim was to describe a comprehensive model of internal quality management (QM) at a medical school founded on the business process analysis (BPA) software tool. BPA software tool was used as the core element for description of all working processes in our medical school, and subsequently the system served as the comprehensive model of internal QM. The quality management system at the University of Split School of Medicine included the documentation and analysis of all business processes within the School. The analysis revealed 80 weak points related to one or several business processes. A precise analysis of medical school business processes allows identification of unfinished, unclear and inadequate points in these processes, and subsequently the respective improvements and increase of the QM level and ultimately a rationalization of the institution's work. Our approach offers a potential reference model for development of common QM framework allowing a continuous quality control, i.e. the adjustments and adaptation to contemporary educational needs of medical students. Copyright © 2016 by Academy of Sciences and Arts of Bosnia and Herzegovina.

  8. Thermal Stir Welding: A New Solid State Welding Process

    NASA Technical Reports Server (NTRS)

    Ding, R. Jeffrey

    2003-01-01

    Thermal stir welding is a new welding process developed at NASA's Marshall Space Flight Center in Huntsville, AL. Thermal stir welding is similar to friction stir welding in that it joins similar or dissimilar materials without melting the parent material. However, unlike friction stir welding, the heating, stirring and forging elements of the process are all independent of each other and are separately controlled. Furthermore, the heating element of the process can be either a solid-state process (such as a thermal blanket, induction type process, etc), or, a fusion process (YG laser, plasma torch, etc.) The separation of the heating, stirring, forging elements of the process allows more degrees of freedom for greater process control. This paper introduces the mechanics of the thermal stir welding process. In addition, weld mechanical property data is presented for selected alloys as well as metallurgical analysis.

  9. Thermal Stir Welding: A New Solid State Welding Process

    NASA Technical Reports Server (NTRS)

    Ding, R. Jeffrey; Munafo, Paul M. (Technical Monitor)

    2002-01-01

    Thermal stir welding is a new welding process developed at NASA's Marshall Space Flight Center in Huntsville, AL. Thermal stir welding is similar to friction stir welding in that it joins similar or dissimilar materials without melting the parent material. However, unlike friction stir welding, the heating, stirring and forging elements of the process are all independent of each other and are separately controlled. Furthermore, the heating element of the process can be either a solid-state process (such as a thermal blanket, induction type process, etc), or, a fusion process (YG laser, plasma torch, etc.) The separation of the heating, stirring, forging elements of the process allows more degrees of freedom for greater process control. This paper introduces the mechanics of the thermal stir welding process. In addition, weld mechanical property data is presented for selected alloys as well as metallurgical analysis.

  10. A comparison of a novel robust decentralised control strategy and MPC for industrial high purity, high recovery, multicomponent distillation.

    PubMed

    Udugama, Isuru A; Wolfenstetter, Florian; Kirkpatrick, Robert; Yu, Wei; Young, Brent R

    2017-07-01

    In this work we have developed a novel, robust practical control structure to regulate an industrial methanol distillation column. This proposed control scheme is based on a override control framework and can manage a non-key trace ethanol product impurity specification while maintaining high product recovery. For comparison purposes, a MPC with a discrete process model (based on step tests) was also developed and tested. The results from process disturbance testing shows that, both the MPC and the proposed controller were capable of maintaining both the trace level ethanol specification in the distillate (X D ) and high product recovery (β). Closer analysis revealed that the MPC controller has a tighter X D control, while the proposed controller was tighter in β control. The tight X D control allowed the MPC to operate at a higher X D set point (closer to the 10ppm AA grade methanol standard), allowing for savings in energy usage. Despite the energy savings of the MPC, the proposed control scheme has lower installation and running costs. An economic analysis revealed a multitude of other external economic and plant design factors, that should be considered when making a decision between the two controllers. In general, we found relatively high energy costs favour MPC. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Decoding of top-down cognitive processing for SSVEP-controlled BMI

    PubMed Central

    Min, Byoung-Kyong; Dähne, Sven; Ahn, Min-Hee; Noh, Yung-Kyun; Müller, Klaus-Robert

    2016-01-01

    We present a fast and accurate non-invasive brain-machine interface (BMI) based on demodulating steady-state visual evoked potentials (SSVEPs) in electroencephalography (EEG). Our study reports an SSVEP-BMI that, for the first time, decodes primarily based on top-down and not bottom-up visual information processing. The experimental setup presents a grid-shaped flickering line array that the participants observe while intentionally attending to a subset of flickering lines representing the shape of a letter. While the flickering pixels stimulate the participant’s visual cortex uniformly with equal probability, the participant’s intention groups the strokes and thus perceives a ‘letter Gestalt’. We observed decoding accuracy of 35.81% (up to 65.83%) with a regularized linear discriminant analysis; on average 2.05-fold, and up to 3.77-fold greater than chance levels in multi-class classification. Compared to the EEG signals, an electrooculogram (EOG) did not significantly contribute to decoding accuracies. Further analysis reveals that the top-down SSVEP paradigm shows the most focalised activation pattern around occipital visual areas; Granger causality analysis consistently revealed prefrontal top-down control over early visual processing. Taken together, the present paradigm provides the first neurophysiological evidence for the top-down SSVEP BMI paradigm, which potentially enables multi-class intentional control of EEG-BMIs without using gaze-shifting. PMID:27808125

  12. Decoding of top-down cognitive processing for SSVEP-controlled BMI

    NASA Astrophysics Data System (ADS)

    Min, Byoung-Kyong; Dähne, Sven; Ahn, Min-Hee; Noh, Yung-Kyun; Müller, Klaus-Robert

    2016-11-01

    We present a fast and accurate non-invasive brain-machine interface (BMI) based on demodulating steady-state visual evoked potentials (SSVEPs) in electroencephalography (EEG). Our study reports an SSVEP-BMI that, for the first time, decodes primarily based on top-down and not bottom-up visual information processing. The experimental setup presents a grid-shaped flickering line array that the participants observe while intentionally attending to a subset of flickering lines representing the shape of a letter. While the flickering pixels stimulate the participant’s visual cortex uniformly with equal probability, the participant’s intention groups the strokes and thus perceives a ‘letter Gestalt’. We observed decoding accuracy of 35.81% (up to 65.83%) with a regularized linear discriminant analysis; on average 2.05-fold, and up to 3.77-fold greater than chance levels in multi-class classification. Compared to the EEG signals, an electrooculogram (EOG) did not significantly contribute to decoding accuracies. Further analysis reveals that the top-down SSVEP paradigm shows the most focalised activation pattern around occipital visual areas; Granger causality analysis consistently revealed prefrontal top-down control over early visual processing. Taken together, the present paradigm provides the first neurophysiological evidence for the top-down SSVEP BMI paradigm, which potentially enables multi-class intentional control of EEG-BMIs without using gaze-shifting.

  13. Stroop effects in persons with traumatic brain injury: selective attention, speed of processing, or color-naming? A meta-analysis.

    PubMed

    Ben-David, Boaz M; Nguyen, Linh L T; van Lieshout, Pascal H H M

    2011-03-01

    The color word Stroop test is the most common tool used to assess selective attention in persons with traumatic brain injury (TBI). A larger Stroop effect for TBI patients, as compared to controls, is generally interpreted as reflecting a decrease in selective attention. Alternatively, it has been suggested that this increase in Stroop effects is influenced by group differences in generalized speed of processing (SOP). The current study describes an overview and meta-analysis of 10 studies, where persons with TBI (N = 324) were compared to matched controls (N = 501) on the Stroop task. The findings confirmed that Stroop interference was significantly larger for TBI groups (p = .008). However, these differences may be strongly biased by TBI-related slowdown in generalized SOP (r² = .81 in a Brinley analysis). We also found that TBI-related changes in sensory processing may affect group differences. Mainly, a TBI-related increase in the latency difference between reading and naming the font color of a color-neutral word (r² = .96) was linked to Stroop effects. Our results suggest that, in using Stroop, it seems prudent to control for both sensory factors and SOP to differentiate potential changes in selective attention from other changes following TBI.

  14. Steam Hydrocarbon Cracking and Reforming

    ERIC Educational Resources Information Center

    Golombok, Michael

    2004-01-01

    The interactive methods of steam hydrocarbon reforming and cracking of the oil and chemical industries are scrutinized, with special focus on their resemblance and variations. The two methods are illustrations of equilibrium-controlled and kinetically-controlled processes, the analysis of which involves theories, which overlap and balance each…

  15. 14 CFR 417.111 - Launch plans.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... controls identified by a launch operator's ground safety analysis and implementation of the ground safety.... (ii) For each toxic propellant, any hazard controls and process constraints determined under the... classification and compatibility group as defined by part 420 of this chapter. (3) A graphic depiction of the...

  16. Distributed Scene Analysis For Autonomous Road Vehicle Guidance

    NASA Astrophysics Data System (ADS)

    Mysliwetz, Birger D.; Dickmanns, E. D.

    1987-01-01

    An efficient distributed processing scheme has been developed for visual road boundary tracking by 'VaMoRs', a testbed vehicle for autonomous mobility and computer vision. Ongoing work described here is directed to improving the robustness of the road boundary detection process in the presence of shadows, ill-defined edges and other disturbing real world effects. The system structure and the techniques applied for real-time scene analysis are presented along with experimental results. All subfunctions of road boundary detection for vehicle guidance, such as edge extraction, feature aggregation and camera pointing control, are executed in parallel by an onboard multiprocessor system. On the image processing level local oriented edge extraction is performed in multiple 'windows', tighly controlled from a hierarchically higher, modelbased level. The interpretation process involving a geometric road model and the observer's relative position to the road boundaries is capable of coping with ambiguity in measurement data. By using only selected measurements to update the model parameters even high noise levels can be dealt with and misleading edges be rejected.

  17. State machine analysis of sensor data from dynamic processes

    DOEpatents

    Cook, William R.; Brabson, John M.; Deland, Sharon M.

    2003-12-23

    A state machine model analyzes sensor data from dynamic processes at a facility to identify the actual processes that were performed at the facility during a period of interest for the purpose of remote facility inspection. An inspector can further input the expected operations into the state machine model and compare the expected, or declared, processes to the actual processes to identify undeclared processes at the facility. The state machine analysis enables the generation of knowledge about the state of the facility at all levels, from location of physical objects to complex operational concepts. Therefore, the state machine method and apparatus may benefit any agency or business with sensored facilities that stores or manipulates expensive, dangerous, or controlled materials or information.

  18. A Petri Net Approach Based Elementary Siphons Supervisor for Flexible Manufacturing Systems

    NASA Astrophysics Data System (ADS)

    Abdul-Hussin, Mowafak Hassan

    2015-05-01

    This paper presents an approach to constructing a class of an S3PR net for modeling, simulation and control of processes occurring in the flexible manufacturing system (FMS) used based elementary siphons of a Petri net. Siphons are very important to the analysis and control of deadlocks of FMS that is significant objectives of siphons. Petri net models in the efficiency structure analysis, and utilization of the FMSs when different policy can be implemented lead to the deadlock prevention. We are representing an effective deadlock-free policy of a special class of Petri nets called S3PR. Simulation of Petri net structural analysis and reachability graph analysis is used for analysis and control of Petri nets. Petri nets contain been successfully as one of the most powerful tools for modelling of FMS, where Using structural analysis, we show that liveness of such systems can be attributed to the absence of under marked siphons.

  19. A retrospective analysis of the change in anti-malarial treatment policy: Peru.

    PubMed

    Williams, Holly Ann; Vincent-Mark, Arlene; Herrera, Yenni; Chang, O Jaime

    2009-04-28

    National malaria control programmes must deal with the complex process of changing national malaria treatment guidelines, often without guidance on the process of change. Selecting a replacement drug is only one issue in this process. There is a paucity of literature describing successful malaria treatment policy changes to help guide control programs through this process. To understand the wider context in which national malaria treatment guidelines were formulated in a specific country (Peru). Using qualitative methods (individual and focus group interviews, stakeholder analysis and a review of documents), a retrospective analysis of the process of change in Peru's anti-malarial treatment policy from the early 1990's to 2003 was completed. The decision to change Peru's policies resulted from increasing levels of anti-malarial drug resistance, as well as complaints from providers that the drugs were no longer working. The context of the change occurred in a time in which Peru was changing national governments, which created extreme challenges in moving the change process forward. Peru utilized a number of key strategies successfully to ensure that policy change would occur. This included a) having the process directed by a group who shared a common interest in malaria and who had long-established social and professional networks among themselves, b) engaging in collaborative teamwork among nationals and between nationals and international collaborators, c) respect for and inclusion of district-level staff in all phases of the process, d) reliance on high levels of technical and scientific knowledge, e) use of standardized protocols to collect data, and f) transparency. Although not perfectly or fully implemented by 2003, the change in malaria treatment policy in Peru occurred very quickly, as compared to other countries. They identified a problem, collected the data necessary to justify the change, utilized political will to their favor, approved the policy, and moved to improve malaria control in their country. As such, they offer an excellent example for other countries as they contemplate or embark on policy changes.

  20. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  1. Safety Analysis and Protection Measures of the Control System of the Pulsed High Magnetic Field Facility in WHMFC

    NASA Astrophysics Data System (ADS)

    Shi, J. T.; Han, X. T.; Xie, J. F.; Yao, L.; Huang, L. T.; Li, L.

    2013-03-01

    A Pulsed High Magnetic Field Facility (PHMFF) has been established in Wuhan National High Magnetic Field Center (WHMFC) and various protection measures are applied in its control system. In order to improve the reliability and robustness of the control system, the safety analysis of the PHMFF is carried out based on Fault Tree Analysis (FTA) technique. The function and realization of 5 protection systems, which include sequence experiment operation system, safety assistant system, emergency stop system, fault detecting and processing system and accident isolating protection system, are given. The tests and operation indicate that these measures improve the safety of the facility and ensure the safety of people.

  2. Preliminary Design and Analysis of the ARES Atmospheric Flight Vehicle Thermal Control System

    NASA Technical Reports Server (NTRS)

    Gasbarre, J. F.; Dillman, R. A.

    2003-01-01

    The Aerial Regional-scale Environmental Survey (ARES) is a proposed 2007 Mars Scout Mission that will be the first mission to deploy an atmospheric flight vehicle (AFV) on another planet. This paper will describe the preliminary design and analysis of the AFV thermal control system for its flight through the Martian atmosphere and also present other analyses broadening the scope of that design to include other phases of the ARES mission. Initial analyses are discussed and results of trade studies are presented which detail the design process for AFV thermal control. Finally, results of the most recent AFV thermal analysis are shown and the plans for future work are discussed.

  3. Accident analysis and control options in support of the sludge water system safety analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HEY, B.E.

    A hazards analysis was initiated for the SWS in July 2001 (SNF-8626, K Basin Sludge and Water System Preliminary Hazard Analysis) and updated in December 2001 (SNF-10020 Rev. 0, Hazard Evaluation for KE Sludge and Water System - Project A16) based on conceptual design information for the Sludge Retrieval System (SRS) and 60% design information for the cask and container. SNF-10020 was again revised in September 2002 to incorporate new hazards identified from final design information and from a What-if/Checklist evaluation of operational steps. The process hazards, controls, and qualitative consequence and frequency estimates taken from these efforts have beenmore » incorporated into Revision 5 of HNF-3960, K Basins Hazards Analysis. The hazards identification process documented in the above referenced reports utilized standard industrial safety techniques (AIChE 1992, Guidelines for Hazard Evaluation Procedures) to systematically guide several interdisciplinary teams through the system using a pre-established set of process parameters (e.g., flow, temperature, pressure) and guide words (e.g., high, low, more, less). The teams generally included representation from the U.S. Department of Energy (DOE), K Basins Nuclear Safety, T Plant Nuclear Safety, K Basin Industrial Safety, fire protection, project engineering, operations, and facility engineering.« less

  4. Rapid self-assembly of DNA on a microfluidic chip

    PubMed Central

    Zheng, Yao; Footz, Tim; Manage, Dammika P; Backhouse, Christopher James

    2005-01-01

    Background DNA self-assembly methods have played a major role in enabling methods for acquiring genetic information without having to resort to sequencing, a relatively slow and costly procedure. However, even self-assembly processes tend to be very slow when they rely upon diffusion on a large scale. Miniaturisation and integration therefore hold the promise of greatly increasing this speed of operation. Results We have developed a rapid method for implementing the self-assembly of DNA within a microfluidic system by electrically extracting the DNA from an environment containing an uncharged denaturant. By controlling the parameters of the electrophoretic extraction and subsequent analysis of the DNA we are able to control when the hybridisation occurs as well as the degree of hybridisation. By avoiding off-chip processing or long thermal treatments we are able to perform this hybridisation rapidly and can perform hybridisation, sizing, heteroduplex analysis and single-stranded conformation analysis within a matter of minutes. The rapidity of this analysis allows the sampling of transient effects that may improve the sensitivity of mutation detection. Conclusions We believe that this method will aid the integration of self-assembly methods upon microfluidic chips. The speed of this analysis also appears to provide information upon the dynamics of the self-assembly process. PMID:15717935

  5. Hydrochemical evolution and groundwater flow processes in the Galilee and Eromanga basins, Great Artesian Basin, Australia: a multivariate statistical approach.

    PubMed

    Moya, Claudio E; Raiber, Matthias; Taulis, Mauricio; Cox, Malcolm E

    2015-03-01

    The Galilee and Eromanga basins are sub-basins of the Great Artesian Basin (GAB). In this study, a multivariate statistical approach (hierarchical cluster analysis, principal component analysis and factor analysis) is carried out to identify hydrochemical patterns and assess the processes that control hydrochemical evolution within key aquifers of the GAB in these basins. The results of the hydrochemical assessment are integrated into a 3D geological model (previously developed) to support the analysis of spatial patterns of hydrochemistry, and to identify the hydrochemical and hydrological processes that control hydrochemical variability. In this area of the GAB, the hydrochemical evolution of groundwater is dominated by evapotranspiration near the recharge area resulting in a dominance of the Na-Cl water types. This is shown conceptually using two selected cross-sections which represent discrete groundwater flow paths from the recharge areas to the deeper parts of the basins. With increasing distance from the recharge area, a shift towards a dominance of carbonate (e.g. Na-HCO3 water type) has been observed. The assessment of hydrochemical changes along groundwater flow paths highlights how aquifers are separated in some areas, and how mixing between groundwater from different aquifers occurs elsewhere controlled by geological structures, including between GAB aquifers and coal bearing strata of the Galilee Basin. The results of this study suggest that distinct hydrochemical differences can be observed within the previously defined Early Cretaceous-Jurassic aquifer sequence of the GAB. A revision of the two previously recognised hydrochemical sequences is being proposed, resulting in three hydrochemical sequences based on systematic differences in hydrochemistry, salinity and dominant hydrochemical processes. The integrated approach presented in this study which combines different complementary multivariate statistical techniques with a detailed assessment of the geological framework of these sedimentary basins, can be adopted in other complex multi-aquifer systems to assess hydrochemical evolution and its geological controls. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Orbiter data reduction complex data processing requirements for the OFT mission evaluation team (level C)

    NASA Technical Reports Server (NTRS)

    1979-01-01

    This document addresses requirements for post-test data reduction in support of the Orbital Flight Tests (OFT) mission evaluation team, specifically those which are planned to be implemented in the ODRC (Orbiter Data Reduction Complex). Only those requirements which have been previously baselined by the Data Systems and Analysis Directorate configuration control board are included. This document serves as the control document between Institutional Data Systems Division and the Integration Division for OFT mission evaluation data processing requirements, and shall be the basis for detailed design of ODRC data processing systems.

  7. Impact of hospital care on incidence of bloodstream infection: the evaluation of processes and indicators in infection control study.

    PubMed Central

    Kritchevsky, S. B.; Braun, B. I.; Wong, E. S.; Solomon, S. L.; Steele, L.; Richards, C.; Simmons, B. P.

    2001-01-01

    The Evaluation of Processes and Indicators in Infection Control (EPIC) study assesses the relationship between hospital care and rates of central venous catheter-associated primary bacteremia in 54 intensive-care units (ICUs) in the United States and 14 other countries. Using ICU rather than the patient as the primary unit of statistical analysis permits evaluation of factors that vary at the ICU level. The design of EPIC can serve as a template for studies investigating the relationship between process and event rates across health-care institutions. PMID:11294704

  8. Theory of agent-based market models with controlled levels of greed and anxiety

    NASA Astrophysics Data System (ADS)

    Papadopoulos, P.; Coolen, A. C. C.

    2010-01-01

    We use generating functional analysis to study minority-game-type market models with generalized strategy valuation updates that control the psychology of agents' actions. The agents' choice between trend-following and contrarian trading, and their vigor in each, depends on the overall state of the market. Even in 'fake history' models, the theory now involves an effective overall bid process (coupled to the effective agent process) which can exhibit profound remanence effects and new phase transitions. For some models the bid process can be solved directly, others require Maxwell-construction-type approximations.

  9. A Theoretical and Experimental Analysis of the Outside World Perception Process

    NASA Technical Reports Server (NTRS)

    Wewerinke, P. H.

    1978-01-01

    The outside scene is often an important source of information for manual control tasks. Important examples of these are car driving and aircraft control. This paper deals with modelling this visual scene perception process on the basis of linear perspective geometry and the relative motion cues. Model predictions utilizing psychophysical threshold data from base-line experiments and literature of a variety of visual approach tasks are compared with experimental data. Both the performance and workload results illustrate that the model provides a meaningful description of the outside world perception process, with a useful predictive capability.

  10. PAT-tools for process control in pharmaceutical film coating applications.

    PubMed

    Knop, Klaus; Kleinebudde, Peter

    2013-12-05

    Recent development of analytical techniques to monitor the coating process of pharmaceutical solid dosage forms such as pellets and tablets are described. The progress from off- or at-line measurements to on- or in-line applications is shown for the spectroscopic methods near infrared (NIR) and Raman spectroscopy as well as for terahertz pulsed imaging (TPI) and image analysis. The common goal of all these methods is to control or at least to monitor the coating process and/or to estimate the coating end point through timely measurements. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. [CMACPAR an modified parallel neuro-controller for control processes].

    PubMed

    Ramos, E; Surós, R

    1999-01-01

    CMACPAR is a Parallel Neurocontroller oriented to real time systems as for example Control Processes. Its characteristics are mainly a fast learning algorithm, a reduced number of calculations, great generalization capacity, local learning and intrinsic parallelism. This type of neurocontroller is used in real time applications required by refineries, hydroelectric centers, factories, etc. In this work we present the analysis and the parallel implementation of a modified scheme of the Cerebellar Model CMAC for the n-dimensional space projection using a mean granularity parallel neurocontroller. The proposed memory management allows for a significant memory reduction in training time and required memory size.

  12. [A strategy of constructing the technological system for quality control of Chinese medicine based on process control and management].

    PubMed

    Cheng, Yi-Yu; Qian, Zhong-Zhi; Zhang, Bo-Li

    2017-01-01

    The current situation, bottleneck problems and severe challenges in quality control technology of Chinese Medicine (CM) are briefly described. It is presented to change the phenomenon related to the post-test as the main means and contempt for process control in drug regulation, reverse the situation of neglecting the development of process control and management technology for pharmaceutical manufacture and reconstruct the technological system for quality control of CM products. The regulation and technology system based on process control and management for controlling CM quality should be established to solve weighty realistic problems of CM industry from the root causes, including backwardness of quality control technology, weakness of quality risk control measures, poor reputation of product quality and so on. By this way, the obstacles from poor controllability of CM product quality could be broken. Concentrating on those difficult problems and weak links in the technical field of CM quality control, it is proposed to build CMC (Chemistry, Manufacturing and Controls) regulation for CM products with Chinese characteristics and promote the regulation international recognition as soon as possible. The CMC technical framework, which is clinical efficacy-oriented, manufacturing manner-centered and process control-focused, was designed. To address the clinical characteristics of traditional Chinese medicine (TCM) and the production feature of CM manufacture, it is suggested to establish quality control engineering for CM manufacturing by integrating pharmaceutical analysis, TCM chemistry, TCM pharmacology, pharmaceutical engineering, control engineering, management engineering and other disciplines. Further, a theoretical model of quality control engineering for CM manufacturing and the methodology of digital pharmaceutical engineering are proposed. A technology pathway for promoting CM standard and realizing the strategic goal of CM internationalization is elaborated. Copyright© by the Chinese Pharmaceutical Association.

  13. Does daily nurse staffing match ward workload variability? Three hospitals' experiences.

    PubMed

    Gabbay, Uri; Bukchin, Michael

    2009-01-01

    Nurse shortage and rising healthcare resource burdens mean that appropriate workforce use is imperative. This paper aims to evaluate whether daily nursing staffing meets ward workload needs. Nurse attendance and daily nurses' workload capacity in three hospitals were evaluated. Statistical process control was used to evaluate intra-ward nurse workload capacity and day-to-day variations. Statistical process control is a statistics-based method for process monitoring that uses charts with predefined target measure and control limits. Standardization was performed for inter-ward analysis by converting ward-specific crude measures to ward-specific relative measures by dividing observed/expected. Two charts: acceptable and tolerable daily nurse workload intensity, were defined. Appropriate staffing indicators were defined as those exceeding predefined rates within acceptable and tolerable limits (50 percent and 80 percent respectively). A total of 42 percent of the overall days fell within acceptable control limits and 71 percent within tolerable control limits. Appropriate staffing indicators were met in only 33 percent of wards regarding acceptable nurse workload intensity and in only 45 percent of wards regarding tolerable workloads. The study work did not differentiate crude nurse attendance and it did not take into account patient severity since crude bed occupancy was used. Double statistical process control charts and certain staffing indicators were used, which is open to debate. Wards that met appropriate staffing indicators prove the method's feasibility. Wards that did not meet appropriate staffing indicators prove the importance and the need for process evaluations and monitoring. Methods presented for monitoring daily staffing appropriateness are simple to implement either for intra-ward day-to-day variation by using nurse workload capacity statistical process control charts or for inter-ward evaluation using standardized measure of nurse workload intensity. The real challenge will be to develop planning systems and implement corrective interventions such as dynamic and flexible daily staffing, which will face difficulties and barriers. The paper fulfils the need for workforce utilization evaluation. A simple method using available data for daily staffing appropriateness evaluation, which is easy to implement and operate, is presented. The statistical process control method enables intra-ward evaluation, while standardization by converting crude into relative measures enables inter-ward analysis. The staffing indicator definitions enable performance evaluation. This original study uses statistical process control to develop simple standardization methods and applies straightforward statistical tools. This method is not limited to crude measures, rather it uses weighted workload measures such as nursing acuity or weighted nurse level (i.e. grade/band).

  14. Microarray analysis of gene expression patterns in the leaf during potato tuberization in the potato somatic hybrid Solanum tuberosum and Solanum etuberosum.

    PubMed

    Tiwari, Jagesh Kumar; Devi, Sapna; Sundaresha, S; Chandel, Poonam; Ali, Nilofer; Singh, Brajesh; Bhardwaj, Vinay; Singh, Bir Pal

    2015-06-01

    Genes involved in photoassimilate partitioning and changes in hormonal balance are important for potato tuberization. In the present study, we investigated gene expression patterns in the tuber-bearing potato somatic hybrid (E1-3) and control non-tuberous wild species Solanum etuberosum (Etb) by microarray. Plants were grown under controlled conditions and leaves were collected at eight tuber developmental stages for microarray analysis. A t-test analysis identified a total of 468 genes (94 up-regulated and 374 down-regulated) that were statistically significant (p ≤ 0.05) and differentially expressed in E1-3 and Etb. Gene Ontology (GO) characterization of the 468 genes revealed that 145 were annotated and 323 were of unknown function. Further, these 145 genes were grouped based on GO biological processes followed by molecular function and (or) PGSC description into 15 gene sets, namely (1) transport, (2) metabolic process, (3) biological process, (4) photosynthesis, (5) oxidation-reduction, (6) transcription, (7) translation, (8) binding, (9) protein phosphorylation, (10) protein folding, (11) ubiquitin-dependent protein catabolic process, (12) RNA processing, (13) negative regulation of protein, (14) methylation, and (15) mitosis. RT-PCR analysis of 10 selected highly significant genes (p ≤ 0.01) confirmed the microarray results. Overall, we show that candidate genes induced in leaves of E1-3 were implicated in tuberization processes such as transport, carbohydrate metabolism, phytohormones, and transcription/translation/binding functions. Hence, our results provide an insight into the candidate genes induced in leaf tissues during tuberization in E1-3.

  15. ISO 9000 and/or Systems Engineering Capability Maturity Model?

    NASA Technical Reports Server (NTRS)

    Gholston, Sampson E.

    2002-01-01

    For businesses and organizations to remain competitive today they must have processes and systems in place that will allow them to first identify customer needs and then develop products/processes that will meet or exceed the customers needs and expectations. Customer needs, once identified, are normally stated as requirements. Designers can then develop products/processes that will meet these requirements. Several functions, such as quality management and systems engineering management are used to assist product development teams in the development process. Both functions exist in all organizations and both have a similar objective, which is to ensure that developed processes will meet customer requirements. Are efforts in these organizations being duplicated? Are both functions needed by organizations? What are the similarities and differences between the functions listed above? ISO 9000 is an international standard of goods and services. It sets broad requirements for the assurance of quality and for management's involvement. It requires organizations to document the processes and to follow these documented processes. ISO 9000 gives customers assurance that the suppliers have control of the process for product development. Systems engineering can broadly be defined as a discipline that seeks to ensure that all requirements for a system are satisfied throughout the life of the system by preserving their interrelationship. The key activities of systems engineering include requirements analysis, functional analysis/allocation, design synthesis and verification, and system analysis and control. The systems engineering process, when followed properly, will lead to higher quality products, lower cost products, and shorter development cycles. The System Engineering Capability Maturity Model (SE-CMM) will allow companies to measure their system engineering capability and continuously improve those capabilities. ISO 9000 and SE-CMM seem to have a similar objective, which is to document the organization's processes and certify to potential customers the capability of a supplier to control the processes that determine the quality of the product or services being produced. The remaining sections of this report examine the differences and similarities between ISO 9000 and SE-CMM and make recommendations for implementation.

  16. Markov Modeling of Component Fault Growth over a Derived Domain of Feasible Output Control Effort Modifications

    NASA Technical Reports Server (NTRS)

    Bole, Brian; Goebel, Kai; Vachtsevanos, George

    2012-01-01

    This paper introduces a novel Markov process formulation of stochastic fault growth modeling, in order to facilitate the development and analysis of prognostics-based control adaptation. A metric representing the relative deviation between the nominal output of a system and the net output that is actually enacted by an implemented prognostics-based control routine, will be used to define the action space of the formulated Markov process. The state space of the Markov process will be defined in terms of an abstracted metric representing the relative health remaining in each of the system s components. The proposed formulation of component fault dynamics will conveniently relate feasible system output performance modifications to predictions of future component health deterioration.

  17. Preliminary control system design and analysis for the Space Station Furnace Facility thermal control system

    NASA Technical Reports Server (NTRS)

    Jackson, M. E.

    1995-01-01

    This report presents the Space Station Furnace Facility (SSFF) thermal control system (TCS) preliminary control system design and analysis. The SSFF provides the necessary core systems to operate various materials processing furnaces. The TCS is defined as one of the core systems, and its function is to collect excess heat from furnaces and to provide precise cold temperature control of components and of certain furnace zones. Physical interconnection of parallel thermal control subsystems through a common pump implies the description of the TCS by coupled nonlinear differential equations in pressure and flow. This report formulates the system equations and develops the controllers that cause the interconnected subsystems to satisfy flow rate tracking requirements. Extensive digital simulation results are presented to show the flow rate tracking performance.

  18. Monitoring of beer fermentation based on hybrid electronic tongue.

    PubMed

    Kutyła-Olesiuk, Anna; Zaborowski, Michał; Prokaryn, Piotr; Ciosek, Patrycja

    2012-10-01

    Monitoring of biotechnological processes, including fermentation is extremely important because of the rapidly occurring changes in the composition of the samples during the production. In the case of beer, the analysis of physicochemical parameters allows for the determination of the stage of fermentation process and the control of its possible perturbations. As a tool to control the beer production process a sensor array can be used, composed of potentiometric and voltammetric sensors (so-called hybrid Electronic Tongue, h-ET). The aim of this study is to apply electronic tongue system to distinguish samples obtained during alcoholic fermentation. The samples originate from batch of homemade beer fermentation and from two stages of the process: fermentation reaction and maturation of beer. The applied sensor array consists of 10 miniaturized ion-selective electrodes (potentiometric ET) and silicon based 3-electrode voltammetric transducers (voltammetric ET). The obtained results were processed using Partial Least Squares (PLS) and Partial Least Squares-Discriminant Analysis (PLS-DA). For potentiometric data, voltammetric data, and combined potentiometric and voltammetric data, comparison of the classification ability was conducted based on Root Mean Squared Error (RMSE), sensitivity, specificity, and coefficient F calculation. It is shown, that in the contrast to the separately used techniques, the developed hybrid system allowed for a better characterization of the beer samples. Data fusion in hybrid ET enables to obtain better results both in qualitative analysis (RMSE, specificity, sensitivity) and in quantitative analysis (RMSE, R(2), a, b). Copyright © 2012 Elsevier B.V. All rights reserved.

  19. New Ultrasonic Controller and Characterization System for Low Temperature Drying Process Intensification

    NASA Astrophysics Data System (ADS)

    Andrés, R. R.; Blanco, A.; Acosta, V. M.; Riera, E.; Martínez, I.; Pinto, A.

    Process intensification constitutes a high interesting and promising industrial area. It aims to modify conventional processes or develop new technologies in order to reduce energy needs, increase yields and improve product quality. It has been demonstrated by this research group (CSIC) that power ultrasound have a great potential in food drying processes. The effects associated with the application of power ultrasound can enhance heat and mass transfer and may constitute a way for process intensification. The objective of this work has been the design and development of a new ultrasonic system for the power characterization of piezoelectric plate-transducers, as excitation, monitoring, analysis, control and characterization of their nonlinear response. For this purpose, the system proposes a new, efficient and economic approach that separates the effect of different parameters of the process like excitation, medium and transducer parameters and variables (voltage, current, frequency, impedance, vibration velocity, acoustic pressure and temperature) by observing the electrical, mechanical, acoustical and thermal behavior, and controlling the vibrational state.

  20. Acoustic Sample Deposition MALDI-MS (ASD-MALDI-MS): A Novel Process Flow for Quality Control Screening of Compound Libraries.

    PubMed

    Chin, Jefferson; Wood, Elizabeth; Peters, Grace S; Drexler, Dieter M

    2016-02-01

    In the early stages of drug discovery, high-throughput screening (HTS) of compound libraries against pharmaceutical targets is a common method to identify potential lead molecules. For these HTS campaigns to be efficient and successful, continuous quality control of the compound collection is necessary and crucial. However, the large number of compound samples and the limited sample amount pose unique challenges. Presented here is a proof-of-concept study for a novel process flow for the quality control screening of small-molecule compound libraries that consumes only minimal amounts of samples and affords compound-specific molecular data. This process employs an acoustic sample deposition (ASD) technique for the offline sample preparation by depositing nanoliter volumes in an array format onto microscope glass slides followed by matrix-assisted laser desorption/ionization mass spectrometric (MALDI-MS) analysis. An initial study of a 384-compound array employing the ASD-MALDI-MS workflow resulted in a 75% first-pass positive identification rate with an analysis time of <1 s per sample. © 2015 Society for Laboratory Automation and Screening.

  1. Dynamic performance analysis of permanent magnet contactor with a flux-weakening control strategy

    NASA Astrophysics Data System (ADS)

    Wang, Xianbing; Lin, Heyun; Fang, Shuhua; Jin, Ping; Wang, Junhua; Ho, S. L.

    2011-04-01

    A new flux-weakening control strategy for permanent magnet contactors is proposed. By matching the dynamic attraction force and the antiforce, the terminal velocity and collision energy of the movable iron in the closing process are significantly reduced. The movable iron displacement is estimated by detecting the closing voltage and current with the proposed control. A dynamic mathematical model is also established under four kinds of excitation scenarios. The attraction force and flux linkage are predicted by finite element method and the dynamics of the closing process is simulated using the 4th-order Runge-Kutta algorithm. Experiments are carried out on a 250A prototype with an intelligent control unit to verify the proposed control strategy.

  2. Confirmatory factor analysis reveals a latent cognitive structure common to bipolar disorder, schizophrenia, and normal controls.

    PubMed

    Schretlen, David J; Peña, Javier; Aretouli, Eleni; Orue, Izaskun; Cascella, Nicola G; Pearlson, Godfrey D; Ojeda, Natalia

    2013-06-01

    We sought to determine whether a single hypothesized latent factor structure would characterize cognitive functioning in three distinct groups. We assessed 576 adults (340 community controls, 126 adults with bipolar disorder, and 110 adults with schizophrenia) using 15 measures derived from nine cognitive tests. Confirmatory factor analysis (CFA) was conducted to examine the fit of a hypothesized six-factor model. The hypothesized factors included attention, psychomotor speed, verbal memory, visual memory, ideational fluency, and executive functioning. The six-factor model provided an excellent fit for all three groups [for community controls, root mean square error of approximation (RMSEA) <0.048 and comparative fit index (CFI) = 0.99; for adults with bipolar disorder, RMSEA = 0.071 and CFI = 0.99; and for adults with schizophrenia, RMSEA = 0.06 and CFI = 0.98]. Alternate models that combined fluency with processing speed or verbal and visual memory reduced the goodness of fit. Multi-group CFA results supported factor invariance across the three groups. Confirmatory factor analysis supported a single six-factor structure of cognitive functioning among patients with schizophrenia or bipolar disorder and community controls. While the three groups clearly differ in level of performance, they share a common underlying architecture of information processing abilities. These cognitive factors could provide useful targets for clinical trials of treatments that aim to enhance information processing in persons with neurological and neuropsychiatric disorders. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. Dynamic response characteristics analysis of the doubly-fed wind power system under grid voltage drop

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Wang, J.; Wang, H. H.; Yang, L.; Chen, W.; Xu, Y. T.

    2016-08-01

    Double-fed induction generator (DFIG) is sensitive to the disturbances of grid, so the security and stability of the grid and the DFIG itself are under threat with the rapid increase of DFIG. Therefore, it is important to study dynamic response of the DFIG when voltage drop failure is happened in power system. In this paper, firstly, mathematical models and the control strategy about mechanical and electrical response processes is respectively introduced. Then through the analysis of response process, it is concluded that the dynamic response characteristics are related to voltage drop level, operating status of DFIG and control strategy adapted to rotor side. Last, the correctness of conclusion is validated by the simulation about mechanical and electrical response processes in different voltage levels drop and different DFIG output levels under DIgSILENT/PowerFactory software platform.

  4. Dual-Use Space Technology Transfer Conference and Exhibition. Volume 1

    NASA Technical Reports Server (NTRS)

    Krishen, Kumar (Compiler)

    1994-01-01

    This document contains papers presented at the Dual-Use Space Technology Transfer Conference and Exhibition held at the Johnson Space Center February 1-3, 1994. Possible technology transfers covered during the conference were in the areas of information access; innovative microwave and optical applications; materials and structures; marketing and barriers; intelligent systems; human factors and habitation; communications and data systems; business process and technology transfer; software engineering; biotechnology and advanced bioinstrumentation; communications signal processing and analysis; new ways of doing business; medical care; applications derived from control center data systems; human performance evaluation; technology transfer methods; mathematics, modeling, and simulation; propulsion; software analysis and decision tools systems/processes in human support technology; networks, control centers, and distributed systems; power; rapid development perception and vision technologies; integrated vehicle health management; automation technologies; advanced avionics; ans robotics technologies. More than 77 papers, 20 presentations, and 20 exhibits covering various disciplines were presented b experts from NASA, universities, and industry.

  5. Drop-on-Demand Single Cell Isolation and Total RNA Analysis

    PubMed Central

    Moon, Sangjun; Kim, Yun-Gon; Dong, Lingsheng; Lombardi, Michael; Haeggstrom, Edward; Jensen, Roderick V.; Hsiao, Li-Li; Demirci, Utkan

    2011-01-01

    Technologies that rapidly isolate viable single cells from heterogeneous solutions have significantly contributed to the field of medical genomics. Challenges remain both to enable efficient extraction, isolation and patterning of single cells from heterogeneous solutions as well as to keep them alive during the process due to a limited degree of control over single cell manipulation. Here, we present a microdroplet based method to isolate and pattern single cells from heterogeneous cell suspensions (10% target cell mixture), preserve viability of the extracted cells (97.0±0.8%), and obtain genomic information from isolated cells compared to the non-patterned controls. The cell encapsulation process is both experimentally and theoretically analyzed. Using the isolated cells, we identified 11 stem cell markers among 1000 genes and compare to the controls. This automated platform enabling high-throughput cell manipulation for subsequent genomic analysis employs fewer handling steps compared to existing methods. PMID:21412416

  6. RF control at SSCL — an object oriented design approach

    NASA Astrophysics Data System (ADS)

    Dohan, D. A.; Osberg, E.; Biggs, R.; Bossom, J.; Chillara, K.; Richter, R.; Wade, D.

    1994-12-01

    The Superconducting Super Collider (SSC) in Texas, the construction of which was stopped in 1994, would have represented a major challenge in accelerator research and development. This paper addresses the issues encountered in the parallel design and construction of the control systems for the RF equipment for the five accelerators comprising the SSC. An extensive analysis of the components of the RF control systems has been undertaken, based upon the Schlaer-Mellor object-oriented analysis and design (OOA/OOD) methodology. The RF subsystem components such as amplifiers, tubes, power supplies, PID loops, etc. were analyzed to produce OOA information, behavior and process models. Using these models, OOD was iteratively applied to develop a generic RF control system design. This paper describes the results of this analysis and the development of 'bridges' between the analysis objects, and the EPICS-based software and underlying VME-based hardware architectures. The application of this approach to several of the SSCL RF control systems is discussed.

  7. Robotics-assisted mass spectrometry assay platform enabled by open-source electronics.

    PubMed

    Chiu, Shih-Hao; Urban, Pawel L

    2015-02-15

    Mass spectrometry (MS) is an important analytical technique with numerous applications in clinical analysis, biochemistry, environmental analysis, geology and physics. Its success builds on the ability of MS to determine molecular weights of analytes, and elucidate their structures. However, sample handling prior to MS requires a lot of attention and labor. In this work we were aiming to automate processing samples for MS so that analyses could be conducted without much supervision of experienced analysts. The goal of this study was to develop a robotics and information technology-oriented platform that could control the whole analysis process including sample delivery, reaction-based assay, data acquisition, and interaction with the analyst. The proposed platform incorporates a robotic arm for handling sample vials delivered to the laboratory, and several auxiliary devices which facilitate and secure the analysis process. They include: multi-relay board, infrared sensors, photo-interrupters, gyroscopes, force sensors, fingerprint scanner, barcode scanner, touch screen panel, and internet interface. The control of all the building blocks is achieved through implementation of open-source electronics (Arduino), and enabled by custom-written programs in C language. The advantages of the proposed system include: low cost, simplicity, small size, as well as facile automation of sample delivery and processing without the intervention of the analyst. It is envisaged that this simple robotic system may be the forerunner of automated laboratories dedicated to mass spectrometric analysis of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. David A. Sievers | NREL

    Science.gov Websites

    fermentation and refining process. One of his favorite topics is in the design and commissioning of custom research equipment. His areas of expertise include: Project management Process design, equipment design , and fabrication Instrumentation and controls design and programming Data analysis and presentation

  9. 21 CFR 106.25 - In-process control.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... analyzed as specified in § 106.30(b)(1), the manufacturer shall analyze each in-process batch for: (1) Solids; (2) Protein, fat, and carbohydrates (carbohydrates either by analysis or by mathematical difference); (3) The indicator nutrient(s) in each nutrient premix; (4) Each nutrient added independently of...

  10. 9 CFR 417.1 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... control point. A point, step, or procedure in a food process at which control can be applied and, as a...

  11. 9 CFR 417.1 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... control point. A point, step, or procedure in a food process at which control can be applied and, as a...

  12. 9 CFR 417.1 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... control point. A point, step, or procedure in a food process at which control can be applied and, as a...

  13. 9 CFR 417.1 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... control point. A point, step, or procedure in a food process at which control can be applied and, as a...

  14. 9 CFR 417.1 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... control point. A point, step, or procedure in a food process at which control can be applied and, as a...

  15. Environmental Impact Analysis Process, Environmental Assessment Space Test Experiments Platform Mission 1, Vandenberg Air Force Base, CA

    DTIC Science & Technology

    1994-01-05

    Information LOCAL AGENCIES Santa Barbara County Air Pollution Control District, California Hallerman , Richard Air Quality Specialist City of...Base. Provided by Richard Hallerman , SBCAPCD, to Bob Baxter, Engineering-Science. 11 June. ____, 1991. Santa Barbara County Air Pollution Control

  16. An Analysis of the Navy Regional Data Automation Center (NARDAC) chargeback System

    DTIC Science & Technology

    1986-09-01

    addition, operational control is concerned with performing predefined activities whereas management control relates to the organiza- tion’s goals and...In effect, the management control system monitors the progress of operations and alerts the "appropriate management level" when performance as measured...architecture, the financial control processes, and the audit function ( Brandon , 1978; Anderson, 1983). In an operating DP environment, however, non-financial

  17. Red and processed meat consumption and the risk of lung cancer: a dose-response meta-analysis of 33 published studies

    PubMed Central

    Xue, Xiu-Juan; Gao, Qing; Qiao, Jian-Hong; Zhang, Jie; Xu, Cui-Ping; Liu, Ju

    2014-01-01

    This meta-analysis was to summarize the published studies about the association between red/processed meat consumption and the risk of lung cancer. 5 databases were systematically reviewed, and random-effect model was used to pool the study results and to assess dose-response relationships. Results shown that six cohort studies and twenty eight case-control studies were included in this meat-analysis. The pooled Risk Radios (RR) for total red meat and processed meat were 1.44 (95% CI, 1.29-1.61) and 1.23 (95% CI, 1.10-1.37), respectively. Dose-response analysis revealed that for every increment of 120 grams red meat per day the risk of lung cancer increases 35% and for every increment of 50 grams red meat per day the risk of lung cancer increases 20%. The present dose-response meta-analysis suggested that both red and processed meat consumption showed a positive effect on lung cancer risk. PMID:25035778

  18. Optimisation of warpage on plastic injection moulding part using response surface methodology (RSM)

    NASA Astrophysics Data System (ADS)

    Miza, A. T. N. A.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Rashidi, M. M.

    2017-09-01

    The warpage is often encountered which occur during injection moulding process of thin shell part depending the process condition. The statistical design of experiment method which are Integrating Finite Element (FE) Analysis, moldflow analysis and response surface methodology (RSM) are the stage of few ways in minimize the warpage values of x,y and z on thin shell plastic parts that were investigated. A battery cover of a remote controller is one of the thin shell plastic part that produced by using injection moulding process. The optimum process condition parameter were determined as to achieve the minimum warpage from being occur. Packing pressure, Cooling time, Melt temperature and Mould temperature are 4 parameters that considered in this study. A two full factorial experimental design was conducted in Design Expert of RSM analysis as to combine all these parameters study. FE analysis result gain from analysis of variance (ANOVA) method was the one of the important process parameters influenced warpage. By using RSM, a predictive response surface model for warpage data will be shown.

  19. Microscopic Analysis of Activated Sludge. Training Manual.

    ERIC Educational Resources Information Center

    Office of Water Program Operations (EPA), Cincinnati, OH. National Training and Operational Technology Center.

    This training manual presents material on the use of a compound microscope to analyze microscope communities, present in wastewater treatment processes, for operational control. Course topics include: sampling techniques, sample handling, laboratory analysis, identification of organisms, data interpretation, and use of the compound microscope.…

  20. Robotic solid phase extraction and high performance liquid chromatographic analysis of ranitidine in serum or plasma.

    PubMed

    Lloyd, T L; Perschy, T B; Gooding, A E; Tomlinson, J J

    1992-01-01

    A fully automated assay for the analysis of ranitidine in serum and plasma, with and without an internal standard, was validated. It utilizes robotic solid phase extraction with on-line high performance liquid chromatographic (HPLC) analysis. The ruggedness of the assay was demonstrated over a three-year period. A Zymark Py Technology II robotic system was used for serial processing from initial aspiration of samples from original collection containers, to final direct injection onto the on-line HPLC system. Automated serial processing with on-line analysis provided uniform sample history and increased productivity by freeing the chemist to analyse data and perform other tasks. The solid phase extraction efficiency was 94% throughout the assay range of 10-250 ng/mL. The coefficients of variation for within- and between-day quality control samples ranged from 1 to 6% and 1 to 5%, respectively. Mean accuracy for between-day standards and quality control results ranged from 97 to 102% of the respective theoretical concentrations.

  1. Incorporating Handling Qualities Analysis into Rotorcraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Lawrence, Ben

    2014-01-01

    This paper describes the initial development of a framework to incorporate handling qualities analyses into a rotorcraft conceptual design process. In particular, the paper describes how rotorcraft conceptual design level data can be used to generate flight dynamics models for handling qualities analyses. Also, methods are described that couple a basic stability augmentation system to the rotorcraft flight dynamics model to extend analysis to beyond that of the bare airframe. A methodology for calculating the handling qualities characteristics of the flight dynamics models and for comparing the results to ADS-33E criteria is described. Preliminary results from the application of the handling qualities analysis for variations in key rotorcraft design parameters of main rotor radius, blade chord, hub stiffness and flap moment of inertia are shown. Varying relationships, with counteracting trends for different handling qualities criteria and different flight speeds are exhibited, with the action of the control system playing a complex part in the outcomes. Overall, the paper demonstrates how a broad array of technical issues across flight dynamics stability and control, simulation and modeling, control law design and handling qualities testing and evaluation had to be confronted to implement even a moderately comprehensive handling qualities analysis of relatively low fidelity models. A key outstanding issue is to how to 'close the loop' with an overall design process, and options for the exploration of how to feedback handling qualities results to a conceptual design process are proposed for future work.

  2. Chemical Structure and Molecular Dimension As Controls on the Inherent Stability of Charcoal in Boreal Forest Soil

    NASA Astrophysics Data System (ADS)

    Hockaday, W. C.; Kane, E. S.; Ohlson, M.; Huang, R.; Von Bargen, J.; Davis, R.

    2014-12-01

    Efforts have been made by various scientific disciplines to study hyporheic zones and characterize their associated processes. One way to approach the study of the hyporheic zone is to define facies, which are elements of a (hydrobio) geologic classification scheme that groups components of a complex system with high variability into a manageable set of discrete classes. In this study, we try to classify the hyporheic zone based on the geology, geochemistry, microbiology, and understand their interactive influences on the integrated biogeochemical distributions and processes. A number of measurements have been taken for 21 freeze core samples along the Columbia River bank in the Hanford 300 Area, and unique datasets have been obtained on biomass, pH, number of microbial taxa, percentage of N/C/H/S, microbial activity parameters, as well as microbial community attributes/modules. In order to gain a complete understanding of the geological control on these variables and processes, the explanatory variables are set to include quantitative gravel/sand/mud/silt/clay percentages, statistical moments of grain size distributions, as well as geological (e.g., Folk-Wentworth) and statistical (e.g., hierarchical) clusters. The dominant factors for major microbial and geochemical variables are identified and summarized using exploratory data analysis approaches (e.g., principal component analysis, hierarchical clustering, factor analysis, multivariate analysis of variance). The feasibility of extending the facies definition and its control of microbial and geochemical properties to larger scales is discussed.

  3. Microbial facies distribution and its geological and geochemical controls at the Hanford 300 area

    NASA Astrophysics Data System (ADS)

    Hou, Z.; Nelson, W.; Stegen, J.; Murray, C. J.; Arntzen, E.

    2015-12-01

    Efforts have been made by various scientific disciplines to study hyporheic zones and characterize their associated processes. One way to approach the study of the hyporheic zone is to define facies, which are elements of a (hydrobio) geologic classification scheme that groups components of a complex system with high variability into a manageable set of discrete classes. In this study, we try to classify the hyporheic zone based on the geology, geochemistry, microbiology, and understand their interactive influences on the integrated biogeochemical distributions and processes. A number of measurements have been taken for 21 freeze core samples along the Columbia River bank in the Hanford 300 Area, and unique datasets have been obtained on biomass, pH, number of microbial taxa, percentage of N/C/H/S, microbial activity parameters, as well as microbial community attributes/modules. In order to gain a complete understanding of the geological control on these variables and processes, the explanatory variables are set to include quantitative gravel/sand/mud/silt/clay percentages, statistical moments of grain size distributions, as well as geological (e.g., Folk-Wentworth) and statistical (e.g., hierarchical) clusters. The dominant factors for major microbial and geochemical variables are identified and summarized using exploratory data analysis approaches (e.g., principal component analysis, hierarchical clustering, factor analysis, multivariate analysis of variance). The feasibility of extending the facies definition and its control of microbial and geochemical properties to larger scales is discussed.

  4. Trends in Process Analytical Technology: Present State in Bioprocessing.

    PubMed

    Jenzsch, Marco; Bell, Christian; Buziol, Stefan; Kepert, Felix; Wegele, Harald; Hakemeyer, Christian

    2017-08-04

    Process analytical technology (PAT), the regulatory initiative for incorporating quality in pharmaceutical manufacturing, is an area of intense research and interest. If PAT is effectively applied to bioprocesses, this can increase process understanding and control, and mitigate the risk from substandard drug products to both manufacturer and patient. To optimize the benefits of PAT, the entire PAT framework must be considered and each elements of PAT must be carefully selected, including sensor and analytical technology, data analysis techniques, control strategies and algorithms, and process optimization routines. This chapter discusses the current state of PAT in the biopharmaceutical industry, including several case studies demonstrating the degree of maturity of various PAT tools. Graphical Abstract Hierarchy of QbD components.

  5. A 2D systems approach to iterative learning control for discrete linear processes with zero Markov parameters

    NASA Astrophysics Data System (ADS)

    Hladowski, Lukasz; Galkowski, Krzysztof; Cai, Zhonglun; Rogers, Eric; Freeman, Chris T.; Lewin, Paul L.

    2011-07-01

    In this article a new approach to iterative learning control for the practically relevant case of deterministic discrete linear plants with uniform rank greater than unity is developed. The analysis is undertaken in a 2D systems setting that, by using a strong form of stability for linear repetitive processes, allows simultaneous consideration of both trial-to-trial error convergence and along the trial performance, resulting in design algorithms that can be computed using linear matrix inequalities (LMIs). Finally, the control laws are experimentally verified on a gantry robot that replicates a pick and place operation commonly found in a number of applications to which iterative learning control is applicable.

  6. Biochemical and genetic analysis of the yeast proteome with a movable ORF collection

    PubMed Central

    Gelperin, Daniel M.; White, Michael A.; Wilkinson, Martha L.; Kon, Yoshiko; Kung, Li A.; Wise, Kevin J.; Lopez-Hoyo, Nelson; Jiang, Lixia; Piccirillo, Stacy; Yu, Haiyuan; Gerstein, Mark; Dumont, Mark E.; Phizicky, Eric M.; Snyder, Michael; Grayhack, Elizabeth J.

    2005-01-01

    Functional analysis of the proteome is an essential part of genomic research. To facilitate different proteomic approaches, a MORF (moveable ORF) library of 5854 yeast expression plasmids was constructed, each expressing a sequence-verified ORF as a C-terminal ORF fusion protein, under regulated control. Analysis of 5573 MORFs demonstrates that nearly all verified ORFs are expressed, suggests the authenticity of 48 ORFs characterized as dubious, and implicates specific processes including cytoskeletal organization and transcriptional control in growth inhibition caused by overexpression. Global analysis of glycosylated proteins identifies 109 new confirmed N-linked and 345 candidate glycoproteins, nearly doubling the known yeast glycome. PMID:16322557

  7. Low cost solar array project. Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Technical activities are reported in the design of process, facilities, and equipment for producing silicon at a rate and price comensurate with production goals for low cost solar cell modules. The silane-silicone process has potential for providing high purity poly-silicon on a commercial scale at a price of fourteen dollars per kilogram by 1986, (1980 dollars). Commercial process, economic analysis, process support research and development, and quality control are discussed.

  8. Comparison of automatic control systems

    NASA Technical Reports Server (NTRS)

    Oppelt, W

    1941-01-01

    This report deals with a reciprocal comparison of an automatic pressure control, an automatic rpm control, an automatic temperature control, and an automatic directional control. It shows the difference between the "faultproof" regulator and the actual regulator which is subject to faults, and develops this difference as far as possible in a parallel manner with regard to the control systems under consideration. Such as analysis affords, particularly in its extension to the faults of the actual regulator, a deep insight into the mechanism of the regulator process.

  9. [Research on optimal modeling strategy for licorice extraction process based on near-infrared spectroscopy technology].

    PubMed

    Wang, Hai-Xia; Suo, Tong-Chuan; Yu, He-Shui; Li, Zheng

    2016-10-01

    The manufacture of traditional Chinese medicine (TCM) products is always accompanied by processing complex raw materials and real-time monitoring of the manufacturing process. In this study, we investigated different modeling strategies for the extraction process of licorice. Near-infrared spectra associate with the extraction time was used to detemine the states of the extraction processes. Three modeling approaches, i.e., principal component analysis (PCA), partial least squares regression (PLSR) and parallel factor analysis-PLSR (PARAFAC-PLSR), were adopted for the prediction of the real-time status of the process. The overall results indicated that PCA, PLSR and PARAFAC-PLSR can effectively detect the errors in the extraction procedure and predict the process trajectories, which has important significance for the monitoring and controlling of the extraction processes. Copyright© by the Chinese Pharmaceutical Association.

  10. Differences in cumulus cells gene expression between modified natural and stimulated in vitro fertilization cycles.

    PubMed

    Papler, Tanja Burnik; Bokal, Eda Vrtačnik; Tacer, Klementina Fon; Juvan, Peter; Virant Klun, Irma; Devjak, Rok

    2014-01-01

    The aim of our study was to determine whether there are any differences in the cumulus cell gene expression profile of mature oocytes derived from modified natural IVF and controlled ovarian hyperstimulation cycles and if these changes could help us understand why modified natural IVF has lower success rates. Cumulus cells surrounding mature oocytes that developed to morulae or blastocysts on day 5 after oocyte retrieval were submitted to microarray analysis. The obtained data were then validated using quantitative real-time PCR. There were 66 differentially expressed genes between cumulus cells of modified natural IVF and controlled ovarian hyperstimulation cycles. Gene ontology analysis revealed the oxidation-reduction process, glutathione metabolic process, xenobiotic metabolic process and gene expression were significantly enriched biological processes in MNIVF cycles. Among differentially expressed genes we observed a large group of small nucleolar RNA's whose role in folliculogenesis has not yet been established. The increased expression of genes involved in the oxidation-reduction process probably points to hypoxic conditions in modified natural IVF cycles. This finding opens up new perspectives for the establishment of the potential role that oxidation-reduction processes have in determining success rates of modified natural IVF.

  11. Virtual reality measures in neuropsychological assessment: a meta-analytic review.

    PubMed

    Neguț, Alexandra; Matu, Silviu-Andrei; Sava, Florin Alin; David, Daniel

    2016-02-01

    Virtual reality-based assessment is a new paradigm for neuropsychological evaluation, that might provide an ecological assessment, compared to paper-and-pencil or computerized neuropsychological assessment. Previous research has focused on the use of virtual reality in neuropsychological assessment, but no meta-analysis focused on the sensitivity of virtual reality-based measures of cognitive processes in measuring cognitive processes in various populations. We found eighteen studies that compared the cognitive performance between clinical and healthy controls on virtual reality measures. Based on a random effects model, the results indicated a large effect size in favor of healthy controls (g = .95). For executive functions, memory and visuospatial analysis, subgroup analysis revealed moderate to large effect sizes, with superior performance in the case of healthy controls. Participants' mean age, type of clinical condition, type of exploration within virtual reality environments, and the presence of distractors were significant moderators. Our findings support the sensitivity of virtual reality-based measures in detecting cognitive impairment. They highlight the possibility of using virtual reality measures for neuropsychological assessment in research applications, as well as in clinical practice.

  12. Sensing in tissue bioreactors

    NASA Astrophysics Data System (ADS)

    Rolfe, P.

    2006-03-01

    Specialized sensing and measurement instruments are under development to aid the controlled culture of cells in bioreactors for the fabrication of biological tissues. Precisely defined physical and chemical conditions are needed for the correct culture of the many cell-tissue types now being studied, including chondrocytes (cartilage), vascular endothelial cells and smooth muscle cells (blood vessels), fibroblasts, hepatocytes (liver) and receptor neurones. Cell and tissue culture processes are dynamic and therefore, optimal control requires monitoring of the key process variables. Chemical and physical sensing is approached in this paper with the aim of enabling automatic optimal control, based on classical cell growth models, to be achieved. Non-invasive sensing is performed via the bioreactor wall, invasive sensing with probes placed inside the cell culture chamber and indirect monitoring using analysis within a shunt or a sampling chamber. Electroanalytical and photonics-based systems are described. Chemical sensing for gases, ions, metabolites, certain hormones and proteins, is under development. Spectroscopic analysis of the culture medium is used for measurement of glucose and for proteins that are markers of cell biosynthetic behaviour. Optical interrogation of cells and tissues is also investigated for structural analysis based on scatter.

  13. Proteomic analysis of Taenia ovis metacestodes by high performance liquid chromatography-coupled tandem mass spectrometry.

    PubMed

    Zheng, Yadong

    2017-03-15

    Taenia ovis metacestodes reside in the muscle of sheep and goats, and may cause great economic loss due to condemnation of carcasses if not effectively controlled. Although advances have been made in the control of T. ovis infection, our knowledge of T. ovis biology is limited. Herein the protein profiling of T. ovis metacestodes was determined by liquid chromatography-linked tandem mass spectrometry. A total of 966 proteins were identified and 25.1% (188/748) were annotated to be associated with metabolic pathways. Consistently, GO analysis returned a metabolic process (16.27%) as one of two main biological process terms. Moreover, it was found that 24 proteins, including very low-density lipoprotein receptor, enolase, paramyosin and endophilin B1, were abundant in T. ovis metacestodes. These proteins may be associated with motility, metabolism, signaling, stress, drug resistance and immune responses. Furthermore, comparative analysis of 5 cestodes revealed the presence of Taenia-specific enolases. These data provide clues for better understanding of T. ovis biology, which is informative for effective control of infection. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. [Conceptual self-management analysis of hypertensive individuals].

    PubMed

    Balduino, Anice de Fátima Ahmad; Mantovani, Maria de Fátima; Lacerda, Maria Ribeiro; Meier, Marineli Joaquim

    2013-12-01

    This research aimed to analyze the concept of self-management of hypertensive individuals. Theoretical and documentary study based on Walker and Avant's conceptual analysis by means of the Scientific Electronic Library Brazil and the Medical Literature Analysis and Retrieval System Online in the Coordination for Higher Education Personnel Development (CAPES, in Portuguese) and the National Library of Medicine websites. Fourteen (14) articles and one (1) thesis were selected and reviewed in Portuguese and English, in the period January 2007 to September 2012. missing doctor's appointments, non-compliance to blood pressure control treatment to recommendations to proper diet standards and stress. Attributer blood pressure control and disease management Consequences home monitoring of blood pressure with control improvement, accomplishment of disease management, compliance and sharing of the creation process of self-management goals and caring activities by the interdiscplinary team through individualized actions. It was concluded that the self-management concept is a dynamic, active process which requires knowledge, attitude, discipline, determination, commitment self-regulation, empowerment and self-efficiency in order to manage the disease and achieve healthy living.

  15. Inventory Control System for a Healthcare Apparel Service Centre with Stockout Risk: A Case Analysis

    PubMed Central

    Hui, Chi-Leung

    2017-01-01

    Based on the real-world inventory control problem of a capacitated healthcare apparel service centre in Hong Kong which provides tailor-made apparel-making services for the elderly and disabled people, this paper studies a partial backordered continuous review inventory control problem in which the product demand follows a Poisson process with a constant lead time. The system is controlled by an (Q,r) inventory policy which incorporate the stockout risk, storage capacity, and partial backlog. The healthcare apparel service centre, under the capacity constraint, aims to minimize the inventory cost and achieving a low stockout risk. To address this challenge, an optimization problem is constructed. A real case-based data analysis is conducted, and the result shows that the expected total cost on an order cycle is reduced substantially at around 20% with our proposed optimal inventory control policy. An extensive sensitivity analysis is conducted to generate additional insights. PMID:29527283

  16. Inventory Control System for a Healthcare Apparel Service Centre with Stockout Risk: A Case Analysis.

    PubMed

    Pan, An; Hui, Chi-Leung

    2017-01-01

    Based on the real-world inventory control problem of a capacitated healthcare apparel service centre in Hong Kong which provides tailor-made apparel-making services for the elderly and disabled people, this paper studies a partial backordered continuous review inventory control problem in which the product demand follows a Poisson process with a constant lead time. The system is controlled by an ( Q , r ) inventory policy which incorporate the stockout risk, storage capacity, and partial backlog. The healthcare apparel service centre, under the capacity constraint, aims to minimize the inventory cost and achieving a low stockout risk. To address this challenge, an optimization problem is constructed. A real case-based data analysis is conducted, and the result shows that the expected total cost on an order cycle is reduced substantially at around 20% with our proposed optimal inventory control policy. An extensive sensitivity analysis is conducted to generate additional insights.

  17. A study of discrete control signal fault conditions in the shuttle DPS

    NASA Technical Reports Server (NTRS)

    Reddi, S. S.; Retter, C. T.

    1976-01-01

    An analysis of the effects of discrete failures on the data processing subsystem is presented. A functional description of each discrete together with a list of software modules that use this discrete are included. A qualitative description of the consequences that may ensue due to discrete failures is given followed by a probabilistic reliability analysis of the data processing subsystem. Based on the investigation conducted, recommendations were made to improve the reliability of the subsystem.

  18. Radar cross section studies/compact range research

    NASA Technical Reports Server (NTRS)

    Burnside, W. D.; Dominek, A. K.; Gupta, I. J.; Newman, E. H.; Pathak, P. H.; Peters, L., Jr.

    1989-01-01

    Achievements in advancing the state-of-the-art in the measurement, control, and analysis of electromagnetic scattering from general aerodynamic targets are summarized. The major topics associated with this study include: (1) electromagnetic scattering analysis; (2) indoor scattering measurement systems; (3) RCS control; (4) waveform processing techniques; (5) material scattering and design studies; (6) design and evaluation of standard targets; and (7) antenna studies. Progress in each of these areas is reported and related publications are listed.

  19. An integrated microfluidic analysis microsystems with bacterial capture enrichment and in-situ impedance detection

    NASA Astrophysics Data System (ADS)

    Liu, Hai-Tao; Wen, Zhi-Yu; Xu, Yi; Shang, Zheng-Guo; Peng, Jin-Lan; Tian, Peng

    2017-09-01

    In this paper, an integrated microfluidic analysis microsystems with bacterial capture enrichment and in-situ impedance detection was purposed based on microfluidic chips dielectrophoresis technique and electrochemical impedance detection principle. The microsystems include microfluidic chip, main control module, and drive and control module, and signal detection and processing modulet and result display unit. The main control module produce the work sequence of impedance detection system parts and achieve data communication functions, the drive and control circuit generate AC signal which amplitude and frequency adjustable, and it was applied on the foodborne pathogens impedance analysis microsystems to realize the capture enrichment and impedance detection. The signal detection and processing circuit translate the current signal into impendence of bacteria, and transfer to computer, the last detection result is displayed on the computer. The experiment sample was prepared by adding Escherichia coli standard sample into chicken sample solution, and the samples were tested on the dielectrophoresis chip capture enrichment and in-situ impedance detection microsystems with micro-array electrode microfluidic chips. The experiments show that the Escherichia coli detection limit of microsystems is 5 × 104 CFU/mL and the detection time is within 6 min in the optimization of voltage detection 10 V and detection frequency 500 KHz operating conditions. The integrated microfluidic analysis microsystems laid the solid foundation for rapid real-time in-situ detection of bacteria.

  20. Mineral status in canine medial coronoid process disease: a cohort study using analysis of hair by mass spectrometry.

    PubMed

    Davies, M; West, J; Williams, C; Gardner, D S

    2017-05-06

    In several species, developmental skeletal diseases involving abnormal endochondral ossification have been associated with imbalanced mineral intake. Hair analysis reflects long-term mineral status. To determine the mineral content of hair from dogs with or without medial coronoid process disease (MCPD). Dogs with MCPD have a different profile of minerals known to influence metalloenzymes involved in endochondral ossification. After cleansing, chelation and acid digestion of hair samples (n=79 in total: control dogs, n=70 v MCPD, n=9), mineral profile (7 major and 25 trace elements) was determined by inductively coupled plasma-mass spectrometry. Dogs were of similar age (control, 4.05 [1.85-7.70] v MCPD, 4.30 [3.25-6.53] median (IQR) years; P=0.78) and gender (control, n=43/27 v MCPD, n=4/5 males/females). 28/70 (40 per cent) of control and 8/9 (88 per cent) of MCPD dogs were neutered, respectively. Hair from dogs with MCPD contained significantly lower amounts (µg/g/DM) of copper, sulphur and zinc (all at P<0.001). Age, sex and neutered status had no effect on hair mineral status. Based on hair analysis, a role for mineral imbalance including copper, sulphur and zinc in the aetiopathogenesis of canine MCPD is suggested. Hair mineral analysis may prove useful as a biomarker for susceptible puppies. British Veterinary Association.

  1. Improving financial performance by modeling and analysis of radiology procedure scheduling at a large community hospital.

    PubMed

    Lu, Lingbo; Li, Jingshan; Gisler, Paula

    2011-06-01

    Radiology tests, such as MRI, CT-scan, X-ray and ultrasound, are cost intensive and insurance pre-approvals are necessary to get reimbursement. In some cases, tests may be denied for payments by insurance companies due to lack of pre-approvals, inaccurate or missing necessary information. This can lead to substantial revenue losses for the hospital. In this paper, we present a simulation study of a centralized scheduling process for outpatient radiology tests at a large community hospital (Central Baptist Hospital in Lexington, Kentucky). Based on analysis of the central scheduling process, a simulation model of information flow in the process has been developed. Using such a model, the root causes of financial losses associated with errors and omissions in this process were identified and analyzed, and their impacts were quantified. In addition, "what-if" analysis was conducted to identify potential process improvement strategies in the form of recommendations to the hospital leadership. Such a model provides a quantitative tool for continuous improvement and process control in radiology outpatient test scheduling process to reduce financial losses associated with process error. This method of analysis is also applicable to other departments in the hospital.

  2. A Lean Six Sigma approach to the improvement of the selenium analysis method.

    PubMed

    Cloete, Bronwyn C; Bester, André

    2012-11-02

    Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and represents both a management discipline, and a standardised approach to problem solving and process optimisation.

  3. ADP SYSTEMS ANALYSIS - COMMITTED VS. AVAILABLE MILITARY TRANSPORTATION (LMI T1).

    DTIC Science & Technology

    LOGISTICS , * MANAGEMENT ENGINEERING), (*DATA PROCESSING, LOGISTICS), INFORMATION RETRIEVAL, SYSTEMS ENGINEERING, MILITARY TRANSPORTATION, CARGO VEHICLES, SCHEDULING, COMPUTER PROGRAMMING, MANAGEMENT PLANNING AND CONTROL

  4. Optical Fourier diffractometry applied to degraded bone structure recognition

    NASA Astrophysics Data System (ADS)

    Galas, Jacek; Godwod, Krzysztof; Szawdyn, Jacek; Sawicki, Andrzej

    1993-09-01

    Image processing and recognition methods are useful in many fields. This paper presents the hybrid optical and digital method applied to recognition of pathological changes in bones involved by metabolic bone diseases. The trabecular bone structure, registered by x ray on the photographic film, is analyzed in the new type of computer controlled diffractometer. The set of image parameters, extracted from diffractogram, is evaluated by statistical analysis. The synthetic image descriptors in discriminant space, constructed on the base of 3 training groups of images (control, osteoporosis, and osteomalacia groups) by discriminant analysis, allow us to recognize bone samples with degraded bone structure and to recognize the disease. About 89% of the images were classified correctly. This method after optimization process will be verified in medical investigations.

  5. [Applications of near-infrared spectroscopy to analysis of traditional Chinese herbal medicine].

    PubMed

    Li, Yan-Zhou; Min, Shun-Geng; Liu, Xia

    2008-07-01

    Analysis of traditional Chinese herbal medicine is of great importance to its quality control Conventional analysis methods can not meet the requirement of rapid and on-line analysis because of complex process more experiences or needed. In recent years, near-infrared spectroscopy technique has been used for rapid determination of active components, on-line quality control, identification of counterfeit and discrimination of geographical origins of herbal medicines and so on, due to its advantages of simple pretreatment, high efficiency, convenience to use solid diffuse reflection spectroscopy and fiber. In the present paper, the principles and methods of near-infrared spectroscopy technique are introduced concisely. Especially, the applications of this technique in quantitative analysis and qualitative analysis of traditional Chinese herbal medicine are reviewed.

  6. Experimental power spectral density analysis for mid- to high-spatial frequency surface error control.

    PubMed

    Hoyo, Javier Del; Choi, Heejoo; Burge, James H; Kim, Geon-Hee; Kim, Dae Wook

    2017-06-20

    The control of surface errors as a function of spatial frequency is critical during the fabrication of modern optical systems. A large-scale surface figure error is controlled by a guided removal process, such as computer-controlled optical surfacing. Smaller-scale surface errors are controlled by polishing process parameters. Surface errors of only a few millimeters may degrade the performance of an optical system, causing background noise from scattered light and reducing imaging contrast for large optical systems. Conventionally, the microsurface roughness is often given by the root mean square at a high spatial frequency range, with errors within a 0.5×0.5  mm local surface map with 500×500 pixels. This surface specification is not adequate to fully describe the characteristics for advanced optical systems. The process for controlling and minimizing mid- to high-spatial frequency surface errors with periods of up to ∼2-3  mm was investigated for many optical fabrication conditions using the measured surface power spectral density (PSD) of a finished Zerodur optical surface. Then, the surface PSD was systematically related to various fabrication process parameters, such as the grinding methods, polishing interface materials, and polishing compounds. The retraceable experimental polishing conditions and processes used to produce an optimal optical surface PSD are presented.

  7. The Launch Processing System for Space Shuttle.

    NASA Technical Reports Server (NTRS)

    Springer, D. A.

    1973-01-01

    In order to reduce costs and accelerate vehicle turnaround, a single automated system will be developed to support shuttle launch site operations, replacing a multiplicity of systems used in previous programs. The Launch Processing System will provide real-time control, data analysis, and information display for the checkout, servicing, launch, landing, and refurbishment of the launch vehicles, payloads, and all ground support systems. It will also provide real-time and historical data retrieval for management and sustaining engineering (test records and procedures, logistics, configuration control, scheduling, etc.).

  8. Analysis of Management Control Techniques for the Data Processing Department at the Navy Finance Center, Cleveland, Ohio.

    DTIC Science & Technology

    1983-03-01

    Sysiem are: Order processinq coordinators Order processing management Credit and collections Accounts receivable Support management Admin ianagemenr...or sales secretary, then by order processing (OP). Phone-in orders go directly to OP. The infor- mation is next Transcribed onto an order entry... ORDER PROCESSING : The central systems validate The order items and codes t!, processing them against the customer file, the prodicT or PA? ts file, and

  9. A New Hybrid BFOA-PSO Optimization Technique for Decoupling and Robust Control of Two-Coupled Distillation Column Process.

    PubMed

    Abdelkarim, Noha; Mohamed, Amr E; El-Garhy, Ahmed M; Dorrah, Hassen T

    2016-01-01

    The two-coupled distillation column process is a physically complicated system in many aspects. Specifically, the nested interrelationship between system inputs and outputs constitutes one of the significant challenges in system control design. Mostly, such a process is to be decoupled into several input/output pairings (loops), so that a single controller can be assigned for each loop. In the frame of this research, the Brain Emotional Learning Based Intelligent Controller (BELBIC) forms the control structure for each decoupled loop. The paper's main objective is to develop a parameterization technique for decoupling and control schemes, which ensures robust control behavior. In this regard, the novel optimization technique Bacterial Swarm Optimization (BSO) is utilized for the minimization of summation of the integral time-weighted squared errors (ITSEs) for all control loops. This optimization technique constitutes a hybrid between two techniques, which are the Particle Swarm and Bacterial Foraging algorithms. According to the simulation results, this hybridized technique ensures low mathematical burdens and high decoupling and control accuracy. Moreover, the behavior analysis of the proposed BELBIC shows a remarkable improvement in the time domain behavior and robustness over the conventional PID controller.

  10. A New Hybrid BFOA-PSO Optimization Technique for Decoupling and Robust Control of Two-Coupled Distillation Column Process

    PubMed Central

    Mohamed, Amr E.; Dorrah, Hassen T.

    2016-01-01

    The two-coupled distillation column process is a physically complicated system in many aspects. Specifically, the nested interrelationship between system inputs and outputs constitutes one of the significant challenges in system control design. Mostly, such a process is to be decoupled into several input/output pairings (loops), so that a single controller can be assigned for each loop. In the frame of this research, the Brain Emotional Learning Based Intelligent Controller (BELBIC) forms the control structure for each decoupled loop. The paper's main objective is to develop a parameterization technique for decoupling and control schemes, which ensures robust control behavior. In this regard, the novel optimization technique Bacterial Swarm Optimization (BSO) is utilized for the minimization of summation of the integral time-weighted squared errors (ITSEs) for all control loops. This optimization technique constitutes a hybrid between two techniques, which are the Particle Swarm and Bacterial Foraging algorithms. According to the simulation results, this hybridized technique ensures low mathematical burdens and high decoupling and control accuracy. Moreover, the behavior analysis of the proposed BELBIC shows a remarkable improvement in the time domain behavior and robustness over the conventional PID controller. PMID:27807444

  11. Quality by design case study: an integrated multivariate approach to drug product and process development.

    PubMed

    Huang, Jun; Kaul, Goldi; Cai, Chunsheng; Chatlapalli, Ramarao; Hernandez-Abad, Pedro; Ghosh, Krishnendu; Nagi, Arwinder

    2009-12-01

    To facilitate an in-depth process understanding, and offer opportunities for developing control strategies to ensure product quality, a combination of experimental design, optimization and multivariate techniques was integrated into the process development of a drug product. A process DOE was used to evaluate effects of the design factors on manufacturability and final product CQAs, and establish design space to ensure desired CQAs. Two types of analyses were performed to extract maximal information, DOE effect & response surface analysis and multivariate analysis (PCA and PLS). The DOE effect analysis was used to evaluate the interactions and effects of three design factors (water amount, wet massing time and lubrication time), on response variables (blend flow, compressibility and tablet dissolution). The design space was established by the combined use of DOE, optimization and multivariate analysis to ensure desired CQAs. Multivariate analysis of all variables from the DOE batches was conducted to study relationships between the variables and to evaluate the impact of material attributes/process parameters on manufacturability and final product CQAs. The integrated multivariate approach exemplifies application of QbD principles and tools to drug product and process development.

  12. U.K. Foot and Mouth Disease: A Systemic Risk Assessment of Existing Controls.

    PubMed

    Delgado, João; Pollard, Simon; Pearn, Kerry; Snary, Emma L; Black, Edgar; Prpich, George; Longhurst, Phil

    2017-09-01

    This article details a systemic analysis of the controls in place and possible interventions available to further reduce the risk of a foot and mouth disease (FMD) outbreak in the United Kingdom. Using a research-based network analysis tool, we identify vulnerabilities within the multibarrier control system and their corresponding critical control points (CCPs). CCPs represent opportunities for active intervention that produce the greatest improvement to United Kingdom's resilience to future FMD outbreaks. Using an adapted 'features, events, and processes' (FEPs) methodology and network analysis, our results suggest that movements of animals and goods associated with legal activities significantly influence the system's behavior due to their higher frequency and ability to combine and create scenarios of exposure similar in origin to the U.K. FMD outbreaks of 1967/8 and 2001. The systemic risk assessment highlights areas outside of disease control that are relevant to disease spread. Further, it proves to be a powerful tool for demonstrating the need for implementing disease controls that have not previously been part of the system. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  13. Cyber-physical system for a water reclamation plant: Balancing aeration, energy, and water quality to maintain process resilience

    NASA Astrophysics Data System (ADS)

    Zhu, Junjie

    Aeration accounts for a large fraction of energy consumption in conventional water reclamation plants (WRPs). Although process operations at older WRPs can satisfy effluent permit requirements, they typically operate with excess aeration. More effective process controls at older WRPs can be challenging as operators work to balance higher energy costs and more stringent effluent limitations while managing fluctuating loads. Therefore, understandings of process resilience or ability to quickly return to original operation conditions at a WRP are important. A state-of-art WRP should maintain process resilience to deal with different kinds of perturbations even after optimization of energy demands. This work was to evaluate the applicability and feasibility of cyber-physical system (CPS) for improving operation at Metropolitan Water Reclamation District of Greater Chicago (MWRDGC) Calumet WRP. In this work, a process model was developed and used to better understand the conditions of current Calumet WRP, with additional valuable information from two dissolved oxygen field measurements. Meanwhile, a classification system was developed to reveal the pattern of historical influent scenario based on cluster analysis and cross-tabulation analysis. Based on the results from the classification, typical process control options were investigated. To ensure the feasibility of information acquisition, the reliability and flexibility of soft sensors were assessed to typical influent conditions. Finally, the process resilience was investigated to better balance influent perturbations, energy demands, and effluent quality for long-term operations. These investigations and evaluations show that although the energy demands change as the influent conditions and process controls. In general, aeration savings could be up to 50% from the level of current consumption; with a more complex process controls, the saving could be up to 70% in relatively steady-state conditions and at least 40% in relatively challenging transient conditions. The soft sensors can provide reliable and flexible performance on target predictions. The plant can still maintain at a similar level of process resilience after 50% aeration saving, even during long-term perturbations. Overall, this work shows that it is well feasible to provide more cost-effective operations at the Calumet WRP, and meanwhile influent perturbations, effluent quality, and process resilience are well in balance.

  14. A system framework of inter-enterprise machining quality control based on fractal theory

    NASA Astrophysics Data System (ADS)

    Zhao, Liping; Qin, Yongtao; Yao, Yiyong; Yan, Peng

    2014-03-01

    In order to meet the quality control requirement of dynamic and complicated product machining processes among enterprises, a system framework of inter-enterprise machining quality control based on fractal was proposed. In this system framework, the fractal-specific characteristic of inter-enterprise machining quality control function was analysed, and the model of inter-enterprise machining quality control was constructed by the nature of fractal structures. Furthermore, the goal-driven strategy of inter-enterprise quality control and the dynamic organisation strategy of inter-enterprise quality improvement were constructed by the characteristic analysis on this model. In addition, the architecture of inter-enterprise machining quality control based on fractal was established by means of Web service. Finally, a case study for application was presented. The result showed that the proposed method was available, and could provide guidance for quality control and support for product reliability in inter-enterprise machining processes.

  15. An explorative analysis of the recruitment of patients to a randomised controlled trial in adolescents with dental anxiety.

    PubMed

    Wide Boman, Ulla; Broberg, Anders G; Krekmanova, Larisa; Staberg, Marie; Svensson, Carina; Robertson, Agneta

    2014-01-01

    Randomised controlled trials (RCTs) are considered to provide the most reliable evidence on the efficacy of interventions. The aim of this study was to describe the recruitment process of an RCT study set up to evaluate a Cognitive Behavioural Therapy (CBT) intervention programme for adolescent patients with dental anxiety (DA). The participants were recruited from a consecutive sample of adolescent patients (12-19 yrs old) referred for DA to a specialised pediatric dentistry clinic. Age, gender, and reason for referral were recorded for the possible eligible patients as part of the drop-out analysis of the recruitment process. Participants were then randomized to the intervention (CBT integrated with dental treatment) or control (adapted dental treatment) condition. In the recruitment process, 138 possible eligible patients met inclusion criteria, of these 55 were enrolled, 44 declined participation and 39 patients were excluded.The patients enrolled in the RCT did not differ from the non-participants with regard to age, gender or cause of referral. As a result of difficulties in the recruitment process, the study period was extended. The considerable proportion of non-participants as evident from the recruitment process may pose a threat to the external validity of the clinical trial. From a clinical perspective, the reasons for the lack of motivation to participate in behavioural interventions and the failure to appear warrant further investigation.

  16. The environmental control and life support system advanced automation project. Phase 1: Application evaluation

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.

    1990-01-01

    The Environmental Control and Life Support System (ECLSS) is a Freedom Station distributed system with inherent applicability to advanced automation primarily due to the comparatively large reaction times of its subsystem processes. This allows longer contemplation times in which to form a more intelligent control strategy and to detect or prevent faults. The objective of the ECLSS Advanced Automation Project is to reduce the flight and ground manpower needed to support the initial and evolutionary ECLS system. The approach is to search out and make apparent those processes in the baseline system which are in need of more automatic control and fault detection strategies, to influence the ECLSS design by suggesting software hooks and hardware scars which will allow easy adaptation to advanced algorithms, and to develop complex software prototypes which fit into the ECLSS software architecture and will be shown in an ECLSS hardware testbed to increase the autonomy of the system. Covered here are the preliminary investigation and evaluation process, aimed at searching the ECLSS for candidate functions for automation and providing a software hooks and hardware scars analysis. This analysis shows changes needed in the baselined system for easy accommodation of knowledge-based or other complex implementations which, when integrated in flight or ground sustaining engineering architectures, will produce a more autonomous and fault tolerant Environmental Control and Life Support System.

  17. Optimizing Web-Based Instruction: A Case Study Using Poultry Processing Unit Operations

    ERIC Educational Resources Information Center

    O' Bryan, Corliss A.; Crandall, Philip G.; Shores-Ellis, Katrina; Johnson, Donald M.; Ricke, Steven C.; Marcy, John

    2009-01-01

    Food companies and supporting industries need inexpensive, revisable training methods for large numbers of hourly employees due to continuing improvements in Hazard Analysis Critical Control Point (HACCP) programs, new processing equipment, and high employee turnover. HACCP-based food safety programs have demonstrated their value by reducing the…

  18. 21 CFR 120.25 - Process verification for certain processors.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Process verification for certain processors. 120.25 Section 120.25 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS...

  19. 21 CFR 120.25 - Process verification for certain processors.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Process verification for certain processors. 120.25 Section 120.25 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS...

  20. 21 CFR 120.25 - Process verification for certain processors.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Process verification for certain processors. 120.25 Section 120.25 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS...

  1. AFRC2017-0076-1

    NASA Image and Video Library

    2017-04-04

    NASA Armstrong’s Mission Control Center, or MCC, is where culmination of all data-gathering occurs. Engineers, flight controllers and researchers monitor flights and missions as they are carried out. Data and video run through the MCC and are recorded, displayed and archived. Data is then processed and prepared for post-flight analysis.

  2. 78 FR 72626 - Notice of Request for Renewal of a Currently Approved Information Collection (Pathogen Reduction...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-03

    ... reduction and Hazard Analysis and Critical Control Point (HACCP) Systems requirements because OMB approval... February 28, 2014. FSIS has established requirements applicable to meat and poultry establishments designed.... coli by slaughter establishments to verify the adequacy of the establishment's process controls for the...

  3. What Students Learn from Hands-On Activities

    ERIC Educational Resources Information Center

    Schwichow, Martin; Zimmerman, Corinne; Croker, Steve; Härtig, Hendrik

    2016-01-01

    The ability to design and interpret controlled experiments is an important scientific process skill and a common objective of science standards. Numerous intervention studies have investigated how the control-of-variables-strategy (CVS) can be introduced to students. However, a meta-analysis of 72 intervention studies found that the opportunity to…

  4. Analysis of ChIP-seq Data in R/Bioconductor.

    PubMed

    de Santiago, Ines; Carroll, Thomas

    2018-01-01

    The development of novel high-throughput sequencing methods for ChIP (chromatin immunoprecipitation) has provided a very powerful tool to study gene regulation in multiple conditions at unprecedented resolution and scale. Proactive quality-control and appropriate data analysis techniques are of critical importance to extract the most meaningful results from the data. Over the last years, an array of R/Bioconductor tools has been developed allowing researchers to process and analyze ChIP-seq data. This chapter provides an overview of the methods available to analyze ChIP-seq data based primarily on software packages from the open-source Bioconductor project. Protocols described in this chapter cover basic steps including data alignment, peak calling, quality control and data visualization, as well as more complex methods such as the identification of differentially bound regions and functional analyses to annotate regulatory regions. The steps in the data analysis process were demonstrated on publicly available data sets and will serve as a demonstration of the computational procedures routinely used for the analysis of ChIP-seq data in R/Bioconductor, from which readers can construct their own analysis pipelines.

  5. Electric terminal performance and characterization of solid oxide fuel cells and systems

    NASA Astrophysics Data System (ADS)

    Lindahl, Peter Allan

    Solid Oxide Fuel Cells (SOFCs) are electrochemical devices which can effect efficient, clean, and quiet conversion of chemical to electrical energy. In contrast to conventional electricity generation systems which feature multiple discrete energy conversion processes, SOFCs are direct energy conversion devices. That is, they feature a fully integrated chemical to electrical energy conversion process where the electric load demanded of the cell intrinsically drives the electrochemical reactions and associated processes internal to the cell. As a result, the cell's electric terminals provide a path for interaction between load side electric demand and the conversion side processes. The implication of this is twofold. First, the magnitude and dynamic characteristics of the electric load demanded of the cell can directly impact the long-term efficacy of the cell's chemical to electrical energy conversion. Second, the electric terminal response to dynamic loads can be exploited for monitoring the cell's conversion side processes and used in diagnostic analysis and degradation-mitigating control schemes. This dissertation presents a multi-tier investigation into this electric terminal based performance characterization of SOFCs through the development of novel test systems, analysis techniques and control schemes. First, a reference-based simulation system is introduced. This system scales up the electric terminal performance of a prototype SOFC system, e.g. a single fuel cell, to that of a full power-level stack. This allows realistic stack/load interaction studies while maintaining explicit ability for post-test analysis of the prototype system. Next, a time-domain least squares fitting method for electrochemical impedance spectroscopy (EIS) is developed for reduced-time monitoring of the electrochemical and physicochemical mechanics of the fuel cell through its electric terminals. The utility of the reference-based simulator and the EIS technique are demonstrated through their combined use in the performance testing of a hybrid-source power management (HSPM) system designed to allow in-situ EIS monitoring of a stack under dynamic loading conditions. The results from the latter study suggest that an HSPM controller allows an opportunity for in-situ electric terminal monitoring and control-based mitigation of SOFC degradation. As such, an exploration of control-based SOFC degradation mitigation is presented and ideas for further work are suggested.

  6. Identification of genes involved in the ACC-mediated control of root cell elongation in Arabidopsis thaliana

    PubMed Central

    2012-01-01

    Background Along the root axis of Arabidopsis thaliana, cells pass through different developmental stages. In the apical meristem repeated cycles of division increase the numbers of cells. Upon leaving the meristem, these cells pass the transition zone where they are physiologically and mechanically prepared to undergo subsequent rapid elongation. During the process of elongation epidermal cells increase their length by 300% in a couple of hours. When elongation ceases, the cells acquire their final size, shape and functions (in the differentiation zone). Ethylene administered as its precursor 1-aminocyclopropane-1-carboxylic acid (ACC) is capable of inhibiting elongation in a concentration-dependent way. Using a microarray analysis, genes and/or processes involved in this elongation arrest are identified. Results Using a CATMA-microarray analysis performed on control and 3h ACC-treated roots, 240 differentially expressed genes were identified. Quantitative Real-Time RT-PCR analysis of the 10 most up and down regulated genes combined with literature search confirmed the accurateness of the analysis. This revealed that inhibition of cell elongation is, at least partly, caused by restricting the events that under normal growth conditions initiate elongation and by increasing the processes that normally stop cellular elongation at the end of the elongation/onset of differentiation zone. Conclusions ACC interferes with cell elongation in the Arabidopsis thaliana roots by inhibiting cells from entering the elongation process and by immediately stimulating the formation of cross-links in cell wall components, diminishing the remaining elongation capacity. From the analysis of the differentially expressed genes, it becomes clear that many genes identified in this response, are also involved in several other kind of stress responses. This suggests that many responses originate from individual elicitors, but that somewhere in the downstream signaling cascade, these are converged to a ’common pathway’. Furthermore, several potential keyplayers, such as transcription factors and auxin-responsive genes, were identified by the microarray analysis. They await further analysis to reveal their exact role in the control of cell elongation. PMID:23134674

  7. Distributed Secure Coordinated Control for Multiagent Systems Under Strategic Attacks.

    PubMed

    Feng, Zhi; Wen, Guanghui; Hu, Guoqiang

    2017-05-01

    This paper studies a distributed secure consensus tracking control problem for multiagent systems subject to strategic cyber attacks modeled by a random Markov process. A hybrid stochastic secure control framework is established for designing a distributed secure control law such that mean-square exponential consensus tracking is achieved. A connectivity restoration mechanism is considered and the properties on attack frequency and attack length rate are investigated, respectively. Based on the solutions of an algebraic Riccati equation and an algebraic Riccati inequality, a procedure to select the control gains is provided and stability analysis is studied by using Lyapunov's method.. The effect of strategic attacks on discrete-time systems is also investigated. Finally, numerical examples are provided to illustrate the effectiveness of theoretical analysis.

  8. Analysis and control of the METC fluid bed gasifier. Quarterly report, July 1--September 30, 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-12-31

    In this work, three components will form the basis for design of a control scheme for the Fluidized Bed Gasifier (FBG) at METC: (1) a control systems analysis based on simple linear models derived from process data; (2) review of the literature on fluid bed gasifier operation and control; and (3) understanding of present FBG operation and real world considerations. Tasks accomplished during the present reporting period include: (1) observation of the FBG during the week of July 17 to July 21; (2) suggested improvements to the control of FBG backpressure and MGCR pressure; and (3) data collection from FBGmore » run No. 11 and transfer of data to USC.« less

  9. Experimental Design For Photoresist Characterization

    NASA Astrophysics Data System (ADS)

    Luckock, Larry

    1987-04-01

    In processing a semiconductor product (from discrete devices up to the most complex products produced) we find more photolithographic steps in wafer fabrication than any other kind of process step. Thus, the success of a semiconductor manufacturer hinges on the optimization of their photolithographic processes. Yet, we find few companies that have taken the time to properly characterize this critical operation; they are sitting in the "passenger's seat", waiting to see what will come out, hoping that the yields will improve someday. There is no "black magic" involved in setting up a process at its optimum conditions (i.e. minimum sensitivity to all variables at the same time). This paper gives an example of a real world situation for optimizing a photolithographic process by the use of a properly designed experiment, followed by adequate multidimensional analysis of the data. Basic SPC practices like plotting control charts will not, by themselves, improve yields; the control charts are, however, among the necessary tools used in the determination of the process capability and in the formulation of the problems to be addressed. The example we shall consider is the twofold objective of shifting the process average, while tightening the variance, of polysilicon line widths. This goal was identified from a Pareto analysis of yield-limiting mechanisms, plus inspection of the control charts. A key issue in a characterization of this type of process is the number of interactions between variables; this example rules out two-level full factorial and three-level fractional factorial designs (which cannot detect all of the interactions). We arrive at an experiment with five factors at five levels each. A full factorial design for five factors at three levels would require 3125 wafers. Instead, we will use a design that allows us to run this experiment with only 25 wafers, for a significant reduction in time, materials and manufacturing interruption in order to complete the experiment. An optimum solution is then determined via response surface analysis and a series of 3-D and contour plots are shown. The offset between the mask dimensions and poly CD at the optimum operating conditions is discussed with respect to yield, profits and return-on-investment. The expert system used for process optimization covers all types of process steps, producing the best custom designed experiment based on the actual equipment used. The knowledge base contains parameter lists, by machine make and model, ranked by sensitivity and controllability. One option allows 3-D spatial characterization of equipment. For the purpose of this presentation, we will assume that we want to optimize a photo-lithographic process used for polysilicon pattern definition and that we have determined minimum and maximum line widths, based on electrical yield requirements of the product. For this MOS process, the minimum critical dimension (CD) for the poly gate was determined by punchthrough voltage, threshold voltage, etc., while the maximum CD was determined from other performance factors like access time. We will start with the product engineer's analysis.

  10. Assessing the influence of component processing and donor characteristics on quality of red cell concentrates using quality control data.

    PubMed

    Jordan, A; Chen, D; Yi, Q-L; Kanias, T; Gladwin, M T; Acker, J P

    2016-07-01

    Quality control (QC) data collected by blood services are used to monitor production and to ensure compliance with regulatory standards. We demonstrate how analysis of quality control data can be used to highlight the sources of variability within red cell concentrates (RCCs). We merged Canadian Blood Services QC data with manufacturing and donor records for 28 227 RCC between June 2011 and October 2014. Units were categorized based on processing method, bag manufacturer, donor age and donor sex, then assessed based on product characteristics: haemolysis and haemoglobin levels, unit volume, leucocyte count and haematocrit. Buffy-coat method (top/bottom)-processed units exhibited lower haemolysis than units processed using the whole-blood filtration method (top/top). Units from female donors exhibited lower haemolysis than male donations. Processing method influenced unit volume and the ratio of additive solution to residual plasma. Stored red blood cell characteristics are influenced by prestorage processing and donor factors. Understanding the relationship between processing, donors and RCC quality will help blood services to ensure the safety of transfused products. © 2016 International Society of Blood Transfusion.

  11. Independent Orbiter Assessment (IOA): Analysis of the electrical power distribution and control/remote manipulator system subsystem

    NASA Technical Reports Server (NTRS)

    Robinson, W. W.

    1987-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the Electrical Power Distribution and Control (EPD and C)/Remote Manipulator System (RMS) hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained in the NASA FMEA/CIL documentation. This report documents the results of the independent analysis of the EPD and C/RMS (both port and starboard) hardware. The EPD and C/RMS subsystem hardware provides the electrical power and power control circuitry required to safely deploy, operate, control, and stow or guillotine and jettison two (one port and one starboard) RMSs. The EPD and C/RMS subsystem is subdivided into the four following functional divisions: Remote Manipulator Arm; Manipulator Deploy Control; Manipulator Latch Control; Manipulator Arm Shoulder Jettison; and Retention Arm Jettison. The IOA analysis process utilized available EPD and C/RMS hardware drawings and schematics for defining hardware assemblies, components, and hardware items. Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based on the severity of the effect for each failure mode.

  12. A randomized wait-list controlled analysis of the implementation integrity of team-initiated problem solving processes.

    PubMed

    Newton, J Stephen; Horner, Robert H; Algozzine, Bob; Todd, Anne W; Algozzine, Kate

    2012-08-01

    Members of Positive Behavior Interventions and Supports (PBIS) teams from 34 elementary schools participated in a Team-Initiated Problem Solving (TIPS) Workshop and follow-up technical assistance. Within the context of a randomized wait-list controlled trial, team members who were the first recipients of the TIPS intervention demonstrated greater implementation integrity in using the problem-solving processes during their team meetings than did members of PBIS Teams in the Wait-List Control group. The success of TIPS at improving implementation integrity of the problem-solving processes is encouraging and suggests the value of conducting additional research focused on determining whether there is a functional relation between use of these problem-solving processes and actual resolution of targeted student academic and social problems. Copyright © 2012 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  13. A Web-Based System for Monitoring and Controlling Multidisciplinary Design Projects

    NASA Technical Reports Server (NTRS)

    Salas, Andrea O.; Rogers, James L.

    1997-01-01

    In today's competitive environment, both industry and government agencies are under enormous pressure to reduce the time and cost of multidisciplinary design projects. A number of frameworks have been introduced to assist in this process by facilitating the integration of and communication among diverse disciplinary codes. An examination of current frameworks reveals weaknesses in various areas such as sequencing, displaying, monitoring, and controlling the design process. The objective of this research is to explore how Web technology, in conjunction with an existing framework, can improve these areas of weakness. This paper describes a system that executes a sequence of programs, monitors and controls the design process through a Web-based interface, and visualizes intermediate and final results through the use of Java(Tm) applets. A small sample problem, which includes nine processes with two analysis programs that are coupled to an optimizer, is used to demonstrate the feasibility of this approach.

  14. Assessment and Control of Spacecraft Charging Risks on the International Space Station

    NASA Technical Reports Server (NTRS)

    Koontz, Steve; Valentine, Mark; Keeping, Thomas; Edeen, Marybeth; Spetch, William; Dalton, Penni

    2004-01-01

    The International Space Station (ISS) operates in the F2 region of Earth's ionosphere, orbiting at altitudes ranging from 350 to 450 km at an inclination of 51.6 degrees. The relatively dense, cool F2 ionospheric plasma suppresses surface charging processes much of the time, and the flux of relativistic electrons is low enough to preclude deep dielectric charging processes. The most important spacecraft charging processes in the ISS orbital environment are: 1) ISS electrical power system interactions with the F2 plasma, 2) magnetic induction processes resulting from flight through the geomagnetic field and, 3) charging processes that result from interaction with auroral electrons at high latitude. Recently, the continuing review and evaluation of putative ISS charging hazards required by the ISS Program Office revealed that ISS charging could produce an electrical shock hazard to the ISS crew during extravehicular activity (EVA). ISS charging risks are being evaluated in an ongoing measurement and analysis campaign. The results of ISS charging measurements are combined with a recently developed model of ISS charging (the Plasma Interaction Model) and an exhaustive analysis of historical ionospheric variability data (ISS Ionospheric Specification) to evaluate ISS charging risks using Probabilistic Risk Assessment (PRA) methods. The PRA combines estimates of the frequency of occurrence and severity of the charging hazards with estimates of the reliability of various hazard controls systems, as required by NASA s safety and risk management programs, to enable design and selection of a hazard control approach that minimizes overall programmatic and personnel risk. The PRA provides a quantitative methodology for incorporating the results of the ISS charging measurement and analysis campaigns into the necessary hazard reports, EVA procedures, and ISS flight rules required for operating ISS in a safe and productive manner.

  15. Potential Air Pollutant Emissions and Permitting Classifications for Two Biorefinery Process Designs in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eberle, Annika; Bhatt, Arpit; Zhang, Yimin

    Advanced biofuel production facilities (biorefineries), such as those envisioned by the United States (U.S.) Renewable Fuel Standard and U.S. Department of Energy's research and development programs, often lack historical air pollutant emissions data, which can pose challenges for obtaining air emission permits that are required for construction and operation. To help fill this knowledge gap, we perform a thorough regulatory analysis and use engineering process designs to assess the applicability of federal air regulations and quantify air pollutant emissions for two feasibility-level biorefinery designs. We find that without additional emission-control technologies both biorefineries would likely be required to obtain majormore » source permits under the Clean Air Act's New Source Review program. The permitting classification (so-called 'major' or 'minor') has implications for the time and effort required for permitting and therefore affects the cost of capital and the fuel selling price. Consequently, we explore additional technically feasible emission-control technologies and process modifications that have the potential to reduce emissions to achieve a minor source permitting classification. Finally, our analysis of air pollutant emissions and controls can assist biorefinery developers with the air permitting process and inform regulatory agencies about potential permitting pathways for novel biorefinery designs.« less

  16. Potential Air Pollutant Emissions and Permitting Classifications for Two Biorefinery Process Designs in the United States

    DOE PAGES

    Eberle, Annika; Bhatt, Arpit; Zhang, Yimin; ...

    2017-04-26

    Advanced biofuel production facilities (biorefineries), such as those envisioned by the United States (U.S.) Renewable Fuel Standard and U.S. Department of Energy's research and development programs, often lack historical air pollutant emissions data, which can pose challenges for obtaining air emission permits that are required for construction and operation. To help fill this knowledge gap, we perform a thorough regulatory analysis and use engineering process designs to assess the applicability of federal air regulations and quantify air pollutant emissions for two feasibility-level biorefinery designs. We find that without additional emission-control technologies both biorefineries would likely be required to obtain majormore » source permits under the Clean Air Act's New Source Review program. The permitting classification (so-called 'major' or 'minor') has implications for the time and effort required for permitting and therefore affects the cost of capital and the fuel selling price. Consequently, we explore additional technically feasible emission-control technologies and process modifications that have the potential to reduce emissions to achieve a minor source permitting classification. Finally, our analysis of air pollutant emissions and controls can assist biorefinery developers with the air permitting process and inform regulatory agencies about potential permitting pathways for novel biorefinery designs.« less

  17. Potential Air Pollutant Emissions and Permitting Classifications for Two Biorefinery Process Designs in the United States.

    PubMed

    Eberle, Annika; Bhatt, Arpit; Zhang, Yimin; Heath, Garvin

    2017-06-06

    Advanced biofuel production facilities (biorefineries), such as those envisioned by the United States (U.S.) Renewable Fuel Standard and U.S. Department of Energy's research and development programs, often lack historical air pollutant emissions data, which can pose challenges for obtaining air emission permits that are required for construction and operation. To help fill this knowledge gap, we perform a thorough regulatory analysis and use engineering process designs to assess the applicability of federal air regulations and quantify air pollutant emissions for two feasibility-level biorefinery designs. We find that without additional emission-control technologies both biorefineries would likely be required to obtain major source permits under the Clean Air Act's New Source Review program. The permitting classification (so-called "major" or "minor") has implications for the time and effort required for permitting and therefore affects the cost of capital and the fuel selling price. Consequently, we explore additional technically feasible emission-control technologies and process modifications that have the potential to reduce emissions to achieve a minor source permitting classification. Our analysis of air pollutant emissions and controls can assist biorefinery developers with the air permitting process and inform regulatory agencies about potential permitting pathways for novel biorefinery designs.

  18. Validation of acid washes as critical control points in hazard analysis and critical control point systems.

    PubMed

    Dormedy, E S; Brashears, M M; Cutter, C N; Burson, D E

    2000-12-01

    A 2% lactic acid wash used in a large meat-processing facility was validated as an effective critical control point (CCP) in a hazard analysis and critical control point (HACCP) plan. We examined the microbial profiles of beef carcasses before the acid wash, beef carcasses immediately after the acid wash, beef carcasses 24 h after the acid wash, beef subprimal cuts from the acid-washed carcasses, and on ground beef made from acid-washed carcasses. Total mesophilic, psychrotrophic, coliforms, generic Escherichia coli, lactic acid bacteria, pseudomonads, and acid-tolerant microorganisms were enumerated on all samples. The presence of Salmonella spp. was also determined. Acid washing significantly reduced all counts except for pseudomonads that were present at very low numbers before acid washing. All other counts continued to stay significantly lower (P < 0.05) than those on pre-acid-washed carcasses throughout all processing steps. Total bacteria, coliforms, and generic E. coli enumerated on ground beef samples were more than 1 log cycle lower than those reported in the U.S. Department of Agriculture Baseline data. This study suggests that acid washes may be effective CCPs in HACCP plans and can significantly reduce the total number of microorganisms present on the carcass and during further processing.

  19. Mathematical modeling and characteristic analysis for over-under turbine based combined cycle engine

    NASA Astrophysics Data System (ADS)

    Ma, Jingxue; Chang, Juntao; Ma, Jicheng; Bao, Wen; Yu, Daren

    2018-07-01

    The turbine based combined cycle engine has become the most promising hypersonic airbreathing propulsion system for its superiority of ground self-starting, wide flight envelop and reusability. The simulation model of the turbine based combined cycle engine plays an important role in the research of performance analysis and control system design. In this paper, a turbine based combined cycle engine mathematical model is built on the Simulink platform, including a dual-channel air intake system, a turbojet engine and a ramjet. It should be noted that the model of the air intake system is built based on computational fluid dynamics calculation, which provides valuable raw data for modeling of the turbine based combined cycle engine. The aerodynamic characteristics of turbine based combined cycle engine in turbojet mode, ramjet mode and mode transition process are studied by the mathematical model, and the influence of dominant variables on performance and safety of the turbine based combined cycle engine is analyzed. According to the stability requirement of thrust output and the safety in the working process of turbine based combined cycle engine, a control law is proposed that could guarantee the steady output of thrust by controlling the control variables of the turbine based combined cycle engine in the whole working process.

  20. Fossil Energy Program

    NASA Astrophysics Data System (ADS)

    McNeese, L. E.

    1981-01-01

    Increased utilization of coal and other fossil fuel alternatives as sources of clean energy is reported. The following topics are discussed: coal conversion development, chemical research and development, materials technology, component development and process evaluation studies, technical support to major liquefaction projects, process analysis and engineering evaluations, fossil energy environmental analysis, flue gas desulfurization, solid waste disposal, coal preparation waste utilization, plant control development, atmospheric fluidized bed coal combustor for cogeneration, TVA FBC demonstration plant program technical support, PFBC systems analysis, fossil fuel applications assessments, performance assurance system support for fossil energy projects, international energy technology assessment, and general equilibrium models of liquid and gaseous fuel supplies.

  1. No Control Genes Required: Bayesian Analysis of qRT-PCR Data

    PubMed Central

    Matz, Mikhail V.; Wright, Rachel M.; Scott, James G.

    2013-01-01

    Background Model-based analysis of data from quantitative reverse-transcription PCR (qRT-PCR) is potentially more powerful and versatile than traditional methods. Yet existing model-based approaches cannot properly deal with the higher sampling variances associated with low-abundant targets, nor do they provide a natural way to incorporate assumptions about the stability of control genes directly into the model-fitting process. Results In our method, raw qPCR data are represented as molecule counts, and described using generalized linear mixed models under Poisson-lognormal error. A Markov Chain Monte Carlo (MCMC) algorithm is used to sample from the joint posterior distribution over all model parameters, thereby estimating the effects of all experimental factors on the expression of every gene. The Poisson-based model allows for the correct specification of the mean-variance relationship of the PCR amplification process, and can also glean information from instances of no amplification (zero counts). Our method is very flexible with respect to control genes: any prior knowledge about the expected degree of their stability can be directly incorporated into the model. Yet the method provides sensible answers without such assumptions, or even in the complete absence of control genes. We also present a natural Bayesian analogue of the “classic” analysis, which uses standard data pre-processing steps (logarithmic transformation and multi-gene normalization) but estimates all gene expression changes jointly within a single model. The new methods are considerably more flexible and powerful than the standard delta-delta Ct analysis based on pairwise t-tests. Conclusions Our methodology expands the applicability of the relative-quantification analysis protocol all the way to the lowest-abundance targets, and provides a novel opportunity to analyze qRT-PCR data without making any assumptions concerning target stability. These procedures have been implemented as the MCMC.qpcr package in R. PMID:23977043

  2. Programmable Logic Controller Modification Attacks for use in Detection Analysis

    DTIC Science & Technology

    2014-03-27

    and J. Lowe, “The Myths and Facts Behind Cyber Security Risks for Industrial Control Systems ,” in Proceedings of the VDE Kongress, vol. 116, 2004. [13...Feb 2014 Date 20 Feb 2014 Date 20 Feb 2014 Date AFIT-ENG-14-M-66 Abstract Unprotected Supervisory Control and Data Acquisition (SCADA) systems offer...control and monitor physical industrial processes. Although attacks targeting SCADA systems have increased, there has been little work exploring the

  3. Management of the General Process of Parenteral Nutrition Using mHealth Technologies: Evaluation and Validation Study.

    PubMed

    Cervera Peris, Mercedes; Alonso Rorís, Víctor Manuel; Santos Gago, Juan Manuel; Álvarez Sabucedo, Luis; Wanden-Berghe, Carmina; Sanz-Valero, Javier

    2018-04-03

    Any system applied to the control of parenteral nutrition (PN) ought to prove that the process meets the established requirements and include a repository of records to allow evaluation of the information about PN processes at any time. The goal of the research was to evaluate the mobile health (mHealth) app and validate its effectiveness in monitoring the management of the PN process. We studied the evaluation and validation of the general process of PN using an mHealth app. The units of analysis were the PN bags prepared and administered at the Son Espases University Hospital, Palma, Spain, from June 1 to September 6, 2016. For the evaluation of the app, we used the Poststudy System Usability Questionnaire and subsequent analysis with the Cronbach alpha coefficient. Validation was performed by checking the compliance of control for all operations on each of the stages (validation and transcription of the prescription, preparation, conservation, and administration) and by monitoring the operative control points and critical control points. The results obtained from 387 bags were analyzed, with 30 interruptions of administration. The fulfillment of stages was 100%, including noncritical nonconformities in the storage control. The average deviation in the weight of the bags was less than 5%, and the infusion time did not present deviations greater than 1 hour. The developed app successfully passed the evaluation and validation tests and was implemented to perform the monitoring procedures for the overall PN process. A new mobile solution to manage the quality and traceability of sensitive medicines such as blood-derivative drugs and hazardous drugs derived from this project is currently being deployed. ©Mercedes Cervera Peris, Víctor Manuel Alonso Rorís, Juan Manuel Santos Gago, Luis Álvarez Sabucedo, Carmina Wanden-Berghe, Javier Sanz-Valero. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 03.04.2018.

  4. 18-Degree-of-Freedom Controller Design for the ST7 Disturbance Reduction System

    NASA Technical Reports Server (NTRS)

    Markley, F. L.; Maghami, P. G.; Houghton, M. B.; Hsu, O. C.

    2003-01-01

    This paper presents the overall design and analysis process of the spacecraft controller being developed at NASA's Goddard Space Flight Center to close the loop between the GRS and the micro-newton colloidal thrusters. The essential dynamics of the ST7-DRS are captured in a simulation including eighteen rigid-body dynamic degrees of freedom: three translations and three rotations for the spacecraft and for each test mass. The ST7 DRS comprises three control systems: the attitude control system (ACS) to maintain a sun-pointing attitude; the drag free control (DFC) to center the spacecraft about the test masses; and the test mass suspension control. This paper summarizes the control design and analysis of the ST7-DRS 18-DOF model, and is an extension of previous analyses employing a 7-DOF planar model of ST-7.

  5. Tobacco control in the Russian Federation--a policy analysis.

    PubMed

    Lunze, Karsten; Migliorini, Luigi

    2013-01-23

    The Russian Federation (Russia) has one of the highest smoking rates in the world. The purpose of this study is to analyze past and current trends of the tobacco epidemic in the Russian Federation, review current tobacco control policy responses, and identify areas of opportunity for policy priorities. We used a policy triangle as analytical framework to examine content, context, and processes of Russian tobacco control policy. The analysis was based on secondary data on supply and demand sides of the Russian tobacco epidemic, tobacco-related economic and health effects during Russia's economic transition, and compliance of Russian tobacco policy with international standards and regulations. Tobacco-promoting strategies have specifically targeted women and youth. Russia's approval of a "National Tobacco Control Concept" and draft for a comprehensive tobacco control bill increasingly align national legislature with the WHO Framework Convention on Tobacco Control (FCTC). However, several structural and cultural factors represent substantial barriers to the policy process. The influence of transnational tobacco companies on policy processes in Russia has so far impeded a full implementation of the FCTC mandates. Several strategies have been identified as having the potential to reduce the prevalence of tobacco use in Russia and decrease tobacco-related national health and economic burden: adjusting national tobacco policy by raising tobacco tax from the current lowest level in Europe to at least 70%; consequent enforcement of a complete smoking ban in public places; marketing restrictions; and smoking cessation interventions integrated into primary care. Russia's tobacco control efforts need to target women and youths specifically to efficiently counter industry efforts.

  6. Tobacco control in the Russian Federation- a policy analysis

    PubMed Central

    2013-01-01

    Background The Russian Federation (Russia) has one of the highest smoking rates in the world. The purpose of this study is to analyze past and current trends of the tobacco epidemic in the Russian Federation, review current tobacco control policy responses, and identify areas of opportunity for policy priorities. Methods We used a policy triangle as analytical framework to examine content, context, and processes of Russian tobacco control policy. The analysis was based on secondary data on supply and demand sides of the Russian tobacco epidemic, tobacco-related economic and health effects during Russia’s economic transition, and compliance of Russian tobacco policy with international standards and regulations. Results Tobacco-promoting strategies have specifically targeted women and youth. Russia’s approval of a “National Tobacco Control Concept” and draft for a comprehensive tobacco control bill increasingly align national legislature with the WHO Framework Convention on Tobacco Control (FCTC). However, several structural and cultural factors represent substantial barriers to the policy process. The influence of transnational tobacco companies on policy processes in Russia has so far impeded a full implementation of the FCTC mandates. Conclusions Several strategies have been identified as having the potential to reduce the prevalence of tobacco use in Russia and decrease tobacco-related national health and economic burden: adjusting national tobacco policy by raising tobacco tax from the current lowest level in Europe to at least 70%; consequent enforcement of a complete smoking ban in public places; marketing restrictions; and smoking cessation interventions integrated into primary care. Russia’s tobacco control efforts need to target women and youths specifically to efficiently counter industry efforts. PMID:23339756

  7. Approximation of the Newton Step by a Defect Correction Process

    NASA Technical Reports Server (NTRS)

    Arian, E.; Batterman, A.; Sachs, E. W.

    1999-01-01

    In this paper, an optimal control problem governed by a partial differential equation is considered. The Newton step for this system can be computed by solving a coupled system of equations. To do this efficiently with an iterative defect correction process, a modifying operator is introduced into the system. This operator is motivated by local mode analysis. The operator can be used also for preconditioning in Generalized Minimum Residual (GMRES). We give a detailed convergence analysis for the defect correction process and show the derivation of the modifying operator. Numerical tests are done on the small disturbance shape optimization problem in two dimensions for the defect correction process and for GMRES.

  8. Fracture control procedures for aircraft structural integrity

    NASA Technical Reports Server (NTRS)

    Wood, H. A.

    1972-01-01

    The application of applied fracture mechanics in the design, analysis, and qualification of aircraft structural systems are reviewed. Recent service experiences are cited. Current trends in high-strength materials application are reviewed with particular emphasis on the manner in which fracture toughness and structural efficiency may affect the material selection process. General fracture control procedures are reviewed in depth with specific reference to the impact of inspectability, structural arrangement, and material on proposed analysis requirements for safe crack growth. The relative impact on allowable design stress is indicated by example. Design criteria, material, and analysis requirements for implementation of fracture control procedures are reviewed together with limitations in current available data techniques. A summary of items which require further study and attention is presented.

  9. Preanalytical errors in medical laboratories: a review of the available methodologies of data collection and analysis.

    PubMed

    West, Jamie; Atherton, Jennifer; Costelloe, Seán J; Pourmahram, Ghazaleh; Stretton, Adam; Cornes, Michael

    2017-01-01

    Preanalytical errors have previously been shown to contribute a significant proportion of errors in laboratory processes and contribute to a number of patient safety risks. Accreditation against ISO 15189:2012 requires that laboratory Quality Management Systems consider the impact of preanalytical processes in areas such as the identification and control of non-conformances, continual improvement, internal audit and quality indicators. Previous studies have shown that there is a wide variation in the definition, repertoire and collection methods for preanalytical quality indicators. The International Federation of Clinical Chemistry Working Group on Laboratory Errors and Patient Safety has defined a number of quality indicators for the preanalytical stage, and the adoption of harmonized definitions will support interlaboratory comparisons and continual improvement. There are a variety of data collection methods, including audit, manual recording processes, incident reporting mechanisms and laboratory information systems. Quality management processes such as benchmarking, statistical process control, Pareto analysis and failure mode and effect analysis can be used to review data and should be incorporated into clinical governance mechanisms. In this paper, The Association for Clinical Biochemistry and Laboratory Medicine PreAnalytical Specialist Interest Group review the various data collection methods available. Our recommendation is the use of the laboratory information management systems as a recording mechanism for preanalytical errors as this provides the easiest and most standardized mechanism of data capture.

  10. Using a Systematic Conceptual Model for a Process Evaluation of a Middle School Obesity Risk-Reduction Nutrition Curriculum Intervention: Choice, Control & Change

    PubMed Central

    Lee, Heewon; Contento, Isobel R.; Koch, Pamela

    2012-01-01

    Objective To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, Choice, Control & Change, designed to promote dietary and physical activity behaviors that reduce obesity risk. Design A process evaluation study based on a systematic conceptual model. Setting Five middle schools in New York City. Participants 562 students in 20 classes and their science teachers (n=8). Main Outcome Measures Based on the model, teacher professional development, teacher implementation, and student reception were evaluated. Also measured were teacher characteristics, teachers’ curriculum evaluation, and satisfaction with teaching the curriculum. Analysis Descriptive statistics and Spearman’s Rho Correlation for quantitative analysis and content analysis for qualitative data were used. Results Mean score of the teacher professional development evaluation was 4.75 on a 5-point scale. Average teacher implementation rate was 73%, and student reception rate was 69%. Ongoing teacher support was highly valued by teachers. Teachers’ satisfaction with teaching the curriculum was highly correlated with students’ satisfaction (p <.05). Teachers’ perception of amount of student work was negatively correlated with implementation and with student satisfaction (p<.05). Conclusions and implications Use of a systematic conceptual model and comprehensive process measures improves understanding of the implementation process and helps educators to better implement interventions as designed. PMID:23321021

  11. Continuous welding of unidirectional fiber reinforced thermoplastic tape material

    NASA Astrophysics Data System (ADS)

    Schledjewski, Ralf

    2017-10-01

    Continuous welding techniques like thermoplastic tape placement with in situ consolidation offer several advantages over traditional manufacturing processes like autoclave consolidation, thermoforming, etc. However, still there is a need to solve several important processing issues before it becomes a viable economic process. Intensive process analysis and optimization has been carried out in the past through experimental investigation, model definition and simulation development. Today process simulation is capable to predict resulting consolidation quality. Effects of material imperfections or process parameter variations are well known. But using this knowledge to control the process based on online process monitoring and according adaption of the process parameters is still challenging. Solving inverse problems and using methods for automated code generation allowing fast implementation of algorithms on targets are required. The paper explains the placement technique in general. Process-material-property-relationships and typical material imperfections are described. Furthermore, online monitoring techniques and how to use them for a model based process control system are presented.

  12. Standards for the Analysis and Processing of Surface-Water Data and Information Using Electronic Methods

    USGS Publications Warehouse

    Sauer, Vernon B.

    2002-01-01

    Surface-water computation methods and procedures are described in this report to provide standards from which a completely automated electronic processing system can be developed. To the greatest extent possible, the traditional U. S. Geological Survey (USGS) methodology and standards for streamflow data collection and analysis have been incorporated into these standards. Although USGS methodology and standards are the basis for this report, the report is applicable to other organizations doing similar work. The proposed electronic processing system allows field measurement data, including data stored on automatic field recording devices and data recorded by the field hydrographer (a person who collects streamflow and other surface-water data) in electronic field notebooks, to be input easily and automatically. A user of the electronic processing system easily can monitor the incoming data and verify and edit the data, if necessary. Input of the computational procedures, rating curves, shift requirements, and other special methods are interactive processes between the user and the electronic processing system, with much of this processing being automatic. Special computation procedures are provided for complex stations such as velocity-index, slope, control structures, and unsteady-flow models, such as the Branch-Network Dynamic Flow Model (BRANCH). Navigation paths are designed to lead the user through the computational steps for each type of gaging station (stage-only, stagedischarge, velocity-index, slope, rate-of-change in stage, reservoir, tide, structure, and hydraulic model stations). The proposed electronic processing system emphasizes the use of interactive graphics to provide good visual tools for unit values editing, rating curve and shift analysis, hydrograph comparisons, data-estimation procedures, data review, and other needs. Documentation, review, finalization, and publication of records are provided for with the electronic processing system, as well as archiving, quality assurance, and quality control.

  13. Launch Vehicle Control Center Architectures

    NASA Technical Reports Server (NTRS)

    Watson, Michael D.; Epps, Amy; Woodruff, Van; Vachon, Michael Jacob; Monreal, Julio; Williams, Randall; McLaughlin, Tom

    2014-01-01

    This analysis is a survey of control center architectures of the NASA Space Launch System (SLS), United Launch Alliance (ULA) Atlas V and Delta IV, and the European Space Agency (ESA) Ariane 5. Each of these control center architectures have similarities in basic structure, and differences in functional distribution of responsibilities for the phases of operations: (a) Launch vehicles in the international community vary greatly in configuration and process; (b) Each launch site has a unique processing flow based on the specific configurations; (c) Launch and flight operations are managed through a set of control centers associated with each launch site, however the flight operations may be a different control center than the launch center; and (d) The engineering support centers are primarily located at the design center with a small engineering support team at the launch site.

  14. Application of optimal control principles to describe the supervisory control behavior of AAA crew members

    NASA Technical Reports Server (NTRS)

    Hale, C.; Valentino, G. J.

    1982-01-01

    Supervisory decision making and control behavior within a C(3) oriented, ground based weapon system is being studied. The program involves empirical investigation of the sequence of control strategies used during engagement of aircraft targets. An engagement is conceptually divided into several stages which include initial information processing activity, tracking, and ongoing adaptive control decisions. Following a brief description of model parameters, two experiments which served as initial investigation into the accuracy of assumptions regarding the importance of situation assessment in procedure selection are outlined. Preliminary analysis of the results upheld the validity of the assumptions regarding strategic information processing and cue-criterion relationship learning. These results indicate that this model structure should be useful in studies of supervisory decision behavior.

  15. [Design of medical devices management system supporting full life-cycle process management].

    PubMed

    Su, Peng; Zhong, Jianping

    2014-03-01

    Based on the analysis of the present status of medical devices management, this paper optimized management process, developed a medical devices management system with Web technologies. With information technology to dynamic master the use of state of the entire life-cycle of medical devices. Through the closed-loop management with pre-event budget, mid-event control and after-event analysis, improved the delicacy management level of medical devices, optimized asset allocation, promoted positive operation of devices.

  16. Vulnerability-attention analysis for space-related activities

    NASA Technical Reports Server (NTRS)

    Ford, Donnie; Hays, Dan; Lee, Sung Yong; Wolfsberger, John

    1988-01-01

    Techniques for representing and analyzing trouble spots in structures and processes are discussed. Identification of vulnerable areas usually depends more on particular and often detailed knowledge than on algorithmic or mathematical procedures. In some cases, machine inference can facilitate the identification. The analysis scheme proposed first establishes the geometry of the process, then marks areas that are conditionally vulnerable. This provides a basis for advice on the kinds of human attention or machine sensing and control that can make the risks tolerable.

  17. Multi-factor Analysis of Pre-control Fracture Simulations about Projectile Material

    NASA Astrophysics Data System (ADS)

    Wan, Ren-Yi; Zhou, Wei

    2016-05-01

    The study of projectile material pre-control fracture is helpful to improve the projectile metal effective fragmentation and the material utilization rate. Fragments muzzle velocity and lethality can be affected by the different explosive charge and the way of initiation. The finite element software can simulate the process of projectile explosive rupture which has a pre-groove in the projectile shell surface and analysis of typical node velocity change with time, to provides a reference for the design and optimization of precontrol frag.

  18. Cost Analysis In A Multi-Mission Operations Environment

    NASA Technical Reports Server (NTRS)

    Newhouse, M.; Felton, L.; Bornas, N.; Botts, D.; Roth, K.; Ijames, G.; Montgomery, P.

    2014-01-01

    Spacecraft control centers have evolved from dedicated, single-mission or single missiontype support to multi-mission, service-oriented support for operating a variety of mission types. At the same time, available money for projects is shrinking and competition for new missions is increasing. These factors drive the need for an accurate and flexible model to support estimating service costs for new or extended missions; the cost model in turn drives the need for an accurate and efficient approach to service cost analysis. The National Aeronautics and Space Administration (NASA) Huntsville Operations Support Center (HOSC) at Marshall Space Flight Center (MSFC) provides operations services to a variety of customers around the world. HOSC customers range from launch vehicle test flights; to International Space Station (ISS) payloads; to small, short duration missions; and has included long duration flagship missions. The HOSC recently completed a detailed analysis of service costs as part of the development of a complete service cost model. The cost analysis process required the team to address a number of issues. One of the primary issues involves the difficulty of reverse engineering individual mission costs in a highly efficient multimission environment, along with a related issue of the value of detailed metrics or data to the cost model versus the cost of obtaining accurate data. Another concern is the difficulty of balancing costs between missions of different types and size and extrapolating costs to different mission types. The cost analysis also had to address issues relating to providing shared, cloud-like services in a government environment, and then assigning an uncertainty or risk factor to cost estimates that are based on current technology, but will be executed using future technology. Finally the cost analysis needed to consider how to validate the resulting cost models taking into account the non-homogeneous nature of the available cost data and the decreasing flight rate. This paper presents the issues encountered during the HOSC cost analysis process, and the associated lessons learned. These lessons can be used when planning for a new multi-mission operations center or in the transformation from a dedicated control center to multi-center operations, as an aid in defining processes that support future cost analysis and estimation. The lessons can also be used by mature serviceoriented, multi-mission control centers to streamline or refine their cost analysis process.

  19. Cost Analysis in a Multi-Mission Operations Environment

    NASA Technical Reports Server (NTRS)

    Felton, Larry; Newhouse, Marilyn; Bornas, Nick; Botts, Dennis; Ijames, Gayleen; Montgomery, Patty; Roth, Karl

    2014-01-01

    Spacecraft control centers have evolved from dedicated, single-mission or single mission-type support to multi-mission, service-oriented support for operating a variety of mission types. At the same time, available money for projects is shrinking and competition for new missions is increasing. These factors drive the need for an accurate and flexible model to support estimating service costs for new or extended missions; the cost model in turn drives the need for an accurate and efficient approach to service cost analysis. The National Aeronautics and Space Administration (NASA) Huntsville Operations Support Center (HOSC) at Marshall Space Flight Center (MSFC) provides operations services to a variety of customers around the world. HOSC customers range from launch vehicle test flights; to International Space Station (ISS) payloads; to small, short duration missions; and has included long duration flagship missions. The HOSC recently completed a detailed analysis of service costs as part of the development of a complete service cost model. The cost analysis process required the team to address a number of issues. One of the primary issues involves the difficulty of reverse engineering individual mission costs in a highly efficient multi-mission environment, along with a related issue of the value of detailed metrics or data to the cost model versus the cost of obtaining accurate data. Another concern is the difficulty of balancing costs between missions of different types and size and extrapolating costs to different mission types. The cost analysis also had to address issues relating to providing shared, cloud-like services in a government environment, and then assigning an uncertainty or risk factor to cost estimates that are based on current technology, but will be executed using future technology. Finally the cost analysis needed to consider how to validate the resulting cost models taking into account the non-homogeneous nature of the available cost data and the decreasing flight rate. This paper presents the issues encountered during the HOSC cost analysis process, and the associated lessons learned. These lessons can be used when planning for a new multi-mission operations center or in the transformation from a dedicated control center to multi-center operations, as an aid in defining processes that support future cost analysis and estimation. The lessons can also be used by mature service-oriented, multi-mission control centers to streamline or refine their cost analysis process.

  20. Field propagation-induced directionality of carrier-envelope phase-controlled photoemission from nanospheres

    DOE PAGES

    SuBmann, F.; Seiffert, L.; Zherebtsov, S.; ...

    2015-08-12

    Near-fields of non-resonantly laser-excited nanostructures enable strong localization of ultrashort light fields and have opened novel routes to fundamentally modify and control electronic strong-field processes. Harnessing spatiotemporally tunable near-fields for the steering of sub-cycle electron dynamics may enable ultrafast optoelectronic devices and unprecedented control in the generation of attosecond electron and photon pulses. Here we utilize unsupported sub-wavelength dielectric nanospheres to generate near-fields with adjustable structure and study the resulting strong-field dynamics via photoelectron imaging. We demonstrate field propagation-induced tunability of the emission direction of fast recollision electrons up to a regime, where nonlinear charge interaction effects become dominant inmore » the acceleration process. In conclusion, our analysis supports that the timing of the recollision process remains controllable with attosecond resolution by the carrier-envelope phase, indicating the possibility to expand near-field-mediated control far into the realm of high-field phenomena.« less

Top