Sample records for process control variables

  1. Process for applying control variables having fractal structures

    DOEpatents

    Bullock, IV, Jonathan S.; Lawson, Roger L.

    1996-01-01

    A process and apparatus for the application of a control variable having a fractal structure to a body or process. The process of the present invention comprises the steps of generating a control variable having a fractal structure and applying the control variable to a body or process reacting in accordance with the control variable. The process is applicable to electroforming where first, second and successive pulsed-currents are applied to cause the deposition of material onto a substrate, such that the first pulsed-current, the second pulsed-current, and successive pulsed currents form a fractal pulsed-current waveform.

  2. Process for applying control variables having fractal structures

    DOEpatents

    Bullock, J.S. IV; Lawson, R.L.

    1996-01-23

    A process and apparatus are disclosed for the application of a control variable having a fractal structure to a body or process. The process of the present invention comprises the steps of generating a control variable having a fractal structure and applying the control variable to a body or process reacting in accordance with the control variable. The process is applicable to electroforming where first, second and successive pulsed-currents are applied to cause the deposition of material onto a substrate, such that the first pulsed-current, the second pulsed-current, and successive pulsed currents form a fractal pulsed-current waveform. 3 figs.

  3. Dynamic control of remelting processes

    DOEpatents

    Bertram, Lee A.; Williamson, Rodney L.; Melgaard, David K.; Beaman, Joseph J.; Evans, David G.

    2000-01-01

    An apparatus and method of controlling a remelting process by providing measured process variable values to a process controller; estimating process variable values using a process model of a remelting process; and outputting estimated process variable values from the process controller. Feedback and feedforward control devices receive the estimated process variable values and adjust inputs to the remelting process. Electrode weight, electrode mass, electrode gap, process current, process voltage, electrode position, electrode temperature, electrode thermal boundary layer thickness, electrode velocity, electrode acceleration, slag temperature, melting efficiency, cooling water temperature, cooling water flow rate, crucible temperature profile, slag skin temperature, and/or drip short events are employed, as are parameters representing physical constraints of electroslag remelting or vacuum arc remelting, as applicable.

  4. Advanced multivariable control of a turboexpander plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Altena, D.; Howard, M.; Bullin, K.

    1998-12-31

    This paper describes an application of advanced multivariable control on a natural gas plant and compares its performance to the previous conventional feed-back control. This control algorithm utilizes simple models from existing plant data and/or plant tests to hold the process at the desired operating point in the presence of disturbances and changes in operating conditions. The control software is able to accomplish this due to effective handling of process variable interaction, constraint avoidance and feed-forward of measured disturbances. The economic benefit of improved control lies in operating closer to the process constraints while avoiding significant violations. The South Texasmore » facility where this controller was implemented experienced reduced variability in process conditions which increased liquids recovery because the plant was able to operate much closer to the customer specified impurity constraint. An additional benefit of this implementation of multivariable control is the ability to set performance criteria beyond simple setpoints, including process variable constraints, relative variable merit and optimizing use of manipulated variables. The paper also details the control scheme applied to the complex turboexpander process and some of the safety features included to improve reliability.« less

  5. Data-driven process decomposition and robust online distributed modelling for large-scale processes

    NASA Astrophysics Data System (ADS)

    Shu, Zhang; Lijuan, Li; Lijuan, Yao; Shipin, Yang; Tao, Zou

    2018-02-01

    With the increasing attention of networked control, system decomposition and distributed models show significant importance in the implementation of model-based control strategy. In this paper, a data-driven system decomposition and online distributed subsystem modelling algorithm was proposed for large-scale chemical processes. The key controlled variables are first partitioned by affinity propagation clustering algorithm into several clusters. Each cluster can be regarded as a subsystem. Then the inputs of each subsystem are selected by offline canonical correlation analysis between all process variables and its controlled variables. Process decomposition is then realised after the screening of input and output variables. When the system decomposition is finished, the online subsystem modelling can be carried out by recursively block-wise renewing the samples. The proposed algorithm was applied in the Tennessee Eastman process and the validity was verified.

  6. A System-Oriented Approach for the Optimal Control of Process Chains under Stochastic Influences

    NASA Astrophysics Data System (ADS)

    Senn, Melanie; Schäfer, Julian; Pollak, Jürgen; Link, Norbert

    2011-09-01

    Process chains in manufacturing consist of multiple connected processes in terms of dynamic systems. The properties of a product passing through such a process chain are influenced by the transformation of each single process. There exist various methods for the control of individual processes, such as classical state controllers from cybernetics or function mapping approaches realized by statistical learning. These controllers ensure that a desired state is obtained at process end despite of variations in the input and disturbances. The interactions between the single processes are thereby neglected, but play an important role in the optimization of the entire process chain. We divide the overall optimization into two phases: (1) the solution of the optimization problem by Dynamic Programming to find the optimal control variable values for each process for any encountered end state of its predecessor and (2) the application of the optimal control variables at runtime for the detected initial process state. The optimization problem is solved by selecting adequate control variables for each process in the chain backwards based on predefined quality requirements for the final product. For the demonstration of the proposed concept, we have chosen a process chain from sheet metal manufacturing with simplified transformation functions.

  7. Optimal regulation in systems with stochastic time sampling

    NASA Technical Reports Server (NTRS)

    Montgomery, R. C.; Lee, P. S.

    1980-01-01

    An optimal control theory that accounts for stochastic variable time sampling in a distributed microprocessor based flight control system is presented. The theory is developed by using a linear process model for the airplane dynamics and the information distribution process is modeled as a variable time increment process where, at the time that information is supplied to the control effectors, the control effectors know the time of the next information update only in a stochastic sense. An optimal control problem is formulated and solved for the control law that minimizes the expected value of a quadratic cost function. The optimal cost obtained with a variable time increment Markov information update process where the control effectors know only the past information update intervals and the Markov transition mechanism is almost identical to that obtained with a known and uniform information update interval.

  8. Detecting Anomalies in Process Control Networks

    NASA Astrophysics Data System (ADS)

    Rrushi, Julian; Kang, Kyoung-Don

    This paper presents the estimation-inspection algorithm, a statistical algorithm for anomaly detection in process control networks. The algorithm determines if the payload of a network packet that is about to be processed by a control system is normal or abnormal based on the effect that the packet will have on a variable stored in control system memory. The estimation part of the algorithm uses logistic regression integrated with maximum likelihood estimation in an inductive machine learning process to estimate a series of statistical parameters; these parameters are used in conjunction with logistic regression formulas to form a probability mass function for each variable stored in control system memory. The inspection part of the algorithm uses the probability mass functions to estimate the normalcy probability of a specific value that a network packet writes to a variable. Experimental results demonstrate that the algorithm is very effective at detecting anomalies in process control networks.

  9. Longitudinal Growth Curves of Brain Function Underlying Inhibitory Control through Adolescence

    PubMed Central

    Foran, William; Velanova, Katerina; Luna, Beatriz

    2013-01-01

    Neuroimaging studies suggest that developmental improvements in inhibitory control are primarily supported by changes in prefrontal executive function. However, studies are contradictory with respect to how activation in prefrontal regions changes with age, and they have yet to analyze longitudinal data using growth curve modeling, which allows characterization of dynamic processes of developmental change, individual differences in growth trajectories, and variables that predict any interindividual variability in trajectories. In this study, we present growth curves modeled from longitudinal fMRI data collected over 302 visits (across ages 9 to 26 years) from 123 human participants. Brain regions within circuits known to support motor response control, executive control, and error processing (i.e., aspects of inhibitory control) were investigated. Findings revealed distinct developmental trajectories for regions within each circuit and indicated that a hierarchical pattern of maturation of brain activation supports the gradual emergence of adult-like inhibitory control. Mean growth curves of activation in motor response control regions revealed no changes with age, although interindividual variability decreased with development, indicating equifinality with maturity. Activation in certain executive control regions decreased with age until adolescence, and variability was stable across development. Error-processing activation in the dorsal anterior cingulate cortex showed continued increases into adulthood and no significant interindividual variability across development, and was uniquely associated with task performance. These findings provide evidence that continued maturation of error-processing abilities supports the protracted development of inhibitory control over adolescence, while motor response control regions provide early-maturing foundational capacities and suggest that some executive control regions may buttress immature networks as error processing continues to mature. PMID:24227721

  10. Derivation of sequential, real-time, process-control programs

    NASA Technical Reports Server (NTRS)

    Marzullo, Keith; Schneider, Fred B.; Budhiraja, Navin

    1991-01-01

    The use of weakest-precondition predicate transformers in the derivation of sequential, process-control software is discussed. Only one extension to Dijkstra's calculus for deriving ordinary sequential programs was found to be necessary: function-valued auxiliary variables. These auxiliary variables are needed for reasoning about states of a physical process that exists during program transitions.

  11. Study of process variables associated with manufacturing hermetically-sealed nickel-cadmium cells

    NASA Technical Reports Server (NTRS)

    Miller, L.

    1974-01-01

    A two year study of the major process variables associated with the manufacturing process for sealed, nickel-cadmium, areospace cells is summarized. Effort was directed toward identifying the major process variables associated with a manufacturing process, experimentally assessing each variable's effect, and imposing the necessary changes (optimization) and controls for the critical process variables to improve results and uniformity. A critical process variable associated with the sintered nickel plaque manufacturing process was identified as the manual forming operation. Critical process variables identified with the positive electrode impregnation/polarization process were impregnation solution temperature, free acid content, vacuum impregnation, and sintered plaque strength. Positive and negative electrodes were identified as a major source of carbonate contamination in sealed cells.

  12. Industrial implementation of spatial variability control by real-time SPC

    NASA Astrophysics Data System (ADS)

    Roule, O.; Pasqualini, F.; Borde, M.

    2016-10-01

    Advanced technology nodes require more and more information to get the wafer process well setup. The critical dimension of components decreases following Moore's law. At the same time, the intra-wafer dispersion linked to the spatial non-uniformity of tool's processes is not capable to decrease in the same proportions. APC systems (Advanced Process Control) are being developed in waferfab to automatically adjust and tune wafer processing, based on a lot of process context information. It can generate and monitor complex intrawafer process profile corrections between different process steps. It leads us to put under control the spatial variability, in real time by our SPC system (Statistical Process Control). This paper will outline the architecture of an integrated process control system for shape monitoring in 3D, implemented in waferfab.

  13. Information distribution in distributed microprocessor based flight control systems

    NASA Technical Reports Server (NTRS)

    Montgomery, R. C.; Lee, P. S.

    1977-01-01

    This paper presents an optimal control theory that accounts for variable time intervals in the information distribution to control effectors in a distributed microprocessor based flight control system. The theory is developed using a linear process model for the aircraft dynamics and the information distribution process is modeled as a variable time increment process where, at the time that information is supplied to the control effectors, the control effectors know the time of the next information update only in a stochastic sense. An optimal control problem is formulated and solved that provides the control law that minimizes the expected value of a quadratic cost function. An example is presented where the theory is applied to the control of the longitudinal motions of the F8-DFBW aircraft. Theoretical and simulation results indicate that, for the example problem, the optimal cost obtained using a variable time increment Markov information update process where the control effectors know only the past information update intervals and the Markov transition mechanism is almost identical to that obtained using a known uniform information update interval.

  14. A Framework for Categorizing Important Project Variables

    NASA Technical Reports Server (NTRS)

    Parsons, Vickie S.

    2003-01-01

    While substantial research has led to theories concerning the variables that affect project success, no universal set of such variables has been acknowledged as the standard. The identification of a specific set of controllable variables is needed to minimize project failure. Much has been hypothesized about the need to match project controls and management processes to individual projects in order to increase the chance for success. However, an accepted taxonomy for facilitating this matching process does not exist. This paper surveyed existing literature on classification of project variables. After an analysis of those proposals, a simplified categorization is offered to encourage further research.

  15. The process of cognitive behaviour therapy for chronic fatigue syndrome: which changes in perpetuating cognitions and behaviour are related to a reduction in fatigue?

    PubMed

    Heins, Marianne J; Knoop, Hans; Burk, William J; Bleijenberg, Gijs

    2013-09-01

    Cognitive behaviour therapy (CBT) can significantly reduce fatigue in chronic fatigue syndrome (CFS), but little is known about the process of change taking place during CBT. Based on a recent treatment model (Wiborg et al. J Psych Res 2012), we examined how (changes in) cognitions and behaviour are related to the decrease in fatigue. We included 183 patients meeting the US Centers for Disease Control criteria for CFS, aged 18 to 65 years, starting CBT. We measured fatigue and possible process variables before treatment; after 6, 12 and 18 weeks; and after treatment. Possible process variables were sense of control over fatigue, focusing on symptoms, self-reported physical functioning, perceived physical activity and objective (actigraphic) physical activity. We built multiple regression models, explaining levels of fatigue during therapy by (changes in) proposed process variables. We observed large individual variation in the patterns of change in fatigue and process variables during CBT for CFS. Increases in the sense of control over fatigue, perceived activity and self-reported physical functioning, and decreases in focusing on symptoms explained 20 to 46% of the variance in fatigue. An increase in objective activity was not a process variable. A change in cognitive factors seems to be related to the decrease in fatigue during CBT for CFS. The pattern of change varies considerably between patients, but changes in process variables and fatigue occur mostly in the same period. © 2013.

  16. Using Statistical Process Control Charts to Study Stuttering Frequency Variability during a Single Day

    ERIC Educational Resources Information Center

    Karimi, Hamid; O'Brian, Sue; Onslow, Mark; Jones, Mark; Menzies, Ross; Packman, Ann

    2013-01-01

    Purpose: Stuttering varies between and within speaking situations. In this study, the authors used statistical process control charts with 10 case studies to investigate variability of stuttering frequency. Method: Participants were 10 adults who stutter. The authors counted the percentage of syllables stuttered (%SS) for segments of their speech…

  17. Family climates: family factors specific to disturbed eating and bulimia nervosa.

    PubMed

    Laliberté, M; Boland, F J; Leichner, P

    1999-09-01

    More than a decade of research has characterized the families of individuals with bulimia and bulimia anorexia (Anorexia Nervosa, Binge/Purging Type) as less expressive, less cohesive, and experiencing more conflicts than normal control families. This two-part study investigated variables believed more directly related to disturbed eating and bulimia as contributing to a "family climate for eating disorders." In Study 1. a nonclinical sample of 324 women who had just left home for college and a sample of 121 mothers evaluated their families. Principal-components analyses revealed the same factor structure for both students and mothers, with Family Body Satisfaction, Family Social Appearance Orientation, and Family Achievement Emphasis loading together, representing the hypothesized family climate for eating disorders: the remaining variables loaded with the more traditional family process variables (conflict, cohesion, expressiveness), representing a more general family dysfunction. As predicted, the family climate for eating disorders factor score was a more powerful predictor of disturbed eating. Study 2 extended these findings into a clin ical population, examining whether the family climate for eating disorders variables would distinguish individuals with bulimia from both depressed and healthy controls. Groups of eating-disordered patients (n = 40) and depressed (n = 17) and healthy (n = 27) controls completed family measures. The eating-disordered group scored significantly higher on family climate variables than control groups. Family process variables distinguished clinical groups (depressed and eating disordered) from healthy controls, but not from one another. Controlling for depression removed group differences on family process variables, but family climate variables continued to distinguish the eating-disordered group from both control groups. Indications for further research are discussed.

  18. Six Sigma methods applied to cryogenic coolers assembly line

    NASA Astrophysics Data System (ADS)

    Ventre, Jean-Marc; Germain-Lacour, Michel; Martin, Jean-Yves; Cauquil, Jean-Marc; Benschop, Tonny; Griot, René

    2009-05-01

    Six Sigma method have been applied to manufacturing process of a rotary Stirling cooler: RM2. Name of the project is NoVa as main goal of the Six Sigma approach is to reduce variability (No Variability). Project has been based on the DMAIC guideline following five stages: Define, Measure, Analyse, Improve, Control. Objective has been set on the rate of coolers succeeding performance at first attempt with a goal value of 95%. A team has been gathered involving people and skills acting on the RM2 manufacturing line. Measurement System Analysis (MSA) has been applied to test bench and results after R&R gage show that measurement is one of the root cause for variability in RM2 process. Two more root causes have been identified by the team after process mapping analysis: regenerator filling factor and cleaning procedure. Causes for measurement variability have been identified and eradicated as shown by new results from R&R gage. Experimental results show that regenerator filling factor impacts process variability and affects yield. Improved process haven been set after new calibration process for test bench, new filling procedure for regenerator and an additional cleaning stage have been implemented. The objective for 95% coolers succeeding performance test at first attempt has been reached and kept for a significant period. RM2 manufacturing process is now managed according to Statistical Process Control based on control charts. Improvement in process capability have enabled introduction of sample testing procedure before delivery.

  19. Self-organizing sensing and actuation for automatic control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, George Shu-Xing

    A Self-Organizing Process Control Architecture is introduced with a Sensing Layer, Control Layer, Actuation Layer, Process Layer, as well as Self-Organizing Sensors (SOS) and Self-Organizing Actuators (SOA). A Self-Organizing Sensor for a process variable with one or multiple input variables is disclosed. An artificial neural network (ANN) based dynamic modeling mechanism as part of the Self-Organizing Sensor is described. As a case example, a Self-Organizing Soft-Sensor for CFB Boiler Bed Height is presented. Also provided is a method to develop a Self-Organizing Sensor.

  20. Estimation of the processes controlling variability in phytoplankton pigment distributions on the southeastern U.S. continental shelf

    NASA Technical Reports Server (NTRS)

    Mcclain, Charles R.; Ishizaka, Joji; Hofmann, Eileen E.

    1990-01-01

    Five coastal-zone-color-scanner images from the southeastern U.S. continental shelf are combined with concurrent moored current meter measurements to assess the processes controlling the variability in chlorophyll concentration and distribution in this region. An equation governing the space and time distribution of a nonconservative quantity such as chlorophyll is used in the calculations. The terms of the equation, estimated from observations, show that advective, diffusive, and local processes contribute to the plankton distributions and vary with time and location. The results from this calculation are compared with similar results obtained using a numerical physical-biological model with circulation fields derived from an optimal interpolation of the current meter observations and it is concluded that the two approaches produce different estimates of the processes controlling phytoplankton variability.

  1. Application of Advanced Process Control techniques to a pusher type reheating furnace

    NASA Astrophysics Data System (ADS)

    Zanoli, S. M.; Pepe, C.; Barboni, L.

    2015-11-01

    In this paper an Advanced Process Control system aimed at controlling and optimizing a pusher type reheating furnace located in an Italian steel plant is proposed. The designed controller replaced the previous control system, based on PID controllers manually conducted by process operators. A two-layer Model Predictive Control architecture has been adopted that, exploiting a chemical, physical and economic modelling of the process, overcomes the limitations of plant operators’ mental model and knowledge. In addition, an ad hoc decoupling strategy has been implemented, allowing the selection of the manipulated variables to be used for the control of each single process variable. Finally, in order to improve the system flexibility and resilience, the controller has been equipped with a supervision module. A profitable trade-off between conflicting specifications, e.g. safety, quality and production constraints, energy saving and pollution impact, has been guaranteed. Simulation tests and real plant results demonstrated the soundness and the reliability of the proposed system.

  2. Indicator organisms in meat and poultry slaughter operations: their potential use in process control and the role of emerging technologies.

    PubMed

    Saini, Parmesh K; Marks, Harry M; Dreyfuss, Moshe S; Evans, Peter; Cook, L Victor; Dessai, Uday

    2011-08-01

    Measuring commonly occurring, nonpathogenic organisms on poultry products may be used for designing statistical process control systems that could result in reductions of pathogen levels. The extent of pathogen level reduction that could be obtained from actions resulting from monitoring these measurements over time depends upon the degree of understanding cause-effect relationships between processing variables, selected output variables, and pathogens. For such measurements to be effective for controlling or improving processing to some capability level within the statistical process control context, sufficiently frequent measurements would be needed to help identify processing deficiencies. Ultimately the correct balance of sampling and resources is determined by those characteristics of deficient processing that are important to identify. We recommend strategies that emphasize flexibility, depending upon sampling objectives. Coupling the measurement of levels of indicator organisms with practical emerging technologies and suitable on-site platforms that decrease the time between sample collections and interpreting results would enhance monitoring process control.

  3. A new method for defining and managing process alarms and for correcting process operation when an alarm occurs.

    PubMed

    Brooks, Robin; Thorpe, Richard; Wilson, John

    2004-11-11

    A new mathematical treatment of alarms that considers them as multi-variable interactions between process variables has provided the first-ever method to calculate values for alarm limits. This has resulted in substantial reductions in false alarms and hence in alarm annunciation rates in field trials. It has also unified alarm management, process control and product quality control into a single mathematical framework so that operations improvement and hence economic benefits are obtained at the same time as increased process safety. Additionally, an algorithm has been developed that advises what changes should be made to Manipulable process variables to clear an alarm. The multi-variable Best Operating Zone at the heart of the method is derived from existing historical data using equation-free methods. It does not require a first-principles process model or an expensive series of process identification experiments. Integral with the method is a new format Process Operator Display that uses only existing variables to fully describe the multi-variable operating space. This combination of features makes it an affordable and maintainable solution for small plants and single items of equipment as well as for the largest plants. In many cases, it also provides the justification for the investments about to be made or already made in process historian systems. Field Trials have been and are being conducted at IneosChlor and Mallinckrodt Chemicals, both in the UK, of the new geometric process control (GPC) method for improving the quality of both process operations and product by providing Process Alarms and Alerts of much high quality than ever before. The paper describes the methods used, including a simple visual method for Alarm Rationalisation that quickly delivers large sets of Consistent Alarm Limits, and the extension to full Alert Management with highlights from the Field Trials to indicate the overall effectiveness of the method in practice.

  4. Fluvial processes in Puget Sound rivers and the Pacific Northwest [Chapter 3

    Treesearch

    John M. Buffington; Richard D. Woodsmith; Derek B. Booth; David R. Montgomery

    2003-01-01

    The variability of topography, geology, climate; vegetation, and land use in the Pacific Northwest creates considerable spatial and temporal variability of fluvial processes and reach-scale channel type. Here we identify process domains of typical Pacific Northwest watersheds and examine local physiographic and geologic controls on channel processes and response...

  5. Modified Exponential Weighted Moving Average (EWMA) Control Chart on Autocorrelation Data

    NASA Astrophysics Data System (ADS)

    Herdiani, Erna Tri; Fandrilla, Geysa; Sunusi, Nurtiti

    2018-03-01

    In general, observations of the statistical process control are assumed to be mutually independence. However, this assumption is often violated in practice. Consequently, statistical process controls were developed for interrelated processes, including Shewhart, Cumulative Sum (CUSUM), and exponentially weighted moving average (EWMA) control charts in the data that were autocorrelation. One researcher stated that this chart is not suitable if the same control limits are used in the case of independent variables. For this reason, it is necessary to apply the time series model in building the control chart. A classical control chart for independent variables is usually applied to residual processes. This procedure is permitted provided that residuals are independent. In 1978, Shewhart modification for the autoregressive process was introduced by using the distance between the sample mean and the target value compared to the standard deviation of the autocorrelation process. In this paper we will examine the mean of EWMA for autocorrelation process derived from Montgomery and Patel. Performance to be investigated was investigated by examining Average Run Length (ARL) based on the Markov Chain Method.

  6. A New Turbo-shaft Engine Control Law during Variable Rotor Speed Transient Process

    NASA Astrophysics Data System (ADS)

    Hua, Wei; Miao, Lizhen; Zhang, Haibo; Huang, Jinquan

    2015-12-01

    A closed-loop control law employing compressor guided vanes is firstly investigated to solve unacceptable fuel flow dynamic change in single fuel control for turbo-shaft engine here, especially for rotorcraft in variable rotor speed process. Based on an Augmented Linear Quadratic Regulator (ALQR) algorithm, a dual-input, single-output robust control scheme is proposed for a turbo-shaft engine, involving not only the closed loop adjustment of fuel flow but also that of compressor guided vanes. Furthermore, compared to single fuel control, some digital simulation cases using this new scheme about variable rotor speed have been implemented on the basis of an integrated system of helicopter and engine model. The results depict that the command tracking performance to the free turbine rotor speed can be asymptotically realized. Moreover, the fuel flow transient process has been significantly improved, and the fuel consumption has been dramatically cut down by more than 2% while keeping the helicopter level fight unchanged.

  7. Motion makes sense: an adaptive motor-sensory strategy underlies the perception of object location in rats.

    PubMed

    Saraf-Sinik, Inbar; Assa, Eldad; Ahissar, Ehud

    2015-06-10

    Tactile perception is obtained by coordinated motor-sensory processes. We studied the processes underlying the perception of object location in freely moving rats. We trained rats to identify the relative location of two vertical poles placed in front of them and measured at high resolution the motor and sensory variables (19 and 2 variables, respectively) associated with this whiskers-based perceptual process. We found that the rats developed stereotypic head and whisker movements to solve this task, in a manner that can be described by several distinct behavioral phases. During two of these phases, the rats' whiskers coded object position by first temporal and then angular coding schemes. We then introduced wind (in two opposite directions) and remeasured their perceptual performance and motor-sensory variables. Our rats continued to perceive object location in a consistent manner under wind perturbations while maintaining all behavioral phases and relatively constant sensory coding. Constant sensory coding was achieved by keeping one group of motor variables (the "controlled variables") constant, despite the perturbing wind, at the cost of strongly modulating another group of motor variables (the "modulated variables"). The controlled variables included coding-relevant variables, such as head azimuth and whisker velocity. These results indicate that consistent perception of location in the rat is obtained actively, via a selective control of perception-relevant motor variables. Copyright © 2015 the authors 0270-6474/15/358777-13$15.00/0.

  8. Multifractal Properties of Process Control Variables

    NASA Astrophysics Data System (ADS)

    Domański, Paweł D.

    2017-06-01

    Control system is an inevitable element of any industrial installation. Its quality affects overall process performance significantly. The assessment, whether control system needs any improvement or not, requires relevant and constructive measures. There are various methods, like time domain based, Minimum Variance, Gaussian and non-Gaussian statistical factors, fractal and entropy indexes. Majority of approaches use time series of control variables. They are able to cover many phenomena. But process complexities and human interventions cause effects that are hardly visible for standard measures. It is shown that the signals originating from industrial installations have multifractal properties and such an analysis may extend standard approach to further observations. The work is based on industrial and simulation data. The analysis delivers additional insight into the properties of control system and the process. It helps to discover internal dependencies and human factors, which are hardly detectable.

  9. Two-phase strategy of neural control for planar reaching movements: I. XY coordination variability and its relation to end-point variability.

    PubMed

    Rand, Miya K; Shimansky, Yury P

    2013-03-01

    A quantitative model of optimal transport-aperture coordination (TAC) during reach-to-grasp movements has been developed in our previous studies. The utilization of that model for data analysis allowed, for the first time, to examine the phase dependence of the precision demand specified by the CNS for neurocomputational information processing during an ongoing movement. It was shown that the CNS utilizes a two-phase strategy for movement control. That strategy consists of reducing the precision demand for neural computations during the initial phase, which decreases the cost of information processing at the expense of lower extent of control optimality. To successfully grasp the target object, the CNS increases precision demand during the final phase, resulting in higher extent of control optimality. In the present study, we generalized the model of optimal TAC to a model of optimal coordination between X and Y components of point-to-point planar movements (XYC). We investigated whether the CNS uses the two-phase control strategy for controlling those movements, and how the strategy parameters depend on the prescribed movement speed, movement amplitude and the size of the target area. The results indeed revealed a substantial similarity between the CNS's regulation of TAC and XYC. First, the variability of XYC within individual trials was minimal, meaning that execution noise during the movement was insignificant. Second, the inter-trial variability of XYC was considerable during the majority of the movement time, meaning that the precision demand for information processing was lowered, which is characteristic for the initial phase. That variability significantly decreased, indicating higher extent of control optimality, during the shorter final movement phase. The final phase was the longest (shortest) under the most (least) challenging combination of speed and accuracy requirements, fully consistent with the concept of the two-phase control strategy. This paper further discussed the relationship between motor variability and XYC variability.

  10. Statistical quality control through overall vibration analysis

    NASA Astrophysics Data System (ADS)

    Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos

    2010-05-01

    The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence of predictive variables (high-frequency vibration displacements) that are sensible to the processes setup and the quality of the products obtained. Based on the result of this overall vibration analysis, a second paper will analyse self-induced vibration spectrums in order to define limit vibration bands, controllable every cycle or connected to permanent vibration-monitoring systems able to adjust sensible process variables identified by ANOVA, once the vibration readings exceed established quality limits.

  11. Optimization under variability and uncertainty: a case study for NOx emissions control for a gasification system.

    PubMed

    Chen, Jianjun; Frey, H Christopher

    2004-12-15

    Methods for optimization of process technologies considering the distinction between variability and uncertainty are developed and applied to case studies of NOx control for Integrated Gasification Combined Cycle systems. Existing methods of stochastic optimization (SO) and stochastic programming (SP) are demonstrated. A comparison of SO and SP results provides the value of collecting additional information to reduce uncertainty. For example, an expected annual benefit of 240,000 dollars is estimated if uncertainty can be reduced before a final design is chosen. SO and SP are typically applied to uncertainty. However, when applied to variability, the benefit of dynamic process control is obtained. For example, an annual savings of 1 million dollars could be achieved if the system is adjusted to changes in process conditions. When variability and uncertainty are treated distinctively, a coupled stochastic optimization and programming method and a two-dimensional stochastic programming method are demonstrated via a case study. For the case study, the mean annual benefit of dynamic process control is estimated to be 700,000 dollars, with a 95% confidence range of 500,000 dollars to 940,000 dollars. These methods are expected to be of greatest utility for problems involving a large commitment of resources, for which small differences in designs can produce large cost savings.

  12. Application of fluorescence spectroscopy for on-line bioprocess monitoring and control

    NASA Astrophysics Data System (ADS)

    Boehl, Daniela; Solle, D.; Toussaint, Hans J.; Menge, M.; Renemann, G.; Lindemann, Carsten; Hitzmann, Bernd; Scheper, Thomas-Helmut

    2001-02-01

    12 Modern bioprocess control requires fast data acquisition and in-time evaluation of bioprocess variables. On-line fluorescence spectroscopy for data acquisition and the use of chemometric methods accomplish these requirements. The presented investigations were performed with fluorescence spectrometers with wide ranges of excitation and emission wavelength. By detection of several biogenic fluorophors (amino acids, coenzymes and vitamins) a large amount of information about the state of the bioprocess are obtained. For the evaluation of the process variables partial least squares regression is used. This technique was applied to several bioprocesses: the production of ergotamine by Claviceps purpurea, the production of t-PA (tissue plasminogen activator) by animal cells and brewing processes. The main point of monitoring the brewing processes was to determine the process variables cell count and extract concentration.

  13. Number versus Continuous Quantity in Numerosity Judgments by Fish

    ERIC Educational Resources Information Center

    Agrillo, Christian; Piffer, Laura; Bisazza, Angelo

    2011-01-01

    In quantity discrimination tasks, adults, infants and animals have been sometimes observed to process number only after all continuous variables, such as area or density, have been controlled for. This has been taken as evidence that processing number may be more cognitively demanding than processing continuous variables. We tested this hypothesis…

  14. Statistical Methods for Quality Control of Steel Coils Manufacturing Process using Generalized Linear Models

    NASA Astrophysics Data System (ADS)

    García-Díaz, J. Carlos

    2009-11-01

    Fault detection and diagnosis is an important problem in process engineering. Process equipments are subject to malfunctions during operation. Galvanized steel is a value added product, furnishing effective performance by combining the corrosion resistance of zinc with the strength and formability of steel. Fault detection and diagnosis is an important problem in continuous hot dip galvanizing and the increasingly stringent quality requirements in automotive industry has also demanded ongoing efforts in process control to make the process more robust. When faults occur, they change the relationship among these observed variables. This work compares different statistical regression models proposed in the literature for estimating the quality of galvanized steel coils on the basis of short time histories. Data for 26 batches were available. Five variables were selected for monitoring the process: the steel strip velocity, four bath temperatures and bath level. The entire data consisting of 48 galvanized steel coils was divided into sets. The first training data set was 25 conforming coils and the second data set was 23 nonconforming coils. Logistic regression is a modeling tool in which the dependent variable is categorical. In most applications, the dependent variable is binary. The results show that the logistic generalized linear models do provide good estimates of quality coils and can be useful for quality control in manufacturing process.

  15. Cognitive inconsistency in bipolar patients is determined by increased intra-individual variability in initial phase of task performance.

    PubMed

    Krukow, Paweł; Szaniawska, Ola; Harciarek, Michał; Plechawska-Wójcik, Małgorzata; Jonak, Kamil

    2017-03-01

    Bipolar patients show high intra-individual variability during cognitive processing. However, it is not known whether there are a specific fluctuations of variability contributing to the overall high cognitive inconsistency. The objective was to compare dynamic profiles of patients and healthy controls to identify hypothetical differences and their associations with overall variability and processing speed. Changes of reaction times iSD during processing speed test performance over time was measured by dividing the iSD for whole task into four consecutive parts. Motor speed and cognitive effort were controlled. Patients with BD exhibited significantly lower results regarding processing speed and higher intra-individual variability comparing with HC. The profile of intra-individual variability changes over time of performance was significantly different in BD versus HC groups: F(3, 207)=8.60, p<0.0001, η p 2 =0.11. iSD of BD patients in the initial phase of performance was three times higher than in the last. There was no significant differences between four intervals in HC group. Inter-group difference in the initial part of the profiles was significant also after controlling for several cognitive and clinical variables. Applied computer version of Cognitive Speed Test was relatively new and, thus, replication studies are needed. Effect seen in the present study is driven mainly by the BD type I. Patients with BD exhibits problems with setting a stimulus-response association in starting phase of cognitive processing. This deficit may negatively interfere with the other cognitive functions, decreasing level of psychosocial functioning, therefore should be explored in future studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Using Paper Helicopters to Teach Statistical Process Control

    ERIC Educational Resources Information Center

    Johnson, Danny J.

    2011-01-01

    This hands-on project uses a paper helicopter to teach students how to distinguish between common and special causes of variability when developing and using statistical process control charts. It allows the student to experience a process that is out-of-control due to imprecise or incomplete product design specifications and to discover how the…

  17. Bio-inspired online variable recruitment control of fluidic artificial muscles

    NASA Astrophysics Data System (ADS)

    Jenkins, Tyler E.; Chapman, Edward M.; Bryant, Matthew

    2016-12-01

    This paper details the creation of a hybrid variable recruitment control scheme for fluidic artificial muscle (FAM) actuators with an emphasis on maximizing system efficiency and switching control performance. Variable recruitment is the process of altering a system’s active number of actuators, allowing operation in distinct force regimes. Previously, FAM variable recruitment was only quantified with offline, manual valve switching; this study addresses the creation and characterization of novel, on-line FAM switching control algorithms. The bio-inspired algorithms are implemented in conjunction with a PID and model-based controller, and applied to a simulated plant model. Variable recruitment transition effects and chatter rejection are explored via a sensitivity analysis, allowing a system designer to weigh tradeoffs in actuator modeling, algorithm choice, and necessary hardware. Variable recruitment is further developed through simulation of a robotic arm tracking a variety of spline position inputs, requiring several levels of actuator recruitment. Switching controller performance is quantified and compared with baseline systems lacking variable recruitment. The work extends current variable recruitment knowledge by creating novel online variable recruitment control schemes, and exploring how online actuator recruitment affects system efficiency and control performance. Key topics associated with implementing a variable recruitment scheme, including the effects of modeling inaccuracies, hardware considerations, and switching transition concerns are also addressed.

  18. Time dependent analysis of assay comparability: a novel approach to understand intra- and inter-site variability over time

    NASA Astrophysics Data System (ADS)

    Winiwarter, Susanne; Middleton, Brian; Jones, Barry; Courtney, Paul; Lindmark, Bo; Page, Ken M.; Clark, Alan; Landqvist, Claire

    2015-09-01

    We demonstrate here a novel use of statistical tools to study intra- and inter-site assay variability of five early drug metabolism and pharmacokinetics in vitro assays over time. Firstly, a tool for process control is presented. It shows the overall assay variability but allows also the following of changes due to assay adjustments and can additionally highlight other, potentially unexpected variations. Secondly, we define the minimum discriminatory difference/ratio to support projects to understand how experimental values measured at different sites at a given time can be compared. Such discriminatory values are calculated for 3 month periods and followed over time for each assay. Again assay modifications, especially assay harmonization efforts, can be noted. Both the process control tool and the variability estimates are based on the results of control compounds tested every time an assay is run. Variability estimates for a limited set of project compounds were computed as well and found to be comparable. This analysis reinforces the need to consider assay variability in decision making, compound ranking and in silico modeling.

  19. Multivariate statistical process control of a continuous pharmaceutical twin-screw granulation and fluid bed drying process.

    PubMed

    Silva, A F; Sarraguça, M C; Fonteyne, M; Vercruysse, J; De Leersnyder, F; Vanhoorne, V; Bostijn, N; Verstraeten, M; Vervaet, C; Remon, J P; De Beer, T; Lopes, J A

    2017-08-07

    A multivariate statistical process control (MSPC) strategy was developed for the monitoring of the ConsiGma™-25 continuous tablet manufacturing line. Thirty-five logged variables encompassing three major units, being a twin screw high shear granulator, a fluid bed dryer and a product control unit, were used to monitor the process. The MSPC strategy was based on principal component analysis of data acquired under normal operating conditions using a series of four process runs. Runs with imposed disturbances in the dryer air flow and temperature, in the granulator barrel temperature, speed and liquid mass flow and in the powder dosing unit mass flow were utilized to evaluate the model's monitoring performance. The impact of the imposed deviations to the process continuity was also evaluated using Hotelling's T 2 and Q residuals statistics control charts. The influence of the individual process variables was assessed by analyzing contribution plots at specific time points. Results show that the imposed disturbances were all detected in both control charts. Overall, the MSPC strategy was successfully developed and applied. Additionally, deviations not associated with the imposed changes were detected, mainly in the granulator barrel temperature control. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculatingmore » a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the security of treatments. They also showed that the dose delivery processes in the cancer center were in control for prostate and head-and-neck treatments. In parallel, long term process performance indices (P{sub p}, P{sub pk}, and P{sub pm}) have been analyzed. Their analysis helped defining which actions should be undertaken in order to improve the performance of the process. The prostate dose delivery process has been shown statistically capable (0.08% of the results is expected to be outside the clinical tolerances) contrary to the head-and-neck dose delivery process (5.76% of the results are expected to be outside the clinical tolerances).« less

  1. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    PubMed

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the security of treatments. They also showed that the dose delivery processes in the cancer center were in control for prostate and head-and-neck treatments. In parallel, long-term process performance indices (P(p), P(pk), and P(pm)) have been analyzed. Their analysis helped defining which actions should be undertaken in order to improve the performance of the process. The prostate dose delivery process has been shown statistically capable (0.08% of the results is expected to be outside the clinical tolerances) contrary to the head-and-neck dose delivery process (5.76% of the results are expected to be outside the clinical tolerances).

  2. Controlling the COD removal of an A-stage pilot study with instrumentation and automatic process control.

    PubMed

    Miller, Mark W; Elliott, Matt; DeArmond, Jon; Kinyua, Maureen; Wett, Bernhard; Murthy, Sudhir; Bott, Charles B

    2017-06-01

    The pursuit of fully autotrophic nitrogen removal via the anaerobic ammonium oxidation (anammox) pathway has led to an increased interest in carbon removal technologies, particularly the A-stage of the adsorption/bio-oxidation (A/B) process. The high-rate operation of the A-stage and lack of automatic process control often results in wide variations of chemical oxygen demand (COD) removal that can ultimately impact nitrogen removal in the downstream B-stage process. This study evaluated the use dissolved oxygen (DO) and mixed liquor suspended solids (MLSS) based automatic control strategies through the use of in situ on-line sensors in the A-stage of an A/B pilot study. The objective of using these control strategies was to reduce the variability of COD removal by the A-stage and thus the variability of the effluent C/N. The use of cascade DO control in the A-stage did not impact COD removal at the conditions tested in this study, likely because the bulk DO concentration (>0.5 mg/L) was maintained above the half saturation coefficient of heterotrophic organisms for DO. MLSS-based solids retention time (SRT) control, where MLSS was used as a surrogate for SRT, did not significantly reduce the effluent C/N variability but it was able to reduce COD removal variation in the A-stage by 90%.

  3. Which Measures of Online Control Are Least Sensitive to Offline Processes?

    PubMed

    de Grosbois, John; Tremblay, Luc

    2018-02-28

    A major challenge to the measurement of online control is the contamination by offline, planning-based processes. The current study examined the sensitivity of four measures of online control to offline changes in reaching performance induced by prism adaptation and terminal feedback. These measures included the squared Z scores (Z 2 ) of correlations of limb position at 75% movement time versus movement end, variable error, time after peak velocity, and a frequency-domain analysis (pPower). The results indicated that variable error and time after peak velocity were sensitive to the prism adaptation. Furthermore, only the Z 2 values were biased by the terminal feedback. Ultimately, the current study has demonstrated the sensitivity of limb kinematic measures to offline control processes and that pPower analyses may yield the most suitable measure of online control.

  4. Dimensional control of die castings

    NASA Astrophysics Data System (ADS)

    Karve, Aniruddha Ajit

    The demand for net shape die castings, which require little or no machining, is steadily increasing. Stringent customer requirements are forcing die casters to deliver high quality castings in increasingly short lead times. Dimensional conformance to customer specifications is an inherent part of die casting quality. The dimensional attributes of a die casting are essentially dependent upon many factors--the quality of the die and the degree of control over the process variables being the two major sources of dimensional error in die castings. This study focused on investigating the nature and the causes of dimensional error in die castings. The two major components of dimensional error i.e., dimensional variability and die allowance were studied. The major effort of this study was to qualitatively and quantitatively study the effects of casting geometry and process variables on die casting dimensional variability and die allowance. This was accomplished by detailed dimensional data collection at production die casting sites. Robust feature characterization schemes were developed to describe complex casting geometry in quantitative terms. Empirical modeling was utilized to quantify the effects of the casting variables on dimensional variability and die allowance for die casting features. A number of casting geometry and process variables were found to affect dimensional variability in die castings. The dimensional variability was evaluated by comparisons with current published dimensional tolerance standards. The casting geometry was found to play a significant role in influencing the die allowance of the features measured. The predictive models developed for dimensional variability and die allowance were evaluated to test their effectiveness. Finally, the relative impact of all the components of dimensional error in die castings was put into perspective, and general guidelines for effective dimensional control in the die casting plant were laid out. The results of this study will contribute to enhancement of dimensional quality and lead time compression in the die casting industry, thus making it competitive with other net shape manufacturing processes.

  5. System-wide hybrid MPC-PID control of a continuous pharmaceutical tablet manufacturing process via direct compaction.

    PubMed

    Singh, Ravendra; Ierapetritou, Marianthi; Ramachandran, Rohit

    2013-11-01

    The next generation of QbD based pharmaceutical products will be manufactured through continuous processing. This will allow the integration of online/inline monitoring tools, coupled with an efficient advanced model-based feedback control systems, to achieve precise control of process variables, so that the predefined product quality can be achieved consistently. The direct compaction process considered in this study is highly interactive and involves time delays for a number of process variables due to sensor placements, process equipment dimensions, and the flow characteristics of the solid material. A simple feedback regulatory control system (e.g., PI(D)) by itself may not be sufficient to achieve the tight process control that is mandated by regulatory authorities. The process presented herein comprises of coupled dynamics involving slow and fast responses, indicating the requirement of a hybrid control scheme such as a combined MPC-PID control scheme. In this manuscript, an efficient system-wide hybrid control strategy for an integrated continuous pharmaceutical tablet manufacturing process via direct compaction has been designed. The designed control system is a hybrid scheme of MPC-PID control. An effective controller parameter tuning strategy involving an ITAE method coupled with an optimization strategy has been used for tuning of both MPC and PID parameters. The designed hybrid control system has been implemented in a first-principles model-based flowsheet that was simulated in gPROMS (Process System Enterprise). Results demonstrate enhanced performance of critical quality attributes (CQAs) under the hybrid control scheme compared to only PID or MPC control schemes, illustrating the potential of a hybrid control scheme in improving pharmaceutical manufacturing operations. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Analytical design of an industrial two-term controller for optimal regulatory control of open-loop unstable processes under operational constraints.

    PubMed

    Tchamna, Rodrigue; Lee, Moonyong

    2018-01-01

    This paper proposes a novel optimization-based approach for the design of an industrial two-term proportional-integral (PI) controller for the optimal regulatory control of unstable processes subjected to three common operational constraints related to the process variable, manipulated variable and its rate of change. To derive analytical design relations, the constrained optimal control problem in the time domain was transformed into an unconstrained optimization problem in a new parameter space via an effective parameterization. The resulting optimal PI controller has been verified to yield optimal performance and stability of an open-loop unstable first-order process under operational constraints. The proposed analytical design method explicitly takes into account the operational constraints in the controller design stage and also provides useful insights into the optimal controller design. Practical procedures for designing optimal PI parameters and a feasible constraint set exclusive of complex optimization steps are also proposed. The proposed controller was compared with several other PI controllers to illustrate its performance. The robustness of the proposed controller against plant-model mismatch has also been investigated. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Enhanced Cumulative Sum Charts for Monitoring Process Dispersion

    PubMed Central

    Abujiya, Mu’azu Ramat; Riaz, Muhammad; Lee, Muhammad Hisyam

    2015-01-01

    The cumulative sum (CUSUM) control chart is widely used in industry for the detection of small and moderate shifts in process location and dispersion. For efficient monitoring of process variability, we present several CUSUM control charts for monitoring changes in standard deviation of a normal process. The newly developed control charts based on well-structured sampling techniques - extreme ranked set sampling, extreme double ranked set sampling and double extreme ranked set sampling, have significantly enhanced CUSUM chart ability to detect a wide range of shifts in process variability. The relative performances of the proposed CUSUM scale charts are evaluated in terms of the average run length (ARL) and standard deviation of run length, for point shift in variability. Moreover, for overall performance, we implore the use of the average ratio ARL and average extra quadratic loss. A comparison of the proposed CUSUM control charts with the classical CUSUM R chart, the classical CUSUM S chart, the fast initial response (FIR) CUSUM R chart, the FIR CUSUM S chart, the ranked set sampling (RSS) based CUSUM R chart and the RSS based CUSUM S chart, among others, are presented. An illustrative example using real dataset is given to demonstrate the practicability of the application of the proposed schemes. PMID:25901356

  8. [Statistical Process Control (SPC) can help prevent treatment errors without increasing costs in radiotherapy].

    PubMed

    Govindarajan, R; Llueguera, E; Melero, A; Molero, J; Soler, N; Rueda, C; Paradinas, C

    2010-01-01

    Statistical Process Control (SPC) was applied to monitor patient set-up in radiotherapy and, when the measured set-up error values indicated a loss of process stability, its root cause was identified and eliminated to prevent set-up errors. Set up errors were measured for medial-lateral (ml), cranial-caudal (cc) and anterior-posterior (ap) dimensions and then the upper control limits were calculated. Once the control limits were known and the range variability was acceptable, treatment set-up errors were monitored using sub-groups of 3 patients, three times each shift. These values were plotted on a control chart in real time. Control limit values showed that the existing variation was acceptable. Set-up errors, measured and plotted on a X chart, helped monitor the set-up process stability and, if and when the stability was lost, treatment was interrupted, the particular cause responsible for the non-random pattern was identified and corrective action was taken before proceeding with the treatment. SPC protocol focuses on controlling the variability due to assignable cause instead of focusing on patient-to-patient variability which normally does not exist. Compared to weekly sampling of set-up error in each and every patient, which may only ensure that just those sampled sessions were set-up correctly, the SPC method enables set-up error prevention in all treatment sessions for all patients and, at the same time, reduces the control costs. Copyright © 2009 SECA. Published by Elsevier Espana. All rights reserved.

  9. Modelling and control for laser based welding processes: modern methods of process control to improve quality of laser-based joining methods

    NASA Astrophysics Data System (ADS)

    Zäh, Ralf-Kilian; Mosbach, Benedikt; Hollwich, Jan; Faupel, Benedikt

    2017-02-01

    To ensure the competitiveness of manufacturing companies it is indispensable to optimize their manufacturing processes. Slight variations of process parameters and machine settings have only marginally effects on the product quality. Therefore, the largest possible editing window is required. Such parameters are, for example, the movement of the laser beam across the component for the laser keyhole welding. That`s why it is necessary to keep the formation of welding seams within specified limits. Therefore, the quality of laser welding processes is ensured, by using post-process methods, like ultrasonic inspection, or special in-process methods. These in-process systems only achieve a simple evaluation which shows whether the weld seam is acceptable or not. Furthermore, in-process systems use no feedback for changing the control variables such as speed of the laser or adjustment of laser power. In this paper the research group presents current results of the research field of Online Monitoring, Online Controlling and Model predictive controlling in laser welding processes to increase the product quality. To record the characteristics of the welding process, tested online methods are used during the process. Based on the measurement data, a state space model is ascertained, which includes all the control variables of the system. Depending on simulation tools the model predictive controller (MPC) is designed for the model and integrated into an NI-Real-Time-System.

  10. Trends in Solidification Grain Size and Morphology for Additive Manufacturing of Ti-6Al-4V

    NASA Astrophysics Data System (ADS)

    Gockel, Joy; Sheridan, Luke; Narra, Sneha P.; Klingbeil, Nathan W.; Beuth, Jack

    2017-12-01

    Metal additive manufacturing (AM) is used for both prototyping and production of final parts. Therefore, there is a need to predict and control the microstructural size and morphology. Process mapping is an approach that represents AM process outcomes in terms of input variables. In this work, analytical, numerical, and experimental approaches are combined to provide a holistic view of trends in the solidification grain structure of Ti-6Al-4V across a wide range of AM process input variables. The thermal gradient is shown to vary significantly through the depth of the melt pool, which precludes development of fully equiaxed microstructure throughout the depth of the deposit within any practical range of AM process variables. A strategy for grain size control is demonstrated based on the relationship between melt pool size and grain size across multiple deposit geometries, and additional factors affecting grain size are discussed.

  11. System properties, feedback control and effector coordination of human temperature regulation.

    PubMed

    Werner, Jürgen

    2010-05-01

    The aim of human temperature regulation is to protect body processes by establishing a relative constancy of deep body temperature (regulated variable), in spite of external and internal influences on it. This is basically achieved by a distributed multi-sensor, multi-processor, multi-effector proportional feedback control system. The paper explains why proportional control implies inherent deviations of the regulated variable from the value in the thermoneutral zone. The concept of feedback of the thermal state of the body, conveniently represented by a high-weighted core temperature (T (c)) and low-weighted peripheral temperatures (T (s)) is equivalent to the control concept of "auxiliary feedback control", using a main (regulated) variable (T (c)), supported by an auxiliary variable (T (s)). This concept implies neither regulation of T (s) nor feedforward control. Steady-states result in the closed control-loop, when the open-loop properties of the (heat transfer) process are compatible with those of the thermoregulatory processors. They are called operating points or balance points and are achieved due to the inherent property of dynamical stability of the thermoregulatory feedback loop. No set-point and no comparison of signals (e.g. actual-set value) are necessary. Metabolic heat production and sweat production, though receiving the same information about the thermal state of the body, are independent effectors with different thresholds and gains. Coordination between one of these effectors and the vasomotor effector is achieved by the fact that changes in the (heat transfer) process evoked by vasomotor control are taken into account by the metabolic/sweat processor.

  12. A unified method for evaluating real-time computer controllers: A case study. [aircraft control

    NASA Technical Reports Server (NTRS)

    Shin, K. G.; Krishna, C. M.; Lee, Y. H.

    1982-01-01

    A real time control system consists of a synergistic pair, that is, a controlled process and a controller computer. Performance measures for real time controller computers are defined on the basis of the nature of this synergistic pair. A case study of a typical critical controlled process is presented in the context of new performance measures that express the performance of both controlled processes and real time controllers (taken as a unit) on the basis of a single variable: controller response time. Controller response time is a function of current system state, system failure rate, electrical and/or magnetic interference, etc., and is therefore a random variable. Control overhead is expressed as a monotonically nondecreasing function of the response time and the system suffers catastrophic failure, or dynamic failure, if the response time for a control task exceeds the corresponding system hard deadline, if any. A rigorous probabilistic approach is used to estimate the performance measures. The controlled process chosen for study is an aircraft in the final stages of descent, just prior to landing. First, the performance measures for the controller are presented. Secondly, control algorithms for solving the landing problem are discussed and finally the impact of the performance measures on the problem is analyzed.

  13. Variability in Proactive and Reactive Cognitive Control Processes Across the Adult Lifespan

    PubMed Central

    Karayanidis, Frini; Whitson, Lisa Rebecca; Heathcote, Andrew; Michie, Patricia T.

    2011-01-01

    Task-switching paradigms produce a highly consistent age-related increase in mixing cost [longer response time (RT) on repeat trials in mixed-task than single-task blocks] but a less consistent age effect on switch cost (longer RT on switch than repeat trials in mixed-task blocks). We use two approaches to examine the adult lifespan trajectory of control processes contributing to mixing cost and switch cost: latent variables derived from an evidence accumulation model of choice, and event-related potentials (ERP) that temporally differentiate proactive (cue-driven) and reactive (target-driven) control processes. Under highly practiced and prepared task conditions, aging was associated with increasing RT mixing cost but reducing RT switch cost. Both effects were largely due to the same cause: an age effect for mixed-repeat trials. In terms of latent variables, increasing age was associated with slower non-decision processes, slower rate of evidence accumulation about the target, and higher response criterion. Age effects on mixing costs were evident only on response criterion, the amount of evidence required to trigger a decision, whereas age effects on switch cost were present for all three latent variables. ERPs showed age-related increases in preparation for mixed-repeat trials, anticipatory attention, and post-target interference. Cue-locked ERPs that are linked to proactive control were associated with early emergence of age differences in response criterion. These results are consistent with age effects on strategic processes controlling decision caution. Consistent with an age-related decline in cognitive flexibility, younger adults flexibly adjusted response criterion from trial-to-trial on mixed-task blocks, whereas older adults maintained a high criterion for all trials. PMID:22073037

  14. Interim results of quality-control sampling of surface water for the Upper Colorado River National Water-Quality Assessment Study Unit, water years 1995-96

    USGS Publications Warehouse

    Spahr, N.E.; Boulger, R.W.

    1997-01-01

    Quality-control samples provide part of the information needed to estimate the bias and variability that result from sample collection, processing, and analysis. Quality-control samples of surface water collected for the Upper Colorado River National Water-Quality Assessment study unit for water years 1995?96 are presented and analyzed in this report. The types of quality-control samples collected include pre-processing split replicates, concurrent replicates, sequential replicates, post-processing split replicates, and field blanks. Analysis of the pre-processing split replicates, concurrent replicates, sequential replicates, and post-processing split replicates is based on differences between analytical results of the environmental samples and analytical results of the quality-control samples. Results of these comparisons indicate that variability introduced by sample collection, processing, and handling is low and will not affect interpretation of the environmental data. The differences for most water-quality constituents is on the order of plus or minus 1 or 2 lowest rounding units. A lowest rounding unit is equivalent to the magnitude of the least significant figure reported for analytical results. The use of lowest rounding units avoids some of the difficulty in comparing differences between pairs of samples when concentrations span orders of magnitude and provides a measure of the practical significance of the effect of variability. Analysis of field-blank quality-control samples indicates that with the exception of chloride and silica, no systematic contamination of samples is apparent. Chloride contamination probably was the result of incomplete rinsing of the dilute cleaning solution from the outlet ports of the decaport sample splitter. Silica contamination seems to have been introduced by the blank water. Sampling and processing procedures for water year 1997 have been modified as a result of these analyses.

  15. Analysis on electronic control unit of continuously variable transmission

    NASA Astrophysics Data System (ADS)

    Cao, Shuanggui

    Continuously variable transmission system can ensure that the engine work along the line of best fuel economy, improve fuel economy, save fuel and reduce harmful gas emissions. At the same time, continuously variable transmission allows the vehicle speed is more smooth and improves the ride comfort. Although the CVT technology has made great development, but there are many shortcomings in the CVT. The CVT system of ordinary vehicles now is still low efficiency, poor starting performance, low transmission power, and is not ideal controlling, high cost and other issues. Therefore, many scholars began to study some new type of continuously variable transmission. The transmission system with electronic systems control can achieve automatic control of power transmission, give full play to the characteristics of the engine to achieve optimal control of powertrain, so the vehicle is always traveling around the best condition. Electronic control unit is composed of the core processor, input and output circuit module and other auxiliary circuit module. Input module collects and process many signals sent by sensor and , such as throttle angle, brake signals, engine speed signal, speed signal of input and output shaft of transmission, manual shift signals, mode selection signals, gear position signal and the speed ratio signal, so as to provide its corresponding processing for the controller core.

  16. Learning-based controller for biotechnology processing, and method of using

    DOEpatents

    Johnson, John A.; Stoner, Daphne L.; Larsen, Eric D.; Miller, Karen S.; Tolle, Charles R.

    2004-09-14

    The present invention relates to process control where some of the controllable parameters are difficult or impossible to characterize. The present invention relates to process control in biotechnology of such systems, but not limited to. Additionally, the present invention relates to process control in biotechnology minerals processing. In the inventive method, an application of the present invention manipulates a minerals bioprocess to find local exterma (maxima or minima) for selected output variables/process goals by using a learning-based controller for bioprocess oxidation of minerals during hydrometallurgical processing. The learning-based controller operates with or without human supervision and works to find processor optima without previously defined optima due to the non-characterized nature of the process being manipulated.

  17. Control of variable speed variable pitch wind turbine based on a disturbance observer

    NASA Astrophysics Data System (ADS)

    Ren, Haijun; Lei, Xin

    2017-11-01

    In this paper, a novel sliding mode controller based on disturbance observer (DOB) to optimize the efficiency of variable speed variable pitch (VSVP) wind turbine is developed and analyzed. Due to the highly nonlinearity of the VSVP system, the model is linearly processed to obtain the state space model of the system. Then, a conventional sliding mode controller is designed and a DOB is added to estimate wind speed. The proposed control strategy can successfully deal with the random nature of wind speed, the nonlinearity of VSVP system, the uncertainty of parameters and external disturbance. Via adding the observer to the sliding mode controller, it can greatly reduce the chattering produced by the sliding mode switching gain. The simulation results show that the proposed control system has the effectiveness and robustness.

  18. Effect of Scientific Argumentation on the Development of Scientific Process Skills in the Context of Teaching Chemistry

    ERIC Educational Resources Information Center

    Gultepe, Nejla; Kilic, Ziya

    2015-01-01

    This study was conducted in order to determine the differences in integrated scientific process skills (designing experiments, forming data tables, drawing graphs, graph interpretation, determining the variables and hypothesizing, changing and controlling variables) of students (n = 17) who were taught with an approach based on scientific…

  19. Wandering Minds and Wavering Rhythms: Linking Mind Wandering and Behavioral Variability

    ERIC Educational Resources Information Center

    Seli, Paul; Cheyne, James Allan; Smilek, Daniel

    2013-01-01

    Mind wandering is a pervasive feature of human cognition often associated with the withdrawal of task-related executive control processes. Here, we explore the possibility that, in tasks requiring executive control to sustain consistent responding, moments of mind wandering could be associated with moments of increased behavioral variability. To…

  20. Proactive Control Processes in Event-Based Prospective Memory: Evidence from Intraindividual Variability and Ex-Gaussian Analyses

    ERIC Educational Resources Information Center

    Ball, B. Hunter; Brewer, Gene A.

    2018-01-01

    The present study implemented an individual differences approach in conjunction with response time (RT) variability and distribution modeling techniques to better characterize the cognitive control dynamics underlying ongoing task cost (i.e., slowing) and cue detection in event-based prospective memory (PM). Three experiments assessed the relation…

  1. The role of physician characteristics in clinical trial acceptance: testing pathways of influence.

    PubMed

    Curbow, Barbara; Fogarty, Linda A; McDonnell, Karen A; Chill, Julia; Scott, Lisa Benz

    2006-03-01

    Eight videotaped vignettes were developed that assessed the effects of three physician-related experimental variables (in a 2 x 2 x 2 factorial design) on clinical trial (CT) knowledge, video knowledge, information processing, CT beliefs, affective evaluations (attitudes), and CT acceptance. It was hypothesized that the physician variables (community versus academic-based affiliation, enthusiastic versus neutral presentation of the trial, and new versus previous relationship with the patient) would serve as communication cues that would interrupt message processing, leading to lower knowledge gain but more positive beliefs, attitudes, and CT acceptance. A total of 262 women (161 survivors and 101 controls) participated in the study. The manipulated variables primarily influenced the intermediary variables of post-test CT beliefs and satisfaction with information rather than knowledge or information processing. Multiple regression results indicated that CT acceptance was associated with positive post-CT beliefs, a lower level of information processing, satisfaction with information, and control status. Based on these results, CT acceptance does not appear to be based on a rational decision-making model; this has implications for both the ethics of informed consent and research conceptual models.

  2. Development and demonstration of manufacturing processes for fabricating graphite/LARC-160 polyimide structural elements, part 4, paragraph B

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A quality assurance program was developed which included specifications for celion/LARC-160 polyimide materials and quality control of materials and processes. The effects of monomers and/or polymer variables and prepeg variables on the processibility of celion/LARC prepeg were included. Processes for fabricating laminates, honeycomb core panels, and chopped fiber moldings were developed. Specimens and conduct tests were fabricated to qualify the processes for fabrication of demonstration components.

  3. Increased Force Variability Is Associated with Altered Modulation of the Motorneuron Pool Activity in Autism Spectrum Disorder (ASD).

    PubMed

    Wang, Zheng; Kwon, Minhyuk; Mohanty, Suman; Schmitt, Lauren M; White, Stormi P; Christou, Evangelos A; Mosconi, Matthew W

    2017-03-25

    Force control deficits have been repeatedly documented in autism spectrum disorder (ASD). They are associated with worse social and daily living skill impairments in patients suggesting that developing a more mechanistic understanding of the central and peripheral processes that cause them may help guide the development of treatments that improve multiple outcomes in ASD. The neuromuscular mechanisms underlying force control deficits are not yet understood. Seventeen individuals with ASD and 14 matched healthy controls completed an isometric index finger abduction test at 60% of their maximum voluntary contraction (MVC) during recording of the first dorsal interosseous (FDI) muscle to determine the neuromuscular processes associated with sustained force variability. Central modulation of the motorneuron pool activation of the FDI muscle was evaluated at delta (0-4 Hz), alpha (4-10 Hz), beta (10-35 Hz) and gamma (35-60 Hz) frequency bands. ASD patients showed greater force variability than controls when attempting to maintain a constant force. Relative to controls, patients also showed increased central modulation of the motorneuron pool at beta and gamma bands. For controls, reduced force variability was associated with reduced delta frequency modulation of the motorneuron pool activity of the FDI muscle and increased modulation at beta and gamma bands. In contrast, delta, beta, and gamma frequency oscillations were not associated with force variability in ASD. These findings suggest that alterations of central mechanisms that control motorneuron pool firing may underlie the common and often impairing symptoms of ASD.

  4. Deficits in auditory processing contribute to impairments in vocal affect recognition in autism spectrum disorders: A MEG study.

    PubMed

    Demopoulos, Carly; Hopkins, Joyce; Kopald, Brandon E; Paulson, Kim; Doyle, Lauren; Andrews, Whitney E; Lewine, Jeffrey David

    2015-11-01

    The primary aim of this study was to examine whether there is an association between magnetoencephalography-based (MEG) indices of basic cortical auditory processing and vocal affect recognition (VAR) ability in individuals with autism spectrum disorder (ASD). MEG data were collected from 25 children/adolescents with ASD and 12 control participants using a paired-tone paradigm to measure quality of auditory physiology, sensory gating, and rapid auditory processing. Group differences were examined in auditory processing and vocal affect recognition ability. The relationship between differences in auditory processing and vocal affect recognition deficits was examined in the ASD group. Replicating prior studies, participants with ASD showed longer M1n latencies and impaired rapid processing compared with control participants. These variables were significantly related to VAR, with the linear combination of auditory processing variables accounting for approximately 30% of the variability after controlling for age and language skills in participants with ASD. VAR deficits in ASD are typically interpreted as part of a core, higher order dysfunction of the "social brain"; however, these results suggest they also may reflect basic deficits in auditory processing that compromise the extraction of socially relevant cues from the auditory environment. As such, they also suggest that therapeutic targeting of sensory dysfunction in ASD may have additional positive implications for other functional deficits. (c) 2015 APA, all rights reserved).

  5. A Process Dynamics and Control Experiment for the Undergraduate Laboratory

    ERIC Educational Resources Information Center

    Spencer, Jordan L.

    2009-01-01

    This paper describes a process control experiment. The apparatus includes a three-vessel glass flow system with a variable flow configuration, means for feeding dye solution controlled by a stepper-motor driven valve, and a flow spectrophotometer. Students use impulse response data and nonlinear regression to estimate three parameters of a model…

  6. Pharmaceutical quality by design: product and process development, understanding, and control.

    PubMed

    Yu, Lawrence X

    2008-04-01

    The purpose of this paper is to discuss the pharmaceutical Quality by Design (QbD) and describe how it can be used to ensure pharmaceutical quality. The QbD was described and some of its elements identified. Process parameters and quality attributes were identified for each unit operation during manufacture of solid oral dosage forms. The use of QbD was contrasted with the evaluation of product quality by testing alone. The QbD is a systemic approach to pharmaceutical development. It means designing and developing formulations and manufacturing processes to ensure predefined product quality. Some of the QbD elements include: Defining target product quality profile; Designing product and manufacturing processes; Identifying critical quality attributes, process parameters, and sources of variability; Controlling manufacturing processes to produce consistent quality over time. Using QbD, pharmaceutical quality is assured by understanding and controlling formulation and manufacturing variables. Product testing confirms the product quality. Implementation of QbD will enable transformation of the chemistry, manufacturing, and controls (CMC) review of abbreviated new drug applications (ANDAs) into a science-based pharmaceutical quality assessment.

  7. Doing It Right: 366 answers to computing questions you didn't know you had

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herring, Stuart Davis

    Slides include information on history: version control, version control: branches, version control: Git, releases, requirements, readability, readability control flow, global variables, architecture, architecture redundancy, processes, input/output, unix, etcetera.

  8. Development of process data capturing, analysis and controlling for thermal spray techniques - SprayTracker

    NASA Astrophysics Data System (ADS)

    Kelber, C.; Marke, S.; Trommler, U.; Rupprecht, C.; Weis, S.

    2017-03-01

    Thermal spraying processes are becoming increasingly important in high-technology areas, such as automotive engineering and medical technology. The method offers the advantage of a local layer application with different materials and high deposition rates. Challenges in the application of thermal spraying result from the complex interaction of different influencing variables, which can be attributed to the properties of different materials, operating equipment supply, electrical parameters, flow mechanics, plasma physics and automation. In addition, spraying systems are subject to constant wear. Due to the process specification and the high demands on the produced coatings, innovative quality assurance tools are necessary. A central aspect, which has not yet been considered, is the data management in relation to the present measured variables, in particular the spraying system, the handling system, working safety devices and additional measuring sensors. Both the recording of all process-characterizing variables, their linking and evaluation as well as the use of the data for the active process control presuppose a novel, innovative control system (hardware and software) that was to be developed within the scope of the research project. In addition, new measurement methods and sensors are to be developed and qualified in order to improve the process reliability of thermal spraying.

  9. Contextual analysis of fluid intelligence.

    PubMed

    Salthouse, Timothy A; Pink, Jeffrey E; Tucker-Drob, Elliot M

    2008-01-01

    The nature of fluid intelligence was investigated by identifying variables that were, and were not, significantly related to this construct. Relevant information was obtained from three sources: re-analyses of data from previous studies, a study in which 791 adults performed storage-plus-processing working memory tasks, and a study in which 236 adults performed a variety of working memory, updating, and cognitive control tasks. The results suggest that fluid intelligence represents a broad individual difference dimension contributing to diverse types of controlled or effortful processing. The analyses also revealed that very few of the age-related effects on the target variables were statistically independent of effects on established cognitive abilities, which suggests most of the age-related influences on a wide variety of cognitive control variables overlap with age-related influences on cognitive abilities such as fluid intelligence, episodic memory, and perceptual speed.

  10. On the reliability of Shewhart-type control charts for multivariate process variability

    NASA Astrophysics Data System (ADS)

    Djauhari, Maman A.; Salleh, Rohayu Mohd; Zolkeply, Zunnaaim; Li, Lee Siaw

    2017-05-01

    We show that in the current practice of multivariate process variability monitoring, the reliability of Shewhart-type control charts cannot be measured except when the sub-group size n tends to infinity. However, the requirement of large n is meaningless not only in manufacturing industry where n is small but also in service industry where n is moderate. In this paper, we introduce a new definition of control limits in the two most appreciated control charts in the literature, i.e., the improved generalized variance chart (IGV-chart) and vector variance chart (VV-chart). With the new definition of control limits, the reliability of the control charts can be determined. Some important properties of new control limits will be derived and the computational technique of probability of false alarm will be delivered.

  11. Dream controller

    DOEpatents

    Cheng, George Shu-Xing; Mulkey, Steven L; Wang, Qiang; Chow, Andrew J

    2013-11-26

    A method and apparatus for intelligently controlling continuous process variables. A Dream Controller comprises an Intelligent Engine mechanism and a number of Model-Free Adaptive (MFA) controllers, each of which is suitable to control a process with specific behaviors. The Intelligent Engine can automatically select the appropriate MFA controller and its parameters so that the Dream Controller can be easily used by people with limited control experience and those who do not have the time to commission, tune, and maintain automatic controllers.

  12. eClims: An Extensible and Dynamic Integration Framework for Biomedical Information Systems.

    PubMed

    Savonnet, Marinette; Leclercq, Eric; Naubourg, Pierre

    2016-11-01

    Biomedical information systems (BIS) require consideration of three types of variability: data variability induced by new high throughput technologies, schema or model variability induced by large scale studies or new fields of research, and knowledge variability resulting from new discoveries. Beyond data heterogeneity, managing variabilities in the context of BIS requires extensible and dynamic integration process. In this paper, we focus on data and schema variabilities and we propose an integration framework based on ontologies, master data, and semantic annotations. The framework addresses issues related to: 1) collaborative work through a dynamic integration process; 2) variability among studies using an annotation mechanism; and 3) quality control over data and semantic annotations. Our approach relies on two levels of knowledge: BIS-related knowledge is modeled using an application ontology coupled with UML models that allow controlling data completeness and consistency, and domain knowledge is described by a domain ontology, which ensures data coherence. A system build with the eClims framework has been implemented and evaluated in the context of a proteomic platform.

  13. Fermentanomics: Relating quality attributes of a monoclonal antibody to cell culture process variables and raw materials using multivariate data analysis.

    PubMed

    Rathore, Anurag S; Kumar Singh, Sumit; Pathak, Mili; Read, Erik K; Brorson, Kurt A; Agarabi, Cyrus D; Khan, Mansoor

    2015-01-01

    Fermentanomics is an emerging field of research and involves understanding the underlying controlled process variables and their effect on process yield and product quality. Although major advancements have occurred in process analytics over the past two decades, accurate real-time measurement of significant quality attributes for a biotech product during production culture is still not feasible. Researchers have used an amalgam of process models and analytical measurements for monitoring and process control during production. This article focuses on using multivariate data analysis as a tool for monitoring the internal bioreactor dynamics, the metabolic state of the cell, and interactions among them during culture. Quality attributes of the monoclonal antibody product that were monitored include glycosylation profile of the final product along with process attributes, such as viable cell density and level of antibody expression. These were related to process variables, raw materials components of the chemically defined hybridoma media, concentration of metabolites formed during the course of the culture, aeration-related parameters, and supplemented raw materials such as glucose, methionine, threonine, tryptophan, and tyrosine. This article demonstrates the utility of multivariate data analysis for correlating the product quality attributes (especially glycosylation) to process variables and raw materials (especially amino acid supplements in cell culture media). The proposed approach can be applied for process optimization to increase product expression, improve consistency of product quality, and target the desired quality attribute profile. © 2015 American Institute of Chemical Engineers.

  14. The design of control system of livestock feeding processing

    NASA Astrophysics Data System (ADS)

    Sihombing, Juna; Napitupulu, Humala L.; Hidayati, Juliza

    2018-03-01

    PT. XYZ is a company that produces animal feed. One type of animal feed produced is 105 ISA P. In carrying out its production process, PT. XYZ faces the problem of rejected feed amounts during 2014 to June 2015 due to the amount of animal feed that exceeds the standard feed quality of 13% of moisture content and 3% for ash content. Therefore, the researchers analyzed the relationship between factors affecting the quality and extent of damage by using regression and correlation and determine the optimum value of each processing process. Analysis results found that variables affecting product quality are mixing time, steam conditioning temperature and cooling time. The most dominant variable affecting the product moisture content is mixing time with the correlation coefficient of (0.7959) and the most dominant variable affecting the ash content of the product during the processing is mixing time with the correlation coefficient of (0.8541). The design of the proposed product processing control is to run the product processing process with mixing time 235 seconds, steam conditioning temperature 87 0C and cooling time 192 seconds. Product quality 105 ISA P obtained by using this design is with 12.16% moisture content and ash content of 2.59%.

  15. Metrology for Fuel Cell Manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stocker, Michael; Stanfield, Eric

    2015-02-04

    The project was divided into three subprojects. The first subproject is Fuel Cell Manufacturing Variability and Its Impact on Performance. The objective was to determine if flow field channel dimensional variability has an impact on fuel cell performance. The second subproject is Non-contact Sensor Evaluation for Bipolar Plate Manufacturing Process Control and Smart Assembly of Fuel Cell Stacks. The objective was to enable cost reduction in the manufacture of fuel cell plates by providing a rapid non-contact measurement system for in-line process control. The third subproject is Optical Scatterfield Metrology for Online Catalyst Coating Inspection of PEM Soft Goods. Themore » objective was to evaluate the suitability of Optical Scatterfield Microscopy as a viable measurement tool for in situ process control of catalyst coatings.« less

  16. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.

  17. Testing for Neuropsychological Endophenotypes in Siblings Discordant for ADHD

    PubMed Central

    Bidwell, L. Cinnamon; Willcutt, Erik G.; DeFries, John C.; Pennington, Bruce F.

    2007-01-01

    Objective Neurocognitive deficits associated with attention deficit-hyperactivity disorder (ADHD) may be useful intermediate endophenotypes for determining specific genetic pathways that contribute to ADHD. Methods This study administered 17 measures from prominent neuropsychological theories of ADHD (executive function, processing speed, arousal regulation and motivation/delay aversion) in dizygotic (DZ) twin pairs discordant for ADHD and control twin pairs (ages 8–18) in order to compare performance between twins affected with ADHD (n = 266), their unaffected co-twins (n = 228), and control children from twin pairs without ADHD or learning difficulties (n = 332). Results ADHD subjects show significant impairment on executive function, processing speed, and response variability measures compared to control subjects. Unaffected cotwins of ADHD subjects are significantly impaired on nearly all the same measures as their ADHD siblings, even when subclinical symptoms of ADHD are controlled. Conclusion Executive function, processing speed, and response variability deficits may be useful endophenotypes for genetic studies of ADHD. PMID:17585884

  18. Model reduction in integrated controls-structures design

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.

    1993-01-01

    It is the objective of this paper to present a model reduction technique developed for the integrated controls-structures design of flexible structures. Integrated controls-structures design problems are typically posed as nonlinear mathematical programming problems, where the design variables consist of both structural and control parameters. In the solution process, both structural and control design variables are constantly changing; therefore, the dynamic characteristics of the structure are also changing. This presents a problem in obtaining a reduced-order model for active control design and analysis which will be valid for all design points within the design space. In other words, the frequency and number of the significant modes of the structure (modes that should be included) may vary considerably throughout the design process. This is also true as the locations and/or masses of the sensors and actuators change. Moreover, since the number of design evaluations in the integrated design process could easily run into thousands, any feasible order-reduction method should not require model reduction analysis at every design iteration. In this paper a novel and efficient technique for model reduction in the integrated controls-structures design process, which addresses these issues, is presented.

  19. Variable sensory perception in autism.

    PubMed

    Haigh, Sarah M

    2018-03-01

    Autism is associated with sensory and cognitive abnormalities. Individuals with autism generally show normal or superior early sensory processing abilities compared to healthy controls, but deficits in complex sensory processing. In the current opinion paper, it will be argued that sensory abnormalities impact cognition by limiting the amount of signal that can be used to interpret and interact with environment. There is a growing body of literature showing that individuals with autism exhibit greater trial-to-trial variability in behavioural and cortical sensory responses. If multiple sensory signals that are highly variable are added together to process more complex sensory stimuli, then this might destabilise later perception and impair cognition. Methods to improve sensory processing have shown improvements in more general cognition. Studies that specifically investigate differences in sensory trial-to-trial variability in autism, and the potential changes in variability before and after treatment, could ascertain if trial-to-trial variability is a good mechanism to target for treatment in autism. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  20. Phase 1 of the automated array assembly task of the low cost silicon solar array project

    NASA Technical Reports Server (NTRS)

    Pryor, R. A.; Grenon, L. A.; Coleman, M. G.

    1978-01-01

    The results of a study of process variables and solar cell variables are presented. Interactions between variables and their effects upon control ranges of the variables are identified. The results of a cost analysis for manufacturing solar cells are discussed. The cost analysis includes a sensitivity analysis of a number of cost factors.

  1. Reconfigurable pipelined processor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saccardi, R.J.

    1989-09-19

    This patent describes a reconfigurable pipelined processor for processing data. It comprises: a plurality of memory devices for storing bits of data; a plurality of arithmetic units for performing arithmetic functions with the data; cross bar means for connecting the memory devices with the arithmetic units for transferring data therebetween; at least one counter connected with the cross bar means for providing a source of addresses to the memory devices; at least one variable tick delay device connected with each of the memory devices and arithmetic units; and means for providing control bits to the variable tick delay device formore » variably controlling the input and output operations thereof to selectively delay the memory devices and arithmetic units to align the data for processing in a selected sequence.« less

  2. Increased Force Variability Is Associated with Altered Modulation of the Motorneuron Pool Activity in Autism Spectrum Disorder (ASD)

    PubMed Central

    Wang, Zheng; Kwon, MinHyuk; Mohanty, Suman; Schmitt, Lauren M.; White, Stormi P.; Christou, Evangelos A.; Mosconi, Matthew W.

    2017-01-01

    Force control deficits have been repeatedly documented in autism spectrum disorder (ASD). They are associated with worse social and daily living skill impairments in patients suggesting that developing a more mechanistic understanding of the central and peripheral processes that cause them may help guide the development of treatments that improve multiple outcomes in ASD. The neuromuscular mechanisms underlying force control deficits are not yet understood. Seventeen individuals with ASD and 14 matched healthy controls completed an isometric index finger abduction test at 60% of their maximum voluntary contraction (MVC) during recording of the first dorsal interosseous (FDI) muscle to determine the neuromuscular processes associated with sustained force variability. Central modulation of the motorneuron pool activation of the FDI muscle was evaluated at delta (0–4 Hz), alpha (4–10 Hz), beta (10–35 Hz) and gamma (35–60 Hz) frequency bands. ASD patients showed greater force variability than controls when attempting to maintain a constant force. Relative to controls, patients also showed increased central modulation of the motorneuron pool at beta and gamma bands. For controls, reduced force variability was associated with reduced delta frequency modulation of the motorneuron pool activity of the FDI muscle and increased modulation at beta and gamma bands. In contrast, delta, beta, and gamma frequency oscillations were not associated with force variability in ASD. These findings suggest that alterations of central mechanisms that control motorneuron pool firing may underlie the common and often impairing symptoms of ASD. PMID:28346344

  3. Intra-individual variability in information processing speed reflects white matter microstructure in multiple sclerosis.

    PubMed

    Mazerolle, Erin L; Wojtowicz, Magdalena A; Omisade, Antonina; Fisk, John D

    2013-01-01

    Slowed information processing speed is commonly reported in persons with multiple sclerosis (MS), and is typically investigated using clinical neuropsychological tests, which provide sensitive indices of mean-level information processing speed. However, recent studies have demonstrated that within-person variability or intra-individual variability (IIV) in information processing speed may be a more sensitive indicator of neurologic status than mean-level performance on clinical tests. We evaluated the neural basis of increased IIV in mildly affected relapsing-remitting MS patients by characterizing the relation between IIV (controlling for mean-level performance) and white matter integrity using diffusion tensor imaging (DTI). Twenty women with relapsing-remitting MS and 20 matched control participants completed the Computerized Test of Information Processing (CTIP), from which both mean response time and IIV were calculated. Other clinical measures of information processing speed were also collected. Relations between IIV on the CTIP and DTI metrics of white matter microstructure were evaluated using tract-based spatial statistics. We observed slower and more variable responses on the CTIP in MS patients relative to controls. Significant relations between white matter microstructure and IIV were observed for MS patients. Increased IIV was associated with reduced integrity in more white matter tracts than was slowed information processing speed as measured by either mean CTIP response time or other neuropsychological test scores. Thus, despite the common use of mean-level performance as an index of cognitive dysfunction in MS, IIV may be more sensitive to the overall burden of white matter disease at the microstructural level. Furthermore, our study highlights the potential value of considering within-person fluctuations, in addition to mean-level performance, for uncovering brain-behavior relationships in neurologic disorders with widespread white matter pathology.

  4. Empirical mode decomposition processing to improve multifocal-visual-evoked-potential signal analysis in multiple sclerosis

    PubMed Central

    2018-01-01

    Objective To study the performance of multifocal-visual-evoked-potential (mfVEP) signals filtered using empirical mode decomposition (EMD) in discriminating, based on amplitude, between control and multiple sclerosis (MS) patient groups, and to reduce variability in interocular latency in control subjects. Methods MfVEP signals were obtained from controls, clinically definitive MS and MS-risk progression patients (radiologically isolated syndrome (RIS) and clinically isolated syndrome (CIS)). The conventional method of processing mfVEPs consists of using a 1–35 Hz bandpass frequency filter (XDFT). The EMD algorithm was used to decompose the XDFT signals into several intrinsic mode functions (IMFs). This signal processing was assessed by computing the amplitudes and latencies of the XDFT and IMF signals (XEMD). The amplitudes from the full visual field and from ring 5 (9.8–15° eccentricity) were studied. The discrimination index was calculated between controls and patients. Interocular latency values were computed from the XDFT and XEMD signals in a control database to study variability. Results Using the amplitude of the mfVEP signals filtered with EMD (XEMD) obtains higher discrimination index values than the conventional method when control, MS-risk progression (RIS and CIS) and MS subjects are studied. The lowest variability in interocular latency computations from the control patient database was obtained by comparing the XEMD signals with the XDFT signals. Even better results (amplitude discrimination and latency variability) were obtained in ring 5 (9.8–15° eccentricity of the visual field). Conclusions Filtering mfVEP signals using the EMD algorithm will result in better identification of subjects at risk of developing MS and better accuracy in latency studies. This could be applied to assess visual cortex activity in MS diagnosis and evolution studies. PMID:29677200

  5. Real-time laser cladding control with variable spot size

    NASA Astrophysics Data System (ADS)

    Arias, J. L.; Montealegre, M. A.; Vidal, F.; Rodríguez, J.; Mann, S.; Abels, P.; Motmans, F.

    2014-03-01

    Laser cladding processing has been used in different industries to improve the surface properties or to reconstruct damaged pieces. In order to cover areas considerably larger than the diameter of the laser beam, successive partially overlapping tracks are deposited. With no control over the process variables this conduces to an increase of the temperature, which could decrease mechanical properties of the laser cladded material. Commonly, the process is monitored and controlled by a PC using cameras, but this control suffers from a lack of speed caused by the image processing step. The aim of this work is to design and develop a FPGA-based laser cladding control system. This system is intended to modify the laser beam power according to the melt pool width, which is measured using a CMOS camera. All the control and monitoring tasks are carried out by a FPGA, taking advantage of its abundance of resources and speed of operation. The robustness of the image processing algorithm is assessed, as well as the control system performance. Laser power is decreased as substrate temperature increases, thus maintaining a constant clad width. This FPGA-based control system is integrated in an adaptive laser cladding system, which also includes an adaptive optical system that will control the laser focus distance on the fly. The whole system will constitute an efficient instrument for part repair with complex geometries and coating selective surfaces. This will be a significant step forward into the total industrial implementation of an automated industrial laser cladding process.

  6. Potential interactions among linguistic, autonomic, and motor factors in speech.

    PubMed

    Kleinow, Jennifer; Smith, Anne

    2006-05-01

    Though anecdotal reports link certain speech disorders to increases in autonomic arousal, few studies have described the relationship between arousal and speech processes. Additionally, it is unclear how increases in arousal may interact with other cognitive-linguistic processes to affect speech motor control. In this experiment we examine potential interactions between autonomic arousal, linguistic processing, and speech motor coordination in adults and children. Autonomic responses (heart rate, finger pulse volume, tonic skin conductance, and phasic skin conductance) were recorded simultaneously with upper and lower lip movements during speech. The lip aperture variability (LA variability index) across multiple repetitions of sentences that varied in length and syntactic complexity was calculated under low- and high-arousal conditions. High arousal conditions were elicited by performance of the Stroop color word task. Children had significantly higher lip aperture variability index values across all speaking tasks, indicating more variable speech motor coordination. Increases in syntactic complexity and utterance length were associated with increases in speech motor coordination variability in both speaker groups. There was a significant effect of Stroop task, which produced increases in autonomic arousal and increased speech motor variability in both adults and children. These results provide novel evidence that high arousal levels can influence speech motor control in both adults and children. (c) 2006 Wiley Periodicals, Inc.

  7. Confidence limits for contribution plots in multivariate statistical process control using bootstrap estimates.

    PubMed

    Babamoradi, Hamid; van den Berg, Frans; Rinnan, Åsmund

    2016-02-18

    In Multivariate Statistical Process Control, when a fault is expected or detected in the process, contribution plots are essential for operators and optimization engineers in identifying those process variables that were affected by or might be the cause of the fault. The traditional way of interpreting a contribution plot is to examine the largest contributing process variables as the most probable faulty ones. This might result in false readings purely due to the differences in natural variation, measurement uncertainties, etc. It is more reasonable to compare variable contributions for new process runs with historical results achieved under Normal Operating Conditions, where confidence limits for contribution plots estimated from training data are used to judge new production runs. Asymptotic methods cannot provide confidence limits for contribution plots, leaving re-sampling methods as the only option. We suggest bootstrap re-sampling to build confidence limits for all contribution plots in online PCA-based MSPC. The new strategy to estimate CLs is compared to the previously reported CLs for contribution plots. An industrial batch process dataset was used to illustrate the concepts. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Interfacing sensory input with motor output: does the control architecture converge to a serial process along a single channel?

    PubMed Central

    van de Kamp, Cornelis; Gawthrop, Peter J.; Gollee, Henrik; Lakie, Martin; Loram, Ian D.

    2013-01-01

    Modular organization in control architecture may underlie the versatility of human motor control; but the nature of the interface relating sensory input through task-selection in the space of performance variables to control actions in the space of the elemental variables is currently unknown. Our central question is whether the control architecture converges to a serial process along a single channel? In discrete reaction time experiments, psychologists have firmly associated a serial single channel hypothesis with refractoriness and response selection [psychological refractory period (PRP)]. Recently, we developed a methodology and evidence identifying refractoriness in sustained control of an external single degree-of-freedom system. We hypothesize that multi-segmental whole-body control also shows refractoriness. Eight participants controlled their whole body to ensure a head marker tracked a target as fast and accurately as possible. Analysis showed enhanced delays in response to stimuli with close temporal proximity to the preceding stimulus. Consistent with our preceding work, this evidence is incompatible with control as a linear time invariant process. This evidence is consistent with a single-channel serial ballistic process within the intermittent control paradigm with an intermittent interval of around 0.5 s. A control architecture reproducing intentional human movement control must reproduce refractoriness. Intermittent control is designed to provide computational time for an online optimization process and is appropriate for flexible adaptive control. For human motor control we suggest that parallel sensory input converges to a serial, single channel process involving planning, selection, and temporal inhibition of alternative responses prior to low dimensional motor output. Such design could aid robots to reproduce the flexibility of human control. PMID:23675342

  9. Constant versus variable response signal delays in speed--accuracy trade-offs: effects of advance preparation for processing time.

    PubMed

    Miller, Jeff; Sproesser, Gudrun; Ulrich, Rolf

    2008-07-01

    In two experiments, we used response signals (RSs) to control processing time and trace out speed--accuracy trade-off(SAT) functions in a difficult perceptual discrimination task. Each experiment compared performance in blocks of trials with constant and, hence, temporally predictable RS lags against performance in blocks with variable, unpredictable RS lags. In both experiments, essentially equivalent SAT functions were observed with constant and variable RS lags. We conclude that there is little effect of advance preparation for a given processing time, suggesting that the discrimination mechanisms underlying SAT functions are driven solely by bottom-up information processing in perceptual discrimination tasks.

  10. Controlling for confounding variables in MS-omics protocol: why modularity matters.

    PubMed

    Smith, Rob; Ventura, Dan; Prince, John T

    2014-09-01

    As the field of bioinformatics research continues to grow, more and more novel techniques are proposed to meet new challenges and improvements upon solutions to long-standing problems. These include data processing techniques and wet lab protocol techniques. Although the literature is consistently thorough in experimental detail and variable-controlling rigor for wet lab protocol techniques, bioinformatics techniques tend to be less described and less controlled. As the validation or rejection of hypotheses rests on the experiment's ability to isolate and measure a variable of interest, we urge the importance of reducing confounding variables in bioinformatics techniques during mass spectrometry experimentation. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  11. A multi-process model of self-regulation: influences of mindfulness, integrative self-knowledge and self-control in Iran.

    PubMed

    Ghorbani, Nima; Watson, P J; Farhadi, Mehran; Chen, Zhuo

    2014-04-01

    Self-regulation presumably rests upon multiple processes that include an awareness of ongoing self-experience, enduring self-knowledge and self-control. The present investigation tested this multi-process model using the Five-Facet Mindfulness Questionnaire (FFMQ) and the Integrative Self-Knowledge and Brief Self-Control Scales. Using a sample of 1162 Iranian university students, we confirmed the five-factor structure of the FFMQ in Iran and documented its factorial invariance across males and females. Self-regulatory variables correlated negatively with Perceived Stress, Depression, and Anxiety and positively with Self-Esteem and Satisfaction with Life. Partial mediation effects confirmed that self-regulatory measures ameliorated the disturbing effects of Perceived Stress. Integrative Self-Knowledge and Self-Control interacted to partially mediate the association of Perceived Stress with lower levels of Satisfaction with Life. Integrative Self-Knowledge, alone or in interaction with Self-Control, was the only self-regulation variable to display the expected mediation of Perceived Stress associations with all other measures. Self-Control failed to be implicated in self-regulation only in the mediation of Anxiety. These data confirmed the need to further examine this multi-process model of self-regulation. © 2014 International Union of Psychological Science.

  12. Analysis And Control System For Automated Welding

    NASA Technical Reports Server (NTRS)

    Powell, Bradley W.; Burroughs, Ivan A.; Kennedy, Larry Z.; Rodgers, Michael H.; Goode, K. Wayne

    1994-01-01

    Automated variable-polarity plasma arc (VPPA) welding apparatus operates under electronic supervision by welding analysis and control system. System performs all major monitoring and controlling functions. It acquires, analyzes, and displays weld-quality data in real time and adjusts process parameters accordingly. Also records pertinent data for use in post-weld analysis and documentation of quality. System includes optoelectronic sensors and data processors that provide feedback control of welding process.

  13. Comparison of Grand Median and Cumulative Sum Control Charts on Shuttlecock Weight Variable in CV Marjoko Kompas dan Domas

    NASA Astrophysics Data System (ADS)

    Musdalifah, N.; Handajani, S. S.; Zukhronah, E.

    2017-06-01

    Competition between the homoneous companies cause the company have to keep production quality. To cover this problem, the company controls the production with statistical quality control using control chart. Shewhart control chart is used to normal distributed data. The production data is often non-normal distribution and occured small process shift. Grand median control chart is a control chart for non-normal distributed data, while cumulative sum (cusum) control chart is a sensitive control chart to detect small process shift. The purpose of this research is to compare grand median and cusum control charts on shuttlecock weight variable in CV Marjoko Kompas dan Domas by generating data as the actual distribution. The generated data is used to simulate multiplier of standard deviation on grand median and cusum control charts. Simulation is done to get average run lenght (ARL) 370. Grand median control chart detects ten points that out of control, while cusum control chart detects a point out of control. It can be concluded that grand median control chart is better than cusum control chart.

  14. Thermal Management in Friction-Stir Welding of Precipitation-Hardening Aluminum Alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upadhyay, Piyush; Reynolds, Anthony

    2015-05-25

    Process design and implementation in FSW is mostly dependent on empirical information gathered through experience. Basic science of friction stir welding and processing can only be complete when fundamental interrelationships between process control parameters and response variables and resulting weld microstructure and properties are established to a reasonable extent. It is known that primary process control parameters like tool rotation and translation rate and forge axis force have complicated and interactive relationships to the process response variables such as peak temperature, time at temperature etc. Of primary influence to the other process response parameters are temperature and its gradient atmore » the deformation and heat affected zones. Through review of pertinent works in the literature and some experimental results from boundary condition work performed in precipitation hardening aluminum alloys this paper will partially elucidate the nature and effects of temperature transients caused by variation of thermal boundaries in Friction Stir Welding.« less

  15. Thermal Management in Friction-Stir Welding of Precipitation-Hardened Aluminum Alloys

    NASA Astrophysics Data System (ADS)

    Upadhyay, Piyush; Reynolds, Anthony P.

    2015-05-01

    Process design and implementation in friction-stir welding (FSW) is mostly dependent on empirical information. Basic science of FSW and processing can only be complete when fundamental interrelationships between the process control parameters and response variables and the resulting weld microstructure and properties are established to a reasonable extent. It is known that primary process control parameters such as tool rotation, translation rates, and forge axis force have complicated and interactive relationships to process-response variables such as peak temperature and time at temperature. Of primary influence on the other process-response parameters are temperature and its gradient in the deformation and heat-affected zones. Through a review of pertinent works in the literature and results from boundary condition experiments performed in precipitation-hardening aluminum alloys, this article partially elucidates the nature and effects of temperature transients caused by variation of thermal boundaries in FSW.

  16. Integrated controls design optimization

    DOEpatents

    Lou, Xinsheng; Neuschaefer, Carl H.

    2015-09-01

    A control system (207) for optimizing a chemical looping process of a power plant includes an optimizer (420), an income algorithm (230) and a cost algorithm (225) and a chemical looping process models. The process models are used to predict the process outputs from process input variables. Some of the process in puts and output variables are related to the income of the plant; and some others are related to the cost of the plant operations. The income algorithm (230) provides an income input to the optimizer (420) based on a plurality of input parameters (215) of the power plant. The cost algorithm (225) provides a cost input to the optimizer (420) based on a plurality of output parameters (220) of the power plant. The optimizer (420) determines an optimized operating parameter solution based on at least one of the income input and the cost input, and supplies the optimized operating parameter solution to the power plant.

  17. Effects of task and age on the magnitude and structure of force fluctuations: insights into underlying neuro-behavioral processes.

    PubMed

    Vieluf, Solveig; Temprado, Jean-Jacques; Berton, Eric; Jirsa, Viktor K; Sleimen-Malkoun, Rita

    2015-03-13

    The present study aimed at characterizing the effects of increasing (relative) force level and aging on isometric force control. To achieve this objective and to infer changes in the underlying control mechanisms, measures of information transmission, as well as magnitude and time-frequency structure of behavioral variability were applied to force-time-series. Older adults were found to be weaker, more variable, and less efficient than young participants. As a function of force level, efficiency followed an inverted-U shape in both groups, suggesting a similar organization of the force control system. The time-frequency structure of force output fluctuations was only significantly affected by task conditions. Specifically, a narrower spectral distribution with more long-range correlations and an inverted-U pattern of complexity changes were observed with increasing force level. Although not significant older participants displayed on average a less complex behavior for low and intermediate force levels. The changes in force signal's regularity presented a strong dependence on time-scales, which significantly interacted with age and condition. An inverted-U profile was only observed for the time-scale relevant to the sensorimotor control process. However, in both groups the peak was not aligned with the optimum of efficiency. Our results support the view that behavioral variability, in terms of magnitude and structure, has a functional meaning and affords non-invasive markers of the adaptations of the sensorimotor control system to various constraints. The measures of efficiency and variability ought to be considered as complementary since they convey specific information on the organization of control processes. The reported weak age effect on variability and complexity measures suggests that the behavioral expression of the loss of complexity hypothesis is not as straightforward as conventionally admitted. However, group differences did not completely vanish, which suggests that age differences can be more or less apparent depending on task properties and whether difficulty is scaled in relative or absolute terms.

  18. The methodology of variable management of propellant fuel consumption by jet-propulsion engines of a spacecraft

    NASA Astrophysics Data System (ADS)

    Kovtun, V. S.

    2012-12-01

    Traditionally, management of propellant fuel consumption on board of a spacecraft is only associated with the operation of jet-propulsion engines (JPE) that are actuator devices of motion control systems (MCS). The efficiency of propellant fuel consumption depends not only on the operation of the MCS, but also, to one extent or another, on all systems functioning on board of a spacecraft, and on processes that occur in them and involve conversion of variable management of propellant fuel consumption by JPEs as a constituent part of the control of the complex process of spacecraft flight.

  19. On the Design of a Fuzzy Logic-Based Control System for Freeze-Drying Processes.

    PubMed

    Fissore, Davide

    2016-12-01

    This article is focused on the design of a fuzzy logic-based control system to optimize a drug freeze-drying process. The goal of the system is to keep product temperature as close as possible to the threshold value of the formulation being processed, without trespassing it, in such a way that product quality is not jeopardized and the sublimation flux is maximized. The method involves the measurement of product temperature and a set of rules that have been obtained through process simulation with the goal to obtain a unique set of rules for products with very different characteristics. Input variables are the difference between the temperature of the product and the threshold value, the difference between the temperature of the heating fluid and that of the product, and the rate of change of product temperature. The output variables are the variation of the temperature of the heating fluid and the pressure in the drying chamber. The effect of the starting value of the input variables and of the control interval has been investigated, thus resulting in the optimal configuration of the control system. Experimental investigation carried out in a pilot-scale freeze-dryer has been carried out to validate the proposed system. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  20. Preparation of Effective Operating Manuals to Support Waste Management Plant Operator Training

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, S. R.

    2003-02-25

    Effective plant operating manuals used in a formal training program can make the difference between a successful operation and a failure. Once the plant process design and control strategies have been fixed, equipment has been ordered, and the plant is constructed, the only major variable affecting success is the capability of plant operating personnel. It is essential that the myriad details concerning plant operation are documented in comprehensive operating manuals suitable for training the non-technical personnel that will operate the plant. These manuals must cover the fundamental principles of each unit operation including how each operates, what process variables aremore » important, and the impact of each variable on the overall process. In addition, operators must know the process control strategies, process interlocks, how to respond to alarms, each of the detailed procedures required to start up and optimize the plant, and every control loop-including when it is appropriate to take manual control. More than anything else, operating mistakes during the start-up phase can lead to substantial delays in achieving design processing rates as well as to problems with government authorities if environmental permit limits are exceeded. The only way to assure return on plant investment is to ensure plant operators have the knowledge to properly run the plant from the outset. A comprehensive set of operating manuals specifically targeted toward plant operators and supervisors written by experienced operating personnel is the only effective way to provide the necessary information for formal start-up training.« less

  1. Model-free adaptive control of supercritical circulating fluidized-bed boilers

    DOEpatents

    Cheng, George Shu-Xing; Mulkey, Steven L

    2014-12-16

    A novel 3-Input-3-Output (3.times.3) Fuel-Air Ratio Model-Free Adaptive (MFA) controller is introduced, which can effectively control key process variables including Bed Temperature, Excess O2, and Furnace Negative Pressure of combustion processes of advanced boilers. A novel 7-input-7-output (7.times.7) MFA control system is also described for controlling a combined 3-Input-3-Output (3.times.3) process of Boiler-Turbine-Generator (BTG) units and a 5.times.5 CFB combustion process of advanced boilers. Those boilers include Circulating Fluidized-Bed (CFB) Boilers and Once-Through Supercritical Circulating Fluidized-Bed (OTSC CFB) Boilers.

  2. The relationship between context, structure, and processes with outcomes of 6 regional diabetes networks in Europe.

    PubMed

    Mahdavi, Mahdi; Vissers, Jan; Elkhuizen, Sylvia; van Dijk, Mattees; Vanhala, Antero; Karampli, Eleftheria; Faubel, Raquel; Forte, Paul; Coroian, Elena; van de Klundert, Joris

    2018-01-01

    While health service provisioning for the chronic condition Type 2 Diabetes (T2D) often involves a network of organisations and professionals, most evidence on the relationships between the structures and processes of service provisioning and the outcomes considers single organisations or solo practitioners. Extending Donabedian's Structure-Process-Outcome (SPO) model, we investigate how differences in quality of life, effective coverage of diabetes, and service satisfaction are associated with differences in the structures, processes, and context of T2D services in six regions in Finland, Germany, Greece, Netherlands, Spain, and UK. Data collection consisted of: a) systematic modelling of provider network's structures and processes, and b) a cross-sectional survey of patient reported outcomes and other information. The survey resulted in data from 1459 T2D patients, during 2011-2012. Stepwise linear regression models were used to identify how independent cumulative proportion of variance in quality of life and service satisfaction are related to differences in context, structure and process. The selected context, structure and process variables are based on Donabedian's SPO model, a service quality research instrument (SERVQUAL), and previous organization and professional level evidence. Additional analysis deepens the possible bidirectional relation between outcomes and processes. The regression models explain 44% of variance in service satisfaction, mostly by structure and process variables (such as human resource use and the SERVQUAL dimensions). The models explained 23% of variance in quality of life between the networks, much of which is related to contextual variables. Our results suggest that effectiveness of A1c control is negatively correlated with process variables such as total hours of care provided per year and cost of services per year. While the selected structure and process variables explain much of the variance in service satisfaction, this is less the case for quality of life. Moreover, it appears that the effect of the clinical outcome A1c control on processes is stronger than the other way around, as poorer control seems to relate to more service use, and higher cost. The standardized operational models used in this research prove to form a basis for expanding the network level evidence base for effective T2D service provisioning.

  3. The relationship between context, structure, and processes with outcomes of 6 regional diabetes networks in Europe

    PubMed Central

    Elkhuizen, Sylvia; van Dijk, Mattees; Vanhala, Antero; Karampli, Eleftheria; Faubel, Raquel; Forte, Paul; Coroian, Elena

    2018-01-01

    Background While health service provisioning for the chronic condition Type 2 Diabetes (T2D) often involves a network of organisations and professionals, most evidence on the relationships between the structures and processes of service provisioning and the outcomes considers single organisations or solo practitioners. Extending Donabedian’s Structure-Process-Outcome (SPO) model, we investigate how differences in quality of life, effective coverage of diabetes, and service satisfaction are associated with differences in the structures, processes, and context of T2D services in six regions in Finland, Germany, Greece, Netherlands, Spain, and UK. Methods Data collection consisted of: a) systematic modelling of provider network’s structures and processes, and b) a cross-sectional survey of patient reported outcomes and other information. The survey resulted in data from 1459 T2D patients, during 2011–2012. Stepwise linear regression models were used to identify how independent cumulative proportion of variance in quality of life and service satisfaction are related to differences in context, structure and process. The selected context, structure and process variables are based on Donabedian’s SPO model, a service quality research instrument (SERVQUAL), and previous organization and professional level evidence. Additional analysis deepens the possible bidirectional relation between outcomes and processes. Results The regression models explain 44% of variance in service satisfaction, mostly by structure and process variables (such as human resource use and the SERVQUAL dimensions). The models explained 23% of variance in quality of life between the networks, much of which is related to contextual variables. Our results suggest that effectiveness of A1c control is negatively correlated with process variables such as total hours of care provided per year and cost of services per year. Conclusions While the selected structure and process variables explain much of the variance in service satisfaction, this is less the case for quality of life. Moreover, it appears that the effect of the clinical outcome A1c control on processes is stronger than the other way around, as poorer control seems to relate to more service use, and higher cost. The standardized operational models used in this research prove to form a basis for expanding the network level evidence base for effective T2D service provisioning. PMID:29447220

  4. Explanatory Variables Associated with Campylobacter and Escherichia coli Concentrations on Broiler Chicken Carcasses during Processing in Two Slaughterhouses.

    PubMed

    Pacholewicz, Ewa; Swart, Arno; Wagenaar, Jaap A; Lipman, Len J A; Havelaar, Arie H

    2016-12-01

    This study aimed at identifying explanatory variables that were associated with Campylobacter and Escherichia coli concentrations throughout processing in two commercial broiler slaughterhouses. Quantative data on Campylobacter and E. coli along the processing line were collected. Moreover, information on batch characteristics, slaughterhouse practices, process performance, and environmental variables was collected through questionnaires, observations, and measurements, resulting in data on 19 potential explanatory variables. Analysis was conducted separately in each slaughterhouse to identify which variables were related to changes in concentrations of Campylobacter and E. coli during the processing steps: scalding, defeathering, evisceration, and chilling. Associations with explanatory variables were different in the slaughterhouses studied. In the first slaughterhouse, there was only one significant association: poorer uniformity of the weight of carcasses within a batch with less decrease in E. coli concentrations after defeathering. In the second slaughterhouse, significant statistical associations were found with variables, including age, uniformity, average weight of carcasses, Campylobacter concentrations in excreta and ceca, and E. coli concentrations in excreta. Bacterial concentrations in excreta and ceca were found to be the most prominent variables, because they were associated with concentration on carcasses at various processing points. Although the slaughterhouses produced specific products and had different batch characteristics and processing parameters, the effect of the significant variables was not always the same for each slaughterhouse. Therefore, each slaughterhouse needs to determine its particular relevant measures for hygiene control and process management. This identification could be supported by monitoring changes in bacterial concentrations during processing in individual slaughterhouses. In addition, the possibility that management and food handling practices in slaughterhouses contribute to the differences in bacterial contamination between slaughterhouses needs further investigation.

  5. Soil nitrate reducing processes – drivers, mechanisms for spatial variation, and significance for nitrous oxide production

    PubMed Central

    Giles, Madeline; Morley, Nicholas; Baggs, Elizabeth M.; Daniell, Tim J.

    2012-01-01

    The microbial processes of denitrification and dissimilatory nitrate reduction to ammonium (DNRA) are two important nitrate reducing mechanisms in soil, which are responsible for the loss of nitrate (NO3−) and production of the potent greenhouse gas, nitrous oxide (N2O). A number of factors are known to control these processes, including O2 concentrations and moisture content, N, C, pH, and the size and community structure of nitrate reducing organisms responsible for the processes. There is an increasing understanding associated with many of these controls on flux through the nitrogen cycle in soil systems. However, there remains uncertainty about how the nitrate reducing communities are linked to environmental variables and the flux of products from these processes. The high spatial variability of environmental controls and microbial communities across small sub centimeter areas of soil may prove to be critical in determining why an understanding of the links between biotic and abiotic controls has proved elusive. This spatial effect is often overlooked as a driver of nitrate reducing processes. An increased knowledge of the effects of spatial heterogeneity in soil on nitrate reduction processes will be fundamental in understanding the drivers, location, and potential for N2O production from soils. PMID:23264770

  6. Soil nitrate reducing processes - drivers, mechanisms for spatial variation, and significance for nitrous oxide production.

    PubMed

    Giles, Madeline; Morley, Nicholas; Baggs, Elizabeth M; Daniell, Tim J

    2012-01-01

    The microbial processes of denitrification and dissimilatory nitrate reduction to ammonium (DNRA) are two important nitrate reducing mechanisms in soil, which are responsible for the loss of nitrate ([Formula: see text]) and production of the potent greenhouse gas, nitrous oxide (N(2)O). A number of factors are known to control these processes, including O(2) concentrations and moisture content, N, C, pH, and the size and community structure of nitrate reducing organisms responsible for the processes. There is an increasing understanding associated with many of these controls on flux through the nitrogen cycle in soil systems. However, there remains uncertainty about how the nitrate reducing communities are linked to environmental variables and the flux of products from these processes. The high spatial variability of environmental controls and microbial communities across small sub centimeter areas of soil may prove to be critical in determining why an understanding of the links between biotic and abiotic controls has proved elusive. This spatial effect is often overlooked as a driver of nitrate reducing processes. An increased knowledge of the effects of spatial heterogeneity in soil on nitrate reduction processes will be fundamental in understanding the drivers, location, and potential for N(2)O production from soils.

  7. Effects of aging on the relationship between cognitive demand and step variability during dual-task walking.

    PubMed

    Decker, Leslie M; Cignetti, Fabien; Hunt, Nathaniel; Potter, Jane F; Stergiou, Nicholas; Studenski, Stephanie A

    2016-08-01

    A U-shaped relationship between cognitive demand and gait control may exist in dual-task situations, reflecting opposing effects of external focus of attention and attentional resource competition. The purpose of the study was twofold: to examine whether gait control, as evaluated from step-to-step variability, is related to cognitive task difficulty in a U-shaped manner and to determine whether age modifies this relationship. Young and older adults walked on a treadmill without attentional requirement and while performing a dichotic listening task under three attention conditions: non-forced (NF), forced-right (FR), and forced-left (FL). The conditions increased in their attentional demand and requirement for inhibitory control. Gait control was evaluated by the variability of step parameters related to balance control (step width) and rhythmic stepping pattern (step length and step time). A U-shaped relationship was found for step width variability in both young and older adults and for step time variability in older adults only. Cognitive performance during dual tasking was maintained in both young and older adults. The U-shaped relationship, which presumably results from a trade-off between an external focus of attention and competition for attentional resources, implies that higher-level cognitive processes are involved in walking in young and older adults. Specifically, while these processes are initially involved only in the control of (lateral) balance during gait, they become necessary for the control of (fore-aft) rhythmic stepping pattern in older adults, suggesting that attentional resources turn out to be needed in all facets of walking with aging. Finally, despite the cognitive resources required by walking, both young and older adults spontaneously adopted a "posture second" strategy, prioritizing the cognitive task over the gait task.

  8. Controlled alignment of carbon nanofibers in a large-scale synthesis process

    NASA Astrophysics Data System (ADS)

    Merkulov, Vladimir I.; Melechko, A. V.; Guillorn, M. A.; Simpson, M. L.; Lowndes, D. H.; Whealton, J. H.; Raridon, R. J.

    2002-06-01

    Controlled alignment of catalytically grown carbon nanofibers (CNFs) at a variable angle to the substrate during a plasma-enhanced chemical vapor deposition process is achieved. The CNF alignment is controlled by the direction of the electric field lines during the synthesis process. Off normal CNF orientations are achieved by positioning the sample in the vicinity of geometrical features of the sample holder, where bending of the electric field lines occurs. The controlled growth of kinked CNFs that consist of two parts aligned at different angles to the substrate normal also is demonstrated.

  9. Integrated control-structure design

    NASA Technical Reports Server (NTRS)

    Hunziker, K. Scott; Kraft, Raymond H.; Bossi, Joseph A.

    1991-01-01

    A new approach for the design and control of flexible space structures is described. The approach integrates the structure and controller design processes thereby providing extra opportunities for avoiding some of the disastrous effects of control-structures interaction and for discovering new, unexpected avenues of future structural design. A control formulation based on Boyd's implementation of Youla parameterization is employed. Control design parameters are coupled with structural design variables to produce a set of integrated-design variables which are selected through optimization-based methodology. A performance index reflecting spacecraft mission goals and constraints is formulated and optimized with respect to the integrated design variables. Initial studies have been concerned with achieving mission requirements with a lighter, more flexible space structure. Details of the formulation of the integrated-design approach are presented and results are given from a study involving the integrated redesign of a flexible geostationary platform.

  10. Evaluation of strength-controlling defects in paper by stress concentration analyses

    Treesearch

    John M. Considine; David W. Vahey; James W. Evans; Kevin T. Turner; Robert E. Rowlands

    2011-01-01

    Cellulosic webs, such as paper materials, are composed of an interwoven, bonded network of cellulose fibers. Strength-controlling parameters in these webs are influenced by constituent fibers and method of processing and manufacture. Instead of estimating the effect on tensile strength of each processing/manufacturing variable, this study modifies and compares the...

  11. Method and apparatus for manufacturing gas tags

    DOEpatents

    Gross, K.C.; Laug, M.T.

    1996-12-17

    For use in the manufacture of gas tags employed in a gas tagging failure detection system for a nuclear reactor, a plurality of commercial feed gases each having a respective noble gas isotopic composition are blended under computer control to provide various tag gas mixtures having selected isotopic ratios which are optimized for specified defined conditions such as cost. Using a new approach employing a discrete variable structure rather than the known continuous-variable optimization problem, the computer controlled gas tag manufacturing process employs an analytical formalism from condensed matter physics known as stochastic relaxation, which is a special case of simulated annealing, for input feed gas selection. For a tag blending process involving M tag isotopes with N distinct feed gas mixtures commercially available from an enriched gas supplier, the manufacturing process calculates the cost difference between multiple combinations and specifies gas mixtures which approach the optimum defined conditions. The manufacturing process is then used to control tag blending apparatus incorporating tag gas canisters connected by stainless-steel tubing with computer controlled valves, with the canisters automatically filled with metered quantities of the required feed gases. 4 figs.

  12. Method and apparatus for manufacturing gas tags

    DOEpatents

    Gross, Kenny C.; Laug, Matthew T.

    1996-01-01

    For use in the manufacture of gas tags employed in a gas tagging failure detection system for a nuclear reactor, a plurality of commercial feed gases each having a respective noble gas isotopic composition are blended under computer control to provide various tag gas mixtures having selected isotopic ratios which are optimized for specified defined conditions such as cost. Using a new approach employing a discrete variable structure rather than the known continuous-variable optimization problem, the computer controlled gas tag manufacturing process employs an analytical formalism from condensed matter physics known as stochastic relaxation, which is a special case of simulated annealing, for input feed gas selection. For a tag blending process involving M tag isotopes with N distinct feed gas mixtures commercially available from an enriched gas supplier, the manufacturing process calculates the cost difference between multiple combinations and specifies gas mixtures which approach the optimum defined conditions. The manufacturing process is then used to control tag blending apparatus incorporating tag gas canisters connected by stainless-steel tubing with computer controlled valves, with the canisters automatically filled with metered quantities of the required feed gases.

  13. Quality control of the tribological coating PS212

    NASA Technical Reports Server (NTRS)

    Sliney, Harold E.; Dellacorte, Christopher; Deadmore, Daniel L.

    1989-01-01

    PS212 is a self-lubricating, composite coating that is applied by the plasma spray process. It is a functional lubricating coating from 25 C (or lower) to 900 C. The coating is prepared from a blend of three different powders with very dissimilar properties. Therefore, the final chemical composition and lubricating effectiveness of the coatings are very sensitive to the process variables used in their preparation. Defined here are the relevant variables. The process and analytical procedures that will result in satisfactory tribological coatings are discussed.

  14. A digital controller for variable thrust liquid rocket engines

    NASA Astrophysics Data System (ADS)

    Feng, X.; Zhang, Y. L.; Chen, Q. Z.

    1993-06-01

    The paper describes the design and development of a built-in digital controller (BDC) for the variable thrust liquid rocket engine (VTLRE). Particular attention is given to the function requirements of the BDC, the hardware and software configuration, and the testing process, as well as to the VTLRE real-time computer simulation system used for the development of the BDC. A diagram of the VLTRE control system is presented as well as block diagrams illustrating the hardware and software configuration of the BDC.

  15. Modelling and control algorithms of the cross conveyors line with multiengine variable speed drives

    NASA Astrophysics Data System (ADS)

    Cheremushkina, M. S.; Baburin, S. V.

    2017-02-01

    The paper deals with the actual problem of developing the control algorithm that meets the technical requirements of the mine belt conveyors, and enables energy and resource savings taking into account a random sort of traffic. The most effective method of solution of these tasks is the construction of control systems with the use of variable speed drives for asynchronous motors. The authors designed the mathematical model of the system ‘variable speed multiengine drive - conveyor - control system of conveyors’ that takes into account the dynamic processes occurring in the elements of the transport system, provides an assessment of the energy efficiency of application the developed algorithms, which allows one to reduce the dynamic overload in the belt to 15-20%.

  16. Characterization of eco-hydraulic habitats for examining biogeochemical processes in rivers

    NASA Astrophysics Data System (ADS)

    McPhillips, L. E.; O'Connor, B. L.; Harvey, J. W.

    2009-12-01

    Spatial variability in biogeochemical reaction rates in streams is often attributed to sediment characteristics such as particle size, organic material content, and biota attached to or embedded within the sediments. Also important in controlling biogeochemical reaction rates are hydraulic conditions, which influence mass transfer of reactants from the stream to the bed, as well as hyporheic exchange within near-surface sediments. This combination of physical and ecological variables has the potential to create habitats that are unique not only in sediment texture but also in their biogeochemical processes and metabolism rates. In this study, we examine the two-dimensional (2D) variability of these habitats in an agricultural river in central Iowa. The streambed substratum was assessed using a grid-based survey identifying dominant particle size classes, as well as aerial coverage of green algae, benthic organic material, and coarse woody debris. Hydraulic conditions were quantified using a calibrated 2D model, and hyporheic exchange was assessed using a scaling relationship based on sediment and hydraulic characteristics. Point-metabolism rates were inferred from measured sediment dissolved oxygen profiles using an effective diffusion model and compared to traditional whole-stream measurements of metabolism. The 185 m study reach had contrasting geomorphologic and hydraulic characteristics in the upstream and downstream portions of an otherwise relatively straight run of a meandering river. The upstream portion contained a large central gravel bar (50 m in length) flanked by riffle-run segments and the downstream portion contained a deeper, fairly uniform channel cross-section. While relatively high flow velocities and gravel sediments were characteristic of the study river, the upstream island bar separated channels that differed with sandy gravels on one side and cobbley gravels on the other. Additionally, green algae was almost exclusively found in riffle portions of the cobbley gravel channel sediments while fine benthic organic material was concentrated at channel margins, regardless of the underlying sediments. A high degree of spatial variability in hyporheic exchange potential was the result of the complex 2D nature of topography and hydraulics. However, sediment texture classifications did a reasonable job in characterizing variability in hyporheic exchange potential because sediment texture mapping incorporates qualitative aspects of bed shear stress and hydraulic conductivity that control hyporheic exchange. Together these variables greatly influenced point-metabolism measurements in different sediment texture habitats separated by only 1 to 2 m. Results from this study suggest that spatial variability and complex interactions between geomorphology, hydraulics, and biological communities generate eco-hydraulic habitats that control variability in biogeochemical processes. The processes controlling variability are highly two-dimensional in nature and are not often accounted for in traditional one-dimensional analysis approaches of biogeochemical processes.

  17. Can Dynamic Visualizations with Variable Control Enhance the Acquisition of Intuitive Knowledge?

    NASA Astrophysics Data System (ADS)

    Wichmann, Astrid; Timpe, Sebastian

    2015-10-01

    An important feature of inquiry learning is to take part in science practices including exploring variables and testing hypotheses. Computer-based dynamic visualizations have the potential to open up various exploration possibilities depending on the level of learner control. It is assumed that variable control, e.g., by changing parameters of a variable, leads to deeper processing (Chang and Linn 2013; de Jong and Njoo 1992; Nerdel 2003; Trey and Khan 2008). Variable control may be helpful, in particular, for acquiring intuitive knowledge (Swaak and de Jong 2001). However, it bares the risk of mental exhaustion and thus may have detrimental effects on knowledge acquisition (Sweller 1998). Students ( N = 118) from four chemistry classes followed inquiry cycles using the software Molecular Workbench (Xie and Tinker 2006). Variable control was varied across the conditions (1) No-Manipulation group and (2) Manipulation group. By adding a third condition, (3) Manipulation-Plus group, we tested whether adding an active hypothesis phase prepares students before changing parameters of a variable. As expected, students in the Manipulation group and Manipulation-Plus group performed better concerning intuitive knowledge ( d = 1.14) than students in the No-Manipulation group. On a descriptive level, results indicated higher cognitive effort in the Manipulation group and the Manipulation-Plus group than in the No-Manipulation group. Unexpectedly, students in the Manipulation-Plus group did not benefit from the active hypothesis phase (intuitive knowledge: d = .36). Findings show that students benefit from variable control. Furthermore, findings point toward the direction that variable control evokes desirable difficulties (Bjork and Linn 2006).

  18. Analysis of the temperature of the hot tool in the cut of woven fabric using infrared images

    NASA Astrophysics Data System (ADS)

    Borelli, Joao E.; Verderio, Leonardo A.; Gonzaga, Adilson; Ruffino, Rosalvo T.

    2001-03-01

    Textile manufacture occupies a prominence place in the national economy. By virtue of its importance researches have been made on the development of new materials, equipment and methods used in the production process. The cutting of textiles starts in the basic stage, to be followed within the process of the making of clothes and other articles. In the hot cutting of fabric, one of the variables of great importance in the control of the process is the contact temperature between the tool and the fabric. The work presents a technique for the measurement of the temperature based on the processing of infrared images. For this a system was developed composed of an infrared camera, a framegrabber PC board and software that analyzes the punctual temperature in the cut area enabling the operator to achieve the necessary control of the other variables involved in the process.

  19. Science--A Process Approach, Product Development Report No. 8.

    ERIC Educational Resources Information Center

    Sanderson, Barbara A.; Kratochvil, Daniel W.

    Science - A Process Approach, a science program for grades kindergarten through sixth, mainly focuses on scientific processes: observing, classifying, using numbers, measuring, space/time relationships, communicating, predicting, inferring, defining operationally, formulating hypotheses, interpreting data, controlling variables, and experimenting.…

  20. So Many Brands and Varieties to Choose from: Does This Compromise the Control of Food Intake in Humans?

    PubMed

    Hardman, Charlotte A; Ferriday, Danielle; Kyle, Lesley; Rogers, Peter J; Brunstrom, Jeffrey M

    2015-01-01

    The recent rise in obesity is widely attributed to changes in the dietary environment (e.g., increased availability of energy-dense foods and larger portion sizes). However, a critical feature of our "obesogenic environment" may have been overlooked - the dramatic increase in "dietary variability" (the tendency for specific mass-produced foods to be available in numerous varieties that differ in energy content). In this study we tested the hypothesis that dietary variability compromises the control of food intake in humans. Specifically, we examined the effects of dietary variability in pepperoni pizza on two key outcome variables; i) compensation for calories in pepperoni pizza and ii) expectations about the satiating properties of pepperoni pizza (expected satiation). We reasoned that dietary variability might generate uncertainty about the postingestive effects of a food. An internet-based questionnaire was completed by 199 adults. This revealed substantial variation in exposure to different varieties of pepperoni pizza. In a follow-up study (n= 66; 65% female), high pizza variability was associated with i) poorer compensation for calories in pepperoni pizza and ii) lower expected satiation for pepperoni pizza. Furthermore, the effect of uncertainty on caloric compensation was moderated by individual differences in decision making (loss aversion). For the first time, these findings highlight a process by which dietary variability may compromise food-intake control in humans. This is important because it exposes a new feature of Western diets (processed foods in particular) that might contribute to overeating and obesity.

  1. Multivariate statistical analysis of a high rate biofilm process treating kraft mill bleach plant effluent.

    PubMed

    Goode, C; LeRoy, J; Allen, D G

    2007-01-01

    This study reports on a multivariate analysis of the moving bed biofilm reactor (MBBR) wastewater treatment system at a Canadian pulp mill. The modelling approach involved a data overview by principal component analysis (PCA) followed by partial least squares (PLS) modelling with the objective of explaining and predicting changes in the BOD output of the reactor. Over two years of data with 87 process measurements were used to build the models. Variables were collected from the MBBR control scheme as well as upstream in the bleach plant and in digestion. To account for process dynamics, a variable lagging approach was used for variables with significant temporal correlations. It was found that wood type pulped at the mill was a significant variable governing reactor performance. Other important variables included flow parameters, faults in the temperature or pH control of the reactor, and some potential indirect indicators of biomass activity (residual nitrogen and pH out). The most predictive model was found to have an RMSEP value of 606 kgBOD/d, representing a 14.5% average error. This was a good fit, given the measurement error of the BOD test. Overall, the statistical approach was effective in describing and predicting MBBR treatment performance.

  2. Process-based quality for thermal spray via feedback control

    NASA Astrophysics Data System (ADS)

    Dykhuizen, R. C.; Neiser, R. A.

    2006-09-01

    Quality control of a thermal spray system manufacturing process is difficult due to the many input variables that need to be controlled. Great care must be taken to ensure that the process remains constant to obtain a consistent quality of the parts. Control is greatly complicated by the fact that measurement of particle velocities and temperatures is a noisy stochastic process. This article illustrates the application of quality control concepts to a wire flame spray process. A central feature of the real-time control system is an automatic feedback control scheme that provides fine adjustments to ensure that uncontrolled variations are accommodated. It is shown how the control vectors can be constructed from simple process maps to independently control particle velocity and temperature. This control scheme is shown to perform well in a real production environment. We also demonstrate that slight variations in the feed wire curvature can greatly influence the process. Finally, the geometry of the spray system and sensor must remain constant for the best reproducibility.

  3. The Role of Design-of-Experiments in Managing Flow in Compact Air Vehicle Inlets

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.; Miller, Daniel N.; Gridley, Marvin C.; Agrell, Johan

    2003-01-01

    It is the purpose of this study to demonstrate the viability and economy of Design-of-Experiments methodologies to arrive at microscale secondary flow control array designs that maintain optimal inlet performance over a wide range of the mission variables and to explore how these statistical methods provide a better understanding of the management of flow in compact air vehicle inlets. These statistical design concepts were used to investigate the robustness properties of low unit strength micro-effector arrays. Low unit strength micro-effectors are micro-vanes set at very low angles-of-incidence with very long chord lengths. They were designed to influence the near wall inlet flow over an extended streamwise distance, and their advantage lies in low total pressure loss and high effectiveness in managing engine face distortion. The term robustness is used in this paper in the same sense as it is used in the industrial problem solving community. It refers to minimizing the effects of the hard-to-control factors that influence the development of a product or process. In Robustness Engineering, the effects of the hard-to-control factors are often called noise , and the hard-to-control factors themselves are referred to as the environmental variables or sometimes as the Taguchi noise variables. Hence Robust Optimization refers to minimizing the effects of the environmental or noise variables on the development (design) of a product or process. In the management of flow in compact inlets, the environmental or noise variables can be identified with the mission variables. Therefore this paper formulates a statistical design methodology that minimizes the impact of variations in the mission variables on inlet performance and demonstrates that these statistical design concepts can lead to simpler inlet flow management systems.

  4. Controls on the spatial variability of key soil properties: comparing field data with a mechanistic soilscape evolution model

    NASA Astrophysics Data System (ADS)

    Vanwalleghem, T.; Román, A.; Giraldez, J. V.

    2016-12-01

    There is a need for better understanding the processes influencing soil formation and the resulting distribution of soil properties. Soil properties can exhibit strong spatial variation, even at the small catchment scale. Especially soil carbon pools in semi-arid, mountainous areas are highly uncertain because bulk density and stoniness are very heterogeneous and rarely measured explicitly. In this study, we explore the spatial variability in key soil properties (soil carbon stocks, stoniness, bulk density and soil depth) as a function of processes shaping the critical zone (weathering, erosion, soil water fluxes and vegetation patterns). We also compare the potential of a geostatistical versus a mechanistic soil formation model (MILESD) for predicting these key soil properties. Soil core samples were collected from 67 locations at 6 depths. Total soil organic carbon stocks were 4.38 kg m-2. Solar radiation proved to be the key variable controlling soil carbon distribution. Stone content was mostly controlled by slope, indicating the importance of erosion. Spatial distribution of bulk density was found to be highly random. Finally, total carbon stocks were predicted using a random forest model whose main covariates were solar radiation and NDVI. The model predicts carbon stocks that are double as high on north versus south-facing slopes. However, validation showed that these covariates only explained 25% of the variation in the dataset. Apparently, present-day landscape and vegetation properties are not sufficient to fully explain variability in the soil carbon stocks in this complex terrain under natural vegetation. This is attributed to a high spatial variability in bulk density and stoniness, key variables controlling carbon stocks. Similar results were obtained with the mechanistic soil formation model MILESD, suggesting that more complex models might be needed to further explore this high spatial variability.

  5. The research on visual industrial robot which adopts fuzzy PID control algorithm

    NASA Astrophysics Data System (ADS)

    Feng, Yifei; Lu, Guoping; Yue, Lulin; Jiang, Weifeng; Zhang, Ye

    2017-03-01

    The control system of six degrees of freedom visual industrial robot based on the control mode of multi-axis motion control cards and PC was researched. For the variable, non-linear characteristics of industrial robot`s servo system, adaptive fuzzy PID controller was adopted. It achieved better control effort. In the vision system, a CCD camera was used to acquire signals and send them to video processing card. After processing, PC controls the six joints` motion by motion control cards. By experiment, manipulator can operate with machine tool and vision system to realize the function of grasp, process and verify. It has influence on the manufacturing of the industrial robot.

  6. Can Process Understanding Help Elucidate The Structure Of The Critical Zone? Comparing Process-Based Soil Formation Models With Digital Soil Mapping.

    NASA Astrophysics Data System (ADS)

    Vanwalleghem, T.; Román, A.; Peña, A.; Laguna, A.; Giráldez, J. V.

    2017-12-01

    There is a need for better understanding the processes influencing soil formation and the resulting distribution of soil properties in the critical zone. Soil properties can exhibit strong spatial variation, even at the small catchment scale. Especially soil carbon pools in semi-arid, mountainous areas are highly uncertain because bulk density and stoniness are very heterogeneous and rarely measured explicitly. In this study, we explore the spatial variability in key soil properties (soil carbon stocks, stoniness, bulk density and soil depth) as a function of processes shaping the critical zone (weathering, erosion, soil water fluxes and vegetation patterns). We also compare the potential of traditional digital soil mapping versus a mechanistic soil formation model (MILESD) for predicting these key soil properties. Soil core samples were collected from 67 locations at 6 depths. Total soil organic carbon stocks were 4.38 kg m-2. Solar radiation proved to be the key variable controlling soil carbon distribution. Stone content was mostly controlled by slope, indicating the importance of erosion. Spatial distribution of bulk density was found to be highly random. Finally, total carbon stocks were predicted using a random forest model whose main covariates were solar radiation and NDVI. The model predicts carbon stocks that are double as high on north versus south-facing slopes. However, validation showed that these covariates only explained 25% of the variation in the dataset. Apparently, present-day landscape and vegetation properties are not sufficient to fully explain variability in the soil carbon stocks in this complex terrain under natural vegetation. This is attributed to a high spatial variability in bulk density and stoniness, key variables controlling carbon stocks. Similar results were obtained with the mechanistic soil formation model MILESD, suggesting that more complex models might be needed to further explore this high spatial variability.

  7. Assessment of tobacco smoke effects on neonatal cardiorespiratory control using a semi-automated processing approach.

    PubMed

    Al-Omar, Sally; Le Rolle, Virginie; Beuchée, Alain; Samson, Nathalie; Praud, Jean-Paul; Carrault, Guy

    2018-05-10

    A semi-automated processing approach was developed to assess the effects of early postnatal environmental tobacco smoke (ETS) on the cardiorespiratory control of newborn lambs. The system consists of several steps beginning with artifact rejection, followed by the selection of stationary segments, and ending with feature extraction. This approach was used in six lambs exposed to 20 cigarettes/day for the first 15 days of life, while another six control lambs were exposed to room air. On postnatal day 16, electrocardiograph and respiratory signals were obtained from a 6-h polysomnographic recording. The effects of postnatal ETS exposure on heart rate variability, respiratory rate variability, and cardiorespiratory interrelations were explored. The unique results suggest that early postnatal ETS exposure increases respiratory rate variability and decreases the coupling between cardiac and respiratory systems. Potentially harmful consequences in early life include unstable breathing and decreased adaptability of cardiorespiratory function, particularly during early life challenges, such as prematurity or viral infection. Graphical abstract ᅟ.

  8. Multi-objective optimization for model predictive control.

    PubMed

    Wojsznis, Willy; Mehta, Ashish; Wojsznis, Peter; Thiele, Dirk; Blevins, Terry

    2007-06-01

    This paper presents a technique of multi-objective optimization for Model Predictive Control (MPC) where the optimization has three levels of the objective function, in order of priority: handling constraints, maximizing economics, and maintaining control. The greatest weights are assigned dynamically to control or constraint variables that are predicted to be out of their limits. The weights assigned for economics have to out-weigh those assigned for control objectives. Control variables (CV) can be controlled at fixed targets or within one- or two-sided ranges around the targets. Manipulated Variables (MV) can have assigned targets too, which may be predefined values or current actual values. This MV functionality is extremely useful when economic objectives are not defined for some or all the MVs. To achieve this complex operation, handle process outputs predicted to go out of limits, and have a guaranteed solution for any condition, the technique makes use of the priority structure, penalties on slack variables, and redefinition of the constraint and control model. An engineering implementation of this approach is shown in the MPC embedded in an industrial control system. The optimization and control of a distillation column, the standard Shell heavy oil fractionator (HOF) problem, is adequately achieved with this MPC.

  9. Cognitive switching processes in young people with attention-deficit/hyperactivity disorder.

    PubMed

    Oades, Robert D; Christiansen, Hanna

    2008-01-01

    Patients with attention-deficit/hyperactivity disorder (ADHD) can be slow at switching between stimuli, or between sets of stimuli to control behaviour appropriate to changing situations. We examined clinical and experimental parameters that may influence the speed of such processes measured in the trail-making (TMT) and switch-tasks in cases with ADHD combined type, their non-affected siblings and unrelated healthy controls. The latency for completion of the trail-making task controlling for psychomotor processing (TMT-B-A) was longer for ADHD cases, and correlated with Conners' ratings of symptom severity across all subjects. The effect decreased with age. Switch-task responses to questions of "Which number?" and "How many?" between sets of 1/111 or 3/333 elicited differential increases in latency with condition that affected all groups. But there was evidence for increased symptom-related intra-individual variability among the ADHD cases, and across all subjects. Young siblings showed familiality for some measures of TMT and switch-task performance but these were modest. The potential influences of moderator variables on the efficiency of processing stimulus change rather than the speed of processing are discussed.

  10. Evolution and Control of 2219 Aluminum Microstructural Features Through Electron Beam Freeform Fabrication

    NASA Technical Reports Server (NTRS)

    Taminger, Karen M.; Hafley, Robert A.; Domack, Marcia S.

    2006-01-01

    The layer-additive nature of the electron beam freeform fabrication (EBF3) process results in a tortuous thermal path producing complex microstructures including: small homogeneous equiaxed grains; dendritic growth contained within larger grains; and/or pervasive dendritic formation in the interpass regions of the deposits. Several process control variables contribute to the formation of these different microstructures, including translation speed, wire feed rate, beam current and accelerating voltage. In electron beam processing, higher accelerating voltages embed the energy deeper below the surface of the substrate. Two EBF3 systems have been established at NASA Langley, one with a low-voltage (10-30kV) and the other a high-voltage (30-60 kV) electron beam gun. Aluminum alloy 2219 was processed over a range of different variables to explore the design space and correlate the resultant microstructures with the processing parameters. This report is specifically exploring the impact of accelerating voltage. Of particular interest is correlating energy to the resultant material characteristics to determine the potential of achieving microstructural control through precise management of the heat flux and cooling rates during deposition.

  11. A critical assessment of in-flight particle state during plasma spraying of YSZ and its implications on coating properties and process reliability

    NASA Astrophysics Data System (ADS)

    Srinivasan, Vasudevan

    Air plasma spray is inherently complex due to the deviation from equilibrium conditions, three dimensional nature, multitude of interrelated (controllable) parameters and (uncontrollable) variables involved, and stochastic variability at different stages. The resultant coatings are complex due to the layered high defect density microstructure. Despite the widespread use and commercial success for decades in earthmoving, automotive, aerospace and power generation industries, plasma spray has not been completely understood and prime reliance for critical applications such as thermal barrier coatings on gas turbines are yet to be accomplished. This dissertation is aimed at understanding the in-flight particle state of the plasma spray process towards designing coatings and achieving coating reliability with the aid of noncontact in-flight particle and spray stream sensors. Key issues such as the phenomena of optimum particle injection and the definition of spray stream using particle state are investigated. Few strategies to modify the microstructure and properties of Yttria Stabilized Zirconia coatings are examined systematically using the framework of process maps. An approach to design process window based on design relevant coating properties is presented. Options to control the process for enhanced reproducibility and reliability are examined and the resultant variability is evaluated systematically at the different stages in the process. The 3D variability due to the difference in plasma characteristics has been critically examined by investigating splats collected from the entire spray footprint.

  12. Seasonal-to-Interannual Precipitation Variability and Predictability in a Coupled Land-Atmosphere System

    NASA Technical Reports Server (NTRS)

    Koster, Randal D.; Suarez, M. J.; Heiser, M.

    1998-01-01

    In an earlier GCM study, we showed that interactive land surface processes generally contribute more to continental precipitation variance than do variable sea surface temperatures (SSTs). A new study extends this result through an analysis of 16-member ensembles of multi-decade GCM simulations. We can now show that in many regions, although land processes determine the amplitude of the interannual precipitation anomalies, variable SSTs nevertheless control their timing. The GCM data can be processed into indices that describe geographical variations in (1) the potential for seasonal-to-interannual prediction, and (2) the extent to which the predictability relies on the proper representation of land-atmosphere feedback.

  13. A data-driven approach to identify controls on global fire activity from satellite and climate observations (SOFIA V1)

    NASA Astrophysics Data System (ADS)

    Forkel, Matthias; Dorigo, Wouter; Lasslop, Gitta; Teubner, Irene; Chuvieco, Emilio; Thonicke, Kirsten

    2017-12-01

    Vegetation fires affect human infrastructures, ecosystems, global vegetation distribution, and atmospheric composition. However, the climatic, environmental, and socioeconomic factors that control global fire activity in vegetation are only poorly understood, and in various complexities and formulations are represented in global process-oriented vegetation-fire models. Data-driven model approaches such as machine learning algorithms have successfully been used to identify and better understand controlling factors for fire activity. However, such machine learning models cannot be easily adapted or even implemented within process-oriented global vegetation-fire models. To overcome this gap between machine learning-based approaches and process-oriented global fire models, we introduce a new flexible data-driven fire modelling approach here (Satellite Observations to predict FIre Activity, SOFIA approach version 1). SOFIA models can use several predictor variables and functional relationships to estimate burned area that can be easily adapted with more complex process-oriented vegetation-fire models. We created an ensemble of SOFIA models to test the importance of several predictor variables. SOFIA models result in the highest performance in predicting burned area if they account for a direct restriction of fire activity under wet conditions and if they include a land cover-dependent restriction or allowance of fire activity by vegetation density and biomass. The use of vegetation optical depth data from microwave satellite observations, a proxy for vegetation biomass and water content, reaches higher model performance than commonly used vegetation variables from optical sensors. We further analyse spatial patterns of the sensitivity between anthropogenic, climate, and vegetation predictor variables and burned area. We finally discuss how multiple observational datasets on climate, hydrological, vegetation, and socioeconomic variables together with data-driven modelling and model-data integration approaches can guide the future development of global process-oriented vegetation-fire models.

  14. A Case for a Process Approach: The Warwick Experience.

    ERIC Educational Resources Information Center

    Screen, P.

    1988-01-01

    Describes the cyclical nature of a problem-solving sequence produced from observing children involved in the process. Discusses the generic qualities of science: (1) observing; (2) inferring; (3) classifying; (4) predicting; (5) controlling variables; and (6) hypothesizing. Explains the processes in use and advantages of a process-led course. (RT)

  15. Yield impact for wafer shape misregistration-based binning for overlay APC diagnostic enhancement

    NASA Astrophysics Data System (ADS)

    Jayez, David; Jock, Kevin; Zhou, Yue; Govindarajulu, Venugopal; Zhang, Zhen; Anis, Fatima; Tijiwa-Birk, Felipe; Agarwal, Shivam

    2018-03-01

    The importance of traditionally acceptable sources of variation has started to become more critical as semiconductor technologies continue to push into smaller technology nodes. New metrology techniques are needed to pursue the process uniformity requirements needed for controllable lithography. Process control for lithography has the advantage of being able to adjust for cross-wafer variability, but this requires that all processes are close in matching between process tools/chambers for each process. When this is not the case, the cumulative line variability creates identifiable groups of wafers1 . This cumulative shape based effect is described as impacting overlay measurements and alignment by creating misregistration of the overlay marks. It is necessary to understand what requirements might go into developing a high volume manufacturing approach which leverages this grouping methodology, the key inputs and outputs, and what can be extracted from such an approach. It will be shown that this line variability can be quantified into a loss of electrical yield primarily at the edge of the wafer and proposes a methodology for root cause identification and improvement. This paper will cover the concept of wafer shape based grouping as a diagnostic tool for overlay control and containment, the challenges in implementing this in a manufacturing setting, and the limitations of this approach. This will be accomplished by showing that there are identifiable wafer shape based signatures. These shape based wafer signatures will be shown to be correlated to overlay misregistration, primarily at the edge. It will also be shown that by adjusting for this wafer shape signal, improvements can be made to both overlay as well as electrical yield. These improvements show an increase in edge yield, and a reduction in yield variability.

  16. Contrasting controls of pH climatology in an open coast versus urban fjord estuary

    EPA Science Inventory

    Interactions of physical, chemical, and biological processes in the coastal zone can result in a highly variable carbonate chemistry regime. This characteristic variability in coastal areas has garnered renewed interest within the context of ocean acidification, yet the relative...

  17. Active Power and Flux Control of a Self-Excited Induction Generator for a Variable-Speed Wind Turbine Generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Na, Woonki; Muljadi, Eduard; Leighty, Bill

    A Self-Excited Induction Generation (SEIG) for a variable speed wind turbine generation(VS-WG) is normally considered to be a good candidate for implementation in stand-alone applications such as battery charging, hydrogenation, water pumping, water purification, water desalination, and etc. In this study, we have examined a study on active power and flux control strategies for a SEIG for a variable speed wind turbine generation. The control analysis for the proposed system is carried out by using PSCAD software. In the process, we can optimize the control design of the system, thereby enhancing and expediting the control design procedure for this application.more » With this study, this control design for a SEIG for VS-WG can become the industry standard for analysis and development in terms of SEIG.« less

  18. [Serotonin receptor (5-HTR2A) and dysbindin (DTNBP1) genes and component process variables of short-term verbal memory in schizophrenia].

    PubMed

    Alfimova, M V; Monakhov, M V; Abramova, L I; Golubev, S A; Golimbet, V E

    2009-01-01

    An association study of variations in the DTNBP1 (P1763 and P1578) and 5-HTR2A (T102C and A-1438G) genes with short-term verbal memory efficiency and its component process variables was carried out in 405 patients with schizophrenia and 290 healthy controls. All subjects were asked to recall immediately two sets of 10 words. Total recall, List 1 recall, immediate recall or attention span, proactive interference and a number of intrusions were measured. Patients significantly differed from controls by all memory variables. The efficiency of test performance, efficiency of immediate memory, effect of proactive interference as well as number of intrusions were decreased in the group of patients. Both 5-HTR2A polymorphisms were associated with short-term verbal memory efficiency in the combined sample, with the worst performance observed in carriers of homozygous CC (T102C) and GG (A-1438G) genotypes. The significant effect of the P1763 (DTNBP1) marker on the component process variables (proactive interference and intrusions) was found while its effect on the total recall was non-significant. The homozygotes for GG (P1763) had the worst scores. Overall, the data obtained are in line with the conception of DTNBP1 and 5-HTR2A involvement in different component process variables of memory in healthy subjects and patients with schizophrenia.

  19. Influence of Processing Parameters on the Flow Path in Friction Stir Welding

    NASA Technical Reports Server (NTRS)

    Schneider, J. A.; Nunes, A. C., Jr.

    2006-01-01

    Friction stir welding (FSW) is a solid phase welding process that unites thermal and mechanical aspects to produce a high quality joint. The process variables are rpm, translational weld speed, and downward plunge force. The strain-temperature history of a metal element at each point on the cross-section of the weld is determined by the individual flow path taken by the particular filament of metal flowing around the tool as influenced by the process variables. The resulting properties of the weld are determined by the strain-temperature history. Thus to control FSW properties, improved understanding of the processing parameters on the metal flow path is necessary.

  20. The interprocess NIR sampling as an alternative approach to multivariate statistical process control for identifying sources of product-quality variability.

    PubMed

    Marković, Snežana; Kerč, Janez; Horvat, Matej

    2017-03-01

    We are presenting a new approach of identifying sources of variability within a manufacturing process by NIR measurements of samples of intermediate material after each consecutive unit operation (interprocess NIR sampling technique). In addition, we summarize the development of a multivariate statistical process control (MSPC) model for the production of enteric-coated pellet product of the proton-pump inhibitor class. By developing provisional NIR calibration models, the identification of critical process points yields comparable results to the established MSPC modeling procedure. Both approaches are shown to lead to the same conclusion, identifying parameters of extrusion/spheronization and characteristics of lactose that have the greatest influence on the end-product's enteric coating performance. The proposed approach enables quicker and easier identification of variability sources during manufacturing process, especially in cases when historical process data is not straightforwardly available. In the presented case the changes of lactose characteristics are influencing the performance of the extrusion/spheronization process step. The pellet cores produced by using one (considered as less suitable) lactose source were on average larger and more fragile, leading to consequent breakage of the cores during subsequent fluid bed operations. These results were confirmed by additional experimental analyses illuminating the underlying mechanism of fracture of oblong pellets during the pellet coating process leading to compromised film coating.

  1. Bioreactor process parameter screening utilizing a Plackett-Burman design for a model monoclonal antibody.

    PubMed

    Agarabi, Cyrus D; Schiel, John E; Lute, Scott C; Chavez, Brittany K; Boyne, Michael T; Brorson, Kurt A; Khan, Mansoora; Read, Erik K

    2015-06-01

    Consistent high-quality antibody yield is a key goal for cell culture bioprocessing. This endpoint is typically achieved in commercial settings through product and process engineering of bioreactor parameters during development. When the process is complex and not optimized, small changes in composition and control may yield a finished product of less desirable quality. Therefore, changes proposed to currently validated processes usually require justification and are reported to the US FDA for approval. Recently, design-of-experiments-based approaches have been explored to rapidly and efficiently achieve this goal of optimized yield with a better understanding of product and process variables that affect a product's critical quality attributes. Here, we present a laboratory-scale model culture where we apply a Plackett-Burman screening design to parallel cultures to study the main effects of 11 process variables. This exercise allowed us to determine the relative importance of these variables and identify the most important factors to be further optimized in order to control both desirable and undesirable glycan profiles. We found engineering changes relating to culture temperature and nonessential amino acid supplementation significantly impacted glycan profiles associated with fucosylation, β-galactosylation, and sialylation. All of these are important for monoclonal antibody product quality. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  2. Design of a robust fuzzy controller for the arc stability of CO(2) welding process using the Taguchi method.

    PubMed

    Kim, Dongcheol; Rhee, Sehun

    2002-01-01

    CO(2) welding is a complex process. Weld quality is dependent on arc stability and minimizing the effects of disturbances or changes in the operating condition commonly occurring during the welding process. In order to minimize these effects, a controller can be used. In this study, a fuzzy controller was used in order to stabilize the arc during CO(2) welding. The input variable of the controller was the Mita index. This index estimates quantitatively the arc stability that is influenced by many welding process parameters. Because the welding process is complex, a mathematical model of the Mita index was difficult to derive. Therefore, the parameter settings of the fuzzy controller were determined by performing actual control experiments without using a mathematical model of the controlled process. The solution, the Taguchi method was used to determine the optimal control parameter settings of the fuzzy controller to make the control performance robust and insensitive to the changes in the operating conditions.

  3. Implementation of an adaptive controller for the startup and steady-state running of a biomethanation process operated in the CSTR mode.

    PubMed

    Renard, P; Van Breusegem, V; Nguyen, M T; Naveau, H; Nyns, E J

    1991-10-20

    An adaptive control algorithm has been implemented on a biomethanation process to maintain propionate concentration, a stable variable, at a given low value, by steering the dilution rate. It was thereby expected to ensure the stability of the process during the startup and during steady-state running with an acceptable performance. The methane pilot reactor was operated in the completely mixed, once-through mode and computer-controlled during 161 days. The results yielded the real-life validation of the adaptive control algorithm, and documented the stability and acceptable performance expected.

  4. Multi-disciplinary optimization of aeroservoelastic systems

    NASA Technical Reports Server (NTRS)

    Karpel, Mordechay

    1990-01-01

    Efficient analytical and computational tools for simultaneous optimal design of the structural and control components of aeroservoelastic systems are presented. The optimization objective is to achieve aircraft performance requirements and sufficient flutter and control stability margins with a minimal weight penalty and without violating the design constraints. Analytical sensitivity derivatives facilitate an efficient optimization process which allows a relatively large number of design variables. Standard finite element and unsteady aerodynamic routines are used to construct a modal data base. Minimum State aerodynamic approximations and dynamic residualization methods are used to construct a high accuracy, low order aeroservoelastic model. Sensitivity derivatives of flutter dynamic pressure, control stability margins and control effectiveness with respect to structural and control design variables are presented. The performance requirements are utilized by equality constraints which affect the sensitivity derivatives. A gradient-based optimization algorithm is used to minimize an overall cost function. A realistic numerical example of a composite wing with four controls is used to demonstrate the modeling technique, the optimization process, and their accuracy and efficiency.

  5. Multidisciplinary optimization of aeroservoelastic systems using reduced-size models

    NASA Technical Reports Server (NTRS)

    Karpel, Mordechay

    1992-01-01

    Efficient analytical and computational tools for simultaneous optimal design of the structural and control components of aeroservoelastic systems are presented. The optimization objective is to achieve aircraft performance requirements and sufficient flutter and control stability margins with a minimal weight penalty and without violating the design constraints. Analytical sensitivity derivatives facilitate an efficient optimization process which allows a relatively large number of design variables. Standard finite element and unsteady aerodynamic routines are used to construct a modal data base. Minimum State aerodynamic approximations and dynamic residualization methods are used to construct a high accuracy, low order aeroservoelastic model. Sensitivity derivatives of flutter dynamic pressure, control stability margins and control effectiveness with respect to structural and control design variables are presented. The performance requirements are utilized by equality constraints which affect the sensitivity derivatives. A gradient-based optimization algorithm is used to minimize an overall cost function. A realistic numerical example of a composite wing with four controls is used to demonstrate the modeling technique, the optimization process, and their accuracy and efficiency.

  6. The Strength and Characteristics of VPPA Welded 2219-T87 Aluminum Alloy

    NASA Technical Reports Server (NTRS)

    Jemian, W. A.

    1985-01-01

    A study of the variable polarity plasma arc (VPPA) welding process and those factors that control the structure and properties of VPPA welded aluminum alloy 2219-T87 was conducted. The importance of joint preparation, alignment of parts and welding process variables are already established. Internal weld defects have been eliminated. However, a variation of properties was found to be due to the size variation of interdendritic particles in the fusion zone. These particles contribute to the void formation process, which controls the ultimate tensile strength of the welded alloy. A variation of 150 microns in particle size correlated with a 10 ksi variation of ultimate tensile strength. It was found that all fracture surfaces were of the dimple rupture type, with fracture initiating within the fusion zone.

  7. Who shalt not kill? Individual differences in working memory capacity, executive control, and moral judgment.

    PubMed

    Moore, Adam B; Clark, Brian A; Kane, Michael J

    2008-06-01

    Recent findings suggest that exerting executive control influences responses to moral dilemmas. In our study, subjects judged how morally appropriate it would be for them to kill one person to save others. They made these judgments in 24 dilemmas that systematically varied physical directness of killing, personal risk to the subject, inevitability of the death, and intentionality of the action. All four of these variables demonstrated main effects. Executive control was indexed by scores on working-memory-capacity (WMC) tasks. People with higher WMC found certain types of killing more appropriate than did those with lower WMC and were more consistent in their judgments. We also report interactions between manipulated variables that implicate complex emotion-cognition integration processes not captured by current dual-process views of moral judgment.

  8. Optimising UAV topographic surveys processed with structure-from-motion: Ground control quality, quantity and bundle adjustment

    NASA Astrophysics Data System (ADS)

    James, M. R.; Robson, S.; d'Oleire-Oltmanns, S.; Niethammer, U.

    2017-03-01

    Structure-from-motion (SfM) algorithms greatly facilitate the production of detailed topographic models from photographs collected using unmanned aerial vehicles (UAVs). However, the survey quality achieved in published geomorphological studies is highly variable, and sufficient processing details are never provided to understand fully the causes of variability. To address this, we show how survey quality and consistency can be improved through a deeper consideration of the underlying photogrammetric methods. We demonstrate the sensitivity of digital elevation models (DEMs) to processing settings that have not been discussed in the geomorphological literature, yet are a critical part of survey georeferencing, and are responsible for balancing the contributions of tie and control points. We provide a Monte Carlo approach to enable geomorphologists to (1) carefully consider sources of survey error and hence increase the accuracy of SfM-based DEMs and (2) minimise the associated field effort by robust determination of suitable lower-density deployments of ground control. By identifying appropriate processing settings and highlighting photogrammetric issues such as over-parameterisation during camera self-calibration, processing artefacts are reduced and the spatial variability of error minimised. We demonstrate such DEM improvements with a commonly-used SfM-based software (PhotoScan), which we augment with semi-automated and automated identification of ground control points (GCPs) in images, and apply to two contrasting case studies - an erosion gully survey (Taroudant, Morocco) and an active landslide survey (Super-Sauze, France). In the gully survey, refined processing settings eliminated step-like artefacts of up to 50 mm in amplitude, and overall DEM variability with GCP selection improved from 37 to 16 mm. In the much more challenging landslide case study, our processing halved planimetric error to 0.1 m, effectively doubling the frequency at which changes in landslide velocity could be detected. In both case studies, the Monte Carlo approach provided a robust demonstration that field effort could by substantially reduced by only deploying approximately half the number of GCPs, with minimal effect on the survey quality. To reduce processing artefacts and promote confidence in SfM-based geomorphological surveys, published results should include processing details which include the image residuals for both tie points and GCPs, and ensure that these are considered appropriately within the workflow.

  9. Defining process design space for a hydrophobic interaction chromatography (HIC) purification step: application of quality by design (QbD) principles.

    PubMed

    Jiang, Canping; Flansburg, Lisa; Ghose, Sanchayita; Jorjorian, Paul; Shukla, Abhinav A

    2010-12-15

    The concept of design space has been taking root under the quality by design paradigm as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. This paper outlines the development of a design space for a hydrophobic interaction chromatography (HIC) process step. The design space included the impact of raw material lot-to-lot variability and variations in the feed stream from cell culture. A failure modes and effects analysis was employed as the basis for the process characterization exercise. During mapping of the process design space, the multi-dimensional combination of operational variables were studied to quantify the impact on process performance in terms of yield and product quality. Variability in resin hydrophobicity was found to have a significant influence on step yield and high-molecular weight aggregate clearance through the HIC step. A robust operating window was identified for this process step that enabled a higher step yield while ensuring acceptable product quality. © 2010 Wiley Periodicals, Inc.

  10. [Effects of situational and individual variables on critical thinking expression].

    PubMed

    Tanaka, Yuko; Kusumi, Takashi

    2016-04-01

    The present study examined when people decide to choose an expression that is based on critical thinking, and how situational and individual variables affect such a decision process. Given a conversation scenario including overgeneralization with two friends, participants decided whether to follow the conversation by a critical-thinking expression or not. The authors controlled purpose and topic as situational variables, and measured critical-thinking ability, critical-thinking disposition, and self-monitoring as individual variables. We conducted an experiment in which the situational variables were counterbalanced in a within-subject design with 60 university students. The results of logistic regression analysis showed differences within individuals in the decision process whether to choose a critical-thinking expression, and that some situational factors and some subscales of the individual measurements were related to the differences.

  11. Intra- and inter-individual variation of BIS-index and Entropy during controlled sedation with midazolam/remifentanil and dexmedetomidine/remifentanil in healthy volunteers: an interventional study.

    PubMed

    Haenggi, Matthias; Ypparila-Wolters, Heidi; Hauser, Kathrin; Caviezel, Claudio; Takala, Jukka; Korhonen, Ilkka; Jakob, Stephan M

    2009-01-01

    We studied intra-individual and inter-individual variability of two online sedation monitors, BIS and Entropy, in volunteers under sedation. Ten healthy volunteers were sedated in a stepwise manner with doses of either midazolam and remifentanil or dexmedetomidine and remifentanil. One week later the procedure was repeated with the remaining drug combination. The doses were adjusted to achieve three different sedation levels (Ramsay Scores 2, 3 and 4) and controlled by a computer-driven drug-delivery system to maintain stable plasma concentrations of the drugs. At each level of sedation, BIS and Entropy (response entropy and state entropy) values were recorded for 20 minutes. Baseline recordings were obtained before the sedative medications were administered. Both inter-individual and intra-individual variability increased as the sedation level deepened. Entropy values showed greater variability than BIS(R) values, and the variability was greater during dexmedetomidine/remifentanil sedation than during midazolam/remifentanil sedation. The large intra-individual and inter-individual variability of BIS and Entropy values in sedated volunteers makes the determination of sedation levels by processed electroencephalogram (EEG) variables impossible. Reports in the literature which draw conclusions based on processed EEG variables obtained from sedated intensive care unit (ICU) patients may be inaccurate due to this variability. clinicaltrials.gov Nr. NCT00641563.

  12. Adaptive control and noise suppression by a variable-gain gradient algorithm

    NASA Technical Reports Server (NTRS)

    Merhav, S. J.; Mehta, R. S.

    1987-01-01

    An adaptive control system based on normalized LMS filters is investigated. The finite impulse response of the nonparametric controller is adaptively estimated using a given reference model. Specifically, the following issues are addressed: The stability of the closed loop system is analyzed and heuristically established. Next, the adaptation process is studied for piecewise constant plant parameters. It is shown that by introducing a variable-gain in the gradient algorithm, a substantial reduction in the LMS adaptation rate can be achieved. Finally, process noise at the plant output generally causes a biased estimate of the controller. By introducing a noise suppression scheme, this bias can be substantially reduced and the response of the adapted system becomes very close to that of the reference model. Extensive computer simulations validate these and demonstrate assertions that the system can rapidly adapt to random jumps in plant parameters.

  13. B827 Chemical Synthhesis Project - Industrial Control System Integration - Statement of Work & Specification with Attachments 1-14

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wade, F. E.

    The Chemical Synthesis Pilot Process at the Lawrence Livermore National Laboratory (LLNL) Site 300 827 Complex will be used to synthesize small quantities of material to support research and development. The project will modernize and increase current capabilities for chemical synthesis at LLNL. The primary objective of this project is the conversion of a non-automated hands-on process to a remoteoperation process, while providing enhanced batch process step control, stored recipe-specific parameter sets, process variable visibility, monitoring, alarm and warning handling, and comprehensive batch record data logging. This Statement of Work and Specification provides the industrial-grade process control requirements for themore » chemical synthesis batching control system, hereafter referred to as the “Control System” to be delivered by the System Integrator.« less

  14. Optimization of PID Parameters Utilizing Variable Weight Grey-Taguchi Method and Particle Swarm Optimization

    NASA Astrophysics Data System (ADS)

    Azmi, Nur Iffah Mohamed; Arifin Mat Piah, Kamal; Yusoff, Wan Azhar Wan; Romlay, Fadhlur Rahman Mohd

    2018-03-01

    Controller that uses PID parameters requires a good tuning method in order to improve the control system performance. Tuning PID control method is divided into two namely the classical methods and the methods of artificial intelligence. Particle swarm optimization algorithm (PSO) is one of the artificial intelligence methods. Previously, researchers had integrated PSO algorithms in the PID parameter tuning process. This research aims to improve the PSO-PID tuning algorithms by integrating the tuning process with the Variable Weight Grey- Taguchi Design of Experiment (DOE) method. This is done by conducting the DOE on the two PSO optimizing parameters: the particle velocity limit and the weight distribution factor. Computer simulations and physical experiments were conducted by using the proposed PSO- PID with the Variable Weight Grey-Taguchi DOE and the classical Ziegler-Nichols methods. They are implemented on the hydraulic positioning system. Simulation results show that the proposed PSO-PID with the Variable Weight Grey-Taguchi DOE has reduced the rise time by 48.13% and settling time by 48.57% compared to the Ziegler-Nichols method. Furthermore, the physical experiment results also show that the proposed PSO-PID with the Variable Weight Grey-Taguchi DOE tuning method responds better than Ziegler-Nichols tuning. In conclusion, this research has improved the PSO-PID parameter by applying the PSO-PID algorithm together with the Variable Weight Grey-Taguchi DOE method as a tuning method in the hydraulic positioning system.

  15. Decomposing ADHD-Related Effects in Response Speed and Variability

    PubMed Central

    Karalunas, Sarah L.; Huang-Pollock, Cynthia L.; Nigg, Joel T.

    2012-01-01

    Objective Slow and variable reaction times (RTs) on fast tasks are such a prominent feature of Attention Deficit Hyperactivity Disorder (ADHD) that any theory must account for them. However, this has proven difficult because the cognitive mechanisms responsible for this effect remain unexplained. Although speed and variability are typically correlated, it is unclear whether single or multiple mechanisms are responsible for group differences in each. RTs are a result of several semi-independent processes, including stimulus encoding, rate of information processing, speed-accuracy trade-offs, and motor response, which have not been previously well characterized. Method A diffusion model was applied to RTs from a forced-choice RT paradigm in two large, independent case-control samples (NCohort 1= 214 and N Cohort 2=172). The decomposition measured three validated parameters that account for the full RT distribution, and assessed reproducibility of ADHD effects. Results In both samples, group differences in traditional RT variables were explained by slow information processing speed, and unrelated to speed-accuracy trade-offs or non-decisional processes (e.g. encoding, motor response). Conclusions RT speed and variability in ADHD may be explained by a single information processing parameter, potentially simplifying explanations that assume different mechanisms are required to account for group differences in the mean and variability of RTs. PMID:23106115

  16. Quality control developments for graphite/PMR15 polyimide composites materials

    NASA Technical Reports Server (NTRS)

    Sheppard, C. H.; Hoggatt, J. T.

    1979-01-01

    The problem of lot-to-lot and within-lot variability of graphite/PMR-15 prepreg was investigated. The PMR-15 chemical characterization data were evaluated along with the processing conditions controlling the manufacture of PMR-15 resin and monomers. Manufacturing procedures were selected to yield a consistently reproducible graphite prepreg that could be processed into acceptable structural elements.

  17. SOS based robust H(∞) fuzzy dynamic output feedback control of nonlinear networked control systems.

    PubMed

    Chae, Seunghwan; Nguang, Sing Kiong

    2014-07-01

    In this paper, a methodology for designing a fuzzy dynamic output feedback controller for discrete-time nonlinear networked control systems is presented where the nonlinear plant is modelled by a Takagi-Sugeno fuzzy model and the network-induced delays by a finite state Markov process. The transition probability matrix for the Markov process is allowed to be partially known, providing a more practical consideration of the real world. Furthermore, the fuzzy controller's membership functions and premise variables are not assumed to be the same as the plant's membership functions and premise variables, that is, the proposed approach can handle the case, when the premise of the plant are not measurable or delayed. The membership functions of the plant and the controller are approximated as polynomial functions, then incorporated into the controller design. Sufficient conditions for the existence of the controller are derived in terms of sum of square inequalities, which are then solved by YALMIP. Finally, a numerical example is used to demonstrate the validity of the proposed methodology.

  18. Gaia DR1 documentation Chapter 6: Variability

    NASA Astrophysics Data System (ADS)

    Eyer, L.; Rimoldini, L.; Guy, L.; Holl, B.; Clementini, G.; Cuypers, J.; Mowlavi, N.; Lecoeur-Taïbi, I.; De Ridder, J.; Charnas, J.; Nienartowicz, K.

    2017-12-01

    This chapter describes the photometric variability processing of the Gaia DR1 data. Coordination Unit 7 is responsible for the variability analysis of over a billion celestial sources. In particular the definition, design, development, validation and provision of a software package for the data processing of photometrically variable objects. Data Processing Centre Geneva (DPCG) responsibilities cover all issues related to the computational part of the CU7 analysis. These span: hardware provisioning, including selection, deployment and optimisation of suitable hardware, choosing and developing software architecture, defining data and scientific workflows as well as operational activities such as configuration management, data import, time series reconstruction, storage and processing handling, visualisation and data export. CU7/DPCG is also responsible for interaction with other DPCs and CUs, software and programming training for the CU7 members, scientific software quality control and management of software and data lifecycle. Details about the specific data treatment steps of the Gaia DR1 data products are found in Eyer et al. (2017) and are not repeated here. The variability content of the Gaia DR1 focusses on a subsample of Cepheids and RR Lyrae stars around the South ecliptic pole, showcasing the performance of the Gaia photometry with respect to variable objects.

  19. Temporal changes of spatial soil moisture patterns: controlling factors explained with a multidisciplinary approach

    NASA Astrophysics Data System (ADS)

    Martini, Edoardo; Wollschläger, Ute; Kögler, Simon; Behrens, Thorsten; Dietrich, Peter; Reinstorf, Frido; Schmidt, Karsten; Weiler, Markus; Werban, Ulrike; Zacharias, Steffen

    2016-04-01

    Characterizing the spatial patterns of soil moisture is critical for hydrological and meteorological models, as soil moisture is a key variable that controls matter and energy fluxes and soil-vegetation-atmosphere exchange processes. Deriving detailed process understanding at the hillslope scale is not trivial, because of the temporal variability of local soil moisture dynamics. Nevertheless, it remains a challenge to provide adequate information on the temporal variability of soil moisture and its controlling factors. Recent advances in wireless sensor technology allow monitoring of soil moisture dynamics with high temporal resolution at varying scales. In addition, mobile geophysical methods such as electromagnetic induction (EMI) have been widely used for mapping soil water content at the field scale with high spatial resolution, as being related to soil apparent electrical conductivity (ECa). The objective of this study was to characterize the spatial and temporal pattern of soil moisture at the hillslope scale and to infer the controlling hydrological processes, integrating well established and innovative sensing techniques, as well as new statistical methods. We combined soil hydrological and pedological expertise with geophysical measurements and methods from digital soil mapping for designing a wireless soil moisture monitoring network. For a hillslope site within the Schäfertal catchment (Central Germany), soil water dynamics were observed during 14 months, and soil ECa was mapped on seven occasions whithin this period of time using an EM38-DD device. Using the Spearman rank correlation coefficient, we described the temporal persistence of a dry and a wet characteristic state of soil moisture as well as the switching mechanisms, inferring the local properties that control the observed spatial patterns and the hydrological processes driving the transitions. Based on this, we evaluated the use of EMI for mapping the spatial pattern of soil moisture under different hydrologic conditions and the factors controlling the temporal variability of the ECa-soil moisture relationship. The approach provided valuable insight into the time-varying contribution of local and nonlocal factors to the characteristic spatial patterns of soil moisture and the transition mechanisms. The spatial organization of soil moisture was controlled by different processes in different soil horizons, and the topsoil's moisture did not mirror processes that take place within the soil profile. Results show that, for the Schäfertal hillslope site which is presumed to be representative for non-intensively managed soils with moderate clay content, local soil properties (e.g., soil texture and porosity) are the major control on the spatial pattern of ECa. In contrast, the ECa-soil moisture relationship is small and varies over time indicating that ECa is not a good proxy for soil moisture estimation at the investigated site.Occasionally observed stronger correlations between ECa and soil moisture may be explained by background dependencies of ECa to other state variables such as pore water electrical conductivity. The results will help to improve conceptual understanding for hydrological model studies at similar or smaller scales, and to transfer observation concepts and process understanding to larger or less instrumented sites, as well as to constrain the use of EMI-based ECa data for hydrological applications.

  20. Soil temperature variability in complex terrain measured using fiber-optic distributed temperature sensing

    USDA-ARS?s Scientific Manuscript database

    Soil temperature (Ts) exerts critical controls on hydrologic and biogeochemical processes but magnitude and nature of Ts variability in a landscape setting are rarely documented. Fiber optic distributed temperature sensing systems (FO-DTS) potentially measure Ts at high density over a large extent. ...

  1. Energy efficiency technologies in cement and steel industry

    NASA Astrophysics Data System (ADS)

    Zanoli, Silvia Maria; Cocchioni, Francesco; Pepe, Crescenzo

    2018-02-01

    In this paper, Advanced Process Control strategies aimed at energy efficiency achievement and improvement in cement and steel industry are proposed. A flexible and smart control structure constituted by several functional modules and blocks has been developed. The designed control strategy is based on Model Predictive Control techniques, formulated on linear models. Two industrial control solutions have been developed, oriented to energy efficiency and process control improvement in cement industry clinker rotary kilns (clinker production phase) and in steel industry billets reheating furnaces. Tailored customization procedures for the design of ad hoc control systems have been executed, based on the specific needs and specifications of the analysed processes. The installation of the developed controllers on cement and steel plants produced significant benefits in terms of process control which resulted in working closer to the imposed operating limits. With respect to the previous control systems, based on local controllers and/or operators manual conduction, more profitable configurations of the crucial process variables have been provided.

  2. Improvement of chemical vapor deposition process for production of large diameter carbon base monofilaments

    NASA Technical Reports Server (NTRS)

    Hough, R. L.; Richmond, R. D.

    1971-01-01

    Research was conducted to develop large diameter carbon monofilament, containing 25 to 35 mole % element boron, in the 2.0 to 10.0 mil diameter range using the chemical vapor deposition process. The objective of the program was to gain an understanding of the critical process variables and their effect on fiber properties. Synthesis equipment was modified to allow these variables to be studied. Improved control of synthesis variables permitted reduction in scatter of properties of the monofilaments. Monofilaments have been synthesized in the 3.0 to nearly 6.0 mil diameter range having measured values up to 552,000 psi for ultimate tensile strength and up to 30 million psi for elastic modulus.

  3. Knob manager (KM) operators guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1993-10-08

    KM, Knob Manager, is a tool which enables the user to use the SUNDIALS knob box to adjust the settings of the control system. The followings are some features of KM: dynamic knob assignments with the user friendly interface; user-defined gain for individual knob; graphical displays for operating range and status of each process variable is assigned; backup and restore one or multiple process variable; save current settings to a file and recall the settings from that file in future.

  4. Design of forging process variables under uncertainties

    NASA Astrophysics Data System (ADS)

    Repalle, Jalaja; Grandhi, Ramana V.

    2005-02-01

    Forging is a complex nonlinear process that is vulnerable to various manufacturing anomalies, such as variations in billet geometry, billet/die temperatures, material properties, and workpiece and forging equipment positional errors. A combination of these uncertainties could induce heavy manufacturing losses through premature die failure, final part geometric distortion, and reduced productivity. Identifying, quantifying, and controlling the uncertainties will reduce variability risk in a manufacturing environment, which will minimize the overall production cost. In this article, various uncertainties that affect the forging process are identified, and their cumulative effect on the forging tool life is evaluated. Because the forging process simulation is time-consuming, a response surface model is used to reduce computation time by establishing a relationship between the process performance and the critical process variables. A robust design methodology is developed by incorporating reliability-based optimization techniques to obtain sound forging components. A case study of an automotive-component forging-process design is presented to demonstrate the applicability of the method.

  5. Real-Time Variable Rate Spraying in Orchards and Vineyards: A Review

    NASA Astrophysics Data System (ADS)

    Wandkar, Sachin Vilas; Bhatt, Yogesh Chandra; Jain, H. K.; Nalawade, Sachin M.; Pawar, Shashikant G.

    2018-06-01

    Effective and efficient use of pesticides in the orchards is of concern since many years. With the conventional constant rate sprayers, equal dose of pesticide is applied to each tree. Since, there is great variation in size and shape of each tree in the orchard, trees gets either oversprayed or undersprayed. Real-time variable rate spraying technology offers pesticide application in accordance with tree size. With the help of suitable sensors, tree characteristics such as canopy volume, foliage density, etc. can be acquired and with the micro-processing unit coupled with proper algorithm, flow of electronic proportional valves can be controlled thus, controlling the flow rate of nozzles according to tree characteristics. Also, sensors can help in the detection of spaces in-between trees which allows to control the spray in spaces. Variable rate spraying helps in achieving precision in spraying operation especially inside orchards. This paper reviews the real-time variable rate spraying technology and efforts made by the various researchers for real-time variable application in the orchards and vineyards.

  6. Real-Time Variable Rate Spraying in Orchards and Vineyards: A Review

    NASA Astrophysics Data System (ADS)

    Wandkar, Sachin Vilas; Bhatt, Yogesh Chandra; Jain, H. K.; Nalawade, Sachin M.; Pawar, Shashikant G.

    2018-02-01

    Effective and efficient use of pesticides in the orchards is of concern since many years. With the conventional constant rate sprayers, equal dose of pesticide is applied to each tree. Since, there is great variation in size and shape of each tree in the orchard, trees gets either oversprayed or undersprayed. Real-time variable rate spraying technology offers pesticide application in accordance with tree size. With the help of suitable sensors, tree characteristics such as canopy volume, foliage density, etc. can be acquired and with the micro-processing unit coupled with proper algorithm, flow of electronic proportional valves can be controlled thus, controlling the flow rate of nozzles according to tree characteristics. Also, sensors can help in the detection of spaces in-between trees which allows to control the spray in spaces. Variable rate spraying helps in achieving precision in spraying operation especially inside orchards. This paper reviews the real-time variable rate spraying technology and efforts made by the various researchers for real-time variable application in the orchards and vineyards.

  7. Chemical Structure and Molecular Dimension As Controls on the Inherent Stability of Charcoal in Boreal Forest Soil

    NASA Astrophysics Data System (ADS)

    Hockaday, W. C.; Kane, E. S.; Ohlson, M.; Huang, R.; Von Bargen, J.; Davis, R.

    2014-12-01

    Efforts have been made by various scientific disciplines to study hyporheic zones and characterize their associated processes. One way to approach the study of the hyporheic zone is to define facies, which are elements of a (hydrobio) geologic classification scheme that groups components of a complex system with high variability into a manageable set of discrete classes. In this study, we try to classify the hyporheic zone based on the geology, geochemistry, microbiology, and understand their interactive influences on the integrated biogeochemical distributions and processes. A number of measurements have been taken for 21 freeze core samples along the Columbia River bank in the Hanford 300 Area, and unique datasets have been obtained on biomass, pH, number of microbial taxa, percentage of N/C/H/S, microbial activity parameters, as well as microbial community attributes/modules. In order to gain a complete understanding of the geological control on these variables and processes, the explanatory variables are set to include quantitative gravel/sand/mud/silt/clay percentages, statistical moments of grain size distributions, as well as geological (e.g., Folk-Wentworth) and statistical (e.g., hierarchical) clusters. The dominant factors for major microbial and geochemical variables are identified and summarized using exploratory data analysis approaches (e.g., principal component analysis, hierarchical clustering, factor analysis, multivariate analysis of variance). The feasibility of extending the facies definition and its control of microbial and geochemical properties to larger scales is discussed.

  8. Microbial facies distribution and its geological and geochemical controls at the Hanford 300 area

    NASA Astrophysics Data System (ADS)

    Hou, Z.; Nelson, W.; Stegen, J.; Murray, C. J.; Arntzen, E.

    2015-12-01

    Efforts have been made by various scientific disciplines to study hyporheic zones and characterize their associated processes. One way to approach the study of the hyporheic zone is to define facies, which are elements of a (hydrobio) geologic classification scheme that groups components of a complex system with high variability into a manageable set of discrete classes. In this study, we try to classify the hyporheic zone based on the geology, geochemistry, microbiology, and understand their interactive influences on the integrated biogeochemical distributions and processes. A number of measurements have been taken for 21 freeze core samples along the Columbia River bank in the Hanford 300 Area, and unique datasets have been obtained on biomass, pH, number of microbial taxa, percentage of N/C/H/S, microbial activity parameters, as well as microbial community attributes/modules. In order to gain a complete understanding of the geological control on these variables and processes, the explanatory variables are set to include quantitative gravel/sand/mud/silt/clay percentages, statistical moments of grain size distributions, as well as geological (e.g., Folk-Wentworth) and statistical (e.g., hierarchical) clusters. The dominant factors for major microbial and geochemical variables are identified and summarized using exploratory data analysis approaches (e.g., principal component analysis, hierarchical clustering, factor analysis, multivariate analysis of variance). The feasibility of extending the facies definition and its control of microbial and geochemical properties to larger scales is discussed.

  9. Efficacious insect and disease control with laser-guided air-assisted sprayer

    USDA-ARS?s Scientific Manuscript database

    Efficacy of a newly developed air-assisted variable-rate sprayer was investigated for the control of arthropod pests and plant diseases in six commercial fields. The sprayer was integrated with a high-speed laser scanning sensor, a custom-designed signal processing program, an automatic flow control...

  10. Control of plasma process by use of harmonic frequency components of voltage and current

    DOEpatents

    Miller, Paul A.; Kamon, Mattan

    1994-01-01

    The present invention provides for a technique for taking advantage of the intrinsic electrical non-linearity of processing plasmas to add additional control variables that affect process performance. The technique provides for the adjustment of the electrical coupling circuitry, as well as the electrical excitation level, in response to measurements of the reactor voltage and current and to use that capability to modify the plasma characteristics to obtain the desired performance.

  11. Assessing technical performance in differential gene expression experiments with external spike-in RNA control ratio mixtures.

    PubMed

    Munro, Sarah A; Lund, Steven P; Pine, P Scott; Binder, Hans; Clevert, Djork-Arné; Conesa, Ana; Dopazo, Joaquin; Fasold, Mario; Hochreiter, Sepp; Hong, Huixiao; Jafari, Nadereh; Kreil, David P; Łabaj, Paweł P; Li, Sheng; Liao, Yang; Lin, Simon M; Meehan, Joseph; Mason, Christopher E; Santoyo-Lopez, Javier; Setterquist, Robert A; Shi, Leming; Shi, Wei; Smyth, Gordon K; Stralis-Pavese, Nancy; Su, Zhenqiang; Tong, Weida; Wang, Charles; Wang, Jian; Xu, Joshua; Ye, Zhan; Yang, Yong; Yu, Ying; Salit, Marc

    2014-09-25

    There is a critical need for standard approaches to assess, report and compare the technical performance of genome-scale differential gene expression experiments. Here we assess technical performance with a proposed standard 'dashboard' of metrics derived from analysis of external spike-in RNA control ratio mixtures. These control ratio mixtures with defined abundance ratios enable assessment of diagnostic performance of differentially expressed transcript lists, limit of detection of ratio (LODR) estimates and expression ratio variability and measurement bias. The performance metrics suite is applicable to analysis of a typical experiment, and here we also apply these metrics to evaluate technical performance among laboratories. An interlaboratory study using identical samples shared among 12 laboratories with three different measurement processes demonstrates generally consistent diagnostic power across 11 laboratories. Ratio measurement variability and bias are also comparable among laboratories for the same measurement process. We observe different biases for measurement processes using different mRNA-enrichment protocols.

  12. Quality by Design approach for studying the impact of formulation and process variables on product quality of oral disintegrating films.

    PubMed

    Mazumder, Sonal; Pavurala, Naresh; Manda, Prashanth; Xu, Xiaoming; Cruz, Celia N; Krishnaiah, Yellela S R

    2017-07-15

    The present investigation was carried out to understand the impact of formulation and process variables on the quality of oral disintegrating films (ODF) using Quality by Design (QbD) approach. Lamotrigine (LMT) was used as a model drug. Formulation variable was plasticizer to film former ratio and process variables were drying temperature, air flow rate in the drying chamber, drying time and wet coat thickness of the film. A Definitive Screening Design of Experiments (DoE) was used to identify and classify the critical formulation and process variables impacting critical quality attributes (CQA). A total of 14 laboratory-scale DoE formulations were prepared and evaluated for mechanical properties (%elongation at break, yield stress, Young's modulus, folding endurance) and other CQA (dry thickness, disintegration time, dissolution rate, moisture content, moisture uptake, drug assay and drug content uniformity). The main factors affecting mechanical properties were plasticizer to film former ratio and drying temperature. Dissolution rate was found to be sensitive to air flow rate during drying and plasticizer to film former ratio. Data were analyzed for elucidating interactions between different variables, rank ordering the critical materials attributes (CMA) and critical process parameters (CPP), and for providing a predictive model for the process. Results suggested that plasticizer to film former ratio and process controls on drying are critical to manufacture LMT ODF with the desired CQA. Published by Elsevier B.V.

  13. CRN5EXP: Expert system for statistical quality control

    NASA Technical Reports Server (NTRS)

    Hentea, Mariana

    1991-01-01

    The purpose of the Expert System CRN5EXP is to assist in checking the quality of the coils at two very important mills: Hot Rolling and Cold Rolling in a steel plant. The system interprets the statistical quality control charts, diagnoses and predicts the quality of the steel. Measurements of process control variables are recorded in a database and sample statistics such as the mean and the range are computed and plotted on a control chart. The chart is analyzed through patterns using the C Language Integrated Production System (CLIPS) and a forward chaining technique to reach a conclusion about the causes of defects and to take management measures for the improvement of the quality control techniques. The Expert System combines the certainty factors associated with the process control variables to predict the quality of the steel. The paper presents the approach to extract data from the database, the reason to combine certainty factors, the architecture and the use of the Expert System. However, the interpretation of control charts patterns requires the human expert's knowledge and lends to Expert Systems rules.

  14. Does cost-benefit analysis or self-control predict involvement in two forms of aggression?

    PubMed

    Archer, John; Fernández-Fuertes, Andrés A; Thanzami, Van Lal

    2010-01-01

    The main aim of this research was to assess the relative association between physical aggression and (1) self-control and (2) cost-benefit assessment, these variables representing the operation of impulsive and reflective processes. Study 1 involved direct and indirect aggression among young Indian men, and Study 2 physical aggression to dating partners among Spanish adolescents. In Study 1, perceived benefits and costs but not self-control were associated with direct aggression at other men, and the association remained when their close association with indirect aggression was controlled. In Study 2, benefits and self-control showed significant and independent associations (positive for benefits, negative for self-control) with physical aggression at other-sex partners. Although being victimized was also correlated in the same direction with self-control and benefits, perpetration and being victimized were highly correlated, and there was no association between being victimized and these variables when perpetration was controlled. These results support the theory that reflective (cost-benefit analyses) processes and impulsive (self-control) processes operate in parallel in affecting aggression. The finding that male adolescents perceived more costs and fewer benefits from physical aggression to a partner than female adolescents did is consistent with findings indicating greater social disapproval of men hitting women than vice versa, rather than with the view that male violence to women is facilitated by internalized patriarchal values. (c) 2010 Wiley-Liss, Inc.

  15. Automated process control for plasma etching

    NASA Astrophysics Data System (ADS)

    McGeown, Margaret; Arshak, Khalil I.; Murphy, Eamonn

    1992-06-01

    This paper discusses the development and implementation of a rule-based system which assists in providing automated process control for plasma etching. The heart of the system is to establish a correspondence between a particular data pattern -- sensor or data signals -- and one or more modes of failure, i.e., a data-driven monitoring approach. The objective of this rule based system, PLETCHSY, is to create a program combining statistical process control (SPC) and fault diagnosis to help control a manufacturing process which varies over time. This can be achieved by building a process control system (PCS) with the following characteristics. A facility to monitor the performance of the process by obtaining and analyzing the data relating to the appropriate process variables. Process sensor/status signals are input into an SPC module. If trends are present, the SPC module outputs the last seven control points, a pattern which is represented by either regression or scoring. The pattern is passed to the rule-based module. When the rule-based system recognizes a pattern, it starts the diagnostic process using the pattern. If the process is considered to be going out of control, advice is provided about actions which should be taken to bring the process back into control.

  16. An investigative model evaluating how consumers process pictorial information on nonprescription medication labels.

    PubMed

    Sansgiry, S S; Cady, P S

    1997-01-01

    Currently, marketed over-the-counter (OTC) medication labels were simulated and tested in a controlled environment to understand consumer evaluation of OTC label information. Two factors, consumers' age (younger and older adults) and label designs (picture-only, verbal-only, congruent picture-verbal, and noncongruent picture-verbal) were controlled and tested to evaluate consumer information processing. The effects exerted by the independent variables, namely, comprehension of label information (understanding) and product evaluations (satisfaction, certainty, and perceived confusion) were evaluated on the dependent variable purchase intention. Intention measured as purchase recommendation was significantly related to product evaluations and affected by the factor label design. Participants' level of perceived confusion was more important than actual understanding of information on OTC medication labels. A Label Evaluation Process Model was developed which could be used for future testing of OTC medication labels.

  17. Temporal dynamics of biogeochemical processes at the Norman Landfill site

    USGS Publications Warehouse

    Arora, Bhavna; Mohanty, Binayak P.; McGuire, Jennifer T.; Cozzarelli, Isabelle M.

    2013-01-01

    The temporal variability observed in redox sensitive species in groundwater can be attributed to coupled hydrological, geochemical, and microbial processes. These controlling processes are typically nonstationary, and distributed across various time scales. Therefore, the purpose of this study is to investigate biogeochemical data sets from a municipal landfill site to identify the dominant modes of variation and determine the physical controls that become significant at different time scales. Data on hydraulic head, specific conductance, δ2H, chloride, sulfate, nitrate, and nonvolatile dissolved organic carbon were collected between 1998 and 2000 at three wells at the Norman Landfill site in Norman, OK. Wavelet analysis on this geochemical data set indicates that variations in concentrations of reactive and conservative solutes are strongly coupled to hydrologic variability (water table elevation and precipitation) at 8 month scales, and to individual eco-hydrogeologic framework (such as seasonality of vegetation, surface-groundwater dynamics) at 16 month scales. Apart from hydrologic variations, temporal variability in sulfate concentrations can be associated with different sources (FeS cycling, recharge events) and sinks (uptake by vegetation) depending on the well location and proximity to the leachate plume. Results suggest that nitrate concentrations show multiscale behavior across temporal scales for different well locations, and dominant variability in dissolved organic carbon for a closed municipal landfill can be larger than 2 years due to its decomposition and changing content. A conceptual framework that explains the variability in chemical concentrations at different time scales as a function of hydrologic processes, site-specific interactions, and/or coupled biogeochemical effects is also presented.

  18. Real time monitoring of powder blend bulk density for coupled feed-forward/feed-back control of a continuous direct compaction tablet manufacturing process.

    PubMed

    Singh, Ravendra; Román-Ospino, Andrés D; Romañach, Rodolfo J; Ierapetritou, Marianthi; Ramachandran, Rohit

    2015-11-10

    The pharmaceutical industry is strictly regulated, where precise and accurate control of the end product quality is necessary to ensure the effectiveness of the drug products. For such control, the process and raw materials variability ideally need to be fed-forward in real time into an automatic control system so that a proactive action can be taken before it can affect the end product quality. Variations in raw material properties (e.g., particle size), feeder hopper level, amount of lubrication, milling and blending action, applied shear in different processing stages can affect the blend density significantly and thereby tablet weight, hardness and dissolution. Therefore, real time monitoring of powder bulk density variability and its incorporation into the automatic control system so that its effect can be mitigated proactively and efficiently is highly desired. However, real time monitoring of powder bulk density is still a challenging task because of different level of complexities. In this work, powder bulk density which has a significant effect on the critical quality attributes (CQA's) has been monitored in real time in a pilot-plant facility, using a NIR sensor. The sensitivity of the powder bulk density on critical process parameters (CPP's) and CQA's has been analyzed and thereby feed-forward controller has been designed. The measured signal can be used for feed-forward control so that the corrective actions on the density variations can be taken before they can influence the product quality. The coupled feed-forward/feed-back control system demonstrates improved control performance and improvements in the final product quality in the presence of process and raw material variations. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Controlling Contagion Processes in Activity Driven Networks

    NASA Astrophysics Data System (ADS)

    Liu, Suyu; Perra, Nicola; Karsai, Márton; Vespignani, Alessandro

    2014-03-01

    The vast majority of strategies aimed at controlling contagion processes on networks consider the connectivity pattern of the system either quenched or annealed. However, in the real world, many networks are highly dynamical and evolve, in time, concurrently with the contagion process. Here, we derive an analytical framework for the study of control strategies specifically devised for a class of time-varying networks, namely activity-driven networks. We develop a block variable mean-field approach that allows the derivation of the equations describing the coevolution of the contagion process and the network dynamic. We derive the critical immunization threshold and assess the effectiveness of three different control strategies. Finally, we validate the theoretical picture by simulating numerically the spreading process and control strategies in both synthetic networks and a large-scale, real-world, mobile telephone call data set.

  20. Variables Control Charts: A Measurement Tool to Detect Process Problems within Housing

    ERIC Educational Resources Information Center

    Luna, Andrew

    1999-01-01

    The purpose of this study was to use quality improvement tools to determine if the current process of supplying hot water to a high-rise residence hall for women at a southeastern Doctoral I granting institution was in control. After a series of focus groups among the residents in the hall, it was determined that they were mostly concerned about…

  1. Between-session intra-individual variability in sustained, selective, and integrational non-linguistic attention in aphasia.

    PubMed

    Villard, Sarah; Kiran, Swathi

    2015-01-01

    A number of studies have identified impairments in one or more types/aspects of attention processing in patients with aphasia (PWA) relative to healthy controls; person-to-person variability in performance on attention tasks within the PWA group has also been noted. Studies using non-linguistic stimuli have found evidence that attention is impaired in this population even in the absence of language processing demands. An underlying impairment in non-linguistic, or domain-general, attention processing could have implications for the ability of PWA to attend during therapy sessions, which in turn could impact long-term treatment outcomes. With this in mind, this study aimed to systematically examine the effect of task complexity on reaction time (RT) during a non-linguistic attention task, in both PWA and controls. Additional goals were to assess the effect of task complexity on between-session intra-individual variability (BS-IIV) in RT and to examine inter-individual differences in BS-IIV. Eighteen PWA and five age-matched neurologically healthy controls each completed a novel computerized non-linguistic attention task measuring five types of attention on each of four different non-consecutive days. A significant effect of task complexity on both RT and BS-IIV in RT was found for the PWA group, whereas the control group showed a significant effect of task complexity on RT but not on BS-IIV in RT. Finally, in addition to these group-level findings, it was noted that different patients exhibited different patterns of BS-IIV, indicating the existence of inter-individual variability in BS-IIV within the PWA group. Results may have implications for session-to-session fluctuations in attention during language testing and therapy for PWA. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Designed experiment evaluation of key variables affecting the cutting performance of rotary instruments.

    PubMed

    Funkenbusch, Paul D; Rotella, Mario; Ercoli, Carlo

    2015-04-01

    Laboratory studies of tooth preparation are often performed under a limited range of conditions involving single values for all variables other than the 1 being tested. In contrast, in clinical settings not all variables can be tightly controlled. For example, a new dental rotary cutting instrument may be tested in the laboratory by making a specific cut with a fixed force, but in clinical practice, the instrument must make different cuts with individual dentists applying a range of different forces. Therefore, the broad applicability of laboratory results to diverse clinical conditions is uncertain and the comparison of effects across studies is difficult. The purpose of this study was to examine the effect of 9 process variables on dental cutting in a single experiment, allowing each variable to be robustly tested over a range of values for the other 8 and permitting a direct comparison of the relative importance of each on the cutting process. The effects of 9 key process variables on the efficiency of a simulated dental cutting operation were measured. A fractional factorial experiment was conducted by using a computer-controlled, dedicated testing apparatus to simulate dental cutting procedures and Macor blocks as the cutting substrate. Analysis of Variance (ANOVA) was used to judge the statistical significance (α=.05). Five variables consistently produced large, statistically significant effects (target applied load, cut length, starting rpm, diamond grit size, and cut type), while 4 variables produced relatively small, statistically insignificant effects (number of cooling ports, rotary cutting instrument diameter, disposability, and water flow rate). The control exerted by the dentist, simulated in this study by targeting a specific level of applied force, was the single most important factor affecting cutting efficiency. Cutting efficiency was also significantly affected by factors simulating patient/clinical circumstances as well as hardware choices. These results highlight the importance of local clinical conditions (procedure, dentist) in understanding dental cutting procedures and in designing adequate experimental methodologies for future studies. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  3. Final Technical Report: The effects of climate, forest age, and disturbance history on carbon and water processes at AmeriFlux sites across gradients in Pacific Northwest forests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Law, Beverly E.

    Investigate the effects of disturbance and climate variables on processes controlling carbon and water processes at AmeriFlux cluster sites in semi-arid and mesic forests in Oregon. The observations were made at three existing and productive AmeriFlux research sites that represent climate and disturbance gradients as a natural experiment of the influence of climatic and hydrologic variability on carbon sequestration and resulting atmospheric CO 2 feedback that includes anomalies during the warm/ dry phase of the Pacific Decadal Oscillation.

  4. Time-division multiplexer uses digital gates

    NASA Technical Reports Server (NTRS)

    Myers, C. E.; Vreeland, A. E.

    1977-01-01

    Device eliminates errors caused by analog gates in multiplexing a large number of channels at high frequency. System was designed for use in aerospace work to multiplex signals for monitoring such variables as fuel consumption, pressure, temperature, strain, and stress. Circuit may be useful in monitoring variables in process control and medicine as well.

  5. Evaluation of two spike-and-recovery controls for assessment of extraction efficiency in microbial source tracking studies

    USGS Publications Warehouse

    Stoeckel, D.M.; Stelzer, E.A.; Dick, L.K.

    2009-01-01

    Quantitative PCR (qPCR), applied to complex environmental samples such as water, wastewater, and feces, is susceptible to methodological and sample related biases. In this study, we evaluated two exogenous DNA spike-and-recovery controls as proxies for recovery efficiency of Bacteroidales 16S rDNA gene sequences (AllBac and qHF183) that are used for microbial source tracking (MST) in river water. Two controls-(1) the plant pathogen Pantoea stewartii, carrying the chromosomal target gene cpsD, and (2) Escherichia coli, carrying the plasmid-borne target gene DsRed2-were added to raw water samples immediately prior to concentration and DNA extraction for qPCR. When applied to samples processed in replicate, recovery of each control was positively correlated with the observed concentration of each MST marker. Adjustment of MST marker concentrations according to recovery efficiency reduced variability in replicate analyses when consistent processing and extraction methodologies were applied. Although the effects of this procedure on accuracy could not be tested due to uncertainties in control DNA concentrations, the observed reduction in variability should improve the strength of statistical comparisons. These findings suggest that either of the tested spike-and-recovery controls can be useful to measure efficiency of extraction and recovery in routine laboratory processing. ?? 2009 Elsevier Ltd.

  6. Improved process robustness by using closed loop control in deep drawing applications

    NASA Astrophysics Data System (ADS)

    Barthau, M.; Liewald, M.; Christian, Held

    2017-09-01

    The production of irregular shaped deep-drawing parts with high quality requirements, which are common in today’s automotive production, permanently challenges production processes. High requirements on lightweight construction of passenger car bodies following European regulations until 2020 have been massively increasing the use of high strength steels substantially for years and are also leading to bigger challenges in sheet metal part production. Of course, the more and more complex shapes of today’s car body shells also intensify the issue due to modern and future design criteria. The metal forming technology tries to meet these challenges by developing a highly sophisticated layout of deep drawing dies that consider part quality requirements, process robustness and controlled material flow during the deep or stretch drawing process phase. A new method for controlling material flow using a closed loop system was developed at the IFU Stuttgart. In contrast to previous approaches, this new method allows a control intervention during the deep-drawing stroke. The blank holder force around the outline of the drawn part is used as control variable. The closed loop is designed as trajectory follow up with feed forward control. The used command variable is the part-wall stress that is measured with a piezo-electric measuring pin. In this paper the used control loop will be described in detail. The experimental tool that was built for testing the new control approach is explained here with its features. A method for gaining the follow up trajectories from simulation will also be presented. Furthermore, experimental results considering the robustness of the deep drawing process and the gain in process performance with developed control loop will be shown. Finally, a new procedure for the industrial application of the new control method of deep drawing will be presented by using a new kind of active element to influence the local blank holder pressure onto part flange.

  7. So Many Brands and Varieties to Choose from: Does This Compromise the Control of Food Intake in Humans?

    PubMed Central

    Hardman, Charlotte A.; Ferriday, Danielle; Kyle, Lesley; Rogers, Peter J.; Brunstrom, Jeffrey M.

    2015-01-01

    The recent rise in obesity is widely attributed to changes in the dietary environment (e.g., increased availability of energy-dense foods and larger portion sizes). However, a critical feature of our “obesogenic environment” may have been overlooked - the dramatic increase in “dietary variability” (the tendency for specific mass-produced foods to be available in numerous varieties that differ in energy content). In this study we tested the hypothesis that dietary variability compromises the control of food intake in humans. Specifically, we examined the effects of dietary variability in pepperoni pizza on two key outcome variables; i) compensation for calories in pepperoni pizza and ii) expectations about the satiating properties of pepperoni pizza (expected satiation). We reasoned that dietary variability might generate uncertainty about the postingestive effects of a food. An internet-based questionnaire was completed by 199 adults. This revealed substantial variation in exposure to different varieties of pepperoni pizza. In a follow-up study (n= 66; 65% female), high pizza variability was associated with i) poorer compensation for calories in pepperoni pizza and ii) lower expected satiation for pepperoni pizza. Furthermore, the effect of uncertainty on caloric compensation was moderated by individual differences in decision making (loss aversion). For the first time, these findings highlight a process by which dietary variability may compromise food-intake control in humans. This is important because it exposes a new feature of Western diets (processed foods in particular) that might contribute to overeating and obesity. PMID:25923118

  8. Spatiotemporal Variability of Hillslope Soil Moisture Across Steep, Highly Dissected Topography

    NASA Astrophysics Data System (ADS)

    Jarecke, K. M.; Wondzell, S. M.; Bladon, K. D.

    2016-12-01

    Hillslope ecohydrological processes, including subsurface water flow and plant water uptake, are strongly influenced by soil moisture. However, the factors controlling spatial and temporal variability of soil moisture in steep, mountainous terrain are poorly understood. We asked: How do topography and soils interact to control the spatial and temporal variability of soil moisture in steep, Douglas-fir dominated hillslopes in the western Cascades? We will present a preliminary analysis of bimonthly soil moisture variability from July-November 2016 at 0-30 and 0-60 cm depth across spatially extensive convergent and divergent topographic positions in Watershed 1 of the H.J. Andrews Experimental Forest in central Oregon. Soil moisture monitoring locations were selected following a 5 m LIDAR analysis of topographic position, aspect, and slope. Topographic position index (TPI) was calculated as the difference in elevation to the mean elevation within a 30 m radius. Convergent (negative TPI values) and divergent (positive TPI values) monitoring locations were established along northwest to northeast-facing aspects and within 25-55 degree slopes. We hypothesized that topographic position (convergent vs. divergent), as well as soil physical properties (e.g., texture, bulk density), control variation in hillslope soil moisture at the sub-watershed scale. In addition, we expected the relative importance of hillslope topography to the spatial variability in soil moisture to differ seasonally. By comparing the spatiotemporal variability of hillslope soil moisture across topographic positions, our research provides a foundation for additional understanding of subsurface flow processes and plant-available soil-water in forests with steep, highly dissected terrain.

  9. Temporary stages and motivational variables: Two complementary perspectives in the help-seeking process for mental disorders.

    PubMed

    Del Valle Del Valle, Gema; Carrió, Carmen; Belloch, Amparo

    2017-10-09

    Help-seeking for mental disorders is a complex process, which includes different temporary stages, and in which the motivational variables play an especially relevant role. However, there is a lack of instruments to evaluate in depth both the temporary and motivational variables involved in the help-seeking process. This study aims to analyse in detail these two sets of variables, using a specific instrument designed for the purpose, to gain a better understanding of the process of treatment seeking. A total of 152 patients seeking treatment in mental health outpatient clinics of the NHS were individually interviewed: 71 had Obsessive-Compulsive Disorder, 21 had Agoraphobia, 18 had Major Depressive Disorder), 20 had Anorexia Nervosa, and 22 had Cocaine Dependence. The patients completed a structured interview assessing the help-seeking process. Disorder severity and quality of life was also assessed. The patients with agoraphobia and with major depression took significantly less time in recognising their mental health symptoms. Similarly, patients with major depression were faster in seeking professional help. Motivational variables were grouped in 3 sets: motivators for seeking treatment, related to the negative impact of symptoms on mood and to loss of control over symptoms; motivators for delaying treatment, related to minimisation of the disorder; and stigma-associated variables. The results support the importance of considering the different motivational variables involved in the several stages of the help-seeking process. The interview designed to that end has shown its usefulness in this endeavour. Copyright © 2017 SEP y SEPB. Publicado por Elsevier España, S.L.U. All rights reserved.

  10. The Acquisition of Integrated Science Process Skills in a Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Saat, Rohaida Mohd

    2004-01-01

    Web-based learning is becoming prevalent in science learning. Some use specially designed programs, while others use materials available on the Internet. This qualitative case study examined the process of acquisition of integrated science process skills, particularly the skill of controlling variables, in a web-based learning environment among…

  11. Strategies and Decision Support Systems for Integrating Variable Energy Resources in Control Centers for Reliable Grid Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Lawrence E.

    This report provides findings from the field regarding the best ways in which to guide operational strategies, business processes and control room tools to support the integration of renewable energy into electrical grids.

  12. High-throughput assay for optimising microbial biological control agent production and delivery

    USDA-ARS?s Scientific Manuscript database

    Lack of technologies to produce and deliver effective biological control agents (BCAs) is a major barrier to their commercialization. A myriad of variables associated with BCA cultivation, formulation, drying, storage, and reconstitution processes complicates agent quality maximization. An efficie...

  13. Simulations of Control Schemes for Inductively Coupled Plasma Sources

    NASA Astrophysics Data System (ADS)

    Ventzek, P. L. G.; Oda, A.; Shon, J. W.; Vitello, P.

    1997-10-01

    Process control issues are becoming increasingly important in plasma etching. Numerical experiments are an excellent test-bench for evaluating a proposed control system. Models are generally reliable enough to provide information about controller robustness, fitness of diagnostics. We will present results from a two dimensional plasma transport code with a multi-species plasma chemstry obtained from a global model. [1-2] We will show a correlation of external etch parameters (e.g. input power) with internal plasma parameters (e.g. species fluxes) which in turn are correlated with etch results (etch rate, uniformity, and selectivity) either by comparison to experiment or by using a phenomenological etch model. After process characterization, a control scheme can be evaluated since the relationship between the variable to be controlled (e.g. uniformity) is related to the measurable variable (e.g. a density) and external parameter (e.g. coil current). We will present an evaluation using the HBr-Cl2 system as an example. [1] E. Meeks and J. W. Shon, IEEE Trans. on Plasma Sci., 23, 539, 1995. [2] P. Vitello, et al., IEEE Trans. on Plasma Sci., 24, 123, 1996.

  14. Adaptive adjustment of interval predictive control based on combined model and application in shell brand petroleum distillation tower

    NASA Astrophysics Data System (ADS)

    Sun, Chao; Zhang, Chunran; Gu, Xinfeng; Liu, Bin

    2017-10-01

    Constraints of the optimization objective are often unable to be met when predictive control is applied to industrial production process. Then, online predictive controller will not find a feasible solution or a global optimal solution. To solve this problem, based on Back Propagation-Auto Regressive with exogenous inputs (BP-ARX) combined control model, nonlinear programming method is used to discuss the feasibility of constrained predictive control, feasibility decision theorem of the optimization objective is proposed, and the solution method of soft constraint slack variables is given when the optimization objective is not feasible. Based on this, for the interval control requirements of the controlled variables, the slack variables that have been solved are introduced, the adaptive weighted interval predictive control algorithm is proposed, achieving adaptive regulation of the optimization objective and automatically adjust of the infeasible interval range, expanding the scope of the feasible region, and ensuring the feasibility of the interval optimization objective. Finally, feasibility and effectiveness of the algorithm is validated through the simulation comparative experiments.

  15. Finding Relevant Parameters for the Thin-film Photovoltaic Cells Production Process with the Application of Data Mining Methods.

    PubMed

    Ulaczyk, Jan; Morawiec, Krzysztof; Zabierowski, Paweł; Drobiazg, Tomasz; Barreau, Nicolas

    2017-09-01

    A data mining approach is proposed as a useful tool for the control parameters analysis of the 3-stage CIGSe photovoltaic cell production process, in order to find variables that are the most relevant for cell electric parameters and efficiency. The analysed data set consists of stage duration times, heater power values as well as temperatures for the element sources and the substrate - there are 14 variables per sample in total. The most relevant variables of the process have been found based on the so-called random forest analysis with the application of the Boruta algorithm. 118 CIGSe samples, prepared at Institut des Matériaux Jean Rouxel, were analysed. The results are close to experimental knowledge on the CIGSe cells production process. They bring new evidence to production parameters of new cells and further research. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Excessive motor overflow reveals abnormal inter-hemispheric connectivity in Friedreich ataxia.

    PubMed

    Low, Sze-Cheen; Corben, Louise A; Delatycki, Martin B; Ternes, Anne-Marie; Addamo, Patricia K; Georgiou-Karistianis, Nellie

    2013-07-01

    This study sought to characterise force variability and motor overflow in 12 individuals with Friedreich ataxia (FRDA) and 12 age- and gender-matched controls. Participants performed a finger-pressing task by exerting 30 and 70 % of their maximum finger force using the index finger of the right and left hand. Control of force production was measured as force variability, while any involuntary movements occurring on the finger of the other, passive hand, was measured as motor overflow. Significantly greater force variability in individuals with FRDA compared with controls is indicative of cortico-cerebellar disruption affecting motor control. Meanwhile, significantly greater motor overflow in this group provides the first evidence of possible abnormal inter-hemispheric activity that may be attributable to asymmetrical neuronal loss in the dentate nucleus. Overall, this study demonstrated a differential engagement in the underlying default processes of the motor system in FRDA.

  17. Predictive displays for a process-control schematic interface.

    PubMed

    Yin, Shanqing; Wickens, Christopher D; Helander, Martin; Laberge, Jason C

    2015-02-01

    Our objective was to examine the extent to which increasing precision of predictive (rate of change) information in process control will improve performance on a simulated process-control task. Predictive displays have been found to be useful in process control (as well as aviation and maritime industries). However, authors of prior research have not examined the extent to which predictive value is increased by increasing predictor resolution, nor has such research tied potential improvements to changes in process control strategy. Fifty nonprofessional participants each controlled a simulated chemical mixture process (honey mixer simulation) that simulated the operations found in process control. Participants in each of five groups controlled with either no predictor or a predictor ranging in the resolution of prediction of the process. Increasing detail resolution generally increased the benefit of prediction over the control condition although not monotonically so. The best overall performance, combining quality and predictive ability, was obtained by the display of intermediate resolution. The two displays with the lowest resolution were clearly inferior. Predictors with higher resolution are of value but may trade off enhanced sensitivity to variable change (lower-resolution discrete state predictor) with smoother control action (higher-resolution continuous predictors). The research provides guidelines to the process-control industry regarding displays that can most improve operator performance.

  18. [Monitoring method of extraction process for Schisandrae Chinensis Fructus based on near infrared spectroscopy and multivariate statistical process control].

    PubMed

    Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li

    2017-10-01

    To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.

  19. Determination of dominant biogeochemical processes in a contaminated aquifer-wetland system using multivariate statistical analysis

    USGS Publications Warehouse

    Baez-Cazull, S. E.; McGuire, J.T.; Cozzarelli, I.M.; Voytek, M.A.

    2008-01-01

    Determining the processes governing aqueous biogeochemistry in a wetland hydrologically linked to an underlying contaminated aquifer is challenging due to the complex exchange between the systems and their distinct responses to changes in precipitation, recharge, and biological activities. To evaluate temporal and spatial processes in the wetland-aquifer system, water samples were collected using cm-scale multichambered passive diffusion samplers (peepers) to span the wetland-aquifer interface over a period of 3 yr. Samples were analyzed for major cations and anions, methane, and a suite of organic acids resulting in a large dataset of over 8000 points, which was evaluated using multivariate statistics. Principal component analysis (PCA) was chosen with the purpose of exploring the sources of variation in the dataset to expose related variables and provide insight into the biogeochemical processes that control the water chemistry of the system. Factor scores computed from PCA were mapped by date and depth. Patterns observed suggest that (i) fermentation is the process controlling the greatest variability in the dataset and it peaks in May; (ii) iron and sulfate reduction were the dominant terminal electron-accepting processes in the system and were associated with fermentation but had more complex seasonal variability than fermentation; (iii) methanogenesis was also important and associated with bacterial utilization of minerals as a source of electron acceptors (e.g., barite BaSO4); and (iv) seasonal hydrological patterns (wet and dry periods) control the availability of electron acceptors through the reoxidation of reduced iron-sulfur species enhancing iron and sulfate reduction. Copyright ?? 2008 by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America. All rights reserved.

  20. Habitat connectivity and in-stream vegetation control temporal variability of benthic invertebrate communities.

    PubMed

    Huttunen, K-L; Mykrä, H; Oksanen, J; Astorga, A; Paavola, R; Muotka, T

    2017-05-03

    One of the key challenges to understanding patterns of β diversity is to disentangle deterministic patterns from stochastic ones. Stochastic processes may mask the influence of deterministic factors on community dynamics, hindering identification of the mechanisms causing variation in community composition. We studied temporal β diversity (among-year dissimilarity) of macroinvertebrate communities in near-pristine boreal streams across 14 years. To assess whether the observed β diversity deviates from that expected by chance, and to identify processes (deterministic vs. stochastic) through which different explanatory factors affect community variability, we used a null model approach. We observed that at the majority of sites temporal β diversity was low indicating high community stability. When stochastic variation was unaccounted for, connectivity was the only variable explaining temporal β diversity, with weakly connected sites exhibiting higher community variability through time. After accounting for stochastic effects, connectivity lost importance, suggesting that it was related to temporal β diversity via random colonization processes. Instead, β diversity was best explained by in-stream vegetation, community variability decreasing with increasing bryophyte cover. These results highlight the potential of stochastic factors to dampen the influence of deterministic processes, affecting our ability to understand and predict changes in biological communities through time.

  1. Latent variable modeling to analyze the effects of process parameters on the dissolution of paracetamol tablet

    PubMed Central

    Sun, Fei; Xu, Bing; Zhang, Yi; Dai, Shengyun; Shi, Xinyuan; Qiao, Yanjiang

    2017-01-01

    ABSTRACT The dissolution is one of the critical quality attributes (CQAs) of oral solid dosage forms because it relates to the absorption of drug. In this paper, the influence of raw materials, granules and process parameters on the dissolution of paracetamol tablet was analyzed using latent variable modeling methods. The variability in raw materials and granules was understood based on the principle component analysis (PCA), respectively. A multi-block partial least squares (MBPLS) model was used to determine the critical factors affecting the dissolution. The results showed that the binder amount, the post granulation time, the API content in granule, the fill depth and the punch tip separation distance were the critical factors with variable importance in the projection (VIP) values larger than 1. The importance of each unit of the whole process was also ranked using the block importance in the projection (BIP) index. It was concluded that latent variable models (LVMs) were very useful tools to extract information from the available data and improve the understanding on dissolution behavior of paracetamol tablet. The obtained LVMs were also helpful to propose the process design space and to design control strategies in the further research. PMID:27689242

  2. Typing pictures: Linguistic processing cascades into finger movements.

    PubMed

    Scaltritti, Michele; Arfé, Barbara; Torrance, Mark; Peressotti, Francesca

    2016-11-01

    The present study investigated the effect of psycholinguistic variables on measures of response latency and mean interkeystroke interval in a typewritten picture naming task, with the aim to outline the functional organization of the stages of cognitive processing and response execution associated with typewritten word production. Onset latencies were modulated by lexical and semantic variables traditionally linked to lexical retrieval, such as word frequency, age of acquisition, and naming agreement. Orthographic variables, both at the lexical and sublexical level, appear to influence just within-word interkeystroke intervals, suggesting that orthographic information may play a relevant role in controlling actual response execution. Lexical-semantic variables also influenced speed of execution. This points towards cascaded flow of activation between stages of lexical access and response execution. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Optimal Synthesis of Compliant Mechanisms using Subdivision and Commercial FEA (DETC2004-57497)

    NASA Technical Reports Server (NTRS)

    Hull, Patrick V.; Canfield, Stephen

    2004-01-01

    The field of distributed-compliance mechanisms has seen significant work in developing suitable topology optimization tools for their design. These optimal design tools have grown out of the techniques of structural optimization. This paper will build on the previous work in topology optimization and compliant mechanism design by proposing an alternative design space parameterization through control points and adding another step to the process, that of subdivision. The control points allow a specific design to be represented as a solid model during the optimization process. The process of subdivision creates an additional number of control points that help smooth the surface (for example a C(sup 2) continuous surface depending on the method of subdivision chosen) creating a manufacturable design free of some traditional numerical instabilities. Note that these additional control points do not add to the number of design parameters. This alternative parameterization and description as a solid model effectively and completely separates the design variables from the analysis variables during the optimization procedure. The motivation behind this work is to create an automated design tool from task definition to functional prototype created on a CNC or rapid-prototype machine. This paper will describe the proposed compliant mechanism design process and will demonstrate the procedure on several examples common in the literature.

  4. Research on a dynamic workflow access control model

    NASA Astrophysics Data System (ADS)

    Liu, Yiliang; Deng, Jinxia

    2007-12-01

    In recent years, the access control technology has been researched widely in workflow system, two typical technologies of that are RBAC (Role-Based Access Control) and TBAC (Task-Based Access Control) model, which has been successfully used in the role authorizing and assigning in a certain extent. However, during the process of complicating a system's structure, these two types of technology can not be used in minimizing privileges and separating duties, and they are inapplicable when users have a request of frequently changing on the workflow's process. In order to avoid having these weakness during the applying, a variable flow dynamic role_task_view (briefly as DRTVBAC) of fine-grained access control model is constructed on the basis existed model. During the process of this model applying, an algorithm is constructed to solve users' requirements of application and security needs on fine-grained principle of privileges minimum and principle of dynamic separation of duties. The DRTVBAC model is implemented in the actual system, the figure shows that the task associated with the dynamic management of role and the role assignment is more flexible on authority and recovery, it can be met the principle of least privilege on the role implement of a specific task permission activated; separated the authority from the process of the duties completing in the workflow; prevented sensitive information discovering from concise and dynamic view interface; satisfied with the requirement of the variable task-flow frequently.

  5. An analysis of variability in the manufacturing of dexosomes: implications for development of an autologous therapy.

    PubMed

    Patel, Sanjay; Mehta-Damani, Anita; Shu, Helen; Le Pecq, Jean-Bernard

    2005-10-20

    Dexosomes are nanometer-size vesicles released by dendritic-cells, possessing much of the cellular machinery required to stimulate an immune response (i.e. MHC Class I and II). The ability of patient-derived dexosomes loaded with tumor antigens to elicit anti-tumor activity is currently being evaluated in clinical trials. Unlike conventional biologics, where variability between lots of product arises mostly from the manufacturing process, an autologous product has inherent variability in the starting material due to heterogeneity in the human population. In an effort to assess the variability arising from the dexosome manufacturing process versus the human starting material, 144 dexosome preparations from normal donors (111) and cancer patients (33) from two Phase I clinical trials were analyzed. A large variability in the quantity of dexosomes (measured as the number of MHC Class II molecules) produced between individual lots was observed ( > 50-fold). An analysis of intra-lot variability shows that the manufacturing process introduces relatively little of this variability. To identify the source(s) of variability arising from the human starting material, distributions of the key parameters involved in dexosome production were established, and a model created. Computer simulations using this model were performed, and compared to the actual data observed. The main conclusion from these simulations is that the number of cells collected per individual and the productivity of these cells of are the principal sources of variability in the production of Class II. The approach described here can be extended to other autologous therapies in general to evaluate control of manufacturing processes. Moreover, this analysis of process variability is directly applicable to production at a commercial scale, since the large scale manufacture of autologous products entails an exact process replication rather than scale-up in volume, as is the case with traditional drugs or biologics. Copyright 2005 Wiley Periodicals, Inc.

  6. Advanced control of dissolved oxygen concentration in fed batch cultures during recombinant protein production.

    PubMed

    Kuprijanov, A; Gnoth, S; Simutis, R; Lübbert, A

    2009-02-01

    Design and experimental validation of advanced pO(2) controllers for fermentation processes operated in the fed-batch mode are described. In most situations, the presented controllers are able to keep the pO(2) in fermentations for recombinant protein productions exactly on the desired value. The controllers are based on the gain-scheduling approach to parameter-adaptive proportional-integral controllers. In order to cope with the most often appearing distortions, the basic gain-scheduling feedback controller was complemented with a feedforward control component. This feedforward/feedback controller significantly improved pO(2) control. By means of numerical simulations, the controller behavior was tested and its parameters were determined. Validation runs were performed with three Escherichia coli strains producing different recombinant proteins. It is finally shown that the new controller leads to significant improvements in the signal-to-noise ratio of other key process variables and, thus, to a higher process quality.

  7. Posttraumatic growth after cancer: The role of perceived threat and cognitive processing.

    PubMed

    Caspari, Jennifer M; Raque-Bogdan, Trisha L; McRae, Cynthia; Simoneau, Teresa L; Ash-Lee, Susan; Hultgren, Kristin

    2017-01-01

    This study examines the relation between perceived cognitive and physical threat after a cancer diagnosis and posttraumatic growth (PTG). In total, 169 breast, prostate, and colorectal cancer survivors completed questionnaires. Hierarchical regression models found after controlling for demographic and medical variables, depression, anxiety, and perceived threat account for 41.8% of the variance of positive cognitive processing, and these variables along with positive cognitive processing accounted for 42.7% of the variance of PTG. Positive cognitive processing mediated the pathways between perceived physical threat and PTG. Cognitive processing appears to play a key role in the emergence of PTG following cancer. By exploring survivors' cognitions and perceived threat, psychosocial providers may help cancer survivors cultivate PTG.

  8. A longitudinal study of mortality and air pollution for São Paulo, Brazil.

    PubMed

    Botter, Denise A; Jørgensen, Bent; Peres, Antonieta A Q

    2002-09-01

    We study the effects of various air-pollution variables on the daily death counts for people over 65 years in São Paulo, Brazil, from 1991 to 1993, controlling for meteorological variables. We use a state space model where the air-pollution variables enter via the latent process, and the meteorological variables via the observation equation. The latent process represents the potential mortality due to air pollution, and is estimated by Kalman filter techniques. The effect of air pollution on mortality is found to be a function of the variation in the sulphur dioxide level for the previous 3 days, whereas the other air-pollution variables (total suspended particulates, nitrogen dioxide, carbon monoxide, ozone) are not significant when sulphur dioxide is in the equation. There are significant effects of humidity and up to lag 3 of temperature, and a significant seasonal variation.

  9. Boosted structured additive regression for Escherichia coli fed-batch fermentation modeling.

    PubMed

    Melcher, Michael; Scharl, Theresa; Luchner, Markus; Striedner, Gerald; Leisch, Friedrich

    2017-02-01

    The quality of biopharmaceuticals and patients' safety are of highest priority and there are tremendous efforts to replace empirical production process designs by knowledge-based approaches. Main challenge in this context is that real-time access to process variables related to product quality and quantity is severely limited. To date comprehensive on- and offline monitoring platforms are used to generate process data sets that allow for development of mechanistic and/or data driven models for real-time prediction of these important quantities. Ultimate goal is to implement model based feed-back control loops that facilitate online control of product quality. In this contribution, we explore structured additive regression (STAR) models in combination with boosting as a variable selection tool for modeling the cell dry mass, product concentration, and optical density on the basis of online available process variables and two-dimensional fluorescence spectroscopic data. STAR models are powerful extensions of linear models allowing for inclusion of smooth effects or interactions between predictors. Boosting constructs the final model in a stepwise manner and provides a variable importance measure via predictor selection frequencies. Our results show that the cell dry mass can be modeled with a relative error of about ±3%, the optical density with ±6%, the soluble protein with ±16%, and the insoluble product with an accuracy of ±12%. Biotechnol. Bioeng. 2017;114: 321-334. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  10. Resistance controllability and variability improvement in a TaO{sub x}-based resistive memory for multilevel storage application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prakash, A., E-mail: amitknp@postech.ac.kr, E-mail: amit.knp02@gmail.com, E-mail: hwanghs@postech.ac.kr; Song, J.; Hwang, H., E-mail: amitknp@postech.ac.kr, E-mail: amit.knp02@gmail.com, E-mail: hwanghs@postech.ac.kr

    In order to obtain reliable multilevel cell (MLC) characteristics, resistance controllability between the different resistance levels is required especially in resistive random access memory (RRAM), which is prone to resistance variability mainly due to its intrinsic random nature of defect generation and filament formation. In this study, we have thoroughly investigated the multilevel resistance variability in a TaO{sub x}-based nanoscale (<30 nm) RRAM operated in MLC mode. It is found that the resistance variability not only depends on the conductive filament size but also is a strong function of oxygen vacancy concentration in it. Based on the gained insights through experimentalmore » observations and simulation, it is suggested that forming thinner but denser conductive filament may greatly improve the temporal resistance variability even at low operation current despite the inherent stochastic nature of resistance switching process.« less

  11. Work environment risk factors for injuries in wood processing.

    PubMed

    Holcroft, Christina A; Punnett, Laura

    2009-01-01

    The reported injury rate for wood product manufacturing in Maine, 1987-2004, was almost twice the state-wide average for all jobs. A case-control study was conducted in wood processing plants to determine preventable risk factors for injury. A total of 157 cases with injuries reported to workers' compensation and 251 controls were interviewed. In multivariable analyses, variables associated with injury risk were high physical workload, machine-paced work or inability to take a break, lack of training, absence of a lockout/tagout program, low seniority, and male gender. Different subsets of these variables were significant when acute incidents and overexertions were analyzed separately and when all injuries were stratified by industry sub-sector. Generalizability may be limited somewhat by non-representative participation of workplaces and individuals. Nevertheless, these findings provide evidence that many workplace injuries occurring in wood processing could be prevented by application of ergonomics principles and improved work organization.

  12. Self-Control and Deviant Peer Network Structure

    ERIC Educational Resources Information Center

    McGloin, Jean Marie; Shermer, Lauren O'Neill

    2009-01-01

    From learning and opportunity perspectives, peer group structural dimensions shed light on social processes that can amplify or ameliorate the risk of having delinquent friends. Previous research has not accounted for a primary criminological variable, self-control, limiting theoretical clarity. The authors developed three hypotheses about…

  13. Microeconomics of process control in semiconductor manufacturing

    NASA Astrophysics Data System (ADS)

    Monahan, Kevin M.

    2003-06-01

    Process window control enables accelerated design-rule shrinks for both logic and memory manufacturers, but simple microeconomic models that directly link the effects of process window control to maximum profitability are rare. In this work, we derive these links using a simplified model for the maximum rate of profit generated by the semiconductor manufacturing process. We show that the ability of process window control to achieve these economic objectives may be limited by variability in the larger manufacturing context, including measurement delays and process variation at the lot, wafer, x-wafer, x-field, and x-chip levels. We conclude that x-wafer and x-field CD control strategies will be critical enablers of density, performance and optimum profitability at the 90 and 65nm technology nodes. These analyses correlate well with actual factory data and often identify millions of dollars in potential incremental revenue and cost savings. As an example, we show that a scatterometry-based CD Process Window Monitor is an economically justified, enabling technology for the 65nm node.

  14. Inverted-U shaped dopamine actions on human working memory and cognitive control

    PubMed Central

    Cools, R; D’Esposito, M

    2011-01-01

    Brain dopamine has long been implicated in cognitive control processes, including working memory. However, the precise role of dopamine in cognition is not well understood, partly because there is large variability in the response to dopaminergic drugs both across different behaviors and across different individuals. We review evidence from a series of studies with experimental animals, healthy humans and patients with Parkinson’s disease, which highlight two important factors that contribute to this large variability. First, the existence of an optimum dopamine level for cognitive function implicates the need to take into account baseline levels of dopamine when isolating dopamine’s effects. Second, cognitive control is a multi-factorial phenomenon, requiring a dynamic balance between cognitive stability and cognitive flexibility. These distinct components might implicate the prefrontal cortex and the striatum respectively. Manipulating dopamine will thus have paradoxical consequences for distinct cognitive control processes depending on distinct basal or optimal levels of dopamine in different brain regions. PMID:21531388

  15. Challenges to a blow/fill/seal process with airborne microorganisms having different resistances to dry heat.

    PubMed

    Poisson, Patrick; Sinclair, Colin S; Tallentire, Alan

    2006-01-01

    Controlled challenges with air dispersed microorganisms having widely different resistances to dry heat, carried out on 624 BFS machine processing growth medium, have shown that higher the heat resistance, the greater the extent of vial contamination. Differences in heat resistance affected also the extent of vial contamination when parison and vial formation were knowingly manipulated through changes made to each of three process variables, provision of ballooning air, mould vacuum delay, and parison extrusion rate. The findings demonstrate that, in this investigational system, exposure of challenge micoorganisms to heat inherent in the process has a controlling influence on vial contamination, an influence that could also control microbiological risk in production environments.

  16. Mechanisms of the 40-70 Day Variability in the Yucatan Channel Volume Transport

    NASA Astrophysics Data System (ADS)

    van Westen, René M.; Dijkstra, Henk A.; Klees, Roland; Riva, Riccardo E. M.; Slobbe, D. Cornelis; van der Boog, Carine G.; Katsman, Caroline A.; Candy, Adam S.; Pietrzak, Julie D.; Zijlema, Marcel; James, Rebecca K.; Bouma, Tjeerd J.

    2018-02-01

    The Yucatan Channel connects the Caribbean Sea with the Gulf of Mexico and is the main outflow region of the Caribbean Sea. Moorings in the Yucatan Channel show high-frequent variability in kinetic energy (50-100 days) and transport (20-40 days), but the physical mechanisms controlling this variability are poorly understood. In this study, we show that the short-term variability in the Yucatan Channel transport has an upstream origin and arises from processes in the North Brazil Current. To establish this connection, we use data from altimetry and model output from several high resolution global models. A significant 40-70 day variability is found in the sea surface height in the North Brazil Current retroflection region with a propagation toward the Lesser Antilles. The frequency of variability is generated by intrinsic processes associated with the shedding of eddies, rather than by atmospheric forcing. This sea surface height variability is able to pass the Lesser Antilles, it propagates westward with the background ocean flow in the Caribbean Sea and finally affects the variability in the Yucatan Channel volume transport.

  17. Manufacturing challenge: An employee perception of the impact of BEM variables on motivation

    NASA Astrophysics Data System (ADS)

    Nyaude, Alaster

    The study examines the impact of Thomas F. Gilbert's Behavior Engineering Model (BEM) variables on employee perception of motivation at an aerospace equipment manufacturing plant in Georgia. The research process involved literature review, and determination of an appropriate survey instrument for the study. The Hersey-Chevalier modified PROBE instrument (Appendix C) was used with Dr Roger Chevalier's validation. The participants' responses were further examined to determine the influence of demographic control variables of age, gender, length of service with the company and education on employee perception of motivation. The results indicated that the top three highly motivating variables were knowledge and skills, capacity and resources. Knowledge and skills was perceived to be highly motivating, capacity as second highly motivating and resources as the third highly motivating variable. Interestingly, the fourth highly motivating variable was information, the fifth was motives and the sixth was incentives. The results also showed that demographic control variables had no influence on employee perception of motivation. Further research may be required to understand to what extend these BEM variables impact employee perceptions of motivation.

  18. ATAD control goals through the analysis of process variables and evaluation of quality, production and cost.

    PubMed

    Nájera, S; Gil-Martínez, M; Zambrano, J A

    2015-01-01

    The aim of this paper is to establish and quantify different operational goals and control strategies in autothermal thermophilic aerobic digestion (ATAD). This technology appears as an alternative to conventional sludge digestion systems. During the batch-mode reaction, high temperatures promote sludge stabilization and pasteurization. The digester temperature is usually the only online, robust, measurable variable. The average temperature can be regulated by manipulating both the air injection and the sludge retention time. An improved performance of diverse biochemical variables can be achieved through proper manipulation of these inputs. However, a better quality of treated sludge usually implies major operating costs or a lower production rate. Thus, quality, production and cost indices are defined to quantify the outcomes of the treatment. Based on these, tradeoff control strategies are proposed and illustrated through some examples. This paper's results are relevant to guide plant operators, to design automatic control systems and to compare or evaluate the control performance on ATAD systems.

  19. Performance characteristics of a novel blood bag in-line closure device and subsequent product quality assessment

    PubMed Central

    Serrano, Katherine; Levin, Elena; Culibrk, Brankica; Weiss, Sandra; Scammell, Ken; Boecker, Wolfgang F; Devine, Dana V

    2010-01-01

    BACKGROUND In high-volume processing environments, manual breakage of in-line closures can result in repetitive strain injury (RSI). Furthermore, these closures may be incorrectly opened causing shear-induced hemolysis. To overcome the variability of in-line closure use and minimize RSI, Fresenius Kabi developed a new in-line closure, the CompoFlow, with mechanical openers. STUDY DESIGN AND METHODS The consistency of the performance of the CompoFlow closure device was assessed, as was its effect on component quality. A total of 188 RBC units using CompoFlow blood bag systems and 43 using the standard bag systems were produced using the buffy coat manufacturing method. Twenty-six CompoFlow platelet (PLT) concentrates and 10 control concentrates were prepared from pools of four buffy coats. RBCs were assessed on Days 1, 21, and 42 for cellular variables and hemolysis. PLTs were assessed on Days 1, 3, and 7 for morphology, CD62P expression, glucose, lactate, and pH. A total of 308 closures were excised after processing and the apertures were measured using digital image analysis. RESULTS The use of the CompoFlow device significantly improved the mean extraction time with 0.46 ± 0.11 sec/mL for the CompoFlow units and 0.52 ± 0.13 sec/mL for the control units. The CompoFlow closures showed a highly reproducible aperture after opening (coefficient of variation, 15%) and the device always remained opened. PLT and RBC products showed acceptable storage variables with no differences between CompoFlow and control. CONCLUSIONS The CompoFlow closure devices improved the level of process control and processing time of blood component production with no negative effects on product quality. PMID:20529007

  20. The numerical modelling and process simulation for the fault diagnosis of rotary kiln incinerator.

    PubMed

    Roh, S D; Kim, S W; Cho, W S

    2001-10-01

    The numerical modelling and process simulation for the fault diagnosis of rotary kiln incinerator were accomplished. In the numerical modelling, two models applied to the modelling within the kiln are the combustion chamber model including the mass and energy balance equations for two combustion chambers and 3D thermal model. The combustion chamber model predicts temperature within the kiln, flue gas composition, flux and heat of combustion. Using the combustion chamber model and 3D thermal model, the production-rules for the process simulation can be obtained through interrelation analysis between control and operation variables. The process simulation of the kiln is operated with the production-rules for automatic operation. The process simulation aims to provide fundamental solutions to the problems in incineration process by introducing an online expert control system to provide an integrity in process control and management. Knowledge-based expert control systems use symbolic logic and heuristic rules to find solutions for various types of problems. It was implemented to be a hybrid intelligent expert control system by mutually connecting with the process control systems which has the capability of process diagnosis, analysis and control.

  1. Controls on the Environmental Fate of Compounds Controlled by Coupled Hydrologic and Reactive Processes

    NASA Astrophysics Data System (ADS)

    Hixson, J.; Ward, A. S.; McConville, M.; Remucal, C.

    2017-12-01

    Current understanding of how compounds interact with hydrologic processes or reactive processes have been well established. However, the environmental fate for compounds that interact with hydrologic AND reactive processes is not well known, yet critical in evaluating environmental risk. Evaluations of risk are often simplified to homogenize processes in space and time and to assess processes independently of one another. However, we know spatial heterogeneity and time-variable reactivities complicate predictions of environmental transport and fate, and is further complicated by the interaction of these processes, limiting our ability to accurately predict risk. Compounds that interact with both systems, such as photolytic compounds, require that both components are fully understood in order to predict transport and fate. Release of photolytic compounds occurs through both unintentional releases and intentional loadings. Evaluating risks associated with unintentional releases and implementing best management practices for intentional releases requires an in-depth understanding of the sensitivity of photolytic compounds to external controls. Lampricides, such as 3-trifluoromethyl-4-nitrophenol (TFM), are broadly applied in the Great Lakes system to control the population of invasive sea lamprey. Over-dosing can yield fish kills and other detrimental impacts. Still, planning accounts for time of passage and dilution, but not the interaction of the physical and chemical systems (i.e., storage in the hyporheic zone and time-variable decay rates). In this study, we model a series of TFM applications to test the efficacy of dosing as a function of system characteristics. Overall, our results demonstrate the complexity associated with photo-sensitive compounds through stream-hyporheic systems, and highlight the need to better understand how physical and chemical systems interact to control transport and fate in the environment.

  2. Locomotor sensory organization test: a novel paradigm for the assessment of sensory contributions in gait.

    PubMed

    Chien, Jung Hung; Eikema, Diderik-Jan Anthony; Mukherjee, Mukul; Stergiou, Nicholas

    2014-12-01

    Feedback based balance control requires the integration of visual, proprioceptive and vestibular input to detect the body's movement within the environment. When the accuracy of sensory signals is compromised, the system reorganizes the relative contributions through a process of sensory recalibration, for upright postural stability to be maintained. Whereas this process has been studied extensively in standing using the Sensory Organization Test (SOT), less is known about these processes in more dynamic tasks such as locomotion. In the present study, ten healthy young adults performed the six conditions of the traditional SOT to quantify standing postural control when exposed to sensory conflict. The same subjects performed these six conditions using a novel experimental paradigm, the Locomotor SOT (LSOT), to study dynamic postural control during walking under similar types of sensory conflict. To quantify postural control during walking, the net Center of Pressure sway variability was used. This corresponds to the Performance Index of the center of pressure trajectory, which is used to quantify postural control during standing. Our results indicate that dynamic balance control during locomotion in healthy individuals is affected by the systematic manipulation of multisensory inputs. The sway variability patterns observed during locomotion reflect similar balance performance with standing posture, indicating that similar feedback processes may be involved. However, the contribution of visual input is significantly increased during locomotion, compared to standing in similar sensory conflict conditions. The increased visual gain in the LSOT conditions reflects the importance of visual input for the control of locomotion. Since balance perturbations tend to occur in dynamic tasks and in response to environmental constraints not present during the SOT, the LSOT may provide additional information for clinical evaluation on healthy and deficient sensory processing.

  3. Analysis of factors driving stream water composition and synthesis of management tools--a case study on small/medium Greek catchments.

    PubMed

    Skoulikidis, N Th; Amaxidis, Y; Bertahas, I; Laschou, S; Gritzalis, K

    2006-06-01

    Twenty-nine small- and mid-sized permanent rivers (thirty-six sites) scattered throughout Greece and equally distributed within three geo-chemical-climatic zones, have been investigated in a seasonal base. Hydrochemical types have been determined and spatio-temporal variations have been interpreted in relation to environmental characteristics and anthropogenic pressures. Multivariate statistical techniques have been used to identify the factors and processes affecting hydrochemical variability and the driving forces that control aquatic composition. It has been shown that spatial variation of aquatic quality is mainly governed by geological and hydrogeological factors. Due to geological and climatic variability, the three zones have different hydrochemical characteristics. Temporal hydrological variations in combination with hydrogeological factors control seasonal hydrochemical trends. Respiration processes due to municipal wastewaters, dominate in summer, and enhance nutrient, chloride and sodium concentrations, while nitrate originates primarily from agriculture. Photosynthetic processes dominate in spring. Carbonate chemistry is controlled by hydrogeological factors and biological activity. A possible enrichment of surface waters with nutrients in "pristine" forested catchments is attributed to soil leaching and mineralisation processes. Two management tools have been developed: a nutrient classification system and a rapid prediction of aquatic composition tool.

  4. The swiss army knife of job submission tools: grid-control

    NASA Astrophysics Data System (ADS)

    Stober, F.; Fischer, M.; Schleper, P.; Stadie, H.; Garbers, C.; Lange, J.; Kovalchuk, N.

    2017-10-01

    grid-control is a lightweight and highly portable open source submission tool that supports all common workflows in high energy physics (HEP). It has been used by a sizeable number of HEP analyses to process tasks that sometimes consist of up to 100k jobs. grid-control is built around a powerful plugin and configuration system, that allows users to easily specify all aspects of the desired workflow. Job submission to a wide range of local or remote batch systems or grid middleware is supported. Tasks can be conveniently specified through the parameter space that will be processed, which can consist of any number of variables and data sources with complex dependencies on each other. Dataset information is processed through a configurable pipeline of dataset filters, partition plugins and partition filters. The partition plugins can take the number of files, size of the work units, metadata or combinations thereof into account. All changes to the input datasets or variables are propagated through the processing pipeline and can transparently trigger adjustments to the parameter space and the job submission. While the core functionality is completely experiment independent, full integration with the CMS computing environment is provided by a small set of plugins.

  5. Prospective memory and intraindividual variability in ongoing task response times in an adult lifespan sample: the role of cue focality.

    PubMed

    Ihle, Andreas; Ghisletta, Paolo; Kliegel, Matthias

    2017-03-01

    To contribute to the ongoing conceptual debate of what traditional mean-level ongoing task (OT) costs tell us about the attentional processes underlying prospective memory (PM), we investigated costs to intraindividual variability (IIV) in OT response times as a potentially sensitive indicator of attentional processes. Particularly, we tested whether IIV in OT responses may reflect controlled employment of attentional processes versus lapses of controlled attention, whether these processes differ across adulthood, and whether it is moderated by cue focality. We assessed 150 individuals (19-82 years) in a focal and a nonfocal PM condition. In addition, external measures of inhibition and working memory were assessed. In line with the predictions of the lapses-of-attention/inefficient-executive-control account, our data support the view that costs to IIV in OT trials of PM tasks reflect fluctuations in the efficiency of executive functioning, which was related to failures in prospective remembering, particularly in nonfocal PM tasks, potentially due to their increased executive demands. The additional value of considering costs to IIV over and beyond traditional mean-level OT costs in PM research is discussed.

  6. Fundamental Principles of Coherent-Feedback Quantum Control

    DTIC Science & Technology

    2014-12-08

    in metrology (acceleration sensing, vibrometry, gravity wave detection) and in quantum information processing (continuous-variables quantum ...AFRL-OSR-VA-TR-2015-0009 FUNDAMENTAL PRINCIPLES OF COHERENT-FEEDBACK QUANTUM CONTROL Hideo Mabuchi LELAND STANFORD JUNIOR UNIV CA Final Report 12/08...foundations and potential applications of coherent-feedback quantum control. We have focused on potential applications in quantum -enhanced metrology and

  7. Numerical Simulation and Optimization of Directional Solidification Process of Single Crystal Superalloy Casting

    PubMed Central

    Zhang, Hang; Xu, Qingyan; Liu, Baicheng

    2014-01-01

    The rapid development of numerical modeling techniques has led to more accurate results in modeling metal solidification processes. In this study, the cellular automaton-finite difference (CA-FD) method was used to simulate the directional solidification (DS) process of single crystal (SX) superalloy blade samples. Experiments were carried out to validate the simulation results. Meanwhile, an intelligent model based on fuzzy control theory was built to optimize the complicate DS process. Several key parameters, such as mushy zone width and temperature difference at the cast-mold interface, were recognized as the input variables. The input variables were functioned with the multivariable fuzzy rule to get the output adjustment of withdrawal rate (v) (a key technological parameter). The multivariable fuzzy rule was built, based on the structure feature of casting, such as the relationship between section area, and the delay time of the temperature change response by changing v, and the professional experience of the operator as well. Then, the fuzzy controlling model coupled with CA-FD method could be used to optimize v in real-time during the manufacturing process. The optimized process was proven to be more flexible and adaptive for a steady and stray-grain free DS process. PMID:28788535

  8. A first-principle model of 300 mm Czochralski single-crystal Si production process for predicting crystal radius and crystal growth rate

    NASA Astrophysics Data System (ADS)

    Zheng, Zhongchao; Seto, Tatsuru; Kim, Sanghong; Kano, Manabu; Fujiwara, Toshiyuki; Mizuta, Masahiko; Hasebe, Shinji

    2018-06-01

    The Czochralski (CZ) process is the dominant method for manufacturing large cylindrical single-crystal ingots for the electronics industry. Although many models and control methods for the CZ process have been proposed, they were only tested with small equipment and only a few industrial application were reported. In this research, we constructed a first-principle model for controlling industrial CZ processes that produce 300 mm single-crystal silicon ingots. The developed model, which consists of energy, mass balance, hydrodynamic, and geometrical equations, calculates the crystal radius and the crystal growth rate as output variables by using the heater input, the crystal pulling rate, and the crucible rise rate as input variables. To improve accuracy, we modeled the CZ process by considering factors such as changes in the positions of the crucible and the melt level. The model was validated with the operation data from an industrial 300 mm CZ process. We compared the calculated and actual values of the crystal radius and the crystal growth rate, and the results demonstrated that the developed model simulated the industrial process with high accuracy.

  9. Effective discharge analysis of ecological processes in streams

    USGS Publications Warehouse

    Doyle, Martin W.; Stanley, Emily H.; Strayer, David L.; Jacobson, Robert B.; Schmidt, John C.

    2005-01-01

    Discharge is a master variable that controls many processes in stream ecosystems. However, there is uncertainty of which discharges are most important for driving particular ecological processes and thus how flow regime may influence entire stream ecosystems. Here the analytical method of effective discharge from fluvial geomorphology is used to analyze the interaction between frequency and magnitude of discharge events that drive organic matter transport, algal growth, nutrient retention, macroinvertebrate disturbance, and habitat availability. We quantify the ecological effective discharge using a synthesis of previously published studies and modeling from a range of study sites. An analytical expression is then developed for a particular case of ecological effective discharge and is used to explore how effective discharge varies within variable hydrologic regimes. Our results suggest that a range of discharges is important for different ecological processes in an individual stream. Discharges are not equally important; instead, effective discharge values exist that correspond to near modal flows and moderate floods for the variable sets examined. We suggest four types of ecological response to discharge variability: discharge as a transport mechanism, regulator of habitat, process modulator, and disturbance. Effective discharge analysis will perform well when there is a unique, essentially instantaneous relationship between discharge and an ecological process and poorly when effects of discharge are delayed or confounded by legacy effects. Despite some limitations the conceptual and analytical utility of the effective discharge analysis allows exploring general questions about how hydrologic variability influences various ecological processes in streams.

  10. Toward a Generative Model of the Teaching-Learning Process.

    ERIC Educational Resources Information Center

    McMullen, David W.

    Until the rise of cognitive psychology, models of the teaching-learning process (TLP) stressed external rather than internal variables. Models remained general descriptions until control theory introduced explicit system analyses. Cybernetic models emphasize feedback and adaptivity but give little attention to creativity. Research on artificial…

  11. What controls channel form in steep mountain streams?

    NASA Astrophysics Data System (ADS)

    Palucis, M. C.; Lamb, M. P.

    2017-07-01

    Steep mountain streams have channel morphologies that transition from alternate bar to step-pool to cascade with increasing bed slope, which affect stream habitat, flow resistance, and sediment transport. Experimental and theoretical studies suggest that alternate bars form under large channel width-to-depth ratios, step-pools form in near supercritical flow or when channel width is narrow compared to bed grain size, and cascade morphology is related to debris flows. However, the connection between these process variables and bed slope—the apparent dominant variable for natural stream types—is unclear. Combining field data and theory, we find that certain bed slopes have unique channel morphologies because the process variables covary systematically with bed slope. Multiple stable states are predicted for other ranges in bed slope, suggesting that a competition of underlying processes leads to the emergence of the most stable channel form.

  12. Ampicillin Nanoparticles Production via Supercritical CO2 Gas Antisolvent Process.

    PubMed

    Esfandiari, Nadia; Ghoreishi, Seyyed M

    2015-12-01

    The micronization of ampicillin via supercritical gas antisolvent (GAS) process was studied. The particle size distribution was significantly controlled with effective GAS variables such as initial solute concentration, temperature, pressure, and antisolvent addition rate. The effect of each variable in three levels was investigated. The precipitated particles were analyzed with scanning electron microscopy (SEM) and Zetasizer Nano ZS. The results indicated that decreasing the temperature and initial solute concentration while increasing the antisolvent rate and pressure led to a decrease in ampicillin particle size. The mean particle size of ampicillin was obtained in the range of 220-430 nm by varying the GAS effective variables. The purity of GAS-synthesized ampicillin nanoparticles was analyzed in contrast to unprocessed ampicillin by FTIR and HPLC. The results indicated that the structure of the ampicillin nanoparticles remained unchanged during the GAS process.

  13. The concept and science process skills analysis in bomb calorimeter experiment as a foundation for the development of virtual laboratory of bomb calorimeter

    NASA Astrophysics Data System (ADS)

    Kurniati, D. R.; Rohman, I.

    2018-05-01

    This study aims to analyze the concepts and science process skills in bomb calorimeter experiment as a basis for developing the virtual laboratory of bomb calorimeter. This study employed research and development method (R&D) to gain the answer to the proposed problems. This paper discussed the concepts and process skills analysis. The essential concepts and process skills associated with bomb calorimeter are analyze by optimizing the bomb calorimeter experiment. The concepts analysis found seven fundamental concepts to be concerned in developing the virtual laboratory that are internal energy, burning heat, perfect combustion, incomplete combustion, calorimeter constant, bomb calorimeter, and Black principle. Since the concept of bomb calorimeter, perfect and incomplete combustion created to figure out the real situation and contain controllable variables, in virtual the concepts displayed in the form of simulation. Meanwhile, the last four concepts presented in the form of animation because no variable found to be controlled. The process skills analysis detect four notable skills to be developed that are ability to observe, design experiment, interpretation, and communication skills.

  14. Reward Improves Cancellation and Restraint Inhibition Across Childhood and Adolescence

    PubMed Central

    Sinopoli, Katia J.; Schachar, Russell; Dennis, Maureen

    2011-01-01

    Inhibitory control allows for the regulation of thought and action, and interacts with motivational variables, such as reward, to modify behavior adaptively as environments change. We examined the effects of reward on two distinct forms of inhibitory control, cancellation and restraint. Typically developing children and adolescents completed two versions of the stop signal task (cancellation and restraint) under three reward conditions (neutral, low reward, and high reward), where rewards were earned for successful inhibitory control. Rewards improved both cancellation and restraint inhibition, with similar effects of reward on each form of inhibitory control. Rewards did not alter the speed of response execution in either task, suggesting that rewards specifically altered inhibition processes without influencing processes related to response execution. Adolescents were faster and less variable than children when executing and inhibiting their responses. There were similar developmental effects of reward on the speed of inhibitory control, but group differences were found in terms of accuracy of inhibition in the restraint task. These results clarify how reward modulates two different forms of regulatory behavior in children and adolescents. PMID:21744952

  15. Parental divorce during early adolescence in Caucasian families: the role of family process variables in predicting the long-term consequences for early adult psychosocial adjustment.

    PubMed

    Summers, P; Forehand, R; Armistead, L; Tannenbaum, L

    1998-04-01

    The relationship between parental divorce occurring during adolescence and young adult psychosocial adjustment was examined, as was the role of family process variables in clarifying this relationship. Participants were young Caucasian adults from divorced (n = 119) and married (n = 123) families. Assessments were conducted during adolescence and 6 years later during early adulthood. Young adults from married families reported more secure romantic attachments than those from divorced families; however, differences were not evident in other domains of psychosocial adjustment after demographic variables were controlled. Three family process variables (parent-adolescent relationship, interparental conflict, and maternal depressive symptoms) were examined as potential mediators and moderators of the association between parental divorce and young adult adjustment. No evidence supporting mediation or moderation was found; however, the parent-adolescent and parent-young adult relationships, particularly when the identified parent was the father, emerged as significant predictors of young adult psychosocial adjustment.

  16. Association of familial risk for schizophrenia with thalamic and medial prefrontal functional connectivity during attentional control.

    PubMed

    Antonucci, Linda A; Taurisano, Paolo; Fazio, Leonardo; Gelao, Barbara; Romano, Raffaella; Quarto, Tiziana; Porcelli, Annamaria; Mancini, Marina; Di Giorgio, Annabella; Caforio, Grazia; Pergola, Giulio; Popolizio, Teresa; Bertolino, Alessandro; Blasi, Giuseppe

    2016-05-01

    Anomalies in behavioral correlates of attentional processing and related brain activity are crucial correlates of schizophrenia and associated with familial risk for this brain disorder. However, it is not clear how brain functional connectivity during attentional processes is key for schizophrenia and linked with trait vs. state related variables. To address this issue, we investigated patterns of functional connections during attentional control in healthy siblings of patients with schizophrenia, who share with probands genetic features but not variables related to the state of the disorder. 356 controls, 55 patients with schizophrenia on stable treatment with antipsychotics and 40 healthy siblings of patients with this brain disorder underwent the Variable Attentional Control (VAC) task during fMRI. Independent Component Analysis (ICA) is allowed to identify independent components (IC) of BOLD signal recorded during task performance. Results indicated reduced connectivity strength in patients with schizophrenia as well as in their healthy siblings in left thalamus within an attentional control component and greater connectivity in right medial prefrontal cortex (PFC) within the so-called Default Mode Network (DMN) compared to healthy individuals. These results suggest a relationship between familial risk for schizophrenia and brain functional networks during attentional control, such that this biological phenotype may be considered a useful intermediate phenotype in order to link genes effects to aspects of the pathophysiology of this brain disorder. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Multivariate analysis of sludge disintegration by microwave-hydrogen peroxide pretreatment process.

    PubMed

    Ya-Wei, Wang; Cheng-Min, Gui; Xiao-Tang, Ni; Mei-Xue, Chen; Yuan-Song, Wei

    2015-01-01

    Microwave irradiation (with H2O2) has been shown to offer considerable advantages owing to its flexible control, low overall cost, and resulting higher soluble chemical oxygen demand (SCOD); accordingly, the method has been proposed recently as a means of improving sludge disintegration. However, the key factor controlling this sludge pretreatment process, pH, has received insufficient attention to date. To address this, the response surface approach (central composite design) was applied to evaluate the effects of total suspended solids (TSS, 2-20 g/L), pH (4-10), and H2O2 dosage (0-2 w/w) and their interactions on 16 response variables (e.g., SCODreleased, pH, H2O2remaining). The results demonstrated that all three factors affect sludge disintegration significantly, and no pronounced interactions between response variables were observed during disintegration, except for three variables (TCOD, TSSremaining, and H2O2 remaining). Quadratic predictive models were constructed for all 16 response variables (R(2): 0.871-0.991). Taking soluble chemical oxygen demand (SCOD) as an example, the model and coefficients derived above were able to predict the performance of microwave pretreatment (enhanced by H2O2 and pH adjustment) from previously published studies. The predictive models developed were able to optimize the treatment process for multiple disintegration objectives. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Complexity in relational processing predicts changes in functional brain network dynamics.

    PubMed

    Cocchi, Luca; Halford, Graeme S; Zalesky, Andrew; Harding, Ian H; Ramm, Brentyn J; Cutmore, Tim; Shum, David H K; Mattingley, Jason B

    2014-09-01

    The ability to link variables is critical to many high-order cognitive functions, including reasoning. It has been proposed that limits in relating variables depend critically on relational complexity, defined formally as the number of variables to be related in solving a problem. In humans, the prefrontal cortex is known to be important for reasoning, but recent studies have suggested that such processes are likely to involve widespread functional brain networks. To test this hypothesis, we used functional magnetic resonance imaging and a classic measure of deductive reasoning to examine changes in brain networks as a function of relational complexity. As expected, behavioral performance declined as the number of variables to be related increased. Likewise, increments in relational complexity were associated with proportional enhancements in brain activity and task-based connectivity within and between 2 cognitive control networks: A cingulo-opercular network for maintaining task set, and a fronto-parietal network for implementing trial-by-trial control. Changes in effective connectivity as a function of increased relational complexity suggested a key role for the left dorsolateral prefrontal cortex in integrating and implementing task set in a trial-by-trial manner. Our findings show that limits in relational processing are manifested in the brain as complexity-dependent modulations of large-scale networks. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. Dynamic Modeling of the Main Blow in Basic Oxygen Steelmaking Using Measured Step Responses

    NASA Astrophysics Data System (ADS)

    Kattenbelt, Carolien; Roffel, B.

    2008-10-01

    In the control and optimization of basic oxygen steelmaking, it is important to have an understanding of the influence of control variables on the process. However, important process variables such as the composition of the steel and slag cannot be measured continuously. The decarburization rate and the accumulation rate of oxygen, which can be derived from the generally measured waste gas flow and composition, are an indication of changes in steel and slag composition. The influence of the control variables on the decarburization rate and the accumulation rate of oxygen can best be determined in the main blow period. In this article, the measured step responses of the decarburization rate and the accumulation rate of oxygen to step changes in the oxygen blowing rate, lance height, and the addition rate of iron ore during the main blow are presented. These measured step responses are subsequently used to develop a dynamic model for the main blow. The model consists of an iron oxide and a carbon balance and an additional equation describing the influence of the lance height and the oxygen blowing rate on the decarburization rate. With this simple dynamic model, the measured step responses can be explained satisfactorily.

  20. Analysis of positive control STR experiments reveals that results obtained for FGA, D3S1358, and D13S317 condition the success rate of the analysis of routine reference samples.

    PubMed

    Murigneux, Valentine; Dufour, Anne-Béatrice; Lobry, Jean R; Pène, Laurent

    2014-07-01

    About 120,000 reference samples are analyzed each year in the Forensic Laboratory of Lyon. A total of 1640 positive control experiments used to validate and optimize the analytical method in the routine process were submitted to a multivariate exploratory data analysis approach with the aim of better understanding the underlying sources of variability. The peak heights of the 16 genetic markers targeted by the AmpFℓSTR(®) Identifiler(®) STR kit were used as variables of interest. Six different 3130xl genetic analyzers located in the same controlled environment were involved. Two major sources of variability were found: (i) the DNA load of the sample modulates all peak heights in a similar way so that the 16 markers are highly correlated, (ii) the genetic analyzer used with a locus-specific response for peak height and a better sensitivity for the most recently acquired. Three markers (FGA, D3S1358, and D13S317) were found to be of special interest to predict the success rate observed in the routine process. © 2014 American Academy of Forensic Sciences.

  1. Sometimes processes don't matter: the general effect of short term climate variability on erosional systems.

    NASA Astrophysics Data System (ADS)

    Deal, Eric; Braun, Jean

    2017-04-01

    Climatic forcing undoubtedly plays an important role in shaping the Earth's surface. However, precisely how climate affects erosion rates, landscape morphology and the sedimentary record is highly debated. Recently there has been a focus on the influence of short-term variability in rainfall and river discharge on the relationship between climate and erosion rates. Here, we present a simple probabilistic argument, backed by modelling, that demonstrates that the way the Earth's surface responds to short-term climatic forcing variability is primarily determined by the existence and magnitude of erosional thresholds. We find that it is the ratio between the threshold magnitude and the mean magnitude of climatic forcing that determines whether variability matters or not and in which way. This is a fundamental result that applies regardless of the nature of the erosional process. This means, for example, that we can understand the role that discharge variability plays in determining fluvial erosion efficiency despite doubts about the processes involved in fluvial erosion. We can use this finding to reproduce the main conclusions of previous studies on the role of discharge variability in determining long-term fluvial erosion efficiency. Many aspects of the landscape known to influence discharge variability are affected by human activity, such as land use and river damming. Another important control on discharge variability, rainfall intensity, is also expected to increase with warmer temperatures. Among many other implications, our findings help provide a general framework to understand and predict the response of the Earth's surface to changes in mean and variability of rainfall and river discharge associated with the anthropogenic activity. In addition, the process independent nature of our findings suggest that previous work on river discharge variability and erosion thresholds can be applied to other erosional systems.

  2. Intradaily variability of water quality in a shallow tidal lagoon: Mechanisms and implications

    USGS Publications Warehouse

    Lucas, L.V.; Sereno, D.M.; Burau, J.R.; Schraga, T.S.; Lopez, C.B.; Stacey, M.T.; Parchevsky, K.V.; Parchevsky, V.P.

    2006-01-01

    Although surface water quality and its underlying processes vary over time scales ranging from seconds to decades, they have historically been studied at the lower (weekly to interannual) frequencies. The aim of this study was to investigate intradaily variability of three water quality parameters in a small freshwater tidal lagoon (Mildred Island, California). High frequency time series of specific conductivity, water temperature, and chlorophyll a at two locations within the habitat were analyzed in conjunction with supporting hydrodynamic, meteorological, biological, and spatial mapping data. All three constituents exhibited large amplitude intradaily (e.g., semidiurnal tidal and diurnal) oscillations, and periodicity varied across constituents, space, and time. Like other tidal embayments, this habitat is influenced by several processes with distinct periodicities including physical controls, such as tides, solar radiation, and wind, and biological controls, such as photosynthesis, growth, and grazing. A scaling approach was developed to estimate individual process contributions to the observed variability. Scaling results were generally consistent with observations and together with detailed examination of time series and time derivatives, revealed specific mechanisms underlying the observed periodicities, including interactions between the tidal variability, heating, wind, and biology. The implications for monitoring were illustrated through subsampling of the data set. This exercise demonstrated how quantities needed by scientists and managers (e.g., mean or extreme concentrations) may be misrepresented by low frequency data and how short-duration high frequency measurements can aid in the design and interpretation of temporally coarser sampling programs. The dispersive export of chlorophyll a from the habitat exhibited a fortnightly variability corresponding to the modulation of semidiurnal tidal currents with the diurnal cycle of phytoplankton variability, demonstrating how high frequency interactions can govern long-term trends. Process identification, as through the scaling analysis here, can help us anticipate changes in system behavior and adapt our own interactions with the system. ?? 2006 Estuarine Research Federation.

  3. Unconditional or Conditional Logistic Regression Model for Age-Matched Case-Control Data?

    PubMed

    Kuo, Chia-Ling; Duan, Yinghui; Grady, James

    2018-01-01

    Matching on demographic variables is commonly used in case-control studies to adjust for confounding at the design stage. There is a presumption that matched data need to be analyzed by matched methods. Conditional logistic regression has become a standard for matched case-control data to tackle the sparse data problem. The sparse data problem, however, may not be a concern for loose-matching data when the matching between cases and controls is not unique, and one case can be matched to other controls without substantially changing the association. Data matched on a few demographic variables are clearly loose-matching data, and we hypothesize that unconditional logistic regression is a proper method to perform. To address the hypothesis, we compare unconditional and conditional logistic regression models by precision in estimates and hypothesis testing using simulated matched case-control data. Our results support our hypothesis; however, the unconditional model is not as robust as the conditional model to the matching distortion that the matching process not only makes cases and controls similar for matching variables but also for the exposure status. When the study design involves other complex features or the computational burden is high, matching in loose-matching data can be ignored for negligible loss in testing and estimation if the distributions of matching variables are not extremely different between cases and controls.

  4. Unconditional or Conditional Logistic Regression Model for Age-Matched Case–Control Data?

    PubMed Central

    Kuo, Chia-Ling; Duan, Yinghui; Grady, James

    2018-01-01

    Matching on demographic variables is commonly used in case–control studies to adjust for confounding at the design stage. There is a presumption that matched data need to be analyzed by matched methods. Conditional logistic regression has become a standard for matched case–control data to tackle the sparse data problem. The sparse data problem, however, may not be a concern for loose-matching data when the matching between cases and controls is not unique, and one case can be matched to other controls without substantially changing the association. Data matched on a few demographic variables are clearly loose-matching data, and we hypothesize that unconditional logistic regression is a proper method to perform. To address the hypothesis, we compare unconditional and conditional logistic regression models by precision in estimates and hypothesis testing using simulated matched case–control data. Our results support our hypothesis; however, the unconditional model is not as robust as the conditional model to the matching distortion that the matching process not only makes cases and controls similar for matching variables but also for the exposure status. When the study design involves other complex features or the computational burden is high, matching in loose-matching data can be ignored for negligible loss in testing and estimation if the distributions of matching variables are not extremely different between cases and controls. PMID:29552553

  5. Method and system for spatially variable rate application of agricultural chemicals based on remotely sensed vegetation data

    NASA Technical Reports Server (NTRS)

    Lewis, Mark David (Inventor); Seal, Michael R. (Inventor); Hood, Kenneth Brown (Inventor); Johnson, James William (Inventor)

    2007-01-01

    Remotely sensed spectral image data are used to develop a Vegetation Index file which represents spatial variations of actual crop vigor throughout a field that is under cultivation. The latter information is processed to place it in a format that can be used by farm personnel to correlate and calibrate it with actually observed crop conditions existing at control points within the field. Based on the results, farm personnel formulate a prescription request, which is forwarded via email or FTP to a central processing site, where the prescription is prepared. The latter is returned via email or FTP to on-side farm personnel, who can load it into a controller on a spray rig that directly applies inputs to the field at a spatially variable rate.

  6. Method and apparatus for spatially variable rate application of agricultural chemicals based on remotely sensed vegetation data

    NASA Technical Reports Server (NTRS)

    Hood, Kenneth Brown (Inventor); Johnson, James William (Inventor); Seal, Michael R. (Inventor); Lewis, Mark David (Inventor)

    2004-01-01

    Remotely sensed spectral image data are used to develop a Vegetation Index file which represents spatial variations of actual crop vigor throughout a field that is under cultivation. The latter information is processed to place it in a format that can be used by farm personnel to correlate and calibrate it with actually observed crop conditions existing at control points within the field. Based on the results, farm personnel formulate a prescription request, which is forwarded via email or FTP to a central processing site, where the prescription is prepared. The latter is returned via email or FTP to on-side farm personnel, who can load it into a controller on a spray rig that directly applies inputs to the field at a spatially variable rate.

  7. Global patterns in the poleward expansion of mangrove forests

    NASA Astrophysics Data System (ADS)

    Cavanaugh, K. C.; Feller, I. C.

    2016-12-01

    Understanding the processes that limit the geographic ranges of species is one of the central goals of ecology and biogeography. This issue is particularly relevant for coastal wetlands given that climate change is expected to lead to a `tropicalization' of temperate coastal and marine ecosystems. In coastal wetlands around the world, there have already been observations of mangroves expanding into salt marshes near the current poleward range limits of mangroves. However, there is still uncertainty regarding regional variability in the factors that control mangrove range limits. Here we used time series of Landsat satellite imagery to characterize patterns of mangrove abundance near their poleward range limits around the world. We tested the commonly held assumption that temporal variation in abundance should increase towards the edge of the range. We also compared variability in mangrove abundance to climate factors thought to set mangrove range limits (air temperature, water temperature, and aridity). In general, variability in mangrove abundance at range edges was high relative to range centers and this variability was correlated to one or more climate factors. However, the strength of these relationships varied among poleward range limits, suggesting that some mangrove range limits are control by processes other than climate, such as dispersal limitation.

  8. What Students Learn from Hands-On Activities

    ERIC Educational Resources Information Center

    Schwichow, Martin; Zimmerman, Corinne; Croker, Steve; Härtig, Hendrik

    2016-01-01

    The ability to design and interpret controlled experiments is an important scientific process skill and a common objective of science standards. Numerous intervention studies have investigated how the control-of-variables-strategy (CVS) can be introduced to students. However, a meta-analysis of 72 intervention studies found that the opportunity to…

  9. Health behavior change in advance care planning: an agent-based model.

    PubMed

    Ernecoff, Natalie C; Keane, Christopher R; Albert, Steven M

    2016-02-29

    A practical and ethical challenge in advance care planning research is controlling and intervening on human behavior. Additionally, observing dynamic changes in advance care planning (ACP) behavior proves difficult, though tracking changes over time is important for intervention development. Agent-based modeling (ABM) allows researchers to integrate complex behavioral data about advance care planning behaviors and thought processes into a controlled environment that is more easily alterable and observable. Literature to date has not addressed how best to motivate individuals, increase facilitators and reduce barriers associated with ACP. We aimed to build an ABM that applies the Transtheoretical Model of behavior change to ACP as a health behavior and accurately reflects: 1) the rates at which individuals complete the process, 2) how individuals respond to barriers, facilitators, and behavioral variables, and 3) the interactions between these variables. We developed a dynamic ABM of the ACP decision making process based on the stages of change posited by the Transtheoretical Model. We integrated barriers, facilitators, and other behavioral variables that agents encounter as they move through the process. We successfully incorporated ACP barriers, facilitators, and other behavioral variables into our ABM, forming a plausible representation of ACP behavior and decision-making. The resulting distributions across the stages of change replicated those found in the literature, with approximately half of participants in the action-maintenance stage in both the model and the literature. Our ABM is a useful method for representing dynamic social and experiential influences on the ACP decision making process. This model suggests structural interventions, e.g. increasing access to ACP materials in primary care clinics, in addition to improved methods of data collection for behavioral studies, e.g. incorporating longitudinal data to capture behavioral dynamics.

  10. Advanced Control Synthesis for Reverse Osmosis Water Desalination Processes.

    PubMed

    Phuc, Bui Duc Hong; You, Sam-Sang; Choi, Hyeung-Six; Jeong, Seok-Kwon

    2017-11-01

      In this study, robust control synthesis has been applied to a reverse osmosis desalination plant whose product water flow and salinity are chosen as two controlled variables. The reverse osmosis process has been selected to study since it typically uses less energy than thermal distillation. The aim of the robust design is to overcome the limitation of classical controllers in dealing with large parametric uncertainties, external disturbances, sensor noises, and unmodeled process dynamics. The analyzed desalination process is modeled as a multi-input multi-output (MIMO) system with varying parameters. The control system is decoupled using a feed forward decoupling method to reduce the interactions between control channels. Both nominal and perturbed reverse osmosis systems have been analyzed using structured singular values for their stabilities and performances. Simulation results show that the system responses meet all the control requirements against various uncertainties. Finally the reduced order controller provides excellent robust performance, with achieving decoupling, disturbance attenuation, and noise rejection. It can help to reduce the membrane cleanings, increase the robustness against uncertainties, and lower the energy consumption for process monitoring.

  11. Effects of variable practice on the motor learning outcomes in manual wheelchair propulsion.

    PubMed

    Leving, Marika T; Vegter, Riemer J K; de Groot, Sonja; van der Woude, Lucas H V

    2016-11-23

    Handrim wheelchair propulsion is a cyclic skill that needs to be learned during rehabilitation. It has been suggested that more variability in propulsion technique benefits the motor learning process of wheelchair propulsion. The purpose of this study was to determine the influence of variable practice on the motor learning outcomes of wheelchair propulsion in able-bodied participants. Variable practice was introduced in the form of wheelchair basketball practice and wheelchair-skill practice. Motor learning was operationalized as improvements in mechanical efficiency and propulsion technique. Eleven Participants in the variable practice group and 12 participants in the control group performed an identical pre-test and a post-test. Pre- and post-test were performed in a wheelchair on a motor-driven treadmill (1.11 m/s) at a relative power output of 0.23 W/kg. Energy consumption and the propulsion technique variables with their respective coefficient of variation were calculated. Between the pre- and the post-test the variable practice group received 7 practice sessions. During the practice sessions participants performed one-hour of variable practice, consisting of five wheelchair-skill tasks and a 30 min wheelchair basketball game. The control group did not receive any practice between the pre- and the post-test. Comparison of the pre- and the post-test showed that the variable practice group significantly improved the mechanical efficiency (4.5 ± 0.6% → 5.7 ± 0.7%) in contrast to the control group (4.5 ± 0.6% → 4.4 ± 0.5%) (group x time interaction effect p < 0.001).With regard to propulsion technique, both groups significantly reduced the push frequency and increased the contact angle of the hand with the handrim (within group, time effect). No significant group × time interaction effects were found for propulsion technique. With regard to propulsion variability, the variable practice group increased variability when compared to the control group (interaction effect p < 0.001). Compared to a control, variable practice, resulted in an increase in mechanical efficiency and increased variability. Interestingly, the large relative improvement in mechanical efficiency was concomitant with only moderate improvements in the propulsion technique, which were similar in the control group, suggesting that other factors besides propulsion technique contributed to the lower energy expenditure.

  12. Explaining the effects of an intervention designed to promote evidence-based diabetes care: a theory-based process evaluation of a pragmatic cluster randomised controlled trial

    PubMed Central

    Francis, Jillian J; Eccles, Martin P; Johnston, Marie; Whitty, Paula; Grimshaw, Jeremy M; Kaner, Eileen FS; Smith, Liz; Walker, Anne

    2008-01-01

    Background The results of randomised controlled trials can be usefully illuminated by studies of the processes by which they achieve their effects. The Theory of Planned Behaviour (TPB) offers a framework for conducting such studies. This study used TPB to explore the observed effects in a pragmatic cluster randomised controlled trial of a structured recall and prompting intervention to increase evidence-based diabetes care that was conducted in three Primary Care Trusts in England. Methods All general practitioners and nurses in practices involved in the trial were sent a postal questionnaire at the end of the intervention period, based on the TPB (predictor variables: attitude; subjective norm; perceived behavioural control, or PBC). It focussed on three clinical behaviours recommended in diabetes care: measuring blood pressure; inspecting feet; and prescribing statins. Multivariate analyses of variance and multiple regression analyses were used to explore changes in cognitions and thereby better understand trial effects. Results Fifty-nine general medical practitioners and 53 practice nurses (intervention: n = 55, 41.98% of trial participants; control: n = 57, 38.26% of trial participants) completed the questionnaire. There were no differences between groups in mean scores for attitudes, subjective norms, PBC or intentions. Control group clinicians had 'normatively-driven' intentions (i.e., related to subjective norm scores), whereas intervention group clinicians had 'attitudinally-driven' intentions (i.e., related to attitude scores) for foot inspection and statin prescription. After controlling for effects of the three predictor variables, this group difference was significant for foot inspection behaviour (trial group × attitude interaction, beta = 0.72, p < 0.05; trial group × subjective norm interaction, beta = -0.65, p < 0.05). Conclusion Attitudinally-driven intentions are proposed to be more consistently translated into action than normatively-driven intentions. This proposition was supported by the findings, thus offering an interpretation of the trial effects. This analytic approach demonstrates the potential of the TPB to explain trial effects in terms of different relationships between variables rather than differences in mean scores. This study illustrates the use of theory-based process evaluation to uncover processes underlying change in implementation trials. PMID:19019242

  13. Contribution au developpement d'une methode de controle des procedes dans une usine de bouletage

    NASA Astrophysics Data System (ADS)

    Gosselin, Claude

    This thesis, a collaborative effort between Ecole de technologie superieure and ArcelorMittal Company, presents the development of a methodology for monitoring and quality control of multivariable industrial production processes. This innovation research mandate was developed at ArcelorMittal Exploitation Miniere (AMEM) pellet plant in Port-Cartier (Quebec, Canada). With this undertaking, ArcelorMittal is striving to maintain its world class level of excellence and continues to pursue initiatives that can augment its competitive advantage worldwide. The plant's gravimetric classification process was retained as a prototype and development laboratory due to its effect on the company's competitiveness and its impact on subsequent steps leading to final production of iron oxide pellets. Concretely, the development of this expertise in process control and in situ monitoring will establish a firm basic knowledge in the fields of complex system physical modeling, data reconciliation, statistical observers, multivariate command and quality control using real-time monitoring of the desirability function. The hydraulic classifier is mathematically modeled. Using planned disturbances on the production line, an identification procedure was established to provide empirical estimations of the model's structural parameters. A new sampling campaign and a previously unpublished data collection and consolidation policy were implemented plant-wide. Access to these invaluable data sources has enabled the establishment of new thresholds that govern the production process and its control. Finally, as a substitute for the traditional quality control process, we have implemented a new strategy based on the use of the desirability function. Our innovation is not in using this Finally, as a substitute for the traditional quality control process, we have implemented a new strategy based on the use of the desirability function. Our innovation is not in using this function as an indicator of overall (economic) satisfaction in the production process, but rather in proposing it as an "observer" of the system's state. The first implementation steps have already demonstrated the method's feasibility as well as other numerous industrial impacts on production processes within the company. Namely, the emergence of the economical aspect as a strategic variable that assures better governance of production processes where quality variables present strategic issues.

  14. Robust parameter design for automatically controlled systems and nanostructure synthesis

    NASA Astrophysics Data System (ADS)

    Dasgupta, Tirthankar

    2007-12-01

    This research focuses on developing comprehensive frameworks for developing robust parameter design methodology for dynamic systems with automatic control and for synthesis of nanostructures. In many automatically controlled dynamic processes, the optimal feedback control law depends on the parameter design solution and vice versa and therefore an integrated approach is necessary. A parameter design methodology in the presence of feedback control is developed for processes of long duration under the assumption that experimental noise factors are uncorrelated over time. Systems that follow a pure-gain dynamic model are considered and the best proportional-integral and minimum mean squared error control strategies are developed by using robust parameter design. The proposed method is illustrated using a simulated example and a case study in a urea packing plant. This idea is also extended to cases with on-line noise factors. The possibility of integrating feedforward control with a minimum mean squared error feedback control scheme is explored. To meet the needs of large scale synthesis of nanostructures, it is critical to systematically find experimental conditions under which the desired nanostructures are synthesized reproducibly, at large quantity and with controlled morphology. The first part of the research in this area focuses on modeling and optimization of existing experimental data. Through a rigorous statistical analysis of experimental data, models linking the probabilities of obtaining specific morphologies to the process variables are developed. A new iterative algorithm for fitting a Multinomial GLM is proposed and used. The optimum process conditions, which maximize the above probabilities and make the synthesis process less sensitive to variations of process variables around set values, are derived from the fitted models using Monte-Carlo simulations. The second part of the research deals with development of an experimental design methodology, tailor-made to address the unique phenomena associated with nanostructure synthesis. A sequential space filling design called Sequential Minimum Energy Design (SMED) for exploring best process conditions for synthesis of nanowires. The SMED is a novel approach to generate sequential designs that are model independent, can quickly "carve out" regions with no observable nanostructure morphology, and allow for the exploration of complex response surfaces.

  15. Patterned wafer geometry grouping for improved overlay control

    NASA Astrophysics Data System (ADS)

    Lee, Honggoo; Han, Sangjun; Woo, Jaeson; Park, Junbeom; Song, Changrock; Anis, Fatima; Vukkadala, Pradeep; Jeon, Sanghuck; Choi, DongSub; Huang, Kevin; Heo, Hoyoung; Smith, Mark D.; Robinson, John C.

    2017-03-01

    Process-induced overlay errors from outside the litho cell have become a significant contributor to the overlay error budget including non-uniform wafer stress. Previous studies have shown the correlation between process-induced stress and overlay and the opportunity for improvement in process control, including the use of patterned wafer geometry (PWG) metrology to reduce stress-induced overlay signatures. Key challenges of volume semiconductor manufacturing are how to improve not only the magnitude of these signatures, but also the wafer to wafer variability. This work involves a novel technique of using PWG metrology to provide improved litho-control by wafer-level grouping based on incoming process induced overlay, relevant for both 3D NAND and DRAM. Examples shown in this study are from 19 nm DRAM manufacturing.

  16. High-volume manufacturing device overlay process control

    NASA Astrophysics Data System (ADS)

    Lee, Honggoo; Han, Sangjun; Woo, Jaeson; Lee, DongYoung; Song, ChangRock; Heo, Hoyoung; Brinster, Irina; Choi, DongSub; Robinson, John C.

    2017-03-01

    Overlay control based on DI metrology of optical targets has been the primary basis for run-to-run process control for many years. In previous work we described a scenario where optical overlay metrology is performed on metrology targets on a high frequency basis including every lot (or most lots) at DI. SEM based FI metrology is performed ondevice in-die as-etched on an infrequent basis. Hybrid control schemes of this type have been in use for many process nodes. What is new is the relative size of the NZO as compared to the overlay spec, and the need to find more comprehensive solutions to characterize and control the size and variability of NZO at the 1x nm node: sampling, modeling, temporal frequency and control aspects, as well as trade-offs between SEM throughput and accuracy.

  17. Ultrasonics and Optics Would Control Shot Size

    NASA Technical Reports Server (NTRS)

    Morrison, A. D.

    1983-01-01

    Feedback system assures production of silicon shot of uniform size. Breakup of silicon stream into drops is controlled, in part, by varying frequency of vibrations imparted to stream by ultrasonic transducer. Drop size monitored by photodetector. Control method particularly advantageous in that constant size is maintained even while other process variables are changed deliberately or inadvertently. Applicable to materials other than silicon.

  18. The IRMIS object model and services API.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saunders, C.; Dohan, D. A.; Arnold, N. D.

    2005-01-01

    The relational model developed for the Integrated Relational Model of Installed Systems (IRMIS) toolkit has been successfully used to capture the Advanced Photon Source (APS) control system software (EPICS process variables and their definitions). The relational tables are populated by a crawler script that parses each Input/Output Controller (IOC) start-up file when an IOC reboot is detected. User interaction is provided by a Java Swing application that acts as a desktop for viewing the process variable information. Mapping between the display objects and the relational tables was carried out with the Hibernate Object Relational Modeling (ORM) framework. Work is wellmore » underway at the APS to extend the relational modeling to include control system hardware. For this work, due in part to the complex user interaction required, the primary application development environment has shifted from the relational database view to the object oriented (Java) perspective. With this approach, the business logic is executed in Java rather than in SQL stored procedures. This paper describes the object model used to represent control system software, hardware, and interconnects in IRMIS. We also describe the services API used to encapsulate the required behaviors for creating and maintaining the complex data. In addition to the core schema and object model, many important concepts in IRMIS are captured by the services API. IRMIS is an ambitious collaborative effort for defining and developing a relational database and associated applications to comprehensively document the large and complex EPICS-based control systems of today's accelerators. The documentation effort includes process variables, control system hardware, and interconnections. The approach could also be used to document all components of the accelerator, including mechanical, vacuum, power supplies, etc. One key aspect of IRMIS is that it is a documentation framework, not a design and development tool. We do not generate EPICS control system configurations from IRMIS, and hence do not impose any additional requirements on EPICS developers.« less

  19. Stochastic state-space temperature regulation of biochar production. Part I: Theoretical development.

    PubMed

    Cantrell, Keri B; Martin, Jerry H

    2012-02-01

    The concept of a designer biochar that targets the improvement of a specific soil property imposes the need for production processes to generate biochars with both high consistency and quality. These important production parameters can be affected by variations in process temperature that must be taken into account when controlling the pyrolysis of agricultural residues such as manures and other feedstocks. A novel stochastic state-space temperature regulator was developed to accurately match biochar batch production to a defined temperature input schedule. This was accomplished by describing the system's state-space with five temperature variables--four directly measured and one change in temperature. Relationships were derived between the observed state and the desired, controlled state. When testing the unit at two different temperatures, the actual pyrolytic temperature was within 3 °C of the control with no overshoot. This state-space regulator simultaneously controlled the indirect heat source and sample temperature by employing difficult-to-measure variables such as temperature stability in the description of the pyrolysis system's state-space. These attributes make a state-space controller an optimum control scheme for the production of a predictable, repeatable designer biochar. Published 2011 by John Wiley & Sons, Ltd.

  20. Using a statistical process control chart during the quality assessment of cancer registry data.

    PubMed

    Myles, Zachary M; German, Robert R; Wilson, Reda J; Wu, Manxia

    2011-01-01

    Statistical process control (SPC) charts may be used to detect acute variations in the data while simultaneously evaluating unforeseen aberrations that may warrant further investigation by the data user. Using cancer stage data captured by the Summary Stage 2000 (SS2000) variable, we sought to present a brief report highlighting the utility of the SPC chart during the quality assessment of cancer registry data. Using a county-level caseload for the diagnosis period of 2001-2004 (n=25,648), we found the overall variation of the SS2000 variable to be in control during diagnosis years of 2001 and 2002, exceeded the lower control limit (LCL) in 2003, and exceeded the upper control limit (UCL) in 2004; in situ/localized stages were in control throughout the diagnosis period, regional stage exceeded UCL in 2004, and distant stage exceeded the LCL in 2001 and the UCL in 2004. Our application of the SPC chart with cancer registry data illustrates that the SPC chart may serve as a readily available and timely tool for identifying areas of concern during the data collection and quality assessment of central cancer registry data.

  1. Investigation of multidimensional control systems in the state space and wavelet medium

    NASA Astrophysics Data System (ADS)

    Fedosenkov, D. B.; Simikova, A. A.; Fedosenkov, B. A.

    2018-05-01

    The notions are introduced of “one-dimensional-point” and “multidimensional-point” automatic control systems. To demonstrate the joint use of approaches based on the concepts of state space and wavelet transforms, a method for optimal control in a state space medium represented in the form of time-frequency representations (maps), is considered. The computer-aided control system is formed on the basis of the similarity transformation method, which makes it possible to exclude the use of reduced state variable observers. 1D-material flow signals formed by primary transducers are converted by means of wavelet transformations into multidimensional concentrated-at-a point variables in the form of time-frequency distributions of Cohen’s class. The algorithm for synthesizing a stationary controller for feeding processes is given here. The conclusion is made that the formation of an optimal control law with time-frequency distributions available contributes to the improvement of transient processes quality in feeding subsystems and the mixing unit. Confirming the efficiency of the method presented is illustrated by an example of the current registration of material flows in the multi-feeding unit. The first section in your paper.

  2. Experience with compound words influences their processing: An eye movement investigation with English compound words.

    PubMed

    Juhasz, Barbara J

    2016-11-14

    Recording eye movements provides information on the time-course of word recognition during reading. Juhasz and Rayner [Juhasz, B. J., & Rayner, K. (2003). Investigating the effects of a set of intercorrelated variables on eye fixation durations in reading. Journal of Experimental Psychology: Learning, Memory and Cognition, 29, 1312-1318] examined the impact of five word recognition variables, including familiarity and age-of-acquisition (AoA), on fixation durations. All variables impacted fixation durations, but the time-course differed. However, the study focused on relatively short, morphologically simple words. Eye movements are also informative for examining the processing of morphologically complex words such as compound words. The present study further examined the time-course of lexical and semantic variables during morphological processing. A total of 120 English compound words that varied in familiarity, AoA, semantic transparency, lexeme meaning dominance, sensory experience rating (SER), and imageability were selected. The impact of these variables on fixation durations was examined when length, word frequency, and lexeme frequencies were controlled in a regression model. The most robust effects were found for familiarity and AoA, indicating that a reader's experience with compound words significantly impacts compound recognition. These results provide insight into semantic processing of morphologically complex words during reading.

  3. Advances in photonic MOEMS-MEMS device thinning and polishing

    NASA Astrophysics Data System (ADS)

    McAneny, James J.; Kennedy, Mark; McGroggan, Tom

    2010-02-01

    As devices continue to increase in density and complexity, ever more stringent specifications are placed on the wafer scale equipment manufacturers to produce higher quality and higher output. This results in greater investment and more resource being diverted into producing tools and processes which can meet the latest demanding criteria. Substrate materials employed in the fabrication process range from Silicon through InP and include GaAs, InSb and other optical networking or waveguide materials. With this diversity of substrate materials presented, controlling the geometries and surfaces grows progressively more challenging. This article highlights the key parameters which require close monitoring and control in order to produce highly precise wafers as part of the fabrication process. Several as cut and commercially available standard polished wafer materials were used in empirical trials to test tooling options in generating high levels of geometric control over the dimensions while producing high quality surface finishes. Specific attention was given to the measurement and control of: flatness; parallelism/TTV; surface roughness and final target thickness as common specifications required by the industry. By combining the process variables of: plate speed, download pressure, slurry flow rate and concentration, pad type and wafer travel path across the polish pad, the effect of altering these variables was recorded and analysed to realize the optimum process conditions for the materials under test. The results being then used to design improved methods and tooling for the thinning and polishing of photonic materials applied to MOEMS-MEMS device fabrication.

  4. Static and Dynamic Aeroelastic Tailoring With Variable Camber Control

    NASA Technical Reports Server (NTRS)

    Stanford, Bret K.

    2016-01-01

    This paper examines the use of a Variable Camber Continuous Trailing Edge Flap (VCCTEF) system for aeroservoelastic optimization of a transport wingbox. The quasisteady and unsteady motions of the flap system are utilized as design variables, along with patch-level structural variables, towards minimizing wingbox weight via maneuver load alleviation and active flutter suppression. The resulting system is, in general, very successful at removing structural weight in a feasible manner. Limitations to this success are imposed by including load cases where the VCCTEF system is not active (open-loop) in the optimization process, and also by including actuator operating cost constraints.

  5. Multivariable model predictive control design of reactive distillation column for Dimethyl Ether production

    NASA Astrophysics Data System (ADS)

    Wahid, A.; Putra, I. G. E. P.

    2018-03-01

    Dimethyl ether (DME) as an alternative clean energy has attracted a growing attention in the recent years. DME production via reactive distillation has potential for capital cost and energy requirement savings. However, combination of reaction and distillation on a single column makes reactive distillation process a very complex multivariable system with high non-linearity of process and strong interaction between process variables. This study investigates a multivariable model predictive control (MPC) based on two-point temperature control strategy for the DME reactive distillation column to maintain the purities of both product streams. The process model is estimated by a first order plus dead time model. The DME and water purity is maintained by controlling a stage temperature in rectifying and stripping section, respectively. The result shows that the model predictive controller performed faster responses compared to conventional PI controller that are showed by the smaller ISE values. In addition, the MPC controller is able to handle the loop interactions well.

  6. Implementation of "Quality by Design (QbD)" Approach for the Development of 5-Fluorouracil Loaded Thermosensitive Hydrogel.

    PubMed

    Dalwadi, Chintan; Patel, Gayatri

    2016-01-01

    The purpose of this study was to investigate Quality by Design (QbD) principle for the preparation of hydrogel products to prove both practicability and utility of executing QbD concept to hydrogel based controlled release systems. Product and process understanding will help in decreasing the variability of critical material and process parameters, which give quality product output and reduce the risk. This study includes the identification of the Quality Target Product Profiles (QTPPs) and Critical Quality Attributes (CQAs) from literature or preliminary studies. To identify and control the variability in process and material attributes, two tools of QbD was utilized, Quality Risk Management (QRM) and Experimental Design. Further, it helps to identify the effect of these attributes on CQAs. Potential risk factors were identified from fishbone diagram and screened by risk assessment and optimized by 3-level 2- factor experimental design with center points in triplicate, to analyze the precision of the target process. This optimized formulation was further characterized by gelling time, gelling temperature, rheological parameters, in-vitro biodegradation and in-vitro drug release. Design space was created using experimental design tool that gives the control space and working within this controlled space reduces all the failure modes below the risk level. In conclusion, QbD approach with QRM tool provides potent and effectual pyramid to enhance the quality into the hydrogel.

  7. Seismic Response Control Of Structures Using Semi-Active and Passive Variable Stiffness Devices

    NASA Astrophysics Data System (ADS)

    Salem, Mohamed M. A.

    Controllable devices such as Magneto-Rheological Fluid Dampers, Electro-Rheological Dampers, and controllable friction devices have been studied extensively with limited implementation in real structures. Such devices have shown great potential in reducing seismic demands, either as smart base isolation systems, or as smart devices for multistory structures. Although variable stiffness devices can be used for seismic control of structures, the vast majority of research effort has been given to the control of damping. The primary focus of this dissertation is to evaluate the seismic control of structures using semi-active and passive variable stiffness characteristics. Smart base isolation systems employing variable stiffness devices have been studied, and two semi-active control strategies are proposed. The control algorithms were designed to reduce the superstructure and base accelerations of seismically isolated structures subject to near-fault and far-field ground motions. Computational simulations of the proposed control algorithms on the benchmark structure have shown that excessive base displacements associated with the near-fault ground motions may be better mitigated with the use of variable stiffness devices. However, the device properties must be controllable to produce a wide range of stiffness changes for an effective control of the base displacements. The potential of controllable stiffness devices in limiting the base displacement due to near-fault excitation without compromising the performance of conventionally isolated structures, is illustrated. The application of passive variable stiffness devices for seismic response mitigation of multistory structures is also investigated. A stiffening bracing system (SBS) is proposed to replace the conventional bracing systems of braced frames. An optimization process for the SBS parameters has been developed. The main objective of the design process is to maintain a uniform inter-story drift angle over the building's height, which in turn would evenly distribute the seismic demand over the building. This behavior is particularly essential so that any possible damage is not concentrated in a single story. Furthermore, the proposed design ensures that additional damping devices distributed over the building's height work efficiently with their maximum design capacity, leading to a cost efficient design. An integrated and comprehensive design procedure that can be readily adopted by the current seismic design codes is proposed. An equivalent lateral force distribution is developed that shows a good agreement with the response history analyses in terms of seismic performance and demand prediction. This lateral force pattern explicitly accounts for the higher mode effect, the dynamic characteristics of the structure, the supplemental damping, and the site specific seismic hazard. Therefore, the proposed design procedure is considered as a standalone method for the design of SBS equipped buildings.

  8. A robust variable sampling time BLDC motor control design based upon μ-synthesis.

    PubMed

    Hung, Chung-Wen; Yen, Jia-Yush

    2013-01-01

    The variable sampling rate system is encountered in many applications. When the speed information is derived from the position marks along the trajectory, one would have a speed dependent sampling rate system. The conventional fixed or multisampling rate system theory may not work in these cases because the system dynamics include the uncertainties which resulted from the variable sampling rate. This paper derived a convenient expression for the speed dependent sampling rate system. The varying sampling rate effect is then translated into multiplicative uncertainties to the system. The design then uses the popular μ-synthesis process to achieve a robust performance controller design. The implementation on a BLDC motor demonstrates the effectiveness of the design approach.

  9. A Robust Variable Sampling Time BLDC Motor Control Design Based upon μ-Synthesis

    PubMed Central

    Yen, Jia-Yush

    2013-01-01

    The variable sampling rate system is encountered in many applications. When the speed information is derived from the position marks along the trajectory, one would have a speed dependent sampling rate system. The conventional fixed or multisampling rate system theory may not work in these cases because the system dynamics include the uncertainties which resulted from the variable sampling rate. This paper derived a convenient expression for the speed dependent sampling rate system. The varying sampling rate effect is then translated into multiplicative uncertainties to the system. The design then uses the popular μ-synthesis process to achieve a robust performance controller design. The implementation on a BLDC motor demonstrates the effectiveness of the design approach. PMID:24327804

  10. An Optimized Trajectory Planning for Welding Robot

    NASA Astrophysics Data System (ADS)

    Chen, Zhilong; Wang, Jun; Li, Shuting; Ren, Jun; Wang, Quan; Cheng, Qunchao; Li, Wentao

    2018-03-01

    In order to improve the welding efficiency and quality, this paper studies the combined planning between welding parameters and space trajectory for welding robot and proposes a trajectory planning method with high real-time performance, strong controllability and small welding error. By adding the virtual joint at the end-effector, the appropriate virtual joint model is established and the welding process parameters are represented by the virtual joint variables. The trajectory planning is carried out in the robot joint space, which makes the control of the welding process parameters more intuitive and convenient. By using the virtual joint model combined with the B-spline curve affine invariant, the welding process parameters are indirectly controlled by controlling the motion curve of the real joint. To solve the optimal time solution as the goal, the welding process parameters and joint space trajectory joint planning are optimized.

  11. Tangent linear super-parameterization: attributable, decomposable moist processes for tropical variability studies

    NASA Astrophysics Data System (ADS)

    Mapes, B. E.; Kelly, P.; Song, S.; Hu, I. K.; Kuang, Z.

    2015-12-01

    An economical 10-layer global primitive equation solver is driven by time-independent forcing terms, derived from a training process, to produce a realisting eddying basic state with a tracer q trained to act like water vapor mixing ratio. Within this basic state, linearized anomaly moist physics in the column are applied in the form of a 20x20 matrix. The control matrix was derived from the results of Kuang (2010, 2012) who fitted a linear response function from a cloud resolving model in a state of deep convecting equilibrium. By editing this matrix in physical space and eigenspace, scaling and clipping its action, and optionally adding terms for processes that do not conserve moist statice energy (radiation, surface fluxes), we can decompose and explain the model's diverse moist process coupled variability. Recitified effects of this variability on the general circulation and climate, even in strictly zero-mean centered anomaly physic cases, also are sometimes surprising.

  12. The variability puzzle in human memory.

    PubMed

    Kahana, Michael J; Aggarwal, Eash V; Phan, Tung D

    2018-04-26

    Memory performance exhibits a high level of variability from moment to moment. Much of this variability may reflect inadequately controlled experimental variables, such as word memorability, past practice and subject fatigue. Alternatively, stochastic variability in performance may largely reflect the efficiency of endogenous neural processes that govern memory function. To help adjudicate between these competing views, the authors conducted a multisession study in which subjects completed 552 trials of a delayed free-recall task. Applying a statistical model to predict variability in each subject's recall performance uncovered modest effects of word memorability, proactive interference, and other variables. In contrast to the limited explanatory power of these experimental variables, performance on the prior list strongly predicted current list recall. These findings suggest that endogenous factors underlying successful encoding and retrieval drive variability in performance. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  13. [Application of quality by design in granulation process for ginkgo leaf tablet (Ⅱ): identification of critical quality attributes].

    PubMed

    Xu, Bing; Cui, Xiang-Long; Yang, Chan; Wang, Xin; Shi, Xin-Yuan; Qiao, Yan-Jiang

    2017-03-01

    Quality by design (QbD) highlights the concept of "begin with the end", which means to thoroughly understand the target product quality first, and then guide pharmaceutical process development and quality control throughout the whole manufacturing process. In this paper, the Ginkgo biloba granules intermediates were taken as the research object, and the requirements of the tensile strength of tablets were treated as the goals to establish the methods for identification of granules' critical quality attributes (CQAs) and establishment of CQAs' limits. Firstly, the orthogonal partial least square (OPLS) model was adopted to build the relationship between the micromeritic properties of 29 batches of granules and the tensile strength of ginkgo leaf tablets, and thereby the potential critical quality attributes (pCQAs) were screened by variable importance in the projection (VIP) indexes. Then, a series of OPLS models were rebuilt by reducing pCQAs variables one by one in view of the rule of VIP values from low to high in sequence. The model performance results demonstrated that calibration and predictive performance of the model had no decreasing trend after variables reduction. In consideration of the results from variables selection as well as the collinearity test and testability of the pCQAs, the median particle size (D₅₀) and the bulk density (Da) were identified as critical quality attributes (CQAs). The design space of CQAs was developed based on a multiple linear regression model established between the CQAs (D₅₀ and Da) and the tensile strength. The control constraints of the CQAs were determined as 170 μm< D₅₀<500 μm and 0.30 g•cm⁻³

  14. Age-related decline in cognitive control: the role of fluid intelligence and processing speed

    PubMed Central

    2014-01-01

    Background Research on cognitive control suggests an age-related decline in proactive control abilities whereas reactive control seems to remain intact. However, the reason of the differential age effect on cognitive control efficiency is still unclear. This study investigated the potential influence of fluid intelligence and processing speed on the selective age-related decline in proactive control. Eighty young and 80 healthy older adults were included in this study. The participants were submitted to a working memory recognition paradigm, assessing proactive and reactive cognitive control by manipulating the interference level across items. Results Repeated measures ANOVAs and hierarchical linear regressions indicated that the ability to appropriately use cognitive control processes during aging seems to be at least partially affected by the amount of available cognitive resources (assessed by fluid intelligence and processing speed abilities). Conclusions This study highlights the potential role of cognitive resources on the selective age-related decline in proactive control, suggesting the importance of a more exhaustive approach considering the confounding variables during cognitive control assessment. PMID:24401034

  15. Speed but not amplitude of visual feedback exacerbates force variability in older adults.

    PubMed

    Kim, Changki; Yacoubi, Basma; Christou, Evangelos A

    2018-06-23

    Magnification of visual feedback (VF) impairs force control in older adults. In this study, we aimed to determine whether the age-associated increase in force variability with magnification of visual feedback is a consequence of increased amplitude or speed of visual feedback. Seventeen young and 18 older adults performed a constant isometric force task with the index finger at 5% of MVC. We manipulated the vertical (force gain) and horizontal (time gain) aspect of the visual feedback so participants performed the task with the following VF conditions: (1) high amplitude-fast speed; (2) low amplitude-slow speed; (3) high amplitude-slow speed. Changing the visual feedback from low amplitude-slow speed to high amplitude-fast speed increased force variability in older adults but decreased it in young adults (P < 0.01). Changing the visual feedback from low amplitude-slow speed to high amplitude-slow speed did not alter force variability in older adults (P > 0.2), but decreased it in young adults (P < 0.01). Changing the visual feedback from high amplitude-slow speed to high amplitude-fast speed increased force variability in older adults (P < 0.01) but did not alter force variability in young adults (P > 0.2). In summary, increased force variability in older adults with magnification of visual feedback was evident only when the speed of visual feedback increased. Thus, we conclude that in older adults deficits in the rate of processing visual information and not deficits in the processing of more visual information impair force control.

  16. Increased Intra-Participant Variability in Children with Autistic Spectrum Disorders: Evidence from Single-Trial Analysis of Evoked EEG

    PubMed Central

    Milne, Elizabeth

    2011-01-01

    Intra-participant variability in clinical conditions such as autistic spectrum disorder (ASD) is an important indicator of pathophysiological processing. The data reported here illustrate that trial-by-trial variability can be reliably measured from EEG, and that intra-participant EEG variability is significantly greater in those with ASD than in neuro-typical matched controls. EEG recorded at the scalp is a linear mixture of activity arising from muscle artifacts and numerous concurrent brain processes. To minimize these additional sources of variability, EEG data were subjected to two different methods of spatial filtering. (i) The data were decomposed using infomax independent component analysis, a method of blind source separation which un-mixes the EEG signal into components with maximally independent time-courses, and (ii) a surface Laplacian transform was performed (current source density interpolation) in order to reduce the effects of volume conduction. Data are presented from 13 high functioning adolescents with ASD without co-morbid ADHD, and 12 neuro-typical age-, IQ-, and gender-matched controls. Comparison of variability between the ASD and neuro-typical groups indicated that intra-participant variability of P1 latency and P1 amplitude was greater in the participants with ASD, and inter-trial α-band phase coherence was lower in the participants with ASD. These data support the suggestion that individuals with ASD are less able to synchronize the activity of stimulus-related cell assemblies than neuro-typical individuals, and provide empirical evidence in support of theories of increased neural noise in ASD. PMID:21716921

  17. Automatic Processing of Reactive Polymers

    NASA Technical Reports Server (NTRS)

    Roylance, D.

    1985-01-01

    A series of process modeling computer codes were examined. The codes use finite element techniques to determine the time-dependent process parameters operative during nonisothermal reactive flows such as can occur in reaction injection molding or composites fabrication. The use of these analytical codes to perform experimental control functions is examined; since the models can determine the state of all variables everywhere in the system, they can be used in a manner similar to currently available experimental probes. A small but well instrumented reaction vessel in which fiber-reinforced plaques are cured using computer control and data acquisition was used. The finite element codes were also extended to treat this particular process.

  18. Synaptic dynamics contribute to long-term single neuron response fluctuations.

    PubMed

    Reinartz, Sebastian; Biro, Istvan; Gal, Asaf; Giugliano, Michele; Marom, Shimon

    2014-01-01

    Firing rate variability at the single neuron level is characterized by long-memory processes and complex statistics over a wide range of time scales (from milliseconds up to several hours). Here, we focus on the contribution of non-stationary efficacy of the ensemble of synapses-activated in response to a given stimulus-on single neuron response variability. We present and validate a method tailored for controlled and specific long-term activation of a single cortical neuron in vitro via synaptic or antidromic stimulation, enabling a clear separation between two determinants of neuronal response variability: membrane excitability dynamics vs. synaptic dynamics. Applying this method we show that, within the range of physiological activation frequencies, the synaptic ensemble of a given neuron is a key contributor to the neuronal response variability, long-memory processes and complex statistics observed over extended time scales. Synaptic transmission dynamics impact on response variability in stimulation rates that are substantially lower compared to stimulation rates that drive excitability resources to fluctuate. Implications to network embedded neurons are discussed.

  19. Intrinsic movement variability at work. How long is the path from motor control to design engineering?

    PubMed

    Gaudez, C; Gilles, M A; Savin, J

    2016-03-01

    For several years, increasing numbers of studies have highlighted the existence of movement variability. Before that, it was neglected in movement analysis and it is still almost completely ignored in workstation design. This article reviews motor control theories and factors influencing movement execution, and indicates how intrinsic movement variability is part of task completion. These background clarifications should help ergonomists and workstation designers to gain a better understanding of these concepts, which can then be used to improve design tools. We also question which techniques--kinematics, kinetics or muscular activity--and descriptors are most appropriate for describing intrinsic movement variability and for integration into design tools. By this way, simulations generated by designers for workstation design should be closer to the real movements performed by workers. This review emphasises the complexity of identifying, describing and processing intrinsic movement variability in occupational activities. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  20. Effective reinforcement learning following cerebellar damage requires a balance between exploration and motor noise.

    PubMed

    Therrien, Amanda S; Wolpert, Daniel M; Bastian, Amy J

    2016-01-01

    Reinforcement and error-based processes are essential for motor learning, with the cerebellum thought to be required only for the error-based mechanism. Here we examined learning and retention of a reaching skill under both processes. Control subjects learned similarly from reinforcement and error-based feedback, but showed much better retention under reinforcement. To apply reinforcement to cerebellar patients, we developed a closed-loop reinforcement schedule in which task difficulty was controlled based on recent performance. This schedule produced substantial learning in cerebellar patients and controls. Cerebellar patients varied in their learning under reinforcement but fully retained what was learned. In contrast, they showed complete lack of retention in error-based learning. We developed a mechanistic model of the reinforcement task and found that learning depended on a balance between exploration variability and motor noise. While the cerebellar and control groups had similar exploration variability, the patients had greater motor noise and hence learned less. Our results suggest that cerebellar damage indirectly impairs reinforcement learning by increasing motor noise, but does not interfere with the reinforcement mechanism itself. Therefore, reinforcement can be used to learn and retain novel skills, but optimal reinforcement learning requires a balance between exploration variability and motor noise. © The Author (2015). Published by Oxford University Press on behalf of the Guarantors of Brain.

  1. Effective reinforcement learning following cerebellar damage requires a balance between exploration and motor noise

    PubMed Central

    Therrien, Amanda S.; Wolpert, Daniel M.

    2016-01-01

    Abstract See Miall and Galea (doi: 10.1093/awv343 ) for a scientific commentary on this article. Reinforcement and error-based processes are essential for motor learning, with the cerebellum thought to be required only for the error-based mechanism. Here we examined learning and retention of a reaching skill under both processes. Control subjects learned similarly from reinforcement and error-based feedback, but showed much better retention under reinforcement. To apply reinforcement to cerebellar patients, we developed a closed-loop reinforcement schedule in which task difficulty was controlled based on recent performance. This schedule produced substantial learning in cerebellar patients and controls. Cerebellar patients varied in their learning under reinforcement but fully retained what was learned. In contrast, they showed complete lack of retention in error-based learning. We developed a mechanistic model of the reinforcement task and found that learning depended on a balance between exploration variability and motor noise. While the cerebellar and control groups had similar exploration variability, the patients had greater motor noise and hence learned less. Our results suggest that cerebellar damage indirectly impairs reinforcement learning by increasing motor noise, but does not interfere with the reinforcement mechanism itself. Therefore, reinforcement can be used to learn and retain novel skills, but optimal reinforcement learning requires a balance between exploration variability and motor noise. PMID:26626368

  2. Counter-propagation network with variable degree variable step size LMS for single switch typing recognition.

    PubMed

    Yang, Cheng-Huei; Luo, Ching-Hsing; Yang, Cheng-Hong; Chuang, Li-Yeh

    2004-01-01

    Morse code is now being harnessed for use in rehabilitation applications of augmentative-alternative communication and assistive technology, including mobility, environmental control and adapted worksite access. In this paper, Morse code is selected as a communication adaptive device for disabled persons who suffer from muscle atrophy, cerebral palsy or other severe handicaps. A stable typing rate is strictly required for Morse code to be effective as a communication tool. This restriction is a major hindrance. Therefore, a switch adaptive automatic recognition method with a high recognition rate is needed. The proposed system combines counter-propagation networks with a variable degree variable step size LMS algorithm. It is divided into five stages: space recognition, tone recognition, learning process, adaptive processing, and character recognition. Statistical analyses demonstrated that the proposed method elicited a better recognition rate in comparison to alternative methods in the literature.

  3. On the synergistic use of microwave and infrared satellite observations to monitor soil moisture and flooding

    USDA-ARS?s Scientific Manuscript database

    Extreme hydrological processes are often very dynamic and destructive.A better understanding of these processes requires an accurate mapping of key variables that control them. In this regard, soil moisture is perhaps the most important parameter that impacts the magnitude of flooding events as it c...

  4. Direct Evidence for a Dual Process Model of Deductive Inference

    ERIC Educational Resources Information Center

    Markovits, Henry; Brunet, Marie-Laurence; Thompson, Valerie; Brisson, Janie

    2013-01-01

    In 2 experiments, we tested a strong version of a dual process theory of conditional inference (cf. Verschueren et al., 2005a, 2005b) that assumes that most reasoners have 2 strategies available, the choice of which is determined by situational variables, cognitive capacity, and metacognitive control. The statistical strategy evaluates inferences…

  5. Relation of Childhood Worry to Information-Processing Factors in an Ethnically Diverse Community Sample

    ERIC Educational Resources Information Center

    Suarez-Morales, Lourdes; Bell, Debora

    2006-01-01

    This study examined information-processing variables in relation to worry in a sample of 292 fifth-grade children from Caucasian, African American, and Hispanic backgrounds. Results revealed that worry was related to threat interpretations for hypothetical situations and, when stress level was not controlled, to higher estimates of future…

  6. A Pilot-Scale Heat Recovery System for Computer Process Control Teaching and Research.

    ERIC Educational Resources Information Center

    Callaghan, P. J.; And Others

    1988-01-01

    Describes the experimental system and equipment including an interface box for displaying variables. Discusses features which make the circuit suitable for teaching and research in computing. Feedforward, decoupling, and adaptive control, examination of digital filtering, and a cascade loop are teaching experiments utilizing this rig. Diagrams and…

  7. Utilizing multiple state variables to improve the dynamic range of analog switching in a memristor

    NASA Astrophysics Data System (ADS)

    Jeong, YeonJoo; Kim, Sungho; Lu, Wei D.

    2015-10-01

    Memristors and memristive systems have been extensively studied for data storage and computing applications such as neuromorphic systems. To act as synapses in neuromorphic systems, the memristor needs to exhibit analog resistive switching (RS) behavior with incremental conductance change. In this study, we show that the dynamic range of the analog RS behavior can be significantly enhanced in a tantalum-oxide-based memristor. By controlling different state variables enabled by different physical effects during the RS process, the gradual filament expansion stage can be selectively enhanced without strongly affecting the abrupt filament length growth stage. Detailed physics-based modeling further verified the observed experimental effects and revealed the roles of oxygen vacancy drift and diffusion processes, and how the diffusion process can be selectively enhanced during the filament expansion stage. These findings lead to more desirable and reliable memristor behaviors for analog computing applications. Additionally, the ability to selectively control different internal physical processes demonstrated in the current study provides guidance for continued device optimization of memristor devices in general.

  8. Phytoplankton dynamics of a subtropical reservoir controlled by the complex interplay among hydrological, abiotic, and biotic variables.

    PubMed

    Kuo, Yi-Ming; Wu, Jiunn-Tzong

    2016-12-01

    This study was conducted to identify the key factors related to the spatiotemporal variations in phytoplankton abundance in a subtropical reservoir from 2006 to 2010 and to assist in developing strategies for water quality management. Dynamic factor analysis (DFA), a dimension-reduction technique, was used to identify interactions between explanatory variables (i.e., environmental variables) and abundance (biovolume) of predominant phytoplankton classes. The optimal DFA model significantly described the dynamic changes in abundances of predominant phytoplankton groups (including dinoflagellates, diatoms, and green algae) at five monitoring sites. Water temperature, electrical conductivity, water level, nutrients (total phosphorus, NO 3 -N, and NH 3 -N), macro-zooplankton, and zooplankton were the key factors affecting the dynamics of aforementioned phytoplankton. Therefore, transformations of nutrients and reactions between water quality variables and aforementioned processes altered by hydrological conditions may also control the abundance dynamics of phytoplankton, which may represent common trends in the DFA model. The meandering shape of Shihmen Reservoir and its surrounding rivers caused a complex interplay between hydrological conditions and abiotic and biotic variables, resulting in phytoplankton abundance that could not be estimated using certain variables. Additional water quality and hydrological variables at surrounding rivers and monitoring plans should be executed a few days before and after reservoir operations and heavy storm, which would assist in developing site-specific preventive strategies to control phytoplankton abundance.

  9. Training attentional control in older adults

    PubMed Central

    MacKay-Brandt, Anna

    2013-01-01

    Recent research has demonstrated benefits for older adults from training attentional control using a variable priority strategy, but the construct validity of the training task and the degree to which benefits of training transfer to other contexts are unclear. The goal of this study was to characterize baseline performance on the training task in a sample of 105 healthy older adults and to test for transfer of training in a subset (n = 21). Training gains after 5 days and extent of transfer was compared to another subset (n = 20) that served as a control group. Baseline performance on the training task was characterized by a two-factor model of working memory and processing speed. Processing speed correlated with the training task. Training gains in speed and accuracy were reliable and robust (ps <.001, η2 = .57 to .90). Transfer to an analogous task was observed (ps <.05, η2 = .10 to .17). The beneficial effect of training did not translate to improved performance on related measures of processing speed. This study highlights the robust effect of training and transfer to a similar context using a variable priority training task. Although processing speed is an important aspect of the training task, training benefit is either related to an untested aspect of the training task or transfer of training is limited to the training context. PMID:21728889

  10. Mathematical modeling and characteristic analysis for over-under turbine based combined cycle engine

    NASA Astrophysics Data System (ADS)

    Ma, Jingxue; Chang, Juntao; Ma, Jicheng; Bao, Wen; Yu, Daren

    2018-07-01

    The turbine based combined cycle engine has become the most promising hypersonic airbreathing propulsion system for its superiority of ground self-starting, wide flight envelop and reusability. The simulation model of the turbine based combined cycle engine plays an important role in the research of performance analysis and control system design. In this paper, a turbine based combined cycle engine mathematical model is built on the Simulink platform, including a dual-channel air intake system, a turbojet engine and a ramjet. It should be noted that the model of the air intake system is built based on computational fluid dynamics calculation, which provides valuable raw data for modeling of the turbine based combined cycle engine. The aerodynamic characteristics of turbine based combined cycle engine in turbojet mode, ramjet mode and mode transition process are studied by the mathematical model, and the influence of dominant variables on performance and safety of the turbine based combined cycle engine is analyzed. According to the stability requirement of thrust output and the safety in the working process of turbine based combined cycle engine, a control law is proposed that could guarantee the steady output of thrust by controlling the control variables of the turbine based combined cycle engine in the whole working process.

  11. A rugged landscape model for self-organization and emergent leadership in creative problem solving and production groups.

    PubMed

    Guastello, Stephen J; Craven, Joanna; Zygowicz, Karen M; Bock, Benjamin R

    2005-07-01

    The process by which an initially leaderless group differentiates into one containing leadership and secondary role structures was examined using the swallowtail catastrophe model and principles of selforganization. The objectives were to identify the control variables in the process of leadership emergence in creative problem solving groups and production groups. In the first of two experiments, groups of university students (total N = 114) played a creative problem solving game. Participants later rated each other on leadership behavior, styles, and variables related to the process of conversation. A performance quality measure was included also. Control parameters in the swallowtail catastrophe model were identified through a combination of factor analysis and nonlinear regression. Leaders displayed a broad spectrum of behaviors in the general categories of Controlling the Conversation and Creativity in their role-play. In the second experiment, groups of university students (total N = 197) engaged in a laboratory work experiment that had a substantial production goal component. The same system of ratings and modeling strategy was used along with a work production measure. Leaders in the production task emerged to the extent that they exhibited control over both the creative and production aspects of the task, they could keep tension low, and the externally imposed production goals were realistic.

  12. Integrating Software Modules For Robot Control

    NASA Technical Reports Server (NTRS)

    Volpe, Richard A.; Khosla, Pradeep; Stewart, David B.

    1993-01-01

    Reconfigurable, sensor-based control system uses state variables in systematic integration of reusable control modules. Designed for open-architecture hardware including many general-purpose microprocessors, each having own local memory plus access to global shared memory. Implemented in software as extension of Chimera II real-time operating system. Provides transparent computing mechanism for intertask communication between control modules and generic process-module architecture for multiprocessor realtime computation. Used to control robot arm. Proves useful in variety of other control and robotic applications.

  13. Transport induced by mean-eddy interaction: II. Analysis of transport processes

    NASA Astrophysics Data System (ADS)

    Ide, Kayo; Wiggins, Stephen

    2015-03-01

    We present a framework for the analysis of transport processes resulting from the mean-eddy interaction in a flow. The framework is based on the Transport Induced by the Mean-Eddy Interaction (TIME) method presented in a companion paper (Ide and Wiggins, 2014) [1]. The TIME method estimates the (Lagrangian) transport across stationary (Eulerian) boundaries defined by chosen streamlines of the mean flow. Our framework proceeds after first carrying out a sequence of preparatory steps that link the flow dynamics to the transport processes. This includes the construction of the so-called "instantaneous flux" as the Hovmöller diagram. Transport processes are studied by linking the signals of the instantaneous flux field to the dynamical variability of the flow. This linkage also reveals how the variability of the flow contributes to the transport. The spatio-temporal analysis of the flux diagram can be used to assess the efficiency of the variability in transport processes. We apply the method to the double-gyre ocean circulation model in the situation where the Rossby-wave mode dominates the dynamic variability. The spatio-temporal analysis shows that the inter-gyre transport is controlled by the circulating eddy vortices in the fast eastward jet region, whereas the basin-scale Rossby waves have very little impact.

  14. Work environment risk factors for injuries in wood processing

    PubMed Central

    Holcroft, Christina A.; Punnett, Laura

    2018-01-01

    Problem The reported injury rate for wood product manufacturing in Maine, 1987–2004, was almost twice the state-wide average for all jobs. Method A case-control study was conducted in wood processing plants to determine preventable risk factors for injury. A total of 157 cases with injuries reported to workers’ compensation and 251 controls were interviewed. Results In multivariable analyses, variables associated with injury risk were high physical workload, machine-paced work or inability to take a break, lack of training, absence of a lockout/tagout program, low seniority, and male gender. Different subsets of these variables were significant when acute incidents and overexertions were analyzed separately and when all injuries were stratified by industry sub-sector. Impact on industry Generalizability may be limited somewhat by non-representative participation of workplaces and individuals. Nevertheless, these findings provide evidence that many workplace injuries occurring in wood processing could be prevented by application of ergonomics principles and improved work organization. PMID:19778648

  15. The symbol-grounding problem in numerical cognition: A review of theory, evidence, and outstanding questions.

    PubMed

    Leibovich, Tali; Ansari, Daniel

    2016-03-01

    How do numerical symbols, such as number words, acquire semantic meaning? This question, also referred to as the "symbol-grounding problem," is a central problem in the field of numerical cognition. Present theories suggest that symbols acquire their meaning by being mapped onto an approximate system for the nonsymbolic representation of number (Approximate Number System or ANS). In the present literature review, we first asked to which extent current behavioural and neuroimaging data support this theory, and second, to which extent the ANS, upon which symbolic numbers are assumed to be grounded, is numerical in nature. We conclude that (a) current evidence that has examined the association between the ANS and number symbols does not support the notion that number symbols are grounded in the ANS and (b) given the strong correlation between numerosity and continuous variables in nonsymbolic number processing tasks, it is next to impossible to measure the pure association between symbolic and nonsymbolic numerosity. Instead, it is clear that significant cognitive control resources are required to disambiguate numerical from continuous variables during nonsymbolic number processing. Thus, if there exists any mapping between the ANS and symbolic number, then this process of association must be mediated by cognitive control. Taken together, we suggest that studying the role of both cognitive control and continuous variables in numerosity comparison tasks will provide a more complete picture of the symbol-grounding problem. (c) 2016 APA, all rights reserved).

  16. Integrated Controls-Structures Design Methodology: Redesign of an Evolutionary Test Structure

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Joshi, Suresh M.

    1997-01-01

    An optimization-based integrated controls-structures design methodology for a class of flexible space structures is described, and the phase-0 Controls-Structures-Integration evolutionary model, a laboratory testbed at NASA Langley, is redesigned using this integrated design methodology. The integrated controls-structures design is posed as a nonlinear programming problem to minimize the control effort required to maintain a specified line-of-sight pointing performance, under persistent white noise disturbance. Static and dynamic dissipative control strategies are employed for feedback control, and parameters of these controllers are considered as the control design variables. Sizes of strut elements in various sections of the CEM are used as the structural design variables. Design guides for the struts are developed and employed in the integrated design process, to ensure that the redesigned structure can be effectively fabricated. The superiority of the integrated design methodology over the conventional design approach is demonstrated analytically by observing a significant reduction in the average control power needed to maintain specified pointing performance with the integrated design approach.

  17. VPPA weld model evaluation

    NASA Technical Reports Server (NTRS)

    Mccutcheon, Kimble D.; Gordon, Stephen S.; Thompson, Paul A.

    1992-01-01

    NASA uses the Variable Polarity Plasma Arc Welding (VPPAW) process extensively for fabrication of Space Shuttle External Tanks. This welding process has been in use at NASA since the late 1970's but the physics of the process have never been satisfactorily modeled and understood. In an attempt to advance the level of understanding of VPPAW, Dr. Arthur C. Nunes, Jr., (NASA) has developed a mathematical model of the process. The work described in this report evaluated and used two versions (level-0 and level-1) of Dr. Nunes' model, and a model derived by the University of Alabama at Huntsville (UAH) from Dr. Nunes' level-1 model. Two series of VPPAW experiments were done, using over 400 different combinations of welding parameters. Observations were made of VPPAW process behavior as a function of specific welding parameter changes. Data from these weld experiments was used to evaluate and suggest improvements to Dr. Nunes' model. Experimental data and correlations with the model were used to develop a multi-variable control algorithm for use with a future VPPAW controller. This algorithm is designed to control weld widths (both on the crown and root of the weld) based upon the weld parameters, base metal properties, and real-time observation of the crown width. The algorithm exhibited accuracy comparable to that of the weld width measurements for both aluminum and mild steel welds.

  18. A Context-Driven Model for the Flat Roofs Construction Process through Sensing Systems, Internet-of-Things and Last Planner System

    PubMed Central

    Andújar-Montoya, María Dolores

    2017-01-01

    The main causes of building defects are errors in the design and the construction phases. These causes related to construction are mainly due to the general lack of control of construction work and represent approximately 75% of the anomalies. In particular, one of the main causes of such anomalies, which end in building defects, is the lack of control over the physical variables of the work environment during the execution of tasks. Therefore, the high percentage of defects detected in buildings that have the root cause in the construction phase could be avoidable with a more accurate and efficient control of the process. The present work proposes a novel integration model based on information and communications technologies for the automation of both construction work and its management at the execution phase, specifically focused on the flat roof construction process. Roofs represent the second area where more defects are claimed. The proposed model is based on a Web system, supported by a service oriented architecture, for the integral management of tasks through the Last Planner System methodology, but incorporating the management of task restrictions from the physical environment variables by designing specific sensing systems. Likewise, all workers are integrated into the management process by Internet-of-Things solutions that guide them throughout the execution process in a non-intrusive and transparent way. PMID:28737693

  19. A Context-Driven Model for the Flat Roofs Construction Process through Sensing Systems, Internet-of-Things and Last Planner System.

    PubMed

    Andújar-Montoya, María Dolores; Marcos-Jorquera, Diego; García-Botella, Francisco Manuel; Gilart-Iglesias, Virgilio

    2017-07-22

    The main causes of building defects are errors in the design and the construction phases. These causes related to construction are mainly due to the general lack of control of construction work and represent approximately 75% of the anomalies. In particular, one of the main causes of such anomalies, which end in building defects, is the lack of control over the physical variables of the work environment during the execution of tasks. Therefore, the high percentage of defects detected in buildings that have the root cause in the construction phase could be avoidable with a more accurate and efficient control of the process. The present work proposes a novel integration model based on information and communications technologies for the automation of both construction work and its management at the execution phase, specifically focused on the flat roof construction process. Roofs represent the second area where more defects are claimed. The proposed model is based on a Web system, supported by a service oriented architecture, for the integral management of tasks through the Last Planner System methodology, but incorporating the management of task restrictions from the physical environment variables by designing specific sensing systems. Likewise, all workers are integrated into the management process by Internet-of-Things solutions that guide them throughout the execution process in a non-intrusive and transparent way.

  20. Quantification of the oxygen uptake rate in a dissolved oxygen controlled oscillating jet-driven microbioreactor.

    PubMed

    Kirk, Timothy V; Marques, Marco Pc; Radhakrishnan, Anand N Pallipurath; Szita, Nicolas

    2016-03-01

    Microbioreactors have emerged as a new tool for early bioprocess development. The technology has advanced rapidly in the last decade and obtaining real-time quantitative data of process variables is nowadays state of the art. In addition, control over process variables has also been achieved. The aim of this study was to build a microbioreactor capable of controlling dissolved oxygen (DO) concentrations and to determine oxygen uptake rate in real time. An oscillating jet driven, membrane-aerated microbioreactor was developed without comprising any moving parts. Mixing times of ∼7 s, and k L a values of ∼170 h -1 were achieved. DO control was achieved by varying the duty cycle of a solenoid microvalve, which changed the gas mixture in the reactor incubator chamber. The microbioreactor supported Saccharomyces cerevisiae growth over 30 h and cell densities of 6.7 g dcw L -1 . Oxygen uptake rates of ∼34 mmol L -1 h -1 were achieved. The results highlight the potential of DO-controlled microbioreactors to obtain real-time information on oxygen uptake rate, and by extension on cellular metabolism for a variety of cell types over a broad range of processing conditions. © 2015 The Authors. Journal of Chemical Technology & Biotechnology published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.

  1. Long-term variability in the water budget and its controls in an oak-dominated temperate forest

    Treesearch

    Jing Xie; Ge Sun; Hou-Sen Chu; Junguo Liu; Steven G. McNulty; Asko Noormets; Ranjeet John; Zutao Ouyang; Tianshan Zha; Haitao Li; Wenbin Guan; Jiquan Chen

    2014-01-01

    Water availability is one of the key environmental factors that control ecosystem functions in temperate forests. Changing climate is likely to alter the ecohydrology and other ecosystem processes, which affect forest structures and functions. We constructed a multi-year water budget (2004–2010) and quantified environmental controls on an evapotranspiration (ET) in a...

  2. Sequencing batch-reactor control using Gaussian-process models.

    PubMed

    Kocijan, Juš; Hvala, Nadja

    2013-06-01

    This paper presents a Gaussian-process (GP) model for the design of sequencing batch-reactor (SBR) control for wastewater treatment. The GP model is a probabilistic, nonparametric model with uncertainty predictions. In the case of SBR control, it is used for the on-line optimisation of the batch-phases duration. The control algorithm follows the course of the indirect process variables (pH, redox potential and dissolved oxygen concentration) and recognises the characteristic patterns in their time profile. The control algorithm uses GP-based regression to smooth the signals and GP-based classification for the pattern recognition. When tested on the signals from an SBR laboratory pilot plant, the control algorithm provided a satisfactory agreement between the proposed completion times and the actual termination times of the biodegradation processes. In a set of tested batches the final ammonia and nitrate concentrations were below 1 and 0.5 mg L(-1), respectively, while the aeration time was shortened considerably. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. An individual differences analysis of memory control

    PubMed Central

    Salthouse, Timothy A.; Siedlecki, Karen L.; Krueger, Lacy E.

    2013-01-01

    Performance on a wide variety of memory tasks can be hypothesized to be influenced by processes associated with controlling the contents of memory. In this project 328 adults ranging from 18 to 93 years of age performed six tasks (e.g., multiple trial recall with an interpolated interference list, directed forgetting, proactive interference, and retrieval inhibition) postulated to yield measures of the effectiveness of memory control. Although most of the patterns from earlier studies were replicated, only a few of the measures of memory control were reliable at the level of individual differences. Furthermore, the memory control measures had very weak relations with the age of the participant. Analyses examining the relations between established cognitive abilities and variables from the experimental tasks revealed that most of the variables were related only to episodic memory ability. PMID:24347812

  4. GEECS (Generalized Equipment and Experiment Control System)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    GONSALVES, ANTHONY; DESHMUKH, AALHAD

    2017-01-12

    GEECS (Generalized Equipment and Experiment Control System) monitors and controls equipment distributed across a network, performs experiments by scanning input variables, and collects and stores various types of data synchronously from devices. Examples of devices include cameras, motors and pressure gauges. GEEKS is based upon LabView graphical object oriented programming (GOOP), allowing for a modular and scalable framework. Data is published for subscription of an arbitrary number of variables over TCP. A secondary framework allows easy development of graphical user interfaces for a combined control of any available devices on the control system without the need of programming knowledge. Thismore » allows for rapid integration of GEECS into a wide variety of systems. A database interface provides for devise and process configuration while allowing the user to save large quantities of data to local or network drives.« less

  5. Microeconomics of advanced process window control for 50-nm gates

    NASA Astrophysics Data System (ADS)

    Monahan, Kevin M.; Chen, Xuemei; Falessi, Georges; Garvin, Craig; Hankinson, Matt; Lev, Amir; Levy, Ady; Slessor, Michael D.

    2002-07-01

    Fundamentally, advanced process control enables accelerated design-rule reduction, but simple microeconomic models that directly link the effects of advanced process control to profitability are rare or non-existent. In this work, we derive these links using a simplified model for the rate of profit generated by the semiconductor manufacturing process. We use it to explain why and how microprocessor manufacturers strive to avoid commoditization by producing only the number of dies required to satisfy the time-varying demand in each performance segment. This strategy is realized using the tactic known as speed binning, the deliberate creation of an unnatural distribution of microprocessor performance that varies according to market demand. We show that the ability of APC to achieve these economic objectives may be limited by variability in the larger manufacturing context, including measurement delays and process window variation.

  6. Documentation for the State Variables Package for the Groundwater-Management Process of MODFLOW-2005 (GWM-2005)

    USGS Publications Warehouse

    Ahlfeld, David P.; Barlow, Paul M.; Baker, Kristine M.

    2011-01-01

    Many groundwater-management problems are concerned with the control of one or more variables that reflect the state of a groundwater-flow system or a coupled groundwater/surface-water system. These system state variables include the distribution of heads within an aquifer, streamflow rates within a hydraulically connected stream, and flow rates into or out of aquifer storage. This report documents the new State Variables Package for the Groundwater-Management Process of MODFLOW-2005 (GWM-2005). The new package provides a means to explicitly represent heads, streamflows, and changes in aquifer storage as state variables in a GWM-2005 simulation. The availability of these state variables makes it possible to include system state in the objective function and enhances existing capabilities for constructing constraint sets for a groundwater-management formulation. The new package can be used to address groundwater-management problems such as the determination of withdrawal strategies that meet water-supply demands while simultaneously maximizing heads or streamflows, or minimizing changes in aquifer storage. Four sample problems are provided to demonstrate use of the new package for typical groundwater-management applications.

  7. Automatic Information Processing and High Performance Skills: Individual Differences and Mechanisms of Performance Improvement in Search-Detection and Complex Task

    DTIC Science & Technology

    1992-09-01

    abilities is fit along with the autoregressive process. Initially, the influences on search performance of within-group age and sex were included as control...Results: PerformanceLAbility Structure Measurement Model: Ability Structure The correlations between all the ability measures, age, and sex are...subsequent analyses for young adults. Age and sex were included as control variables. There was an age range of 15 years; this range is sufficiently large that

  8. Stabilizing laser energy density on a target during pulsed laser deposition of thin films

    DOEpatents

    Dowden, Paul C.; Jia, Quanxi

    2016-05-31

    A process for stabilizing laser energy density on a target surface during pulsed laser deposition of thin films controls the focused laser spot on the target. The process involves imaging an image-aperture positioned in the beamline. This eliminates changes in the beam dimensions of the laser. A continuously variable attenuator located in between the output of the laser and the imaged image-aperture adjusts the energy to a desired level by running the laser in a "constant voltage" mode. The process provides reproducibility and controllability for deposition of electronic thin films by pulsed laser deposition.

  9. Mechanisms and kinetics of cellulose fermentation for protein production

    NASA Technical Reports Server (NTRS)

    Dunlap, C. A.

    1971-01-01

    The development of a process (and ancillary processing and analytical techniques) to produce bacterial single-cell protein of good nutritional quality from waste cellulose is discussed. A fermentation pilot plant and laboratory were developed and have been in operation for about two years. Single-cell protein (SCP) can be produced from sugarcane bagasse--a typical agricultural cellulosic waste. The optimization and understanding of this process and its controlling variables are examined. Both batch and continuous fermentation runs have been made under controlled conditions in the 535 liter pilot plant vessel and in the laboratory 14-liter fermenters.

  10. Control Design for an Advanced Geared Turbofan Engine

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Litt, Jonathan S.

    2017-01-01

    This paper describes the design process for the control system of an advanced geared turbofan engine. This process is applied to a simulation that is representative of a 30,000 pound-force thrust class concept engine with two main spools, ultra-high bypass ratio, and a variable area fan nozzle. Control system requirements constrain the non-linear engine model as it operates throughout its flight envelope of sea level to 40,000 feet and from 0 to 0.8 Mach. The purpose of this paper is to review the engine control design process for an advanced turbofan engine configuration. The control architecture selected for this project was developed from literature and reflects a configuration that utilizes a proportional integral controller with sets of limiters that enable the engine to operate safely throughout its flight envelope. Simulation results show the overall system meets performance requirements without exceeding operational limits.

  11. Nonlinear model predictive control for chemical looping process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joshi, Abhinaya; Lei, Hao; Lou, Xinsheng

    A control system for optimizing a chemical looping ("CL") plant includes a reduced order mathematical model ("ROM") that is designed by eliminating mathematical terms that have minimal effect on the outcome. A non-linear optimizer provides various inputs to the ROM and monitors the outputs to determine the optimum inputs that are then provided to the CL plant. An estimator estimates the values of various internal state variables of the CL plant. The system has one structure adapted to control a CL plant that only provides pressure measurements in the CL loops A and B, a second structure adapted to amore » CL plant that provides pressure measurements and solid levels in both loops A, and B, and a third structure adapted to control a CL plant that provides full information on internal state variables. A final structure provides a neural network NMPC controller to control operation of loops A and B.« less

  12. Internally Generated and Externally Forced Multidecadal Oceanic Modes and their Influence on the Summer Rainfall over East Asia

    NASA Astrophysics Data System (ADS)

    Si, D.; Hu, A.

    2017-12-01

    The interdecadal oceanic variabilities can be generated from both internal and external processes, and these variabilities can significantly modulate our climate on global and regional scale, such as the warming slowdown in the early 21st century, and the rainfall in East Asia. By analyzing simulations from a unique Community Earth System Model (CESM) Large Ensemble (CESM_LE) project, we show that the Interdecadal Pacific Oscillation (IPO) is primarily an internally generated oceanic variability, while the Atlantic Multidecadal Oscillation (AMO) may be an oceanic variability generated by internal oceanic processes and modulated by external forcings in the 20th century. Although the observed relationship between IPO and the Yangtze-Huaihe River valley (YHRV) summer rainfall in China is well simulated in both the preindustrial control and 20th century ensemble, none of the 20th century ensemble members can reproduce the observed time evolution of both IPO and YHRV due to the unpredictable nature of IPO on multidecade timescale. On the other hand, although CESM_LE cannot reproduce the observed relationship between AMO and Huanghe River valley (HRV) summer rainfall of China in the preindustrial control simulation, this relationship in the 20th century simulations is well reproduced, and the chance to reproduce the observed time evolution of both AMO and HRV rainfall is about 30%, indicating the important role of the interaction between the internal processes and the external forcing to realistically simulate the AMO and HRV rainfall.

  13. Controls on morphological variability and role of stream power distribution pattern, Yamuna River, western India

    NASA Astrophysics Data System (ADS)

    Bawa, Nupur; Jain, Vikrant; Shekhar, Shashank; Kumar, Niraj; Jyani, Vikas

    2014-12-01

    Understanding the controls on the morphological variability of river systems constitutes one of the fundamental questions in geomorphic investigation. Channel morphology is an important indicator of river processes and is of significance for mapping the hydrology-ecologic connectivity in a river system and for predicting the future trajectory of river health in response to external forcings. This paper documents the spatial morphological variability and its natural and anthropogenic controls for the Yamuna River, a major tributary of the Ganga River, India. The Yamuna River runs through a major urban centre i.e. Delhi National Capital Region. The Yamuna River was divided into eight geomorphically distinct reaches on the basis of the assemblages of geomorphic units and the association of landscape, valley and floodplain settings. The morphological variability was analysed through stream power distribution and sediment load data at various stations. Stream power distribution of the Yamuna River basin is characterised by a non-linear pattern that was used to distinguish (a) high energy ‘natural' upstream reaches, (b) ‘anthropogenically altered', low energy middle stream reaches, and (c) ‘rejuvenated' downstream reaches again with higher stream power. The relationship between stream power and channel morphology in these reaches was integrated with sediment load data to define the maximum flow efficiency (MFE) as the threshold for geomorphic transition. This analysis supports the continuity of river processes and the significance of a holistic, basin-scale approach rather than isolated local scale analysis in river studies.

  14. Reduced neural activity of the prefrontal cognitive control circuitry during response inhibition to negative words in people with schizophrenia

    PubMed Central

    Vercammen, Ans; Morris, Richard; Green, Melissa J.; Lenroot, Rhoshel; Kulkarni, Jayashri; Carr, Vaughan J.; Weickert, Cynthia Shannon; Weickert, Thomas W.

    2012-01-01

    Background Schizophrenia is characterized by deficits in executive control and impairments in emotion processing. This study assessed the nature and extent of potential alterations in the neural substrates supporting the interaction between cognitive control mechanisms and emotion attribution processes in people with schizophrenia. Methods Functional magnetic resonance imaging was performed during a verbal emotional go/no-go task. People with schizophrenia and healthy controls responded to word stimuli of a prespecified emotional valence (positive, negative or neutral) while inhibiting responses to stimuli of a different valence. Results We enrolled 20 people with schizophrenia and 23 controls in the study. Healthy controls activated an extensive dorsal prefrontal–parietal network while inhibiting responses to negative words compared to neutral words, but showed deactivation of the midcingulate cortex while inhibiting responses to positive words compared to neutral words. People with schizophrenia failed to activate this network during response inhibition to negative words, whereas during response inhibition to positive words they did not deactivate the cingulate, but showed increased responsivity in the frontal cortex. Limitations Sample heterogeneity is characteristic of studies of schizophrenia and may have contributed to more variable neural responses in the patient sample despite the care taken to control for potentially confounding variables. Conclusion Our results showed that schizophrenia is associated with aberrant modulation of neural responses during the interaction between cognitive control and emotion processing. Failure of the frontal circuitry to regulate goal-directed behaviour based on emotion attributions may contribute to deficits in psychosocial functioning in daily life. PMID:22617625

  15. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quock, D. E. R.; Cianciarulo, M. B.; APS Engineering Support Division

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, themore » necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.« less

  16. Automatic and controlled components of judgment and decision making.

    PubMed

    Ferreira, Mario B; Garcia-Marques, Leonel; Sherman, Steven J; Sherman, Jeffrey W

    2006-11-01

    The categorization of inductive reasoning into largely automatic processes (heuristic reasoning) and controlled analytical processes (rule-based reasoning) put forward by dual-process approaches of judgment under uncertainty (e.g., K. E. Stanovich & R. F. West, 2000) has been primarily a matter of assumption with a scarcity of direct empirical findings supporting it. The present authors use the process dissociation procedure (L. L. Jacoby, 1991) to provide convergent evidence validating a dual-process perspective to judgment under uncertainty based on the independent contributions of heuristic and rule-based reasoning. Process dissociations based on experimental manipulation of variables were derived from the most relevant theoretical properties typically used to contrast the two forms of reasoning. These include processing goals (Experiment 1), cognitive resources (Experiment 2), priming (Experiment 3), and formal training (Experiment 4); the results consistently support the author's perspective. They conclude that judgment under uncertainty is neither an automatic nor a controlled process but that it reflects both processes, with each making independent contributions.

  17. Research environments that promote integrity.

    PubMed

    Jeffers, Brenda Recchia; Whittemore, Robin

    2005-01-01

    The body of empirical knowledge about research integrity and the factors that promote research integrity in nursing research environments remains small. To propose an internal control model as an innovative framework for the design and structure of nursing research environments that promote integrity. An internal control model is adapted to illustrate its use for conceptualizing and designing research environments that promote integrity. The internal control model integrates both the organizational elements necessary to promote research integrity and the processes needed to assess research environments. The model provides five interrelated process components within which any number of research integrity variables and processes may be used and studied: internal control environment, risk assessment, internal control activities, monitoring, and information and communication. The components of the proposed research integrity internal control model proposed comprise an integrated conceptualization of the processes that provide reasonable assurance that research integrity will be promoted within the nursing research environment. Schools of nursing can use the model to design, implement, and evaluate systems that promote research integrity. The model process components need further exploration to substantiate the use of the model in nursing research environments.

  18. Optimal control of the gear shifting process for shift smoothness in dual-clutch transmissions

    NASA Astrophysics Data System (ADS)

    Li, Guoqiang; Görges, Daniel

    2018-03-01

    The control of the transmission system in vehicles is significant for the driving comfort. In order to design a controller for smooth shifting and comfortable driving, a dynamic model of a dual-clutch transmission is presented in this paper. A finite-time linear quadratic regulator is proposed for the optimal control of the two friction clutches in the torque phase for the upshift process. An integral linear quadratic regulator is introduced to regulate the relative speed difference between the engine and the slipping clutch under the optimization of the input torque during the inertia phase. The control objective focuses on smoothing the upshift process so as to improve the driving comfort. Considering the available sensors in vehicles for feedback control, an observer design is presented to track the immeasurable variables. Simulation results show that the jerk can be reduced both in the torque phase and inertia phase, indicating good shift performance. Furthermore, compared with conventional controllers for the upshift process, the proposed control method can reduce shift jerk and improve shift quality.

  19. Can we (control) Engineer the degree learning process?

    NASA Astrophysics Data System (ADS)

    White, A. S.; Censlive, M.; Neilsen, D.

    2014-07-01

    This paper investigates how control theory could be applied to learning processes in engineering education. The initial point for the analysis is White's Double Loop learning model of human automation control modified for the education process where a set of governing principals is chosen, probably by the course designer. After initial training the student decides unknowingly on a mental map or model. After observing how the real world is behaving, a strategy to achieve the governing variables is chosen and a set of actions chosen. This may not be a conscious operation, it maybe completely instinctive. These actions will cause some consequences but not until a certain time delay. The current model is compared with the work of Hollenbeck on goal setting, Nelson's model of self-regulation and that of Abdulwahed, Nagy and Blanchard at Loughborough who investigated control methods applied to the learning process.

  20. Are your covariates under control? How normalization can re-introduce covariate effects.

    PubMed

    Pain, Oliver; Dudbridge, Frank; Ronald, Angelica

    2018-04-30

    Many statistical tests rely on the assumption that the residuals of a model are normally distributed. Rank-based inverse normal transformation (INT) of the dependent variable is one of the most popular approaches to satisfy the normality assumption. When covariates are included in the analysis, a common approach is to first adjust for the covariates and then normalize the residuals. This study investigated the effect of regressing covariates against the dependent variable and then applying rank-based INT to the residuals. The correlation between the dependent variable and covariates at each stage of processing was assessed. An alternative approach was tested in which rank-based INT was applied to the dependent variable before regressing covariates. Analyses based on both simulated and real data examples demonstrated that applying rank-based INT to the dependent variable residuals after regressing out covariates re-introduces a linear correlation between the dependent variable and covariates, increasing type-I errors and reducing power. On the other hand, when rank-based INT was applied prior to controlling for covariate effects, residuals were normally distributed and linearly uncorrelated with covariates. This latter approach is therefore recommended in situations were normality of the dependent variable is required.

  1. Soil moisture control of sap-flow response to biophysical factors in a desert-shrub species, Artemisia ordosica

    NASA Astrophysics Data System (ADS)

    Zha, Tianshan; Qian, Duo; Jia, Xin; Bai, Yujie; Tian, Yun; Bourque, Charles P.-A.; Ma, Jingyong; Feng, Wei; Wu, Bin; Peltola, Heli

    2017-10-01

    The current understanding of acclimation processes in desert-shrub species to drought stress in dryland ecosystems is still incomplete. In this study, we measured sap flow in Artemisia ordosica and associated environmental variables throughout the growing seasons of 2013 and 2014 (May-September period of each year) to better understand the environmental controls on the temporal dynamics of sap flow. We found that the occurrence of drought in the dry year of 2013 during the leaf-expansion and leaf-expanded periods caused sap flow per leaf area (Js) to decline significantly, resulting in transpiration being 34 % lower in 2013 than in 2014. Sap flow per leaf area correlated positively with radiation (Rs), air temperature (T), and water vapor pressure deficit (VPD) when volumetric soil water content (VWC) was greater than 0.10 m3 m-3. Diurnal Js was generally ahead of Rs by as much as 6 hours. This time lag, however, decreased with increasing VWC. The relative response of Js to the environmental variables (i.e., Rs, T, and VPD) varied with VWC, Js being more strongly controlled by plant-physiological processes during periods of dryness indicated by a low decoupling coefficient and low sensitivity to the environmental variables. According to this study, soil moisture is shown to control sap-flow (and, therefore, plant-transpiration) response in Artemisia ordosica to diurnal variations in biophysical factors. This species escaped (acclimated to) water limitations by invoking a water-conservation strategy with the regulation of stomatal conductance and advancement of Js peaking time, manifesting in a hysteresis effect. The findings of this study add to the knowledge of acclimation processes in desert-shrub species under drought-associated stress. This knowledge is essential in modeling desert-shrub-ecosystem functioning under changing climatic conditions.

  2. Children's Learning in Scientific Thinking: Instructional Approaches and Roles of Variable Identification and Executive Function

    NASA Astrophysics Data System (ADS)

    Blums, Angela

    The present study examines instructional approaches and cognitive factors involved in elementary school children's thinking and learning the Control of Variables Strategy (CVS), a critical aspect of scientific reasoning. Previous research has identified several features related to effective instruction of CVS, including using a guided learning approach, the use of self-reflective questions, and learning in individual and group contexts. The current study examined the roles of procedural and conceptual instruction in learning CVS and investigated the role of executive function in the learning process. Additionally, this study examined how learning to identify variables is a part of the CVS process. In two studies (individual and classroom experiments), 139 third, fourth, and fifth grade students participated in hands-on and paper and pencil CVS learning activities and, in each study, were assigned to either a procedural instruction, conceptual instruction, or control (no instruction) group. Participants also completed a series of executive function tasks. The study was carried out with two parts--Study 1 used an individual context and Study 2 was carried out in a group setting. Results indicated that procedural and conceptual instruction were more effective than no instruction, and the ability to identify variables was identified as a key piece to the CVS process. Executive function predicted ability to identify variables and predicted success on CVS tasks. Developmental differences were present, in that older children outperformed younger children on CVS tasks, and that conceptual instruction was slightly more effective for older children. Some differences between individual and group instruction were found, with those in the individual context showing some advantage over the those in the group setting in learning CVS concepts. Conceptual implications about scientific thinking and practical implications in science education are discussed.

  3. Quality by design case study: an integrated multivariate approach to drug product and process development.

    PubMed

    Huang, Jun; Kaul, Goldi; Cai, Chunsheng; Chatlapalli, Ramarao; Hernandez-Abad, Pedro; Ghosh, Krishnendu; Nagi, Arwinder

    2009-12-01

    To facilitate an in-depth process understanding, and offer opportunities for developing control strategies to ensure product quality, a combination of experimental design, optimization and multivariate techniques was integrated into the process development of a drug product. A process DOE was used to evaluate effects of the design factors on manufacturability and final product CQAs, and establish design space to ensure desired CQAs. Two types of analyses were performed to extract maximal information, DOE effect & response surface analysis and multivariate analysis (PCA and PLS). The DOE effect analysis was used to evaluate the interactions and effects of three design factors (water amount, wet massing time and lubrication time), on response variables (blend flow, compressibility and tablet dissolution). The design space was established by the combined use of DOE, optimization and multivariate analysis to ensure desired CQAs. Multivariate analysis of all variables from the DOE batches was conducted to study relationships between the variables and to evaluate the impact of material attributes/process parameters on manufacturability and final product CQAs. The integrated multivariate approach exemplifies application of QbD principles and tools to drug product and process development.

  4. Relational-database model for improving quality assurance and process control in a composite manufacturing environment

    NASA Astrophysics Data System (ADS)

    Gentry, Jeffery D.

    2000-05-01

    A relational database is a powerful tool for collecting and analyzing the vast amounts of inner-related data associated with the manufacture of composite materials. A relational database contains many individual database tables that store data that are related in some fashion. Manufacturing process variables as well as quality assurance measurements can be collected and stored in database tables indexed according to lot numbers, part type or individual serial numbers. Relationships between manufacturing process and product quality can then be correlated over a wide range of product types and process variations. This paper presents details on how relational databases are used to collect, store, and analyze process variables and quality assurance data associated with the manufacture of advanced composite materials. Important considerations are covered including how the various types of data are organized and how relationships between the data are defined. Employing relational database techniques to establish correlative relationships between process variables and quality assurance measurements is then explored. Finally, the benefits of database techniques such as data warehousing, data mining and web based client/server architectures are discussed in the context of composite material manufacturing.

  5. Emotion dysregulation in alexithymia: Startle reactivity to fearful affective imagery and its relation to heart rate variability.

    PubMed

    Panayiotou, Georgia; Constantinou, Elena

    2017-09-01

    Alexithymia is associated with deficiencies in recognizing and expressing emotions and impaired emotion regulation, though few studies have verified the latter assertion using objective measures. This study examined startle reflex modulation by fearful imagery and its associations with heart rate variability in alexithymia. Fifty-four adults (27 alexithymic) imagined previously normed fear scripts. Startle responses were assessed during baseline, first exposure, and reexposure. During first exposure, participants, in separate trials, engaged in either shallow or deep emotion processing, giving emphasis on descriptive or affective aspects of imagery, respectively. Resting heart rate variability was assessed during 2 min of rest prior to the experiment, with high alexithymic participants demonstrating significantly higher LF/HF (low frequency/high frequency) ratio than controls. Deep processing was associated with nonsignificantly larger and faster startle responses at first exposure for alexithymic participants. Lower LF/HF ratio, reflecting higher parasympathetic cardiac activity, predicted greater startle amplitude habituation for alexithymia but lower habituation for controls. Results suggest that, when exposed to prolonged threat, alexithymics may adjust poorly, showing a smaller initial defensive response but slower habituation. This pattern seems related to their low emotion regulation ability as indexed by heart rate variability. © 2017 Society for Psychophysiological Research.

  6. A Computational Model for Aperture Control in Reach-to-Grasp Movement Based on Predictive Variability

    PubMed Central

    Takemura, Naohiro; Fukui, Takao; Inui, Toshio

    2015-01-01

    In human reach-to-grasp movement, visual occlusion of a target object leads to a larger peak grip aperture compared to conditions where online vision is available. However, no previous computational and neural network models for reach-to-grasp movement explain the mechanism of this effect. We simulated the effect of online vision on the reach-to-grasp movement by proposing a computational control model based on the hypothesis that the grip aperture is controlled to compensate for both motor variability and sensory uncertainty. In this model, the aperture is formed to achieve a target aperture size that is sufficiently large to accommodate the actual target; it also includes a margin to ensure proper grasping despite sensory and motor variability. To this end, the model considers: (i) the variability of the grip aperture, which is predicted by the Kalman filter, and (ii) the uncertainty of the object size, which is affected by visual noise. Using this model, we simulated experiments in which the effect of the duration of visual occlusion was investigated. The simulation replicated the experimental result wherein the peak grip aperture increased when the target object was occluded, especially in the early phase of the movement. Both predicted motor variability and sensory uncertainty play important roles in the online visuomotor process responsible for grip aperture control. PMID:26696874

  7. Bioprocess development workflow: Transferable physiological knowledge instead of technological correlations.

    PubMed

    Reichelt, Wieland N; Haas, Florian; Sagmeister, Patrick; Herwig, Christoph

    2017-01-01

    Microbial bioprocesses need to be designed to be transferable from lab scale to production scale as well as between setups. Although substantial effort is invested to control technological parameters, usually the only true constant parameter is the actual producer of the product: the cell. Hence, instead of solely controlling technological process parameters, the focus should be increasingly laid on physiological parameters. This contribution aims at illustrating a workflow of data life cycle management with special focus on physiology. Information processing condenses the data into physiological variables, while information mining condenses the variables further into physiological descriptors. This basis facilitates data analysis for a physiological explanation for observed phenomena in productivity. Targeting transferability, we demonstrate this workflow using an industrially relevant Escherichia coli process for recombinant protein production and substantiate the following three points: (1) The postinduction phase is independent in terms of productivity and physiology from the preinduction variables specific growth rate and biomass at induction. (2) The specific substrate uptake rate during induction phase was found to significantly impact the maximum specific product titer. (3) The time point of maximum specific titer can be predicted by an easy accessible physiological variable: while the maximum specific titers were reached at different time points (19.8 ± 7.6 h), those maxima were reached all within a very narrow window of cumulatively consumed substrate dSn (3.1 ± 0.3 g/g). Concluding, this contribution provides a workflow on how to gain a physiological view on the process and illustrates potential benefits. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 33:261-270, 2017. © 2016 American Institute of Chemical Engineers.

  8. Autonomous Control of Space Nuclear Reactors

    NASA Technical Reports Server (NTRS)

    Merk, John

    2013-01-01

    Nuclear reactors to support future robotic and manned missions impose new and innovative technological requirements for their control and protection instrumentation. Long-duration surface missions necessitate reliable autonomous operation, and manned missions impose added requirements for failsafe reactor protection. There is a need for an advanced instrumentation and control system for space-nuclear reactors that addresses both aspects of autonomous operation and safety. The Reactor Instrumentation and Control System (RICS) consists of two functionally independent systems: the Reactor Protection System (RPS) and the Supervision and Control System (SCS). Through these two systems, the RICS both supervises and controls a nuclear reactor during normal operational states, as well as monitors the operation of the reactor and, upon sensing a system anomaly, automatically takes the appropriate actions to prevent an unsafe or potentially unsafe condition from occurring. The RPS encompasses all electrical and mechanical devices and circuitry, from sensors to actuation device output terminals. The SCS contains a comprehensive data acquisition system to measure continuously different groups of variables consisting of primary measurement elements, transmitters, or conditioning modules. These reactor control variables can be categorized into two groups: those directly related to the behavior of the core (known as nuclear variables) and those related to secondary systems (known as process variables). Reliable closed-loop reactor control is achieved by processing the acquired variables and actuating the appropriate device drivers to maintain the reactor in a safe operating state. The SCS must prevent a deviation from the reactor nominal conditions by managing limitation functions in order to avoid RPS actions. The RICS has four identical redundancies that comply with physical separation, electrical isolation, and functional independence. This architecture complies with the safety requirements of a nuclear reactor and provides high availability to the host system. The RICS is intended to interface with a host computer (the computer of the spacecraft where the reactor is mounted). The RICS leverages the safety features inherent in Earth-based reactors and also integrates the wide range neutron detector (WRND). A neutron detector provides the input that allows the RICS to do its job. The RICS is based on proven technology currently in use at a nuclear research facility. In its most basic form, the RICS is a ruggedized, compact data-acquisition and control system that could be adapted to support a wide variety of harsh environments. As such, the RICS could be a useful instrument outside the scope of a nuclear reactor, including military applications where failsafe data acquisition and control is required with stringent size, weight, and power constraints.

  9. Document Preparation (for Filming). ERIC Processing Manual, Appendix B.

    ERIC Educational Resources Information Center

    Brandhorst, Ted, Ed.; And Others

    The technical report or "fugitive" literature collected by ERIC is produced using a wide variety of printing techniques, many formats, and variable degrees of quality control. Since the documents processed by ERIC go on to be microfilmed and reproduced in microfiche and paper copy for sale to users, it is essential that the ERIC document…

  10. Wood-based composites and panel products

    Treesearch

    John A. Youngquist

    1999-01-01

    Because wood properties vary among species, between trees of the same species, and between pieces from the same tree, solid wood cannot match reconstituted wood in the range of properties that can be controlled in processing. When processing variables are properly selected, the end result can sometimes surpass nature’s best effort. With solid wood, changes in...

  11. Effortful Control and Adaptive Functioning of Homeless Children: Variable-Focused and Person-Focused Analyses

    ERIC Educational Resources Information Center

    Obradovic, Jelena

    2010-01-01

    Homeless children show significant developmental delays across major domains of adaptation, yet research on protective processes that may contribute to resilient adaptation in this highly disadvantaged group of children is extremely rare. This study examined the role of effortful control for adaption in 58 homeless children, ages 5-6, during their…

  12. Method of operating an oil shale kiln

    DOEpatents

    Reeves, Adam A.

    1978-05-23

    Continuously determining the bulk density of raw and retorted oil shale, the specific gravity of the raw oil shale and the richness of the raw oil shale provides accurate means to control process variables of the retorting of oil shale, predicting oil production, determining mining strategy, and aids in controlling shale placement in the kiln for the retorting.

  13. LLRW disposal facility siting approaches: Connecticut`s innovative volunteer approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forcella, D.; Gingerich, R.E.; Holeman, G.R.

    1994-12-31

    The Connecticut Hazardous Waste Management Service (CHWMS) has embarked on a volunteer approach to siting a LLRW disposal facility in Connecticut. This effort comes after an unsuccessful effort to site a facility using a step-wise, criteria-based site screening process that was a classic example of the decide/announce/defend approach. While some of the specific features of the CHWMS` volunteer process reflect the unique challenge presented by the state`s physical characteristics, political structure and recent unsuccessful siting experience, the basic elements of the process are applicable to siting LLRW disposal facilities in many parts of the United States. The CHWMS` volunteer processmore » is structured to reduce the {open_quotes}outrage{close_quotes} dimension of two of the variables that affect the public`s perception of risk. The two variables are the degree to which the risk is taken on voluntarily (voluntary risks are accepted more readily than those that are imposed) and the amount of control one has over the risk (risks under individual control are accepted more readily than those under government control). In the volunteer process, the CHWMS will only consider sites that have been been voluntarily offered by the community in which they are located and the CHWMS will share control over the development and operation of the facility with the community. In addition to these elements which have broad applicability, the CHWMS has tailored the volunteer approach to take advantage of the unique opportunities made possible by the earlier statewide site screening process. Specifically, the approach presents a {open_quotes}win-win{close_quotes} situation for elected officials in many communities if they decide to participate in the process.« less

  14. Multi-Objective Control Optimization for Greenhouse Environment Using Evolutionary Algorithms

    PubMed Central

    Hu, Haigen; Xu, Lihong; Wei, Ruihua; Zhu, Bingkun

    2011-01-01

    This paper investigates the issue of tuning the Proportional Integral and Derivative (PID) controller parameters for a greenhouse climate control system using an Evolutionary Algorithm (EA) based on multiple performance measures such as good static-dynamic performance specifications and the smooth process of control. A model of nonlinear thermodynamic laws between numerous system variables affecting the greenhouse climate is formulated. The proposed tuning scheme is tested for greenhouse climate control by minimizing the integrated time square error (ITSE) and the control increment or rate in a simulation experiment. The results show that by tuning the gain parameters the controllers can achieve good control performance through step responses such as small overshoot, fast settling time, and less rise time and steady state error. Besides, it can be applied to tuning the system with different properties, such as strong interactions among variables, nonlinearities and conflicting performance criteria. The results implicate that it is a quite effective and promising tuning method using multi-objective optimization algorithms in the complex greenhouse production. PMID:22163927

  15. Discrimination Learning and the Effects of Interference on Short and Long Term Retention Process of Retarded and Normal Children. Final Report.

    ERIC Educational Resources Information Center

    Stukuls, Henry I.

    Eighteen retarded Ss (mean IQ 50 and mean age 14 years) and 18 normal Ss (mean IQ 100 and mean age 7 years) participated in a study to isolate variables that differentially control discrimination learning and retention processes, and to evaluate contrasting theories on discrimination learning and menory processes of retarded and normal children.…

  16. Advances in deep-UV processing using cluster tools

    NASA Astrophysics Data System (ADS)

    Escher, Gary C.; Tepolt, Gary; Mohondro, Robert D.

    1993-09-01

    Deep-UV laser lithography has shown the capability of supporting the manufacture of multiple generations of integrated circuits (ICs) due to its wide process latitude and depth of focus (DOF) for 0.2 micrometers to 0.5 micrometers feature sizes. This capability has been attained through improvements in deep-UV wide field lens technology, excimer lasers, steppers and chemically amplified, positive deep-UV resists. Chemically amplified deep-UV resists are required for 248 nm lithography due to the poor absorption and sensitivity of conventional novolac resists. The acid catalyzation processes of the new resists requires control of the thermal history and environmental conditions of the lithographic process. Work is currently underway at several resist vendors to reduce the need for these controls, but practical manufacturing solutions exist today. One of these solutions is the integration of steppers and resist tracks into a `cluster tool' or `Lithocell' to insure a consistent thermal profile for the resist process and reduce the time the resist is exposed to atmospheric contamination. The work here reports processing and system integration results with a Machine Technology, Inc (MTI) post-exposure bake (PEB) track interfaced with an advanced GCA XLS 7800 deep-UV stepper [31 mm diameter, variable NA (0.35 - 0.53) and variable sigma (0.3 - 0.74)].

  17. Instrumenting an upland research catchment in Canterbury, New Zealand to study controls on variability of soil moisture, shallow groundwater and streamflow

    NASA Astrophysics Data System (ADS)

    McMillan, Hilary; Srinivasan, Ms

    2015-04-01

    Hydrologists recognise the importance of vertical drainage and deep flow paths in runoff generation, even in headwater catchments. Both soil and groundwater stores are highly variable over multiple scales, and the distribution of water has a strong control on flow rates and timing. In this study, we instrumented an upland headwater catchment in New Zealand to measure the temporal and spatial variation in unsaturated and saturated-zone responses. In NZ, upland catchments are the source of much of the water used in lowland agriculture, but the hydrology of such catchments and their role in water partitioning, storage and transport is poorly understood. The study area is the Langs Gully catchment in the North Branch of the Waipara River, Canterbury: this catchment was chosen to be representative of the foothills environment, with lightly managed dryland pasture and native Matagouri shrub vegetation cover. Over a period of 16 months we measured continuous soil moisture at 32 locations and near-surface water table (< 2 m) at 14 locations, as well as measuring flow at 3 stream gauges. The distributed measurement sites were located to allow comparisons between North and South facing locations, near-stream versus hillslope locations, and convergent versus divergent hillslopes. We found that temporal variability is strongly controlled by the climatic seasonal cycle, for both soil moisture and water table, and for both the mean and extremes of their distributions. Groundwater is a larger water storage component than soil moisture, and the difference increases with catchment wetness. The spatial standard deviation of both soil moisture and groundwater is larger in winter than in summer. It peaks during rainfall events due to partial saturation of the catchment, and also rises in spring as different locations dry out at different rates. The most important controls on spatial variability are aspect and distance from stream. South-facing and near-stream locations have higher water tables and more, larger soil moisture wetting events. Typical hydrological models do not explicitly account for aspect, but our results suggest that it is an important factor in hillslope runoff generation. Co-measurement of soil moisture and water table level allowed us to identify interrelationships between the two. Locations where water tables peaked closest to the surface had consistently wetter soils and higher water tables. These wetter sites were the same across seasons. However, temporary patterns of strong soil moisture response to summer storms did not correspond to the wetter sites. Total catchment spatial variability is composed of multiple variability sources, and the dominant type is sensitive to those stores that are close to a threshold such as field capacity or saturation. Therefore, we classified spatial variability as 'summer mode' or 'winter mode'. In summer mode, variability is controlled by shallow processes e.g. interactions of water with soils and vegetation. In winter mode, variability is controlled by deeper processes e.g. groundwater movement and bypass flow. Double flow peaks observed during some events show the direct impact of groundwater variability on runoff generation. Our results suggest that emergent catchment behaviour depends on the combination of these multiple, time varying components of variability.

  18. Online analysis and process control in recombinant protein production (review).

    PubMed

    Palmer, Shane M; Kunji, Edmund R S

    2012-01-01

    Online analysis and control is essential for efficient and reproducible bioprocesses. A key factor in real-time control is the ability to measure critical variables rapidly. Online in situ measurements are the preferred option and minimize the potential loss of sterility. The challenge is to provide sensors with a good lifespan that withstand harsh bioprocess conditions, remain stable for the duration of a process without the need for recalibration, and offer a suitable working range. In recent decades, many new techniques that promise to extend the possibilities of analysis and control, not only by providing new parameters for analysis, but also through the improvement of accepted, well practiced, measurements have arisen.

  19. Evolutionary grinding model for nanometric control of surface roughness for aspheric optical surfaces.

    PubMed

    Han, Jeong-Yeol; Kim, Sug-Whan; Han, Inwoo; Kim, Geon-Hee

    2008-03-17

    A new evolutionary grinding process model has been developed for nanometric control of material removal from an aspheric surface of Zerodur substrate. The model incorporates novel control features such as i) a growing database; ii) an evolving, multi-variable regression equation; and iii) an adaptive correction factor for target surface roughness (Ra) for the next machine run. This process model demonstrated a unique evolutionary controllability of machining performance resulting in the final grinding accuracy (i.e. averaged difference between target and measured surface roughness) of -0.2+/-2.3(sigma) nm Ra over seven trial machine runs for the target surface roughness ranging from 115 nm to 64 nm Ra.

  20. Co-optimization of CO 2 -EOR and Storage Processes under Geological Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ampomah, William; Balch, Robert; Will, Robert

    This paper presents an integrated numerical framework to co-optimize EOR and CO 2 storage performance in the Farnsworth field unit (FWU), Ochiltree County, Texas. The framework includes a field-scale compositional reservoir flow model, an uncertainty quantification model and a neural network optimization process. The reservoir flow model has been constructed based on the field geophysical, geological, and engineering data. A laboratory fluid analysis was tuned to an equation of state and subsequently used to predict the thermodynamic minimum miscible pressure (MMP). A history match of primary and secondary recovery processes was conducted to estimate the reservoir and multiphase flow parametersmore » as the baseline case for analyzing the effect of recycling produced gas, infill drilling and water alternating gas (WAG) cycles on oil recovery and CO 2 storage. A multi-objective optimization model was defined for maximizing both oil recovery and CO 2 storage. The uncertainty quantification model comprising the Latin Hypercube sampling, Monte Carlo simulation, and sensitivity analysis, was used to study the effects of uncertain variables on the defined objective functions. Uncertain variables such as bottom hole injection pressure, WAG cycle, injection and production group rates, and gas-oil ratio among others were selected. The most significant variables were selected as control variables to be used for the optimization process. A neural network optimization algorithm was utilized to optimize the objective function both with and without geological uncertainty. The vertical permeability anisotropy (Kv/Kh) was selected as one of the uncertain parameters in the optimization process. The simulation results were compared to a scenario baseline case that predicted CO 2 storage of 74%. The results showed an improved approach for optimizing oil recovery and CO 2 storage in the FWU. The optimization process predicted more than 94% of CO 2 storage and most importantly about 28% of incremental oil recovery. The sensitivity analysis reduced the number of control variables to decrease computational time. A risk aversion factor was used to represent results at various confidence levels to assist management in the decision-making process. The defined objective functions were proved to be a robust approach to co-optimize oil recovery and CO 2 storage. The Farnsworth CO 2 project will serve as a benchmark for future CO 2–EOR or CCUS projects in the Anadarko basin or geologically similar basins throughout the world.« less

  1. Co-optimization of CO 2 -EOR and Storage Processes under Geological Uncertainty

    DOE PAGES

    Ampomah, William; Balch, Robert; Will, Robert; ...

    2017-07-01

    This paper presents an integrated numerical framework to co-optimize EOR and CO 2 storage performance in the Farnsworth field unit (FWU), Ochiltree County, Texas. The framework includes a field-scale compositional reservoir flow model, an uncertainty quantification model and a neural network optimization process. The reservoir flow model has been constructed based on the field geophysical, geological, and engineering data. A laboratory fluid analysis was tuned to an equation of state and subsequently used to predict the thermodynamic minimum miscible pressure (MMP). A history match of primary and secondary recovery processes was conducted to estimate the reservoir and multiphase flow parametersmore » as the baseline case for analyzing the effect of recycling produced gas, infill drilling and water alternating gas (WAG) cycles on oil recovery and CO 2 storage. A multi-objective optimization model was defined for maximizing both oil recovery and CO 2 storage. The uncertainty quantification model comprising the Latin Hypercube sampling, Monte Carlo simulation, and sensitivity analysis, was used to study the effects of uncertain variables on the defined objective functions. Uncertain variables such as bottom hole injection pressure, WAG cycle, injection and production group rates, and gas-oil ratio among others were selected. The most significant variables were selected as control variables to be used for the optimization process. A neural network optimization algorithm was utilized to optimize the objective function both with and without geological uncertainty. The vertical permeability anisotropy (Kv/Kh) was selected as one of the uncertain parameters in the optimization process. The simulation results were compared to a scenario baseline case that predicted CO 2 storage of 74%. The results showed an improved approach for optimizing oil recovery and CO 2 storage in the FWU. The optimization process predicted more than 94% of CO 2 storage and most importantly about 28% of incremental oil recovery. The sensitivity analysis reduced the number of control variables to decrease computational time. A risk aversion factor was used to represent results at various confidence levels to assist management in the decision-making process. The defined objective functions were proved to be a robust approach to co-optimize oil recovery and CO 2 storage. The Farnsworth CO 2 project will serve as a benchmark for future CO 2–EOR or CCUS projects in the Anadarko basin or geologically similar basins throughout the world.« less

  2. Analysis of the control structures for an integrated ethanol processor for proton exchange membrane fuel cell systems

    NASA Astrophysics Data System (ADS)

    Biset, S.; Nieto Deglioumini, L.; Basualdo, M.; Garcia, V. M.; Serra, M.

    The aim of this work is to investigate which would be a good preliminary plantwide control structure for the process of Hydrogen production from bioethanol to be used in a proton exchange membrane (PEM) accounting only steady-state information. The objective is to keep the process under optimal operation point, that is doing energy integration to achieve the maximum efficiency. Ethanol, produced from renewable feedstocks, feeds a fuel processor investigated for steam reforming, followed by high- and low-temperature shift reactors and preferential oxidation, which are coupled to a polymeric fuel cell. Applying steady-state simulation techniques and using thermodynamic models the performance of the complete system with two different control structures have been evaluated for the most typical perturbations. A sensitivity analysis for the key process variables together with the rigorous operability requirements for the fuel cell are taking into account for defining acceptable plantwide control structure. This is the first work showing an alternative control structure applied to this kind of process.

  3. A multi-model ensemble view of winter heat flux dynamics and the dipole mode in the Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Liguori, Giovanni; Di Lorenzo, Emanuele; Cabos, William

    2017-02-01

    Changes in surface heat fluxes affect several climate processes controlling the Mediterranean climate. These include the winter formation of deep waters, which is the primary driver of the Mediterranean Sea overturning circulation. Previous studies that characterize the spatial and temporal variability of surface heat flux anomalies over the basin reveal the existence of two statistically dominant patterns of variability: a monopole of uniform sign and an east-west dipole of opposite signs. In this work, we use the 12 regional climate model ensemble from the EU-FP6 ENSEMBLES project to diagnose the large-scale atmospheric processes that control the variability of heat fluxes over the Mediterranean Sea from interannual to decadal timescales (here defined as timescales > 6 year). Our findings suggest that while the monopole structure captures variability in the winter-to-winter domain-average net heat flux, the dipole pattern tracks changes in the Mediterranean climate that are connected to the East Atlantic/Western Russia (EA/WR) atmospheric teleconnection pattern. Furthermore, while the monopole exhibits significant differences in the spatial structure across the multi-model ensemble, the dipole pattern is very robust and more clearly identifiable in the anomaly maps of individual years. A heat budget analysis of the dipole pattern reveals that changes in winds associated with the EA/WR pattern exert dominant control through both a direct effect on the latent heat flux (i.e., wind speed) and an indirect effect through specific humidity (e.g., wind advection). A simple reconstruction of the heat flux variability over the deep-water formation regions of the Gulf of Lion and the Aegean Sea reveals that the combination of the monopole and dipole time series explains over 90 % of the heat flux variance in these regions. Given the important role that surface heat flux anomalies play in deep-water formation and the regional climate, improving our knowledge on the dynamics controlling the leading modes of heat flux variability may enhance our predictability of the climate of the Mediterranean area.

  4. Timing matters: change depends on the stage of treatment in cognitive behavioral therapy for panic disorder with agoraphobia.

    PubMed

    Gloster, Andrew T; Klotsche, Jens; Gerlach, Alexander L; Hamm, Alfons; Ströhle, Andreas; Gauggel, Siegfried; Kircher, Tilo; Alpers, Georg W; Deckert, Jürgen; Wittchen, Hans-Ulrich

    2014-02-01

    The mechanisms of action underlying treatment are inadequately understood. This study examined 5 variables implicated in the treatment of panic disorder with agoraphobia (PD/AG): catastrophic agoraphobic cognitions, anxiety about bodily sensations, agoraphobic avoidance, anxiety sensitivity, and psychological flexibility. The relative importance of these process variables was examined across treatment phases: (a) psychoeducation/interoceptive exposure, (b) in situ exposure, and (c) generalization/follow-up. Data came from a randomized controlled trial of cognitive behavioral therapy for PD/AG (n = 301). Outcomes were the Panic and Agoraphobia Scale (Bandelow, 1995) and functioning as measured in the Clinical Global Impression scale (Guy, 1976). The effect of process variables on subsequent change in outcome variables was calculated using bivariate latent difference score modeling. Change in panic symptomatology was preceded by catastrophic appraisal and agoraphobic avoidance across all phases of treatment, by anxiety sensitivity during generalization/follow-up, and by psychological flexibility during exposure in situ. Change in functioning was preceded by agoraphobic avoidance and psychological flexibility across all phases of treatment, by fear of bodily symptoms during generalization/follow-up, and by anxiety sensitivity during exposure. The effects of process variables on outcomes differ across treatment phases and outcomes (i.e., symptomatology vs. functioning). Agoraphobic avoidance and psychological flexibility should be investigated and therapeutically targeted in addition to cognitive variables. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  5. Inferring Master Painters' Esthetic Biases from the Statistics of Portraits

    PubMed Central

    Aleem, Hassan; Correa-Herran, Ivan; Grzywacz, Norberto M.

    2017-01-01

    The Processing Fluency Theory posits that the ease of sensory information processing in the brain facilitates esthetic pleasure. Accordingly, the theory would predict that master painters should display biases toward visual properties such as symmetry, balance, and moderate complexity. Have these biases been occurring and if so, have painters been optimizing these properties (fluency variables)? Here, we address these questions with statistics of portrait paintings from the Early Renaissance period. To do this, we first developed different computational measures for each of the aforementioned fluency variables. Then, we measured their statistics in 153 portraits from 26 master painters, in 27 photographs of people in three controlled poses, and in 38 quickly snapped photographs of individual persons. A statistical comparison between Early Renaissance portraits and quickly snapped photographs revealed that painters showed a bias toward balance, symmetry, and moderate complexity. However, a comparison between portraits and controlled-pose photographs showed that painters did not optimize each of these properties. Instead, different painters presented biases toward different, narrow ranges of fluency variables. Further analysis suggested that the painters' individuality stemmed in part from having to resolve the tension between complexity vs. symmetry and balance. We additionally found that constraints on the use of different painting materials by distinct painters modulated these fluency variables systematically. In conclusion, the Processing Fluency Theory of Esthetic Pleasure would need expansion if we were to apply it to the history of visual art since it cannot explain the lack of optimization of each fluency variables. To expand the theory, we propose the existence of a Neuroesthetic Space, which encompasses the possible values that each of the fluency variables can reach in any given art period. We discuss the neural mechanisms of this Space and propose that it has a distributed representation in the human brain. We further propose that different artists reside in different, small sub-regions of the Space. This Neuroesthetic-Space hypothesis raises the question of how painters and their paintings evolve across art periods. PMID:28337133

  6. Mediators of weight loss in a family-based intervention presented over the internet.

    PubMed

    White, Marney A; Martin, Pamela D; Newton, Robert L; Walden, Heather M; York-Crowe, Emily E; Gordon, Stewart T; Ryan, Donna H; Williamson, Donald A

    2004-07-01

    To assess the process variables involved in a weight loss program for African-American adolescent girls. Several process variables have been identified as affecting success in in vivo weight loss programs for adults and children, including program adherence, self-efficacy, and social support. The current study sought to broaden the understanding of these process variables as they pertain to an intervention program that is presented using the Internet. It was hypothesized that variables such as program adherence, dietary self-efficacy, psychological factors, and family environment factors would mediate the effect of the experimental condition on weight loss. Participants were 57 adolescent African-American girls who joined the program with one obese parent; family pairs were randomized to either a behavioral or control condition in an Internet-based weight loss program. Outcome data (weight loss) are reported for the first 6 months of the intervention. Results partially supported the hypotheses. For weight loss among adolescents, parent variables pertaining to life and family satisfaction were the strongest mediating variables. For parental weight loss, changes in dietary practices over the course of 6 months were the strongest mediators. The identification of factors that enhance or impede weight loss for adolescents is an important step in improving weight loss programs for this group. The current findings suggest that family/parental variables exert a strong influence on weight loss efforts for adolescents and should be considered in developing future programs. Copyright 2004 NAASO

  7. Study of Variable Frequency Induction Heating in Steel Making Process

    NASA Astrophysics Data System (ADS)

    Fukutani, Kazuhiko; Umetsu, Kenji; Itou, Takeo; Isobe, Takanori; Kitahara, Tadayuki; Shimada, Ryuichi

    Induction heating technologies have been the standard technologies employed in steel making processes because they are clean, they have a high energy density, they are highly the controllable, etc. However, there is a problem in using them; in general, frequencies of the electric circuits have to be kept fixed to improve their power factors, and this constraint makes the processes inflexible. In order to overcome this problem, we have developed a new heating technique-variable frequency power supply with magnetic energy recovery switching. This technique helps us in improving the quality of steel products as well as the productivity. We have also performed numerical calculations and experiments to evaluate its effect on temperature distributions on heated steel plates. The obtained results indicate that the application of the technique in steel making processes would be advantageous.

  8. A Sequential Shifting Algorithm for Variable Rotor Speed Control

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.; Edwards, Jason M.; DeCastro, Jonathan A.

    2007-01-01

    A proof of concept of a continuously variable rotor speed control methodology for rotorcraft is described. Variable rotor speed is desirable for several reasons including improved maneuverability, agility, and noise reduction. However, it has been difficult to implement because turboshaft engines are designed to operate within a narrow speed band, and a reliable drive train that can provide continuous power over a wide speed range does not exist. The new methodology proposed here is a sequential shifting control for twin-engine rotorcraft that coordinates the disengagement and engagement of the two turboshaft engines in such a way that the rotor speed may vary over a wide range, but the engines remain within their prescribed speed bands and provide continuous torque to the rotor; two multi-speed gearboxes facilitate the wide rotor speed variation. The shifting process begins when one engine slows down and disengages from the transmission by way of a standard freewheeling clutch mechanism; the other engine continues to apply torque to the rotor. Once one engine disengages, its gear shifts, the multi-speed gearbox output shaft speed resynchronizes and it re-engages. This process is then repeated with the other engine. By tailoring the sequential shifting, the rotor may perform large, rapid speed changes smoothly, as demonstrated in several examples. The emphasis of this effort is on the coordination and control aspects for proof of concept. The engines, rotor, and transmission are all simplified linear models, integrated to capture the basic dynamics of the problem.

  9. Impact of Hydrologic Variability on Ecosystem Dynamics and the Sustainable Use of Soil and Water Resources

    NASA Astrophysics Data System (ADS)

    Porporato, A. M.

    2013-05-01

    We discuss the key processes by which hydrologic variability affects the probabilistic structure of soil moisture dynamics in water-controlled ecosystems. These in turn impact biogeochemical cycling and ecosystem structure through plant productivity and biodiversity as well as nitrogen availability and soil conditions. Once the long-term probabilistic structure of these processes is quantified, the results become useful to understand the impact of climatic changes and human activities on ecosystem services, and can be used to find optimal strategies of water and soil resources management under unpredictable hydro-climatic fluctuations. Particular applications regard soil salinization, phytoremediation and optimal stochastic irrigation.

  10. Channel Access in Erlang

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicklaus, Dennis J.

    2013-10-13

    We have developed an Erlang language implementation of the Channel Access protocol. Included are low-level functions for encoding and decoding Channel Access protocol network packets as well as higher level functions for monitoring or setting EPICS process variables. This provides access to EPICS process variables for the Fermilab Acnet control system via our Erlang-based front-end architecture without having to interface to C/C++ programs and libraries. Erlang is a functional programming language originally developed for real-time telecommunications applications. Its network programming features and list management functions make it particularly well-suited for the task of managing multiple Channel Access circuits and PVmore » monitors.« less

  11. Lithologic Controls on Critical Zone Processes in a Variably Metamorphosed Shale-Hosted Watershed

    NASA Astrophysics Data System (ADS)

    Eldam Pommer, R.; Navarre-Sitchler, A.

    2017-12-01

    Local and regional shifts in thermal maturity within sedimentary shale systems impart significant variation in chemical and physical rock properties, such as pore-network morphology, mineralogy, organic carbon content, and solute release potential. Even slight variations in these properties on a watershed scale can strongly impact surface and shallow subsurface processes that drive soil formation, landscape evolution, and bioavailability of nutrients. Our ability to map and quantify the effects of this heterogeneity on critical zone processes is hindered by the complex coupling of the multi-scale nature of rock properties, geochemical signatures, and hydrological processes. This study addresses each of these complexities by synthesizing chemical and physical characteristics of variably metamorphosed shales in order to link rock heterogeneity with modern earth surface and shallow subsurface processes. More than 80 samples of variably metamorphosed Mancos Shale were collected in the East River Valley, Colorado, a headwater catchment of the Upper Colorado River Basin. Chemical and physical analyses of the samples show that metamorphism decreases overall rock porosity, pore anisotropy, and surface area, and introduces unique chemical signatures. All of these changes result in lower overall solute release from the Mancos Shale in laboratory dissolution experiments and a change in rock-derived solute chemistry with decreasing organic carbon and cation exchange capacity (Ca, Na, Mg, and K). The increase in rock competency and decrease in reactivity of the more thermally mature shales appear to subsequently control river morphology, with lower channel sinuosity associated with areas of the catchment underlain by metamorphosed Mancos Shale. This work illustrates the formative role of the geologic template on critical zone processes and landscape development within and across watersheds.

  12. Switching and optimizing control for coal flotation process based on a hybrid model

    PubMed Central

    Dong, Zhiyong; Wang, Ranfeng; Fan, Minqiang; Fu, Xiang

    2017-01-01

    Flotation is an important part of coal preparation, and the flotation column is widely applied as efficient flotation equipment. This process is complex and affected by many factors, with the froth depth and reagent dosage being two of the most important and frequently manipulated variables. This paper proposes a new method of switching and optimizing control for the coal flotation process. A hybrid model is built and evaluated using industrial data. First, wavelet analysis and principal component analysis (PCA) are applied for signal pre-processing. Second, a control model for optimizing the set point of the froth depth is constructed based on fuzzy control, and a control model is designed to optimize the reagent dosages based on expert system. Finally, the least squares-support vector machine (LS-SVM) is used to identify the operating conditions of the flotation process and to select one of the two models (froth depth or reagent dosage) for subsequent operation according to the condition parameters. The hybrid model is developed and evaluated on an industrial coal flotation column and exhibits satisfactory performance. PMID:29040305

  13. Economic Statistical Design of Integrated X-bar-S Control Chart with Preventive Maintenance and General Failure Distribution

    PubMed Central

    Caballero Morales, Santiago Omar

    2013-01-01

    The application of Preventive Maintenance (PM) and Statistical Process Control (SPC) are important practices to achieve high product quality, small frequency of failures, and cost reduction in a production process. However there are some points that have not been explored in depth about its joint application. First, most SPC is performed with the X-bar control chart which does not fully consider the variability of the production process. Second, many studies of design of control charts consider just the economic aspect while statistical restrictions must be considered to achieve charts with low probabilities of false detection of failures. Third, the effect of PM on processes with different failure probability distributions has not been studied. Hence, this paper covers these points, presenting the Economic Statistical Design (ESD) of joint X-bar-S control charts with a cost model that integrates PM with general failure distribution. Experiments showed statistically significant reductions in costs when PM is performed on processes with high failure rates and reductions in the sampling frequency of units for testing under SPC. PMID:23527082

  14. Field oriented control of induction motors

    NASA Technical Reports Server (NTRS)

    Burrows, Linda M.; Zinger, Don S.; Roth, Mary Ellen

    1990-01-01

    Induction motors have always been known for their simple rugged construction, but until lately were not suitable for variable speed or servo drives due to the inherent complexity of the controls. With the advent of field oriented control (FOC), however, the induction motor has become an attractive option for these types of drive systems. An FOC system which utilizes the pulse population modulation method to synthesize the motor drive frequencies is examined. This system allows for a variable voltage to frequency ratio and enables the user to have independent control of both the speed and torque of an induction motor. A second generation of the control boards were developed and tested with the next point of focus being the minimization of the size and complexity of these controls. Many options were considered with the best approach being the use of a digital signal processor (DSP) due to its inherent ability to quickly evaluate control algorithms. The present test results of the system and the status of the optimization process using a DSP are discussed.

  15. Biomechanics as a window into the neural control of movement

    PubMed Central

    2016-01-01

    Abstract Biomechanics and motor control are discussed as parts of a more general science, physics of living systems. Major problems of biomechanics deal with exact definition of variables and their experimental measurement. In motor control, major problems are associated with formulating currently unknown laws of nature specific for movements by biological objects. Mechanics-based hypotheses in motor control, such as those originating from notions of a generalized motor program and internal models, are non-physical. The famous problem of motor redundancy is wrongly formulated; it has to be replaced by the principle of abundance, which does not pose computational problems for the central nervous system. Biomechanical methods play a central role in motor control studies. This is illustrated with studies with the reconstruction of hypothetical control variables and those exploring motor synergies within the framework of the uncontrolled manifold hypothesis. Biomechanics and motor control have to merge into physics of living systems, and the earlier this process starts the better. PMID:28149390

  16. Risk-Sensitivity in Sensorimotor Control

    PubMed Central

    Braun, Daniel A.; Nagengast, Arne J.; Wolpert, Daniel M.

    2011-01-01

    Recent advances in theoretical neuroscience suggest that motor control can be considered as a continuous decision-making process in which uncertainty plays a key role. Decision-makers can be risk-sensitive with respect to this uncertainty in that they may not only consider the average payoff of an outcome, but also consider the variability of the payoffs. Although such risk-sensitivity is a well-established phenomenon in psychology and economics, it has been much less studied in motor control. In fact, leading theories of motor control, such as optimal feedback control, assume that motor behaviors can be explained as the optimization of a given expected payoff or cost. Here we review evidence that humans exhibit risk-sensitivity in their motor behaviors, thereby demonstrating sensitivity to the variability of “motor costs.” Furthermore, we discuss how risk-sensitivity can be incorporated into optimal feedback control models of motor control. We conclude that risk-sensitivity is an important concept in understanding individual motor behavior under uncertainty. PMID:21283556

  17. Variability of suspended-sediment concentration at tidal to annual time scales in San Francisco Bay, USA

    USGS Publications Warehouse

    Schoellhamer, D.H.

    2002-01-01

    Singular spectrum analysis for time series with missing data (SSAM) was used to reconstruct components of a 6-yr time series of suspended-sediment concentration (SSC) from San Francisco Bay. Data were collected every 15 min and the time series contained missing values that primarily were due to sensor fouling. SSAM was applied in a sequential manner to calculate reconstructed components with time scales of variability that ranged from tidal to annual. Physical processes that controlled SSC and their contribution to the total variance of SSC were (1) diurnal, semidiurnal, and other higher frequency tidal constituents (24%), (2) semimonthly tidal cycles (21%), (3) monthly tidal cycles (19%), (4) semiannual tidal cycles (12%), and (5) annual pulses of sediment caused by freshwater inflow, deposition, and subsequent wind-wave resuspension (13%). Of the total variance 89% was explained and subtidal variability (65%) was greater than tidal variability (24%). Processes at subtidal time scales accounted for more variance of SSC than processes at tidal time scales because sediment accumulated in the water column and the supply of easily erodible bed sediment increased during periods of increased subtidal energy. This large range of time scales that each contained significant variability of SSC and associated contaminants can confound design of sampling programs and interpretation of resulting data.

  18. Musculoskeletal motion flow fields using hierarchical variable-sized block matching in ultrasonographic video sequences.

    PubMed

    Revell, J D; Mirmehdi, M; McNally, D S

    2004-04-01

    We examine tissue deformations using non-invasive dynamic musculoskeletal ultrasonograhy, and quantify its performance on controlled in vitro gold standard (groundtruth) sequences followed by clinical in vivo data. The proposed approach employs a two-dimensional variable-sized block matching algorithm with a hierarchical full search. We extend this process by refining displacements to sub-pixel accuracy. We show by application that this technique yields quantitatively reliable results.

  19. Interaction Between Ecohydrologic Dynamics and Microtopographic Variability Under Climate Change

    NASA Astrophysics Data System (ADS)

    Le, Phong V. V.; Kumar, Praveen

    2017-10-01

    Vegetation acclimation resulting from elevated atmospheric CO2 concentration, along with response to increased temperature and altered rainfall pattern, is expected to result in emergent behavior in ecologic and hydrologic functions. We hypothesize that microtopographic variability, which are landscape features typically of the length scale of the order of meters, such as topographic depressions, will play an important role in determining this dynamics by altering the persistence and variability of moisture. To investigate these emergent ecohydrologic dynamics, we develop a modeling framework, Dhara, which explicitly incorporates the control of microtopographic variability on vegetation, moisture, and energy dynamics. The intensive computational demand from such a modeling framework that allows coupling of multilayer modeling of the soil-vegetation continuum with 3-D surface-subsurface flow processes is addressed using hybrid CPU-GPU parallel computing framework. The study is performed for different climate change scenarios for an intensively managed agricultural landscape in central Illinois, USA, which is dominated by row-crop agriculture, primarily soybean (Glycine max) and maize (Zea mays). We show that rising CO2 concentration will decrease evapotranspiration, thus increasing soil moisture and surface water ponding in topographic depressions. However, increased atmospheric demand from higher air temperature overcomes this conservative behavior resulting in a net increase of evapotranspiration, leading to reduction in both soil moisture storage and persistence of ponding. These results shed light on the linkage between vegetation acclimation under climate change and microtopography variability controls on ecohydrologic processes.

  20. Elucidating the functional relationship between working memory capacity and psychometric intelligence: a fixed-links modeling approach for experimental repeated-measures designs.

    PubMed

    Thomas, Philipp; Rammsayer, Thomas; Schweizer, Karl; Troche, Stefan

    2015-01-01

    Numerous studies reported a strong link between working memory capacity (WMC) and fluid intelligence (Gf), although views differ in respect to how close these two constructs are related to each other. In the present study, we used a WMC task with five levels of task demands to assess the relationship between WMC and Gf by means of a new methodological approach referred to as fixed-links modeling. Fixed-links models belong to the family of confirmatory factor analysis (CFA) and are of particular interest for experimental, repeated-measures designs. With this technique, processes systematically varying across task conditions can be disentangled from processes unaffected by the experimental manipulation. Proceeding from the assumption that experimental manipulation in a WMC task leads to increasing demands on WMC, the processes systematically varying across task conditions can be assumed to be WMC-specific. Processes not varying across task conditions, on the other hand, are probably independent of WMC. Fixed-links models allow for representing these two kinds of processes by two independent latent variables. In contrast to traditional CFA where a common latent variable is derived from the different task conditions, fixed-links models facilitate a more precise or purified representation of the WMC-related processes of interest. By using fixed-links modeling to analyze data of 200 participants, we identified a non-experimental latent variable, representing processes that remained constant irrespective of the WMC task conditions, and an experimental latent variable which reflected processes that varied as a function of experimental manipulation. This latter variable represents the increasing demands on WMC and, hence, was considered a purified measure of WMC controlled for the constant processes. Fixed-links modeling showed that both the purified measure of WMC (β = .48) as well as the constant processes involved in the task (β = .45) were related to Gf. Taken together, these two latent variables explained the same portion of variance of Gf as a single latent variable obtained by traditional CFA (β = .65) indicating that traditional CFA causes an overestimation of the effective relationship between WMC and Gf. Thus, fixed-links modeling provides a feasible method for a more valid investigation of the functional relationship between specific constructs.

  1. [Effect of a life review process to improve quality of life for the homebound elderly in Japan].

    PubMed

    Imuta, Hiromi; Yasumura, Seiji; Ahiko, Tadayuki

    2004-07-01

    This study examined the therapeutic effects of Life Review processes on physical and psychological functions of homebound elderly in Japan. From 1998, a cohort of people aged 65 and over living in two cities in Yamagata Prefecture has been followed. Sixty-three subjects (24 men, 39 women) were classified as rank A (homebound). Fifty-two persons completed the baseline survey in 1999 and 46 eligible persons (18 men and 28 women) were allocated to intervention and control groups whose age and sex distribution were matched. Intervention entailed giving some health information and Life Review processing for four months, twice a month on average. Each session started with provision of health information followed by the Life Review process which took an hour to finish. All subjects of both groups were assessed for dependent variables at the beginning and the end of the intervention period (pretest and post-test). Dependent variables were physical (Activities of Daily Living, Visual deficit, and others), psychological (subjective health, life satisfaction, self-efficacy scale, and others), and social (functional ability and frequency of getting out of the house). The control group received only the pretest and the post-test. Pretest scores for all physical, psychological, and social variables did not significantly differ between the two groups. The rate for improvement/no change were higher with regard to hearing deficit, ADL (eating, dressing), cognition, subjective health, ikigai and frequency of getting out of house in the intervention group than in the control group, but there were no significant differences. The developed intervention program featuring delivery of health information and structured Life Review Process had no negative influence on physical and psycho-social functions. Practicability of the intervention was suggested. But the study highlights problems such as selection of subjects, duration and method of intervention.

  2. Evolution and Control of 2219 Aluminum Microstructural Features through Electron Beam Freeform Fabrication

    NASA Technical Reports Server (NTRS)

    Taminger, Karen M.; Hafley, Robert A.; Domack, Marcia S.

    2006-01-01

    Electron beam freeform fabrication (EBF3) is a new layer-additive process that has been developed for near-net shape fabrication of complex structures. EBF3 uses an electron beam to create a molten pool on the surface of a substrate. Wire is fed into the molten pool and the part translated with respect to the beam to build up a 3-dimensional structure one layer at a time. Unlike many other freeform fabrication processes, the energy coupling of the electron beam is extremely well suited to processing of aluminum alloys. The layer-additive nature of the EBF3 process results in a tortuous thermal path producing complex microstructures including: small homogeneous equiaxed grains; dendritic growth contained within larger grains; and/or pervasive dendritic formation in the interpass regions of the deposits. Several process control variables contribute to the formation of these different microstructures, including translation speed, wire feed rate, beam current and accelerating voltage. In electron beam processing, higher accelerating voltages embed the energy deeper below the surface of the substrate. Two EBF3 systems have been established at NASA Langley, one with a low-voltage (10-30kV) and the other a high-voltage (30-60 kV) electron beam gun. Aluminum alloy 2219 was processed over a range of different variables to explore the design space and correlate the resultant microstructures with the processing parameters. This report is specifically exploring the impact of accelerating voltage. Of particular interest is correlating energy to the resultant material characteristics to determine the potential of achieving microstructural control through precise management of the heat flux and cooling rates during deposition.

  3. NASA Goddard Space Flight Center Robotic Processing System Program Automation Systems, volume 2

    NASA Technical Reports Server (NTRS)

    Dobbs, M. E.

    1991-01-01

    Topics related to robot operated materials processing in space (RoMPS) are presented in view graph form. Some of the areas covered include: (1) mission requirements; (2) automation management system; (3) Space Transportation System (STS) Hitchhicker Payload; (4) Spacecraft Command Language (SCL) scripts; (5) SCL software components; (6) RoMPS EasyLab Command & Variable summary for rack stations and annealer module; (7) support electronics assembly; (8) SCL uplink packet definition; (9) SC-4 EasyLab System Memory Map; (10) Servo Axis Control Logic Suppliers; and (11) annealing oven control subsystem.

  4. Role of optical computers in aeronautical control applications

    NASA Technical Reports Server (NTRS)

    Baumbick, R. J.

    1981-01-01

    The role that optical computers play in aircraft control is determined. The optical computer has the potential high speed capability required, especially for matrix/matrix operations. The optical computer also has the potential for handling nonlinear simulations in real time. They are also more compatible with fiber optic signal transmission. Optics also permit the use of passive sensors to measure process variables. No electrical energy need be supplied to the sensor. Complex interfacing between optical sensors and the optical computer is avoided if the optical sensor outputs can be directly processed by the optical computer.

  5. Training attentional control in older adults.

    PubMed

    Mackay-Brandt, Anna

    2011-07-01

    Recent research has demonstrated benefits for older adults from training attentional control using a variable priority strategy, but the construct validity of the training task and the degree to which benefits of training transfer to other contexts are unclear. The goal of this study was to characterize baseline performance on the training task in a sample of 105 healthy older adults and to test for transfer of training in a subset (n = 21). Training gains after 5 days and extent of transfer was compared to another subset (n = 20) that served as a control group. Baseline performance on the training task was characterized by a two-factor model of working memory and processing speed. Processing speed correlated with the training task. Training gains in speed and accuracy were reliable and robust (ps <.001, η(2) = .57 to .90). Transfer to an analogous task was observed (ps <.05, η(2) = .10 to .17). The beneficial effect of training did not translate to improved performance on related measures of processing speed. This study highlights the robust effect of training and transfer to a similar context using a variable priority training task. Although processing speed is an important aspect of the training task, training benefit is either related to an untested aspect of the training task or transfer of training is limited to the training context.

  6. Using statistical process control to make data-based clinical decisions.

    PubMed

    Pfadt, A; Wheeler, D J

    1995-01-01

    Applied behavior analysis is based on an investigation of variability due to interrelationships among antecedents, behavior, and consequences. This permits testable hypotheses about the causes of behavior as well as for the course of treatment to be evaluated empirically. Such information provides corrective feedback for making data-based clinical decisions. This paper considers how a different approach to the analysis of variability based on the writings of Walter Shewart and W. Edwards Deming in the area of industrial quality control helps to achieve similar objectives. Statistical process control (SPC) was developed to implement a process of continual product improvement while achieving compliance with production standards and other requirements for promoting customer satisfaction. SPC involves the use of simple statistical tools, such as histograms and control charts, as well as problem-solving techniques, such as flow charts, cause-and-effect diagrams, and Pareto charts, to implement Deming's management philosophy. These data-analytic procedures can be incorporated into a human service organization to help to achieve its stated objectives in a manner that leads to continuous improvement in the functioning of the clients who are its customers. Examples are provided to illustrate how SPC procedures can be used to analyze behavioral data. Issues related to the application of these tools for making data-based clinical decisions and for creating an organizational climate that promotes their routine use in applied settings are also considered.

  7. Data analytics using canonical correlation analysis and Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Rickman, Jeffrey M.; Wang, Yan; Rollett, Anthony D.; Harmer, Martin P.; Compson, Charles

    2017-07-01

    A canonical correlation analysis is a generic parametric model used in the statistical analysis of data involving interrelated or interdependent input and output variables. It is especially useful in data analytics as a dimensional reduction strategy that simplifies a complex, multidimensional parameter space by identifying a relatively few combinations of variables that are maximally correlated. One shortcoming of the canonical correlation analysis, however, is that it provides only a linear combination of variables that maximizes these correlations. With this in mind, we describe here a versatile, Monte-Carlo based methodology that is useful in identifying non-linear functions of the variables that lead to strong input/output correlations. We demonstrate that our approach leads to a substantial enhancement of correlations, as illustrated by two experimental applications of substantial interest to the materials science community, namely: (1) determining the interdependence of processing and microstructural variables associated with doped polycrystalline aluminas, and (2) relating microstructural decriptors to the electrical and optoelectronic properties of thin-film solar cells based on CuInSe2 absorbers. Finally, we describe how this approach facilitates experimental planning and process control.

  8. Human θ burst stimulation enhances subsequent motor learning and increases performance variability.

    PubMed

    Teo, James T H; Swayne, Orlando B C; Cheeran, Binith; Greenwood, Richard J; Rothwell, John C

    2011-07-01

    Intermittent theta burst stimulation (iTBS) transiently increases motor cortex excitability in healthy humans by a process thought to involve synaptic long-term potentiation (LTP), and this is enhanced by nicotine. Acquisition of a ballistic motor task is likewise accompanied by increased excitability and presumed intracortical LTP. Here, we test how iTBS and nicotine influences subsequent motor learning. Ten healthy subjects participated in a double-blinded placebo-controlled trial testing the effects of iTBS and nicotine. iTBS alone increased the rate of learning but this increase was blocked by nicotine. We then investigated factors other than synaptic strengthening that may play a role. Behavioral analysis and modeling suggested that iTBS increased performance variability, which correlated with learning outcome. A control experiment confirmed the increase in motor output variability by showing that iTBS increased the dispersion of involuntary transcranial magnetic stimulation-evoked thumb movements. We suggest that in addition to the effect on synaptic plasticity, iTBS may have facilitated performance by increasing motor output variability; nicotine negated this effect on variability perhaps via increasing the signal-to-noise ratio in cerebral cortex.

  9. Do attentional capacities and processing speed mediate the effect of age on executive functioning?

    PubMed

    Gilsoul, Jessica; Simon, Jessica; Hogge, Michaël; Collette, Fabienne

    2018-02-06

    The executive processes are well known to decline with age, and similar data also exists for attentional capacities and processing speed. Therefore, we investigated whether these two last nonexecutive variables would mediate the effect of age on executive functions (inhibition, shifting, updating, and dual-task coordination). We administered a large battery of executive, attentional and processing speed tasks to 104 young and 71 older people, and we performed mediation analyses with variables showing a significant age effect. All executive and processing speed measures showed age-related effects while only the visual scanning task performance (selective attention) was explained by age when controlled for gender and educational level. Regarding mediation analyses, visual scanning partially mediated the age effect on updating while processing speed partially mediated the age effect on shifting, updating and dual-task coordination. In a more exploratory way, inhibition was also found to partially mediate the effect of age on the three other executive functions. Attention did not greatly influence executive functioning in aging while, in agreement with the literature, processing speed seems to be a major mediator of the age effect on these processes. Interestingly, the global pattern of results seems also to indicate an influence of inhibition but further studies are needed to confirm the role of that variable as a mediator and its relative importance by comparison with processing speed.

  10. Gate sequence for continuous variable one-way quantum computation

    PubMed Central

    Su, Xiaolong; Hao, Shuhong; Deng, Xiaowei; Ma, Lingyu; Wang, Meihong; Jia, Xiaojun; Xie, Changde; Peng, Kunchi

    2013-01-01

    Measurement-based one-way quantum computation using cluster states as resources provides an efficient model to perform computation and information processing of quantum codes. Arbitrary Gaussian quantum computation can be implemented sufficiently by long single-mode and two-mode gate sequences. However, continuous variable gate sequences have not been realized so far due to an absence of cluster states larger than four submodes. Here we present the first continuous variable gate sequence consisting of a single-mode squeezing gate and a two-mode controlled-phase gate based on a six-mode cluster state. The quantum property of this gate sequence is confirmed by the fidelities and the quantum entanglement of two output modes, which depend on both the squeezing and controlled-phase gates. The experiment demonstrates the feasibility of implementing Gaussian quantum computation by means of accessible gate sequences.

  11. Selection of internal control genes for quantitative real-time RT-PCR studies during tomato development process

    PubMed Central

    Expósito-Rodríguez, Marino; Borges, Andrés A; Borges-Pérez, Andrés; Pérez, José A

    2008-01-01

    Background The elucidation of gene expression patterns leads to a better understanding of biological processes. Real-time quantitative RT-PCR has become the standard method for in-depth studies of gene expression. A biologically meaningful reporting of target mRNA quantities requires accurate and reliable normalization in order to identify real gene-specific variation. The purpose of normalization is to control several variables such as different amounts and quality of starting material, variable enzymatic efficiencies of retrotranscription from RNA to cDNA, or differences between tissues or cells in overall transcriptional activity. The validity of a housekeeping gene as endogenous control relies on the stability of its expression level across the sample panel being analysed. In the present report we describe the first systematic evaluation of potential internal controls during tomato development process to identify which are the most reliable for transcript quantification by real-time RT-PCR. Results In this study, we assess the expression stability of 7 traditional and 4 novel housekeeping genes in a set of 27 samples representing different tissues and organs of tomato plants at different developmental stages. First, we designed, tested and optimized amplification primers for real-time RT-PCR. Then, expression data from each candidate gene were evaluated with three complementary approaches based on different statistical procedures. Our analysis suggests that SGN-U314153 (CAC), SGN-U321250 (TIP41), SGN-U346908 ("Expressed") and SGN-U316474 (SAND) genes provide superior transcript normalization in tomato development studies. We recommend different combinations of these exceptionally stable housekeeping genes for suited normalization of different developmental series, including the complete tomato development process. Conclusion This work constitutes the first effort for the selection of optimal endogenous controls for quantitative real-time RT-PCR studies of gene expression during tomato development process. From our study a tool-kit of control genes emerges that outperform the traditional genes in terms of expression stability. PMID:19102748

  12. 40 CFR 439.12 - Effluent limitations attainable by the application of the best practicable control technology...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... percent reduction in the long-term average daily BOD5 load of the raw (untreated) process wastewater, multiplied by a variability factor of 3.0. (1) The long-term average daily BOD5 load of the raw process... concentration value reflecting a reduction in the long-term average daily COD load in the raw (untreated...

  13. 40 CFR 439.12 - Effluent limitations attainable by the application of the best practicable control technology...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... percent reduction in the long-term average daily BOD5 load of the raw (untreated) process wastewater, multiplied by a variability factor of 3.0. (1) The long-term average daily BOD5 load of the raw process... concentration value reflecting a reduction in the long-term average daily COD load in the raw (untreated...

  14. The Ecology of Older Adult Locus of Control, Mindlessness, and Self-Esteem: A Review of Research and Educational Implications.

    ERIC Educational Resources Information Center

    Schiamberg, Lawrence B.; And Others

    A review of research literature pertaining to locus of control in older adults and its application to social and educational settings indicates that reliable generalizations about the self-concept of older adults require a careful consideration of both personal and situational variables. Four separate processes are useful in understanding the…

  15. Apparatus and process for active pulse intensity control of laser beam

    DOEpatents

    Wilcox, Russell B.

    1992-01-01

    An optically controlled laser pulse energy control apparatus and process is disclosed wherein variations in the energy of a portion of the laser beam are used to vary the resistance of a photodetector such as a photoresistor through which a control voltage is fed to a light intensity controlling device through which a second portion of the laser beam passes. Light attenuation means are provided to vary the intensity of the laser light used to control the resistance of the photodetector. An optical delay path is provided through which the second portion of the beam travels before reaching the light intensity controlling device. The control voltage is supplied by a variable power supply. The apparatus may be tuned to properly attenuate the laser beam passing through the intensity controlling device by adjusting the power supply, the optical delay path, or the light attenuating means.

  16. Robotic Variable Polarity Plasma Arc (VPPA) Welding

    NASA Technical Reports Server (NTRS)

    Jaffery, Waris S.

    1993-01-01

    The need for automated plasma welding was identified in the early stages of the Space Station Freedom Program (SSFP) because it requires approximately 1.3 miles of welding for assembly. As a result of the Variable Polarity Plasma Arc Welding (VPPAW) process's ability to make virtually defect-free welds in aluminum, it was chosen to fulfill the welding needs. Space Station Freedom will be constructed of 2219 aluminum utilizing the computer controlled VPPAW process. The 'Node Radial Docking Port', with it's saddle shaped weld path, has a constantly changing surface angle over 360 deg of the 282 inch weld. The automated robotic VPPAW process requires eight-axes of motion (six-axes of robot and two-axes of positioner movement). The robot control system is programmed to maintain Torch Center Point (TCP) orientation perpendicular to the part while the part positioner is tilted and rotated to maintain the vertical up orientation as required by the VPPAW process. The combined speed of the robot and the positioner are integrated to maintain a constant speed between the part and the torch. A laser-based vision sensor system has also been integrated to track the seam and map the surface of the profile during welding.

  17. Intelligent Performance Analysis with a Natural Language Interface

    NASA Astrophysics Data System (ADS)

    Juuso, Esko K.

    2017-09-01

    Performance improvement is taken as the primary goal in the asset management. Advanced data analysis is needed to efficiently integrate condition monitoring data into the operation and maintenance. Intelligent stress and condition indices have been developed for control and condition monitoring by combining generalized norms with efficient nonlinear scaling. These nonlinear scaling methodologies can also be used to handle performance measures used for management since management oriented indicators can be presented in the same scale as intelligent condition and stress indices. Performance indicators are responses of the process, machine or system to the stress contributions analyzed from process and condition monitoring data. Scaled values are directly used in intelligent temporal analysis to calculate fluctuations and trends. All these methodologies can be used in prognostics and fatigue prediction. The meanings of the variables are beneficial in extracting expert knowledge and representing information in natural language. The idea of dividing the problems into the variable specific meanings and the directions of interactions provides various improvements for performance monitoring and decision making. The integrated temporal analysis and uncertainty processing facilitates the efficient use of domain expertise. Measurements can be monitored with generalized statistical process control (GSPC) based on the same scaling functions.

  18. Robotic Variable Polarity Plasma Arc (VPPA) welding

    NASA Astrophysics Data System (ADS)

    Jaffery, Waris S.

    1993-02-01

    The need for automated plasma welding was identified in the early stages of the Space Station Freedom Program (SSFP) because it requires approximately 1.3 miles of welding for assembly. As a result of the Variable Polarity Plasma Arc Welding (VPPAW) process's ability to make virtually defect-free welds in aluminum, it was chosen to fulfill the welding needs. Space Station Freedom will be constructed of 2219 aluminum utilizing the computer controlled VPPAW process. The 'Node Radial Docking Port', with it's saddle shaped weld path, has a constantly changing surface angle over 360 deg of the 282 inch weld. The automated robotic VPPAW process requires eight-axes of motion (six-axes of robot and two-axes of positioner movement). The robot control system is programmed to maintain Torch Center Point (TCP) orientation perpendicular to the part while the part positioner is tilted and rotated to maintain the vertical up orientation as required by the VPPAW process. The combined speed of the robot and the positioner are integrated to maintain a constant speed between the part and the torch. A laser-based vision sensor system has also been integrated to track the seam and map the surface of the profile during welding.

  19. Current aspects of Salmonella contamination in the US poultry production chain and the potential application of risk strategies in understanding emerging hazards.

    PubMed

    Rajan, Kalavathy; Shi, Zhaohao; Ricke, Steven C

    2017-05-01

    One of the leading causes of foodborne illness in poultry products is Salmonella enterica. Salmonella hazards in poultry may be estimated and possible control methods modeled and evaluated through the use of quantitative microbiological risk assessment (QMRA) models and tools. From farm to table, there are many possible routes of Salmonella dissemination and contamination in poultry. From the time chicks are hatched through growth, transportation, processing, storage, preparation, and finally consumption, the product could be contaminated through exposure to different materials and sources. Examination of each step of the process is necessary as well as an examination of the overall picture to create effective countermeasures against contamination and prevent disease. QMRA simulation models can use either point estimates or probability distributions to examine variables such as Salmonella concentrations at retail or at any given point of processing to gain insight on the chance of illness due to Salmonella ingestion. For modeling Salmonella risk in poultry, it is important to look at variables such as Salmonella transfer and cross contamination during processing. QMRA results may be useful for the identification and control of critical sources of Salmonella contamination.

  20. Implicit theories about interrelations of anger components in 25 countries.

    PubMed

    Alonso-Arbiol, Itziar; van de Vijver, Fons J R; Fernandez, Itziar; Paez, Dario; Campos, Miryam; Carrera, Pilar

    2011-02-01

    We were interested in the cross-cultural comparison of implicit theories of the interrelations of eight anger components (antecedents, body sensations, cognitive reactions, verbal expressions, nonverbal expressions, interpersonal responses, and primary and secondary self-control). Self-report scales of each of these components were administered to a total of 5,006 college students in 25 countries. Equivalence of the scales was supported in that scales showed acceptable congruence coefficients in almost all comparisons. A multigroup confirmatory factor model with three latent variables (labeled internal processes, behavioral outcomes, and self-control mechanisms) could well account for the interrelations of the eight observed variables; measurement and structural weights were invariant. Behavioral outcomes and self-control mechanisms were only associated through their common dependence on internal processes. Verbal expressions and cognitive reactions showed the largest cross-cultural differences in means, whereas self-control mechanisms scales showed the smallest differences. Yet, cultural differences between the countries were small. It is concluded that anger, as measured by these scales, shows more pronounced cross-cultural similarities than differences in terms of both interrelations and mean score levels. PsycINFO Database Record (c) 2011 APA, all rights reserved.

  1. Variability of 13C-labeling in plant leaves.

    PubMed

    Nguyen Tu, Thanh Thuy; Biron, Philippe; Maseyk, Kadmiel; Richard, Patricia; Zeller, Bernd; Quénéa, Katell; Alexis, Marie; Bardoux, Gérard; Vaury, Véronique; Girardin, Cyril; Pouteau, Valérie; Billiou, Daniel; Bariac, Thierry

    2013-09-15

    Plant tissues artificially labeled with (13)C are increasingly used in environmental studies to unravel biogeochemical and ecophysiological processes. However, the variability of (13)C-content in labeled tissues has never been carefully investigated. Hence, this study aimed at documenting the variability of (13)C-content in artificially labeled leaves. European beech and Italian ryegrass were subjected to long-term (13)C-labeling in a controlled-environment growth chamber. The (13)C-content of the leaves obtained after several months labeling was determined by isotope ratio mass spectrometry. The (13)C-content of the labeled leaves exhibited inter- and intra-leaf variability much higher than those naturally occurring in unlabeled plants, which do not exceed a few per mil. This variability was correlated with labeling intensity: the isotope composition of leaves varied in ranges of ca 60‰ and 90‰ for experiments that led to average leaf (13)C-content of ca +15‰ and +450‰, respectively. The reported variability of isotope composition in (13)C-enriched leaves is critical, and should be taken into account in subsequent experimental investigations of environmental processes using (13)C-labeled plant tissues. Copyright © 2013 John Wiley & Sons, Ltd.

  2. Smart Frameworks and Self-Describing Models: Model Metadata for Automated Coupling of Hydrologic Process Components (Invited)

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.

    2013-12-01

    Model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that allow heterogeneous sets of process models to be assembled in a plug-and-play manner to create composite "system models". These mechanisms facilitate code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers, e.g. by requiring them to provide their output in specific forms that meet the input requirements of other models. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can compare the answers to these queries with similar answers from other process models in a collection and then automatically call framework service components as necessary to mediate the differences between the coupled models. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. To illustrate the power of standardized model interfaces and metadata, a smart, light-weight modeling framework written in Python will be introduced that can automatically (without user intervention) couple a set of BMI-enabled hydrologic process components together to create a spatial hydrologic model. The same mechanisms could also be used to provide seamless integration (import/export) of data and models.

  3. Modeling of an Adjustable Beam Solid State Light Project

    NASA Technical Reports Server (NTRS)

    Clark, Toni

    2015-01-01

    This proposal is for the development of a computational model of a prototype variable beam light source using optical modeling software, Zemax Optics Studio. The variable beam light source would be designed to generate flood, spot, and directional beam patterns, while maintaining the same average power usage. The optical model would demonstrate the possibility of such a light source and its ability to address several issues: commonality of design, human task variability, and light source design process improvements. An adaptive lighting solution that utilizes the same electronics footprint and power constraints while addressing variability of lighting needed for the range of exploration tasks can save costs and allow for the development of common avionics for lighting controls.

  4. Structure of weakly 2-dependent siphons

    NASA Astrophysics Data System (ADS)

    Chao, Daniel Yuh; Chen, Jiun-Ting

    2013-09-01

    Deadlocks arising from insufficiently marked siphons in flexible manufacturing systems can be controlled by adding monitors to each siphon - too many for large systems. Li and Zhou add monitors to elementary siphons only while controlling the rest of (called dependent) siphons by adjusting control depth variables of elementary siphons. Only a linear number of monitors are required. The control of weakly dependent siphons (WDSs) is rather conservative since only positive terms were considered. The structure for strongly dependent siphons (SDSs) has been studied earlier. Based on this structure, the optimal sequence of adding monitors has been discovered earlier. Better controllability has been discovered to achieve faster and more permissive control. The results have been extended earlier to S3PGR2 (systems of simple sequential processes with general resource requirements). This paper explores the structures for WDSs, which, as found in this paper, involve elementary resource circuits interconnecting at more than (for SDSs, exactly) one resource place. This saves the time to compute compound siphons, their complementary sets and T-characteristic vectors. Also it allows us (1) to improve the controllability of WDSs and control siphons and (2) to avoid the time to find independent vectors for elementary siphons. We propose a sufficient and necessary test for adjusting control depth variables in S3PR (systems of simple sequential processes with resources) to avoid the sufficient-only time-consuming linear integer programming test (LIP) (Nondeterministic Polynomial (NP) time complete problem) required previously for some cases.

  5. Modeling of feed-forward control using the partial least squares regression method in the tablet compression process.

    PubMed

    Hattori, Yusuke; Otsuka, Makoto

    2017-05-30

    In the pharmaceutical industry, the implementation of continuous manufacturing has been widely promoted in lieu of the traditional batch manufacturing approach. More specially, in recent years, the innovative concept of feed-forward control has been introduced in relation to process analytical technology. In the present study, we successfully developed a feed-forward control model for the tablet compression process by integrating data obtained from near-infrared (NIR) spectra and the physical properties of granules. In the pharmaceutical industry, batch manufacturing routinely allows for the preparation of granules with the desired properties through the manual control of process parameters. On the other hand, continuous manufacturing demands the automatic determination of these process parameters. Here, we proposed the development of a control model using the partial least squares regression (PLSR) method. The most significant feature of this method is the use of dataset integrating both the NIR spectra and the physical properties of the granules. Using our model, we determined that the properties of products, such as tablet weight and thickness, need to be included as independent variables in the PLSR analysis in order to predict unknown process parameters. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Cholinergic enhancement reduces functional connectivity and BOLD variability in visual extrastriate cortex during selective attention.

    PubMed

    Ricciardi, Emiliano; Handjaras, Giacomo; Bernardi, Giulio; Pietrini, Pietro; Furey, Maura L

    2013-01-01

    Enhancing cholinergic function improves performance on various cognitive tasks and alters neural responses in task specific brain regions. We have hypothesized that the changes in neural activity observed during increased cholinergic function reflect an increase in neural efficiency that leads to improved task performance. The current study tested this hypothesis by assessing neural efficiency based on cholinergically-mediated effects on regional brain connectivity and BOLD signal variability. Nine subjects participated in a double-blind, placebo-controlled crossover fMRI study. Following an infusion of physostigmine (1 mg/h) or placebo, echo-planar imaging (EPI) was conducted as participants performed a selective attention task. During the task, two images comprised of superimposed pictures of faces and houses were presented. Subjects were instructed periodically to shift their attention from one stimulus component to the other and to perform a matching task using hand held response buttons. A control condition included phase-scrambled images of superimposed faces and houses that were presented in the same temporal and spatial manner as the attention task; participants were instructed to perform a matching task. Cholinergic enhancement improved performance during the selective attention task, with no change during the control task. Functional connectivity analyses showed that the strength of connectivity between ventral visual processing areas and task-related occipital, parietal and prefrontal regions reduced significantly during cholinergic enhancement, exclusively during the selective attention task. Physostigmine administration also reduced BOLD signal temporal variability relative to placebo throughout temporal and occipital visual processing areas, again during the selective attention task only. Together with the observed behavioral improvement, the decreases in connectivity strength throughout task-relevant regions and BOLD variability within stimulus processing regions support the hypothesis that cholinergic augmentation results in enhanced neural efficiency. This article is part of a Special Issue entitled 'Cognitive Enhancers'. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Cholinergic enhancement reduces functional connectivity and BOLD variability in visual extrastriate cortex during selective attention

    PubMed Central

    Ricciardi, Emiliano; Handjaras, Giacomo; Bernardi, Giulio; Pietrini, Pietro; Furey, Maura L.

    2012-01-01

    Enhancing cholinergic function improves performance on various cognitive tasks and alters neural responses in task specific brain regions. Previous findings by our group strongly suggested that the changes in neural activity observed during increased cholinergic function may reflect an increase in neural efficiency that leads to improved task performance. The current study was designed to assess the effects of cholinergic enhancement on regional brain connectivity and BOLD signal variability. Nine subjects participated in a double-blind, placebo-controlled crossover functional magnetic resonance imaging (fMRI) study. Following an infusion of physostigmine (1mg/hr) or placebo, echo-planar imaging (EPI) was conducted as participants performed a selective attention task. During the task, two images comprised of superimposed pictures of faces and houses were presented. Subjects were instructed periodically to shift their attention from one stimulus component to the other and to perform a matching task using hand held response buttons. A control condition included phase-scrambled images of superimposed faces and houses that were presented in the same temporal and spatial manner as the attention task; participants were instructed to perform a matching task. Cholinergic enhancement improved performance during the selective attention task, with no change during the control task. Functional connectivity analyses showed that the strength of connectivity between ventral visual processing areas and task-related occipital, parietal and prefrontal regions was reduced significantly during cholinergic enhancement, exclusively during the selective attention task. Cholinergic enhancement also reduced BOLD signal temporal variability relative to placebo throughout temporal and occipital visual processing areas, again during the selective attention task only. Together with the observed behavioral improvement, the decreases in connectivity strength throughout task-relevant regions and BOLD variability within stimulus processing regions provide further support to the hypothesis that cholinergic augmentation results in enhanced neural efficiency. PMID:22906685

  8. The interaction between practice and performance pressure on the planning and control of fast target directed movement.

    PubMed

    Allsop, Jonathan E; Lawrence, Gavin P; Gray, Robert; Khan, Michael A

    2017-09-01

    Pressure to perform often results in decrements to both outcome accuracy and the kinematics of motor skills. Furthermore, this pressure-performance relationship is moderated by the amount of accumulated practice or the experience of the performer. However, the interactive effects of performance pressure and practice on the underlying processes of motor skills are far from clear. Movement execution involves both an offline pre-planning process and an online control process. The present experiment aimed to investigate the interaction between pressure and practice on these two motor control processes. Two groups of participants (control and pressure; N = 12 and 12, respectively) practiced a video aiming amplitude task and were transferred to either a non-pressure (control group) or a pressure condition (pressure group) both early and late in practice. Results revealed similar accuracy and movement kinematics between the control and pressure groups at early transfer. However, at late transfer, the introduction of pressure was associated with increased performance compared to control conditions. Analysis of kinematic variability throughout the movement suggested that the performance increase was due to participants adopting strategies to improve movement planning in response to pressure reducing the effectiveness of the online control system.

  9. The Wax and Wane of Narcissism: Grandiose Narcissism as a Process or State.

    PubMed

    Giacomin, Miranda; Jordan, Christian H

    2016-04-01

    Though grandiose narcissism has predominantly been studied in structural terms-focused on individuals' general tendencies to be more or less narcissistic-we tested whether it also has a meaningful process or state component. Using a daily diary study methodology and multilevel modeling (N = 178 undergraduates, 146 female; Mage  = 18.86, SD = 2.21), we examine whether there is significant variability in daily state narcissism and whether this variability relates systematically to other psychological states (i.e., self-esteem, stress) and daily events. We assessed state narcissism and daily experiences over a 10-day period. We observed significant within-person variability in daily narcissism. Notably, this variability was not simply random error, as it related systematically to other psychological states and daily events. Specifically, state narcissism was higher when people experienced more positive agentic outcomes (e.g., having power over someone) or more positive communal outcomes (e.g., helping someone with a problem). State narcissism was lower on days people experienced greater felt stress. These relations held when state self-esteem, gender, and trait narcissism were controlled. These findings suggest that grandiose narcissism has a meaningful process or state component. © 2014 Wiley Periodicals, Inc.

  10. Complexity Variability Assessment of Nonlinear Time-Varying Cardiovascular Control

    NASA Astrophysics Data System (ADS)

    Valenza, Gaetano; Citi, Luca; Garcia, Ronald G.; Taylor, Jessica Noggle; Toschi, Nicola; Barbieri, Riccardo

    2017-02-01

    The application of complex systems theory to physiology and medicine has provided meaningful information about the nonlinear aspects underlying the dynamics of a wide range of biological processes and their disease-related aberrations. However, no studies have investigated whether meaningful information can be extracted by quantifying second-order moments of time-varying cardiovascular complexity. To this extent, we introduce a novel mathematical framework termed complexity variability, in which the variance of instantaneous Lyapunov spectra estimated over time serves as a reference quantifier. We apply the proposed methodology to four exemplary studies involving disorders which stem from cardiology, neurology and psychiatry: Congestive Heart Failure (CHF), Major Depression Disorder (MDD), Parkinson’s Disease (PD), and Post-Traumatic Stress Disorder (PTSD) patients with insomnia under a yoga training regime. We show that complexity assessments derived from simple time-averaging are not able to discern pathology-related changes in autonomic control, and we demonstrate that between-group differences in measures of complexity variability are consistent across pathologies. Pathological states such as CHF, MDD, and PD are associated with an increased complexity variability when compared to healthy controls, whereas wellbeing derived from yoga in PTSD is associated with lower time-variance of complexity.

  11. Method of and apparatus for thermomagnetically processing a workpiece

    DOEpatents

    Kisner, Roger A.; Rios, Orlando; Wilgen, John B.; Ludtka, Gerard M.; Ludtka, Gail M.

    2014-08-05

    A method of thermomagnetically processing a material includes disposing a workpiece within a bore of a magnet; exposing the workpiece to a magnetic field of at least about 1 Tesla generated by the magnet; and, while exposing the workpiece to the magnetic field, applying heat energy to the workpiece at a plurality of frequencies to achieve spatially-controlled heating of the workpiece. An apparatus for thermomagnetically processing a material comprises: a high field strength magnet having a bore extending therethrough for insertion of a workpiece therein; and an energy source disposed adjacent to an entrance to the bore. The energy source is an emitter of variable frequency heat energy, and the bore comprises a waveguide for propagation of the variable frequency heat energy from the energy source to the workpiece.

  12. Conservation and Variability of Meiosis Across the Eukaryotes.

    PubMed

    Loidl, Josef

    2016-11-23

    Comparisons among a variety of eukaryotes have revealed considerable variability in the structures and processes involved in their meiosis. Nevertheless, conventional forms of meiosis occur in all major groups of eukaryotes, including early-branching protists. This finding confirms that meiosis originated in the common ancestor of all eukaryotes and suggests that primordial meiosis may have had many characteristics in common with conventional extant meiosis. However, it is possible that the synaptonemal complex and the delicate crossover control related to its presence were later acquisitions. Later still, modifications to meiotic processes occurred within different groups of eukaryotes. Better knowledge on the spectrum of derived and uncommon forms of meiosis will improve our understanding of many still mysterious aspects of the meiotic process and help to explain the evolutionary basis of functional adaptations to the meiotic program.

  13. New Ultrasonic Controller and Characterization System for Low Temperature Drying Process Intensification

    NASA Astrophysics Data System (ADS)

    Andrés, R. R.; Blanco, A.; Acosta, V. M.; Riera, E.; Martínez, I.; Pinto, A.

    Process intensification constitutes a high interesting and promising industrial area. It aims to modify conventional processes or develop new technologies in order to reduce energy needs, increase yields and improve product quality. It has been demonstrated by this research group (CSIC) that power ultrasound have a great potential in food drying processes. The effects associated with the application of power ultrasound can enhance heat and mass transfer and may constitute a way for process intensification. The objective of this work has been the design and development of a new ultrasonic system for the power characterization of piezoelectric plate-transducers, as excitation, monitoring, analysis, control and characterization of their nonlinear response. For this purpose, the system proposes a new, efficient and economic approach that separates the effect of different parameters of the process like excitation, medium and transducer parameters and variables (voltage, current, frequency, impedance, vibration velocity, acoustic pressure and temperature) by observing the electrical, mechanical, acoustical and thermal behavior, and controlling the vibrational state.

  14. How holistic processing of faces relates to cognitive control and intelligence.

    PubMed

    Gauthier, Isabel; Chua, Kao-Wei; Richler, Jennifer J

    2018-04-16

    The Vanderbilt Holistic Processing Test for faces (VHPT-F) is the first standard test designed to measure individual differences in holistic processing. The test measures failures of selective attention to face parts through congruency effects, an operational definition of holistic processing. However, this conception of holistic processing has been challenged by the suggestion that it may tap into the same selective attention or cognitive control mechanisms that yield congruency effects in Stroop and Flanker paradigms. Here, we report data from 130 subjects on the VHPT-F, several versions of Stroop and Flanker tasks, as well as fluid IQ. Results suggested a small degree of shared variance in Stroop and Flanker congruency effects, which did not relate to congruency effects on the VHPT-F. Variability on the VHPT-F was also not correlated with Fluid IQ. In sum, we find no evidence that holistic face processing as measured by congruency in the VHPT-F is accounted for by domain-general control mechanisms.

  15. Reducing The Risk Of Fires In Conveyor Transport

    NASA Astrophysics Data System (ADS)

    Cheremushkina, M. S.; Poddubniy, D. A.

    2017-01-01

    The paper deals with the actual problem of increasing the safety of operation of belt conveyors in mines. Was developed the control algorithm that meets the technical requirements of the mine belt conveyors, reduces the risk of fires of conveyors belt, and enables energy and resource savings taking into account random sort of traffic. The most effective method of decision such tasks is the construction of control systems with the use of variable speed drives for asynchronous motors. Was designed the mathematical model of the system "variable speed multiengine drive - conveyor - control system of conveyors", that takes into account the dynamic processes occurring in the elements of the transport system, provides an assessment of the energy efficiency of application the developed algorithms, which allows to reduce the dynamic overload in the belt to (15-20)%.

  16. Evidentiary, extraevidentiary, and deliberation process predictors of real jury verdicts.

    PubMed

    Devine, Dennis J; Krouse, Paige C; Cavanaugh, Caitlin M; Basora, Jaime Colon

    2016-12-01

    In contrast to the extensive literature based on mock jurors, large-sample studies of decision making by real juries are relatively rare. In this field study, we examined relationships between jury verdicts and variables representing 3 classes of potential determinants-evidentiary, extraevidentiary, and deliberation process-using a sample of 114 criminal jury trials. Posttrial data were collected from 11 presiding judges, 31 attorneys, and 367 jurors using a Web-based questionnaire. The strength of the prosecution's evidence was strongly related to the occurrence of a conviction, whereas most extraevidentiary and deliberation process variables were only weakly to modestly related in bivariate form and when the prosecution's evidence strength was controlled. Notable exceptions to this pattern were jury demographic diversity as represented by the number of different race-gender subgroups (e.g., Black males) present in the jury, and several deliberation process variables reflecting advocacy for acquittal (e.g., presence of an identifiable proacquittal faction within the jury and proacquittal advocacy by the foreperson). Variables reflecting advocacy for conviction were essentially unrelated to jury verdict. Sets of extraevidentiary and deliberation variables were each able to modestly improve the explanation of jury verdicts over prosecution evidence strength in multivariate models. This study highlights the predictive efficacy of prosecution evidence strength with respect to jury verdicts, as well as the potential importance of jury demographic diversity and advocacy for acquittal during deliberation. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  17. Control Design for an Advanced Geared Turbofan Engine

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Litt, Jonathan S.

    2017-01-01

    This paper describes the design process for the control system of an advanced geared turbofan engine. This process is applied to a simulation that is representative of a 30,000 lbf thrust class concept engine with two main spools, ultra-high bypass ratio, and a variable area fan nozzle. Control system requirements constrain the non-linear engine model as it operates throughout its flight envelope of sea level to 40,000 ft and from 0 to 0.8 Mach. The control architecture selected for this project was developed from literature and reflects a configuration that utilizes a proportional integral controller integrated with sets of limiters that enable the engine to operate safely throughout its flight envelope. Simulation results show the overall system meets performance requirements without exceeding system operational limits.

  18. Artificial Intelligence Tools for Scaling Up of High Shear Wet Granulation Process.

    PubMed

    Landin, Mariana

    2017-01-01

    The results presented in this article demonstrate the potential of artificial intelligence tools for predicting the endpoint of the granulation process in high-speed mixer granulators of different scales from 25L to 600L. The combination of neurofuzzy logic and gene expression programing technologies allowed the modeling of the impeller power as a function of operation conditions and wet granule properties, establishing the critical variables that affect the response and obtaining a unique experimental polynomial equation (transparent model) of high predictability (R 2 > 86.78%) for all size equipment. Gene expression programing allowed the modeling of the granulation process for granulators of similar and dissimilar geometries and can be improved by implementing additional characteristics of the process, as composition variables or operation parameters (e.g., batch size, chopper speed). The principles and the methodology proposed here can be applied to understand and control manufacturing process, using any other granulation equipment, including continuous granulation processes. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  19. Geomorphic effectiveness of a long profile shape and the role of inherent geological controls in the Himalayan hinterland area of the Ganga River basin, India

    NASA Astrophysics Data System (ADS)

    Sonam; Jain, Vikrant

    2018-03-01

    Long profiles of rivers provide a platform to analyse interaction between geological and geomorphic processes operating at different time scales. Identification of an appropriate model for river long profile becomes important in order to establish a quantitative relationship between the profile shape, its geomorphic effectiveness, and inherent geological characteristics. This work highlights the variability in the long profile shape of the Ganga River and its major tributaries, its impact on stream power distribution pattern, and role of the geological controls on it. Long profile shapes are represented by the sum of two exponential functions through the curve fitting method. We have shown that coefficients of river long profile equations are governed by the geological characteristics of subbasins. These equations further define the spatial distribution pattern of stream power and help to understand stream power variability in different geological terrains. Spatial distribution of stream power in different geological terrains successfully explains spatial variability in geomorphic processes within the Himalayan hinterland area. In general, the stream power peaks of larger rivers lie in the Higher Himalaya, and rivers in the eastern hinterland area are characterised by the highest magnitude of stream power.

  20. Closed-loop control of grasping with a myoelectric hand prosthesis: which are the relevant feedback variables for force control?

    PubMed

    Ninu, Andrei; Dosen, Strahinja; Muceli, Silvia; Rattay, Frank; Dietl, Hans; Farina, Dario

    2014-09-01

    In closed-loop control of grasping by hand prostheses, the feedback information sent to the user is usually the actual controlled variable, i.e., the grasp force. Although this choice is intuitive and logical, the force production is only the last step in the process of grasping. Therefore, this study evaluated the performance in controlling grasp strength using a hand prosthesis operated through a complete grasping sequence while varying the feedback variables (e.g., closing velocity, grasping force), which were provided to the user visually or through vibrotactile stimulation. The experiments were conducted on 13 volunteers who controlled the Otto Bock Sensor Hand Speed prosthesis. Results showed that vibrotactile patterns were able to replace the visual feedback. Interestingly, the experiments demonstrated that direct force feedback was not essential for the control of grasping force. The subjects were indeed able to control the grip strength, predictively, by estimating the grasping force from the prosthesis velocity of closing. Therefore, grasping without explicit force feedback is not completely blind, contrary to what is usually assumed. In our study we analyzed grasping with a specific prosthetic device, but the outcomes are also applicable for other devices, with one or more degrees-of-freedom. The necessary condition is that the electromyography (EMG) signal directly and proportionally controls the velocity/grasp force of the hand, which is a common approach among EMG controlled prosthetic devices. The results provide important indications on the design of closed-loop EMG controlled prosthetic systems.

  1. A Soft Sensor for Bioprocess Control Based on Sequential Filtering of Metabolic Heat Signals

    PubMed Central

    Paulsson, Dan; Gustavsson, Robert; Mandenius, Carl-Fredrik

    2014-01-01

    Soft sensors are the combination of robust on-line sensor signals with mathematical models for deriving additional process information. Here, we apply this principle to a microbial recombinant protein production process in a bioreactor by exploiting bio-calorimetric methodology. Temperature sensor signals from the cooling system of the bioreactor were used for estimating the metabolic heat of the microbial culture and from that the specific growth rate and active biomass concentration were derived. By applying sequential digital signal filtering, the soft sensor was made more robust for industrial practice with cultures generating low metabolic heat in environments with high noise level. The estimated specific growth rate signal obtained from the three stage sequential filter allowed controlled feeding of substrate during the fed-batch phase of the production process. The biomass and growth rate estimates from the soft sensor were also compared with an alternative sensor probe and a capacitance on-line sensor, for the same variables. The comparison showed similar or better sensitivity and lower variability for the metabolic heat soft sensor suggesting that using permanent temperature sensors of a bioreactor is a realistic and inexpensive alternative for monitoring and control. However, both alternatives are easy to implement in a soft sensor, alone or in parallel. PMID:25264951

  2. A soft sensor for bioprocess control based on sequential filtering of metabolic heat signals.

    PubMed

    Paulsson, Dan; Gustavsson, Robert; Mandenius, Carl-Fredrik

    2014-09-26

    Soft sensors are the combination of robust on-line sensor signals with mathematical models for deriving additional process information. Here, we apply this principle to a microbial recombinant protein production process in a bioreactor by exploiting bio-calorimetric methodology. Temperature sensor signals from the cooling system of the bioreactor were used for estimating the metabolic heat of the microbial culture and from that the specific growth rate and active biomass concentration were derived. By applying sequential digital signal filtering, the soft sensor was made more robust for industrial practice with cultures generating low metabolic heat in environments with high noise level. The estimated specific growth rate signal obtained from the three stage sequential filter allowed controlled feeding of substrate during the fed-batch phase of the production process. The biomass and growth rate estimates from the soft sensor were also compared with an alternative sensor probe and a capacitance on-line sensor, for the same variables. The comparison showed similar or better sensitivity and lower variability for the metabolic heat soft sensor suggesting that using permanent temperature sensors of a bioreactor is a realistic and inexpensive alternative for monitoring and control. However, both alternatives are easy to implement in a soft sensor, alone or in parallel.

  3. Instrumentation, control, and automation for submerged anaerobic membrane bioreactors.

    PubMed

    Robles, Ángel; Durán, Freddy; Ruano, María Victoria; Ribes, Josep; Rosado, Alfredo; Seco, Aurora; Ferrer, José

    2015-01-01

    A submerged anaerobic membrane bioreactor (AnMBR) demonstration plant with two commercial hollow-fibre ultrafiltration systems (PURON®, Koch Membrane Systems, PUR-PSH31) was designed and operated for urban wastewater treatment. An instrumentation, control, and automation (ICA) system was designed and implemented for proper process performance. Several single-input-single-output (SISO) feedback control loops based on conventional on-off and PID algorithms were implemented to control the following operating variables: flow-rates (influent, permeate, sludge recycling and wasting, and recycled biogas through both reactor and membrane tanks), sludge wasting volume, temperature, transmembrane pressure, and gas sparging. The proposed ICA for AnMBRs for urban wastewater treatment enables the optimization of this new technology to be achieved with a high level of process robustness towards disturbances.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buckdahn, Rainer, E-mail: Rainer.Buckdahn@univ-brest.fr; Li, Juan, E-mail: juanli@sdu.edu.cn; Ma, Jin, E-mail: jinma@usc.edu

    In this paper we study the optimal control problem for a class of general mean-field stochastic differential equations, in which the coefficients depend, nonlinearly, on both the state process as well as of its law. In particular, we assume that the control set is a general open set that is not necessary convex, and the coefficients are only continuous on the control variable without any further regularity or convexity. We validate the approach of Peng (SIAM J Control Optim 2(4):966–979, 1990) by considering the second order variational equations and the corresponding second order adjoint process in this setting, and wemore » extend the Stochastic Maximum Principle of Buckdahn et al. (Appl Math Optim 64(2):197–216, 2011) to this general case.« less

  5. Density-dependent resistance of the gypsy moth, Lymantria dispar, to its nucleopolyhedrovirus

    Treesearch

    James R. Reilly; Ann E. Hajek

    2007-01-01

    The processes controlling disease resistance can strongly influence the population dynamics of insect outbreaks. Evidence that disease resistance is density-dependent is accumulating, but the exact form of this relationship is highly variable from species to species.

  6. Evaluation of the ADAPTIR System for Work Zone Traffic Control

    DOT National Transportation Integrated Search

    1999-11-01

    The ADAPTIR system (Automated Data Acquisition and Processing of Traffic Information in Real Time) uses variable message signs (VMS) equipped with radar units, along with a software program to interpret the data, to display appropriate warning and ad...

  7. Un formalisme de systemes a sauts pour la recirculation optimale des casses dans une machine a papier

    NASA Astrophysics Data System (ADS)

    Khanbaghi, Maryam

    Increasing closure of white water circuits is making mill productivity and quality of paper produced increasingly affected by the occurrence of paper breaks. In this thesis the main objective is the development of white water and broke recirculation policies. The thesis consists of three main parts, respectively corresponding to the synthesis of a statistical model of paper breaks in a paper mill, the basic mathematical setup for the formulation of white water and broke recirculation policies in the mill as a jump linear quadratic regulation problem, and finally the tuning of the control law based on first passage-time theory, and its extension to the case of control sensitive paper break rates. More specifically, in the first part a statistical model of paper machine breaks is developed. We start from the hypothesis that the breaks process is a Markov chain with three states: the first state is the operational one, while the two others are associated with the general types of paper-breaks that can take place in the mill (wet breaks and dry breaks). The Markovian hypothesis is empirically validated. We also establish how paper-break rates are correlated with machine speed and broke recirculation ratio. Subsequently, we show how the obtained Markov chain model of paper-breaks can be used to formulate a machine operating speed parameter optimization problem. In the second part, upon recognizing that paper breaks can be modelled as a Markov chain type of process which, when interacting with the continuous mill dynamics, yields a jump Markov model, jump linear theory is proposed as a means of constructing white water and broke recirculation strategies which minimize process variability. Reduced process variability comes at the expense of relatively large swings in white water and broke tanks level. Since the linear design does not specifically account for constraints on the state-space, under the resulting law, damaging events of tank overflow or emptiness can occur. A heuristic simulation-based approach is proposed to choose the performance measure design parameters to keep the mean time between incidents of fluid in broke and white water tanks either overflowing, or reaching dangerously low levels, sufficiently long. In the third part, a methodology, mainly founded on the first passage-time theory of stochastic processes, is proposed to choose the performance measure design parameters to limit process variability while accounting for the possibility of undesirable tank overflows or tank emptiness. The heart of the approach is an approximation technique for evaluating mean first passage-times of the controlled tanks levels. This technique appears to have an applicability which largely exceeds the problem area it was designed for. Furthermore, the introduction of control sensitive break rates and the analysis of the ensuing control problem are presented. This is to account for the experimentally observed increase in breaks concomitant with flow rate variability.

  8. Location specific solidification microstructure control in electron beam melting of Ti-6Al-4V

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Narra, Sneha P.; Cunningham, Ross; Beuth, Jack

    Relationships between prior beta grain size in solidified Ti-6Al-4V and melting process parameters in the Electron Beam Melting (EBM) process are investigated. Samples are built by varying a machine-dependent proprietary speed function to cover the process space. Optical microscopy is used to measure prior beta grain widths and assess the number of prior beta grains present in a melt pool in the raster region of the build. Despite the complicated evolution of beta grain sizes, the beta grain width scales with melt pool width. The resulting understanding of the relationship between primary machine variables and prior beta grain widths ismore » a key step toward enabling the location specific control of as-built microstructure in the EBM process. Control of grain width in separate specimens and within a single specimen is demonstrated.« less

  9. The origins of age of acquisition and typicality effects: Semantic processing in aphasia and the ageing brain.

    PubMed

    Räling, Romy; Schröder, Astrid; Wartenburger, Isabell

    2016-06-01

    Age of acquisition (AOA) has frequently been shown to influence response times and accuracy rates in word processing and constitutes a meaningful variable in aphasic language processing, while its origin in the language processing system is still under debate. To find out where AOA originates and whether and how it is related to another important psycholinguistic variable, namely semantic typicality (TYP), we studied healthy, elderly controls and semantically impaired individuals using semantic priming. For this purpose, we collected reaction times and accuracy rates as well as event-related potential data in an auditory category-member-verification task. The present results confirm a semantic origin of TYP, but question the same for AOA while favouring its origin at the phonology-semantics interface. The data are further interpreted in consideration of recent theories of ageing. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Analysis of field-oriented controlled induction motor drives under sensor faults and an overview of sensorless schemes.

    PubMed

    Arun Dominic, D; Chelliah, Thanga Raj

    2014-09-01

    To obtain high dynamic performance on induction motor drives (IMD), variable voltage and variable frequency operation has to be performed by measuring speed of rotation and stator currents through sensors and fed back them to the controllers. When the sensors are undergone a fault, the stability of control system, may be designed for an industrial process, is disturbed. This paper studies the negative effects on a 12.5 hp induction motor drives when the field oriented control system is subjected to sensor faults. To illustrate the importance of this study mine hoist load diagram is considered as shaft load of the tested machine. The methods to recover the system from sensor faults are discussed. In addition, the various speed sensorless schemes are reviewed comprehensively. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  11. High-order sliding-mode control for blood glucose regulation in the presence of uncertain dynamics.

    PubMed

    Hernández, Ana Gabriela Gallardo; Fridman, Leonid; Leder, Ron; Andrade, Sergio Islas; Monsalve, Cristina Revilla; Shtessel, Yuri; Levant, Arie

    2011-01-01

    The success of blood glucose automatic regulation depends on the robustness of the control algorithm used. It is a difficult task to perform due to the complexity of the glucose-insulin regulation system. The variety of model existing reflects the great amount of phenomena involved in the process, and the inter-patient variability of the parameters represent another challenge. In this research a High-Order Sliding-Mode Control is proposed. It is applied to two well known models, Bergman Minimal Model, and Sorensen Model, to test its robustness with respect to uncertain dynamics, and patients' parameter variability. The controller designed based on the simulations is tested with the specific Bergman Minimal Model of a diabetic patient whose parameters were identified from an in vivo assay. To minimize the insulin infusion rate, and avoid the hypoglycemia risk, the glucose target is a dynamical profile.

  12. A strong control of the South American SeeSaw on the intra-seasonal variability of the isotopic composition of precipitation in the Bolivian Andes

    NASA Astrophysics Data System (ADS)

    Vimeux, Françoise; Tremoy, Guillaume; Risi, Camille; Gallaire, Robert

    2011-07-01

    Water stable isotopes (δ) in tropical regions are a valuable tool to study both convective processes and climate variability provided that local and remote controls on δ are well known. Here, we examine the intra-seasonal variability of the event-based isotopic composition of precipitation (δD Zongo) in the Bolivian Andes (Zongo valley, 16°20'S-67°47'W) from September 1st, 1999 to August 31st, 2000. We show that the local amount effect is a very poor parameter to explain δD Zongo. We thus explore the property of water isotopes to integrate both temporal and spatial convective activities. We first show that the local convective activity averaged over the 7-8 days preceding the rainy event is an important control on δD Zongo during the rainy season (~ 40% of the δD Zongo variability is captured). This could be explained by the progressive depletion of local water vapor by unsaturated downdrafts of convective systems. The exploration of remote convective controls on δD Zongo shows a strong influence of the South American SeeSaw (SASS) which is the first climate mode controlling the precipitation variability in tropical South America during austral summer. Our study clearly evidences that temporal and spatial controls are not fully independent as the 7-day averaged convection in the Zongo valley responds to the SASS. Our results are finally used to evaluate a water isotope enabled atmospheric general circulation model (LMDZ-iso), using the stretched grid functionality to run zoomed simulations over the entire South American continent (15°N-55°S; 30°-85°W). We find that zoomed simulations capture the intra-seasonal isotopic variation and its controls, though with an overestimated local sensitivity, and confirm the role of a remote control on δ according to a SASS-like dipolar structure.

  13. A Process Analytical Technology (PAT) approach to control a new API manufacturing process: development, validation and implementation.

    PubMed

    Schaefer, Cédric; Clicq, David; Lecomte, Clémence; Merschaert, Alain; Norrant, Edith; Fotiadu, Frédéric

    2014-03-01

    Pharmaceutical companies are progressively adopting and introducing Process Analytical Technology (PAT) and Quality-by-Design (QbD) concepts promoted by the regulatory agencies, aiming the building of the quality directly into the product by combining thorough scientific understanding and quality risk management. An analytical method based on near infrared (NIR) spectroscopy was developed as a PAT tool to control on-line an API (active pharmaceutical ingredient) manufacturing crystallization step during which the API and residual solvent contents need to be precisely determined to reach the predefined seeding point. An original methodology based on the QbD principles was designed to conduct the development and validation of the NIR method and to ensure that it is fitted for its intended use. On this basis, Partial least squares (PLS) models were developed and optimized using chemometrics methods. The method was fully validated according to the ICH Q2(R1) guideline and using the accuracy profile approach. The dosing ranges were evaluated to 9.0-12.0% w/w for the API and 0.18-1.50% w/w for the residual methanol. As by nature the variability of the sampling method and the reference method are included in the variability obtained for the NIR method during the validation phase, a real-time process monitoring exercise was performed to prove its fit for purpose. The implementation of this in-process control (IPC) method on the industrial plant from the launch of the new API synthesis process will enable automatic control of the final crystallization step in order to ensure a predefined quality level of the API. In addition, several valuable benefits are expected including reduction of the process time, suppression of a rather difficult sampling and tedious off-line analyses. © 2013 Published by Elsevier B.V.

  14. PSC algorithm description

    NASA Technical Reports Server (NTRS)

    Nobbs, Steven G.

    1995-01-01

    An overview of the performance seeking control (PSC) algorithm and details of the important components of the algorithm are given. The onboard propulsion system models, the linear programming optimization, and engine control interface are described. The PSC algorithm receives input from various computers on the aircraft including the digital flight computer, digital engine control, and electronic inlet control. The PSC algorithm contains compact models of the propulsion system including the inlet, engine, and nozzle. The models compute propulsion system parameters, such as inlet drag and fan stall margin, which are not directly measurable in flight. The compact models also compute sensitivities of the propulsion system parameters to change in control variables. The engine model consists of a linear steady state variable model (SSVM) and a nonlinear model. The SSVM is updated with efficiency factors calculated in the engine model update logic, or Kalman filter. The efficiency factors are used to adjust the SSVM to match the actual engine. The propulsion system models are mathematically integrated to form an overall propulsion system model. The propulsion system model is then optimized using a linear programming optimization scheme. The goal of the optimization is determined from the selected PSC mode of operation. The resulting trims are used to compute a new operating point about which the optimization process is repeated. This process is continued until an overall (global) optimum is reached before applying the trims to the controllers.

  15. Process evaluation of the Enabling Mothers toPrevent Pediatric Obesity Through Web-Based Learning and Reciprocal Determinism (EMPOWER) randomized control trial.

    PubMed

    Knowlden, Adam P; Sharma, Manoj

    2014-09-01

    Family-and-home-based interventions are an important vehicle for preventing childhood obesity. Systematic process evaluations have not been routinely conducted in assessment of these interventions. The purpose of this study was to plan and conduct a process evaluation of the Enabling Mothers to Prevent Pediatric Obesity Through Web-Based Learning and Reciprocal Determinism (EMPOWER) randomized control trial. The trial was composed of two web-based, mother-centered interventions for prevention of obesity in children between 4 and 6 years of age. Process evaluation used the components of program fidelity, dose delivered, dose received, context, reach, and recruitment. Categorical process evaluation data (program fidelity, dose delivered, dose exposure, and context) were assessed using Program Implementation Index (PII) values. Continuous process evaluation variables (dose satisfaction and recruitment) were assessed using ANOVA tests to evaluate mean differences between groups (experimental and control) and sessions (sessions 1 through 5). Process evaluation results found that both groups (experimental and control) were equivalent, and interventions were administered as planned. Analysis of web-based intervention process objectives requires tailoring of process evaluation models for online delivery. Dissemination of process evaluation results can advance best practices for implementing effective online health promotion programs. © 2014 Society for Public Health Education.

  16. Integrating nanosphere lithography in device fabrication

    NASA Astrophysics Data System (ADS)

    Laurvick, Tod V.; Coutu, Ronald A.; Lake, Robert A.

    2016-03-01

    This paper discusses the integration of nanosphere lithography (NSL) with other fabrication techniques, allowing for nano-scaled features to be realized within larger microelectromechanical system (MEMS) based devices. Nanosphere self-patterning methods have been researched for over three decades, but typically not for use as a lithography process. Only recently has progress been made towards integrating many of the best practices from these publications and determining a process that yields large areas of coverage, with repeatability and enabled a process for precise placement of nanospheres relative to other features. Discussed are two of the more common self-patterning methods used in NSL (i.e. spin-coating and dip coating) as well as a more recently conceived variation of dip coating. Recent work has suggested the repeatability of any method depends on a number of variables, so to better understand how these variables affect the process a series of test vessels were developed and fabricated. Commercially available 3-D printing technology was used to incrementally alter the test vessels allowing for each variable to be investigated individually. With these deposition vessels, NSL can now be used in conjunction with other fabrication steps to integrate features otherwise unattainable through current methods, within the overall fabrication process of larger MEMS devices. Patterned regions in 1800 series photoresist with a thickness of ~700nm are used to capture regions of self-assembled nanospheres. These regions are roughly 2-5 microns in width, and are able to control the placement of 500nm polystyrene spheres by controlling where monolayer self-assembly occurs. The resulting combination of photoresist and nanospheres can then be used with traditional deposition or etch methods to utilize these fine scale features in the overall design.

  17. "Congratulations, you have been randomized into the control group!(?)": issues to consider when recruiting schools for matched-pair randomized control trials of prevention programs.

    PubMed

    Ji, Peter; DuBois, David L; Flay, Brian R; Brechling, Vanessa

    2008-03-01

    Recruiting schools into a matched-pair randomized control trial (MP-RCT) to evaluate the efficacy of a school-level prevention program presents challenges for researchers. We considered which of 2 procedures would be most effective for recruiting schools into the study and assigning them to conditions. In 1 procedure (recruit and match/randomize), we would recruit schools and match them prior to randomization, and in the other (match/randomize and recruitment), we would match schools and randomize them prior to recruitment. We considered how each procedure impacted the randomization process and our ability to recruit schools into the study. After implementing the selected procedure, the equivalence of both treatment and control group schools and the participating and nonparticipating schools on school demographic variables was evaluated. We decided on the recruit and match/randomize procedure because we thought it would provide the opportunity to build rapport with the schools and prepare them for the randomization process, thereby increasing the likelihood that they would accept their randomly assigned conditions. Neither the treatment and control group schools nor the participating and nonparticipating schools exhibited statistically significant differences from each other on any of the school demographic variables. Recruitment of schools prior to matching and randomization in an MP-RCT may facilitate the recruitment of schools and thus enhance both the statistical power and the representativeness of study findings. Future research would benefit from the consideration of a broader range of variables (eg, readiness to implement a comprehensive prevention program) both in matching schools and in evaluating their representativeness to nonparticipating schools.

  18. Linking Inflammation, Cardiorespiratory Variability, and Neural Control in Acute Inflammation via Computational Modeling

    PubMed Central

    Dick, Thomas E.; Molkov, Yaroslav I.; Nieman, Gary; Hsieh, Yee-Hsee; Jacono, Frank J.; Doyle, John; Scheff, Jeremy D.; Calvano, Steve E.; Androulakis, Ioannis P.; An, Gary; Vodovotz, Yoram

    2012-01-01

    Acute inflammation leads to organ failure by engaging catastrophic feedback loops in which stressed tissue evokes an inflammatory response and, in turn, inflammation damages tissue. Manifestations of this maladaptive inflammatory response include cardio-respiratory dysfunction that may be reflected in reduced heart rate and ventilatory pattern variabilities. We have developed signal-processing algorithms that quantify non-linear deterministic characteristics of variability in biologic signals. Now, coalescing under the aegis of the NIH Computational Biology Program and the Society for Complexity in Acute Illness, two research teams performed iterative experiments and computational modeling on inflammation and cardio-pulmonary dysfunction in sepsis as well as on neural control of respiration and ventilatory pattern variability. These teams, with additional collaborators, have recently formed a multi-institutional, interdisciplinary consortium, whose goal is to delineate the fundamental interrelationship between the inflammatory response and physiologic variability. Multi-scale mathematical modeling and complementary physiological experiments will provide insight into autonomic neural mechanisms that may modulate the inflammatory response to sepsis and simultaneously reduce heart rate and ventilatory pattern variabilities associated with sepsis. This approach integrates computational models of neural control of breathing and cardio-respiratory coupling with models that combine inflammation, cardiovascular function, and heart rate variability. The resulting integrated model will provide mechanistic explanations for the phenomena of respiratory sinus-arrhythmia and cardio-ventilatory coupling observed under normal conditions, and the loss of these properties during sepsis. This approach holds the potential of modeling cross-scale physiological interactions to improve both basic knowledge and clinical management of acute inflammatory diseases such as sepsis and trauma. PMID:22783197

  19. Linking Inflammation, Cardiorespiratory Variability, and Neural Control in Acute Inflammation via Computational Modeling.

    PubMed

    Dick, Thomas E; Molkov, Yaroslav I; Nieman, Gary; Hsieh, Yee-Hsee; Jacono, Frank J; Doyle, John; Scheff, Jeremy D; Calvano, Steve E; Androulakis, Ioannis P; An, Gary; Vodovotz, Yoram

    2012-01-01

    Acute inflammation leads to organ failure by engaging catastrophic feedback loops in which stressed tissue evokes an inflammatory response and, in turn, inflammation damages tissue. Manifestations of this maladaptive inflammatory response include cardio-respiratory dysfunction that may be reflected in reduced heart rate and ventilatory pattern variabilities. We have developed signal-processing algorithms that quantify non-linear deterministic characteristics of variability in biologic signals. Now, coalescing under the aegis of the NIH Computational Biology Program and the Society for Complexity in Acute Illness, two research teams performed iterative experiments and computational modeling on inflammation and cardio-pulmonary dysfunction in sepsis as well as on neural control of respiration and ventilatory pattern variability. These teams, with additional collaborators, have recently formed a multi-institutional, interdisciplinary consortium, whose goal is to delineate the fundamental interrelationship between the inflammatory response and physiologic variability. Multi-scale mathematical modeling and complementary physiological experiments will provide insight into autonomic neural mechanisms that may modulate the inflammatory response to sepsis and simultaneously reduce heart rate and ventilatory pattern variabilities associated with sepsis. This approach integrates computational models of neural control of breathing and cardio-respiratory coupling with models that combine inflammation, cardiovascular function, and heart rate variability. The resulting integrated model will provide mechanistic explanations for the phenomena of respiratory sinus-arrhythmia and cardio-ventilatory coupling observed under normal conditions, and the loss of these properties during sepsis. This approach holds the potential of modeling cross-scale physiological interactions to improve both basic knowledge and clinical management of acute inflammatory diseases such as sepsis and trauma.

  20. Multibeam sonar backscatter data processing

    NASA Astrophysics Data System (ADS)

    Schimel, Alexandre C. G.; Beaudoin, Jonathan; Parnum, Iain M.; Le Bas, Tim; Schmidt, Val; Keith, Gordon; Ierodiaconou, Daniel

    2018-06-01

    Multibeam sonar systems now routinely record seafloor backscatter data, which are processed into backscatter mosaics and angular responses, both of which can assist in identifying seafloor types and morphology. Those data products are obtained from the multibeam sonar raw data files through a sequence of data processing stages that follows a basic plan, but the implementation of which varies greatly between sonar systems and software. In this article, we provide a comprehensive review of this backscatter data processing chain, with a focus on the variability in the possible implementation of each processing stage. Our objective for undertaking this task is twofold: (1) to provide an overview of backscatter data processing for the consideration of the general user and (2) to provide suggestions to multibeam sonar manufacturers, software providers and the operators of these systems and software for eventually reducing the lack of control, uncertainty and variability associated with current data processing implementations and the resulting backscatter data products. One such suggestion is the adoption of a nomenclature for increasingly refined levels of processing, akin to the nomenclature adopted for satellite remote-sensing data deliverables.

  1. Quality by design approach for understanding the critical quality attributes of cyclosporine ophthalmic emulsion.

    PubMed

    Rahman, Ziyaur; Xu, Xiaoming; Katragadda, Usha; Krishnaiah, Yellela S R; Yu, Lawrence; Khan, Mansoor A

    2014-03-03

    Restasis is an ophthalmic cyclosporine emulsion used for the treatment of dry eye syndrome. There are no generic products for this product, probably because of the limitations on establishing in vivo bioequivalence methods and lack of alternative in vitro bioequivalence testing methods. The present investigation was carried out to understand and identify the appropriate in vitro methods that can discriminate the effect of formulation and process variables on critical quality attributes (CQA) of cyclosporine microemulsion formulations having the same qualitative (Q1) and quantitative (Q2) composition as that of Restasis. Quality by design (QbD) approach was used to understand the effect of formulation and process variables on critical quality attributes (CQA) of cyclosporine microemulsion. The formulation variables chosen were mixing order method, phase volume ratio, and pH adjustment method, while the process variables were temperature of primary and raw emulsion formation, microfluidizer pressure, and number of pressure cycles. The responses selected were particle size, turbidity, zeta potential, viscosity, osmolality, surface tension, contact angle, pH, and drug diffusion. The selected independent variables showed statistically significant (p < 0.05) effect on droplet size, zeta potential, viscosity, turbidity, and osmolality. However, the surface tension, contact angle, pH, and drug diffusion were not significantly affected by independent variables. In summary, in vitro methods can detect formulation and manufacturing changes and would thus be important for quality control or sameness of cyclosporine ophthalmic products.

  2. Development of process control capability through the Browns Ferry Integrated Computer System using Reactor Water Clanup System as an example. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, J.; Mowrey, J.

    1995-12-01

    This report describes the design, development and testing of process controls for selected system operations in the Browns Ferry Nuclear Plant (BFNP) Reactor Water Cleanup System (RWCU) using a Computer Simulation Platform which simulates the RWCU System and the BFNP Integrated Computer System (ICS). This system was designed to demonstrate the feasibility of the soft control (video touch screen) of nuclear plant systems through an operator console. The BFNP Integrated Computer System, which has recently. been installed at BFNP Unit 2, was simulated to allow for operator control functions of the modeled RWCU system. The BFNP Unit 2 RWCU systemmore » was simulated using the RELAP5 Thermal/Hydraulic Simulation Model, which provided the steady-state and transient RWCU process variables and simulated the response of the system to control system inputs. Descriptions of the hardware and software developed are also included in this report. The testing and acceptance program and results are also detailed in this report. A discussion of potential installation of an actual RWCU process control system in BFNP Unit 2 is included. Finally, this report contains a section on industry issues associated with installation of process control systems in nuclear power plants.« less

  3. Model-free adaptive control of advanced power plants

    DOEpatents

    Cheng, George Shu-Xing; Mulkey, Steven L.; Wang, Qiang

    2015-08-18

    A novel 3-Input-3-Output (3.times.3) Model-Free Adaptive (MFA) controller with a set of artificial neural networks as part of the controller is introduced. A 3.times.3 MFA control system using the inventive 3.times.3 MFA controller is described to control key process variables including Power, Steam Throttle Pressure, and Steam Temperature of boiler-turbine-generator (BTG) units in conventional and advanced power plants. Those advanced power plants may comprise Once-Through Supercritical (OTSC) Boilers, Circulating Fluidized-Bed (CFB) Boilers, and Once-Through Supercritical Circulating Fluidized-Bed (OTSC CFB) Boilers.

  4. Development of a working Hovercraft model

    NASA Astrophysics Data System (ADS)

    Noor, S. H. Mohamed; Syam, K.; Jaafar, A. A.; Mohamad Sharif, M. F.; Ghazali, M. R.; Ibrahim, W. I.; Atan, M. F.

    2016-02-01

    This paper presents the development process to fabricate a working hovercraft model. The purpose of this study is to design and investigate of a fully functional hovercraft, based on the studies that had been done. The different designs of hovercraft model had been made and tested but only one of the models is presented in this paper. In this thesis, the weight, the thrust, the lift and the drag force of the model had been measured and the electrical and mechanical parts are also presented. The processing unit of this model is Arduino Uno by using the PSP2 (Playstation 2) as the controller. Since our prototype should be functioning on all kind of earth surface, our model also had been tested in different floor condition. They include water, grass, cement and tile. The Speed of the model is measured in every case as the respond variable, Current (I) as the manipulated variable and Voltage (V) as the constant variable.

  5. ERPs and Psychopathology. I. Behavioral process issues.

    PubMed

    Roth, W T; Tecce, J J; Pfefferbaum, A; Rosenbloom, M; Callaway, E

    1984-01-01

    The clinical study of ERPs has an inherent defect--a self-selection of clinical populations that hampers equating of clinically defined groups on factors extraneous to the independent variables. Such ex post facto studies increase the likelihood of confounding variables in the interpretation of findings. Hence, the development of lawful relationships between clinical variables and ERPs is impeded and the fulfillment of description, explanation, prediction, and control in brain science is thwarted. Proper methodologies and theory development can increase the likelihood of establishing these lawful relationships. One methodology of potential value in the clinical application of ERPs, particularly in studies of aging, is that of divided attention. Two promising theoretical developments in the understanding of brain functioning and aging are the distraction-arousal hypothesis and the controlled-automatic attention model. The evaluation of ERPs in the study of brain-behavior relations in clinical populations might be facilitated by the differentiation of concurrent, predictive, content, and construct validities.

  6. Optimization of an angle-beam ultrasonic approach for characterization of impact damage in composites

    NASA Astrophysics Data System (ADS)

    Henry, Christine; Kramb, Victoria; Welter, John T.; Wertz, John N.; Lindgren, Eric A.; Aldrin, John C.; Zainey, David

    2018-04-01

    Advances in NDE method development are greatly improved through model-guided experimentation. In the case of ultrasonic inspections, models which provide insight into complex mode conversion processes and sound propagation paths are essential for understanding the experimental data and inverting the experimental data into relevant information. However, models must also be verified using experimental data obtained under well-documented and understood conditions. Ideally, researchers would utilize the model simulations and experimental approach to efficiently converge on the optimal solution. However, variability in experimental parameters introduce extraneous signals that are difficult to differentiate from the anticipated response. This paper discusses the results of an ultrasonic experiment designed to evaluate the effect of controllable variables on the anticipated signal, and the effect of unaccounted for experimental variables on the uncertainty in those results. Controlled experimental parameters include the transducer frequency, incidence beam angle and focal depth.

  7. Fuzzy logic control of rotating drum bioreactor for improved production of amylase and protease enzymes by Aspergillus oryzae in solid-state fermentation.

    PubMed

    Sukumprasertsri, Monton; Unrean, Pornkamol; Pimsamarn, Jindarat; Kitsubun, Panit; Tongta, Anan

    2013-03-01

    In this study, we compared the performance of two control systems, fuzzy logic control (FLC) and conventional control (CC). The control systems were applied for controlling temperature and substrate moisture content in a solidstate fermentation for the biosynthesis of amylase and protease enzymes by Aspergillus oryzae. The fermentation process was achieved in a 200 L rotating drum bioreactor. Three factors affecting temperature and moisture content in the solid-state fermentation were considered. They were inlet air velocity, speed of the rotating drum bioreactor, and spray water addition. The fuzzy logic control system was designed using four input variables: air velocity, substrate temperature, fermentation time, and rotation speed. The temperature was controlled by two variables, inlet air velocity and rotational speed of bioreactor, while the moisture content was controlled by spray water. Experimental results confirmed that the FLC system could effectively control the temperature and moisture content of substrate better than the CC system, resulting in an increased enzyme production by A. oryzae. Thus, the fuzzy logic control is a promising control system that can be applied for enhanced production of enzymes in solidstate fermentation.

  8. Control system design for the MOD-5A 7.3 mW wind turbine generator

    NASA Technical Reports Server (NTRS)

    Barton, Robert S.; Hosp, Theodore J.; Schanzenbach, George P.

    1995-01-01

    This paper provides descriptions of the requirements analysis, hardware development and software development phases of the Control System design for the MOD-5A 7.3 mW Wind Turbine Generator. The system, designed by General Electric Company, Advanced Energy Programs Department, under contract DEN 3-153 with NASA Lewis Research Center and DOE, provides real time regulation of rotor speed by control of both generator torque and rotor torque. A variable speed generator system is used to provide both airgap torque control and reactive power control. The wind rotor is designed with segmented ailerons which are positioned to control blade torque. The central component of the control system, selected early in the design process, is a programmable controller used for sequencing, alarm monitoring, communication, and real time control. Development of requirements for use of aileron controlled blades and a variable speed generator required an analytical simulation that combined drivetrain, tower and blade elastic modes with wind disturbances and control behavior. An orderly two phase plan was used for controller software development. A microcomputer based turbine simulator was used to facilitate hardware and software integration and test.

  9. Process control systems: integrated for future process technologies

    NASA Astrophysics Data System (ADS)

    Botros, Youssry; Hajj, Hazem M.

    2003-06-01

    Process Control Systems (PCS) are becoming more crucial to the success of Integrated Circuit makers due to their direct impact on product quality, cost, and Fab output. The primary objective of PCS is to minimize variability by detecting and correcting non optimal performance. Current PCS implementations are considered disparate, where each PCS application is designed, deployed and supported separately. Each implementation targets a specific area of control such as equipment performance, wafer manufacturing, and process health monitoring. With Intel entering the nanometer technology era, tighter process specifications are required for higher yields and lower cost. This requires areas of control to be tightly coupled and integrated to achieve the optimal performance. This requirement can be achieved via consistent design and deployment of the integrated PCS. PCS integration will result in several benefits such as leveraging commonalities, avoiding redundancy, and facilitating sharing between implementations. This paper will address PCS implementations and focus on benefits and requirements of the integrated PCS. Intel integrated PCS Architecture will be then presented and its components will be briefly discussed. Finally, industry direction and efforts to standardize PCS interfaces that enable PCS integration will be presented.

  10. Model-Data Fusion to Test Hypothesized Drivers of Lake Carbon Cycling Reveals Importance of Physical Controls

    NASA Astrophysics Data System (ADS)

    Hararuk, Oleksandra; Zwart, Jacob A.; Jones, Stuart E.; Prairie, Yves; Solomon, Christopher T.

    2018-03-01

    Formal integration of models and data to test hypotheses about the processes controlling carbon dynamics in lakes is rare, despite the importance of lakes in the carbon cycle. We built a suite of models (n = 102) representing different hypotheses about lake carbon processing, fit these models to data from a north-temperate lake using data assimilation, and identified which processes were essential for adequately describing the observations. The hypotheses that we tested concerned organic matter lability and its variability through time, temperature dependence of biological decay, photooxidation, microbial dynamics, and vertical transport of water via hypolimnetic entrainment and inflowing density currents. The data included epilimnetic and hypolimnetic CO2 and dissolved organic carbon, hydrologic fluxes, carbon loads, gross primary production, temperature, and light conditions at high frequency for one calibration and one validation year. The best models explained 76-81% and 64-67% of the variability in observed epilimnetic CO2 and dissolved organic carbon content in the validation data. Accurately describing C dynamics required accounting for hypolimnetic entrainment and inflowing density currents, in addition to accounting for biological transformations. In contrast, neither photooxidation nor variable organic matter lability improved model performance. The temperature dependence of biological decay (Q10) was estimated at 1.45, significantly lower than the commonly assumed Q10 of 2. By confronting multiple models of lake C dynamics with observations, we identified processes essential for describing C dynamics in a temperate lake at daily to annual scales, while also providing a methodological roadmap for using data assimilation to further improve understanding of lake C cycling.

  11. Anomalous CO2 Emissions in Different Ecosystems Around the World

    NASA Astrophysics Data System (ADS)

    Sanchez-Canete, E. P.; Moya Jiménez, M. R.; Kowalski, A. S.; Serrano-Ortiz, P.; López-Ballesteros, A.; Oyonarte, C.; Domingo, F.

    2016-12-01

    As an important tool for understanding and monitoring ecosystem dynamics at ecosystem level, the eddy covariance (EC) technique allows the assessment of the diurnal and seasonal variation of the net ecosystem exchange (NEE). Despite the high temporal resolution data available, there are still many processes (in addition to photosynthesis and respiration) that, although they are being monitored, have been neglected. Only a few authors have studied anomalous CO2 emissions (non biological), and have related them to soil ventilation, photodegradation or geochemical processes. The aim of this study is: 1) to identify anomalous short term CO2 emissions in different ecosystems distributed around the world, 2) to determine the meteorological variables that are influencing these emissions, and 3) to explore the potential processes that can be involved. We have studied EC data together with other meteorological ancillary variables obtained from the FLUXNET database (version 2015) and have found more than 50 sites with anomalous CO2 emissions in different ecosystem types such as grasslands, croplands or savannas. Data were filtered according to the FLUXNET quality control flags (only data with quality control flag equal to 0 was used) and correlation analysis were performed with NEE and ancillary data. Preliminary results showed strong and highly significant correlations between meteorological variables and anomalous CO2 emissions. Correlation results showed clear differing behaviors between ecosystems types, which could be related to the different processes involved in the anomalous CO2 emissions. We suggest that anomalous CO2 emissions are happening globally and therefore, their contribution to the global net ecosystem carbon balance requires further investigation in order to better understand its drivers.

  12. Species associations overwhelm abiotic conditions to dictate the structure and function of wood-decay fungal communities.

    PubMed

    Maynard, Daniel S; Covey, Kristofer R; Crowther, Thomas W; Sokol, Noah W; Morrison, Eric W; Frey, Serita D; van Diepen, Linda T A; Bradford, Mark A

    2018-04-01

    Environmental conditions exert strong controls on the activity of saprotrophic microbes, yet abiotic factors often fail to adequately predict wood decomposition rates across broad spatial scales. Given that species interactions can have significant positive and negative effects on wood-decay fungal activity, one possibility is that biotic processes serve as the primary controls on community function, with abiotic controls emerging only after species associations are accounted for. Here we explore this hypothesis in a factorial field warming- and nitrogen-addition experiment by examining relationships among wood decomposition rates, fungal activity, and fungal community structure. We show that functional outcomes and community structure are largely unrelated to abiotic conditions, with microsite and plot-level abiotic variables explaining at most 19% of the total variability in decomposition and fungal activity, and 2% of the variability in richness and evenness. In contrast, taxonomic richness, evenness, and species associations (i.e., co-occurrence patterns) exhibited strong relationships with community function, accounting for 52% of the variation in decomposition rates and 73% in fungal activity. A greater proportion of positive vs. negative species associations in a community was linked to strong declines in decomposition rates and richness. Evenness emerged as a key mediator between richness and function, with highly even communities exhibiting a positive richness-function relationship and uneven communities exhibiting a negative or null response. These results suggest that community-assembly processes and species interactions are important controls on the function of wood-decay fungal communities, ultimately overwhelming substantial differences in abiotic conditions. © 2018 by the Ecological Society of America.

  13. Moderators of neuropsychological mechanism in attention-deficit hyperactivity disorder.

    PubMed

    Nikolas, Molly A; Nigg, Joel T

    2015-02-01

    Neuropsychological measures have been proposed as both a way to tap mechanisms and as endophenotypes for child ADHD. However, substantial evidence supporting heterogeneity in neuropsychological performance among youth with ADHD as well as apparent effect differences by sex, age, and comorbidity have slowed progress. To address this, it is important to understand sibling effects in relation to these moderators. 461 youth ages 6-17 years (54.8 % male, including 251 youth with ADHD, 107 of their unaffected biological siblings, and 103 non-ADHD controls) completed diagnostic interviews and a theoretically informed battery of neuropsychological functioning. A structural equation model was used to consolidate neuropsychological domains. Group differences between unaffected siblings of youth with ADHD and controls across each domain were first examined as the primary endophenotype test for ADHD. Moderation of these effects was evaluated via investigation of interactions between diagnostic group and both proband and individual level characteristics, including sex, age, and comorbidity status. Unaffected siblings performed worse than control youth in the domains of inhibition, response time variability, and temporal information processing. Individual age moderated these effects, such that differences between controls and unaffected siblings were pronounced among younger children (ages 6-10 years) but absent among older youth (ages 11-17 years). Evidence for moderation of effects by proband sex and comorbidity status produced more variable and smaller effects. Results support the utility of inhibition, response time variability, and temporal processing as useful endophenotypes for ADHD in future genetic associations studies of the disorder, but suggest this value will vary by age among unaffected family members.

  14. CHAM: weak signals detection through a new multivariate algorithm for process control

    NASA Astrophysics Data System (ADS)

    Bergeret, François; Soual, Carole; Le Gratiet, B.

    2016-10-01

    Derivatives technologies based on core CMOS processes are significantly aggressive in term of design rules and process control requirements. Process control plan is a derived from Process Assumption (PA) calculations which result in a design rule based on known process variability capabilities, taking into account enough margin to be safe not only for yield but especially for reliability. Even though process assumptions are calculated with a 4 sigma known process capability margin, efficient and competitive designs are challenging the process especially for derivatives technologies in 40 and 28nm nodes. For wafer fab process control, PA are declined in monovariate (layer1 CD, layer2 CD, layer2 to layer1 overlay, layer3 CD etc….) control charts with appropriated specifications and control limits which all together are securing the silicon. This is so far working fine but such system is not really sensitive to weak signals coming from interactions of multiple key parameters (high layer2 CD combined with high layer3 CD as an example). CHAM is a software using an advanced statistical algorithm specifically designed to detect small signals, especially when there are many parameters to control and when the parameters can interact to create yield issues. In this presentation we will first present the CHAM algorithm, then the case-study on critical dimensions, with the results, and we will conclude on future work. This partnership between Ippon and STM is part of E450LMDAP, European project dedicated to metrology and lithography development for future technology nodes, especially 10nm.

  15. A system identification approach for developing model predictive controllers of antibody quality attributes in cell culture processes

    PubMed Central

    Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey

    2017-01-01

    As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed‐batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647–1661, 2017 PMID:28786215

  16. Optimality and stability of intentional and unintentional actions: I. Origins of drifts in performance.

    PubMed

    Parsa, Behnoosh; Terekhov, Alexander; Zatsiorsky, Vladimir M; Latash, Mark L

    2017-02-01

    We address the nature of unintentional changes in performance in two papers. This first paper tested a hypothesis that unintentional changes in performance variables during continuous tasks without visual feedback are due to two processes. First, there is a drift of the referent coordinate for the salient performance variable toward the actual coordinate of the effector. Second, there is a drift toward minimum of a cost function. We tested this hypothesis in four-finger isometric pressing tasks that required the accurate production of a combination of total moment and total force with natural and modified finger involvement. Subjects performed accurate force-moment production tasks under visual feedback, and then visual feedback was removed for some or all of the salient variables. Analytical inverse optimization was used to compute a cost function. Without visual feedback, both force and moment drifted slowly toward lower absolute magnitudes. Over 15 s, the force drop could reach 20% of its initial magnitude while moment drop could reach 30% of its initial magnitude. Individual finger forces could show drifts toward both higher and lower forces. The cost function estimated using the analytical inverse optimization reduced its value as a consequence of the drift. We interpret the results within the framework of hierarchical control with referent spatial coordinates for salient variables at each level of the hierarchy combined with synergic control of salient variables. The force drift is discussed as a natural relaxation process toward states with lower potential energy in the physical (physiological) system involved in the task.

  17. Optimality and stability of intentional and unintentional actions: I. Origins of drifts in performance

    PubMed Central

    Parsa, Behnoosh; Terekhov, Alexander; Zatsiorsky, Vladimir M.; Latash, Mark L.

    2016-01-01

    We address the nature of unintentional changes in performance in two papers. This first paper tested a hypothesis that unintentional changes in performance variables during continuous tasks without visual feedback are due to two processes. First, there is a drift of the referent coordinate for the salient performance variable toward the actual coordinate of the effector. Second, there is a drift toward minimum of a cost function. We tested this hypothesis in four-finger isometric pressing tasks that required the accurate production of a combination of total moment and total force with natural and modified finger involvement. Subjects performed accurate force/moment production tasks under visual feedback, and then visual feedback was removed for some or all of the salient variables. Analytical inverse optimization was used to compute a cost function. Without visual feedback, both force and moment drifted slowly toward lower absolute magnitudes. Over 15 s, the force drop could reach 20% of its initial magnitude while moment drop could reach 30% of its initial magnitude. Individual finger forces could show drifts toward both higher and lower forces. The cost function estimated using the analytical inverse optimization reduced its value as a consequence of the drift. We interpret the results within the framework of hierarchical control with referent spatial coordinates for salient variables at each level of the hierarchy combined with synergic control of salient variables. The force drift is discussed as a natural relaxation process toward states with lower potential energy in the physical (physiological) system involved in the task. PMID:27785549

  18. Functional variability of habitats within the Sacramento-San Joaquin Delta: Restoration implications

    USGS Publications Warehouse

    Lucas, L.V.; Cloern, J.E.; Thompson, J.K.; Monsen, N.E.

    2002-01-01

    We have now entered an era of large-scale attempts to restore ecological functions and biological communities in impaired ecosystems. Our knowledge base of complex ecosystems and interrelated functions is limited, so the outcomes of specific restoration actions are highly uncertain. One approach for exploring that uncertainty and anticipating the range of possible restoration outcomes is comparative study of existing habitats similar to future habitats slated for construction. Here we compare two examples of one habitat type targeted for restoration in the Sacramento-San Joaquin River Delta. We compare one critical ecological function provided by these shallow tidal habitats - production and distribution of phytoplankton biomass as the food supply to pelagic consumers. We measured spatial and short-term temporal variability of phytoplankton biomass and growth rate and quantified the hydrodynamic and biological processes governing that variability. Results show that the production and distribution of phytoplankton biomass can be highly variable within and between nearby habitats of the same type, due to variations in phytoplankton sources, sinks, and transport. Therefore, superficially similar, geographically proximate habitats can function very differently, and that functional variability introduces large uncertainties into the restoration process. Comparative study of existing habitats is one way ecosystem science can elucidate and potentially minimize restoration uncertainties, by identifying processes shaping habitat functionality, including those that can be controlled in the restoration design.

  19. The Taguchi methodology as a statistical tool for biotechnological applications: a critical appraisal.

    PubMed

    Rao, Ravella Sreenivas; Kumar, C Ganesh; Prakasham, R Shetty; Hobbs, Phil J

    2008-04-01

    Success in experiments and/or technology mainly depends on a properly designed process or product. The traditional method of process optimization involves the study of one variable at a time, which requires a number of combinations of experiments that are time, cost and labor intensive. The Taguchi method of design of experiments is a simple statistical tool involving a system of tabulated designs (arrays) that allows a maximum number of main effects to be estimated in an unbiased (orthogonal) fashion with a minimum number of experimental runs. It has been applied to predict the significant contribution of the design variable(s) and the optimum combination of each variable by conducting experiments on a real-time basis. The modeling that is performed essentially relates signal-to-noise ratio to the control variables in a 'main effect only' approach. This approach enables both multiple response and dynamic problems to be studied by handling noise factors. Taguchi principles and concepts have made extensive contributions to industry by bringing focused awareness to robustness, noise and quality. This methodology has been widely applied in many industrial sectors; however, its application in biological sciences has been limited. In the present review, the application and comparison of the Taguchi methodology has been emphasized with specific case studies in the field of biotechnology, particularly in diverse areas like fermentation, food processing, molecular biology, wastewater treatment and bioremediation.

  20. Mental training enhances attentional stability: Neural and behavioral evidence

    PubMed Central

    Lutz, Antoine; Slagter, Heleen A.; Rawlings, Nancy B.; Francis, Andrew D.; Greischar, Lawrence L.; Davidson, Richard J.

    2009-01-01

    The capacity to stabilize the content of attention over time varies among individuals and its impairment is a hallmark of several mental illnesses. Impairments in sustained attention in patients with attention disorders have been associated with increased trial-to-trial variability in reaction time and event-related potential (ERP) deficits during attention tasks. At present, it is unclear whether the ability to sustain attention and its underlying brain circuitry are transformable through training. Here, we show, with dichotic listening task performance and electroencephalography (EEG), that training attention, as cultivated by meditation, can improve the ability to sustain attention. Three months of intensive meditation training reduced variability in attentional processing of target tones, as indicated by both enhanced theta-band phase consistency of oscillatory neural responses over anterior brain areas and reduced reaction time variability. Furthermore, those individuals who showed the greatest increase in neural response consistency showed the largest decrease in behavioral response variability. Notably, we also observed reduced variability in neural processing, in particular in low-frequency bands, regardless of whether the deviant tone was attended or unattended. Focused attention meditation may thus affect both distracter and target processing, perhaps by enhancing entrainment of neuronal oscillations to sensory input rhythms; a mechanism important for controlling the content of attention. These novel findings highlight the mechanisms underlying focused attention meditation, and support the notion that mental training can significantly affect attention and brain function. PMID:19846729

  1. High-resolution IP25-based reconstruction of sea-ice variability in the western North Pacific and Bering Sea during the past 18,000 years

    NASA Astrophysics Data System (ADS)

    Méheust, Marie; Stein, Ruediger; Fahl, Kirsten; Max, Lars; Riethdorf, Jan-Rainer

    2016-04-01

    Due to its strong influence on heat and moisture exchange between the ocean and the atmosphere, sea ice is an essential component of the global climate system. In the context of its alarming decrease in terms of concentration, thickness and duration, understanding the processes controlling sea-ice variability and reconstructing paleo-sea-ice extent in polar regions have become of great interest for the scientific community. In this study, for the first time, IP25, a recently developed biomarker sea-ice proxy, was used for a high-resolution reconstruction of the sea-ice extent and its variability in the western North Pacific and western Bering Sea during the past 18,000 years. To identify mechanisms controlling the sea-ice variability, IP25 data were associated with published sea-surface temperature as well as diatom and biogenic opal data. The results indicate that a seasonal sea-ice cover existed during cold periods (Heinrich Stadial 1 and Younger Dryas), whereas during warmer intervals (Bølling-Allerød and Holocene) reduced sea ice or ice-free conditions prevailed in the study area. The variability in sea-ice extent seems to be linked to climate anomalies and sea-level changes controlling the oceanographic circulation between the subarctic Pacific and the Bering Sea, especially the Alaskan Stream injection though the Aleutian passes.

  2. Coupling the Solar-Wind/IMF to the Ionosphere through the High Latitude Cusps

    NASA Technical Reports Server (NTRS)

    Maynard, Nelson C.

    2003-01-01

    Magnetic merging is a primary means for coupling energy from the solar wind into the magnetosphere-ionosphere system. The location and nature of the process remain as open questions. By correlating measurements form diverse locations and using large-scale MHD models to put the measurements in context, it is possible to constrain out interpretations of the global and meso-scale dynamics of magnetic merging. Recent evidence demonstrates that merging often occurs at high latitudes in the vicinity of the cusps. The location is in part controlled by the clock angle in the interplanetary magnetic field (IMF) Y-Z plane. In fact, B(sub Y) bifurcated the cusp relative to source regions. The newly opened field lines may couple to the ionosphere at MLT locations of as much as 3 hr away from local noon. On the other side of noon the cusp may be connected to merging sites in the opposite hemisphere. In face, the small convection cell is generally driven by opposite hemisphere merging. B(sub X) controls the timing of the interaction and merging sites in each hemisphere, which may respond to planar features in the IMF at different times. Correlation times are variable and are controlled by the dynamics of the tilt of the interplanetary electric field phase plane. The orientation of the phase plane may change significantly on time scales of tens of minutes. Merging is temporally variable and may be occurring at multiple sites simultaneously. Accelerated electrons from the merging process excite optical signatures at the foot of the newly opened field lines. All-sky photometer observations of 557.7 nm emissions in the cusp region provide a "television picture" of the merging process and may be used to infer the temporal and spatial variability of merging, tied to variations in the IMF.

  3. Controls and variability of solute and sedimentary fluxes in Arctic and sub-Arctic Environments

    NASA Astrophysics Data System (ADS)

    Dixon, John

    2015-04-01

    Six major factors consistently emerge as controls on the spatial and temporal variability in sediment and solute fluxes in cold climates. They are climatic, geologic, physiographic or relief, biologic, hydrologic, and regolith factors. The impact of these factors on sediment and solute mass transfer in Arctic and sub-Arctic environments is examined. Comparison of non-glacierized Arctic vs. subarctic drainage basins reveals the effects of these controls. All drainage basins exhibit considerable variability in rates of sediment and solute fluxes. For the non-glacierized drainage basins there is a consistent increase in sediment mass transfer by slope processes and fluvial processes as relief increases. Similarly, a consistent increase in sediment mass transfer by slope and fluvial processes is observed as total precipitation increases. Similar patterns are also observed with respect to solute transport and relief and precipitation. Lithologic factors are most strongly observed in the contrast between volcanic vs. plutonic igneous bedrock substrates. Basins underlain by volcanic rocks display greater mass transfers than those underlain by plutonic rocks. Biologic influences are most strongly expressed by variations in extent of vegetation cover and the degree of human interference, with human impacted basins generating greater fluxes. For glacierized basins the fundamental difference to non-glacierized basins is an overall increase in mean annual mass transfers of sediment and a generally smaller magnitude solute transfer. The principal role of geology is observed with respect to lithology. Catchments underlain by limestone demonstrate substantially greater solute mass transfers than sediment transfer. The influence of relief is seen in the contrast in mass transfers between upland and lowland drainage basins with upland basins generating greater sediment and solute transfers than lowland basins. For glacierized basins the effects of biology and regolith appear to be largely overridden by the hydrologic impacts of glacierization.

  4. Geologic and climatic controls on streamflow generation processes in a complex eogenetic karst basin

    NASA Astrophysics Data System (ADS)

    Vibhava, F.; Graham, W. D.; Maxwell, R. M.

    2012-12-01

    Streamflow at any given location and time is representative of surface and subsurface contributions from various sources. The ability to fully identify the factors controlling these contributions is key to successfully understanding the transport of contaminants through the system. In this study we developed a fully integrated 3D surface water-groundwater-land surface model, PARFLOW, to evaluate geologic and climatic controls on streamflow generation processes in a complex eogenetic karst basin in North Central Florida. In addition to traditional model evaluation criterion, such as comparing field observations to model simulated streamflow and groundwater elevations, we quantitatively evaluated the model's predictions of surface-groundwater interactions over space and time using a suite of binary end-member mixing models that were developed using observed specific conductivity differences among surface and groundwater sources throughout the domain. Analysis of model predictions showed that geologic heterogeneity exerts a strong control on both streamflow generation processes and land atmospheric fluxes in this watershed. In the upper basin, where the karst aquifer is overlain by a thick confining layer, approximately 92% of streamflow is "young" event flow, produced by near stream rainfall. Throughout the upper basin the confining layer produces a persistent high surficial water table which results in high evapotranspiration, low groundwater recharge and thus negligible "inter-event" streamflow. In the lower basin, where the karst aquifer is unconfined, deeper water tables result in less evapotranspiration. Thus, over 80% of the streamflow is "old" subsurface flow produced by diffuse infiltration through the epikarst throughout the lower basin, and all surface contributions to streamflow originate in the upper confined basin. Climatic variability provides a secondary control on surface-subsurface and land-atmosphere fluxes, producing significant seasonal and interannual variability in these processes. Spatial and temporal patterns of evapotranspiration, groundwater recharge and streamflow generation processes reveal potential hot spots and hot moments for surface and groundwater contamination in this basin.

  5. Differences in baseline and process variables between non-responders and responders in Internet-based cognitive behavior therapy for chronic tinnitus.

    PubMed

    Probst, Thomas; Weise, Cornelia; Andersson, Gerhard; Kleinstäuber, Maria

    2018-06-06

    Although Internet-based cognitive behavior therapy (iCBT) is an effective treatment for chronic tinnitus, several patients do not improve. In the current study, baseline and process variables were compared between non-responders and responders. Data from patients participating in two randomized controlled trials on iCBT for chronic tinnitus were re-analyzed. Based on the literature, a pre-post difference on the "Tinnitus Handicap Inventory" (THI) of less than seven points improvement was used to operationalize non-response. Associations between non-response and baseline variables (age, gender, and questionnaire scores), patient progress (THI), the process of the therapeutic alliance ("Working Alliance Inventory-Short Revised"; WAI-SR), as well as other process variables (number of logins, amount of messages sent from therapists to patients) were investigated. The results showed that non-responders had a less favorable change on the THI than responders already at mid-treatment (p < .05). The alliance (WAI-SR) during iCBT was not associated with non-response. Non-responders showed more severe sleep disturbances, logged in less in the iCBT platform, and received fewer messages from the therapists than responders, but these differences were mostly not significant anymore when correcting for multiple testing. To conclude, no symptom change in the first half of iCBT for chronic tinnitus patients is a risk factor of not benefiting from iCBT.

  6. Comparison of the basin-scale effect of dredging operations and natural estuarine processes on suspended sediment concentration

    USGS Publications Warehouse

    Schoellhamer, D.H.

    2002-01-01

    Suspended sediment concentration (SSC) data from San Pablo Bay, California, were analyzed to compare the basin-scale effect of dredging and disposal of dredged material (dredging operations) and natural estuarine processes. The analysis used twelve 3-wk to 5-wk periods of mid-depth and near-bottom SSC data collected at Point San Pablo every 15 min from 1993-1998. Point San Pablo is within a tidal excursion of a dredged-material disposal site. The SSC data were compared to dredging volume, Julian day, and hydrodynamic and meteorological variables that could affect SSC. Kendall's ??, Spearman's ??, and weighted (by the fraction of valid data in each period) Spearman's ??w correlation coefficients of the variables indicated which variables were significantly correlated with SSC. Wind-wave resuspension had the greatest effect on SSC. Median water-surface elevation was the primary factor affecting mid-depth SSC. Greater depths inhibit wind-wave resuspension of bottom sediment and indicate greater influence of less turbid water from down estuary. Seasonal variability in the supply of erodible sediment is the primary factor affecting near-bottom SSC. Natural physical processes in San Pablo Bay are more areally extensive, of equal or longer duration, and as frequent as dredging operations (when occurring), and they affect SSC at the tidal time scale. Natural processes control SSC at Point San Pablo even when dredging operations are occurring.

  7. Frontal Alpha Oscillations and Attentional Control: A Virtual Reality Neurofeedback Study.

    PubMed

    Berger, Anna M; Davelaar, Eddy J

    2018-05-15

    Two competing views about alpha oscillations suggest that cortical alpha reflect either cortical inactivity or cortical processing efficiency. We investigated the role of alpha oscillations in attentional control, as measured with a Stroop task. We used neurofeedback to train 22 participants to increase their level of alpha amplitude. Based on the conflict/control loop theory, we selected to train prefrontal alpha and focus on the Gratton effect as an index of deployment of attentional control. We expected an increase or a decrease in the Gratton effect with increase in neural learning depending on whether frontal alpha oscillations reflect cortical idling or enhanced processing efficiency, respectively. In order to induce variability in neural learning beyond natural occurring individual differences, we provided half of the participants with feedback on alpha amplitude in a 3-dimensional (3D) virtual reality environment and the other half received feedback in a 2D environment. Our results showed variable neural learning rates, with larger rates in the 3D compared to the 2D group, corroborating prior evidence of individual differences in EEG-based learning and the influence of a virtual environment. Regression analyses revealed a significant association between the learning rate and changes on deployment of attentional control, with larger learning rates being associated with larger decreases in the Gratton effect. This association was not modulated by feedback medium. The study supports the view of frontal alpha oscillations being associated with efficient neurocognitive processing and demonstrates the utility of neurofeedback training in addressing theoretical questions in the non-neurofeedback literature. Copyright © 2017 IBRO. Published by Elsevier Ltd. All rights reserved.

  8. Study of flexural rigidity of weavable powder-coated towpreg

    NASA Technical Reports Server (NTRS)

    Hirt, Douglas E.; Marchello, Joseph M.; Baucom, Robert M.

    1990-01-01

    An effort has been made to weave powder-impregnated tow into a two-dimensional preform, controlling process variables to obtain high flexural rigidity in the warp direction and greater flexibility in the fill direction. The resulting prepregs have been consolidated into laminates with LaRC-TPI matrices. Complementary SEM and DSC studies have been performed to deepen understanding of the relationship between tow flexibility and heat treatment. Attention is also given to the oven temperature and residence time variables' effects on power/fiber fusion.

  9. Interannual variability of terrestrial NEP and its attributions to carbon uptake amplitude and period

    NASA Astrophysics Data System (ADS)

    Niu, S.

    2015-12-01

    Earth system exhibits strong interannual variability (IAV) in the global carbon cycle as reflected in the year-to-year anomalies of the atmospheric CO2 concentration. Although various analyses suggested that land ecosystems contribute mostly to the IAV of atmospheric CO2 concentration, processes leading to the IAV in the terrestrial carbon (C) cycle are far from clear and hinder our effort in predicting the IAV of global C cycle. Previous studies on IAV of global C cycle have focused on the regulation of climatic variables in tropical or semiarid areas, but generated inconsistent conclusions. Using long-term eddy-flux measurements of net ecosystem production (NEP), atmospheric CO2 inversion NEP, and the MODIS-derived gross primary production (GPP), we demonstrate that seasonal carbon uptake amplitude (CUA) and period (CUP) are two key processes that control the IAV in the terrestrial C cycle. The two processes together explain 78% of the variations in the IAV in eddy covariance NEP, 70% in global atmospheric inversed NEP, and 53% in the IAV of GPP. Moreover, the three lines of evidence consistently show that variability in CUA is much more important than that of CUP in determining the variation of NEP at most eddy-flux sites, and most grids of global NEP and GPP. Our results suggest that the maximum carbon uptake potential in the peak-growing season is a determinant process of global C cycle internnual variability and carbon uptake period may play less important role than previous expectations. This study uncovers the most parsimonious, proximate processes underlying the IAV in global C cycle of the Earth system. Future research is needed to identify how climate factors affect the IAV in terrestrial C cycle through their influence on CUA and CUP.

  10. Ecohydrologic processes and soil thickness feedbacks control limestone-weathering rates in a karst landscape

    DOE PAGES

    Dong, Xiaoli; Cohen, Matthew J.; Martin, Jonathan B.; ...

    2018-05-18

    Here, chemical weathering of bedrock plays an essential role in the formation and evolution of Earth's critical zone. Over geologic time, the negative feedback between temperature and chemical weathering rates contributes to the regulation of Earth climate. The challenge of understanding weathering rates and the resulting evolution of critical zone structures lies in complicated interactions and feedbacks among environmental variables, local ecohydrologic processes, and soil thickness, the relative importance of which remains unresolved. We investigate these interactions using a reactive-transport kinetics model, focusing on a low-relief, wetland-dominated karst landscape (Big Cypress National Preserve, South Florida, USA) as a case study.more » Across a broad range of environmental variables, model simulations highlight primary controls of climate and soil biological respiration, where soil thickness both supplies and limits transport of biologically derived acidity. Consequently, the weathering rate maximum occurs at intermediate soil thickness. The value of the maximum weathering rate and the precise soil thickness at which it occurs depend on several environmental variables, including precipitation regime, soil inundation, vegetation characteristics, and rate of groundwater drainage. Simulations for environmental conditions specific to Big Cypress suggest that wetland depressions in this landscape began to form around beginning of the Holocene with gradual dissolution of limestone bedrock and attendant soil development, highlighting large influence of age-varying soil thickness on weathering rates and consequent landscape development. While climatic variables are often considered most important for chemical weathering, our results indicate that soil thickness and biotic activity are equally important. Weathering rates reflect complex interactions among soil thickness, climate, and local hydrologic and biotic processes, which jointly shape the supply and delivery of chemical reactants, and the resulting trajectories of critical zone and karst landscape development.« less

  11. Ecohydrologic processes and soil thickness feedbacks control limestone-weathering rates in a karst landscape

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, Xiaoli; Cohen, Matthew J.; Martin, Jonathan B.

    Here, chemical weathering of bedrock plays an essential role in the formation and evolution of Earth's critical zone. Over geologic time, the negative feedback between temperature and chemical weathering rates contributes to the regulation of Earth climate. The challenge of understanding weathering rates and the resulting evolution of critical zone structures lies in complicated interactions and feedbacks among environmental variables, local ecohydrologic processes, and soil thickness, the relative importance of which remains unresolved. We investigate these interactions using a reactive-transport kinetics model, focusing on a low-relief, wetland-dominated karst landscape (Big Cypress National Preserve, South Florida, USA) as a case study.more » Across a broad range of environmental variables, model simulations highlight primary controls of climate and soil biological respiration, where soil thickness both supplies and limits transport of biologically derived acidity. Consequently, the weathering rate maximum occurs at intermediate soil thickness. The value of the maximum weathering rate and the precise soil thickness at which it occurs depend on several environmental variables, including precipitation regime, soil inundation, vegetation characteristics, and rate of groundwater drainage. Simulations for environmental conditions specific to Big Cypress suggest that wetland depressions in this landscape began to form around beginning of the Holocene with gradual dissolution of limestone bedrock and attendant soil development, highlighting large influence of age-varying soil thickness on weathering rates and consequent landscape development. While climatic variables are often considered most important for chemical weathering, our results indicate that soil thickness and biotic activity are equally important. Weathering rates reflect complex interactions among soil thickness, climate, and local hydrologic and biotic processes, which jointly shape the supply and delivery of chemical reactants, and the resulting trajectories of critical zone and karst landscape development.« less

  12. Mountain Hydrology of the Semi-Arid Western U.S.: Research Needs, Opportunities and Challenges

    NASA Astrophysics Data System (ADS)

    Bales, R.; Dozier, J.; Molotch, N.; Painter, T.; Rice, R.

    2004-12-01

    In the semi-arid Western U.S., water resources are being stressed by the combination of climate warming, changing land use, and population growth. Multiple consensus planning documents point to this region as perhaps the highest priority for new hydrologic understanding. Three main hydrologic issues illustrate research needs in the snow-driven hydrology of the region. First, despite the hydrologic importance of mountainous regions, the processes controlling their energy, water and biogeochemical fluxes are not well understood. Second, there exists a need to realize, at various spatial and temporal scales, the feedback systems between hydrological fluxes and biogeochemical and ecological processes. Third, the paucity of adequate observation networks in mountainous regions hampers improvements in understanding these processes. For example, we lack an adequate description of factors controlling the partitioning of snowmelt into runoff versus infiltration and evapotranspiration, and need strategies to accurately measure the variability of precipitation, snow cover and soil moisture. The amount of mountain-block and mountain-front recharge and how recharge patterns respond to climate variability are poorly known across the mountainous West. Moreover, hydrologic modelers and those measuring important hydrologic variables from remote sensing and distributed in situ sites have failed to bridge rifts between modeling needs and available measurements. Research and operational communities will benefit from data fusion/integration, improved measurement arrays, and rapid data access. For example, the hydrologic modeling community would advance if given new access to single rather than disparate sources of bundles of cutting-edge remote sensing retrievals of snow covered area and albedo, in situ measurements of snow water equivalent and precipitation, and spatio-temporal fields of variables that drive models. In addition, opportunities exist for the deployment of new technologies, taking advantage of research in spatially distributed sensor networks that can enhance data recovery and analysis.

  13. Assessing the sources and magnitude of diurnal nitrate variability in the San Joaquin River (California) with an in situ optical nitrate sensor and dual nitrate isotopes

    USGS Publications Warehouse

    Pellerin, Brian A.; Downing, Bryan D.; Kendall, Carol; Dahlgren, Randy A.; Kraus, Tamara E.C.; Saraceno, John Franco; Spencer, Robert G. M.; Bergamaschi, Brian A.

    2009-01-01

    1. We investigated diurnal nitrate (NO3−) concentration variability in the San Joaquin River using an in situ optical NO3− sensor and discrete sampling during a 5‐day summer period characterized by high algal productivity. Dual NO3− isotopes (δ15NNO3 and δ18ONO3) and dissolved oxygen isotopes (δ18ODO) were measured over 2 days to assess NO3− sources and biogeochemical controls over diurnal time‐scales.2. Concerted temporal patterns of dissolved oxygen (DO) concentrations and δ18ODOwere consistent with photosynthesis, respiration and atmospheric O2 exchange, providing evidence of diurnal biological processes independent of river discharge.3. Surface water NO3− concentrations varied by up to 22% over a single diurnal cycle and up to 31% over the 5‐day study, but did not reveal concerted diurnal patterns at a frequency comparable to DO concentrations. The decoupling of δ15NNO3 and δ18ONO3isotopes suggests that algal assimilation and denitrification are not major processes controlling diurnal NO3− variability in the San Joaquin River during the study. The lack of a clear explanation for NO3− variability likely reflects a combination of riverine biological processes and time‐varying physical transport of NO3− from upstream agricultural drains to the mainstem San Joaquin River.4. The application of an in situ optical NO3− sensor along with discrete samples provides a view into the fine temporal structure of hydrochemical data and may allow for greater accuracy in pollution assessment.

  14. Characterization of Noise Signatures of Involuntary Head Motion in the Autism Brain Imaging Data Exchange Repository

    PubMed Central

    Caballero, Carla; Mistry, Sejal; Vero, Joe; Torres, Elizabeth B

    2018-01-01

    The variability inherently present in biophysical data is partly contributed by disparate sampling resolutions across instrumentations. This poses a potential problem for statistical inference using pooled data in open access repositories. Such repositories combine data collected from multiple research sites using variable sampling resolutions. One example is the Autism Brain Imaging Data Exchange repository containing thousands of imaging and demographic records from participants in the spectrum of autism and age-matched neurotypical controls. Further, statistical analyses of groups from different diagnoses and demographics may be challenging, owing to the disparate number of participants across different clinical subgroups. In this paper, we examine the noise signatures of head motion data extracted from resting state fMRI data harnessed under different sampling resolutions. We characterize the quality of the noise in the variability of the raw linear and angular speeds for different clinical phenotypes in relation to age-matched controls. Further, we use bootstrapping methods to ensure compatible group sizes for statistical comparison and report the ranges of physical involuntary head excursions of these groups. We conclude that different sampling rates do affect the quality of noise in the variability of head motion data and, consequently, the type of random process appropriate to characterize the time series data. Further, given a qualitative range of noise, from pink to brown noise, it is possible to characterize different clinical subtypes and distinguish them in relation to ranges of neurotypical controls. These results may be of relevance to the pre-processing stages of the pipeline of analyses of resting state fMRI data, whereby head motion enters the criteria to clean imaging data from motion artifacts. PMID:29556179

  15. The relationship of working memory, inhibition, and response variability in child psychopathology.

    PubMed

    Verté, Sylvie; Geurts, Hilde M; Roeyers, Herbert; Oosterlaan, Jaap; Sergeant, Joseph A

    2006-02-15

    The aim of this study was to investigate the relationship between working memory and inhibition in children with attention deficit hyperactivity disorder (ADHD), high-functioning autism (HFA), and Tourette syndrome (TS), compared to normally developing children. Furthermore, the contribution of variation in processing speed on working memory and inhibition was investigated in these childhood psychopathologies. Four groups of children are reported in this study: 65 children with ADHD, 66 children with HFA, 24 children with TS, and 82 normal control children. All children were in the age range of 6-13 years. The relationship between working memory and inhibition was similar in children with ADHD, HFA, TS, and normally developing children. The relationship between both domains did not alter significantly for any of the groups, when variation in processing speed was taken into account. More symptoms of hyperactivity/impulsivity are related to a poorer inhibitory process and greater response variability. More symptoms of autism are related to a poorer working memory process. The current study showed that working memory, inhibition, and response variability, are distinct, but related cognitive domains in children with developmental psychopathologies. Research with experimental manipulations is needed to tackle the exact relationship between these cognitive domains.

  16. Carbothermic Synthesis of ~820- m UN Kernels. Investigation of Process Variables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lindemer, Terrence; Silva, Chinthaka M; Henry, Jr, John James

    2015-06-01

    This report details the continued investigation of process variables involved in converting sol-gel-derived, urainia-carbon microspheres to ~820-μm-dia. UN fuel kernels in flow-through, vertical refractory-metal crucibles at temperatures up to 2123 K. Experiments included calcining of air-dried UO 3-H 2O-C microspheres in Ar and H 2-containing gases, conversion of the resulting UO 2-C kernels to dense UO 2:2UC in the same gases and vacuum, and its conversion in N 2 to in UC 1-xN x. The thermodynamics of the relevant reactions were applied extensively to interpret and control the process variables. Producing the precursor UO 2:2UC kernel of ~96% theoretical densitymore » was required, but its subsequent conversion to UC 1-xN x at 2123 K was not accompanied by sintering and resulted in ~83-86% of theoretical density. Decreasing the UC 1-xN x kernel carbide component via HCN evolution was shown to be quantitatively consistent with present and past experiments and the only useful application of H2 in the entire process.« less

  17. A fuzzy decision tree for fault classification.

    PubMed

    Zio, Enrico; Baraldi, Piero; Popescu, Irina C

    2008-02-01

    In plant accident management, the control room operators are required to identify the causes of the accident, based on the different patterns of evolution of the monitored process variables thereby developing. This task is often quite challenging, given the large number of process parameters monitored and the intense emotional states under which it is performed. To aid the operators, various techniques of fault classification have been engineered. An important requirement for their practical application is the physical interpretability of the relationships among the process variables underpinning the fault classification. In this view, the present work propounds a fuzzy approach to fault classification, which relies on fuzzy if-then rules inferred from the clustering of available preclassified signal data, which are then organized in a logical and transparent decision tree structure. The advantages offered by the proposed approach are precisely that a transparent fault classification model is mined out of the signal data and that the underlying physical relationships among the process variables are easily interpretable as linguistic if-then rules that can be explicitly visualized in the decision tree structure. The approach is applied to a case study regarding the classification of simulated faults in the feedwater system of a boiling water reactor.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crowder, Stephen V.

    This document outlines a statistical framework for establishing a shelf-life program for components whose performance is measured by the value of a continuous variable such as voltage or function time. The approach applies to both single measurement devices and repeated measurement devices, although additional process control charts may be useful in the case of repeated measurements. The approach is to choose a sample size that protects the margin associated with a particular variable over the life of the component. Deviations from expected performance of the measured variable are detected prior to the complete loss of margin. This ensures the reliabilitymore » of the component over its lifetime.« less

  19. Assessing the influence of component processing and donor characteristics on quality of red cell concentrates using quality control data.

    PubMed

    Jordan, A; Chen, D; Yi, Q-L; Kanias, T; Gladwin, M T; Acker, J P

    2016-07-01

    Quality control (QC) data collected by blood services are used to monitor production and to ensure compliance with regulatory standards. We demonstrate how analysis of quality control data can be used to highlight the sources of variability within red cell concentrates (RCCs). We merged Canadian Blood Services QC data with manufacturing and donor records for 28 227 RCC between June 2011 and October 2014. Units were categorized based on processing method, bag manufacturer, donor age and donor sex, then assessed based on product characteristics: haemolysis and haemoglobin levels, unit volume, leucocyte count and haematocrit. Buffy-coat method (top/bottom)-processed units exhibited lower haemolysis than units processed using the whole-blood filtration method (top/top). Units from female donors exhibited lower haemolysis than male donations. Processing method influenced unit volume and the ratio of additive solution to residual plasma. Stored red blood cell characteristics are influenced by prestorage processing and donor factors. Understanding the relationship between processing, donors and RCC quality will help blood services to ensure the safety of transfused products. © 2016 International Society of Blood Transfusion.

  20. In-session behaviours and adolescents' self-concept and loneliness: A psychodrama process-outcome study.

    PubMed

    Orkibi, Hod; Azoulay, Bracha; Snir, Sharon; Regev, Dafna

    2017-11-01

    As adolescents spend many hours a day in school, it is crucial to examine the ways in which therapeutic practices in schools promote their well-being. This longitudinal pilot study examined the contribution of school-based psychodrama group therapy to the self-concept dimensions and perceived loneliness of 40 Israeli adolescents (aged 13-16, 60% boys) in public middle schools. From a process-outcome perspective, we also examined the understudied trajectory of adolescents' in-session behaviours (process variables) and its associations with changes in their self-concepts and loneliness (outcome variables). Psychodrama participants reported increases in global, social, and behavioural self-concepts and a decrease in loneliness compared to the control group. In-session productive behaviours increased and resistance decreased throughout the therapy, but varied process-outcome relationships were found. The study suggests that conducting further research into the process-outcome relationships in psychodrama group therapy is warranted to pinpoint specific mechanisms of change. Suggestions for future studies are provided. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Delay compensation in integrated communication and control systems. I - Conceptual development and analysis

    NASA Technical Reports Server (NTRS)

    Luck, Rogelio; Ray, Asok

    1990-01-01

    A procedure for compensating for the effects of distributed network-induced delays in integrated communication and control systems (ICCS) is proposed. The problem of analyzing systems with time-varying and possibly stochastic delays could be circumvented by use of a deterministic observer which is designed to perform under certain restrictive but realistic assumptions. The proposed delay-compensation algorithm is based on a deterministic state estimator and a linear state-variable-feedback control law. The deterministic observer can be replaced by a stochastic observer without any structural modifications of the delay compensation algorithm. However, if a feedforward-feedback control law is chosen instead of the state-variable feedback control law, the observer must be modified as a conventional nondelayed system would be. Under these circumstances, the delay compensation algorithm would be accordingly changed. The separation principle of the classical Luenberger observer holds true for the proposed delay compensator. The algorithm is suitable for ICCS in advanced aircraft, spacecraft, manufacturing automation, and chemical process applications.

  2. Self-regulation and recovery: approaching an understanding of the process of recovery from stress.

    PubMed

    Beckmann, Jürgen; Kellmann, Michael

    2004-12-01

    Stress has been studied extensively in psychology. Only recently, however, has research started to address the question of how individuals manage to recover from stress. Recovery from stress is analyzed as a process of self-regulation. Several individual difference variables which affect the efficiency of self-regulation have been integrated into a structured model of the recovery process. Such variables are action versus state orientation (a tendency to ruminate, e.g., about a past experience) and volitional components, such as self-determination, self-motivation, emotion control, rumination, and self-discipline. Some of these components are assumed to promote recovery from stress, whereas others are assumed to further the perseverance of stress. The model was supported by the empirical findings of three independent studies (Study 1, N=58; Study 2, N=221; Study 3, N= 105). Kuhl's Action Control Scale measured action versus state orientation. Volitional components were assessed with Kuhl and Fuhrmann's Volitional Components Questionnaire. The amounts of experienced stress and recovery from stress was assessed with Kellmann and Kallus's Recovery-Stress Questionnaire. As hypothesized in the model, the disposition towards action versus state orientation was a more distant determinant of the recovery from stress and perseverance of stress. The volitional components are more proximal determinants in the recovery process. Action orientation promotes recovery from stress via adequate volitional skills, e.g., self-determination, self-motivation, emotion control, whereas state orientation furthers a perseverance of stress through rumination and self-discipline.

  3. Physical Processes Dictate Early Biogeochemical Dynamics of Soil Pyrogenic Organic Matter in a Subtropical Forest Ecosystem

    NASA Astrophysics Data System (ADS)

    Stuart, Jason M.; Anderson, Russell; Lazzarino, Patrick; Kuehn, Kevin A.; Harvey, Omar R.

    2018-05-01

    Quantifying links between pyOM dynamics, environmental factors and processes is central to predicting ecosystem function and response to future perturbations. In this study, changes in carbon (TC), nitrogen (TN) , pH and relative recalcitrance (R50) for pine- and cordgrass-derived pyOM were measured at 3-6 weeks intervals throughout the first year of burial in the soil. Objectives were to 1) identify key environmental factors and processes driving early-stage pyOM dynamics, and 2) develop quantitative relationships between environmental factors and changes in pyOM properties. The study was conducted in sandy soils of a forested ecosystem in the Longleaf pine range, US with a focus on links between changes in pyOM properties, fire history (FH), cumulative precipitation (Pcum), average temperature (Tavg) and soil residence time (SRT). Pcum, SRT and Tavg were the main factors controlling TC and TN accounting for 77-91% and 64-96% of their respective variability. Fire history, along with Pcum, SRT and Tavg, exhibited significant controlling effects on pyOM, pH and R50 - accounting for 48-91% and 88-93% of respective variability. Volatilization of volatiles and leaching of water-soluble components (in summer) and the sorption of exogenous organic matter (fall through spring) were most plausibly controlling pyOM dynamics in this study. Overall, our results point to climatic and land management factors and physicochemical process as the main drivers of pyOM dynamics in the pine ecosystems of the Southeastern US.

  4. The segmented non-uniform dielectric module design for uniformity control of plasma profile in a capacitively coupled plasma chamber

    NASA Astrophysics Data System (ADS)

    Xia, Huanxiong; Xiang, Dong; Yang, Wang; Mou, Peng

    2014-12-01

    Low-temperature plasma technique is one of the critical techniques in IC manufacturing process, such as etching and thin-film deposition, and the uniformity greatly impacts the process quality, so the design for the plasma uniformity control is very important but difficult. It is hard to finely and flexibly regulate the spatial distribution of the plasma in the chamber via controlling the discharge parameters or modifying the structure in zero-dimensional space, and it just can adjust the overall level of the process factors. In the view of this problem, a segmented non-uniform dielectric module design solution is proposed for the regulation of the plasma profile in a CCP chamber. The solution achieves refined and flexible regulation of the plasma profile in the radial direction via configuring the relative permittivity and the width of each segment. In order to solve this design problem, a novel simulation-based auto-design approach is proposed, which can automatically design the positional sequence with multi independent variables to make the output target profile in the parameterized simulation model approximate the one that users preset. This approach employs an idea of quasi-closed-loop control system, and works in an iterative mode. It starts from initial values of the design variable sequences, and predicts better sequences via the feedback of the profile error between the output target profile and the expected one. It never stops until the profile error is narrowed in the preset tolerance.

  5. Scheduling the blended solution as industrial CO2 absorber in separation process by back-propagation artificial neural networks.

    PubMed

    Abdollahi, Yadollah; Sairi, Nor Asrina; Said, Suhana Binti Mohd; Abouzari-lotf, Ebrahim; Zakaria, Azmi; Sabri, Mohd Faizul Bin Mohd; Islam, Aminul; Alias, Yatimah

    2015-11-05

    It is believe that 80% industrial of carbon dioxide can be controlled by separation and storage technologies which use the blended ionic liquids absorber. Among the blended absorbers, the mixture of water, N-methyldiethanolamine (MDEA) and guanidinium trifluoromethane sulfonate (gua) has presented the superior stripping qualities. However, the blended solution has illustrated high viscosity that affects the cost of separation process. In this work, the blended fabrication was scheduled with is the process arranging, controlling and optimizing. Therefore, the blend's components and operating temperature were modeled and optimized as input effective variables to minimize its viscosity as the final output by using back-propagation artificial neural network (ANN). The modeling was carried out by four mathematical algorithms with individual experimental design to obtain the optimum topology using root mean squared error (RMSE), R-squared (R(2)) and absolute average deviation (AAD). As a result, the final model (QP-4-8-1) with minimum RMSE and AAD as well as the highest R(2) was selected to navigate the fabrication of the blended solution. Therefore, the model was applied to obtain the optimum initial level of the input variables which were included temperature 303-323 K, x[gua], 0-0.033, x[MDAE], 0.3-0.4, and x[H2O], 0.7-1.0. Moreover, the model has obtained the relative importance ordered of the variables which included x[gua]>temperature>x[MDEA]>x[H2O]. Therefore, none of the variables was negligible in the fabrication. Furthermore, the model predicted the optimum points of the variables to minimize the viscosity which was validated by further experiments. The validated results confirmed the model schedulability. Accordingly, ANN succeeds to model the initial components of the blended solutions as absorber of CO2 capture in separation technologies that is able to industries scale up. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Soil nutrient-landscape relationships in a lowland tropical rainforest in Panama

    USGS Publications Warehouse

    Barthold, F.K.; Stallard, R.F.; Elsenbeer, H.

    2008-01-01

    Soils play a crucial role in biogeochemical cycles as spatially distributed sources and sinks of nutrients. Any spatial patterns depend on soil forming processes, our understanding of which is still limited, especially in regards to tropical rainforests. The objective of our study was to investigate the effects of landscape properties, with an emphasis on the geometry of the land surface, on the spatial heterogeneity of soil chemical properties, and to test the suitability of soil-landscape modeling as an appropriate technique to predict the spatial variability of exchangeable K and Mg in a humid tropical forest in Panama. We used a design-based, stratified sampling scheme to collect soil samples at 108 sites on Barro Colorado Island, Panama. Stratifying variables are lithology, vegetation and topography. Topographic variables were generated from high-resolution digital elevation models with a grid size of 5 m. We took samples from five depths down to 1 m, and analyzed for total and exchangeable K and Mg. We used simple explorative data analysis techniques to elucidate the importance of lithology for soil total and exchangeable K and Mg. Classification and Regression Trees (CART) were adopted to investigate importance of topography, lithology and vegetation for the spatial distribution of exchangeable K and Mg and with the intention to develop models that regionalize the point observations using digital terrain data as explanatory variables. Our results suggest that topography and vegetation do not control the spatial distribution of the selected soil chemical properties at a landscape scale and lithology is important to some degree. Exchangeable K is distributed equally across the study area indicating that other than landscape processes, e.g. biogeochemical processes, are responsible for its spatial distribution. Lithology contributes to the spatial variation of exchangeable Mg but controlling variables could not be detected. The spatial variation of soil total K and Mg is mainly influenced by lithology. ?? 2007 Elsevier B.V. All rights reserved.

  7. The variable polarity plasma arc welding process: Characteristics and performance

    NASA Technical Reports Server (NTRS)

    Hung, R. J.; Zhu, G. J.

    1991-01-01

    Significant advantages of the Variable Polarity Plasma Arc (VPPA) Welding Process include faster welding, fewer repairs, less joint preparation, reduced weldment distortion, and absence of porosity. The power distribution was analyzed for an argon plasma gas flow constituting the fluid in the VPPA Welding Process. The major heat loss at the torch nozzle is convective heat transfer; in the space between the outlet of the nozzle and the workpiece; radiative heat transfer; and in the keyhole in the workpiece, convective heat transfer. The power absorbed at the workpiece produces the molten puddle that solidifies into the weld bead. Crown and root widths, and crown and root heights of the weld bead are predicted. The basis is provided for an algorithm for automatic control of VPPA welding machine parameters to obtain desired weld bead dimensions.

  8. Do Children and Adolescents with Anorexia Nervosa Display an Inefficient Cognitive Processing Style?

    PubMed

    Lang, Katie; Lloyd, Samantha; Khondoker, Mizanur; Simic, Mima; Treasure, Janet; Tchanturia, Kate

    2015-01-01

    This study aimed to examine neuropsychological processing in children and adolescents with Anorexia Nervosa (AN). The relationship of clinical and demographic variables to neuropsychological functioning within the AN group was also explored. The performance of 41 children and adolescents with a diagnosis of AN were compared to 43 healthy control (HC) participants on a number of neuropsychological measures. There were no differences in IQ between AN and HC groups. However, children and adolescents with AN displayed significantly more perseverative errors on the Wisconsin Card Sorting Test, and lower Style and Central Coherence scores on the Rey Osterrieth Complex Figure Test relative to HCs. Inefficient cognitive processing in the AN group was independent of clinical and demographic variables, suggesting it might represent an underlying trait for AN. The implications of these findings are discussed.

  9. Forced reeling of Bombyx mori silk: separating behavior and processing conditions.

    PubMed

    Mortimer, Beth; Holland, Chris; Vollrath, Fritz

    2013-10-14

    Controlled reeling is a powerful tool to investigate the details of silk processing. However, consistent forced reeling of silkworms is hindered by the significant degree of behaviorally induced variation caused by the animal. This paper proposes silkworm paralysis as a novel method to control the animal and thus in vivo spinning conditions. Using these methods, we achieve low and consistent reeling forces during the collection of over 500 m of individual silk fiber while monitoring filament variability, morphology, and properties. Novel techniques to measure the irregular silk cross-sectional areas lead to the more accurate calculation of the true engineering values and mechanical property variation of individual silk fibers. Combining controlled reeling and accurate thread measurement techniques allows us to present the relative contributions of processing and behavior in the performance envelope of Bombyx mori silk.

  10. Biowaste home composting: experimental process monitoring and quality control.

    PubMed

    Tatàno, Fabio; Pagliaro, Giacomo; Di Giovanni, Paolo; Floriani, Enrico; Mangani, Filippo

    2015-04-01

    Because home composting is a prevention option in managing biowaste at local levels, the objective of the present study was to contribute to the knowledge of the process evolution and compost quality that can be expected and obtained, respectively, in this decentralized option. In this study, organized as the research portion of a provincial project on home composting in the territory of Pesaro-Urbino (Central Italy), four experimental composters were first initiated and temporally monitored. Second, two small sub-sets of selected provincial composters (directly operated by households involved in the project) underwent quality control on their compost products at two different temporal steps. The monitored experimental composters showed overall decreasing profiles versus composting time for moisture, organic carbon, and C/N, as well as overall increasing profiles for electrical conductivity and total nitrogen, which represented qualitative indications of progress in the process. Comparative evaluations of the monitored experimental composters also suggested some interactions in home composting, i.e., high C/N ratios limiting organic matter decomposition rates and final humification levels; high moisture contents restricting the internal temperature regime; nearly horizontal phosphorus and potassium evolutions contributing to limit the rates of increase in electrical conductivity; and prolonged biowaste additions contributing to limit the rate of decrease in moisture. The measures of parametric data variability in the two sub-sets of controlled provincial composters showed decreased variability in moisture, organic carbon, and C/N from the seventh to fifteenth month of home composting, as well as increased variability in electrical conductivity, total nitrogen, and humification rate, which could be considered compatible with the respective nature of decreasing and increasing parameters during composting. The modeled parametric kinetics in the monitored experimental composters, along with the evaluation of the parametric central tendencies in the sub-sets of controlled provincial composters, all indicate that 12-15 months is a suitable duration for the appropriate development of home composting in final and simultaneous compliance with typical reference limits. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Evolving fuzzy rules in a learning classifier system

    NASA Technical Reports Server (NTRS)

    Valenzuela-Rendon, Manuel

    1993-01-01

    The fuzzy classifier system (FCS) combines the ideas of fuzzy logic controllers (FLC's) and learning classifier systems (LCS's). It brings together the expressive powers of fuzzy logic as it has been applied in fuzzy controllers to express relations between continuous variables, and the ability of LCS's to evolve co-adapted sets of rules. The goal of the FCS is to develop a rule-based system capable of learning in a reinforcement regime, and that can potentially be used for process control.

  12. Water Rockets and Indirect Measurement.

    ERIC Educational Resources Information Center

    Inman, Duane

    1997-01-01

    Describes an activity that teaches a number of scientific concepts including indirect measurement, Newton's third law of motion, manipulating and controlling variables, and the scientific method of inquiry. Uses process skills such as observation, inference, prediction, mensuration, and communication as well as problem solving and higher-order…

  13. Does Copper Metal React with Acetic Acid?

    ERIC Educational Resources Information Center

    DeMeo, Stephen

    1997-01-01

    Describes an activity that promotes analytical thinking and problem solving. Gives students experience with important scientific processes that can be generalized to other new laboratory experiences. Provides students with the opportunity to hypothesize answers, control variables by designing an experiment, and make logical deductions based on…

  14. Investigation of Mercury Wet Deposition Physicochemistry in the Ohio River Valley through Automated Sequential Sampling

    EPA Science Inventory

    Intra-storm variability and soluble fractionation was explored for summer-time rain events in Steubenville, Ohio to evaluate the physical processes controlling mercury (Hg) in wet deposition in this industrialized region. Comprehensive precipitation sample collection was conducte...

  15. Fostering Argumentation Skills: Doing What Real Scientists Really Do

    ERIC Educational Resources Information Center

    Llewellyn, Douglas; Rajesh, Hema

    2011-01-01

    Elementary and middle school teachers often provide students with hands-on activities or even inquiry-based investigations that emphasize science process skills such as observing, classifying, identifying and controlling variables, hypothesizing, experimenting, and collecting and analyzing data. These activities and investigations are frequently…

  16. The Development of Curricula/Programs for Indian and Metis People.

    ERIC Educational Resources Information Center

    Whyte, Kenn

    1982-01-01

    Discusses various multicultural education programs in Canada, recommending an understanding and appreciation of cultural heritage, analysis of contemporary conditions, and flexibility, variability, and greater control by minority groups of educational processes. Available: Department of Educational Foundations, 5-109 Education North, University of…

  17. Effects of social cognitive impairment on speech disorder in schizophrenia.

    PubMed

    Docherty, Nancy M; McCleery, Amanda; Divilbiss, Marielle; Schumann, Emily B; Moe, Aubrey; Shakeel, Mohammed K

    2013-05-01

    Disordered speech in schizophrenia impairs social functioning because it impedes communication with others. Treatment approaches targeting this symptom have been limited by an incomplete understanding of its causes. This study examined the process underpinnings of speech disorder, assessed in terms of communication failure. Contributions of impairments in 2 social cognitive abilities, emotion perception and theory of mind (ToM), to speech disorder were assessed in 63 patients with schizophrenia or schizoaffective disorder and 21 nonpsychiatric participants, after controlling for the effects of verbal intelligence and impairments in basic language-related neurocognitive abilities. After removal of the effects of the neurocognitive variables, impairments in emotion perception and ToM each explained additional variance in speech disorder in the patients but not the controls. The neurocognitive and social cognitive variables, taken together, explained 51% of the variance in speech disorder in the patients. Schizophrenic disordered speech may be less a concomitant of "positive" psychotic process than of illness-related limitations in neurocognitive and social cognitive functioning.

  18. Statistical process control applied to mechanized peanut sowing as a function of soil texture.

    PubMed

    Zerbato, Cristiano; Furlani, Carlos Eduardo Angeli; Ormond, Antonio Tassio Santana; Gírio, Lucas Augusto da Silva; Carneiro, Franciele Morlin; da Silva, Rouverson Pereira

    2017-01-01

    The successful establishment of agricultural crops depends on sowing quality, machinery performance, soil type and conditions, among other factors. This study evaluates the operational quality of mechanized peanut sowing in three soil types (sand, silt, and clay) with variable moisture contents. The experiment was conducted in three locations in the state of São Paulo, Brazil. The track-sampling scheme was used for 80 sampling locations of each soil type. Descriptive statistics and statistical process control (SPC) were used to evaluate the quality indicators of mechanized peanut sowing. The variables had normal distributions and were stable from the viewpoint of SPC. The best performance for peanut sowing density, normal spacing, and the initial seedling growing stand was found for clayey soil followed by sandy soil and then silty soil. Sandy or clayey soils displayed similar results regarding sowing depth, which was deeper than in the silty soil. Overall, the texture and the moisture of clayey soil provided the best operational performance for mechanized peanut sowing.

  19. Similarity and scale in catchment storm response

    NASA Technical Reports Server (NTRS)

    Wood, Eric F.; Sivapalan, Murugesu; Beven, Keith

    1993-01-01

    Until recently, very little progress had been made in understanding the relationship between small-scale variability of topography, soil, and rainfalls and the storm response seen at the catchment scale. The work reviewed here represents the first attempt at a systematic theoretical framework for such understanding in the context of surface runoff generation by different processes. The parameterization of hydrological processes over a range of scales is examined, and the concept of the 'representative elementary area' (REA) is introduced. The REA is a fundamental scale for catchment modeling at which continuum assumptions can be applied for the spatially variable controls and parameters, and spatial patterns no longer have to be considered explicitly. The investigation of scale leads into the concept of hydrologic similarity in which the effects of the environmental controls on runoff generation and flood frequency response be investigated independently of catchment scale. The paper reviews the authors' initial results and hopefully will motivate others to also investigate the issues of hydrologic scale and similarity.

  20. How emotions affect eating: a five-way model.

    PubMed

    Macht, Michael

    2008-01-01

    Despite the importance of affective processes in eating behaviour, it remains difficult to predict how emotions affect eating. Emphasizing individual differences, previous research did not pay full attention to the twofold variability of emotion-induced changes of eating (variability across both individuals and emotions). By contrast, the present paper takes into account both individual characteristics and emotion features, and specifies five classes of emotion-induced changes of eating: (1) emotional control of food choice, (2) emotional suppression of food intake, (3) impairment of cognitive eating controls, (4) eating to regulate emotions, and (5) emotion-congruent modulation of eating. These classes are distinguished by antecedent conditions, eating responses and mediating mechanisms. They point to basic functional principles underlying the relations between emotions and biologically based motives: interference, concomitance and regulation. Thus, emotion-induced changes of eating can be a result of interference of eating by emotions, a by-product of emotions, and a consequence of regulatory processes (i.e., emotions may regulate eating, and eating may regulate emotions).

  1. Introduction of Transplant Registry Unified Management Program 2 (TRUMP2): scripts for TRUMP data analyses, part I (variables other than HLA-related data).

    PubMed

    Atsuta, Yoshiko

    2016-01-01

    Collection and analysis of information on diseases and post-transplant courses of allogeneic hematopoietic stem cell transplant recipients have played important roles in improving therapeutic outcomes in hematopoietic stem cell transplantation. Efficient, high-quality data collection systems are essential. The introduction of the Second-Generation Transplant Registry Unified Management Program (TRUMP2) is intended to improve data quality and more efficient data management. The TRUMP2 system will also expand possible uses of data, as it is capable of building a more complex relational database. The construction of an accessible data utilization system for adequate data utilization by researchers would promote greater research activity. Study approval and management processes and authorship guidelines also need to be organized within this context. Quality control of processes for data manipulation and analysis will also affect study outcomes. Shared scripts have been introduced to define variables according to standard definitions for quality control and improving efficiency of registry studies using TRUMP data.

  2. Statistical process control applied to mechanized peanut sowing as a function of soil texture

    PubMed Central

    Furlani, Carlos Eduardo Angeli; da Silva, Rouverson Pereira

    2017-01-01

    The successful establishment of agricultural crops depends on sowing quality, machinery performance, soil type and conditions, among other factors. This study evaluates the operational quality of mechanized peanut sowing in three soil types (sand, silt, and clay) with variable moisture contents. The experiment was conducted in three locations in the state of São Paulo, Brazil. The track-sampling scheme was used for 80 sampling locations of each soil type. Descriptive statistics and statistical process control (SPC) were used to evaluate the quality indicators of mechanized peanut sowing. The variables had normal distributions and were stable from the viewpoint of SPC. The best performance for peanut sowing density, normal spacing, and the initial seedling growing stand was found for clayey soil followed by sandy soil and then silty soil. Sandy or clayey soils displayed similar results regarding sowing depth, which was deeper than in the silty soil. Overall, the texture and the moisture of clayey soil provided the best operational performance for mechanized peanut sowing. PMID:28742095

  3. Dynamics of genetic variability in Anastrepha fraterculus (Diptera: Tephritidae) during adaptation to laboratory rearing conditions.

    PubMed

    Parreño, María A; Scannapieco, Alejandra C; Remis, María I; Juri, Marianela; Vera, María T; Segura, Diego F; Cladera, Jorge L; Lanzavecchia, Silvia B

    2014-01-01

    Anastrepha fraterculus is one of the most important fruit fly plagues in the American continent and only chemical control is applied in the field to diminish its population densities. A better understanding of the genetic variability during the introduction and adaptation of wild A. fraterculus populations to laboratory conditions is required for the development of stable and vigorous experimental colonies and mass-reared strains in support of successful Sterile Insect Technique (SIT) efforts. The present study aims to analyze the dynamics of changes in genetic variability during the first six generations under artificial rearing conditions in two populations: a) a wild population recently introduced to laboratory culture, named TW and, b) a long-established control line, named CL. Results showed a declining tendency of genetic variability in TW. In CL, the relatively high values of genetic variability appear to be maintained across generations and could denote an intrinsic capacity to avoid the loss of genetic diversity in time. The impact of evolutionary forces on this species during the adaptation process as well as the best approach to choose strategies to introduce experimental and mass-reared A. fraterculus strains for SIT programs are discussed.

  4. Dynamics of genetic variability in Anastrepha fraterculus (Diptera: Tephritidae) during adaptation to laboratory rearing conditions

    PubMed Central

    2014-01-01

    Background Anastrepha fraterculus is one of the most important fruit fly plagues in the American continent and only chemical control is applied in the field to diminish its population densities. A better understanding of the genetic variability during the introduction and adaptation of wild A. fraterculus populations to laboratory conditions is required for the development of stable and vigorous experimental colonies and mass-reared strains in support of successful Sterile Insect Technique (SIT) efforts. Methods The present study aims to analyze the dynamics of changes in genetic variability during the first six generations under artificial rearing conditions in two populations: a) a wild population recently introduced to laboratory culture, named TW and, b) a long-established control line, named CL. Results Results showed a declining tendency of genetic variability in TW. In CL, the relatively high values of genetic variability appear to be maintained across generations and could denote an intrinsic capacity to avoid the loss of genetic diversity in time. Discussion The impact of evolutionary forces on this species during the adaptation process as well as the best approach to choose strategies to introduce experimental and mass-reared A. fraterculus strains for SIT programs are discussed. PMID:25471362

  5. The association of external knee adduction moment with biomechanical variables in osteoarthritis: a systematic review.

    PubMed

    Foroughi, Nasim; Smith, Richard; Vanwanseele, Benedicte

    2009-10-01

    Osteoarthritis (OA) is a musculoskeletal disorder primarily affecting the older population and resulting in chronic pain and disability. Biomechanical variables, associated with OA severity such as external knee adduction moment (KAM) and joint malalignment, may affect the disease process by altering the bone-on-bone forces during gait. To investigate the association between biomechanical variables and KAM in knee OA. A systematic search for published studies' titles and abstracts was performed on Ovid Medline, Cumulative index to Nursing and Allied Health, PREMEDLINE, EBM reviews and SPORTDiscus. Fourteen studies met the inclusion criteria and were considered for the review. The magnitude and time course of KAM during gait appeared to be consistent across laboratories and computational methods. Only two of the included studies that compared patients with OA to a control group reported a higher peak KAM for the OA group. Knee adduction moment increased with OA severity and was directly proportional to varus malalignment. Classifying the patients on the basis of disease severity decreased the group variability, permitting the differences to be more detectable. Biomechanical variables such as varus malalignment are associated with KAM and therefore may affect the disease process. These variables should be taken into considerations when developing therapeutic interventions for individuals suffering from knee OA.

  6. Precipitation and carbon-water coupling jointly control the interannual variability of global land gross primary production

    NASA Astrophysics Data System (ADS)

    Zhang, Yao; Xiao, Xiangming; Guanter, Luis; Zhou, Sha; Ciais, Philippe; Joiner, Joanna; Sitch, Stephen; Wu, Xiaocui; Nabel, Julia; Dong, Jinwei; Kato, Etsushi; Jain, Atul K.; Wiltshire, Andy; Stocker, Benjamin D.

    2016-12-01

    Carbon uptake by terrestrial ecosystems is increasing along with the rising of atmospheric CO2 concentration. Embedded in this trend, recent studies suggested that the interannual variability (IAV) of global carbon fluxes may be dominated by semi-arid ecosystems, but the underlying mechanisms of this high variability in these specific regions are not well known. Here we derive an ensemble of gross primary production (GPP) estimates using the average of three data-driven models and eleven process-based models. These models are weighted by their spatial representativeness of the satellite-based solar-induced chlorophyll fluorescence (SIF). We then use this weighted GPP ensemble to investigate the GPP variability for different aridity regimes. We show that semi-arid regions contribute to 57% of the detrended IAV of global GPP. Moreover, in regions with higher GPP variability, GPP fluctuations are mostly controlled by precipitation and strongly coupled with evapotranspiration (ET). This higher GPP IAV in semi-arid regions is co-limited by supply (precipitation)-induced ET variability and GPP-ET coupling strength. Our results demonstrate the importance of semi-arid regions to the global terrestrial carbon cycle and posit that there will be larger GPP and ET variations in the future with changes in precipitation patterns and dryland expansion.

  7. Precipitation and Carbon-Water Coupling Jointly Control the Interannual Variability of Global Land Gross Primary Production

    NASA Technical Reports Server (NTRS)

    Zhang, Yao; Xiao, Xiangming; Guanter, Luis; Zhou, Sha; Ciais, Philippe; Joiner, Joanna; Sitch, Stephen; Wu, Xiaocui; Nabel, Julian; Dong, Jinwei; hide

    2016-01-01

    Carbon uptake by terrestrial ecosystems is increasing along with the rising of atmospheric CO2 concentration. Embedded in this trend, recent studies suggested that the interannual variability (IAV) of global carbon fluxes may be dominated by semi-arid ecosystems, but the underlying mechanisms of this high variability in these specific regions are not well known. Here we derive an ensemble of gross primary production (GPP) estimates using the average of three data-driven models and eleven process-based models. These models are weighted by their spatial representativeness of the satellite-based solar-induced chlorophyll fluorescence (SIF). We then use this weighted GPP ensemble to investigate the GPP variability for different aridity regimes. We show that semi-arid regions contribute to 57% of the detrended IAV of global GPP. Moreover, in regions with higher GPP variability, GPP fluctuations are mostly controlled by precipitation and strongly coupled with evapotranspiration (ET). This higher GPP IAV in semi-arid regions is co-limited by supply (precipitation)-induced ET variability and GPP-ET coupling strength. Our results demonstrate the importance of semi-arid regions to the global terrestrial carbon cycle and posit that there will be larger GPP and ET variations in the future with changes in precipitation patterns and dryland expansion.

  8. Quantifying Linkages between Biogeochemical Processes in a Contaminated Aquifer-Wetland System Using Multivariate Statistics and HP1

    NASA Astrophysics Data System (ADS)

    Arora, B.; Mohanty, B. P.; McGuire, J. T.

    2009-12-01

    Fate and transport of contaminants in saturated and unsaturated zones in the subsurface is controlled by complex biogeochemical processes such as precipitation, sorption-desorption, ion-exchange, redox, etc. In dynamic systems such as wetlands and anaerobic aquifers, these processes are coupled and can interact non-linearly with each other. Variability in measured hydrological, geochemical and microbiological parameters thus corresponds to multiple processes simultaneously. To infer the contributing processes, it is important to eliminate correlations and to identify inter-linkages between factors. The objective of this study is to develop quantitative relationships between hydrological (initial and boundary conditions, hydraulic conductivity ratio, and soil layering), geochemical (mineralogy, surface area, redox potential, and organic matter) and microbiological factors (MPN) that alter the biogeochemical processes at the column scale. Data used in this study were collected from controlled flow experiments in: i) two homogeneous soil columns, ii) a layered soil column, iii) a soil column with embedded clay lenses, and iv) a soil column with embedded clay lenses and one central macropore. The soil columns represent increasing level of soil structural heterogeneity to better mimic the Norman Landfill research site. The Norman Landfill is a closed municipal facility with prevalent organic contamination. The sources of variation in the dataset were explored using multivariate statistical techniques and dominant biogeochemical processes were obtained using principal component analysis (PCA). Furthermore, artificial neural networks (ANN) coupled with HP1 was used to develop mathematical rules identifying different combinations of factors that trigger, sustain, accelerate/decelerate, or discontinue the biogeochemical processes. Experimental observations show that infiltrating water triggers biogeochemical processes in all soil columns. Similarly, slow release of water from low permeability clay lenses sustain biogeochemical cycling for a longer period of time than in homogeneous soil columns. Preliminary results indicate: i) certain variables (anion, cation concentrations, etc.) do not follow normal or lognormal distributions even at the column scale, ii) strong correlations exist between parameters related to redox geochemistry (pH with S2- concentrations), and iii) PCA can identify dominant processes (e.g. iron and sulfate reduction) occurring in the system by grouping together causative variables (e.g. dominant TEAPs).

  9. NASA. Langley Research Center dry powder towpreg system

    NASA Technical Reports Server (NTRS)

    Baucom, Robert M.; Marchello, Joseph M.

    1990-01-01

    Dry powder polymer impregnated carbon fiber tows were produced for preform weaving and composite materials molding applications. In the process, fluidized powder is deposited on spread tow bundles and melted on the fibers by radiant heating to adhere the polymer to the fiber. Unit design theory and operating correlations were developed to provide the basis for scale up of the process to commercial operation. Special features of the operation are the pneumatic tow spreader, fluidized bed, resin feeder, and quality control system. Bench scale experiments, at tow speeds up to 50 cm/sec, demonstrated that process variables can be controlled to produce weavable LARC-TPI carbon fiber towpreg. The towpreg made by the dry powder process was formed into unidirectional fiber moldings and was woven and molded into preform material of good quality.

  10. Design and implementation of a control structure for quality products in a crude oil atmospheric distillation column.

    PubMed

    Sotelo, David; Favela-Contreras, Antonio; Sotelo, Carlos; Jiménez, Guillermo; Gallegos-Canales, Luis

    2017-11-01

    In recent years, interest for petrochemical processes has been increasing, especially in refinement area. However, the high variability in the dynamic characteristics present in the atmospheric distillation column poses a challenge to obtain quality products. To improve distillates quality in spite of the changes in the input crude oil composition, this paper details a new design of a control strategy in a conventional crude oil distillation plant defined using formal interaction analysis tools. The process dynamic and its control are simulated on Aspen HYSYS ® dynamic environment under real operating conditions. The simulation results are compared against a typical control strategy commonly used in crude oil atmospheric distillation columns. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Controlled electromigration protocol revised

    NASA Astrophysics Data System (ADS)

    Zharinov, Vyacheslav S.; Baumans, Xavier D. A.; Silhanek, Alejandro V.; Janssens, Ewald; Van de Vondel, Joris

    2018-04-01

    Electromigration has evolved from an important cause of failure in electronic devices to an appealing method, capable of modifying the material properties and geometry of nanodevices. Although this technique has been successfully used by researchers to investigate low dimensional systems and nanoscale objects, its low controllability remains a serious limitation. This is in part due to the inherent stochastic nature of the process, but also due to the inappropriate identification of the relevant control parameters. In this study, we identify a suitable process variable and propose a novel control algorithm that enhances the controllability and, at the same time, minimizes the intervention of an operator. As a consequence, the algorithm facilitates the application of electromigration to systems that require exceptional control of, for example, the width of a narrow junction. It is demonstrated that the electromigration rate can be stabilized on pre-set values, which eventually defines the final geometry of the electromigrated structures.

  12. Instruments speak global language

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nudo, L.

    1993-07-01

    If all goes as planned, companies that use instruments for measurement and control will get more complete, reliable and repeatable information about their processes with advanced digital devices that speak a global language. That language, in technical terms, is known as international fieldbus. But it's not much different from English's role as the international language of business. Companies that use a remote measurement device for environmental applications, such as pH control and fugitive emissions control, are candidates for fieldbus devices, which are much faster and measure more process variables than their counterpart analog devices. With the advent of a globalmore » fieldbus, users will see digital valves, solenoids and multivariable transmitters. Fieldbus technology redefines the roles of the control system and field devices. The control system still serves as a central clearinghouse, but field devices will handle more control and reporting functions and generate data that can be used for trending and preventive maintenance.« less

  13. Adaptive hybrid optimal quantum control for imprecisely characterized systems.

    PubMed

    Egger, D J; Wilhelm, F K

    2014-06-20

    Optimal quantum control theory carries a huge promise for quantum technology. Its experimental application, however, is often hindered by imprecise knowledge of the input variables, the quantum system's parameters. We show how to overcome this by adaptive hybrid optimal control, using a protocol named Ad-HOC. This protocol combines open- and closed-loop optimal control by first performing a gradient search towards a near-optimal control pulse and then an experimental fidelity estimation with a gradient-free method. For typical settings in solid-state quantum information processing, adaptive hybrid optimal control enhances gate fidelities by an order of magnitude, making optimal control theory applicable and useful.

  14. Fuzzy efficiency optimization of AC induction motors

    NASA Technical Reports Server (NTRS)

    Jani, Yashvant; Sousa, Gilberto; Turner, Wayne; Spiegel, Ron; Chappell, Jeff

    1993-01-01

    This paper describes the early states of work to implement a fuzzy logic controller to optimize the efficiency of AC induction motor/adjustable speed drive (ASD) systems running at less than optimal speed and torque conditions. In this paper, the process by which the membership functions of the controller were tuned is discussed and a controller which operates on frequency as well as voltage is proposed. The membership functions for this dual-variable controller are sketched. Additional topics include an approach for fuzzy logic to motor current control which can be used with vector-controlled drives. Incorporation of a fuzzy controller as an application-specific integrated circuit (ASIC) microchip is planned.

  15. PID feedback controller used as a tactical asset allocation technique: The G.A.M. model

    NASA Astrophysics Data System (ADS)

    Gandolfi, G.; Sabatini, A.; Rossolini, M.

    2007-09-01

    The objective of this paper is to illustrate a tactical asset allocation technique utilizing the PID controller. The proportional-integral-derivative (PID) controller is widely applied in most industrial processes; it has been successfully used for over 50 years and it is used by more than 95% of the plants processes. It is a robust and easily understood algorithm that can provide excellent control performance in spite of the diverse dynamic characteristics of the process plant. In finance, the process plant, controlled by the PID controller, can be represented by financial market assets forming a portfolio. More specifically, in the present work, the plant is represented by a risk-adjusted return variable. Money and portfolio managers’ main target is to achieve a relevant risk-adjusted return in their managing activities. In literature and in the financial industry business, numerous kinds of return/risk ratios are commonly studied and used. The aim of this work is to perform a tactical asset allocation technique consisting in the optimization of risk adjusted return by means of asset allocation methodologies based on the PID model-free feedback control modeling procedure. The process plant does not need to be mathematically modeled: the PID control action lies in altering the portfolio asset weights, according to the PID algorithm and its parameters, Ziegler-and-Nichols-tuned, in order to approach the desired portfolio risk-adjusted return efficiently.

  16. Feedback enhanced plasma spray tool

    DOEpatents

    Gevelber, Michael Alan; Wroblewski, Donald Edward; Fincke, James Russell; Swank, William David; Haggard, Delon C.; Bewley, Randy Lee

    2005-11-22

    An improved automatic feedback control scheme enhances plasma spraying of powdered material through reduction of process variability and providing better ability to engineer coating structure. The present inventors discovered that controlling centroid position of the spatial distribution along with other output parameters, such as particle temperature, particle velocity, and molten mass flux rate, vastly increases control over the sprayed coating structure, including vertical and horizontal cracks, voids, and porosity. It also allows improved control over graded layers or compositionally varying layers of material, reduces variations, including variation in coating thickness, and allows increasing deposition rate. Various measurement and system control schemes are provided.

  17. Examination of a carton sealing line using a thermographic scanner

    NASA Astrophysics Data System (ADS)

    Kleinfeld, Jack M.

    1999-03-01

    The study of the operation and performance of natural gas fired sealing lines for polyethylene coated beverage containers was performed. Both thermal and geometric data was abstracted from the thermal scans and used to characterize the performance of the sealing line. The impact of process operating variables such as line speed and carton to carton spacing was studied. Recommendations for system improvements, instrumentation and process control were made.

  18. Proteomic Prediction of Breast Cancer Risk: A Cohort Study

    DTIC Science & Technology

    2007-03-01

    Total 1728 1189 68.81 (c) Data processing. Data analysis was performed using in-house software (Du P , Angeletti RH. Automatic deconvolution of...isotope-resolved mass spectra using variable selection and quantized peptide mass distribution. Anal Chem., 78:3385-92, 2006; P Du, R Sudha, MB...control. Reportable Outcomes So far our publications have been on the development of algorithms for signal processing: 1. Du P , Angeletti RH

  19. Reliability Studies of Ceramic Capacitors.

    DTIC Science & Technology

    1984-10-01

    Virginia Polytechnic BaTiO 3 Ispecimens with variable composition, density and grain size to be used to make carrier concentration, mobility, thermoelectric ...low fields, observed steady-state electrical behavior will be controlled by the bulk properties of the insulator, the second phase of the conduction...carrier mobility E =applied field Note that bulk properties of the Insulator control the conduction process. From this equation it can be seen that a

  20. Planform and mobility in the Meaípe-Maimbá embayed beach on the South East coast of Brazil

    NASA Astrophysics Data System (ADS)

    Albino, Jacqueline; Jiménez, José A.; Oliveira, Tiago C. A.

    2016-01-01

    The Meaípe-Maimbá embayed beach (MMEB) on the south-east coast of Brazil has been subject to anthropogenic pressures since the 70's. In this study we discuss the adequacy and contribution of the parabolic planform model to determine the planform and variability of the MMEB, taking into consideration variation in wave conditions. The role of different controlling conditions on the planform variability is analyzed, as well as the morphological and planform mobility. MMEB exhibited a new configuration in response to the construction of a harbor, which interrupted the longshore sediment transport. After four decades, three particular morphodynamic sectors have been recognized along the beach. The central sector is more exposed to normal wave incidence and cross-shore processes predominate. The northern and southern sectors are influenced by wave diffraction processes around the headlands and port, respectively. In the northern sector, the presence of secondary headlands and inner islands imposed a geomorphological control on beach morphology and coastal processes. The use of the parabolic planform model provided useful insights for the assessment of potential planform mobility, since the decadal shoreline evolution combined with beach profiles and sediment characteristics allowed understanding of the beach mobility processes and supported the interpretation of modeling results.

Top