Sample records for critical process variables

  1. Study of process variables associated with manufacturing hermetically-sealed nickel-cadmium cells

    NASA Technical Reports Server (NTRS)

    Miller, L.

    1974-01-01

    A two year study of the major process variables associated with the manufacturing process for sealed, nickel-cadmium, areospace cells is summarized. Effort was directed toward identifying the major process variables associated with a manufacturing process, experimentally assessing each variable's effect, and imposing the necessary changes (optimization) and controls for the critical process variables to improve results and uniformity. A critical process variable associated with the sintered nickel plaque manufacturing process was identified as the manual forming operation. Critical process variables identified with the positive electrode impregnation/polarization process were impregnation solution temperature, free acid content, vacuum impregnation, and sintered plaque strength. Positive and negative electrodes were identified as a major source of carbonate contamination in sealed cells.

  2. [Effects of situational and individual variables on critical thinking expression].

    PubMed

    Tanaka, Yuko; Kusumi, Takashi

    2016-04-01

    The present study examined when people decide to choose an expression that is based on critical thinking, and how situational and individual variables affect such a decision process. Given a conversation scenario including overgeneralization with two friends, participants decided whether to follow the conversation by a critical-thinking expression or not. The authors controlled purpose and topic as situational variables, and measured critical-thinking ability, critical-thinking disposition, and self-monitoring as individual variables. We conducted an experiment in which the situational variables were counterbalanced in a within-subject design with 60 university students. The results of logistic regression analysis showed differences within individuals in the decision process whether to choose a critical-thinking expression, and that some situational factors and some subscales of the individual measurements were related to the differences.

  3. Quality-by-Design (QbD): An integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and process design space development.

    PubMed

    Wu, Huiquan; White, Maury; Khan, Mansoor A

    2011-02-28

    The aim of this work was to develop an integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and design space development. A dynamic co-precipitation process by gradually introducing water to the ternary system of naproxen-Eudragit L100-alcohol was monitored at real-time in situ via Lasentec FBRM and PVM. 3D map of count-time-chord length revealed three distinguishable process stages: incubation, transition, and steady-state. The effects of high risk process variables (slurry temperature, stirring rate, and water addition rate) on both derived co-precipitation process rates and final chord-length-distribution were evaluated systematically using a 3(3) full factorial design. Critical process variables were identified via ANOVA for both transition and steady state. General linear models (GLM) were then used for parameter estimation for each critical variable. Clear trends about effects of each critical variable during transition and steady state were found by GLM and were interpreted using fundamental process principles and Nyvlt's transfer model. Neural network models were able to link process variables with response variables at transition and steady state with R(2) of 0.88-0.98. PVM images evidenced nucleation and crystal growth. Contour plots illustrated design space via critical process variables' ranges. It demonstrated the utility of integrated PAT approach for QbD development. Published by Elsevier B.V.

  4. Quality by Design approach for studying the impact of formulation and process variables on product quality of oral disintegrating films.

    PubMed

    Mazumder, Sonal; Pavurala, Naresh; Manda, Prashanth; Xu, Xiaoming; Cruz, Celia N; Krishnaiah, Yellela S R

    2017-07-15

    The present investigation was carried out to understand the impact of formulation and process variables on the quality of oral disintegrating films (ODF) using Quality by Design (QbD) approach. Lamotrigine (LMT) was used as a model drug. Formulation variable was plasticizer to film former ratio and process variables were drying temperature, air flow rate in the drying chamber, drying time and wet coat thickness of the film. A Definitive Screening Design of Experiments (DoE) was used to identify and classify the critical formulation and process variables impacting critical quality attributes (CQA). A total of 14 laboratory-scale DoE formulations were prepared and evaluated for mechanical properties (%elongation at break, yield stress, Young's modulus, folding endurance) and other CQA (dry thickness, disintegration time, dissolution rate, moisture content, moisture uptake, drug assay and drug content uniformity). The main factors affecting mechanical properties were plasticizer to film former ratio and drying temperature. Dissolution rate was found to be sensitive to air flow rate during drying and plasticizer to film former ratio. Data were analyzed for elucidating interactions between different variables, rank ordering the critical materials attributes (CMA) and critical process parameters (CPP), and for providing a predictive model for the process. Results suggested that plasticizer to film former ratio and process controls on drying are critical to manufacture LMT ODF with the desired CQA. Published by Elsevier B.V.

  5. Rate of Information-Processing as a Variable of Critical Reading.

    ERIC Educational Resources Information Center

    Van Voorhees, Sylvia Nash

    This study investigated the relationship between rate of information processing and critical reading and the anciallary effect of anxiety on the two variables. A tenth grade sample, consisting of 52 fast readers and 52 slow readers, was identified. All subjects in the sample had intelligence quotients of 120 or higher and vocabulary and…

  6. Costing improvement of remanufacturing crankshaft by integrating Mahalanobis-Taguchi System and Activity based Costing

    NASA Astrophysics Data System (ADS)

    Abu, M. Y.; Nor, E. E. Mohd; Rahman, M. S. Abd

    2018-04-01

    Integration between quality and costing system is very crucial in order to achieve an accurate product cost and profit. Current practice by most of remanufacturers, there are still lacking on optimization during the remanufacturing process which contributed to incorrect variables consideration to the costing system. Meanwhile, traditional costing accounting being practice has distortion in the cost unit which lead to inaccurate cost of product. The aim of this work is to identify the critical and non-critical variables during remanufacturing process using Mahalanobis-Taguchi System and simultaneously estimate the cost using Activity Based Costing method. The orthogonal array was applied to indicate the contribution of variables in the factorial effect graph and the critical variables were considered with overhead costs that are actually demanding the activities. This work improved the quality inspection together with costing system to produce an accurate profitability information. As a result, the cost per unit of remanufactured crankshaft of MAN engine model with 5 critical crankpins is MYR609.50 while Detroit engine model with 4 critical crankpins is MYR1254.80. The significant of output demonstrated through promoting green by reducing re-melting process of damaged parts to ensure consistent benefit of return cores.

  7. Latent variable modeling to analyze the effects of process parameters on the dissolution of paracetamol tablet

    PubMed Central

    Sun, Fei; Xu, Bing; Zhang, Yi; Dai, Shengyun; Shi, Xinyuan; Qiao, Yanjiang

    2017-01-01

    ABSTRACT The dissolution is one of the critical quality attributes (CQAs) of oral solid dosage forms because it relates to the absorption of drug. In this paper, the influence of raw materials, granules and process parameters on the dissolution of paracetamol tablet was analyzed using latent variable modeling methods. The variability in raw materials and granules was understood based on the principle component analysis (PCA), respectively. A multi-block partial least squares (MBPLS) model was used to determine the critical factors affecting the dissolution. The results showed that the binder amount, the post granulation time, the API content in granule, the fill depth and the punch tip separation distance were the critical factors with variable importance in the projection (VIP) values larger than 1. The importance of each unit of the whole process was also ranked using the block importance in the projection (BIP) index. It was concluded that latent variable models (LVMs) were very useful tools to extract information from the available data and improve the understanding on dissolution behavior of paracetamol tablet. The obtained LVMs were also helpful to propose the process design space and to design control strategies in the further research. PMID:27689242

  8. [Clinical reasoning in undergraduate nursing education: a scoping review].

    PubMed

    Menezes, Sáskia Sampaio Cipriano de; Corrêa, Consuelo Garcia; Silva, Rita de Cássia Gengo E; Cruz, Diná de Almeida Monteiro Lopes da

    2015-12-01

    This study aimed at analyzing the current state of knowledge on clinical reasoning in undergraduate nursing education. A systematic scoping review through a search strategy applied to the MEDLINE database, and an analysis of the material recovered by extracting data done by two independent reviewers. The extracted data were analyzed and synthesized in a narrative manner. From the 1380 citations retrieved in the search, 23 were kept for review and their contents were summarized into five categories: 1) the experience of developing critical thinking/clinical reasoning/decision-making process; 2) teaching strategies related to the development of critical thinking/clinical reasoning/decision-making process; 3) measurement of variables related to the critical thinking/clinical reasoning/decision-making process; 4) relationship of variables involved in the critical thinking/clinical reasoning/decision-making process; and 5) theoretical development models of critical thinking/clinical reasoning/decision-making process for students. The biggest challenge for developing knowledge on teaching clinical reasoning seems to be finding consistency between theoretical perspectives on the development of clinical reasoning and methodologies, methods, and procedures in research initiatives in this field.

  9. Study of process variables associated with manufacturing hermetically-sealed nickel-cadmium cells

    NASA Technical Reports Server (NTRS)

    Miller, L.; Doan, D. J.; Carr, E. S.

    1971-01-01

    A program to determine and study the critical process variables associated with the manufacture of aerospace, hermetically-sealed, nickel-cadmium cells is described. The determination and study of the process variables associated with the positive and negative plaque impregnation/polarization process are emphasized. The experimental data resulting from the implementation of fractional factorial design experiments are analyzed by means of a linear multiple regression analysis technique. This analysis permits the selection of preferred levels for certain process variables to achieve desirable impregnated plaque characteristics.

  10. Statistical modeling methods to analyze the impacts of multiunit process variability on critical quality attributes of Chinese herbal medicine tablets.

    PubMed

    Sun, Fei; Xu, Bing; Zhang, Yi; Dai, Shengyun; Yang, Chan; Cui, Xianglong; Shi, Xinyuan; Qiao, Yanjiang

    2016-01-01

    The quality of Chinese herbal medicine tablets suffers from batch-to-batch variability due to a lack of manufacturing process understanding. In this paper, the Panax notoginseng saponins (PNS) immediate release tablet was taken as the research subject. By defining the dissolution of five active pharmaceutical ingredients and the tablet tensile strength as critical quality attributes (CQAs), influences of both the manipulated process parameters introduced by an orthogonal experiment design and the intermediate granules' properties on the CQAs were fully investigated by different chemometric methods, such as the partial least squares, the orthogonal projection to latent structures, and the multiblock partial least squares (MBPLS). By analyzing the loadings plots and variable importance in the projection indexes, the granule particle sizes and the minimal punch tip separation distance in tableting were identified as critical process parameters. Additionally, the MBPLS model suggested that the lubrication time in the final blending was also important in predicting tablet quality attributes. From the calculated block importance in the projection indexes, the tableting unit was confirmed to be the critical process unit of the manufacturing line. The results demonstrated that the combinatorial use of different multivariate modeling methods could help in understanding the complex process relationships as a whole. The output of this study can then be used to define a control strategy to improve the quality of the PNS immediate release tablet.

  11. How Do Microphysical Processes Influence Large-Scale Precipitation Variability and Extremes?

    DOE PAGES

    Hagos, Samson; Ruby Leung, L.; Zhao, Chun; ...

    2018-02-10

    Convection permitting simulations using the Model for Prediction Across Scales-Atmosphere (MPAS-A) are used to examine how microphysical processes affect large-scale precipitation variability and extremes. An episode of the Madden-Julian Oscillation is simulated using MPAS-A with a refined region at 4-km grid spacing over the Indian Ocean. It is shown that cloud microphysical processes regulate the precipitable water (PW) statistics. Because of the non-linear relationship between precipitation and PW, PW exceeding a certain critical value (PWcr) contributes disproportionately to precipitation variability. However, the frequency of PW exceeding PWcr decreases rapidly with PW, so changes in microphysical processes that shift the columnmore » PW statistics relative to PWcr even slightly have large impacts on precipitation variability. Furthermore, precipitation variance and extreme precipitation frequency are approximately linearly related to the difference between the mean and critical PW values. Thus observed precipitation statistics could be used to directly constrain model microphysical parameters as this study demonstrates using radar observations from DYNAMO field campaign.« less

  12. How Do Microphysical Processes Influence Large-Scale Precipitation Variability and Extremes?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagos, Samson; Ruby Leung, L.; Zhao, Chun

    Convection permitting simulations using the Model for Prediction Across Scales-Atmosphere (MPAS-A) are used to examine how microphysical processes affect large-scale precipitation variability and extremes. An episode of the Madden-Julian Oscillation is simulated using MPAS-A with a refined region at 4-km grid spacing over the Indian Ocean. It is shown that cloud microphysical processes regulate the precipitable water (PW) statistics. Because of the non-linear relationship between precipitation and PW, PW exceeding a certain critical value (PWcr) contributes disproportionately to precipitation variability. However, the frequency of PW exceeding PWcr decreases rapidly with PW, so changes in microphysical processes that shift the columnmore » PW statistics relative to PWcr even slightly have large impacts on precipitation variability. Furthermore, precipitation variance and extreme precipitation frequency are approximately linearly related to the difference between the mean and critical PW values. Thus observed precipitation statistics could be used to directly constrain model microphysical parameters as this study demonstrates using radar observations from DYNAMO field campaign.« less

  13. HIGH-SHEAR GRANULATION PROCESS: INFLUENCE OF PROCESSING PARAMETERS ON CRITICAL QUALITY ATTRIBUTES OF ACETAMINOPHEN GRANULES AND TABLETS USING DESIGN OF EXPERIMENT APPROACH.

    PubMed

    Fayed, Mohamed H; Abdel-Rahman, Sayed I; Alanazi, Fars K; Ahmed, Mahrous O; Tawfeek, Hesham M; Al-Shedfat, Ramadan I

    2017-01-01

    Application of quality by design (QbD) in high shear granulation process is critical and need to recognize the correlation between the granulation process parameters and the properties of intermediate (granules) and corresponding final product (tablets). The present work examined the influence of water amount (X,) and wet massing time (X2) as independent process variables on the critical quality attributes of granules and corresponding tablets using design of experiment (DoE) technique. A two factor, three level (32) full factorial design was performed; each of these variables was investigated at three levels to characterize their strength and interaction. The dried granules have been analyzed for their size distribution, density and flow pattern. Additionally, the produced tablets have been investigated for weight uniformity, crushing strength, friability and percent capping, disintegration time and drug dissolution. Statistically significant impact (p < 0.05) of water amount was identified for granule growth, percent fines and distribution width and flow behavior. Granule density and compressibility were found to be significantly influenced (p < 0.05) by the two operating conditions. Also, water amount has significant effect (p < 0.05) on tablet weight unifornity, friability and percent capping. Moreover, tablet disintegration time and drug dissolution appears to be significantly influenced (p < 0.05) by the two process variables. On the other hand, the relationship of process parameters with critical quality attributes of granule and final product tablet was identified and correlated. Ultimately, a judicious selection of process parameters in high shear granulation process will allow providing product of desirable quality.

  14. Effects of process variables on the properties of YBa2Cu3O(7-x) ceramics formed by investment casting

    NASA Technical Reports Server (NTRS)

    Hooker, M. W.; Taylor, T. D.; Leigh, H. D.; Wise, S. A.; Buckley, J. D.; Vasquez, P.; Buck, G. M.; Hicks, L. P.

    1993-01-01

    An investment casting process has been developed to produce net-shape, superconducting ceramics. In this work, a factorial experiment was performed to determine the critical process parameters for producing cast YBa2Cu3O7 ceramics with optimum properties. An analysis of variance procedure indicated that the key variables in casting superconductive ceramics are the particle size distribution and sintering temperature. Additionally, the interactions between the sintering temperature and the other process parameters (e.g., particle size distribution and the use of silver dopants) were also found to influence the density, porosity, and critical current density of the fired ceramics.

  15. Whole blood flow cytometry measurements of in vivo platelet activation in critically-Ill patients are influenced by variability in blood sampling techniques.

    PubMed

    Rondina, Matthew T; Grissom, Colin K; Men, Shaohua; Harris, Estelle S; Schwertz, Hansjorg; Zimmerman, Guy A; Weyrich, Andrew S

    2012-06-01

    Flow cytometry is often used to measure in vivo platelet activation in critically-ill patients. Variability in blood sampling techniques, which may confound these measurements, remains poorly characterized. Platelet activation was measured by flow cytometry performed on arterial and venous blood from 116 critically-ill patients. We determined how variability in vascular sampling site, processing times, and platelet counts influenced levels of platelet-monocyte aggregates (PMA), PAC-1 binding (for glycoprotein (GP) IIbIIIa), and P-selectin (P-SEL) expression. Levels of PMA, but not PAC-1 binding or P-SEL expression, were significantly affected by variability in vascular sampling site. Average PMA levels were approximately 60% higher in whole blood drawn from an arterial vessel compared to venous blood (16.2±1.8% vs. 10.7±1.2%, p<0.05). Levels of PMA in both arterial and venous blood increased significantly during ex vivo processing delays (1.7% increase for every 10 minute delay, p<0.05). In contrast, PAC-1 binding and P-SEL expression were unaffected by processing delays. Levels of PMA, but not PAC-1 binding or P-SEL expression, were correlated with platelet count quartiles (9.4±1.6% for the lowest quartile versus 15.4±1.6% for the highest quartile, p<0.05). In critically-ill patients, variability in vascular sampling site, processing times, and platelet counts influence levels of PMA, but not PAC-1 binding or P-SEL expression. These data demonstrate the need for rigorous adherence to blood sampling protocols, particularly when levels of PMA, which are most sensitive to variations in blood collection, are measured for detection of in vivo platelet activation. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Statistical modeling methods to analyze the impacts of multiunit process variability on critical quality attributes of Chinese herbal medicine tablets

    PubMed Central

    Sun, Fei; Xu, Bing; Zhang, Yi; Dai, Shengyun; Yang, Chan; Cui, Xianglong; Shi, Xinyuan; Qiao, Yanjiang

    2016-01-01

    The quality of Chinese herbal medicine tablets suffers from batch-to-batch variability due to a lack of manufacturing process understanding. In this paper, the Panax notoginseng saponins (PNS) immediate release tablet was taken as the research subject. By defining the dissolution of five active pharmaceutical ingredients and the tablet tensile strength as critical quality attributes (CQAs), influences of both the manipulated process parameters introduced by an orthogonal experiment design and the intermediate granules’ properties on the CQAs were fully investigated by different chemometric methods, such as the partial least squares, the orthogonal projection to latent structures, and the multiblock partial least squares (MBPLS). By analyzing the loadings plots and variable importance in the projection indexes, the granule particle sizes and the minimal punch tip separation distance in tableting were identified as critical process parameters. Additionally, the MBPLS model suggested that the lubrication time in the final blending was also important in predicting tablet quality attributes. From the calculated block importance in the projection indexes, the tableting unit was confirmed to be the critical process unit of the manufacturing line. The results demonstrated that the combinatorial use of different multivariate modeling methods could help in understanding the complex process relationships as a whole. The output of this study can then be used to define a control strategy to improve the quality of the PNS immediate release tablet. PMID:27932865

  17. A critical assessment of in-flight particle state during plasma spraying of YSZ and its implications on coating properties and process reliability

    NASA Astrophysics Data System (ADS)

    Srinivasan, Vasudevan

    Air plasma spray is inherently complex due to the deviation from equilibrium conditions, three dimensional nature, multitude of interrelated (controllable) parameters and (uncontrollable) variables involved, and stochastic variability at different stages. The resultant coatings are complex due to the layered high defect density microstructure. Despite the widespread use and commercial success for decades in earthmoving, automotive, aerospace and power generation industries, plasma spray has not been completely understood and prime reliance for critical applications such as thermal barrier coatings on gas turbines are yet to be accomplished. This dissertation is aimed at understanding the in-flight particle state of the plasma spray process towards designing coatings and achieving coating reliability with the aid of noncontact in-flight particle and spray stream sensors. Key issues such as the phenomena of optimum particle injection and the definition of spray stream using particle state are investigated. Few strategies to modify the microstructure and properties of Yttria Stabilized Zirconia coatings are examined systematically using the framework of process maps. An approach to design process window based on design relevant coating properties is presented. Options to control the process for enhanced reproducibility and reliability are examined and the resultant variability is evaluated systematically at the different stages in the process. The 3D variability due to the difference in plasma characteristics has been critically examined by investigating splats collected from the entire spray footprint.

  18. Study of process variables associated with manufacturing hermetically-sealed nickel-cadmium cells

    NASA Technical Reports Server (NTRS)

    Miller, L.

    1972-01-01

    The effort and results of a program to determine and study the critical process variables associated with the manufacture of aerospace, hermetically-sealed, nickel-cadmium cells are reported. During the period, the impregnation/polarization process variable study was brought to a close with the completion of a series of related experiments. The results of the experiments are summarized. During this period, a general characterization of cell separator materials was initiated. The major conclusions resulting from the characterization of materials are included.

  19. Quality by design approach for understanding the critical quality attributes of cyclosporine ophthalmic emulsion.

    PubMed

    Rahman, Ziyaur; Xu, Xiaoming; Katragadda, Usha; Krishnaiah, Yellela S R; Yu, Lawrence; Khan, Mansoor A

    2014-03-03

    Restasis is an ophthalmic cyclosporine emulsion used for the treatment of dry eye syndrome. There are no generic products for this product, probably because of the limitations on establishing in vivo bioequivalence methods and lack of alternative in vitro bioequivalence testing methods. The present investigation was carried out to understand and identify the appropriate in vitro methods that can discriminate the effect of formulation and process variables on critical quality attributes (CQA) of cyclosporine microemulsion formulations having the same qualitative (Q1) and quantitative (Q2) composition as that of Restasis. Quality by design (QbD) approach was used to understand the effect of formulation and process variables on critical quality attributes (CQA) of cyclosporine microemulsion. The formulation variables chosen were mixing order method, phase volume ratio, and pH adjustment method, while the process variables were temperature of primary and raw emulsion formation, microfluidizer pressure, and number of pressure cycles. The responses selected were particle size, turbidity, zeta potential, viscosity, osmolality, surface tension, contact angle, pH, and drug diffusion. The selected independent variables showed statistically significant (p < 0.05) effect on droplet size, zeta potential, viscosity, turbidity, and osmolality. However, the surface tension, contact angle, pH, and drug diffusion were not significantly affected by independent variables. In summary, in vitro methods can detect formulation and manufacturing changes and would thus be important for quality control or sameness of cyclosporine ophthalmic products.

  20. Computer Optimization of Biodegradable Nanoparticles Fabricated by Dispersion Polymerization.

    PubMed

    Akala, Emmanuel O; Adesina, Simeon; Ogunwuyi, Oluwaseun

    2015-12-22

    Quality by design (QbD) in the pharmaceutical industry involves designing and developing drug formulations and manufacturing processes which ensure predefined drug product specifications. QbD helps to understand how process and formulation variables affect product characteristics and subsequent optimization of these variables vis-à-vis final specifications. Statistical design of experiments (DoE) identifies important parameters in a pharmaceutical dosage form design followed by optimizing the parameters with respect to certain specifications. DoE establishes in mathematical form the relationships between critical process parameters together with critical material attributes and critical quality attributes. We focused on the fabrication of biodegradable nanoparticles by dispersion polymerization. Aided by a statistical software, d-optimal mixture design was used to vary the components (crosslinker, initiator, stabilizer, and macromonomers) to obtain twenty nanoparticle formulations (PLLA-based nanoparticles) and thirty formulations (poly-ɛ-caprolactone-based nanoparticles). Scheffe polynomial models were generated to predict particle size (nm), zeta potential, and yield (%) as functions of the composition of the formulations. Simultaneous optimizations were carried out on the response variables. Solutions were returned from simultaneous optimization of the response variables for component combinations to (1) minimize nanoparticle size; (2) maximize the surface negative zeta potential; and (3) maximize percent yield to make the nanoparticle fabrication an economic proposition.

  1. A quality by design approach to understand formulation and process variability in pharmaceutical melt extrusion processes.

    PubMed

    Patwardhan, Ketaki; Asgarzadeh, Firouz; Dassinger, Thomas; Albers, Jessica; Repka, Michael A

    2015-05-01

    In this study, the principles of quality by design (QbD) have been uniquely applied to a pharmaceutical melt extrusion process for an immediate release formulation with a low melting model drug, ibuprofen. Two qualitative risk assessment tools - Fishbone diagram and failure mode effect analysis - were utilized to strategically narrow down the most influential parameters. Selected variables were further assessed using a Plackett-Burman screening study, which was upgraded to a response surface design consisting of the critical factors to study the interactions between the study variables. In process torque, glass transition temperature (Tg ) of the extrudates, assay, dissolution and phase change were measured as responses to evaluate the critical quality attributes (CQAs) of the extrudates. The effect of each study variable on the measured responses was analysed using multiple regression for the screening design and partial least squares for the optimization design. Experimental limits for formulation and process parameters to attain optimum processing have been outlined. A design space plot describing the domain of experimental variables within which the CQAs remained unchanged was developed. A comprehensive approach for melt extrusion product development based on the QbD methodology has been demonstrated. Drug loading concentrations between 40- 48%w/w and extrusion temperature in the range of 90-130°C were found to be the most optimum. © 2015 Royal Pharmaceutical Society.

  2. Consequences of variable reproduction for seedling recruitment in three neotropical tree species

    Treesearch

    Diane De Steven; S. Joesph Wright

    2002-01-01

    Variable seed production may have important consequences for recruitment but poorly documented for frugivore-dispersed tropical trees. Recruitment limitation may also may be a critical spatial process affectng forest dynamics, but it is rarely assessed at the scale of individual trees. Over an 11-yr period, we studied the consequences of variable seed production for...

  3. Formulation and process factors influencing product quality and in vitro performance of ophthalmic ointments.

    PubMed

    Xu, Xiaoming; Al-Ghabeish, Manar; Rahman, Ziyaur; Krishnaiah, Yellela S R; Yerlikaya, Firat; Yang, Yang; Manda, Prashanth; Hunt, Robert L; Khan, Mansoor A

    2015-09-30

    Owing to its unique anatomical and physiological functions, ocular surface presents special challenges for both design and performance evaluation of the ophthalmic ointment drug products formulated with a variety of bases. The current investigation was carried out to understand and identify the appropriate in vitro methods suitable for quality and performance evaluation of ophthalmic ointment, and to study the effect of formulation and process variables on its critical quality attributes (CQA). The evaluated critical formulation variables include API initial size, drug percentage, and mineral oil percentage while the critical process parameters include mixing rate, temperature, time and cooling rate. The investigated quality and performance attributes include drug assay, content uniformity, API particle size in ointment, rheological characteristics, in vitro drug release and in vitro transcorneal drug permeation. Using design of experiments (DoE) as well as a novel principle component analysis approach, five of the quality and performance attributes (API particle size, storage modulus of ointment, high shear viscosity of ointment, in vitro drug release constant and in vitro transcorneal drug permeation rate constant) were found to be highly influenced by the formulation, in particular the strength of API, and to a lesser degree by processing variables. Correlating the ocular physiology with the physicochemical characteristics of acyclovir ophthalmic ointment suggested that in vitro quality metrics could be a valuable predictor of its in vivo performance. Published by Elsevier B.V.

  4. Can Process Understanding Help Elucidate The Structure Of The Critical Zone? Comparing Process-Based Soil Formation Models With Digital Soil Mapping.

    NASA Astrophysics Data System (ADS)

    Vanwalleghem, T.; Román, A.; Peña, A.; Laguna, A.; Giráldez, J. V.

    2017-12-01

    There is a need for better understanding the processes influencing soil formation and the resulting distribution of soil properties in the critical zone. Soil properties can exhibit strong spatial variation, even at the small catchment scale. Especially soil carbon pools in semi-arid, mountainous areas are highly uncertain because bulk density and stoniness are very heterogeneous and rarely measured explicitly. In this study, we explore the spatial variability in key soil properties (soil carbon stocks, stoniness, bulk density and soil depth) as a function of processes shaping the critical zone (weathering, erosion, soil water fluxes and vegetation patterns). We also compare the potential of traditional digital soil mapping versus a mechanistic soil formation model (MILESD) for predicting these key soil properties. Soil core samples were collected from 67 locations at 6 depths. Total soil organic carbon stocks were 4.38 kg m-2. Solar radiation proved to be the key variable controlling soil carbon distribution. Stone content was mostly controlled by slope, indicating the importance of erosion. Spatial distribution of bulk density was found to be highly random. Finally, total carbon stocks were predicted using a random forest model whose main covariates were solar radiation and NDVI. The model predicts carbon stocks that are double as high on north versus south-facing slopes. However, validation showed that these covariates only explained 25% of the variation in the dataset. Apparently, present-day landscape and vegetation properties are not sufficient to fully explain variability in the soil carbon stocks in this complex terrain under natural vegetation. This is attributed to a high spatial variability in bulk density and stoniness, key variables controlling carbon stocks. Similar results were obtained with the mechanistic soil formation model MILESD, suggesting that more complex models might be needed to further explore this high spatial variability.

  5. [Application of quality by design in granulation process for ginkgo leaf tablet (Ⅱ): identification of critical quality attributes].

    PubMed

    Xu, Bing; Cui, Xiang-Long; Yang, Chan; Wang, Xin; Shi, Xin-Yuan; Qiao, Yan-Jiang

    2017-03-01

    Quality by design (QbD) highlights the concept of "begin with the end", which means to thoroughly understand the target product quality first, and then guide pharmaceutical process development and quality control throughout the whole manufacturing process. In this paper, the Ginkgo biloba granules intermediates were taken as the research object, and the requirements of the tensile strength of tablets were treated as the goals to establish the methods for identification of granules' critical quality attributes (CQAs) and establishment of CQAs' limits. Firstly, the orthogonal partial least square (OPLS) model was adopted to build the relationship between the micromeritic properties of 29 batches of granules and the tensile strength of ginkgo leaf tablets, and thereby the potential critical quality attributes (pCQAs) were screened by variable importance in the projection (VIP) indexes. Then, a series of OPLS models were rebuilt by reducing pCQAs variables one by one in view of the rule of VIP values from low to high in sequence. The model performance results demonstrated that calibration and predictive performance of the model had no decreasing trend after variables reduction. In consideration of the results from variables selection as well as the collinearity test and testability of the pCQAs, the median particle size (D₅₀) and the bulk density (Da) were identified as critical quality attributes (CQAs). The design space of CQAs was developed based on a multiple linear regression model established between the CQAs (D₅₀ and Da) and the tensile strength. The control constraints of the CQAs were determined as 170 μm< D₅₀<500 μm and 0.30 g•cm⁻³

  6. Soil temperature variability in complex terrain measured using fiber-optic distributed temperature sensing

    USDA-ARS?s Scientific Manuscript database

    Soil temperature (Ts) exerts critical controls on hydrologic and biogeochemical processes but magnitude and nature of Ts variability in a landscape setting are rarely documented. Fiber optic distributed temperature sensing systems (FO-DTS) potentially measure Ts at high density over a large extent. ...

  7. A comprehensive strategy in the development of a cyclodextrin-modified microemulsion electrokinetic chromatographic method for the assay of diclofenac and its impurities: Mixture-process variable experiments and quality by design.

    PubMed

    Orlandini, S; Pasquini, B; Caprini, C; Del Bubba, M; Squarcialupi, L; Colotta, V; Furlanetto, S

    2016-09-30

    A comprehensive strategy involving the use of mixture-process variable (MPV) approach and Quality by Design principles has been applied in the development of a capillary electrophoresis method for the simultaneous determination of the anti-inflammatory drug diclofenac and its five related substances. The selected operative mode consisted in microemulsion electrokinetic chromatography with the addition of methyl-β-cyclodextrin. The critical process parameters included both the mixture components (MCs) of the microemulsion and the process variables (PVs). The MPV approach allowed the simultaneous investigation of the effects of MCs and PVs on the critical resolution between diclofenac and its 2-deschloro-2-bromo analogue and on analysis time. MPV experiments were used both in the screening phase and in the Response Surface Methodology, making it possible to draw MCs and PVs contour plots and to find important interactions between MCs and PVs. Robustness testing was carried out by MPV experiments and validation was performed following International Conference on Harmonisation guidelines. The method was applied to a real sample of diclofenac gastro-resistant tablets. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. [Is the critical patient competent for decision taking? Psychological and psychopathological reasons of cognitive impairment].

    PubMed

    Bernat-Adell, M D; Ballester-Arnal, R; Abizanda-Campos, R

    2012-01-01

    Emotional factors may lead to cognitive impairment that can adversely affect the capacity of patients to reason, and thereby, limit their participation in decision taking. To analyze critical patient aptitude for decision taking, and to identify variables that may influence competence. An observational descriptive study was carried out. Intensive care unit. Participants were 29 critically ill patients. Social, demographic and psychological variables were analyzed. Functional capacities and psychological reactions during stay in the ICU were assessed. The patients are of the firm opinion that they should have the last word in the taking of decisions; they prefer bad news to be given by the physician; and feel that the presence of a psychologist would make the process easier. Failure on the part of the professional to answer their questions is perceived as the greatest stress factor. Increased depression results in lesser cognitive capacity, and for patients with impaired cognitive capacity, participation in the decision taking process constitutes a burden. The variables anxiety and depression are significantly related to decision taking capacity. Copyright © 2011 Elsevier España, S.L. and SEMICYUC. All rights reserved.

  9. Lithologic Controls on Critical Zone Processes in a Variably Metamorphosed Shale-Hosted Watershed

    NASA Astrophysics Data System (ADS)

    Eldam Pommer, R.; Navarre-Sitchler, A.

    2017-12-01

    Local and regional shifts in thermal maturity within sedimentary shale systems impart significant variation in chemical and physical rock properties, such as pore-network morphology, mineralogy, organic carbon content, and solute release potential. Even slight variations in these properties on a watershed scale can strongly impact surface and shallow subsurface processes that drive soil formation, landscape evolution, and bioavailability of nutrients. Our ability to map and quantify the effects of this heterogeneity on critical zone processes is hindered by the complex coupling of the multi-scale nature of rock properties, geochemical signatures, and hydrological processes. This study addresses each of these complexities by synthesizing chemical and physical characteristics of variably metamorphosed shales in order to link rock heterogeneity with modern earth surface and shallow subsurface processes. More than 80 samples of variably metamorphosed Mancos Shale were collected in the East River Valley, Colorado, a headwater catchment of the Upper Colorado River Basin. Chemical and physical analyses of the samples show that metamorphism decreases overall rock porosity, pore anisotropy, and surface area, and introduces unique chemical signatures. All of these changes result in lower overall solute release from the Mancos Shale in laboratory dissolution experiments and a change in rock-derived solute chemistry with decreasing organic carbon and cation exchange capacity (Ca, Na, Mg, and K). The increase in rock competency and decrease in reactivity of the more thermally mature shales appear to subsequently control river morphology, with lower channel sinuosity associated with areas of the catchment underlain by metamorphosed Mancos Shale. This work illustrates the formative role of the geologic template on critical zone processes and landscape development within and across watersheds.

  10. Natural language processing to ascertain two key variables from operative reports in ophthalmology.

    PubMed

    Liu, Liyan; Shorstein, Neal H; Amsden, Laura B; Herrinton, Lisa J

    2017-04-01

    Antibiotic prophylaxis is critical to ophthalmology and other surgical specialties. We performed natural language processing (NLP) of 743 838 operative notes recorded for 315 246 surgeries to ascertain two variables needed to study the comparative effectiveness of antibiotic prophylaxis in cataract surgery. The first key variable was an exposure variable, intracameral antibiotic injection. The second was an intraoperative complication, posterior capsular rupture (PCR), which functioned as a potential confounder. To help other researchers use NLP in their settings, we describe our NLP protocol and lessons learned. For each of the two variables, we used SAS Text Miner and other SAS text-processing modules with a training set of 10 000 (1.3%) operative notes to develop a lexicon. The lexica identified misspellings, abbreviations, and negations, and linked words into concepts (e.g. "antibiotic" linked with "injection"). We confirmed the NLP tools by iteratively obtaining random samples of 2000 (0.3%) notes, with replacement. The NLP tools identified approximately 60 000 intracameral antibiotic injections and 3500 cases of PCR. The positive and negative predictive values for intracameral antibiotic injection exceeded 99%. For the intraoperative complication, they exceeded 94%. NLP was a valid and feasible method for obtaining critical variables needed for a research study of surgical safety. These NLP tools were intended for use in the study sample. Use with external datasets or future datasets in our own setting would require further testing. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Natural Language Processing to Ascertain Two Key Variables from Operative Reports in Ophthalmology

    PubMed Central

    Liu, Liyan; Shorstein, Neal H.; Amsden, Laura B; Herrinton, Lisa J.

    2016-01-01

    Purpose Antibiotic prophylaxis is critical to ophthalmology and other surgical specialties. We performed natural language processing (NLP) of 743,838 operative notes recorded for 315,246 surgeries to ascertain two variables needed to study the comparative effectiveness of antibiotic prophylaxis in cataract surgery. The first key variable was an exposure variable, intracameral antibiotic injection. The second was an intraoperative complication, posterior capsular rupture (PCR), that functioned as a potential confounder. To help other researchers use NLP in their settings, we describe our NLP protocol and lessons learned. Methods For each of the two variables, we used SAS Text Miner and other SAS text-processing modules with a training set of 10,000 (1.3%) operative notes to develop a lexicon. The lexica identified misspellings, abbreviations, and negations, and linked words into concepts (e.g., “antibiotic” linked with “injection”). We confirmed the NLP tools by iteratively obtaining random samples of 2,000 (0.3%) notes, with replacement. Results The NLP tools identified approximately 60,000 intracameral antibiotic injections and 3,500 cases of PCR. The positive and negative predictive values for intracameral antibiotic injection exceeded 99%. For the intraoperative complication, they exceeded 94%. Conclusion NLP was a valid and feasible method for obtaining critical variables needed for a research study of surgical safety. These NLP tools were intended for use in the study sample. Use with external datasets or future datasets in our own setting would require further testing. PMID:28052483

  12. Identification of critical process variables affecting particle size following precipitation using a supercritical fluid.

    PubMed

    Sacha, Gregory A; Schmitt, William J; Nail, Steven L

    2006-01-01

    The critical processing parameters affecting average particle size, particle size distribution, yield, and level of residual carrier solvent using the supercritical anti-solvent method (SAS) were identified. Carbon dioxide was used as the supercritical fluid. Methylprednisolone acetate was used as the model solute in tetrahydrofuran. Parameters examined included pressure of the supercritical fluid, agitation rate, feed solution flow rate, impeller diameter, and nozzle design. Pressure was identified as the most important process parameter affecting average particle size, either through the effect of pressure on dispersion of the feed solution into the precipitation vessel or through the effect of pressure on solubility of drug in the CO2/organic solvent mixture. Agitation rate, impeller diameter, feed solution flow rate, and nozzle design had significant effects on particle size, which suggests that dispersion of the feed solution is important. Crimped HPLC tubing was the most effective method of introducing feed solution into the precipitation vessel, largely because it resulted in the least amount of clogging during the precipitation. Yields of 82% or greater were consistently produced and were not affected by the processing variables. Similarly, the level of residual solvent was independent of the processing variables and was present at 0.0002% wt/wt THF or less.

  13. Variability metrics in Josephson Junction fabrication for Quantum Computing circuits

    NASA Astrophysics Data System (ADS)

    Rosenblatt, Sami; Hertzberg, Jared; Brink, Markus; Chow, Jerry; Gambetta, Jay; Leng, Zhaoqi; Houck, Andrew; Nelson, J. J.; Plourde, Britton; Wu, Xian; Lake, Russell; Shainline, Jeff; Pappas, David; Patel, Umeshkumar; McDermott, Robert

    Multi-qubit gates depend on the relative frequencies of the qubits. To reliably build multi-qubit devices therefore requires careful fabrication of Josephson junctions in order to precisely set their critical currents. The Ambegaokar-Baratoff relation between tunnel conductance and critical current implies a correlation between qubit frequency spread and tunnel junction resistance spread. Here we discuss measurement of large numbers of tunnel junctions to assess these resistance spreads, which can exceed 5% of mean resistance. With the goal of minimizing these spreads, we investigate process parameters such as lithographic junction area, evaporation and masking scheme, oxidation conditions, and substrate choice, as well as test environment, design and setup. In addition, trends of junction resistance with temperature are compared with theoretical models for further insights into process and test variability.

  14. Assessment Methodology for Process Validation Lifecycle Stage 3A.

    PubMed

    Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Chen, Shu; Ingram, Marzena; Spes, Jana

    2017-07-01

    The paper introduces evaluation methodologies and associated statistical approaches for process validation lifecycle Stage 3A. The assessment tools proposed can be applied to newly developed and launched small molecule as well as bio-pharma products, where substantial process and product knowledge has been gathered. The following elements may be included in Stage 3A: number of 3A batch determination; evaluation of critical material attributes, critical process parameters, critical quality attributes; in vivo in vitro correlation; estimation of inherent process variability (IPV) and PaCS index; process capability and quality dashboard (PCQd); and enhanced control strategy. US FDA guidance on Process Validation: General Principles and Practices, January 2011 encourages applying previous credible experience with suitably similar products and processes. A complete Stage 3A evaluation is a valuable resource for product development and future risk mitigation of similar products and processes. Elements of 3A assessment were developed to address industry and regulatory guidance requirements. The conclusions made provide sufficient information to make a scientific and risk-based decision on product robustness.

  15. Development process and initial validation of the Ethical Conflict in Nursing Questionnaire-Critical Care Version.

    PubMed

    Falcó-Pegueroles, Anna; Lluch-Canut, Teresa; Guàrdia-Olmos, Joan

    2013-06-01

    Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables 'frequency' and 'degree of conflict'. In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable 'exposure to conflict', as well as considering six 'types of ethical conflict'. An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach's alpha was used to evaluate the instrument's reliability. All analyses were performed using the statistical software PASW v19. Cronbach's alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.

  16. Development process and initial validation of the Ethical Conflict in Nursing Questionnaire-Critical Care Version

    PubMed Central

    2013-01-01

    Background Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables ‘frequency’ and ‘degree of conflict’. In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable ‘exposure to conflict’, as well as considering six ‘types of ethical conflict’. An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach’s alpha was used to evaluate the instrument’s reliability. All analyses were performed using the statistical software PASW v19. Results Cronbach’s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV. PMID:23725477

  17. A comparison of six potential evapotranspiration methods for regional use in the Southeastern United States

    Treesearch

    Jianbiao Lu; Ge Sun; Steven G. McNulty; Devendra Amatya

    2005-01-01

    Potential evapotranspiration (PET) is an important index of hydrologic budgets at different spatial scales and is a critical variable for understanding regional biological processes. It is often an important variable in estimating actual evapotranspiration (AET) in rainfall-runoff and ecosystem modeling. However, PET is defined in different ways in the literature and...

  18. New gentle-wing high-shear granulator: impact of processing variables on granules and tablets characteristics of high-drug loading formulation using design of experiment approach.

    PubMed

    Fayed, Mohamed H; Abdel-Rahman, Sayed I; Alanazi, Fars K; Ahmed, Mahrous O; Tawfeek, Hesham M; Al-Shdefat, Ramadan I

    2017-10-01

    The aim of this work was to study the application of design of experiment (DoE) approach in defining design space for granulation and tableting processes using a novel gentle-wing high-shear granulator. According to quality-by-design (QbD) prospective, critical attributes of granules, and tablets should be ensured by manufacturing process design. A face-centered central composite design has been employed in order to investigate the effect of water amount (X 1 ), impeller speed (X 2 ), wet massing time (X 3 ), and water addition rate (X 4 ) as independent process variables on granules and tablets characteristics. Acetaminophen was used as a model drug and granulation experiments were carried out using dry addition of povidone k30. The dried granules have been analyzed for their size distribution, density, and flow pattern. Additionally, the produced tablets have been investigated for; weight uniformity, breaking force, friability and percent capping, disintegration time, and drug dissolution. Results of regression analysis showed that water amount, impeller speed and wet massing time have significant (p < .05) effect on granules and tablets characteristics. However, the water amount had the most pronounced effect as indicated by its higher parameter estimate. On the other hand, water addition rate showed a minimal impact on granules and tablets properties. In conclusion, water amount, impeller speed, and wet massing time could be considered as critical process variables. Thus, understanding the relationship between these variables and quality attributes of granules and corresponding tablets provides the basis for adjusting granulation variables in order to optimize product performance.

  19. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.

  20. Validation of the SEPHIS Program for the Modeling of the HM Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kyser, E.A.

    The SEPHIS computer program is currently being used to evaluate the effect of all process variables on the criticality safety of the HM 1st Uranium Cycle process in H Canyon. The objective of its use has three main purposes. (1) To provide a better technical basis for those process variables that do not have any realistic effect on the criticality safety of the process. (2) To qualitatively study those conditions that have been previously recognized to affect the nuclear safety of the process or additional conditions that modeling has indicated may pose a criticality safety issue. (3) To judge themore » adequacy of existing or future neutron monitors locations in the detection of the initial stages of reflux for specific scenarios.Although SEPHIS generally over-predicts the distribution of uranium to the organic phase, it is a capable simulation tool as long as the user recognizes its biases and takes special care when using the program for scenarios where the prediction bias is non-conservative. The temperature coefficient used by SEPHIS is poor at predicting effect of temperature on uranium extraction for the 7.5 percent TBP used in the HM process. Therefore, SEPHIS should not be used to study temperature related scenarios. However, within normal operating temperatures when other process variables are being studied, it may be used. Care must be is given to understanding the prediction bias and its effect on any conclusion for the particular scenario that is under consideration. Uranium extraction with aluminum nitrate is over-predicted worse than for nitric acid systems. However, the extraction section of the 1A bank has sufficient excess capability that these errors, while relatively large, still allow SEPHIS to be used to develop reasonable qualitative assessments for reflux scenarios. However, high losses to the 1AW stream cannot be modeled by SEPHIS.« less

  1. Improvement of chemical vapor deposition process for production of large diameter carbon base monofilaments

    NASA Technical Reports Server (NTRS)

    Hough, R. L.; Richmond, R. D.

    1971-01-01

    Research was conducted to develop large diameter carbon monofilament, containing 25 to 35 mole % element boron, in the 2.0 to 10.0 mil diameter range using the chemical vapor deposition process. The objective of the program was to gain an understanding of the critical process variables and their effect on fiber properties. Synthesis equipment was modified to allow these variables to be studied. Improved control of synthesis variables permitted reduction in scatter of properties of the monofilaments. Monofilaments have been synthesized in the 3.0 to nearly 6.0 mil diameter range having measured values up to 552,000 psi for ultimate tensile strength and up to 30 million psi for elastic modulus.

  2. Kpejigaou: an indigenous, high-protein, low-fat, cowpea-based griddled food proposed for coastal West Africa.

    PubMed

    Amonsou, Eric Oscar; Sakyi-Dawson, Esther; Saalia, Firibu Kwesi; Houssou, Paul

    2008-12-01

    Griddled cowpea paste foods have high nutritional potential because they are low in fat but high in protein. A good understanding of process and product characteristics of kpejigaou is necessary to improve its quality and enhance acceptability. To describe the product, evaluate critical variables in traditional processing, and determine consumer quality criteria and preferences for kpejigaou. A survey of kpejigaou processing was carried out among processors and regular consumers of kpejigaou. Kpejigaou is flat and circular in shape, with uniform thickness and porous structure. The production process of kpejigaou was found to be simple and rapid, but the quality of the finished product varied among processors and among batches. Critical processing variables affecting quality were dehulling of the cowpeas, type of griddling equipment, and griddling temperature. Texture (sponginess) is the most important quality index that determines the preference and acceptability of kpejigaou by consumers. Traditionally processed kpejigaou does not meet current standards for high-quality foods. This study provides the basis for efforts to standardize the kpejigaou process to ensure consistent product quality and enhance the acceptability of kpejigaou among consumers. Kpejigaou has a potential for success if marketed as a low-fat, nutritious fast food.

  3. Individual Variability in the Semantic Processing of English Compound Words

    ERIC Educational Resources Information Center

    Schmidtke, Daniel; Van Dyke, Julie A.; Kuperman, Victor

    2018-01-01

    Semantic transparency effects during compound word recognition provide critical insight into the organization of semantic knowledge and the nature of semantic processing. The past 25 years of psycholinguistic research on compound semantic transparency has produced discrepant effects, leaving the existence and nature of its influence unresolved. In…

  4. Ecohydrologic processes and soil thickness feedbacks control limestone-weathering rates in a karst landscape

    DOE PAGES

    Dong, Xiaoli; Cohen, Matthew J.; Martin, Jonathan B.; ...

    2018-05-18

    Here, chemical weathering of bedrock plays an essential role in the formation and evolution of Earth's critical zone. Over geologic time, the negative feedback between temperature and chemical weathering rates contributes to the regulation of Earth climate. The challenge of understanding weathering rates and the resulting evolution of critical zone structures lies in complicated interactions and feedbacks among environmental variables, local ecohydrologic processes, and soil thickness, the relative importance of which remains unresolved. We investigate these interactions using a reactive-transport kinetics model, focusing on a low-relief, wetland-dominated karst landscape (Big Cypress National Preserve, South Florida, USA) as a case study.more » Across a broad range of environmental variables, model simulations highlight primary controls of climate and soil biological respiration, where soil thickness both supplies and limits transport of biologically derived acidity. Consequently, the weathering rate maximum occurs at intermediate soil thickness. The value of the maximum weathering rate and the precise soil thickness at which it occurs depend on several environmental variables, including precipitation regime, soil inundation, vegetation characteristics, and rate of groundwater drainage. Simulations for environmental conditions specific to Big Cypress suggest that wetland depressions in this landscape began to form around beginning of the Holocene with gradual dissolution of limestone bedrock and attendant soil development, highlighting large influence of age-varying soil thickness on weathering rates and consequent landscape development. While climatic variables are often considered most important for chemical weathering, our results indicate that soil thickness and biotic activity are equally important. Weathering rates reflect complex interactions among soil thickness, climate, and local hydrologic and biotic processes, which jointly shape the supply and delivery of chemical reactants, and the resulting trajectories of critical zone and karst landscape development.« less

  5. Ecohydrologic processes and soil thickness feedbacks control limestone-weathering rates in a karst landscape

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, Xiaoli; Cohen, Matthew J.; Martin, Jonathan B.

    Here, chemical weathering of bedrock plays an essential role in the formation and evolution of Earth's critical zone. Over geologic time, the negative feedback between temperature and chemical weathering rates contributes to the regulation of Earth climate. The challenge of understanding weathering rates and the resulting evolution of critical zone structures lies in complicated interactions and feedbacks among environmental variables, local ecohydrologic processes, and soil thickness, the relative importance of which remains unresolved. We investigate these interactions using a reactive-transport kinetics model, focusing on a low-relief, wetland-dominated karst landscape (Big Cypress National Preserve, South Florida, USA) as a case study.more » Across a broad range of environmental variables, model simulations highlight primary controls of climate and soil biological respiration, where soil thickness both supplies and limits transport of biologically derived acidity. Consequently, the weathering rate maximum occurs at intermediate soil thickness. The value of the maximum weathering rate and the precise soil thickness at which it occurs depend on several environmental variables, including precipitation regime, soil inundation, vegetation characteristics, and rate of groundwater drainage. Simulations for environmental conditions specific to Big Cypress suggest that wetland depressions in this landscape began to form around beginning of the Holocene with gradual dissolution of limestone bedrock and attendant soil development, highlighting large influence of age-varying soil thickness on weathering rates and consequent landscape development. While climatic variables are often considered most important for chemical weathering, our results indicate that soil thickness and biotic activity are equally important. Weathering rates reflect complex interactions among soil thickness, climate, and local hydrologic and biotic processes, which jointly shape the supply and delivery of chemical reactants, and the resulting trajectories of critical zone and karst landscape development.« less

  6. Industrial implementation of spatial variability control by real-time SPC

    NASA Astrophysics Data System (ADS)

    Roule, O.; Pasqualini, F.; Borde, M.

    2016-10-01

    Advanced technology nodes require more and more information to get the wafer process well setup. The critical dimension of components decreases following Moore's law. At the same time, the intra-wafer dispersion linked to the spatial non-uniformity of tool's processes is not capable to decrease in the same proportions. APC systems (Advanced Process Control) are being developed in waferfab to automatically adjust and tune wafer processing, based on a lot of process context information. It can generate and monitor complex intrawafer process profile corrections between different process steps. It leads us to put under control the spatial variability, in real time by our SPC system (Statistical Process Control). This paper will outline the architecture of an integrated process control system for shape monitoring in 3D, implemented in waferfab.

  7. Spatiotemporal variability of snow depletion curves derived from SNODAS for the conterminous United States, 2004-2013

    USGS Publications Warehouse

    Driscoll, Jessica; Hay, Lauren E.; Bock, Andrew R.

    2017-01-01

    Assessment of water resources at a national scale is critical for understanding their vulnerability to future change in policy and climate. Representation of the spatiotemporal variability in snowmelt processes in continental-scale hydrologic models is critical for assessment of water resource response to continued climate change. Continental-extent hydrologic models such as the U.S. Geological Survey National Hydrologic Model (NHM) represent snowmelt processes through the application of snow depletion curves (SDCs). SDCs relate normalized snow water equivalent (SWE) to normalized snow covered area (SCA) over a snowmelt season for a given modeling unit. SDCs were derived using output from the operational Snow Data Assimilation System (SNODAS) snow model as daily 1-km gridded SWE over the conterminous United States. Daily SNODAS output were aggregated to a predefined watershed-scale geospatial fabric and used to also calculate SCA from October 1, 2004 to September 30, 2013. The spatiotemporal variability in SNODAS output at the watershed scale was evaluated through the spatial distribution of the median and standard deviation for the time period. Representative SDCs for each watershed-scale modeling unit over the conterminous United States (n = 54,104) were selected using a consistent methodology and used to create categories of snowmelt based on SDC shape. The relation of SDC categories to the topographic and climatic variables allow for national-scale categorization of snowmelt processes.

  8. Complexity in relational processing predicts changes in functional brain network dynamics.

    PubMed

    Cocchi, Luca; Halford, Graeme S; Zalesky, Andrew; Harding, Ian H; Ramm, Brentyn J; Cutmore, Tim; Shum, David H K; Mattingley, Jason B

    2014-09-01

    The ability to link variables is critical to many high-order cognitive functions, including reasoning. It has been proposed that limits in relating variables depend critically on relational complexity, defined formally as the number of variables to be related in solving a problem. In humans, the prefrontal cortex is known to be important for reasoning, but recent studies have suggested that such processes are likely to involve widespread functional brain networks. To test this hypothesis, we used functional magnetic resonance imaging and a classic measure of deductive reasoning to examine changes in brain networks as a function of relational complexity. As expected, behavioral performance declined as the number of variables to be related increased. Likewise, increments in relational complexity were associated with proportional enhancements in brain activity and task-based connectivity within and between 2 cognitive control networks: A cingulo-opercular network for maintaining task set, and a fronto-parietal network for implementing trial-by-trial control. Changes in effective connectivity as a function of increased relational complexity suggested a key role for the left dorsolateral prefrontal cortex in integrating and implementing task set in a trial-by-trial manner. Our findings show that limits in relational processing are manifested in the brain as complexity-dependent modulations of large-scale networks. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Using machine learning for improving knowledge on antibacterial effect of bioactive glass.

    PubMed

    Echezarreta-López, M M; Landin, M

    2013-09-10

    The aim of this work was to find relationships between critical bioactive glass characteristics and their antibacterial behaviour using an artificial intelligence tool. A large dataset including ingredients and process variables of the bioactive glasses production, bacterial characteristics and microbiological experimental conditions was generated from literature and analyzed by neurofuzzy logic technology. Our findings allow an explanation on the variability in antibacterial behaviour found by different authors and to obtain general conclusions about critical parameters of bioactive glasses to be considered in order to achieve activity against some of the most common skin and implant surgery pathogens. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Bioenergy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenney, Kevin; Gresham, Garold

    Scientists and engineers at Idaho National Laboratory are working with partners throughout the bioenergy industry in preprocessing and characterization to ensure optimum feedstock quality. This elite team understands that addressing feedstock variability is a critical component in the biofuel production process.

  11. Bioenergy

    ScienceCinema

    Kenney, Kevin; Gresham, Garold

    2018-06-06

    Scientists and engineers at Idaho National Laboratory are working with partners throughout the bioenergy industry in preprocessing and characterization to ensure optimum feedstock quality. This elite team understands that addressing feedstock variability is a critical component in the biofuel production process.

  12. Design of forging process variables under uncertainties

    NASA Astrophysics Data System (ADS)

    Repalle, Jalaja; Grandhi, Ramana V.

    2005-02-01

    Forging is a complex nonlinear process that is vulnerable to various manufacturing anomalies, such as variations in billet geometry, billet/die temperatures, material properties, and workpiece and forging equipment positional errors. A combination of these uncertainties could induce heavy manufacturing losses through premature die failure, final part geometric distortion, and reduced productivity. Identifying, quantifying, and controlling the uncertainties will reduce variability risk in a manufacturing environment, which will minimize the overall production cost. In this article, various uncertainties that affect the forging process are identified, and their cumulative effect on the forging tool life is evaluated. Because the forging process simulation is time-consuming, a response surface model is used to reduce computation time by establishing a relationship between the process performance and the critical process variables. A robust design methodology is developed by incorporating reliability-based optimization techniques to obtain sound forging components. A case study of an automotive-component forging-process design is presented to demonstrate the applicability of the method.

  13. The critical domain size of stochastic population models.

    PubMed

    Reimer, Jody R; Bonsall, Michael B; Maini, Philip K

    2017-02-01

    Identifying the critical domain size necessary for a population to persist is an important question in ecology. Both demographic and environmental stochasticity impact a population's ability to persist. Here we explore ways of including this variability. We study populations with distinct dispersal and sedentary stages, which have traditionally been modelled using a deterministic integrodifference equation (IDE) framework. Individual-based models (IBMs) are the most intuitive stochastic analogues to IDEs but yield few analytic insights. We explore two alternate approaches; one is a scaling up to the population level using the Central Limit Theorem, and the other a variation on both Galton-Watson branching processes and branching processes in random environments. These branching process models closely approximate the IBM and yield insight into the factors determining the critical domain size for a given population subject to stochasticity.

  14. Critical shear stress measurement of cohesive soils in streams: identifying device-dependent variability using an in-situ jet test device and conduit flume

    NASA Astrophysics Data System (ADS)

    Mahalder, B.; Schwartz, J. S.; Palomino, A.; Papanicolaou, T.

    2016-12-01

    Cohesive soil erodibility and threshold shear stress for stream bed and bank are dependent on both soil physical and geochemical properties in association with the channel vegetative conditions. These properties can be spatially variable therefore making critical shear stress measurement in cohesive soil challenging and leads to a need for a more comprehensive understanding of the erosional processes in streams. Several in-situ and flume-type test devices for estimating critical shear stress have been introduced by different researchers; however reported shear stress estimates per device vary widely in orders of magnitude. Advantages and disadvantages exist between these devices. Development of in-situ test devices leave the bed and/or bank material relatively undisturbed and can capture the variable nature of field soil conditions. However, laboratory flumes provide a means to control environmental conditions that can be quantify and tested. This study was conducted to observe differences in critical shear stress using jet tester and a well-controlled conduit flume. Soil samples were collected from the jet test locations and tested in a pressurized flume following standard operational procedure to calculate the critical shear stress. The results were compared using statistical data analysis (mean-separation ANOVA procedure) to identify possible differences. In addition to the device comparison, the mini jet device was used to measure critical shear stress across geologically diverse regions of Tennessee, USA. Statistical correlation between critical shear stress and the soil physical, and geochemical properties were completed identifying that geological origin plays a significant role in critical shear stress prediction for cohesive soils. Finally, the critical shear stress prediction equations using the jet test data were examined with possible suggestions to modify based on the flume test results.

  15. Characteristics of Combined Submerged Membrane Bioreactor with Granular Activated Carbon (GAC) in Treating Lineal Alkylbenzene Sulphonates (LAS) Wastewater

    NASA Astrophysics Data System (ADS)

    Guo, Jifeng; Xia, Siqing; Lu, Yanjun

    2010-11-01

    A combined MBR (cMBR) with granular activated carbon (GAC) was used as a backbone system to treat the synthetic lineal alkylbenzene sulphonates (LAS) wastewater. The GAC was added in the MBR to improve the resistance of membrane fouling. A parallel conventional MBR (pMBR) without the GAC was run to give a contrast. The results of the process demonstrate that the cMBR process was more efficient than pMBR. It was found that the TMP changes of the cMBR were slower than the pMBR. The results demonstrated that the cMBRs membrane was better than the pMBR's after a clean period run. It was the GAC scrubbing to the membrane that delayed the membrane fouling of the cMBR. Variable critical flux was found in MBR, which showed that the cMBR could make the critical flux better than pMBR in the run time, but GAC could not improve the critical flux at the end of the period for the severe membrane fouling. Based on this theory, a variable critical flux (J) of MBR was put forward, and the relationship of J with time (t) was: J = 16.081e-0.0177t.

  16. Western Wind Strategy: Addressing Critical Issues for Wind Deployment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Douglas Larson; Thomas Carr

    2012-03-30

    The goal of the Western Wind Strategy project was to help remove critical barriers to wind development in the Western Interconnection. The four stated objectives of this project were to: (1) identify the barriers, particularly barriers to the operational integration of renewables and barriers identified by load-serving entities (LSEs) that will be buying wind generation, (2) communicate the barriers to state officials, (3) create a collaborative process to address those barriers with the Western states, utilities and the renewable industry, and (4) provide a role model for other regions. The project has been on the forefront of identifying and informingmore » state policy makers and utility regulators of critical issues related to wind energy and the integration of variable generation. The project has been a critical component in the efforts of states to push forward important reforms and innovations that will enable states to meet their renewable energy goals and lower the cost to consumers of integrating variable generation.« less

  17. Negative Life Events and Adolescent Suicidal Behavior: A Critical Analysis from the Stress Process Perspective.

    ERIC Educational Resources Information Center

    Sandin, Bonifacio; Chorot, Paloma; Santed, Miguel A.; Valiente, Rosa M.; Joiner, Thomas E., Jr.

    1998-01-01

    Empirical evidence relating negative life events and adolescent suicidal behavior is reviewed. The contribution of life events tends to be moderate or weak. A stress process model is presented. Past research has not incorporated mediating and moderating variables into pathways that link psychosocial stressors and suicidal outcomes, providing a…

  18. Influence of raw material properties upon critical quality attributes of continuously produced granules and tablets.

    PubMed

    Fonteyne, Margot; Wickström, Henrika; Peeters, Elisabeth; Vercruysse, Jurgen; Ehlers, Henrik; Peters, Björn-Hendrik; Remon, Jean Paul; Vervaet, Chris; Ketolainen, Jarkko; Sandler, Niklas; Rantanen, Jukka; Naelapää, Kaisa; De Beer, Thomas

    2014-07-01

    Continuous manufacturing gains more and more interest within the pharmaceutical industry. The International Conference of Harmonisation (ICH) states in its Q8 'Pharmaceutical Development' guideline that the manufacturer of pharmaceuticals should have an enhanced knowledge of the product performance over a range of raw material attributes, manufacturing process options and process parameters. This fits further into the Process Analytical Technology (PAT) and Quality by Design (QbD) framework. The present study evaluates the effect of variation in critical raw material properties on the critical quality attributes of granules and tablets, produced by a continuous from-powder-to-tablet wet granulation line. The granulation process parameters were kept constant to examine the differences in the end product quality caused by the variability of the raw materials properties only. Theophylline-Lactose-PVP (30-67.5-2.5%) was used as model formulation. Seven different grades of theophylline were granulated. Afterward, the obtained granules were tableted. Both the characteristics of granules and tablets were determined. The results show that differences in raw material properties both affect their processability and several critical quality attributes of the resulting granules and tablets. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Critical shear stress for erosion of cohesive soils subjected to temperatures typical of wildfires

    USGS Publications Warehouse

    Moody, J.A.; Dungan, Smith J.; Ragan, B.W.

    2005-01-01

    [1] Increased erosion is a well-known response after wildfire. To predict and to model erosion on a landscape scale requires knowledge of the critical shear stress for the initiation of motion of soil particles. As this soil property is temperature-dependent, a quantitative relation between critical shear stress and the temperatures to which the soils have been subjected during a wildfire is required. In this study the critical shear stress was measured in a recirculating flume using samples of forest soil exposed to different temperatures (40??-550??C) for 1 hour. Results were obtained for four replicates of soils derived from three different types of parent material (granitic bedrock, sandstone, and volcanic tuffs). In general, the relation between critical shear stress and temperature can be separated into three different temperature ranges (275??C), which are similar to those for water repellency and temperature. The critical shear stress was most variable (1.0-2.0 N m-2) for temperatures 2.0 N m-2) between 175?? and 275??C, and was essentially constant (0.5-0.8 N m-2) for temperatures >275??C. The changes in critical shear stress with temperature were found to be essentially independent of soil type and suggest that erosion processes in burned watersheds can be modeled more simply than erosion processes in unburned watersheds. Wildfire reduces the spatial variability of soil erodibility associated with unburned watersheds by eliminating the complex effects of vegetation in protecting soils and by reducing the range of cohesion associated with different types of unburned soils. Our results indicate that modeling the erosional response after a wildfire depends primarily on determining the spatial distribution of the maximum soil temperatures that were reached during the wildfire. Copyright 2005 by the American Geophysical Union.

  20. Simulation and sensitivity analysis of carbon storage and fluxes in the New Jersey Pinelands

    Treesearch

    Zewei Miao; Richard G. Lathrop; Ming Xu; Inga P. La Puma; Kenneth L. Clark; John Hom; Nicholas Skowronski; Steve Van Tuyl

    2011-01-01

    A major challenge in modeling the carbon dynamics of vegetation communities is the proper parameterization and calibration of eco-physiological variables that are critical determinants of the ecosystem process-based model behavior. In this study, we improved and calibrated a biochemical process-based WxBGC model by using in situ AmeriFlux eddy covariance tower...

  1. Forest cover change, climate variability, and hydrological responses

    Treesearch

    Xiaohua Wei; Rita Winkler; Ge Sun

    2017-01-01

    Understanding ecohydrological response to environmental change is critical for protecting watershed functions, sustaining clean water supply, and other ecosystem services, safeguarding public safety, floods mitigation, and drought response. Understanding ecohyhdrological processes and their implications to forest and water management has become increasingly important...

  2. Methods and Techniques of Revenue Forecasting.

    ERIC Educational Resources Information Center

    Caruthers, J. Kent; Wentworth, Cathi L.

    1997-01-01

    Revenue forecasting is the critical first step in most college and university budget-planning processes. While it seems a straightforward exercise, effective forecasting requires consideration of a number of interacting internal and external variables, including demographic trends, economic conditions, and broad social priorities. The challenge…

  3. Understanding human management of automation errors

    PubMed Central

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  4. Understanding human management of automation errors.

    PubMed

    McBride, Sara E; Rogers, Wendy A; Fisk, Arthur D

    2014-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance.

  5. Characterization and optimization of cell seeding in scaffolds by factorial design: quality by design approach for skeletal tissue engineering.

    PubMed

    Chen, Yantian; Bloemen, Veerle; Impens, Saartje; Moesen, Maarten; Luyten, Frank P; Schrooten, Jan

    2011-12-01

    Cell seeding into scaffolds plays a crucial role in the development of efficient bone tissue engineering constructs. Hence, it becomes imperative to identify the key factors that quantitatively predict reproducible and efficient seeding protocols. In this study, the optimization of a cell seeding process was investigated using design of experiments (DOE) statistical methods. Five seeding factors (cell type, scaffold type, seeding volume, seeding density, and seeding time) were selected and investigated by means of two response parameters, critically related to the cell seeding process: cell seeding efficiency (CSE) and cell-specific viability (CSV). In addition, cell spatial distribution (CSD) was analyzed by Live/Dead staining assays. Analysis identified a number of statistically significant main factor effects and interactions. Among the five seeding factors, only seeding volume and seeding time significantly affected CSE and CSV. Also, cell and scaffold type were involved in the interactions with other seeding factors. Within the investigated ranges, optimal conditions in terms of CSV and CSD were obtained when seeding cells in a regular scaffold with an excess of medium. The results of this case study contribute to a better understanding and definition of optimal process parameters for cell seeding. A DOE strategy can identify and optimize critical process variables to reduce the variability and assists in determining which variables should be carefully controlled during good manufacturing practice production to enable a clinically relevant implant.

  6. Finite-Size Scaling Analysis of Binary Stochastic Processes and Universality Classes of Information Cascade Phase Transition

    NASA Astrophysics Data System (ADS)

    Mori, Shintaro; Hisakado, Masato

    2015-05-01

    We propose a finite-size scaling analysis method for binary stochastic processes X(t) in { 0,1} based on the second moment correlation length ξ for the autocorrelation function C(t). The purpose is to clarify the critical properties and provide a new data analysis method for information cascades. As a simple model to represent the different behaviors of subjects in information cascade experiments, we assume that X(t) is a mixture of an independent random variable that takes 1 with probability q and a random variable that depends on the ratio z of the variables taking 1 among recent r variables. We consider two types of the probability f(z) that the latter takes 1: (i) analog [f(z) = z] and (ii) digital [f(z) = θ(z - 1/2)]. We study the universal functions of scaling for ξ and the integrated correlation time τ. For finite r, C(t) decays exponentially as a function of t, and there is only one stable renormalization group (RG) fixed point. In the limit r to ∞ , where X(t) depends on all the previous variables, C(t) in model (i) obeys a power law, and the system becomes scale invariant. In model (ii) with q ≠ 1/2, there are two stable RG fixed points, which correspond to the ordered and disordered phases of the information cascade phase transition with the critical exponents β = 1 and ν|| = 2.

  7. Real Time Land-Surface Hydrologic Modeling Over Continental US

    NASA Technical Reports Server (NTRS)

    Houser, Paul R.

    1998-01-01

    The land surface component of the hydrological cycle is fundamental to the overall functioning of the atmospheric and climate processes. Spatially and temporally variable rainfall and available energy, combined with land surface heterogeneity cause complex variations in all processes related to surface hydrology. The characterization of the spatial and temporal variability of water and energy cycles are critical to improve our understanding of land surface-atmosphere interaction and the impact of land surface processes on climate extremes. Because the accurate knowledge of these processes and their variability is important for climate predictions, most Numerical Weather Prediction (NWP) centers have incorporated land surface schemes in their models. However, errors in the NWP forcing accumulate in the surface and energy stores, leading to incorrect surface water and energy partitioning and related processes. This has motivated the NWP to impose ad hoc corrections to the land surface states to prevent this drift. A proposed methodology is to develop Land Data Assimilation schemes (LDAS), which are uncoupled models forced with observations, and not affected by NWP forcing biases. The proposed research is being implemented as a real time operation using an existing Surface Vegetation Atmosphere Transfer Scheme (SVATS) model at a 40 km degree resolution across the United States to evaluate these critical science questions. The model will be forced with real time output from numerical prediction models, satellite data, and radar precipitation measurements. Model parameters will be derived from the existing GIS vegetation and soil coverages. The model results will be aggregated to various scales to assess water and energy balances and these will be validated with various in-situ observations.

  8. Development of a design space and predictive statistical model for capsule filling of low-fill-weight inhalation products.

    PubMed

    Faulhammer, E; Llusa, M; Wahl, P R; Paudel, A; Lawrence, S; Biserni, S; Calzolari, V; Khinast, J G

    2016-01-01

    The objectives of this study were to develop a predictive statistical model for low-fill-weight capsule filling of inhalation products with dosator nozzles via the quality by design (QbD) approach and based on that to create refined models that include quadratic terms for significant parameters. Various controllable process parameters and uncontrolled material attributes of 12 powders were initially screened using a linear model with partial least square (PLS) regression to determine their effect on the critical quality attributes (CQA; fill weight and weight variability). After identifying critical material attributes (CMAs) and critical process parameters (CPPs) that influenced the CQA, model refinement was performed to study if interactions or quadratic terms influence the model. Based on the assessment of the effects of the CPPs and CMAs on fill weight and weight variability for low-fill-weight inhalation products, we developed an excellent linear predictive model for fill weight (R(2 )= 0.96, Q(2 )= 0.96 for powders with good flow properties and R(2 )= 0.94, Q(2 )= 0.93 for cohesive powders) and a model that provides a good approximation of the fill weight variability for each powder group. We validated the model, established a design space for the performance of different types of inhalation grade lactose on low-fill weight capsule filling and successfully used the CMAs and CPPs to predict fill weight of powders that were not included in the development set.

  9. Development of nanostructured lipid carriers containing salicyclic acid for dermal use based on the Quality by Design method.

    PubMed

    Kovács, A; Berkó, Sz; Csányi, E; Csóka, I

    2017-03-01

    The aim of our present work was to evaluate the applicability of the Quality by Design (QbD) methodology in the development and optimalization of nanostructured lipid carriers containing salicyclic acid (NLC SA). Within the Quality by Design methology, special emphasis is layed on the adaptation of the initial risk assessment step in order to properly identify the critical material attributes and critical process parameters in formulation development. NLC SA products were formulated by the ultrasonication method using Compritol 888 ATO as solid lipid, Miglyol 812 as liquid lipid and Cremophor RH 60® as surfactant. LeanQbD Software and StatSoft. Inc. Statistica for Windows 11 were employed to indentify the risks. Three highly critical quality attributes (CQAs) for NLC SA were identified, namely particle size, particle size distribution and aggregation. Five attributes of medium influence were identified, including dissolution rate, dissolution efficiency, pH, lipid solubility of the active pharmaceutical ingredient (API) and entrapment efficiency. Three critical material attributes (CMA) and critical process parameters (CPP) were identified: surfactant concentration, solid lipid/liquid lipid ratio and ultrasonication time. The CMAs and CPPs are considered as independent variables and the CQAs are defined as dependent variables. The 2 3 factorial design was used to evaluate the role of the independent and dependent variables. Based on our experiments, an optimal formulation can be obtained when the surfactant concentration is set to 5%, the solid lipid/liquid lipid ratio is 7:3 and ultrasonication time is 20min. The optimal NLC SA showed narrow size distribution (0.857±0.014) with a mean particle size of 114±2.64nm. The NLC SA product showed a significantly higher in vitro drug release compared to the micro-particle reference preparation containing salicylic acid (MP SA). Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Progress in Titanium Metal Powder Injection Molding.

    PubMed

    German, Randall M

    2013-08-20

    Metal powder injection molding is a shaping technology that has achieved solid scientific underpinnings. It is from this science base that recent progress has occurred in titanium powder injection molding. Much of the progress awaited development of the required particles with specific characteristics of particle size, particle shape, and purity. The production of titanium components by injection molding is stabilized by a good understanding of how each process variable impacts density and impurity level. As summarized here, recent research has isolated the four critical success factors in titanium metal powder injection molding (Ti-MIM) that must be simultaneously satisfied-density, purity, alloying, and microstructure. The critical role of density and impurities, and the inability to remove impurities with sintering, compels attention to starting Ti-MIM with high quality alloy powders. This article addresses the four critical success factors to rationalize Ti-MIM processing conditions to the requirements for demanding applications in aerospace and medical fields. Based on extensive research, a baseline process is identified and reported here with attention to linking mechanical properties to the four critical success factors.

  11. Progress in Titanium Metal Powder Injection Molding

    PubMed Central

    German, Randall M.

    2013-01-01

    Metal powder injection molding is a shaping technology that has achieved solid scientific underpinnings. It is from this science base that recent progress has occurred in titanium powder injection molding. Much of the progress awaited development of the required particles with specific characteristics of particle size, particle shape, and purity. The production of titanium components by injection molding is stabilized by a good understanding of how each process variable impacts density and impurity level. As summarized here, recent research has isolated the four critical success factors in titanium metal powder injection molding (Ti-MIM) that must be simultaneously satisfied—density, purity, alloying, and microstructure. The critical role of density and impurities, and the inability to remove impurities with sintering, compels attention to starting Ti-MIM with high quality alloy powders. This article addresses the four critical success factors to rationalize Ti-MIM processing conditions to the requirements for demanding applications in aerospace and medical fields. Based on extensive research, a baseline process is identified and reported here with attention to linking mechanical properties to the four critical success factors. PMID:28811458

  12. Sequential chemical-biological processes for the treatment of industrial wastewaters: review of recent progresses and critical assessment.

    PubMed

    Guieysse, Benoit; Norvill, Zane N

    2014-02-28

    When direct wastewater biological treatment is unfeasible, a cost- and resource-efficient alternative to direct chemical treatment consists of combining biological treatment with a chemical pre-treatment aiming to convert the hazardous pollutants into more biodegradable compounds. Whereas the principles and advantages of sequential treatment have been demonstrated for a broad range of pollutants and process configurations, recent progresses (2011-present) in the field provide the basis for refining assessment of feasibility, costs, and environmental impacts. This paper thus reviews recent real wastewater demonstrations at pilot and full scale as well as new process configurations. It also discusses new insights on the potential impacts of microbial community dynamics on process feasibility, design and operation. Finally, it sheds light on a critical issue that has not yet been properly addressed in the field: integration requires complex and tailored optimization and, of paramount importance to full-scale application, is sensitive to uncertainty and variability in the inputs used for process design and operation. Future research is therefore critically needed to improve process control and better assess the real potential of sequential chemical-biological processes for industrial wastewater treatment. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Report to the High Order Language Working Group (HOLWG)

    DTIC Science & Technology

    1977-01-14

    as running, runnable, suspended or dormant, may be synchronized by semaphore variables, may be schedaled using clock and duration data types and mpy...Recursive and non-recursive routines G6. Parallel processes, synchronization , critical regions G7. User defined parameterized exception handling G8...typed and lacks extensibility, parallel processing, synchronization and real-time features. Overall Evaluation IBM strongly recommended PL/I as a

  14. Media milling process optimization for manufacture of drug nanoparticles using design of experiments (DOE).

    PubMed

    Nekkanti, Vijaykumar; Marwah, Ashwani; Pillai, Raviraj

    2015-01-01

    Design of experiments (DOE), a component of Quality by Design (QbD), is systematic and simultaneous evaluation of process variables to develop a product with predetermined quality attributes. This article presents a case study to understand the effects of process variables in a bead milling process used for manufacture of drug nanoparticles. Experiments were designed and results were computed according to a 3-factor, 3-level face-centered central composite design (CCD). The factors investigated were motor speed, pump speed and bead volume. Responses analyzed for evaluating these effects and interactions were milling time, particle size and process yield. Process validation batches were executed using the optimum process conditions obtained from software Design-Expert® to evaluate both the repeatability and reproducibility of bead milling technique. Milling time was optimized to <5 h to obtain the desired particle size (d90 < 400 nm). The desirability function used to optimize the response variables and observed responses were in agreement with experimental values. These results demonstrated the reliability of selected model for manufacture of drug nanoparticles with predictable quality attributes. The optimization of bead milling process variables by applying DOE resulted in considerable decrease in milling time to achieve the desired particle size. The study indicates the applicability of DOE approach to optimize critical process parameters in the manufacture of drug nanoparticles.

  15. Understanding the uncertainty associated with particle-bound pollutant build-up and wash-off: A critical review.

    PubMed

    Wijesiri, Buddhi; Egodawatta, Prasanna; McGree, James; Goonetilleke, Ashantha

    2016-09-15

    Accurate prediction of stormwater quality is essential for developing effective pollution mitigation strategies. The use of models incorporating simplified mathematical replications of pollutant processes is the common practice for determining stormwater quality. However, an inherent process uncertainty arises due to the intrinsic variability associated with pollutant processes, which has neither been comprehensively understood, nor well accounted for in uncertainty assessment of stormwater quality modelling. This review provides the context for defining and quantifying the uncertainty associated with pollutant build-up and wash-off on urban impervious surfaces based on the hypothesis that particle size is predominant in influencing process variability. Critical analysis of published research literature brings scientific evidence together in order to establish the fact that particle size changes with time, and different sized particles exhibit distinct behaviour during build-up and wash-off, resulting in process variability. Analysis of the different adsorption behaviour of particles confirmed that the variations in pollutant load and composition are influenced by particle size. Particle behaviour and variations in pollutant load and composition are related due to the strong affinity of pollutants such as heavy metals and hydrocarbons for specific particle size ranges. As such, the temporal variation in particle size is identified as the key to establishing a basis for assessing build-up and wash-off process uncertainty. Therefore, accounting for pollutant build-up and wash-off process variability, which is influenced by particle size, would facilitate the assessment of the uncertainty associated with modelling outcomes. Furthermore, the review identified fundamental knowledge gaps where further research is needed in relation to: (1) the aggregation of particles suspended in the atmosphere during build-up; (2) particle re-suspension during wash-off; (3) pollutant re-adsorption by different particle size fractions; and (4) development of evidence-based techniques for assessing uncertainty; and (5) methods for translating the knowledge acquired from the investigation of process mechanisms at small scale into catchment scale for stormwater quality modelling. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Variability in, variability out: best practice recommendations to standardize pre-analytical variables in the detection of circulating and tissue microRNAs.

    PubMed

    Khan, Jenna; Lieberman, Joshua A; Lockwood, Christina M

    2017-05-01

    microRNAs (miRNAs) hold promise as biomarkers for a variety of disease processes and for determining cell differentiation. These short RNA species are robust, survive harsh treatment and storage conditions and may be extracted from blood and tissue. Pre-analytical variables are critical confounders in the analysis of miRNAs: we elucidate these and identify best practices for minimizing sample variation in blood and tissue specimens. Pre-analytical variables addressed include patient-intrinsic variation, time and temperature from sample collection to storage or processing, processing methods, contamination by cells and blood components, RNA extraction method, normalization, and storage time/conditions. For circulating miRNAs, hemolysis and blood cell contamination significantly affect profiles; samples should be processed within 2 h of collection; ethylene diamine tetraacetic acid (EDTA) is preferred while heparin should be avoided; samples should be "double spun" or filtered; room temperature or 4 °C storage for up to 24 h is preferred; miRNAs are stable for at least 1 year at -20 °C or -80 °C. For tissue-based analysis, warm ischemic time should be <1 h; cold ischemic time (4 °C) <24 h; common fixative used for all specimens; formalin fix up to 72 h prior to processing; enrich for cells of interest; validate candidate biomarkers with in situ visualization. Most importantly, all specimen types should have standard and common workflows with careful documentation of relevant pre-analytical variables.

  17. Formulation characteristics and in vitro release testing of cyclosporine ophthalmic ointments.

    PubMed

    Dong, Yixuan; Qu, Haiou; Pavurala, Naresh; Wang, Jiang; Sekar, Vasanthakumar; Martinez, Marilyn N; Fahmy, Raafat; Ashraf, Muhammad; Cruz, Celia N; Xu, Xiaoming

    2018-06-10

    The aim of the present study was to investigate the relationship between formulation/process variables versus the critical quality attributes (CQAs) of cyclosporine ophthalmic ointments and to explore the feasibility of using an in vitro approach to assess product sameness. A definitive screening design (DSD) was used to evaluate the impact of formulation and process variables. The formulation variables included drug percentage, percentage of corn oil and lanolin alcohol. The process variables studied were mixing temperature, mixing time and the method of mixing. The quality and performance attributes examined included drug assay, content uniformity, image analysis, rheology (storage modulus, shear viscosity) and in vitro drug release. Of the formulation variables evaluated, the percentage of the drug substance and the percentage of corn oil in the matrix were the most influential factors with respect to in vitro drug release. Conversely, the process parameters tested were observed to have minimal impact. An evaluation of the release mechanism of cyclosporine from the ointment revealed an interplay between formulation (e.g. physicochemical properties of the drug and ointment matrix type) and the release medium. These data provide a scientific basis to guide method development for in vitro drug release testing of ointment dosage forms. These results demonstrate that the in vitro methods used in this investigation were fit-for-purpose for detecting formulation and process changes and therefore amenable to assessment of product sameness. Published by Elsevier B.V.

  18. Assessment of reservoir system variable forecasts

    NASA Astrophysics Data System (ADS)

    Kistenmacher, Martin; Georgakakos, Aris P.

    2015-05-01

    Forecast ensembles are a convenient means to model water resources uncertainties and to inform planning and management processes. For multipurpose reservoir systems, forecast types include (i) forecasts of upcoming inflows and (ii) forecasts of system variables and outputs such as reservoir levels, releases, flood damage risks, hydropower production, water supply withdrawals, water quality conditions, navigation opportunities, and environmental flows, among others. Forecasts of system variables and outputs are conditional on forecasted inflows as well as on specific management policies and can provide useful information for decision-making processes. Unlike inflow forecasts (in ensemble or other forms), which have been the subject of many previous studies, reservoir system variable and output forecasts are not formally assessed in water resources management theory or practice. This article addresses this gap and develops methods to rectify potential reservoir system forecast inconsistencies and improve the quality of management-relevant information provided to stakeholders and managers. The overarching conclusion is that system variable and output forecast consistency is critical for robust reservoir management and needs to be routinely assessed for any management model used to inform planning and management processes. The above are demonstrated through an application from the Sacramento-American-San Joaquin reservoir system in northern California.

  19. Advanced in-production hotspot prediction and monitoring with micro-topography

    NASA Astrophysics Data System (ADS)

    Fanton, P.; Hasan, T.; Lakcher, A.; Le-Gratiet, B.; Prentice, C.; Simiz, J.-G.; La Greca, R.; Depre, L.; Hunsche, S.

    2017-03-01

    At 28nm technology node and below, hot spot prediction and process window control across production wafers have become increasingly critical to prevent hotspots from becoming yield-limiting defects. We previously established proof of concept for a systematic approach to identify the most critical pattern locations, i.e. hotspots, in a reticle layout by computational lithography and combining process window characteristics of these patterns with across-wafer process variation data to predict where hotspots may become yield impacting defects [1,2]. The current paper establishes the impact of micro-topography on a 28nm metal layer, and its correlation with hotspot best focus variations across a production chip layout. Detailed topography measurements are obtained from an offline tool, and pattern-dependent best focus (BF) shifts are determined from litho simulations that include mask-3D effects. We also establish hotspot metrology and defect verification by SEM image contour extraction and contour analysis. This enables detection of catastrophic defects as well as quantitative characterization of pattern variability, i.e. local and global CD uniformity, across a wafer to establish hotspot defect and variability maps. Finally, we combine defect prediction and verification capabilities for process monitoring by on-product, guided hotspot metrology, i.e. with sampling locations being determined from the defect prediction model and achieved prediction accuracy (capture rate) around 75%

  20. Proposed Interoperability Readiness Level Assessment for Mission Critical Interfaces During Navy Acquisition

    DTIC Science & Technology

    2010-12-01

    This involves zeroing and recreating the interoperability arrays and other variables used in the simulation. Since the constants do not change from run......Using this algorithm, the process of encrypting/decrypting data requires very little computation, and the generation of the random pads can be

  1. SimilarityExplorer: A visual inter-comparison tool for multifaceted climate data

    Treesearch

    J. Poco; A. Dasgupta; Y. Wei; W. Hargrove; C. Schwalm; R. Cook; E. Bertini; C. Silva

    2014-01-01

    Inter-comparison and similarity analysis to gauge consensus among multiple simulation models is a critical visualization problem for understanding climate change patterns. Climate models, specifically, Terrestrial Biosphere Models (TBM) represent time and space variable ecosystem processes, for example, simulations of photosynthesis and respiration, using algorithms...

  2. Optimal Experimental Design for Model Discrimination

    ERIC Educational Resources Information Center

    Myung, Jay I.; Pitt, Mark A.

    2009-01-01

    Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it…

  3. COMPARATIVE EVALUATION OF SHORT-TERM LEACH TESTS FOR HEAVY METAL RELEASE FROM MINERAL PROCESSING WASTE

    EPA Science Inventory

    Evaluation of metal leaching using a single leach test such as the Toxicity Characteristic Leaching Procedure (TCLP) is often questionable. The pH, redox potential (Eh), particle size and contact time are critical variables in controlling metal stability, not accounted...

  4. Buckling and Post-Buckling Behaviors of a Variable Stiffness Composite Laminated Wing Box Structure

    NASA Astrophysics Data System (ADS)

    Wang, Peiyan; Huang, Xinting; Wang, Zhongnan; Geng, Xiaoliang; Wang, Yuansheng

    2018-04-01

    The buckling and post-buckling behaviors of variable stiffness composite laminates (VSCL) with curvilinear fibers were investigated and compared with constant stiffness composite laminates (CSCL) with straight fibers. A VSCL box structure was evaluated under a pure bending moment. The results of the comparative test showed that the critical buckling load of the VSCL box was approximately 3% higher than that of the CSCL box. However, the post-buckling load-bearing capacity was similar due to the layup angle and the immature status of the material processing technology. The properties of the VSCL and CSCL boxes under a pure bending moment were simulated using the Hashin criterion and cohesive interface elements. The simulation results are consistent with the experimental results in stiffness, critical buckling load and failure modes but not in post-buckling load capacity. The results of the experiment, the simulation and laminated plate theory show that VSCL greatly improves the critical buckling load but has little influence on the post-buckling load-bearing capacity.

  5. Mental Status Documentation: Information Quality and Data Processes

    PubMed Central

    Weir, Charlene; Gibson, Bryan; Taft, Teresa; Slager, Stacey; Lewis, Lacey; Staggers, Nancy

    2016-01-01

    Delirium is a fluctuating disturbance of cognition and/or consciousness associated with poor outcomes. Caring for patients with delirium requires integration of disparate information across clinicians, settings and time. The goal of this project was to characterize the information processes involved in nurses’ assessment, documentation, decisionmaking and communication regarding patients’ mental status in the inpatient setting. VA nurse managers of medical wards (n=18) were systematically selected across the US. A semi-structured telephone interview focused on current assessment, documentation, and communication processes, as well as clinical and administrative decision-making was conducted, audio-recorded and transcribed. A thematic analytic approach was used. Five themes emerged: 1) Fuzzy Concepts, 2) Grey Data, 3) Process Variability 4) Context is Critical and 5) Goal Conflict. This project describes the vague and variable information processes related to delirium and mental status that undermine effective risk, prevention, identification, communication and mitigation of harm. PMID:28269919

  6. Mental Status Documentation: Information Quality and Data Processes.

    PubMed

    Weir, Charlene; Gibson, Bryan; Taft, Teresa; Slager, Stacey; Lewis, Lacey; Staggers, Nancy

    2016-01-01

    Delirium is a fluctuating disturbance of cognition and/or consciousness associated with poor outcomes. Caring for patients with delirium requires integration of disparate information across clinicians, settings and time. The goal of this project was to characterize the information processes involved in nurses' assessment, documentation, decisionmaking and communication regarding patients' mental status in the inpatient setting. VA nurse managers of medical wards (n=18) were systematically selected across the US. A semi-structured telephone interview focused on current assessment, documentation, and communication processes, as well as clinical and administrative decision-making was conducted, audio-recorded and transcribed. A thematic analytic approach was used. Five themes emerged: 1) Fuzzy Concepts, 2) Grey Data, 3) Process Variability 4) Context is Critical and 5) Goal Conflict. This project describes the vague and variable information processes related to delirium and mental status that undermine effective risk, prevention, identification, communication and mitigation of harm.

  7. Rivers of the Andes and the Amazon Basin: Deciphering global change from the hydroclimatic variability in the critical zone

    NASA Astrophysics Data System (ADS)

    Moreira-Turcq, Patricia; Carlo Espinoza, Jhan; Filizola, Naziano; Martinez, Jean-Michel

    2018-01-01

    The Critical Zone has been defined as the thin layer of the continental surfaces extending from fresh bedrock and the bottom of groundwater up to vegetation canopy, where soil, rock, water, air, and living organisms interact (Banwart et al., 2012; Lin et al., 2011). Despite the Critical Zone's importance to terrestrial life, it remains poorly understood. In this context, understanding the complex interactions between physical, chemical, and biological processes of the Critical Zone requires long-term observations (Anderson et al., 2012; Brantley et al., 2017), not only because different mechanisms have varying time frames, but also because it is necessary to monitor its natural and anthropogenic evolution in response to global climate and environmental changes.

  8. Prioritizing the organization and management of intensive care services in the United States: the PrOMIS Conference.

    PubMed

    Barnato, Amber E; Kahn, Jeremy M; Rubenfeld, Gordon D; McCauley, Kathleen; Fontaine, Dorrie; Frassica, Joseph J; Hubmayr, Rolf; Jacobi, Judith; Brower, Roy G; Chalfin, Donald; Sibbald, William; Asch, David A; Kelley, Mark; Angus, Derek C

    2007-04-01

    Adult critical care services are a large, expensive part of U.S. health care. The current agenda for response to workforce shortages and rising costs has largely been determined by members of the critical care profession without input from other stakeholders. We sought to elicit the perceived problems and solutions to the delivery of critical care services from a broad set of U.S. stakeholders. A consensus process involving purposive sampling of identified stakeholders, preconference Web-based survey, and 2-day conference. Participants represented healthcare providers, accreditation and quality-oversight groups, federal sponsoring institutions, healthcare vendors, and institutional and individual payers. We identified 39 stakeholders for the field of critical care medicine. Thirty-six (92%) completed the preconference survey and 37 (95%) attended the conference. None. Participants expressed moderate to strong agreement with the concerns identified by the critical care professionals and additionally expressed consternation that the critical care delivery system was fragmented, variable, and not patient-centered. Recommended solutions included regionalizing the adult critical care system into "tiers" defined by explicit triage criteria and professional competencies, achieved through voluntary hospital accreditation, supported through an expanded process of competency certification, and monitored through process and outcome surveillance; implementing mechanisms for improved communication across providers and settings and between providers and patients/families; and conducting market research and a public education campaign regarding critical care's promises and limitations. This consensus conference confirms that agreement on solutions to complex healthcare delivery problems can be achieved and that problem and solution frames expand with broader stakeholder participation. This process can be used as a model by other specialties to address priority setting in an era of shifting demographics and increasing resource constraints.

  9. Identification of critical formulation and processing variables for metoprolol tartrate extended-release (ER) matrix tablets.

    PubMed

    Rekhi, G S; Nellore, R V; Hussain, A S; Tillman, L G; Malinowski, H J; Augsburger, L L

    1999-06-02

    The objective of this study, was to examine the influence of critical formulation and processing variables as described in the AAPS/FDA Workshop II report on scale-up of oral extended-release dosage forms, using a hydrophilic polymer hydroxypropyl methylcellulose (Methocel K100LV). A face-centered central composite design (26 runs+3 center points) was selected and the variables studied were: filler ratio (lactose:dicalcium phosphate (50:50)), polymer level (15/32.5/50%), magnesium stearate level (1/1.5/2%), lubricant blend time (2/6/10 min) and compression force (400/600/800 kg). Granulations (1.5 kg, 3000 units) were manufactured using a fluid-bed process, lubricated and tablets (100 mg metoprolol tartrate) were compressed on an instrumented Manesty D3B rotary tablet press. Dissolution tests were performed using USP apparatus 2, at 50 rpm in 900 ml phosphate buffer (pH 6.8). Responses studied included percent drug released at Q1 (1 h), Q4, Q6, Q12. Analysis of variance indicated that change in polymer level was the most significant factor affecting drug release. Increase in dicalcium phosphate level and compression force were found to affect the percent released at the later dissolution time points. Some interaction effects between the variables studied were also found to be statistically significant. The drug release mechanism was predominantly found to be Fickian diffusion controlled (n=0.46-0.59). Response surface plots and regression models were developed which adequately described the experimental space. Three formulations having slow-, medium- and fast-releasing dissolution profiles were identified for a future bioavailability/bioequivalency study. The results of this study provided the framework for further work involving both in vivo studies and scale-up.

  10. Impact of auditory selective attention on verbal short-term memory and vocabulary development.

    PubMed

    Majerus, Steve; Heiligenstein, Lucie; Gautherot, Nathalie; Poncelet, Martine; Van der Linden, Martial

    2009-05-01

    This study investigated the role of auditory selective attention capacities as a possible mediator of the well-established association between verbal short-term memory (STM) and vocabulary development. A total of 47 6- and 7-year-olds were administered verbal immediate serial recall and auditory attention tasks. Both task types probed processing of item and serial order information because recent studies have shown this distinction to be critical when exploring relations between STM and lexical development. Multiple regression and variance partitioning analyses highlighted two variables as determinants of vocabulary development: (a) a serial order processing variable shared by STM order recall and a selective attention task for sequence information and (b) an attentional variable shared by selective attention measures targeting item or sequence information. The current study highlights the need for integrative STM models, accounting for conjoined influences of attentional capacities and serial order processing capacities on STM performance and the establishment of the lexical language network.

  11. Implications of contact metamorphism of Mancos Shale for critical zone processes

    NASA Astrophysics Data System (ADS)

    Navarre-Sitchler, A.

    2016-12-01

    Bedrock lithology imparts control on some critical zone processes, for example rates and extent of chemical weathering, solute release though mineral dissolution, and water flow. Bedrock can be very heterogeneous resulting in spatial variability of these processes throughout a catchment. In the East River watershed outside of Crested Butte, Colorado, bedrock is dominantly comprised of the Mancos Shale; a Cretaceous aged, organic carbon rich marine shale. However, in some areas the Mancos Shale appears contact metamorphosed by nearby igneous intrusions resulting in a potential gradient in lithologic change in part of the watershed where impacts of lithology on critical zone processes can be evaluated. Samples were collected in the East River valley along a transect from the contact between the Tertiary Gothic Mountain laccolith of the Mount Carbon igneous system and the underlying Manocs shale. Porosity of these samples was analyzed by small-angle and ultra small-angle neutron scattering. Results indicate contact metamorphism decreases porosity of the shale and changes the pore shape from slightly anisotropic pores aligned with bedding in the unmetamorphosed shale to isotropic pores with no bedding alignment in the metamorphosed shales. The porosity analysis combined with clay mineralogy, surface area, carbon content and oxidation state, and solute release rates determined from column experiments will be used to develop a full understanding of the impact of contact metamorphism on critical zone processes in the East River.

  12. Climate variability and extremes, interacting with nitrogen storage, amplify eutrophication risk

    USGS Publications Warehouse

    Lee, Minjin; Shevliakova, Elena; Malyshev, Sergey; Milly, P.C.D.; Jaffe, Peter R.

    2016-01-01

    Despite 30 years of basin-wide nutrient-reduction efforts, severe hypoxia continues to be observed in the Chesapeake Bay. Here we demonstrate the critical influence of climate variability, interacting with accumulated nitrogen (N) over multidecades, on Susquehanna River dissolved nitrogen (DN) loads, known precursors of the hypoxia in the Bay. We used the process model LM3-TAN (Terrestrial and Aquatic Nitrogen), which is capable of capturing both seasonal and decadal-to-century changes in vegetation-soil-river N storage, and produced nine scenarios of DN-load distributions under different short-term scenarios of climate variability and extremes. We illustrate that after 1 to 3 yearlong dry spells, the likelihood of exceeding a threshold DN load (56 kt yr−1) increases by 40 to 65% due to flushing of N accumulated throughout the dry spells and altered microbial processes. Our analyses suggest that possible future increases in climate variability/extremes—specifically, high precipitation occurring after multiyear dry spells—could likely lead to high DN-load anomalies and hypoxia.

  13. Test Operations Procedure (TOP) 06-2-301 Wind Testing

    DTIC Science & Technology

    2017-06-14

    critical to ensure that the test item is exposed to the required wind speeds. This may be an iterative process as the fan blade pitch, fan speed...fan speed is the variable that is adjusted to reach the required velocities. Calibration runs with a range of fan speeds are performed and a

  14. Using the Terms "Hypothesis" and "Variable" for Qualitative Work: A Critical Reflection

    ERIC Educational Resources Information Center

    Lareau, Annette

    2012-01-01

    Ralph LaRossa's (2012) thoughtful piece suggested that qualitative researchers' self-awareness (and clear articulation) of their conceptual and empirical goals can help their manuscripts in many ways, including during the review process. If authors self-consciously embrace particular orientations, then it will be easier for reviewers to evaluate…

  15. How fast and how often: The pharmacokinetics of drug use are decisive in addiction.

    PubMed

    Allain, Florence; Minogianis, Ellie-Anna; Roberts, David C S; Samaha, Anne-Noël

    2015-09-01

    How much, how often and how fast a drug reaches the brain determine the behavioural and neuroplastic changes associated with the addiction process. Despite the critical nature of these variables, the drug addiction field often ignores pharmacokinetic issues, which we argue can lead to false conclusions. First, we review the clinical data demonstrating the importance of the speed of drug onset and of intermittent patterns of drug intake in psychostimulant drug addiction. This is followed by a review of the preclinical literature demonstrating that pharmacokinetic variables play a decisive role in determining behavioural and neurobiological outcomes in animal models of addiction. This literature includes recent data highlighting the importance of intermittent, 'spiking' brain levels of drug in producing an increase in the motivation to take drug over time. Rapid drug onset and intermittent drug exposure both appear to push the addiction process forward most effectively. This has significant implications for refining animal models of addiction and for better understanding the neuroadaptations that are critical for the disorder. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Vascular Glucose Sensor Symposium: Continuous Glucose Monitoring Systems (CGMS) for Hospitalized and Ambulatory Patients at Risk for Hyperglycemia, Hypoglycemia, and Glycemic Variability.

    PubMed

    Joseph, Jeffrey I; Torjman, Marc C; Strasma, Paul J

    2015-07-01

    Hyperglycemia, hypoglycemia, and glycemic variability have been associated with increased morbidity, mortality, length of stay, and cost in a variety of critical care and non-critical care patient populations in the hospital. The results from prospective randomized clinical trials designed to determine the risks and benefits of intensive insulin therapy and tight glycemic control have been confusing; and at times conflicting. The limitations of point-of-care blood glucose (BG) monitoring in the hospital highlight the great clinical need for an automated real-time continuous glucose monitoring system (CGMS) that can accurately measure the concentration of glucose every few minutes. Automation and standardization of the glucose measurement process have the potential to significantly improve BG control, clinical outcome, safety and cost. © 2015 Diabetes Technology Society.

  17. A qualitative analysis of how advanced practice nurses use clinical decision support systems.

    PubMed

    Weber, Scott

    2007-12-01

    The purpose of this study was to generate a grounded theory that will reflect the experiences of advanced practice nurses (APNs) working as critical care nurse practitioners (NPs) and clinical nurse specialists (CNS) with computer-based decision-making systems. A study design using grounded theory qualitative research methods and convenience sampling was employed in this study. Twenty-three APNs (13 CNS and 10 NPs) were recruited from 16 critical care units located in six large urban medical centers in the U.S. Midwest. Single-structured in-depth interviews with open-ended audio-taped questions were conducted with each APN. Through this process, APNs defined what they consider to be relevant themes and patterns of clinical decision system use in their critical care practices, and they identified the interrelatedness of the conceptual categories that emerged from the results. Data were analyzed using the constant comparative analysis method of qualitative research. APN participants were predominantly female, white/non-Hispanic, had a history of access to the clinical decision system used in their critical care settings for an average of 14 months, and had attended a formal training program to learn how to use clinical decision systems. "Forecasting decision outcomes," which was defined as the voluntary process employed to forecast the outcomes of patient care decisions in critical care prior to actual decision making, was the core variable describing system use that emerged from the responses. This variable consisted of four user constructs or components: (a) users' perceptions of their initial system learning experience, (b) users' sense of how well they understand how system technology works, (c) users' understanding of how system inferences are created or derived, and (d) users' relative trust of system-derived data. Each of these categories was further described through the grounded theory research process, and the relationships between the categories were identified. The findings of this study suggest that the main reason critical care APNs choose to integrate clinical decision systems into their practices is to provide an objective, scientifically derived, technology-based backup for human forecasting of the outcomes of patient care decisions prior to their actual decision making. Implications for nursing, health care, and technology research are presented.

  18. On the Temporal and Functional Origin of L2 Disadvantages in Speech Production: A Critical Review

    PubMed Central

    Runnqvist, Elin; Strijkers, Kristof; Sadat, Jasmin; Costa, Albert

    2011-01-01

    Despite a large amount of psycholinguistic research devoted to the issue of processing differences between a first and a second language, there is no consensus regarding the locus where these emerge or the mechanism behind them. The aim of this article is to briefly examine both the behavioral and neuroscientific evidence in order to critically assess three hypotheses that have been put forward in the literature to explain such differences: the weaker links, executive control, and post-lexical accounts. We conclude that (a) while all stages of processing are likely to be slowed down when speaking in an L2 compared to an L1, the differences seem to originate at a lexical stage; and (b) frequency of use seems to be the variable mainly responsible for these bilingual processing disadvantages. PMID:22203812

  19. [Rapid assessment of critical quality attributes of Chinese materia medica (II): strategy of NIR assignment].

    PubMed

    Pei, Yan-Ling; Wu, Zhi-Sheng; Shi, Xin-Yuan; Zhou, Lu-Wei; Qiao, Yan-Jiang

    2014-09-01

    The present paper firstly reviewed the research progress and main methods of NIR spectral assignment coupled with our research results. Principal component analysis was focused on characteristic signal extraction to reflect spectral differences. Partial least squares method was concerned with variable selection to discover characteristic absorption band. Two-dimensional correlation spectroscopy was mainly adopted for spectral assignment. Autocorrelation peaks were obtained from spectral changes, which were disturbed by external factors, such as concentration, temperature and pressure. Density functional theory was used to calculate energy from substance structure to establish the relationship between molecular energy and spectra change. Based on the above reviewed method, taking a NIR spectral assignment of chlorogenic acid as example, a reliable spectral assignment for critical quality attributes of Chinese materia medica (CMM) was established using deuterium technology and spectral variable selection. The result demonstrated the assignment consistency according to spectral features of different concentrations of chlorogenic acid and variable selection region of online NIR model in extract process. Although spectral assignment was initial using an active pharmaceutical ingredient, it is meaningful to look forward to the futurity of the complex components in CMM. Therefore, it provided methodology for NIR spectral assignment of critical quality attributes in CMM.

  20. Artificial Intelligence Tools for Scaling Up of High Shear Wet Granulation Process.

    PubMed

    Landin, Mariana

    2017-01-01

    The results presented in this article demonstrate the potential of artificial intelligence tools for predicting the endpoint of the granulation process in high-speed mixer granulators of different scales from 25L to 600L. The combination of neurofuzzy logic and gene expression programing technologies allowed the modeling of the impeller power as a function of operation conditions and wet granule properties, establishing the critical variables that affect the response and obtaining a unique experimental polynomial equation (transparent model) of high predictability (R 2 > 86.78%) for all size equipment. Gene expression programing allowed the modeling of the granulation process for granulators of similar and dissimilar geometries and can be improved by implementing additional characteristics of the process, as composition variables or operation parameters (e.g., batch size, chopper speed). The principles and the methodology proposed here can be applied to understand and control manufacturing process, using any other granulation equipment, including continuous granulation processes. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  1. Examining the causes of memory strength variability: Recollection, attention failure, or encoding variability?

    PubMed Central

    Koen, Joshua D.; Aly, Mariam; Wang, Wei-Chun; Yonelinas, Andrew P.

    2013-01-01

    A prominent finding in recognition memory is that studied items are associated with more variability in memory strength than new items. Here, we test three competing theories for why this occurs - the encoding variability, attention failure, and recollection accounts. Distinguishing amongst these theories is critical because each provides a fundamentally different account of the processes underlying recognition memory. The encoding variability and attention failure accounts propose that old item variance will be unaffected by retrieval manipulations because the processes producing this effect are ascribed to encoding. The recollection account predicts that both encoding and retrieval manipulations that preferentially affect recollection will affect memory variability. These contrasting predictions were tested by examining the effect of response speeding (Experiment 1), dividing attention at retrieval (Experiment 2), context reinstatement (Experiment 3), and increased test delay (Experiment 4) on recognition performance. The results of all four experiments confirmed the predictions of the recollection account, and were inconsistent with the encoding variability account. The evidence supporting the attention failure account was mixed, with two of the four experiments confirming the account and two disconfirming the account. These results indicate that encoding variability and attention failure are insufficient accounts of memory variance, and provide support for the recollection account. Several alternative theoretical accounts of the results are also considered. PMID:23834057

  2. Empirical Investigation of Critical Transitions in Paleoclimate

    NASA Astrophysics Data System (ADS)

    Loskutov, E. M.; Mukhin, D.; Gavrilov, A.; Feigin, A.

    2016-12-01

    In this work we apply a new empirical method for the analysis of complex spatially distributed systems to the analysis of paleoclimate data. The method consists of two general parts: (i) revealing the optimal phase-space variables and (ii) construction the empirical prognostic model by observed time series. The method of phase space variables construction based on the data decomposition into nonlinear dynamical modes which was successfully applied to global SST field and allowed clearly separate time scales and reveal climate shift in the observed data interval [1]. The second part, the Bayesian approach to optimal evolution operator reconstruction by time series is based on representation of evolution operator in the form of nonlinear stochastic function represented by artificial neural networks [2,3]. In this work we are focused on the investigation of critical transitions - the abrupt changes in climate dynamics - in match longer time scale process. It is well known that there were number of critical transitions on different time scales in the past. In this work, we demonstrate the first results of applying our empirical methods to analysis of paleoclimate variability. In particular, we discuss the possibility of detecting, identifying and prediction such critical transitions by means of nonlinear empirical modeling using the paleoclimate record time series. The study is supported by Government of Russian Federation (agreement #14.Z50.31.0033 with the Institute of Applied Physics of RAS). 1. Mukhin, D., Gavrilov, A., Feigin, A., Loskutov, E., & Kurths, J. (2015). Principal nonlinear dynamical modes of climate variability. Scientific Reports, 5, 15510. http://doi.org/10.1038/srep155102. Ya. I. Molkov, D. N. Mukhin, E. M. Loskutov, A.M. Feigin, (2012) : Random dynamical models from time series. Phys. Rev. E, Vol. 85, n.3.3. Mukhin, D., Kondrashov, D., Loskutov, E., Gavrilov, A., Feigin, A., & Ghil, M. (2015). Predicting Critical Transitions in ENSO models. Part II: Spatially Dependent Models. Journal of Climate, 28(5), 1962-1976. http://doi.org/10.1175/JCLI-D-14-00240.1

  3. Development of clinical process measures for pediatric burn care: Understanding variation in practice patterns.

    PubMed

    Kazis, Lewis E; Sheridan, Robert L; Shapiro, Gabriel D; Lee, Austin F; Liang, Matthew H; Ryan, Colleen M; Schneider, Jeffrey C; Lydon, Martha; Soley-Bori, Marina; Sonis, Lily A; Dore, Emily C; Palmieri, Tina; Herndon, David; Meyer, Walter; Warner, Petra; Kagan, Richard; Stoddard, Frederick J; Murphy, Michael; Tompkins, Ronald G

    2018-04-01

    There has been little systematic examination of variation in pediatric burn care clinical practices and its effect on outcomes. As a first step, current clinical care processes need to be operationally defined. The highly specialized burn care units of the Shriners Hospitals for Children system present an opportunity to describe the processes of care. The aim of this study was to develop a set of process-based measures for pediatric burn care and examine adherence to them by providers in a cohort of pediatric burn patients. We conducted a systematic literature review to compile a set of process-based indicators. These measures were refined by an expert panel of burn care providers, yielding 36 process-based indicators in four clinical areas: initial evaluation and resuscitation, acute excisional surgery and critical care, psychosocial and pain control, and reconstruction and aftercare. We assessed variability in adherence to the indicators in a cohort of 1,076 children with burns at four regional pediatric burn programs in the Shriners Hospital system. The percentages of the cohort at each of the four sites were as follows: Boston, 20.8%; Cincinnati, 21.1%; Galveston, 36.0%; and Sacramento, 22.1%. The cohort included children who received care between 2006 and 2010. Adherence to the process indicators varied both across sites and by clinical area. Adherence was lowest for the clinical areas of acute excisional surgery and critical care, with a range of 35% to 48% across sites, followed by initial evaluation and resuscitation (range, 34%-60%). In contrast, the clinical areas of psychosocial and pain control and reconstruction and aftercare had relatively high adherence across sites, with ranges of 62% to 93% and 71% to 87%, respectively. Of the 36 process indicators, 89% differed significantly in adherence between clinical sites (p < 0.05). Acute excisional surgery and critical care exhibited the most variability. The development of this set of process-based measures represents an important step in the assessment of clinical practice in pediatric burn care. Substantial variation was observed in practices of pediatric burn care. However, further research is needed to link these process-based measures to clinical outcomes. Therapeutic/care management, level IV.

  4. Variability of 13C-labeling in plant leaves.

    PubMed

    Nguyen Tu, Thanh Thuy; Biron, Philippe; Maseyk, Kadmiel; Richard, Patricia; Zeller, Bernd; Quénéa, Katell; Alexis, Marie; Bardoux, Gérard; Vaury, Véronique; Girardin, Cyril; Pouteau, Valérie; Billiou, Daniel; Bariac, Thierry

    2013-09-15

    Plant tissues artificially labeled with (13)C are increasingly used in environmental studies to unravel biogeochemical and ecophysiological processes. However, the variability of (13)C-content in labeled tissues has never been carefully investigated. Hence, this study aimed at documenting the variability of (13)C-content in artificially labeled leaves. European beech and Italian ryegrass were subjected to long-term (13)C-labeling in a controlled-environment growth chamber. The (13)C-content of the leaves obtained after several months labeling was determined by isotope ratio mass spectrometry. The (13)C-content of the labeled leaves exhibited inter- and intra-leaf variability much higher than those naturally occurring in unlabeled plants, which do not exceed a few per mil. This variability was correlated with labeling intensity: the isotope composition of leaves varied in ranges of ca 60‰ and 90‰ for experiments that led to average leaf (13)C-content of ca +15‰ and +450‰, respectively. The reported variability of isotope composition in (13)C-enriched leaves is critical, and should be taken into account in subsequent experimental investigations of environmental processes using (13)C-labeled plant tissues. Copyright © 2013 John Wiley & Sons, Ltd.

  5. Process Improvement at the Aircraft Intermediate Maintenance Detachment (AIMD) at Naval Air Station Whidbey Island

    DTIC Science & Technology

    2006-12-01

    the goal of achieving zero waste is impractical. Thus, the concept of Lean has to be slightly modified to adjust for the uncertainty and variability...personnel are qualified as Black or Green belts, this may become an issue for them down the road. 2. Criticism Two The goal of Lean is to achieve “ Zero ... Waste ,” therefore, how can the military achieve Lean in such a vast area of uncertainty and variability? Under the environment that DoD operates in

  6. Relationship between operational variables, fundamental physics and foamed cement properties in lab and field generated foamed cement slurries

    DOE PAGES

    Glosser, D.; Kutchko, B.; Benge, G.; ...

    2016-03-21

    Foamed cement is a critical component for wellbore stability. The mechanical performance of a foamed cement depends on its microstructure, which in turn depends on the preparation method and attendant operational variables. Determination of cement stability for field use is based on laboratory testing protocols governed by API Recommended Practice 10B-4 (API RP 10B-4, 2015). However, laboratory and field operational variables contrast considerably in terms of scale, as well as slurry mixing and foaming processes. Here in this paper, laboratory and field operational processes are characterized within a physics-based framework. It is shown that the “atomization energy” imparted by themore » high pressure injection of nitrogen gas into the field mixed foamed cement slurry is – by a significant margin – the highest energy process, and has a major impact on the void system in the cement slurry. There is no analog for this high energy exchange in current laboratory cement preparation and testing protocols. Quantifying the energy exchanges across the laboratory and field processes provides a basis for understanding relative impacts of these variables on cement structure, and can ultimately lead to the development of practices to improve cement testing and performance.« less

  7. Meta-analysis of variables affecting mouse protection efficacy of whole organism Brucella vaccines and vaccine candidates

    PubMed Central

    2013-01-01

    Background Vaccine protection investigation includes three processes: vaccination, pathogen challenge, and vaccine protection efficacy assessment. Many variables can affect the results of vaccine protection. Brucella, a genus of facultative intracellular bacteria, is the etiologic agent of brucellosis in humans and multiple animal species. Extensive research has been conducted in developing effective live attenuated Brucella vaccines. We hypothesized that some variables play a more important role than others in determining vaccine protective efficacy. Using Brucella vaccines and vaccine candidates as study models, this hypothesis was tested by meta-analysis of Brucella vaccine studies reported in the literature. Results Nineteen variables related to vaccine-induced protection of mice against infection with virulent brucellae were selected based on modeling investigation of the vaccine protection processes. The variable "vaccine protection efficacy" was set as a dependent variable while the other eighteen were set as independent variables. Discrete or continuous values were collected from papers for each variable of each data set. In total, 401 experimental groups were manually annotated from 74 peer-reviewed publications containing mouse protection data for live attenuated Brucella vaccines or vaccine candidates. Our ANOVA analysis indicated that nine variables contributed significantly (P-value < 0.05) to Brucella vaccine protection efficacy: vaccine strain, vaccination host (mouse) strain, vaccination dose, vaccination route, challenge pathogen strain, challenge route, challenge-killing interval, colony forming units (CFUs) in mouse spleen, and CFU reduction compared to control group. The other 10 variables (e.g., mouse age, vaccination-challenge interval, and challenge dose) were not found to be statistically significant (P-value > 0.05). The protection level of RB51 was sacrificed when the values of several variables (e.g., vaccination route, vaccine viability, and challenge pathogen strain) change. It is suggestive that it is difficult to protect against aerosol challenge. Somewhat counter-intuitively, our results indicate that intraperitoneal and subcutaneous vaccinations are much more effective to protect against aerosol Brucella challenge than intranasal vaccination. Conclusions Literature meta-analysis identified variables that significantly contribute to Brucella vaccine protection efficacy. The results obtained provide critical information for rational vaccine study design. Literature meta-analysis is generic and can be applied to analyze variables critical for vaccine protection against other infectious diseases. PMID:23735014

  8. Evidence of opposing fitness effects of parental heterozygosity and relatedness in a critically endangered marine turtle?

    PubMed

    Phillips, K P; Jorgensen, T H; Jolliffe, K G; Richardson, D S

    2017-11-01

    How individual genetic variability relates to fitness is important in understanding evolution and the processes affecting populations of conservation concern. Heterozygosity-fitness correlations (HFCs) have been widely used to study this link in wild populations, where key parameters that affect both variability and fitness, such as inbreeding, can be difficult to measure. We used estimates of parental heterozygosity and genetic similarity ('relatedness') derived from 32 microsatellite markers to explore the relationship between genetic variability and fitness in a population of the critically endangered hawksbill turtle, Eretmochelys imbricata. We found no effect of maternal MLH (multilocus heterozygosity) on clutch size or egg success rate, and no single-locus effects. However, we found effects of paternal MLH and parental relatedness on egg success rate that interacted in a way that may result in both positive and negative effects of genetic variability. Multicollinearity in these tests was within safe limits, and null simulations suggested that the effect was not an artefact of using paternal genotypes reconstructed from large samples of offspring. Our results could imply a tension between inbreeding and outbreeding depression in this system, which is biologically feasible in turtles: female-biased natal philopatry may elevate inbreeding risk and local adaptation, and both processes may be disrupted by male-biased dispersal. Although this conclusion should be treated with caution due to a lack of significant identity disequilibrium, our study shows the importance of considering both positive and negative effects when assessing how variation in genetic variability affects fitness in wild systems. © 2017 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2017 European Society For Evolutionary Biology.

  9. Implementation of "Quality by Design (QbD)" Approach for the Development of 5-Fluorouracil Loaded Thermosensitive Hydrogel.

    PubMed

    Dalwadi, Chintan; Patel, Gayatri

    2016-01-01

    The purpose of this study was to investigate Quality by Design (QbD) principle for the preparation of hydrogel products to prove both practicability and utility of executing QbD concept to hydrogel based controlled release systems. Product and process understanding will help in decreasing the variability of critical material and process parameters, which give quality product output and reduce the risk. This study includes the identification of the Quality Target Product Profiles (QTPPs) and Critical Quality Attributes (CQAs) from literature or preliminary studies. To identify and control the variability in process and material attributes, two tools of QbD was utilized, Quality Risk Management (QRM) and Experimental Design. Further, it helps to identify the effect of these attributes on CQAs. Potential risk factors were identified from fishbone diagram and screened by risk assessment and optimized by 3-level 2- factor experimental design with center points in triplicate, to analyze the precision of the target process. This optimized formulation was further characterized by gelling time, gelling temperature, rheological parameters, in-vitro biodegradation and in-vitro drug release. Design space was created using experimental design tool that gives the control space and working within this controlled space reduces all the failure modes below the risk level. In conclusion, QbD approach with QRM tool provides potent and effectual pyramid to enhance the quality into the hydrogel.

  10. QbD for pediatric oral lyophilisates development: risk assessment followed by screening and optimization.

    PubMed

    Casian, Tibor; Iurian, Sonia; Bogdan, Catalina; Rus, Lucia; Moldovan, Mirela; Tomuta, Ioan

    2017-12-01

    This study proposed the development of oral lyophilisates with respect to pediatric medicine development guidelines, by applying risk management strategies and DoE as an integrated QbD approach. Product critical quality attributes were overviewed by generating Ishikawa diagrams for risk assessment purposes, considering process, formulation and methodology related parameters. Failure Mode Effect Analysis was applied to highlight critical formulation and process parameters with an increased probability of occurrence and with a high impact on the product performance. To investigate the effect of qualitative and quantitative formulation variables D-optimal designs were used for screening and optimization purposes. Process parameters related to suspension preparation and lyophilization were classified as significant factors, and were controlled by implementing risk mitigation strategies. Both quantitative and qualitative formulation variables introduced in the experimental design influenced the product's disintegration time, mechanical resistance and dissolution properties selected as CQAs. The optimum formulation selected through Design Space presented ultra-fast disintegration time (5 seconds), a good dissolution rate (above 90%) combined with a high mechanical resistance (above 600 g load). Combining FMEA and DoE allowed the science based development of a product with respect to the defined quality target profile by providing better insights on the relevant parameters throughout development process. The utility of risk management tools in pharmaceutical development was demonstrated.

  11. Fluctuation theorem: A critical review

    NASA Astrophysics Data System (ADS)

    Malek Mansour, M.; Baras, F.

    2017-10-01

    Fluctuation theorem for entropy production is revisited in the framework of stochastic processes. The applicability of the fluctuation theorem to physico-chemical systems and the resulting stochastic thermodynamics were analyzed. Some unexpected limitations are highlighted in the context of jump Markov processes. We have shown that these limitations handicap the ability of the resulting stochastic thermodynamics to correctly describe the state of non-equilibrium systems in terms of the thermodynamic properties of individual processes therein. Finally, we considered the case of diffusion processes and proved that the fluctuation theorem for entropy production becomes irrelevant at the stationary state in the case of one variable systems.

  12. An integrated quality by design and mixture-process variable approach in the development of a capillary electrophoresis method for the analysis of almotriptan and its impurities.

    PubMed

    Orlandini, S; Pasquini, B; Stocchero, M; Pinzauti, S; Furlanetto, S

    2014-04-25

    The development of a capillary electrophoresis (CE) method for the assay of almotriptan (ALM) and its main impurities using an integrated Quality by Design and mixture-process variable (MPV) approach is described. A scouting phase was initially carried out by evaluating different CE operative modes, including the addition of pseudostationary phases and additives to the background electrolyte, in order to approach the analytical target profile. This step made it possible to select normal polarity microemulsion electrokinetic chromatography (MEEKC) as operative mode, which allowed a good selectivity to be achieved in a low analysis time. On the basis of a general Ishikawa diagram for MEEKC methods, a screening asymmetric matrix was applied in order to screen the effects of the process variables (PVs) voltage, temperature, buffer concentration and buffer pH, on critical quality attributes (CQAs), represented by critical separation values and analysis time. A response surface study was then carried out considering all the critical process parameters, including both the PVs and the mixture components (MCs) of the microemulsion (borate buffer, n-heptane as oil, sodium dodecyl sulphate/n-butanol as surfactant/cosurfactant). The values of PVs and MCs were simultaneously changed in a MPV study, making it possible to find significant interaction effects. The design space (DS) was defined as the multidimensional combination of PVs and MCs where the probability for the different considered CQAs to be acceptable was higher than a quality level π=90%. DS was identified by risk of failure maps, which were drawn on the basis of Monte-Carlo simulations, and verification points spanning the design space were tested. Robustness testing of the method, performed by a D-optimal design, and system suitability criteria allowed a control strategy to be designed. The optimized method was validated following ICH Guideline Q2(R1) and was applied to a real sample of ALM coated tablets. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Quantifying Vegetation Biophysical Variables from Imaging Spectroscopy Data: A Review on Retrieval Methods

    NASA Astrophysics Data System (ADS)

    Verrelst, Jochem; Malenovský, Zbyněk; Van der Tol, Christiaan; Camps-Valls, Gustau; Gastellu-Etchegorry, Jean-Philippe; Lewis, Philip; North, Peter; Moreno, Jose

    2018-06-01

    An unprecedented spectroscopic data stream will soon become available with forthcoming Earth-observing satellite missions equipped with imaging spectroradiometers. This data stream will open up a vast array of opportunities to quantify a diversity of biochemical and structural vegetation properties. The processing requirements for such large data streams require reliable retrieval techniques enabling the spatiotemporally explicit quantification of biophysical variables. With the aim of preparing for this new era of Earth observation, this review summarizes the state-of-the-art retrieval methods that have been applied in experimental imaging spectroscopy studies inferring all kinds of vegetation biophysical variables. Identified retrieval methods are categorized into: (1) parametric regression, including vegetation indices, shape indices and spectral transformations; (2) nonparametric regression, including linear and nonlinear machine learning regression algorithms; (3) physically based, including inversion of radiative transfer models (RTMs) using numerical optimization and look-up table approaches; and (4) hybrid regression methods, which combine RTM simulations with machine learning regression methods. For each of these categories, an overview of widely applied methods with application to mapping vegetation properties is given. In view of processing imaging spectroscopy data, a critical aspect involves the challenge of dealing with spectral multicollinearity. The ability to provide robust estimates, retrieval uncertainties and acceptable retrieval processing speed are other important aspects in view of operational processing. Recommendations towards new-generation spectroscopy-based processing chains for operational production of biophysical variables are given.

  14. Quality by Design approach to spray drying processing of crystalline nanosuspensions.

    PubMed

    Kumar, Sumit; Gokhale, Rajeev; Burgess, Diane J

    2014-04-10

    Quality by Design (QbD) principles were explored to understand spray drying process for the conversion of liquid nanosuspensions into solid nano-crystalline dry powders using indomethacin as a model drug. The effects of critical process variables: inlet temperature, flow and aspiration rates on critical quality attributes (CQAs): particle size, moisture content, percent yield and crystallinity were investigated employing a full factorial design. A central cubic design was employed to generate the response surface for particle size and percent yield. Multiple linear regression analysis and ANOVA were employed to identify and estimate the effect of critical parameters, establish their relationship with CQAs, create design space and model the spray drying process. Inlet temperature was identified as the only significant factor (p value <0.05) to affect dry powder particle size. Higher inlet temperatures caused drug surface melting and hence aggregation of the dried nano-crystalline powders. Aspiration and flow rates were identified as significant factors affecting yield (p value <0.05). Higher yields were obtained at higher aspiration and lower flow rates. All formulations had less than 3% (w/w) moisture content. Formulations dried at higher inlet temperatures had lower moisture compared to those dried at lower inlet temperatures. Published by Elsevier B.V.

  15. Structure, process, and annual ICU mortality across 69 centers: United States Critical Illness and Injury Trials Group Critical Illness Outcomes Study.

    PubMed

    Checkley, William; Martin, Greg S; Brown, Samuel M; Chang, Steven Y; Dabbagh, Ousama; Fremont, Richard D; Girard, Timothy D; Rice, Todd W; Howell, Michael D; Johnson, Steven B; O'Brien, James; Park, Pauline K; Pastores, Stephen M; Patil, Namrata T; Pietropaoli, Anthony P; Putman, Maryann; Rotello, Leo; Siner, Jonathan; Sajid, Sahul; Murphy, David J; Sevransky, Jonathan E

    2014-02-01

    Hospital-level variations in structure and process may affect clinical outcomes in ICUs. We sought to characterize the organizational structure, processes of care, use of protocols, and standardized outcomes in a large sample of U.S. ICUs. We surveyed 69 ICUs about organization, size, volume, staffing, processes of care, use of protocols, and annual ICU mortality. ICUs participating in the United States Critical Illness and Injury Trials Group Critical Illness Outcomes Study. Sixty-nine intensivists completed the survey. We characterized structure and process variables across ICUs, investigated relationships between these variables and annual ICU mortality, and adjusted for illness severity using Acute Physiology and Chronic Health Evaluation II. Ninety-four ICU directors were invited to participate in the study and 69 ICUs (73%) were enrolled, of which 25 (36%) were medical, 24 (35%) were surgical, and 20 (29%) were of mixed type, and 64 (93%) were located in teaching hospitals with a median number of five trainees per ICU. Average annual ICU mortality was 10.8%, average Acute Physiology and Chronic Health Evaluation II score was 19.3, 58% were closed units, and 41% had a 24-hour in-house intensivist. In multivariable linear regression adjusted for Acute Physiology and Chronic Health Evaluation II and multiple ICU structure and process factors, annual ICU mortality was lower in surgical ICUs than in medical ICUs (5.6% lower [95% CI, 2.4-8.8%]) or mixed ICUs (4.5% lower [95% CI, 0.4-8.7%]). We also found a lower annual ICU mortality among ICUs that had a daily plan of care review (5.8% lower [95% CI, 1.6-10.0%]) and a lower bed-to-nurse ratio (1.8% lower when the ratio decreased from 2:1 to 1.5:1 [95% CI, 0.25-3.4%]). In contrast, 24-hour intensivist coverage (p = 0.89) and closed ICU status (p = 0.16) were not associated with a lower annual ICU mortality. In a sample of 69 ICUs, a daily plan of care review and a lower bed-to-nurse ratio were both associated with a lower annual ICU mortality. In contrast to 24-hour intensivist staffing, improvement in team communication is a low-cost, process-targeted intervention strategy that may improve clinical outcomes in ICU patients.

  16. Assessment of Process Capability: the case of Soft Drinks Processing Unit

    NASA Astrophysics Data System (ADS)

    Sri Yogi, Kottala

    2018-03-01

    The process capability studies have significant impact in investigating process variation which is important in achieving product quality characteristics. Its indices are to measure the inherent variability of a process and thus to improve the process performance radically. The main objective of this paper is to understand capability of the process being produced within specification of the soft drinks processing unit, a premier brands being marketed in India. A few selected critical parameters in soft drinks processing: concentration of gas volume, concentration of brix, torque of crock has been considered for this study. Assessed some relevant statistical parameters: short term capability, long term capability as a process capability indices perspective. For assessment we have used real time data of soft drinks bottling company which is located in state of Chhattisgarh, India. As our research output suggested reasons for variations in the process which is validated using ANOVA and also predicted Taguchi cost function, assessed also predicted waste monetarily this shall be used by organization for improving process parameters. This research work has substantially benefitted the organization in understanding the various variations of selected critical parameters for achieving zero rejection.

  17. False memory in aging: effects of emotional valence on word recognition accuracy.

    PubMed

    Piguet, Olivier; Connally, Emily; Krendl, Anne C; Huot, Jessica R; Corkin, Suzanne

    2008-06-01

    Memory is susceptible to distortions. Valence and increasing age are variables known to affect memory accuracy and may increase false alarm production. Interaction between these variables and their impact on false memory was investigated in 36 young (18-28 years) and 36 older (61-83 years) healthy adults. At study, participants viewed lists of neutral words orthographically related to negative, neutral, or positive critical lures (not presented). Memory for these words was subsequently tested with a remember-know procedure. At test, items included the words seen at study and their associated critical lures, as well as sets of orthographically related neutral words not seen at study and their associated unstudied lures. Positive valence was shown to have two opposite effects on older adults' discrimination of the lures: It improved correct rejection of unstudied lures but increased false memory for critical lures (i.e., lures associated with words studied previously). Thus, increased salience triggered by positive valence may disrupt memory accuracy in older adults when discriminating among similar events. These findings likely reflect a source memory deficit due to decreased efficiency in cognitive control processes with aging.

  18. Human Language Technology: Opportunities and Challenges

    DTIC Science & Technology

    2005-01-01

    because of the connections to and reliance on signal processing. Audio diarization critically includes indexing of speakers [12], since speaker ...to reduce inter- speaker variability in training. Standard techniques include vocal-tract length normalization, adaptation of acoustic models using...maximum likelihood linear regression (MLLR), and speaker -adaptive training based on MLLR. The acoustic models are mixtures of Gaussians, typically with

  19. A Retention Assessment Process: Utilizing Total Quality Management Principles and Focus Groups

    ERIC Educational Resources Information Center

    Codjoe, Henry M.; Helms, Marilyn M.

    2005-01-01

    Retaining students is a critical topic in higher education. Methodologies abound to gather attrition data as well as key variables important to retention. Using the theories of total quality management and focus groups, this case study gathers and reports data from current college students. Key results, suggestions for replication, and areas for…

  20. Reversing the Speed-IQ Correlation: Intra-Individual Variability and Attentional Control in the Inspection Time Paradigm

    ERIC Educational Resources Information Center

    Fox, Mark C.; Roring, Roy W.; Mitchum, Ainsley L.

    2009-01-01

    Elementary cognitive tasks (ECTs) are simple tasks involving basic cognitive processes for which speed of performance typically correlates with IQ. Inspection time (IT) has the strongest IQ correlations and is considered critical evidence for neural speed underlying individual differences in intelligence. However, results from Bors et al. [Bors,…

  1. Dissociating Temporal Preparation Processes as a Function of the Inter-Trial Interval Duration

    ERIC Educational Resources Information Center

    Vallesi, Antonino; Lozano, Violeta N.; Correa, Angel

    2013-01-01

    Preparation over time is a ubiquitous capacity which implies decreasing uncertainty about when critical events will occur. This capacity is usually studied with the variable foreperiod paradigm, which consists in the random variation of the time interval (foreperiod) between a warning stimulus and a target. With this paradigm, response time (RT)…

  2. The Effects of Dynamical Rates on Species Coexistence in a Variable Environment: The Paradox of the Plankton Revisited.

    PubMed

    Li, Lina; Chesson, Peter

    2016-08-01

    Hutchinson's famous hypothesis for the "paradox of the plankton" has been widely accepted, but critical aspects have remained unchallenged. Hutchinson argued that environmental fluctuations would promote coexistence when the timescale for environmental change is comparable to the timescale for competitive exclusion. Using a consumer-resource model, we do find that timescales of processes are important. However, it is not the time to exclusion that must be compared with the time for environmental change but the time for resource depletion. Fast resource depletion, when resource consumption is favored for different species at different times, strongly promotes coexistence. The time for exclusion is independent of the rate of resource depletion. Therefore, the widely believed predictions of Hutchinson are misleading. Fast resource depletion, as determined by environmental conditions, ensures strong coupling of environmental processes and competition, which leads to enhancement over time of intraspecific competition relative to interspecific competition as environmental shifts favor different species at different times. This critical coupling is measured by the covariance between environment and competition. Changes in this quantity as densities change determine the stability of coexistence and provide the key to rigorous analysis, both theoretically and empirically, of coexistence in a variable environment. These ideas apply broadly to diversity maintenance in variable environments whether the issue is species diversity or genetic diversity and competition or apparent competition.

  3. Bioreactor process parameter screening utilizing a Plackett-Burman design for a model monoclonal antibody.

    PubMed

    Agarabi, Cyrus D; Schiel, John E; Lute, Scott C; Chavez, Brittany K; Boyne, Michael T; Brorson, Kurt A; Khan, Mansoora; Read, Erik K

    2015-06-01

    Consistent high-quality antibody yield is a key goal for cell culture bioprocessing. This endpoint is typically achieved in commercial settings through product and process engineering of bioreactor parameters during development. When the process is complex and not optimized, small changes in composition and control may yield a finished product of less desirable quality. Therefore, changes proposed to currently validated processes usually require justification and are reported to the US FDA for approval. Recently, design-of-experiments-based approaches have been explored to rapidly and efficiently achieve this goal of optimized yield with a better understanding of product and process variables that affect a product's critical quality attributes. Here, we present a laboratory-scale model culture where we apply a Plackett-Burman screening design to parallel cultures to study the main effects of 11 process variables. This exercise allowed us to determine the relative importance of these variables and identify the most important factors to be further optimized in order to control both desirable and undesirable glycan profiles. We found engineering changes relating to culture temperature and nonessential amino acid supplementation significantly impacted glycan profiles associated with fucosylation, β-galactosylation, and sialylation. All of these are important for monoclonal antibody product quality. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  4. Sources of Variability in Chlorophyll Analysis by Fluorometry and High-Performance Liquid Chromatography in a SIMBIOS Inter-Calibration Exercise

    NASA Technical Reports Server (NTRS)

    VanHeukelem, Laurie; Thomas, Crystal S.; Gilbert, Patricia M.; Fargion, Giulietta S. (Editor); McClain, Charles R. (Editor)

    2002-01-01

    The purpose of this technical report is to provide current documentation of the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities, NASA Research Announcement (NRA) research status, satellite data processing, data product validation, and field calibration. This documentation is necessary to ensure that critical information is related to the scientific community and NASA management. This critical information includes the technical difficulties and challenges of validating and combining ocean color data from an array of independent satellite systems to form consistent and accurate global bio-optical time series products. This technical report is not meant as a substitute for scientific literature. Instead, it will provide a ready and responsive vehicle for the multitude of technical reports issued by an operational project. This particular document focus on the variability in chlorophyll pigment measurements resulting from differences in methodologies and laboratories conducting the pigment analysis.

  5. The interprocess NIR sampling as an alternative approach to multivariate statistical process control for identifying sources of product-quality variability.

    PubMed

    Marković, Snežana; Kerč, Janez; Horvat, Matej

    2017-03-01

    We are presenting a new approach of identifying sources of variability within a manufacturing process by NIR measurements of samples of intermediate material after each consecutive unit operation (interprocess NIR sampling technique). In addition, we summarize the development of a multivariate statistical process control (MSPC) model for the production of enteric-coated pellet product of the proton-pump inhibitor class. By developing provisional NIR calibration models, the identification of critical process points yields comparable results to the established MSPC modeling procedure. Both approaches are shown to lead to the same conclusion, identifying parameters of extrusion/spheronization and characteristics of lactose that have the greatest influence on the end-product's enteric coating performance. The proposed approach enables quicker and easier identification of variability sources during manufacturing process, especially in cases when historical process data is not straightforwardly available. In the presented case the changes of lactose characteristics are influencing the performance of the extrusion/spheronization process step. The pellet cores produced by using one (considered as less suitable) lactose source were on average larger and more fragile, leading to consequent breakage of the cores during subsequent fluid bed operations. These results were confirmed by additional experimental analyses illuminating the underlying mechanism of fracture of oblong pellets during the pellet coating process leading to compromised film coating.

  6. Method and apparatus for monitoring dynamic cardiovascular function using n-dimensional representatives of critical functions

    NASA Technical Reports Server (NTRS)

    Syroid, Noah (Inventor); Westinskow, Dwayne (Inventor); Agutter, James (Inventor); Albert, Robert (Inventor); Strayer, David (Inventor); Wachter, S. Blake (Inventor); Drews, Frank (Inventor)

    2010-01-01

    A method, system, apparatus and device for the monitoring, diagnosis and evaluation of the state of a dynamic pulmonary system is disclosed. This method and system provides the processing means for receiving sensed and/or simulated data, converting such data into a displayable object format and displaying such objects in a manner such that the interrelationships between the respective variables can be correlated and identified by a user. This invention provides for the rapid cognitive grasp of the overall state of a pulmonary critical function with respect to a dynamic system.

  7. Theory of sampling: four critical success factors before analysis.

    PubMed

    Wagner, Claas; Esbensen, Kim H

    2015-01-01

    Food and feed materials characterization, risk assessment, and safety evaluations can only be ensured if QC measures are based on valid analytical data, stemming from representative samples. The Theory of Sampling (TOS) is the only comprehensive theoretical framework that fully defines all requirements to ensure sampling correctness and representativity, and to provide the guiding principles for sampling in practice. TOS also defines the concept of material heterogeneity and its impact on the sampling process, including the effects from all potential sampling errors. TOS's primary task is to eliminate bias-generating errors and to minimize sampling variability. Quantitative measures are provided to characterize material heterogeneity, on which an optimal sampling strategy should be based. Four critical success factors preceding analysis to ensure a representative sampling process are presented here.

  8. Micro-topographic hydrologic variability due to vegetation acclimation under climate change

    NASA Astrophysics Data System (ADS)

    Le, P. V.; Kumar, P.

    2012-12-01

    Land surface micro-topography and vegetation cover have fundamental effects on the land-atmosphere interactions. The altered temperature and precipitation variability associated with climate change will affect the water and energy processes both directly and that mediated through vegetation. Since climate change induces vegetation acclimation that leads to shifts in evapotranspiration and heat fluxes, it further modifies microclimate and near-surface hydrological processes. In this study, we investigate the impacts of vegetation acclimation to climate change on micro-topographic hydrologic variability. The ability to accurately predict these impacts requires the simultaneous considerations of biochemical, ecophysiological and hydrological processes. A multilayer canopy-root-soil system model coupled with a conjunctive surface-subsurface flow model is used to capture the acclimatory responses and analyze the changes in dynamics of structure and connectivity of micro-topographic storage and in magnitudes of runoff. The study is performed using Light Detection and Ranging (LiDAR) topographic data in the Birds Point-New Madrid floodway in Missouri, U.S.A. The result indicates that both climate change and its associated vegetation acclimation play critical roles in altering the micro-topographic hydrological responses.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glosser, D.; Kutchko, B.; Benge, G.

    Foamed cement is a critical component for wellbore stability. The mechanical performance of a foamed cement depends on its microstructure, which in turn depends on the preparation method and attendant operational variables. Determination of cement stability for field use is based on laboratory testing protocols governed by API Recommended Practice 10B-4 (API RP 10B-4, 2015). However, laboratory and field operational variables contrast considerably in terms of scale, as well as slurry mixing and foaming processes. Here in this paper, laboratory and field operational processes are characterized within a physics-based framework. It is shown that the “atomization energy” imparted by themore » high pressure injection of nitrogen gas into the field mixed foamed cement slurry is – by a significant margin – the highest energy process, and has a major impact on the void system in the cement slurry. There is no analog for this high energy exchange in current laboratory cement preparation and testing protocols. Quantifying the energy exchanges across the laboratory and field processes provides a basis for understanding relative impacts of these variables on cement structure, and can ultimately lead to the development of practices to improve cement testing and performance.« less

  10. Examining the causes of memory strength variability: recollection, attention failure, or encoding variability?

    PubMed

    Koen, Joshua D; Aly, Mariam; Wang, Wei-Chun; Yonelinas, Andrew P

    2013-11-01

    A prominent finding in recognition memory is that studied items are associated with more variability in memory strength than new items. Here, we test 3 competing theories for why this occurs-the encoding variability, attention failure, and recollection accounts. Distinguishing among these theories is critical because each provides a fundamentally different account of the processes underlying recognition memory. The encoding variability and attention failure accounts propose that old item variance will be unaffected by retrieval manipulations because the processes producing this effect are ascribed to encoding. The recollection account predicts that both encoding and retrieval manipulations that preferentially affect recollection will affect memory variability. These contrasting predictions were tested by examining the effect of response speeding (Experiment 1), dividing attention at retrieval (Experiment 2), context reinstatement (Experiment 3), and increased test delay (Experiment 4) on recognition performance. The results of all 4 experiments confirm the predictions of the recollection account and are inconsistent with the encoding variability account. The evidence supporting the attention failure account is mixed, with 2 of the 4 experiments confirming the account and 2 disconfirming the account. These results indicate that encoding variability and attention failure are insufficient accounts of memory variance and provide support for the recollection account. Several alternative theoretical accounts of the results are also considered. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  11. Microstructure and mesh sensitivities of mesoscale surrogate driving force measures for transgranular fatigue cracks in polycrystals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castelluccio, Gustavo M.; McDowell, David L.

    The number of cycles required to form and grow microstructurally small fatigue cracks in metals exhibits substantial variability, particularly for low applied strain amplitudes. This variability is commonly attributed to the heterogeneity of cyclic plastic deformation within the microstructure, and presents a challenge to minimum life design of fatigue resistant components. Our paper analyzes sources of variability that contribute to the driving force of transgranular fatigue cracks within nucleant grains. We also employ crystal plasticity finite element simulations that explicitly render the polycrystalline microstructure and Fatigue Indicator Parameters (FIPs) averaged over different volume sizes and shapes relative to the anticipatedmore » fatigue damage process zone. Volume averaging is necessary to both achieve description of a finite fatigue damage process zone and to regularize mesh dependence in simulations. Furthermore, results from constant amplitude remote applied straining are characterized in terms of the extreme value distributions of volume averaged FIPs. Grain averaged FIP values effectively mitigate mesh sensitivity, but they smear out variability within grains. Furthermore, volume averaging over bands that encompass critical transgranular slip planes appear to present the most attractive approach to mitigate mesh sensitivity while preserving variability within grains.« less

  12. Microstructure and mesh sensitivities of mesoscale surrogate driving force measures for transgranular fatigue cracks in polycrystals

    DOE PAGES

    Castelluccio, Gustavo M.; McDowell, David L.

    2015-05-22

    The number of cycles required to form and grow microstructurally small fatigue cracks in metals exhibits substantial variability, particularly for low applied strain amplitudes. This variability is commonly attributed to the heterogeneity of cyclic plastic deformation within the microstructure, and presents a challenge to minimum life design of fatigue resistant components. Our paper analyzes sources of variability that contribute to the driving force of transgranular fatigue cracks within nucleant grains. We also employ crystal plasticity finite element simulations that explicitly render the polycrystalline microstructure and Fatigue Indicator Parameters (FIPs) averaged over different volume sizes and shapes relative to the anticipatedmore » fatigue damage process zone. Volume averaging is necessary to both achieve description of a finite fatigue damage process zone and to regularize mesh dependence in simulations. Furthermore, results from constant amplitude remote applied straining are characterized in terms of the extreme value distributions of volume averaged FIPs. Grain averaged FIP values effectively mitigate mesh sensitivity, but they smear out variability within grains. Furthermore, volume averaging over bands that encompass critical transgranular slip planes appear to present the most attractive approach to mitigate mesh sensitivity while preserving variability within grains.« less

  13. How long bones grow children: Mechanistic paths to variation in human height growth.

    PubMed

    Lampl, Michelle; Schoen, Meriah

    2017-03-01

    Eveleth and Tanner's descriptive documentation of worldwide variability in human growth provided evidence of the interaction between genetics and environment during development that has been foundational to the science of human growth. There remains a need, however, to describe the mechanistic foundations of variability in human height growth patterns. A review of research documenting cellular activities at the endochondral growth plate aims to show how the unique microenvironment and cell functions during the sequential phases of the chondrocyte lifecycle affect long bone elongation, a fundamental source of height growth. There are critical junctures within the chondrocytic differentiation cascade at which environmental influences are integrated and have the ability to influence progression to the hypertrophic chondrocyte phase, the primary driver of long bone elongation. Phenotypic differences in height growth patterns reflect variability in amplitude and frequency of discretely timed hypertrophic cellular expansion events, the cellular basis of saltation and stasis growth biology. Final height is a summary of the dynamic processes carried out by the growth plate cellular machinery. As these cell-level mechanisms unfold in an individual, time-specific manner, there are many critical points at which a genetic growth program can be enhanced or perturbed. Recognizing both the complexity and fluidity of this adaptive system questions the likelihood of a single, optimal growth pattern and instead identifies a larger bandwidth of saltatory frequencies for "normal" growth. Further inquiry into mechanistic sources of variability acting at critical organizational points of chondrogenesis can provide new opportunities for growth interventions. © 2017 Wiley Periodicals, Inc.

  14. Research and development supporting risk-based wildfire effects prediction for fuels and fire management: Status and needs

    Treesearch

    Kevin Hyde; Matthew B. Dickinson; Gil Bohrer; David Calkin; Louisa Evers; Julie Gilbertson-Day; Tessa Nicolet; Kevin Ryan; Christina Tague

    2013-01-01

    Wildland fire management has moved beyond a singular focus on suppression, calling for wildfire management for ecological benefit where no critical human assets are at risk. Processes causing direct effects and indirect, long-term ecosystem changes are complex and multidimensional. Robust risk-assessment tools are required that account for highly variable effects on...

  15. Automated Welding System

    NASA Technical Reports Server (NTRS)

    Bayless, E. O.; Lawless, K. G.; Kurgan, C.; Nunes, A. C.; Graham, B. F.; Hoffman, D.; Jones, C. S.; Shepard, R.

    1993-01-01

    Fully automated variable-polarity plasma arc VPPA welding system developed at Marshall Space Flight Center. System eliminates defects caused by human error. Integrates many sensors with mathematical model of the weld and computer-controlled welding equipment. Sensors provide real-time information on geometry of weld bead, location of weld joint, and wire-feed entry. Mathematical model relates geometry of weld to critical parameters of welding process.

  16. Empirical modeling of spatial and temporal variation in warm season nocturnal air temperatures in two North Idaho mountain ranges, USA

    Treesearch

    Zachery A. Holden; Michael A. Crimmins; Samuel A. Cushman; Jeremy S. Littell

    2010-01-01

    Accurate, fine spatial resolution predictions of surface air temperatures are critical for understanding many hydrologic and ecological processes. This study examines the spatial and temporal variability in nocturnal air temperatures across a mountainous region of Northern Idaho. Principal components analysis (PCA) was applied to a network of 70 Hobo temperature...

  17. Mammalian cell culture monitoring using in situ spectroscopy: Is your method really optimised?

    PubMed

    André, Silvère; Lagresle, Sylvain; Hannas, Zahia; Calvosa, Éric; Duponchel, Ludovic

    2017-03-01

    In recent years, as a result of the process analytical technology initiative of the US Food and Drug Administration, many different works have been carried out on direct and in situ monitoring of critical parameters for mammalian cell cultures by Raman spectroscopy and multivariate regression techniques. However, despite interesting results, it cannot be said that the proposed monitoring strategies, which will reduce errors of the regression models and thus confidence limits of the predictions, are really optimized. Hence, the aim of this article is to optimize some critical steps of spectroscopic acquisition and data treatment in order to reach a higher level of accuracy and robustness of bioprocess monitoring. In this way, we propose first an original strategy to assess the most suited Raman acquisition time for the processes involved. In a second part, we demonstrate the importance of the interbatch variability on the accuracy of the predictive models with a particular focus on the optical probes adjustment. Finally, we propose a methodology for the optimization of the spectral variables selection in order to decrease prediction errors of multivariate regressions. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 33:308-316, 2017. © 2017 American Institute of Chemical Engineers.

  18. [Prescribing monitoring in clinical practice: from enlightened empiricism to rational strategies].

    PubMed

    Buclin, Thierry; Herzig, Lilli

    2013-05-15

    Monitoring of a medical condition is the periodic measurement of one or several physiological or biological variables to detect a signal regarding its clinical progression or its response to treatment. We distinguish different medical situations between diagnostic, clinical and therapeutic process to apply monitoring. Many clinical, variables can be used for monitoring, once their intrinsic properties (normal range, critical difference, kinetics, reactivity) and external validity (pathophysiological importance, predictive power for clinical outcomes) are established. A formal conceptualization of monitoring is being developed and should support the rational development of monitoring strategies and their validation through appropriate clinical trials.

  19. Functional variability of habitats within the Sacramento-San Joaquin Delta: Restoration implications

    USGS Publications Warehouse

    Lucas, L.V.; Cloern, J.E.; Thompson, J.K.; Monsen, N.E.

    2002-01-01

    We have now entered an era of large-scale attempts to restore ecological functions and biological communities in impaired ecosystems. Our knowledge base of complex ecosystems and interrelated functions is limited, so the outcomes of specific restoration actions are highly uncertain. One approach for exploring that uncertainty and anticipating the range of possible restoration outcomes is comparative study of existing habitats similar to future habitats slated for construction. Here we compare two examples of one habitat type targeted for restoration in the Sacramento-San Joaquin River Delta. We compare one critical ecological function provided by these shallow tidal habitats - production and distribution of phytoplankton biomass as the food supply to pelagic consumers. We measured spatial and short-term temporal variability of phytoplankton biomass and growth rate and quantified the hydrodynamic and biological processes governing that variability. Results show that the production and distribution of phytoplankton biomass can be highly variable within and between nearby habitats of the same type, due to variations in phytoplankton sources, sinks, and transport. Therefore, superficially similar, geographically proximate habitats can function very differently, and that functional variability introduces large uncertainties into the restoration process. Comparative study of existing habitats is one way ecosystem science can elucidate and potentially minimize restoration uncertainties, by identifying processes shaping habitat functionality, including those that can be controlled in the restoration design.

  20. The Taguchi methodology as a statistical tool for biotechnological applications: a critical appraisal.

    PubMed

    Rao, Ravella Sreenivas; Kumar, C Ganesh; Prakasham, R Shetty; Hobbs, Phil J

    2008-04-01

    Success in experiments and/or technology mainly depends on a properly designed process or product. The traditional method of process optimization involves the study of one variable at a time, which requires a number of combinations of experiments that are time, cost and labor intensive. The Taguchi method of design of experiments is a simple statistical tool involving a system of tabulated designs (arrays) that allows a maximum number of main effects to be estimated in an unbiased (orthogonal) fashion with a minimum number of experimental runs. It has been applied to predict the significant contribution of the design variable(s) and the optimum combination of each variable by conducting experiments on a real-time basis. The modeling that is performed essentially relates signal-to-noise ratio to the control variables in a 'main effect only' approach. This approach enables both multiple response and dynamic problems to be studied by handling noise factors. Taguchi principles and concepts have made extensive contributions to industry by bringing focused awareness to robustness, noise and quality. This methodology has been widely applied in many industrial sectors; however, its application in biological sciences has been limited. In the present review, the application and comparison of the Taguchi methodology has been emphasized with specific case studies in the field of biotechnology, particularly in diverse areas like fermentation, food processing, molecular biology, wastewater treatment and bioremediation.

  1. Early language processing efficiency predicts later receptive vocabulary outcomes in children born preterm.

    PubMed

    Marchman, Virginia A; Adams, Katherine A; Loi, Elizabeth C; Fernald, Anne; Feldman, Heidi M

    2016-01-01

    As rates of prematurity continue to rise, identifying which preterm children are at increased risk for learning disabilities is a public health imperative. Identifying continuities between early and later skills in this vulnerable population can also illuminate fundamental neuropsychological processes that support learning in all children. At 18 months adjusted age, we used socioeconomic status (SES), medical variables, parent-reported vocabulary, scores on the Bayley Scales of Infant and Toddler Development (third edition) language composite, and children's lexical processing speed in the looking-while-listening (LWL) task as predictor variables in a sample of 30 preterm children. Receptive vocabulary as measured by the Peabody Picture Vocabulary Test (fourth edition) at 36 months was the outcome. Receptive vocabulary was correlated with SES, but uncorrelated with degree of prematurity or a composite of medical risk. Importantly, lexical processing speed was the strongest predictor of receptive vocabulary (r = -.81), accounting for 30% unique variance. Individual differences in lexical processing efficiency may be able to serve as a marker for information processing skills that are critical for language learning.

  2. Pharmaceutical quality by design: product and process development, understanding, and control.

    PubMed

    Yu, Lawrence X

    2008-04-01

    The purpose of this paper is to discuss the pharmaceutical Quality by Design (QbD) and describe how it can be used to ensure pharmaceutical quality. The QbD was described and some of its elements identified. Process parameters and quality attributes were identified for each unit operation during manufacture of solid oral dosage forms. The use of QbD was contrasted with the evaluation of product quality by testing alone. The QbD is a systemic approach to pharmaceutical development. It means designing and developing formulations and manufacturing processes to ensure predefined product quality. Some of the QbD elements include: Defining target product quality profile; Designing product and manufacturing processes; Identifying critical quality attributes, process parameters, and sources of variability; Controlling manufacturing processes to produce consistent quality over time. Using QbD, pharmaceutical quality is assured by understanding and controlling formulation and manufacturing variables. Product testing confirms the product quality. Implementation of QbD will enable transformation of the chemistry, manufacturing, and controls (CMC) review of abbreviated new drug applications (ANDAs) into a science-based pharmaceutical quality assessment.

  3. Critical Thinking in Social Work Education: A Research Synthesis

    ERIC Educational Resources Information Center

    Samson, Patricia L.

    2016-01-01

    In a meta-analytic review of critical thinking in social work education, findings revealed variability in research designs, methods, and subsequent findings. The 10 studies reviewed assessed different components of critical thinking and highlighted different potential moderator variables. Although there are significant limitations to all the…

  4. Perceiving speech in context: Compensation for contextual variability during acoustic cue encoding and categorization

    NASA Astrophysics Data System (ADS)

    Toscano, Joseph Christopher

    Several fundamental questions about speech perception concern how listeners understand spoken language despite considerable variability in speech sounds across different contexts (the problem of lack of invariance in speech). This contextual variability is caused by several factors, including differences between individual talkers' voices, variation in speaking rate, and effects of coarticulatory context. A number of models have been proposed to describe how the speech system handles differences across contexts. Critically, these models make different predictions about (1) whether contextual variability is handled at the level of acoustic cue encoding or categorization, (2) whether it is driven by feedback from category-level processes or interactions between cues, and (3) whether listeners discard fine-grained acoustic information to compensate for contextual variability. Separating the effects of cue- and category-level processing has been difficult because behavioral measures tap processes that occur well after initial cue encoding and are influenced by task demands and linguistic information. Recently, we have used the event-related brain potential (ERP) technique to examine cue encoding and online categorization. Specifically, we have looked at differences in the auditory N1 as a measure of acoustic cue encoding and the P3 as a measure of categorization. This allows us to examine multiple levels of processing during speech perception and can provide a useful tool for studying effects of contextual variability. Here, I apply this approach to determine the point in processing at which context has an effect on speech perception and to examine whether acoustic cues are encoded continuously. Several types of contextual variability (talker gender, speaking rate, and coarticulation), as well as several acoustic cues (voice onset time, formant frequencies, and bandwidths), are examined in a series of experiments. The results suggest that (1) at early stages of speech processing, listeners encode continuous differences in acoustic cues, independent of phonological categories; (2) at post-perceptual stages, fine-grained acoustic information is preserved; and (3) there is preliminary evidence that listeners encode cues relative to context via feedback from categories. These results are discussed in relation to proposed models of speech perception and sources of contextual variability.

  5. A quality by design study applied to an industrial pharmaceutical fluid bed granulation.

    PubMed

    Lourenço, Vera; Lochmann, Dirk; Reich, Gabriele; Menezes, José C; Herdling, Thorsten; Schewitz, Jens

    2012-06-01

    The pharmaceutical industry is encouraged within Quality by Design (QbD) to apply science-based manufacturing principles to assure quality not only of new but also of existing processes. This paper presents how QbD principles can be applied to an existing industrial pharmaceutical fluid bed granulation (FBG) process. A three-step approach is presented as follows: (1) implementation of Process Analytical Technology (PAT) monitoring tools at the industrial scale process, combined with multivariate data analysis (MVDA) of process and PAT data to increase the process knowledge; (2) execution of scaled-down designed experiments at a pilot scale, with adequate PAT monitoring tools, to investigate the process response to intended changes in Critical Process Parameters (CPPs); and finally (3) the definition of a process Design Space (DS) linking CPPs to Critical to Quality Attributes (CQAs), within which product quality is ensured by design, and after scale-up enabling its use at the industrial process scale. The proposed approach was developed for an existing industrial process. Through enhanced process knowledge established a significant reduction in product CQAs, variability already within quality specifications ranges was achieved by a better choice of CPPs values. The results of such step-wise development and implementation are described. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. Improvements in sub-grid, microphysics averages using quadrature based approaches

    NASA Astrophysics Data System (ADS)

    Chowdhary, K.; Debusschere, B.; Larson, V. E.

    2013-12-01

    Sub-grid variability in microphysical processes plays a critical role in atmospheric climate models. In order to account for this sub-grid variability, Larson and Schanen (2013) propose placing a probability density function on the sub-grid cloud microphysics quantities, e.g. autoconversion rate, essentially interpreting the cloud microphysics quantities as a random variable in each grid box. Random sampling techniques, e.g. Monte Carlo and Latin Hypercube, can be used to calculate statistics, e.g. averages, on the microphysics quantities, which then feed back into the model dynamics on the coarse scale. We propose an alternate approach using numerical quadrature methods based on deterministic sampling points to compute the statistical moments of microphysics quantities in each grid box. We have performed a preliminary test on the Kessler autoconversion formula, and, upon comparison with Latin Hypercube sampling, our approach shows an increased level of accuracy with a reduction in sample size by almost two orders of magnitude. Application to other microphysics processes is the subject of ongoing research.

  7. Recent updates in developing a statistical pseudo-dynamic source-modeling framework to capture the variability of earthquake rupture scenarios

    NASA Astrophysics Data System (ADS)

    Song, Seok Goo; Kwak, Sangmin; Lee, Kyungbook; Park, Donghee

    2017-04-01

    It is a critical element to predict the intensity and variability of strong ground motions in seismic hazard assessment. The characteristics and variability of earthquake rupture process may be a dominant factor in determining the intensity and variability of near-source strong ground motions. Song et al. (2014) demonstrated that the variability of earthquake rupture scenarios could be effectively quantified in the framework of 1-point and 2-point statistics of earthquake source parameters, constrained by rupture dynamics and past events. The developed pseudo-dynamic source modeling schemes were also validated against the recorded ground motion data of past events and empirical ground motion prediction equations (GMPEs) at the broadband platform (BBP) developed by the Southern California Earthquake Center (SCEC). Recently we improved the computational efficiency of the developed pseudo-dynamic source-modeling scheme by adopting the nonparametric co-regionalization algorithm, introduced and applied in geostatistics initially. We also investigated the effect of earthquake rupture process on near-source ground motion characteristics in the framework of 1-point and 2-point statistics, particularly focusing on the forward directivity region. Finally we will discuss whether the pseudo-dynamic source modeling can reproduce the variability (standard deviation) of empirical GMPEs and the efficiency of 1-point and 2-point statistics to address the variability of ground motions.

  8. Punishment induced behavioural and neurophysiological variability reveals dopamine-dependent selection of kinematic movement parameters

    PubMed Central

    Galea, Joseph M.; Ruge, Diane; Buijink, Arthur; Bestmann, Sven; Rothwell, John C.

    2013-01-01

    Action selection describes the high-level process which selects between competing movements. In animals, behavioural variability is critical for the motor exploration required to select the action which optimizes reward and minimizes cost/punishment, and is guided by dopamine (DA). The aim of this study was to test in humans whether low-level movement parameters are affected by punishment and reward in ways similar to high-level action selection. Moreover, we addressed the proposed dependence of behavioural and neurophysiological variability on DA, and whether this may underpin the exploration of kinematic parameters. Participants performed an out-and-back index finger movement and were instructed that monetary reward and punishment were based on its maximal acceleration (MA). In fact, the feedback was not contingent on the participant’s behaviour but pre-determined. Blocks highly-biased towards punishment were associated with increased MA variability relative to blocks with either reward or without feedback. This increase in behavioural variability was positively correlated with neurophysiological variability, as measured by changes in cortico-spinal excitability with transcranial magnetic stimulation over the primary motor cortex. Following the administration of a DA-antagonist, the variability associated with punishment diminished and the correlation between behavioural and neurophysiological variability no longer existed. Similar changes in variability were not observed when participants executed a pre-determined MA, nor did DA influence resting neurophysiological variability. Thus, under conditions of punishment, DA-dependent processes influence the selection of low-level movement parameters. We propose that the enhanced behavioural variability reflects the exploration of kinematic parameters for less punishing, or conversely more rewarding, outcomes. PMID:23447607

  9. Review and classification of variability analysis techniques with clinical applications.

    PubMed

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  10. Review and classification of variability analysis techniques with clinical applications

    PubMed Central

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  11. Word type effects in false recall: concrete, abstract, and emotion word critical lures.

    PubMed

    Bauer, Lisa M; Olheiser, Erik L; Altarriba, Jeanette; Landi, Nicole

    2009-01-01

    Previous research has demonstrated that definable qualities of verbal stimuli have implications for memory. For example, the distinction between concrete and abstract words has led to the finding that concrete words have an advantage in memory tasks (i.e., the concreteness effect). However, other word types, such as words that label specific human emotions, may also affect memory processes. This study examined the effects of word type on the production of false memories by using a list-learning false memory paradigm. Participants heard lists of words that were highly associated to nonpresented concrete, abstract, or emotion words (i.e., the critical lures) and then engaged in list recall. Emotion word critical lures were falsely recalled at a significantly higher rate (with the effect carried by the positively valenced critical lures) than concrete and abstract critical lures. These findings suggest that the word type variable has implications for our understanding of the mechanisms that underlie recall and false recall.

  12. A review of blood sample handling and pre-processing for metabolomics studies.

    PubMed

    Hernandes, Vinicius Veri; Barbas, Coral; Dudzik, Danuta

    2017-09-01

    Metabolomics has been found to be applicable to a wide range of clinical studies, bringing a new era for improving clinical diagnostics, early disease detection, therapy prediction and treatment efficiency monitoring. A major challenge in metabolomics, particularly untargeted studies, is the extremely diverse and complex nature of biological specimens. Despite great advances in the field there still exist fundamental needs for considering pre-analytical variability that can introduce bias to the subsequent analytical process and decrease the reliability of the results and moreover confound final research outcomes. Many researchers are mainly focused on the instrumental aspects of the biomarker discovery process, and sample related variables sometimes seem to be overlooked. To bridge the gap, critical information and standardized protocols regarding experimental design and sample handling and pre-processing are highly desired. Characterization of a range variation among sample collection methods is necessary to prevent results misinterpretation and to ensure that observed differences are not due to an experimental bias caused by inconsistencies in sample processing. Herein, a systematic discussion of pre-analytical variables affecting metabolomics studies based on blood derived samples is performed. Furthermore, we provide a set of recommendations concerning experimental design, collection, pre-processing procedures and storage conditions as a practical review that can guide and serve for the standardization of protocols and reduction of undesirable variation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Model based estimation of sediment erosion in groyne fields along the River Elbe

    NASA Astrophysics Data System (ADS)

    Prohaska, Sandra; Jancke, Thomas; Westrich, Bernhard

    2008-11-01

    River water quality is still a vital environmental issue, even though ongoing emissions of contaminants are being reduced in several European rivers. The mobility of historically contaminated deposits is key issue in sediment management strategy and remediation planning. Resuspension of contaminated sediments impacts the water quality and thus, it is important for river engineering and ecological rehabilitation. The erodibility of the sediments and associated contaminants is difficult to predict due to complex time depended physical, chemical, and biological processes, as well as due to the lack of information. Therefore, in engineering practice the values for erosion parameters are usually assumed to be constant despite their high spatial and temporal variability, which leads to a large uncertainty of the erosion parameters. The goal of presented study is to compare the deterministic approach assuming constant critical erosion shear stress and an innovative approach which takes the critical erosion shear stress as a random variable. Furthermore, quantification of the effective value of the critical erosion shear stress, its applicability in numerical models, and erosion probability will be estimated. The results presented here are based on field measurements and numerical modelling of the River Elbe groyne fields.

  14. Video Game Telemetry as a Critical Tool in the Study of Complex Skill Learning

    PubMed Central

    Thompson, Joseph J.; Blair, Mark R.; Chen, Lihan; Henrey, Andrew J.

    2013-01-01

    Cognitive science has long shown interest in expertise, in part because prediction and control of expert development would have immense practical value. Most studies in this area investigate expertise by comparing experts with novices. The reliance on contrastive samples in studies of human expertise only yields deep insight into development where differences are important throughout skill acquisition. This reliance may be pernicious where the predictive importance of variables is not constant across levels of expertise. Before the development of sophisticated machine learning tools for data mining larger samples, and indeed, before such samples were available, it was difficult to test the implicit assumption of static variable importance in expertise development. To investigate if this reliance may have imposed critical restrictions on the understanding of complex skill development, we adopted an alternative method, the online acquisition of telemetry data from a common daily activity for many: video gaming. Using measures of cognitive-motor, attentional, and perceptual processing extracted from game data from 3360 Real-Time Strategy players at 7 different levels of expertise, we identified 12 variables relevant to expertise. We show that the static variable importance assumption is false - the predictive importance of these variables shifted as the levels of expertise increased - and, at least in our dataset, that a contrastive approach would have been misleading. The finding that variable importance is not static across levels of expertise suggests that large, diverse datasets of sustained cognitive-motor performance are crucial for an understanding of expertise in real-world contexts. We also identify plausible cognitive markers of expertise. PMID:24058656

  15. Video game telemetry as a critical tool in the study of complex skill learning.

    PubMed

    Thompson, Joseph J; Blair, Mark R; Chen, Lihan; Henrey, Andrew J

    2013-01-01

    Cognitive science has long shown interest in expertise, in part because prediction and control of expert development would have immense practical value. Most studies in this area investigate expertise by comparing experts with novices. The reliance on contrastive samples in studies of human expertise only yields deep insight into development where differences are important throughout skill acquisition. This reliance may be pernicious where the predictive importance of variables is not constant across levels of expertise. Before the development of sophisticated machine learning tools for data mining larger samples, and indeed, before such samples were available, it was difficult to test the implicit assumption of static variable importance in expertise development. To investigate if this reliance may have imposed critical restrictions on the understanding of complex skill development, we adopted an alternative method, the online acquisition of telemetry data from a common daily activity for many: video gaming. Using measures of cognitive-motor, attentional, and perceptual processing extracted from game data from 3360 Real-Time Strategy players at 7 different levels of expertise, we identified 12 variables relevant to expertise. We show that the static variable importance assumption is false--the predictive importance of these variables shifted as the levels of expertise increased--and, at least in our dataset, that a contrastive approach would have been misleading. The finding that variable importance is not static across levels of expertise suggests that large, diverse datasets of sustained cognitive-motor performance are crucial for an understanding of expertise in real-world contexts. We also identify plausible cognitive markers of expertise.

  16. Modeling critical zone processes in intensively managed environments

    NASA Astrophysics Data System (ADS)

    Kumar, Praveen; Le, Phong; Woo, Dong; Yan, Qina

    2017-04-01

    Processes in the Critical Zone (CZ), which sustain terrestrial life, are tightly coupled across hydrological, physical, biochemical, and many other domains over both short and long timescales. In addition, vegetation acclimation resulting from elevated atmospheric CO2 concentration, along with response to increased temperature and altered rainfall pattern, is expected to result in emergent behaviors in ecologic and hydrologic functions, subsequently controlling CZ processes. We hypothesize that the interplay between micro-topographic variability and these emergent behaviors will shape complex responses of a range of ecosystem dynamics within the CZ. Here, we develop a modeling framework ('Dhara') that explicitly incorporates micro-topographic variability based on lidar topographic data with coupling of multi-layer modeling of the soil-vegetation continuum and 3-D surface-subsurface transport processes to study ecological and biogeochemical dynamics. We further couple a C-N model with a physically based hydro-geomorphologic model to quantify (i) how topographic variability controls the spatial distribution of soil moisture, temperature, and biogeochemical processes, and (ii) how farming activities modify the interaction between soil erosion and soil organic carbon (SOC) dynamics. To address the intensive computational demand from high-resolution modeling at lidar data scale, we use a hybrid CPU-GPU parallel computing architecture run over large supercomputing systems for simulations. Our findings indicate that rising CO2 concentration and air temperature have opposing effects on soil moisture, surface water and ponding in topographic depressions. Further, the relatively higher soil moisture and lower soil temperature contribute to decreased soil microbial activities in the low-lying areas due to anaerobic conditions and reduced temperatures. The decreased microbial relevant processes cause the reduction of nitrification rates, resulting in relatively lower nitrate concentration. Results from geomorphologic model also suggest that soil erosion and deposition plays a dominant role in SOC both above- and below-ground. In addition, tillage can change the amplitude and frequency of C-N oscillation. This work sheds light in developing practical means for reducing soil erosion and carbon loss when the landscape is affected by human activities.

  17. A method for developing outcome measures in the clinical laboratory.

    PubMed

    Jones, J

    1996-01-01

    Measuring and reporting outcomes in health care is becoming more important for quality assessment, utilization assessment, accreditation standards, and negotiating contracts in managed care. How does one develop an outcome measure for the laboratory to assess the value of the services? A method is described which outlines seven steps in developing outcome measures for a laboratory service or process. These steps include the following: 1. Identify the process or service to be monitored for performance and outcome assessment. 2. If necessary, form an multidisciplinary team of laboratory staff, other department staff, physicians, and pathologists. 3. State the purpose of the test or service including a review of published data for the clinical pathological correlation. 4. Prepare a process cause and effect diagram including steps critical to the outcome. 5. Identify key process variables that contribute to positive or negative outcomes. 6. Identify outcome measures that are not process measures. 7. Develop an operational definition, identify data sources, and collect data. Examples, including a process cause and effect diagram, process variables, and outcome measures, are given using the Therapeutic Drug Monitoring service (TDM). A summary of conclusions and precautions for outcome measurement is then provided.

  18. Indian monsoon variability on millennial-orbital timescales.

    PubMed

    Kathayat, Gayatri; Cheng, Hai; Sinha, Ashish; Spötl, Christoph; Edwards, R Lawrence; Zhang, Haiwei; Li, Xianglei; Yi, Liang; Ning, Youfeng; Cai, Yanjun; Lui, Weiguo Lui; Breitenbach, Sebastian F M

    2016-04-13

    The Indian summer monsoon (ISM) monsoon is critical to billions of people living in the region. Yet, significant debates remain on primary ISM drivers on millennial-orbital timescales. Here, we use speleothem oxygen isotope (δ(18)O) data from Bittoo cave, Northern India to reconstruct ISM variability over the past 280,000 years. We find strong coherence between North Indian and Chinese speleothem δ(18)O records from the East Asian monsoon domain, suggesting that both Asian monsoon subsystems exhibit a coupled response to changes in Northern Hemisphere summer insolation (NHSI) without significant temporal lags, supporting the view that the tropical-subtropical monsoon variability is driven directly by precession-induced changes in NHSI. Comparisons of the North Indian record with both Antarctic ice core and sea-surface temperature records from the southern Indian Ocean over the last glacial period do not suggest a dominant role of Southern Hemisphere climate processes in regulating the ISM variability on millennial-orbital timescales.

  19. Density profiles of the exclusive queuing process

    NASA Astrophysics Data System (ADS)

    Arita, Chikashi; Schadschneider, Andreas

    2012-12-01

    The exclusive queuing process (EQP) incorporates the exclusion principle into classic queuing models. It is characterized by, in addition to the entrance probability α and exit probability β, a third parameter: the hopping probability p. The EQP can be interpreted as an exclusion process of variable system length. Its phase diagram in the parameter space (α,β) is divided into a convergent phase and a divergent phase by a critical line which consists of a curved part and a straight part. Here we extend previous studies of this phase diagram. We identify subphases in the divergent phase, which can be distinguished by means of the shape of the density profile, and determine the velocity of the system length growth. This is done for EQPs with different update rules (parallel, backward sequential and continuous time). We also investigate the dynamics of the system length and the number of customers on the critical line. They are diffusive or subdiffusive with non-universal exponents that also depend on the update rules.

  20. Variable and complex food web structures revealed by exploring missing trophic links between birds and biofilm.

    PubMed

    Kuwae, Tomohiro; Miyoshi, Eiichi; Hosokawa, Shinya; Ichimi, Kazuhiko; Hosoya, Jun; Amano, Tatsuya; Moriya, Toshifumi; Kondoh, Michio; Ydenberg, Ronald C; Elner, Robert W

    2012-04-01

    Food webs are comprised of a network of trophic interactions and are essential to elucidating ecosystem processes and functions. However, the presence of unknown, but critical networks hampers understanding of complex and dynamic food webs in nature. Here, we empirically demonstrate a missing link, both critical and variable, by revealing that direct predator-prey relationships between shorebirds and biofilm are widespread and mediated by multiple ecological and evolutionary determinants. Food source mixing models and energy budget estimates indicate that the strength of the missing linkage is dependent on predator traits (body mass and foraging action rate) and the environment that determines food density. Morphological analyses, showing that smaller bodied species possess more developed feeding apparatus to consume biofilm, suggest that the linkage is also phylogenetically dependent and affords a compelling re-interpretation of niche differentiation. We contend that exploring missing links is a necessity for revealing true network structure and dynamics. © 2012 Blackwell Publishing Ltd/CNRS.

  1. Evaluation of Critical Thinking in Higher Education in Oman

    ERIC Educational Resources Information Center

    Kumar.R, Renjith; James, Rajani

    2015-01-01

    The study aims to identify the critical level thinking of students in higher education. It is focused to evaluate the level of critical thinking variables among the students in Nizwa College of Technology and to determine whether these variables are influenced by gender and department. The data for the research is collected from 281 diploma…

  2. Investigating Pre-Service Science Teachers' Critical Thinking Dispositions and Problem Solving Skills in Terms of Different Variables

    ERIC Educational Resources Information Center

    Yenice, Nilgun

    2011-01-01

    This study was conducted to examine pre-service science teachers' critical thinking dispositions and problem solving skills based on gender, grade level and graduated high school variables. Also relationship between pre-service science teachers' critical thinking dispositions and problem solving skills was examined based on gender, grade level and…

  3. Epigenetic modification of the oxytocin receptor gene influences the perception of anger and fear in the human brain

    PubMed Central

    Puglia, Meghan H.; Lillard, Travis S.; Morris, James P.; Connelly, Jessica J.

    2015-01-01

    In humans, the neuropeptide oxytocin plays a critical role in social and emotional behavior. The actions of this molecule are dependent on a protein that acts as its receptor, which is encoded by the oxytocin receptor gene (OXTR). DNA methylation of OXTR, an epigenetic modification, directly influences gene transcription and is variable in humans. However, the impact of this variability on specific social behaviors is unknown. We hypothesized that variability in OXTR methylation impacts social perceptual processes often linked with oxytocin, such as perception of facial emotions. Using an imaging epigenetic approach, we established a relationship between OXTR methylation and neural activity in response to emotional face processing. Specifically, high levels of OXTR methylation were associated with greater amounts of activity in regions associated with face and emotion processing including amygdala, fusiform, and insula. Importantly, we found that these higher levels of OXTR methylation were also associated with decreased functional coupling of amygdala with regions involved in affect appraisal and emotion regulation. These data indicate that the human endogenous oxytocin system is involved in attenuation of the fear response, corroborating research implicating intranasal oxytocin in the same processes. Our findings highlight the importance of including epigenetic mechanisms in the description of the endogenous oxytocin system and further support a central role for oxytocin in social cognition. This approach linking epigenetic variability with neural endophenotypes may broadly explain individual differences in phenotype including susceptibility or resilience to disease. PMID:25675509

  4. The Importance of Freshwater to Spatial Variability of Aragonite Saturation State in the Gulf of Alaska

    NASA Astrophysics Data System (ADS)

    Siedlecki, Samantha A.; Pilcher, Darren J.; Hermann, Albert J.; Coyle, Ken; Mathis, Jeremy

    2017-11-01

    High-latitude and subpolar regions like the Gulf of Alaska (GOA) are more vulnerable than equatorial regions to rising carbon dioxide (CO2) levels, in part due to local processes that amplify the global signal. Recent field observations have shown that the shelf of the GOA is currently experiencing seasonal corrosive events (carbonate mineral saturation states Ω, Ω < 1), including suppressed Ω in response to ocean acidification as well as local processes like increased low-alkalinity glacial meltwater discharge. While the glacial discharge mainly influences the inner shelf, on the outer shelf, upwelling brings corrosive waters from the deep GOA. In this work, we develop a high-resolution model for carbon dynamics in the GOA, identify regions of high variability of Ω, and test the sensitivity of those regions to changes in the chemistry of glacial meltwater discharge. Results indicate the importance of this climatically sensitive and relatively unconstrained regional freshwater forcing for Ω variability in the nearshore. The increase was nearly linear at 0.002 Ω per 100 µmol/kg increase in alkalinity in the freshwater runoff. We find that the local winds, biological processes, and freshwater forcing all contribute to the spatial distribution of Ω and identify which of these three is highly correlated to the variability in Ω. Given that the timing and magnitude of these processes will likely change during the next few decades, it is critical to elucidate the effect of local processes on the background ocean acidification signal using robust models, such as the one described here.

  5. Mechanism and simulation of droplet coalescence in molten steel

    NASA Astrophysics Data System (ADS)

    Ni, Bing; Zhang, Tao; Ni, Hai-qi; Luo, Zhi-guo

    2017-11-01

    Droplet coalescence in liquid steel was carefully investigated through observations of the distribution pattern of inclusions in solidified steel samples. The process of droplet coalescence was slow, and the critical Weber number ( We) was used to evaluate the coalescence or separation of droplets. The relationship between the collision parameter and the critical We indicated whether slow coalescence or bouncing of droplets occurred. The critical We was 5.5, which means that the droplets gradually coalesce when We ≤ 5.5, whereas they bounce when We > 5.5. For the carbonate wire feeding into liquid steel, a mathematical model implementing a combined computational fluid dynamics (CFD)-discrete element method (DEM) approach was developed to simulate the movement and coalescence of variably sized droplets in a bottom-argon-blowing ladle. In the CFD model, the flow field was solved on the premise that the fluid was a continuous medium. Meanwhile, the droplets were dispersed in the DEM model, and the coalescence criterion of the particles was added to simulate the collision- coalescence process of the particles. The numerical simulation results and observations of inclusion coalescence in steel samples are consistent.

  6. Effectiveness of Formal Logic Course on the Reasoning Skills of Students in Nizwa College of Technology, Oman

    ERIC Educational Resources Information Center

    Kumar, R. Renjith

    2017-01-01

    The study of formal logic helps to improve the process of thinking and tries to refine and improve the thinking ability. The objectives of this study are to know the effectiveness of formal logic course and to determine the critical thinking variables that are effective and that are ineffective. A sample of 214 students is selected from all the…

  7. Bio-Optics of the Chesapeake Bay from Measurements and Radiative Transfer Calculations

    NASA Technical Reports Server (NTRS)

    Tzortziou, Maria; Herman, Jay R.; Gallegos, Charles L.; Neale, Patrick J.; Subramaniam, Ajit; Harding, Lawrence W., Jr.; Ahmad, Ziauddin

    2005-01-01

    We combined detailed bio-optical measurements and radiative transfer (RT) modeling to perform an optical closure experiment for optically complex and biologically productive Chesapeake Bay waters. We used this experiment to evaluate certain assumptions commonly used when modeling bio-optical processes, and to investigate the relative importance of several optical characteristics needed to accurately model and interpret remote sensing ocean-color observations in these Case 2 waters. Direct measurements were made of the magnitude, variability, and spectral characteristics of backscattering and absorption that are critical for accurate parameterizations in satellite bio-optical algorithms and underwater RT simulations. We found that the ratio of backscattering to total scattering in the mid-mesohaline Chesapeake Bay varied considerably depending on particulate loading, distance from land, and mixing processes, and had an average value of 0.0128 at 530 nm. Incorporating information on the magnitude, variability, and spectral characteristics of particulate backscattering into the RT model, rather than using a volume scattering function commonly assumed for turbid waters, was critical to obtaining agreement between RT calculations and measured radiometric quantities. In situ measurements of absorption coefficients need to be corrected for systematic overestimation due to scattering errors, and this correction commonly employs the assumption that absorption by particulate matter at near infrared wavelengths is zero.

  8. Determination of Optimal Subsidy for Materials Saving Investment through Recycle/Recovery at Industrial Level

    NASA Astrophysics Data System (ADS)

    Batzias, Dimitris F.

    2009-08-01

    This work deals with a methodological framework under the form of a simple/short algorithmic procedure (including 11 activity steps and 3 decision nodes) designed/developed for the determination of optimal subsidy for materials saving investment through recycle/recovery (RR) at industrial level. Two case examples are presented, covering both aspects, without and with recycling. The expected Relative Cost Decrease (RCD) because of recycling, which forms a critical index for decision making on subsidizing, is estimated. The developed procedure can be extended outside the industrial unit to include collection/transportation/processing of recyclable wasted products. Since, in such a case, transportation cost and processing cost are conflict depended variables (when the quantity collected/processed Q is the independent/explanatory variable), the determination of Qopt is examined under energy crises conditions, when corresponding subsidies might be granted to re-set the original equilibrium and avoid putting the recycling enterprise in jeopardize due to dangerous lowering of the first break-even point.

  9. Assessing technical performance in differential gene expression experiments with external spike-in RNA control ratio mixtures.

    PubMed

    Munro, Sarah A; Lund, Steven P; Pine, P Scott; Binder, Hans; Clevert, Djork-Arné; Conesa, Ana; Dopazo, Joaquin; Fasold, Mario; Hochreiter, Sepp; Hong, Huixiao; Jafari, Nadereh; Kreil, David P; Łabaj, Paweł P; Li, Sheng; Liao, Yang; Lin, Simon M; Meehan, Joseph; Mason, Christopher E; Santoyo-Lopez, Javier; Setterquist, Robert A; Shi, Leming; Shi, Wei; Smyth, Gordon K; Stralis-Pavese, Nancy; Su, Zhenqiang; Tong, Weida; Wang, Charles; Wang, Jian; Xu, Joshua; Ye, Zhan; Yang, Yong; Yu, Ying; Salit, Marc

    2014-09-25

    There is a critical need for standard approaches to assess, report and compare the technical performance of genome-scale differential gene expression experiments. Here we assess technical performance with a proposed standard 'dashboard' of metrics derived from analysis of external spike-in RNA control ratio mixtures. These control ratio mixtures with defined abundance ratios enable assessment of diagnostic performance of differentially expressed transcript lists, limit of detection of ratio (LODR) estimates and expression ratio variability and measurement bias. The performance metrics suite is applicable to analysis of a typical experiment, and here we also apply these metrics to evaluate technical performance among laboratories. An interlaboratory study using identical samples shared among 12 laboratories with three different measurement processes demonstrates generally consistent diagnostic power across 11 laboratories. Ratio measurement variability and bias are also comparable among laboratories for the same measurement process. We observe different biases for measurement processes using different mRNA-enrichment protocols.

  10. Downstream processing from melt granulation towards tablets: In-depth analysis of a continuous twin-screw melt granulation process using polymeric binders.

    PubMed

    Grymonpré, W; Verstraete, G; Vanhoorne, V; Remon, J P; De Beer, T; Vervaet, C

    2018-03-01

    The concept of twin-screw melt granulation (TSMG) has steadily (re)-gained interest in pharmaceutical formulation development as an intermediate step during tablet manufacturing. However, to be considered as a viable processing option for solid oral dosage forms there is a need to understand all critical sources of variability which could affect this granulation technique. The purpose of this study was to provide an in-depth analysis of the continuous TSMG process in order to expose the critical process parameters (CPP) and elucidate the impact of process and formulation parameters on the critical quality attributes (CQA) of granules and tablets during continuous TSMG. A first part of the study dealt with the screening of various amorphous polymers as binder for producing high-dosed melt granules of two model drug (i.e. acetaminophen and hydrochlorothiazide). The second part of this study described a quality-by-design (QbD) approach for melt granulation of hydrochlorothiazide in order to thoroughly evaluate TSMG, milling and tableting stage of the continuous TSMG line. Using amorphous polymeric binders resulted in melt granules with high milling efficiency due to their brittle behaviour without producing excessive amounts of fines, providing high granule yields with low friability. Therefore, it makes them extremely suitable for further downstream processing. One of the most important CPP during TSMG with polymeric binders was the granulation-torque, which - in case of polymers with high T g - increased during longer granulation runs to critical levels endangering the continuous process flow. However, by optimizing both screw speed and throughput or changing to polymeric binders with lower T g it was possible to significantly reduce this risk. This research paper highlighted that TSMG must be considered as a viable option during formulation development of solid oral dosage forms based on the robustness of the CQA of both melt granules and tablets. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Real time monitoring of powder blend bulk density for coupled feed-forward/feed-back control of a continuous direct compaction tablet manufacturing process.

    PubMed

    Singh, Ravendra; Román-Ospino, Andrés D; Romañach, Rodolfo J; Ierapetritou, Marianthi; Ramachandran, Rohit

    2015-11-10

    The pharmaceutical industry is strictly regulated, where precise and accurate control of the end product quality is necessary to ensure the effectiveness of the drug products. For such control, the process and raw materials variability ideally need to be fed-forward in real time into an automatic control system so that a proactive action can be taken before it can affect the end product quality. Variations in raw material properties (e.g., particle size), feeder hopper level, amount of lubrication, milling and blending action, applied shear in different processing stages can affect the blend density significantly and thereby tablet weight, hardness and dissolution. Therefore, real time monitoring of powder bulk density variability and its incorporation into the automatic control system so that its effect can be mitigated proactively and efficiently is highly desired. However, real time monitoring of powder bulk density is still a challenging task because of different level of complexities. In this work, powder bulk density which has a significant effect on the critical quality attributes (CQA's) has been monitored in real time in a pilot-plant facility, using a NIR sensor. The sensitivity of the powder bulk density on critical process parameters (CPP's) and CQA's has been analyzed and thereby feed-forward controller has been designed. The measured signal can be used for feed-forward control so that the corrective actions on the density variations can be taken before they can influence the product quality. The coupled feed-forward/feed-back control system demonstrates improved control performance and improvements in the final product quality in the presence of process and raw material variations. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Mixed-order phase transition of the contact process near multiple junctions.

    PubMed

    Juhász, Róbert; Iglói, Ferenc

    2017-02-01

    We have studied the phase transition of the contact process near a multiple junction of M semi-infinite chains by Monte Carlo simulations. As opposed to the continuous transitions of the translationally invariant (M=2) and semi-infinite (M=1) system, the local order parameter is found to be discontinuous for M>2. Furthermore, the temporal correlation length diverges algebraically as the critical point is approached, but with different exponents on the two sides of the transition. In the active phase, the estimate is compatible with the bulk value, while in the inactive phase it exceeds the bulk value and increases with M. The unusual local critical behavior is explained by a scaling theory with an irrelevant variable, which becomes dangerous in the inactive phase. Quenched spatial disorder is found to make the transition continuous in agreement with earlier renormalization group results.

  13. Measurement and analysis of critical crack tip processes during fatigue crack growth

    NASA Technical Reports Server (NTRS)

    Davidson, D. L.; Hudak, S. J.; Dexter, R. J.

    1985-01-01

    The mechanics of fatigue crack growth under constant-amplitudes and variable-amplitude loading were examined. Critical loading histories involving relatively simple overload and overload/underload cycles were studied to provide a basic understanding of the underlying physical processes controlling crack growth. The material used for this study was 7091-T7E69, a powder metallurgy aluminum alloy. Local crack-tip parameters were measured at various times before, during, and after the overloads, these include crack-tip opening loads and displacements, and crack-tip strain fields. The latter were useed, in combination with the materials cyclic and monotonic stress-strain properties, to compute crack-tip residual stresses. The experimental results are also compared with analytical predictions obtained using the FAST-2 computer code. The sensitivity of the analytical model to constant-amplitude fatigue crack growth rate properties and to through-thickness constrain are studied.

  14. Risk analysis of the thermal sterilization process. Analysis of factors affecting the thermal resistance of microorganisms.

    PubMed

    Akterian, S G; Fernandez, P S; Hendrickx, M E; Tobback, P P; Periago, P M; Martinez, A

    1999-03-01

    A risk analysis was applied to experimental heat resistance data. This analysis is an approach for processing experimental thermobacteriological data in order to study the variability of D and z values of target microorganisms depending on the deviations range of environmental factors, to determine the critical factors and to specify their critical tolerance. This analysis is based on sets of sensitivity functions applied to a specific case of experimental data related to the thermoresistance of Clostridium sporogenes and Bacillus stearothermophilus spores. The effect of the following factors was analyzed: the type of target microorganism; nature of the heating substrate; pH, temperature; type of acid employed and NaCl concentration. The type of target microorganism to be inactivated, the nature of the substrate (reference or real food) and the heating temperature were identified as critical factors, determining about 90% of the alteration of the microbiological risk. The effect of the type of acid used for the acidification of products and the concentration of NaCl can be assumed to be negligible factors for the purposes of engineering calculations. The critical non-uniformity in temperature during thermobacteriological studies was set as 0.5% and the critical tolerances of pH value and NaCl concentration were 5%. These results are related to a specific case study, for that reason their direct generalization is not correct.

  15. So Many Brands and Varieties to Choose from: Does This Compromise the Control of Food Intake in Humans?

    PubMed

    Hardman, Charlotte A; Ferriday, Danielle; Kyle, Lesley; Rogers, Peter J; Brunstrom, Jeffrey M

    2015-01-01

    The recent rise in obesity is widely attributed to changes in the dietary environment (e.g., increased availability of energy-dense foods and larger portion sizes). However, a critical feature of our "obesogenic environment" may have been overlooked - the dramatic increase in "dietary variability" (the tendency for specific mass-produced foods to be available in numerous varieties that differ in energy content). In this study we tested the hypothesis that dietary variability compromises the control of food intake in humans. Specifically, we examined the effects of dietary variability in pepperoni pizza on two key outcome variables; i) compensation for calories in pepperoni pizza and ii) expectations about the satiating properties of pepperoni pizza (expected satiation). We reasoned that dietary variability might generate uncertainty about the postingestive effects of a food. An internet-based questionnaire was completed by 199 adults. This revealed substantial variation in exposure to different varieties of pepperoni pizza. In a follow-up study (n= 66; 65% female), high pizza variability was associated with i) poorer compensation for calories in pepperoni pizza and ii) lower expected satiation for pepperoni pizza. Furthermore, the effect of uncertainty on caloric compensation was moderated by individual differences in decision making (loss aversion). For the first time, these findings highlight a process by which dietary variability may compromise food-intake control in humans. This is important because it exposes a new feature of Western diets (processed foods in particular) that might contribute to overeating and obesity.

  16. Characterization of Machine Variability and Progressive Heat Treatment in Selective Laser Melting of Inconel 718

    NASA Technical Reports Server (NTRS)

    Prater, T.; Tilson, W.; Jones, Z.

    2015-01-01

    The absence of an economy of scale in spaceflight hardware makes additive manufacturing an immensely attractive option for propulsion components. As additive manufacturing techniques are increasingly adopted by government and industry to produce propulsion hardware in human-rated systems, significant development efforts are needed to establish these methods as reliable alternatives to conventional subtractive manufacturing. One of the critical challenges facing powder bed fusion techniques in this application is variability between machines used to perform builds. Even with implementation of robust process controls, it is possible for two machines operating at identical parameters with equivalent base materials to produce specimens with slightly different material properties. The machine variability study presented here evaluates 60 specimens of identical geometry built using the same parameters. 30 samples were produced on machine 1 (M1) and the other 30 samples were built on machine 2 (M2). Each of the 30-sample sets were further subdivided into three subsets (with 10 specimens in each subset) to assess the effect of progressive heat treatment on machine variability. The three categories for post-processing were: stress relief, stress relief followed by hot isostatic press (HIP), and stress relief followed by HIP followed by heat treatment per AMS 5664. Each specimen (a round, smooth tensile) was mechanically tested per ASTM E8. Two formal statistical techniques, hypothesis testing for equivalency of means and one-way analysis of variance (ANOVA), were applied to characterize the impact of machine variability and heat treatment on six material properties: tensile stress, yield stress, modulus of elasticity, fracture elongation, and reduction of area. This work represents the type of development effort that is critical as NASA, academia, and the industrial base work collaboratively to establish a path to certification for additively manufactured parts. For future flight programs, NASA and its commercial partners will procure parts from vendors who will use a diverse range of machines to produce parts and, as such, it is essential that the AM community develop a sound understanding of the degree to which machine variability impacts material properties.

  17. Model-supported estimation of mortality rates in Baltic cod (Gadus morhua callarias L.) larvae: the varying impact of 'critical periods'

    PubMed Central

    Voss, Rüdiger; Hinrichsen, Hans-Harald; Wieland, Kai

    2001-01-01

    Background Changes in the survival-rate during the larval phase may strongly influence the recruitment level in marine fish species. During the larval phase different 'critical periods' are discussed, e.g. the hatching period and the first-feeding period. No such information was available for the Baltic cod stock, a commercially important stock showing reproduction failure during the last years. We calculated field-based mortality rates for larval Baltic cod during these phases using basin-wide abundance estimates from two consecutive surveys. Survey information was corrected by three dimensional hydrodynamic model runs. Results The corrections applied for transport were of variable impact, depending on the prevailing circulation patterns. Especially at high wind forcing scenarios, abundance estimates have the potential to be biased without accounting for transport processes. In May 1988 mortality between hatch and first feeding amounted to approximately 20% per day. Mortality rates during the onset of feeding were considerably lower with only 7% per day. In August 1991 the situation was vice versa: Extremely low mortality rates of 0.08% per day were calculated between hatch and first feeding, while the period between the onset of feeding to the state of an established feeder was more critical with mortality rates of 22% per day. Conclusions Mortality rates during the different proposed 'critical periods' were found to be highly variable. Survival rates of Baltic cod are not only influenced by a single 'critical period', but can be limited at different points during the larval phase, depending on several biotic and abiotic factors. PMID:11737879

  18. Progress with lossy compression of data from the Community Earth System Model

    NASA Astrophysics Data System (ADS)

    Xu, H.; Baker, A.; Hammerling, D.; Li, S.; Clyne, J.

    2017-12-01

    Climate models, such as the Community Earth System Model (CESM), generate massive quantities of data, particularly when run at high spatial and temporal resolutions. The burden of storage is further exacerbated by creating large ensembles, generating large numbers of variables, outputting at high frequencies, and duplicating data archives (to protect against disk failures). Applying lossy compression methods to CESM datasets is an attractive means of reducing data storage requirements, but ensuring that the loss of information does not negatively impact science objectives is critical. In particular, test methods are needed to evaluate whether critical features (e.g., extreme values and spatial and temporal gradients) have been preserved and to boost scientists' confidence in the lossy compression process. We will provide an overview on our progress in applying lossy compression to CESM output and describe our unique suite of metric tests that evaluate the impact of information loss. Further, we will describe our processes how to choose an appropriate compression algorithm (and its associated parameters) given the diversity of CESM data (e.g., variables may be constant, smooth, change abruptly, contain missing values, or have large ranges). Traditional compression algorithms, such as those used for images, are not necessarily ideally suited for floating-point climate simulation data, and different methods may have different strengths and be more effective for certain types of variables than others. We will discuss our progress towards our ultimate goal of developing an automated multi-method parallel approach for compression of climate data that both maximizes data reduction and minimizes the impact of data loss on science results.

  19. Moving forward socio-economically focused models of deforestation.

    PubMed

    Dezécache, Camille; Salles, Jean-Michel; Vieilledent, Ghislain; Hérault, Bruno

    2017-09-01

    Whilst high-resolution spatial variables contribute to a good fit of spatially explicit deforestation models, socio-economic processes are often beyond the scope of these models. Such a low level of interest in the socio-economic dimension of deforestation limits the relevancy of these models for decision-making and may be the cause of their failure to accurately predict observed deforestation trends in the medium term. This study aims to propose a flexible methodology for taking into account multiple drivers of deforestation in tropical forested areas, where the intensity of deforestation is explicitly predicted based on socio-economic variables. By coupling a model of deforestation location based on spatial environmental variables with several sub-models of deforestation intensity based on socio-economic variables, we were able to create a map of predicted deforestation over the period 2001-2014 in French Guiana. This map was compared to a reference map for accuracy assessment, not only at the pixel scale but also over cells ranging from 1 to approximately 600 sq. km. Highly significant relationships were explicitly established between deforestation intensity and several socio-economic variables: population growth, the amount of agricultural subsidies, gold and wood production. Such a precise characterization of socio-economic processes allows to avoid overestimation biases in high deforestation areas, suggesting a better integration of socio-economic processes in the models. Whilst considering deforestation as a purely geographical process contributes to the creation of conservative models unable to effectively assess changes in the socio-economic and political contexts influencing deforestation trends, this explicit characterization of the socio-economic dimension of deforestation is critical for the creation of deforestation scenarios in REDD+ projects. © 2017 John Wiley & Sons Ltd.

  20. Pool and flow boiling in variable and microgravity

    NASA Technical Reports Server (NTRS)

    Merte, Herman, Jr.

    1994-01-01

    As is well known, boiling is an effective mode of heat transfer in that high heat flux levels are possible with relatively small temperature differences. Its optimal application requires that the process be adequately understood. A measure of the understanding of any physical event lies in the ability to predict its behavior in terms of the relevant parameters. Despite many years of research the predictability of boiling is currently possible only for quite specialized circumstances, e.g., the critical heat flux and film boiling for the pool boiling case, and then only with special geometries. Variable gravity down to microgravity provides the opportunity to test this understanding, but possibly more important, by changing the dimensional and time scales involved permits more detailed observations of elements involved in the boiling process, and perhaps discloses phenomena heretofore unknown. The focus here is on nucleate boiling although, as will be demonstrated below, under but certain circumstances in microgravity it can take place concurrently with the dryout process. In the presence of earth gravity or forced convection effects, the latter process is usually referred to as film boiling. However, no vapor film as such forms with pool boiling in microgravity, only dryout. Initial results are presented here for pool boiling in microgravity, and were made possible at such an early date by the availability of the Get-Away-Specials (GAS). Also presented here are some results of ground testing of a flow loop for the study of low velocity boiling, eventually to take place also in microgravity. In the interim, variable buoyancy normal to the heater surface is achieved by rotation of the entire loop relative to earth gravity. Of course, this is at the expense of varying the buoyancy parallel to the heater surface. Two questions which must be resolved early in the study of flow boiling in microgravity are (1) the lower limits of liquid flow velocity where buoyancy effects become significant to the boiling process (2) the effect of lower liquid flow velocities on the Critical Heat Flux when buoyancy is removed. Results of initial efforts in these directions are presented, albeit restricted currently to the ever present earth gravity.

  1. Challenges and Potential Solutions – Individualised Antibiotic Dosing at the Bedside for Critically Ill Patients: a structured review

    PubMed Central

    Roberts, Jason A.; Aziz, Mohd Hafiz Abdul; Lipman, Jeffrey; Mouton, Johan W.; Vinks, Alexander A.; Felton, Timothy W.; Hope, William W.; Farkas, Andras; Neely, Michael N.; Schentag, Jerome J.; Drusano, George; Frey, Otto R.; Theuretzbacher, Ursula; Kuti, Joseph L.

    2014-01-01

    Summary Infections in critically ill patients are associated with persistently poor clinical outcomes. These patients have severely altered and variable antibiotic pharmacokinetics and are infected by less susceptible pathogens. Antibiotic dosing that does not account for these features is likely to result in sub-optimal outcomes. In this paper, we review the patient- and pathogen-related challenges that contribute to inadequate antibiotic dosing and discuss how a process for individualised antibiotic therapy, that increases the accuracy of dosing, can be implemented to further optimise care for the critically ill patient. The process for optimised antibiotic dosing firstly requires determination of the physiological derangements in the patient that can alter antibiotic concentrations including altered fluid status, microvascular failure, serum albumin concentrations as well as altered renal and hepatic function. Secondly, knowledge of the susceptibility of the infecting pathogen should be determined through liaison with the microbiology laboratory. The patient and pathogen challenges can then be solved by combining susceptibility data with measured antibiotic concentration data (where possible) into a clinical dosing software. Such software uses pharmacokinetic-pharmacodynamic (PK/PD) models from critically ill patients to accurately predict the dosing requirements for the individual patient with the aim of optimising antibiotic exposure and maximising effectiveness. PMID:24768475

  2. Evaluation of the Correlation between Learning Styles and Critical Thinking Dispositions of the Students of School of Physical Education and Sports

    ERIC Educational Resources Information Center

    Çetin, Mehmet Çagri

    2014-01-01

    The study was conducted in order to detect critical thinking dispositions and learning styles of the students of school of physical education and sports, to explore whether there was a significant difference in terms of gender variable and academic department variable and, to discover the correlation between critical thinking tendencies and…

  3. Disturbance Impacts on Thermal Hot Spots and Hot Moments at the Peatland-Atmosphere Interface

    NASA Astrophysics Data System (ADS)

    Leonard, R. M.; Kettridge, N.; Devito, K. J.; Petrone, R. M.; Mendoza, C. A.; Waddington, J. M.; Krause, S.

    2018-01-01

    Soil-surface temperature acts as a master variable driving nonlinear terrestrial ecohydrological, biogeochemical, and micrometeorological processes, inducing short-lived or spatially isolated extremes across heterogeneous landscape surfaces. However, subcanopy soil-surface temperatures have been, to date, characterized through isolated, spatially discrete measurements. Using spatially complex forested northern peatlands as an exemplar ecosystem, we explore the high-resolution spatiotemporal thermal behavior of this critical interface and its response to disturbances by using Fiber-Optic Distributed Temperature Sensing. Soil-surface thermal patterning was identified from 1.9 million temperature measurements under undisturbed, trees removed and vascular subcanopy removed conditions. Removing layers of the structurally diverse vegetation canopy not only increased mean temperatures but it shifted the spatial and temporal distribution, range, and longevity of thermal hot spots and hot moments. We argue that linking hot spots and/or hot moments with spatially variable ecosystem processes and feedbacks is key for predicting ecosystem function and resilience.

  4. Individual differences in the joint effects of semantic priming and word frequency: The role of lexical integrity

    PubMed Central

    Yap, Melvin J.; Tse, Chi-Shing; Balota, David A.

    2009-01-01

    Word frequency and semantic priming effects are among the most robust effects in visual word recognition, and it has been generally assumed that these two variables produce interactive effects in lexical decision performance, with larger priming effects for low-frequency targets. The results from four lexical decision experiments indicate that the joint effects of semantic priming and word frequency are critically dependent upon differences in the vocabulary knowledge of the participants. Specifically, across two Universities, additive effects of the two variables were observed in participants with more vocabulary knowledge, while interactive effects were observed in participants with less vocabulary knowledge. These results are discussed with reference to Borowsky and Besner’s (1993) multistage account and Plaut and Booth’s (2000) single-mechanism model. In general, the findings are also consistent with a flexible lexical processing system that optimizes performance based on processing fluency and task demands. PMID:20161653

  5. Conserved and variable domains of RNase MRP RNA.

    PubMed

    Dávila López, Marcela; Rosenblad, Magnus Alm; Samuelsson, Tore

    2009-01-01

    Ribonuclease MRP is a eukaryotic ribonucleoprotein complex consisting of one RNA molecule and 7-10 protein subunits. One important function of MRP is to catalyze an endonucleolytic cleavage during processing of rRNA precursors. RNase MRP is evolutionary related to RNase P which is critical for tRNA processing. A large number of MRP RNA sequences that now are available have been used to identify conserved primary and secondary structure features of the molecule. MRP RNA has structural features in common with P RNA such as a conserved catalytic core, but it also has unique features and is characterized by a domain highly variable between species. Information regarding primary and secondary structure features is of interest not only in basic studies of the function of MRP RNA, but also because mutations in the RNA give rise to human genetic diseases such as cartilage-hair hypoplasia.

  6. High-frequency, long-duration water sampling in acid mine drainage studies: a short review of current methods and recent advances in automated water samplers

    USGS Publications Warehouse

    Chapin, Thomas

    2015-01-01

    Hand-collected grab samples are the most common water sampling method but using grab sampling to monitor temporally variable aquatic processes such as diel metal cycling or episodic events is rarely feasible or cost-effective. Currently available automated samplers are a proven, widely used technology and typically collect up to 24 samples during a deployment. However, these automated samplers are not well suited for long-term sampling in remote areas or in freezing conditions. There is a critical need for low-cost, long-duration, high-frequency water sampling technology to improve our understanding of the geochemical response to temporally variable processes. This review article will examine recent developments in automated water sampler technology and utilize selected field data from acid mine drainage studies to illustrate the utility of high-frequency, long-duration water sampling.

  7. Mechanisms controlling the complete accretionary beach state sequence

    NASA Astrophysics Data System (ADS)

    Dubarbier, Benjamin; Castelle, Bruno; Ruessink, Gerben; Marieu, Vincent

    2017-06-01

    Accretionary downstate beach sequence is a key element of observed nearshore morphological variability along sandy coasts. We present and analyze the first numerical simulation of such a sequence using a process-based morphodynamic model that solves the coupling between waves, depth-integrated currents, and sediment transport. The simulation evolves from an alongshore uniform barred beach (storm profile) to an almost featureless shore-welded terrace (summer profile) through the highly alongshore variable detached crescentic bar and transverse bar/rip system states. A global analysis of the full sequence allows determining the varying contributions of the different hydro-sedimentary processes. Sediment transport driven by orbital velocity skewness is critical to the overall onshore sandbar migration, while gravitational downslope sediment transport acts as a damping term inhibiting further channel growth enforced by rip flow circulation. Accurate morphological diffusivity and inclusion of orbital velocity skewness opens new perspectives in terms of morphodynamic modeling of real beaches.

  8. Chapter 6: Temperature

    USGS Publications Warehouse

    Jones, Leslie A.; Muhlfeld, Clint C.; Hauer, F. Richard; F. Richard Hauer,; Lamberti, G.A.

    2017-01-01

    Stream temperature has direct and indirect effects on stream ecology and is critical in determining both abiotic and biotic system responses across a hierarchy of spatial and temporal scales. Temperature variation is primarily driven by solar radiation, while landscape topography, geology, and stream reach scale ecosystem processes contribute to local variability. Spatiotemporal heterogeneity in freshwater ecosystems influences habitat distributions, physiological functions, and phenology of all aquatic organisms. In this chapter we provide an overview of methods for monitoring stream temperature, characterization of thermal profiles, and modeling approaches to stream temperature prediction. Recent advances in temperature monitoring allow for more comprehensive studies of the underlying processes influencing annual variation of temperatures and how thermal variability may impact aquatic organisms at individual, population, and community based scales. Likewise, the development of spatially explicit predictive models provide a framework for simulating natural and anthropogenic effects on thermal regimes which is integral for sustainable management of freshwater systems.

  9. Comparison of 2 intravenous insulin protocols: Glycemia variability in critically ill patients.

    PubMed

    Gómez-Garrido, Marta; Rodilla-Fiz, Ana M; Girón-Lacasa, María; Rodríguez-Rubio, Laura; Martínez-Blázquez, Anselmo; Martínez-López, Fernando; Pardo-Ibáñez, María Dolores; Núñez-Marín, Juan M

    2017-05-01

    Glycemic variability is an independent predictor of mortality in critically ill patients. The objective of this study was to compare two intravenous insulin protocols in critically ill patients regarding the glycemic variability. This was a retrospective observational study performed by reviewing clinical records of patients from a Critical Care Unit for 4 consecutive months. First, a simpler Scale-Based Intravenous Insulin Protocol (SBIIP) was reviewed and later it was compared for the same months of the following year with a Sliding Scale-Based Intravenous Insulin Protocol (SSBIIP). All adult patients admitted to the unit during the referred months were included. Patients in whom the protocol was not adequately followed were excluded. A total of 557 patients were reviewed, of whom they had needed intravenous insulin 73 in the first group and 52 in the second group. Four and two patients were excluded in each group respectively. Glycemic variability for both day 1 (DS1) and total stay (DST) was lower in SSBIIP patients compared to SBIIP patients: SD1 34.88 vs 18.16 and SDT 36.45 vs 23.65 (P<.001). A glycemic management protocol in critically ill patients based on sliding scales decreases glycemic variability. Copyright © 2017 SEEN. Publicado por Elsevier España, S.L.U. All rights reserved.

  10. Infant perceptual development for faces and spoken words: An integrated approach

    PubMed Central

    Watson, Tamara L; Robbins, Rachel A; Best, Catherine T

    2014-01-01

    There are obvious differences between recognizing faces and recognizing spoken words or phonemes that might suggest development of each capability requires different skills. Recognizing faces and perceiving spoken language, however, are in key senses extremely similar endeavors. Both perceptual processes are based on richly variable, yet highly structured input from which the perceiver needs to extract categorically meaningful information. This similarity could be reflected in the perceptual narrowing that occurs within the first year of life in both domains. We take the position that the perceptual and neurocognitive processes by which face and speech recognition develop are based on a set of common principles. One common principle is the importance of systematic variability in the input as a source of information rather than noise. Experience of this variability leads to perceptual tuning to the critical properties that define individual faces or spoken words versus their membership in larger groupings of people and their language communities. We argue that parallels can be drawn directly between the principles responsible for the development of face and spoken language perception. PMID:25132626

  11. Lung Cancer Screening Participation: Developing a Conceptual Model to Guide Research

    PubMed Central

    Carter-Harris, Lisa; Davis, Lorie L.; Rawl, Susan M.

    2017-01-01

    Purpose To describe the development of a conceptual model to guide research focused on lung cancer screening participation from the perspective of the individual in the decision-making process. Methods Based on a comprehensive review of empirical and theoretical literature, a conceptual model was developed linking key psychological variables (stigma, medical mistrust, fatalism, worry, and fear) to the health belief model and precaution adoption process model. Results Proposed model concepts have been examined in prior research of either lung or other cancer screening behavior. To date, a few studies have explored a limited number of variables that influence screening behavior in lung cancer specifically. Therefore, relationships among concepts in the model have been proposed and future research directions presented. Conclusion This proposed model is an initial step to support theoretically based research. As lung cancer screening becomes more widely implemented, it is critical to theoretically guide research to understand variables that may be associated with lung cancer screening participation. Findings from future research guided by the proposed conceptual model can be used to refine the model and inform tailored intervention development. PMID:28304262

  12. Lung Cancer Screening Participation: Developing a Conceptual Model to Guide Research.

    PubMed

    Carter-Harris, Lisa; Davis, Lorie L; Rawl, Susan M

    2016-11-01

    To describe the development of a conceptual model to guide research focused on lung cancer screening participation from the perspective of the individual in the decision-making process. Based on a comprehensive review of empirical and theoretical literature, a conceptual model was developed linking key psychological variables (stigma, medical mistrust, fatalism, worry, and fear) to the health belief model and precaution adoption process model. Proposed model concepts have been examined in prior research of either lung or other cancer screening behavior. To date, a few studies have explored a limited number of variables that influence screening behavior in lung cancer specifically. Therefore, relationships among concepts in the model have been proposed and future research directions presented. This proposed model is an initial step to support theoretically based research. As lung cancer screening becomes more widely implemented, it is critical to theoretically guide research to understand variables that may be associated with lung cancer screening participation. Findings from future research guided by the proposed conceptual model can be used to refine the model and inform tailored intervention development.

  13. A Sensory Material Approach for Reducing Variability in Additively Manufactured Metal Parts.

    PubMed

    Franco, B E; Ma, J; Loveall, B; Tapia, G A; Karayagiz, K; Liu, J; Elwany, A; Arroyave, R; Karaman, I

    2017-06-15

    Despite the recent growth in interest for metal additive manufacturing (AM) in the biomedical and aerospace industries, variability in the performance, composition, and microstructure of AM parts remains a major impediment to its widespread adoption. The underlying physical mechanisms, which cause variability, as well as the scale and nature of variability are not well understood, and current methods are ineffective at capturing these details. Here, a Nickel-Titanium alloy is used as a sensory material in order to quantitatively, and rather rapidly, observe compositional and/or microstructural variability in selective laser melting manufactured parts; thereby providing a means to evaluate the role of process parameters on the variability. We perform detailed microstructural investigations using transmission electron microscopy at various locations to reveal the origins of microstructural variability in this sensory material. This approach helped reveal how reducing the distance between adjacent laser scans below a critical value greatly reduces both the in-sample and sample-to-sample variability. Microstructural investigations revealed that when the laser scan distance is wide, there is an inhomogeneity in subgrain size, precipitate distribution, and dislocation density in the microstructure, responsible for the observed variability. These results provide an important first step towards understanding the nature of variability in additively manufactured parts.

  14. Do network relationships matter? Comparing network and instream habitat variables to explain densities of juvenile coho salmon (Oncorhynchus kisutch) in mid-coastal Oregon, USA

    Treesearch

    Rebecca L. Flitcroft; Kelly M. Burnett; Gordon H. Reeves; Lisa M. Ganio

    2012-01-01

    Aquatic ecologists are working to develop theory and techniques for analysis of dynamic stream processes and communities of organisms. Such work is critical for the development of conservation plans that are relevant at the scale of entire ecosystems. The stream network is the foundation upon which stream systems are organized. Natural and human disturbances in streams...

  15. Relationship between oral health in children and poverty related factors.

    PubMed

    Squassi, Aldo; Mauro, Silvia; Mauro, María José; Sánchez, Gabriel; Bordoni, Noemí

    2008-01-01

    The aim of this investigation was to analyze the variables related to poverty and its influence on oral health in children living in a suburban area ofBuenos Aires, Argentina. The study population consisted of 1,049 children. 579 children at social risk (Group I) were recruited from five neighborhoods with critical lacks (Katzman, 1989) and divided into 2 subgroups according to age: (A) preschool children and (B) school children. 470 preschool and school children from the same district but living in homes without critical lacks served as controls (Group II). The following variables associated with poverty were analyzed: (a) parents' instructional level, (b) employment conditions, and (c) accessibilty to regular oral health care. Group I comprised children from five neighborhoods categorized according to the incidence rate of each variable. Clinical examinations were performed under similar conditions by three calibrated investigators. DMFS, dmfs, total DMFS + dmfs, DS + ds, Care Index and Loe & Silness plaque index were recorded and analyzed using Students t test, ANOVA and Chi square test (level of significance p < 0.05). Dental indicators were significantly higher in Group I than in Group II. The dental caries indicators increased as the incidence rate of the poverty-related variables rose. The highest number of children with high cariogenic risk was observed in neighborhoods with the highest social risk (c2 = 30.48; p < 0.005). The analyzed poverty-related variables seemed to be associated with factors that play a role in the dental caries development process in school and preschool children living in the Metropolitan area of Buenos Aires.

  16. Quality risk management of top spray fluidized bed process for antihypertensive drug formulation with control strategy engendered by Box-behnken experimental design space.

    PubMed

    Mukharya, Amit; Patel, Paresh U; Shenoy, Dinesh; Chaudhary, Shivang

    2013-01-01

    Lacidipine (LCDP) is a very low soluble and highly biovariable calcium channel blocker used in the treatment of hypertension. To increase its apparent solubility and to reduce its biovariability, solid dispersion fluid bed processing technology was explored, as it produces highly dispersible granules with a characteristic porous structure that enhances dispersibility, wettability, blend uniformity (by dissolving and spraying a solution of actives), flow ability and compressibility of granules for tableting and reducing variability by uniform drug-binder solution distribution on carrier molecules. Main object of this quality risk management (QRM) study is to provide a sophisticated "robust and rugged" Fluidized Bed Process (FBP) for the preparation of LCDP tablets with desired quality (stability) and performance (dissolution) by quality by design (QbD) concept. THIS STUDY IS PRINCIPALLY FOCUSING ON THOROUGH MECHANISTIC UNDERSTANDING OF THE FBP BY WHICH IT IS DEVELOPED AND SCALED UP WITH A KNOWLEDGE OF THE CRITICAL RISKS INVOLVED IN MANUFACTURING PROCESS ANALYZED BY RISK ASSESSMENT TOOLS LIKE: Qualitative Initial Risk-based Matrix Analysis (IRMA) and Quantitative Failure Mode Effective Analysis (FMEA) to identify and rank parameters with potential to have an impact on In Process/Finished Product Critical Quality Attributes (IP/FP CQAs). These Critical Process Parameters (CPPs) were further refined by DoE and MVDA to develop design space with Real Time Release Testing (RTRT) that leads to implementation of a control strategy to achieve consistent finished product quality at lab scale itself to prevent possible product failure at larger manufacturing scale.

  17. Quality risk management of top spray fluidized bed process for antihypertensive drug formulation with control strategy engendered by Box-behnken experimental design space

    PubMed Central

    Mukharya, Amit; Patel, Paresh U; Shenoy, Dinesh; Chaudhary, Shivang

    2013-01-01

    Introduction: Lacidipine (LCDP) is a very low soluble and highly biovariable calcium channel blocker used in the treatment of hypertension. To increase its apparent solubility and to reduce its biovariability, solid dispersion fluid bed processing technology was explored, as it produces highly dispersible granules with a characteristic porous structure that enhances dispersibility, wettability, blend uniformity (by dissolving and spraying a solution of actives), flow ability and compressibility of granules for tableting and reducing variability by uniform drug-binder solution distribution on carrier molecules. Materials and Methods: Main object of this quality risk management (QRM) study is to provide a sophisticated “robust and rugged” Fluidized Bed Process (FBP) for the preparation of LCDP tablets with desired quality (stability) and performance (dissolution) by quality by design (QbD) concept. Results and Conclusion: This study is principally focusing on thorough mechanistic understanding of the FBP by which it is developed and scaled up with a knowledge of the critical risks involved in manufacturing process analyzed by risk assessment tools like: Qualitative Initial Risk-based Matrix Analysis (IRMA) and Quantitative Failure Mode Effective Analysis (FMEA) to identify and rank parameters with potential to have an impact on In Process/Finished Product Critical Quality Attributes (IP/FP CQAs). These Critical Process Parameters (CPPs) were further refined by DoE and MVDA to develop design space with Real Time Release Testing (RTRT) that leads to implementation of a control strategy to achieve consistent finished product quality at lab scale itself to prevent possible product failure at larger manufacturing scale. PMID:23799202

  18. Analytical modeling and tolerance analysis of a linear variable filter for spectral order sorting.

    PubMed

    Ko, Cheng-Hao; Chang, Kuei-Ying; Huang, You-Min

    2015-02-23

    This paper proposes an innovative method to overcome the low production rate of current linear variable filter (LVF) fabrication. During the fabrication process, a commercial coater is combined with a local mask on a substrate. The proposed analytical thin film thickness model, which is based on the geometry of the commercial coater, is developed to more effectively calculate the profiles of LVFs. Thickness tolerance, LVF zone width, thin film layer structure, transmission spectrum and the effects of variations in critical parameters of the coater are analyzed. Profile measurements demonstrate the efficacy of local mask theory in the prediction of evaporation profiles with a high degree of accuracy.

  19. Stability and Variability in Aesthetic Experience: A Review

    PubMed Central

    Jacobsen, Thomas; Beudt, Susan

    2017-01-01

    Based on psychophysics’ pragmatic dualism, we trace the cognitive neuroscience of stability and variability in aesthetic experience. With regard to different domains of aesthetic processing, we touch upon the relevance of cognitive schemata for aesthetic preference. Attitudes and preferences are explored in detail. Evolutionary constraints on attitude formation or schema generation are elucidated, just as the often seemingly arbitrary influences of social, societal, and cultural nature are. A particular focus is put on the concept of critical periods during an individual’s ontogenesis. The latter contrasting with changes of high frequency, such as fashion influences. Taken together, these analyses document the state of the art in the field and, potentially, highlight avenues for future research. PMID:28223955

  20. Stability and Variability in Aesthetic Experience: A Review.

    PubMed

    Jacobsen, Thomas; Beudt, Susan

    2017-01-01

    Based on psychophysics' pragmatic dualism, we trace the cognitive neuroscience of stability and variability in aesthetic experience. With regard to different domains of aesthetic processing, we touch upon the relevance of cognitive schemata for aesthetic preference. Attitudes and preferences are explored in detail. Evolutionary constraints on attitude formation or schema generation are elucidated, just as the often seemingly arbitrary influences of social, societal, and cultural nature are. A particular focus is put on the concept of critical periods during an individual's ontogenesis. The latter contrasting with changes of high frequency, such as fashion influences. Taken together, these analyses document the state of the art in the field and, potentially, highlight avenues for future research.

  1. An overview of current applications, challenges, and future trends in distributed process-based models in hydrology

    USGS Publications Warehouse

    Fatichi, Simone; Vivoni, Enrique R.; Odgen, Fred L; Ivanov, Valeriy Y; Mirus, Benjamin B.; Gochis, David; Downer, Charles W; Camporese, Matteo; Davison, Jason H; Ebel, Brian A.; Jones, Norm; Kim, Jongho; Mascaro, Giuseppe; Niswonger, Richard G.; Restrepo, Pedro; Rigon, Riccardo; Shen, Chaopeng; Sulis, Mauro; Tarboton, David

    2016-01-01

    Process-based hydrological models have a long history dating back to the 1960s. Criticized by some as over-parameterized, overly complex, and difficult to use, a more nuanced view is that these tools are necessary in many situations and, in a certain class of problems, they are the most appropriate type of hydrological model. This is especially the case in situations where knowledge of flow paths or distributed state variables and/or preservation of physical constraints is important. Examples of this include: spatiotemporal variability of soil moisture, groundwater flow and runoff generation, sediment and contaminant transport, or when feedbacks among various Earth’s system processes or understanding the impacts of climate non-stationarity are of primary concern. These are situations where process-based models excel and other models are unverifiable. This article presents this pragmatic view in the context of existing literature to justify the approach where applicable and necessary. We review how improvements in data availability, computational resources and algorithms have made detailed hydrological simulations a reality. Avenues for the future of process-based hydrological models are presented suggesting their use as virtual laboratories, for design purposes, and with a powerful treatment of uncertainty.

  2. Modeling and validation of heat and mass transfer in individual coffee beans during the coffee roasting process using computational fluid dynamics (CFD).

    PubMed

    Alonso-Torres, Beatriz; Hernández-Pérez, José Alfredo; Sierra-Espinoza, Fernando; Schenker, Stefan; Yeretzian, Chahan

    2013-01-01

    Heat and mass transfer in individual coffee beans during roasting were simulated using computational fluid dynamics (CFD). Numerical equations for heat and mass transfer inside the coffee bean were solved using the finite volume technique in the commercial CFD code Fluent; the software was complemented with specific user-defined functions (UDFs). To experimentally validate the numerical model, a single coffee bean was placed in a cylindrical glass tube and roasted by a hot air flow, using the identical geometrical 3D configuration and hot air flow conditions as the ones used for numerical simulations. Temperature and humidity calculations obtained with the model were compared with experimental data. The model predicts the actual process quite accurately and represents a useful approach to monitor the coffee roasting process in real time. It provides valuable information on time-resolved process variables that are otherwise difficult to obtain experimentally, but critical to a better understanding of the coffee roasting process at the individual bean level. This includes variables such as time-resolved 3D profiles of bean temperature and moisture content, and temperature profiles of the roasting air in the vicinity of the coffee bean.

  3. Collaborative Research: Improving Decadal Prediction of Arctic Climate Variability and Change Using a Regional Arctic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutowski, William J.

    This project developed and applied a regional Arctic System model for enhanced decadal predictions. It built on successful research by four of the current PIs with support from the DOE Climate Change Prediction Program, which has resulted in the development of a fully coupled Regional Arctic Climate Model (RACM) consisting of atmosphere, land-hydrology, ocean and sea ice components. An expanded RACM, a Regional Arctic System Model (RASM), has been set up to include ice sheets, ice caps, mountain glaciers, and dynamic vegetation to allow investigation of coupled physical processes responsible for decadal-scale climate change and variability in the Arctic. RASMmore » can have high spatial resolution (~4-20 times higher than currently practical in global models) to advance modeling of critical processes and determine the need for their explicit representation in Global Earth System Models (GESMs). The pan-Arctic region is a key indicator of the state of global climate through polar amplification. However, a system-level understanding of critical arctic processes and feedbacks needs further development. Rapid climate change has occurred in a number of Arctic System components during the past few decades, including retreat of the perennial sea ice cover, increased surface melting of the Greenland ice sheet, acceleration and thinning of outlet glaciers, reduced snow cover, thawing permafrost, and shifts in vegetation. Such changes could have significant ramifications for global sea level, the ocean thermohaline circulation and heat budget, ecosystems, native communities, natural resource exploration, and commercial transportation. The overarching goal of the RASM project has been to advance understanding of past and present states of arctic climate and to improve seasonal to decadal predictions. To do this the project has focused on variability and long-term change of energy and freshwater flows through the arctic climate system. The three foci of this research are: - Changes in the freshwater flux between arctic climate system components resulting from decadal changes in land and sea ice, seasonal snow, vegetation, and ocean circulation. - Changing energetics due to decadal changes in ice mass, vegetation, and air-sea interactions. - The role of small-scale atmospheric and oceanic processes that influence decadal variability. This research has been addressing modes of natural climate variability as well as extreme and rapid climate change. RASM can facilitate studies of climate impacts (e.g., droughts and fires) and of ecosystem adaptations to these impacts.« less

  4. The integration methods of fuzzy fault mode and effect analysis and fault tree analysis for risk analysis of yogurt production

    NASA Astrophysics Data System (ADS)

    Aprilia, Ayu Rizky; Santoso, Imam; Ekasari, Dhita Murita

    2017-05-01

    Yogurt is a product based on milk, which has beneficial effects for health. The process for the production of yogurt is very susceptible to failure because it involves bacteria and fermentation. For an industry, the risks may cause harm and have a negative impact. In order for a product to be successful and profitable, it requires the analysis of risks that may occur during the production process. Risk analysis can identify the risks in detail and prevent as well as determine its handling, so that the risks can be minimized. Therefore, this study will analyze the risks of the production process with a case study in CV.XYZ. The method used in this research is the Fuzzy Failure Mode and Effect Analysis (fuzzy FMEA) and Fault Tree Analysis (FTA). The results showed that there are 6 risks from equipment variables, raw material variables, and process variables. Those risks include the critical risk, which is the risk of a lack of an aseptic process, more specifically if starter yogurt is damaged due to contamination by fungus or other bacteria and a lack of sanitation equipment. The results of quantitative analysis of FTA showed that the highest probability is the probability of the lack of an aseptic process, with a risk of 3.902%. The recommendations for improvement include establishing SOPs (Standard Operating Procedures), which include the process, workers, and environment, controlling the starter of yogurt and improving the production planning and sanitation equipment using hot water immersion.

  5. Geochemical evolution of the Critical Zone across variable time scales informs concentration-discharge relationships: Jemez River Basin Critical Zone Observatory

    NASA Astrophysics Data System (ADS)

    McIntosh, Jennifer C.; Schaumberg, Courtney; Perdrial, Julia; Harpold, Adrian; Vázquez-Ortega, Angélica; Rasmussen, Craig; Vinson, David; Zapata-Rios, Xavier; Brooks, Paul D.; Meixner, Thomas; Pelletier, Jon; Derry, Louis; Chorover, Jon

    2017-05-01

    This study investigates the influence of water, carbon, and energy fluxes on solute production and transport through the Jemez Critical Zone (CZ) and impacts on C-Q relationships over variable spatial and temporal scales. Chemical depletion-enrichment profiles of soils, combined with regolith thickness and groundwater data indicate the importance to stream hydrochemistry of incongruent dissolution of silicate minerals during deep bedrock weathering, which is primarily limited by water fluxes, in this highly fractured, young volcanic terrain. Under high flow conditions (e.g., spring snowmelt), wetting of soil and regolith surfaces and presence of organic acids promote mineral dissolution and provide a constant supply of base cations, Si, and DIC to soil water and groundwater. Mixing of waters from different hydrochemical reservoirs in the near stream environment during "wet" periods leads to the chemostatic behavior of DIC, base cations, and Si in stream flow. Metals transported by organic matter complexation (i.e., Ge, Al) and/or colloids (i.e., Al) during periods of soil saturation and lateral connectivity to the stream display a positive relationship with Q. Variable Si-Q relationships, under all but the highest flow conditions, can be explained by nonconservative transport and precipitation of clay minerals, which influences long versus short-term Si weathering fluxes. By combining measurements of the CZ obtained across different spatial and temporal scales, we were able to constrain weathering processes in different hydrological reservoirs that may be flushed to the stream during hydrologic events, thereby informing C-Q relationships.

  6. High-Volume Production of Lightweight Multijunction Solar Cells

    NASA Technical Reports Server (NTRS)

    Youtsey, Christopher

    2015-01-01

    MicroLink Devices, Inc., has transitioned its 6-inch epitaxial lift-off (ELO) solar cell fabrication process into a manufacturing platform capable of sustaining large-volume production. This Phase II project improves the ELO process by reducing cycle time and increasing the yield of large-area devices. In addition, all critical device fabrication processes have transitioned to 6-inch production tool sets designed for volume production. An emphasis on automated cassette-to-cassette and batch processes minimizes operator dependence and cell performance variability. MicroLink Devices established a pilot production line capable of at least 1,500 6-inch wafers per month at greater than 80 percent yield. The company also increased the yield and manufacturability of the 6-inch reclaim process, which is crucial to reducing the cost of the cells.

  7. Complexity associated with the optimisation of capability options in military operations

    NASA Astrophysics Data System (ADS)

    Pincombe, A.; Bender, A.; Allen, G.

    2005-12-01

    In the context of a military operation, even if the intended actions, the geographic location, and the capabilities of the opposition are known, there are still some critical uncertainties that could have a major impact on the effectiveness of a given set of capabilities. These uncertainties include unpredictable events and the response alternatives that are available to the command and control elements of the capability set. They greatly complicate any a priori mathematical description. In a forecasting approach, the most likely future might be chosen and a solution sought that is optimal for that case. With scenario analysis, futures are proposed on the basis of critical uncertainties and the option that is most robust is chosen. We use scenario analysis but our approach is different in that we focus on the complexity and use the coupling between scenarios and options to create information on ideal options. The approach makes use of both soft and hard operations research methods, with subject matter expertise being used to define plausible responses to scenarios. In each scenario, uncertainty affects only a subset of the system-inherent variables and the variables that describe system-environment interactions. It is this scenario-specific reduction of variables that makes the problem mathematically tractable. The process we define is significantly different to existing scenario analysis processes, so we have named it adversarial scenario analysis. It can be used in conjunction with other methods, including recent improvements to the scenario analysis process. To illustrate the approach, we undertake a tactical level scenario analysis for a logistics problem that is defined by a network, expected throughputs to end users, the transport capacity available, the infrastructure at the nodes and the capacities of roads, stocks etc. The throughput capacity, e.g. the effectiveness, of the system relies on all of these variables and on the couplings between them. The system is initially in equilibrium for a given level of demand. However, different, and simpler, solutions emerge as the balance of couplings and the importance of variables change. The scenarios describe such changes in conditions. For each scenario it was possible to define measures that describe the differences between options. As with agent-based distillations, the solution is essentially qualitative and exploratory, bringing awareness of possible future difficulties and of the capabilities that are necessary if we are to deal successfully with those difficulties.

  8. Early-warning signals for catastrophic soil degradation

    NASA Astrophysics Data System (ADS)

    Karssenberg, Derek

    2010-05-01

    Many earth systems have critical thresholds at which the system shifts abruptly from one state to another. Such critical transitions have been described, among others, for climate, vegetation, animal populations, and geomorphology. Predicting the timing of critical transitions before they are reached is of importance because of the large impact on nature and society associated with the transition. However, it is notably difficult to predict the timing of a transition. This is because the state variables of the system show little change before the threshold is reached. As a result, the precision of field observations is often too low to provide predictions of the timing of a transition. A possible solution is the use of spatio-temporal patterns in state variables as leading indicators of a transition. It is becoming clear that the critically slowing down of a system causes spatio-temporal autocorrelation and variance to increase before the transition. Thus, spatio-temporal patterns are important candidates for early-warning signals. In this research we will show that these early-warning signals also exist in geomorphological systems. We consider a modelled vegetation-soil system under a gradually increasing grazing pressure causing an abrupt shift towards extensive soil degradation. It is shown that changes in spatio-temporal patterns occur well ahead of this catastrophic transition. A distributed model describing the coupled processes of vegetation growth and geomorphological denudation is adapted. The model uses well-studied simple process representations for vegetation and geomorphology. A logistic growth model calculates vegetation cover as a function of grazing pressure and vegetation growth rate. Evolution of the soil thickness is modelled by soil creep and wash processes, as a function of net rain reaching the surface. The vegetation and soil system are coupled by 1) decreasing vegetation growth with decreasing soil thickness and 2) increasing soil wash with decreasing vegetation cover. The model describes a critical, catastrophic transition of an underexploited system with low grazing pressure towards an overexploited system. The underexploited state has high vegetation cover and well developed soils, while the overexploited state has low vegetation cover and largely degraded soils. We first show why spatio-temporal patterns in vegetation cover, morphology, erosion rate, and sediment load should be expected to change well before the critical transition towards the overexploited state. Subsequently, spatio-temporal patterns are quantified by calculating statistics, in particular first order statistics and autocorrelation in space and time. It is shown that these statistics gradually change before the transition is reached. This indicates that the statistics may serve as early-warning signals in real-world applications. We also discuss the potential use of remote sensing to predict the critical transition in real-world landscapes.

  9. A multi-resolution analysis of lidar-DTMs to identify geomorphic processes from characteristic topographic length scales

    NASA Astrophysics Data System (ADS)

    Sangireddy, H.; Passalacqua, P.; Stark, C. P.

    2013-12-01

    Characteristic length scales are often present in topography, and they reflect the driving geomorphic processes. The wide availability of high resolution lidar Digital Terrain Models (DTMs) allows us to measure such characteristic scales, but new methods of topographic analysis are needed in order to do so. Here, we explore how transitions in probability distributions (pdfs) of topographic variables such as (log(area/slope)), defined as topoindex by Beven and Kirkby[1979], can be measured by Multi-Resolution Analysis (MRA) of lidar DTMs [Stark and Stark, 2001; Sangireddy et al.,2012] and used to infer dominant geomorphic processes such as non-linear diffusion and critical shear. We show this correlation between dominant geomorphic processes to characteristic length scales by comparing results from a landscape evolution model to natural landscapes. The landscape evolution model MARSSIM Howard[1994] includes components for modeling rock weathering, mass wasting by non-linear creep, detachment-limited channel erosion, and bedload sediment transport. We use MARSSIM to simulate steady state landscapes for a range of hillslope diffusivity and critical shear stresses. Using the MRA approach, we estimate modal values and inter-quartile ranges of slope, curvature, and topoindex as a function of resolution. We also construct pdfs at each resolution and identify and extract characteristic scale breaks. Following the approach of Tucker et al.,[2001], we measure the average length to channel from ridges, within the GeoNet framework developed by Passalacqua et al.,[2010] and compute pdfs for hillslope lengths at each scale defined in the MRA. We compare the hillslope diffusivity used in MARSSIM against inter-quartile ranges of topoindex and hillslope length scales, and observe power law relationships between the compared variables for simulated landscapes at steady state. We plot similar measures for natural landscapes and are able to qualitatively infer the dominant geomorphic processes. Also, we explore the variability in hillslope length scales as a function of hillslope diffusivity coefficients and critical shear stress in natural landscapes and show that we can infer signatures of dominant geomorphic processes by analyzing characteristic topographic length scales present in topography. References: Beven, K. and Kirkby, M. J.: A physically based variable contributing area model of basin hydrology, Hydrol. Sci. Bull., 24, 43-69, 1979 Howard, A. D. (1994). A detachment-limited model of drainage basin evolution.Water resources research, 30(7), 2261-2285. Passalacqua, P., Do Trung, T., Foufoula Georgiou, E., Sapiro, G., & Dietrich, W. E. (2010). A geometric framework for channel network extraction from lidar: Nonlinear diffusion and geodesic paths. Journal of Geophysical. Research: Earth Surface (2003-2012), 115(F1). Sangireddy, H., Passalacqua, P., Stark, C.P.(2012). Multi-resolution estimation of lidar-DTM surface flow metrics to identify characteristic topographic length scales, EP13C-0859: AGU Fall meeting 2012. Stark, C. P., & Stark, G. J. (2001). A channelization model of landscape evolution. American Journal of Science, 301(4-5), 486-512. Tucker, G. E., Catani, F., Rinaldo, A., & Bras, R. L. (2001). Statistical analysis of drainage density from digital terrain data. Geomorphology, 36(3), 187-202.

  10. Soil nitrate reducing processes – drivers, mechanisms for spatial variation, and significance for nitrous oxide production

    PubMed Central

    Giles, Madeline; Morley, Nicholas; Baggs, Elizabeth M.; Daniell, Tim J.

    2012-01-01

    The microbial processes of denitrification and dissimilatory nitrate reduction to ammonium (DNRA) are two important nitrate reducing mechanisms in soil, which are responsible for the loss of nitrate (NO3−) and production of the potent greenhouse gas, nitrous oxide (N2O). A number of factors are known to control these processes, including O2 concentrations and moisture content, N, C, pH, and the size and community structure of nitrate reducing organisms responsible for the processes. There is an increasing understanding associated with many of these controls on flux through the nitrogen cycle in soil systems. However, there remains uncertainty about how the nitrate reducing communities are linked to environmental variables and the flux of products from these processes. The high spatial variability of environmental controls and microbial communities across small sub centimeter areas of soil may prove to be critical in determining why an understanding of the links between biotic and abiotic controls has proved elusive. This spatial effect is often overlooked as a driver of nitrate reducing processes. An increased knowledge of the effects of spatial heterogeneity in soil on nitrate reduction processes will be fundamental in understanding the drivers, location, and potential for N2O production from soils. PMID:23264770

  11. Soil nitrate reducing processes - drivers, mechanisms for spatial variation, and significance for nitrous oxide production.

    PubMed

    Giles, Madeline; Morley, Nicholas; Baggs, Elizabeth M; Daniell, Tim J

    2012-01-01

    The microbial processes of denitrification and dissimilatory nitrate reduction to ammonium (DNRA) are two important nitrate reducing mechanisms in soil, which are responsible for the loss of nitrate ([Formula: see text]) and production of the potent greenhouse gas, nitrous oxide (N(2)O). A number of factors are known to control these processes, including O(2) concentrations and moisture content, N, C, pH, and the size and community structure of nitrate reducing organisms responsible for the processes. There is an increasing understanding associated with many of these controls on flux through the nitrogen cycle in soil systems. However, there remains uncertainty about how the nitrate reducing communities are linked to environmental variables and the flux of products from these processes. The high spatial variability of environmental controls and microbial communities across small sub centimeter areas of soil may prove to be critical in determining why an understanding of the links between biotic and abiotic controls has proved elusive. This spatial effect is often overlooked as a driver of nitrate reducing processes. An increased knowledge of the effects of spatial heterogeneity in soil on nitrate reduction processes will be fundamental in understanding the drivers, location, and potential for N(2)O production from soils.

  12. Virtual sensors for on-line wheel wear and part roughness measurement in the grinding process.

    PubMed

    Arriandiaga, Ander; Portillo, Eva; Sánchez, Jose A; Cabanes, Itziar; Pombo, Iñigo

    2014-05-19

    Grinding is an advanced machining process for the manufacturing of valuable complex and accurate parts for high added value sectors such as aerospace, wind generation, etc. Due to the extremely severe conditions inside grinding machines, critical process variables such as part surface finish or grinding wheel wear cannot be easily and cheaply measured on-line. In this paper a virtual sensor for on-line monitoring of those variables is presented. The sensor is based on the modelling ability of Artificial Neural Networks (ANNs) for stochastic and non-linear processes such as grinding; the selected architecture is the Layer-Recurrent neural network. The sensor makes use of the relation between the variables to be measured and power consumption in the wheel spindle, which can be easily measured. A sensor calibration methodology is presented, and the levels of error that can be expected are discussed. Validation of the new sensor is carried out by comparing the sensor's results with actual measurements carried out in an industrial grinding machine. Results show excellent estimation performance for both wheel wear and surface roughness. In the case of wheel wear, the absolute error is within the range of microns (average value 32 μm). In the case of surface finish, the absolute error is well below Ra 1 μm (average value 0.32 μm). The present approach can be easily generalized to other grinding operations.

  13. Yield impact for wafer shape misregistration-based binning for overlay APC diagnostic enhancement

    NASA Astrophysics Data System (ADS)

    Jayez, David; Jock, Kevin; Zhou, Yue; Govindarajulu, Venugopal; Zhang, Zhen; Anis, Fatima; Tijiwa-Birk, Felipe; Agarwal, Shivam

    2018-03-01

    The importance of traditionally acceptable sources of variation has started to become more critical as semiconductor technologies continue to push into smaller technology nodes. New metrology techniques are needed to pursue the process uniformity requirements needed for controllable lithography. Process control for lithography has the advantage of being able to adjust for cross-wafer variability, but this requires that all processes are close in matching between process tools/chambers for each process. When this is not the case, the cumulative line variability creates identifiable groups of wafers1 . This cumulative shape based effect is described as impacting overlay measurements and alignment by creating misregistration of the overlay marks. It is necessary to understand what requirements might go into developing a high volume manufacturing approach which leverages this grouping methodology, the key inputs and outputs, and what can be extracted from such an approach. It will be shown that this line variability can be quantified into a loss of electrical yield primarily at the edge of the wafer and proposes a methodology for root cause identification and improvement. This paper will cover the concept of wafer shape based grouping as a diagnostic tool for overlay control and containment, the challenges in implementing this in a manufacturing setting, and the limitations of this approach. This will be accomplished by showing that there are identifiable wafer shape based signatures. These shape based wafer signatures will be shown to be correlated to overlay misregistration, primarily at the edge. It will also be shown that by adjusting for this wafer shape signal, improvements can be made to both overlay as well as electrical yield. These improvements show an increase in edge yield, and a reduction in yield variability.

  14. Application of quality by design concept to develop a dual gradient elution stability-indicating method for cloxacillin forced degradation studies using combined mixture-process variable models.

    PubMed

    Zhang, Xia; Hu, Changqin

    2017-09-08

    Penicillins are typical of complex ionic samples which likely contain large number of degradation-related impurities (DRIs) with different polarities and charge properties. It is often a challenge to develop selective and robust high performance liquid chromatography (HPLC) methods for the efficient separation of all DRIs. In this study, an analytical quality by design (AQbD) approach was proposed for stability-indicating method development of cloxacillin. The structures, retention and UV characteristics rules of penicillins and their impurities were summarized and served as useful prior knowledge. Through quality risk assessment and screen design, 3 critical process parameters (CPPs) were defined, including 2 mixture variables (MVs) and 1 process variable (PV). A combined mixture-process variable (MPV) design was conducted to evaluate the 3 CPPs simultaneously and a response surface methodology (RSM) was used to achieve the optimal experiment parameters. A dual gradient elution was performed to change buffer pH, mobile-phase type and strength simultaneously. The design spaces (DSs) was evaluated using Monte Carlo simulation to give their possibility of meeting the specifications of CQAs. A Plackett-Burman design was performed to test the robustness around the working points and to decide the normal operating ranges (NORs). Finally, validation was performed following International Conference on Harmonisation (ICH) guidelines. To our knowledge, this is the first study of using MPV design and dual gradient elution to develop HPLC methods and improve separations for complex ionic samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Assessment of variability in the hydrological cycle of the Loess Plateau, China: examining dependence structures of hydrological processes

    NASA Astrophysics Data System (ADS)

    Guo, A.; Wang, Y.

    2017-12-01

    Investigating variability in dependence structures of hydrological processes is of critical importance for developing an understanding of mechanisms of hydrological cycles in changing environments. In focusing on this topic, present work involves the following: (1) identifying and eliminating serial correlation and conditional heteroscedasticity in monthly streamflow (Q), precipitation (P) and potential evapotranspiration (PE) series using the ARMA-GARCH model (ARMA: autoregressive moving average; GARCH: generalized autoregressive conditional heteroscedasticity); (2) describing dependence structures of hydrological processes using partial copula coupled with the ARMA-GARCH model and identifying their variability via copula-based likelihood-ratio test method; and (3) determining conditional probability of annual Q under different climate scenarios on account of above results. This framework enables us to depict hydrological variables in the presence of conditional heteroscedasticity and to examine dependence structures of hydrological processes while excluding the influence of covariates by using partial copula-based ARMA-GARCH model. Eight major catchments across the Loess Plateau (LP) are used as study regions. Results indicate that (1) The occurrence of change points in dependence structures of Q and P (PE) varies across the LP. Change points of P-PE dependence structures in all regions almost fully correspond to the initiation of global warming, i.e., the early 1980s. (3) Conditional probabilities of annual Q under various P and PE scenarios are estimated from the 3-dimensional joint distribution of (Q, P and PE) based on the above change points. These findings shed light on mechanisms of the hydrological cycle and can guide water supply planning and management, particularly in changing environments.

  16. Clinical review: International comparisons in critical care - lessons learned.

    PubMed

    Murthy, Srinivas; Wunsch, Hannah

    2012-12-12

    Critical care medicine is a global specialty and epidemiologic research among countries provides important data on availability of critical care resources, best practices, and alternative options for delivery of care. Understanding the diversity across healthcare systems allows us to explore that rich variability and understand better the nature of delivery systems and their impact on outcomes. However, because the delivery of ICU services is complex (for example, interplay of bed availability, cultural norms and population case-mix), the diversity among countries also creates challenges when interpreting and applying data. This complexity has profound influences on reported outcomes, often obscuring true differences. Future research should emphasize determination of resource data worldwide in order to understand current practices in different countries; this will permit rational pandemic and disaster planning, allow comparisons of in-ICU processes of care, and facilitate addition of pre- and post-ICU patient data to better interpret outcomes.

  17. a New Method for Fmeca Based on Fuzzy Theory and Expert System

    NASA Astrophysics Data System (ADS)

    Byeon, Yoong-Tae; Kim, Dong-Jin; Kim, Jin-O.

    2008-10-01

    Failure Mode Effects and Criticality Analysis (FMECA) is one of most widely used methods in modern engineering system to investigate potential failure modes and its severity upon the system. FMECA evaluates criticality and severity of each failure mode and visualize the risk level matrix putting those indices to column and row variable respectively. Generally, those indices are determined subjectively by experts and operators. However, this process has no choice but to include uncertainty. In this paper, a method for eliciting expert opinions considering its uncertainty is proposed to evaluate the criticality and severity. In addition, a fuzzy expert system is constructed in order to determine the crisp value of risk level for each failure mode. Finally, an illustrative example system is analyzed in the case study. The results are worth considering in deciding the proper policies for each component of the system.

  18. Variable-rate nitrogen application algorithm based on canopy reflected spectrum and its influence on wheat

    NASA Astrophysics Data System (ADS)

    Liang, Hongxia; Zhao, Chunjiang; Huang, Wenjiang; Liu, Liangyun; Wang, Jihua; Ma, Youhua

    2005-01-01

    This study was to develop the time-specific and time-critical method to overcome the limitations of traditional field sampling methods for variable rate fertilization. Farmers, agricultural managers and grain processing enterprises are interested in measuring and assessing soil and crop status in order to apply adequate fertilizer quantities to crop growth. This paper focused on studying the relationship between vegetation index (OSAVI) and nitrogen content to determine the amount of nitrogen fertilizer recommended for variable rate management in precision agriculture. The traditional even rate fertilizer management was chosen as the CK. The grain yield, ear numbers, 1000-grain weight and grain protein content were measured among the CK, uniform treatments and variable rate fertilizer treatments. It indicated that variable rate fertilization reduced the variability of wheat yield, ear numbers and dry biomass, but it didn't increased crop yield and grain protein content significantly and did not decrease the variety of 1000-grain weight, compared to traditional rate application. The nitrogen fertilizer use efficiency was improved, for this purpose, the variable rate technology based on vegetation index could be used to prevent under ground water pollution and environmental deterioration.

  19. Antecedent Moisture and Biological Inertia as Predictors of Plant and Ecosystem Productivity in Arid and Semiarid Systems

    NASA Astrophysics Data System (ADS)

    Ogle, K.

    2011-12-01

    Many plant and ecosystem processes in arid and semiarid systems may be affected by antecedent environmental conditions (e.g., precipitation patterns, soil water availability, temperature) that integrate over past days, weeks, months, seasons, or years. However, the importance of such antecedent exogenous effects relative to conditions occurring at the time of the observed process is relatively unexplored. Even less is known about the potential importance of antecedent endogenous effects that describe the influence of past ecosystem states on the current ecosystem state; e.g., how is current ecosystem productivity related to past productivity patterns? We hypothesize that incorporation of antecedent exogenous and endogenous factors can improve our predictive understanding of many plant and ecosystem processes, especially in arid and semiarid ecosystems. Furthermore, the common approach to quantifying the effects of antecedent (exogenous) variables relies on arbitrary, deterministic definitions of antecedent variables that (1) may not accurately describe the role of antecedent conditions and (2) ignore uncertainty associated with applying deterministic definitions. In this study, we employ a stochastic framework for (1) computing the antecedent variables that estimates the relative importance of conditions experienced each time unit into the past, also providing insight into potential lag responses, and (2) estimating the effect of antecedent factors on the response variable of interest. We employ this approach to explore the potential roles of antecedent exogenous and endogenous influences in three settings that illustrate the: (1) importance of antecedent precipitation for net primary productivity in the shortgrass steppe in northern Colorado, (2) dependency of tree growth on antecedent precipitation and past growth states for pinyon growing in western Colorado, and (3) influence of antecedent soil water and prior root status on observed root growth in the Mojave Desert FACE experiment. All three examples suggest that antecedent conditions are critical to predicting different indices of productivity such that the incorporation of antecedent effects explained an additional 20-40% of the variation in the productivity responses. Antecedent endogenous factors were important for understanding tree and root growth, suggesting a potential biological inertia effect that is likely linked to labile carbon storage and allocation strategies. The role of antecedent exogenous (water) variables suggests a lag response whose duration and timing differs according to the time scale of the response variable. In summary, antecedent water availability and past endogenous states appear critical to understanding plant and ecosystem productivity in arid and semiarid systems, and this study describes a stochastic framework for quantifying the potential influence of such antecedent conditions.

  20. Mapping standing dead trees (snags) in the aftermath of the 2013 Rim Fire using airborne LiDAR data.

    NASA Astrophysics Data System (ADS)

    Casas Planes, Á.; Garcia-Alonso, M.; Koltunov, A.; Ustin, S.; Falk, M.; Ramirez, C.; Siegel, R.

    2014-12-01

    Abundance and spatial distribution of standing dead trees (snags) are key indicators of forest biodiversity and ecosystem health and represent a critical component of habitat for various wildlife species, including the great grey owl and the black-backed woodpecker. In this work we assess the potential of light detection and ranging (LiDAR) to discriminate snags from the live trees and map their distribution. The study area encompasses the burn perimeter of the Rim Fire, the third largest wildfire in California's recorded history (~104.000 ha) and represents a heterogeneous mosaic of mixed conifer forests, hardwood, and meadows. The snags mapping procedure is based on a 3D single tree detection using a Watershed algorithm and the extraction of height and intensity metrics within each segment. Variables selected using Gaussian processes form a feature space for a classifier to distinguish between dead trees and live trees. Finally, snag density and snag diameter classes that are relevant for avian species are mapped. This work shows the use of LiDAR metrics to quantify ecological variables related to the vertical heterogeneity of the forest canopy that are important in the identification of snags, for example, fractional cover. We observed that intensity-related variables are critical to the successful identification of snags and their distribution. Our study highlights the importance of high-density LiDAR for characterizing the forest structural variables that contribute to the assessment of wildlife habitat suitability.

  1. Variability and change of sea level and its components in the Indo-Pacific region during the altimetry era

    NASA Astrophysics Data System (ADS)

    Wu, Quran; Zhang, Xuebin; Church, John A.; Hu, Jianyu

    2017-03-01

    Previous studies have shown that regional sea level exhibits interannual and decadal variations associated with the modes of climate variability. A better understanding of those low-frequency sea level variations benefits the detection and attribution of climate change signals. Nonetheless, the contributions of thermosteric, halosteric, and mass sea level components to sea level variability and trend patterns remain unclear. By focusing on signals associated with dominant climate modes in the Indo-Pacific region, we estimate the interannual and decadal fingerprints and trend of each sea level component utilizing a multivariate linear regression of two adjoint-based ocean reanalyses. Sea level interannual, decadal, and trend patterns primarily come from thermosteric sea level (TSSL). Halosteric sea level (HSSL) is of regional importance in the Pacific Ocean on decadal time scale and dominates sea level trends in the northeast subtropical Pacific. The compensation between TSSL and HSSL is identified in their decadal variability and trends. The interannual and decadal variability of temperature generally peak at subsurface around 100 m but that of salinity tend to be surface-intensified. Decadal temperature and salinity signals extend deeper into the ocean in some regions than their interannual equivalents. Mass sea level (MassSL) is critical for the interannual and decadal variability of sea level over shelf seas. Inconsistencies exist in MassSL trend patterns among various estimates. This study highlights regions where multiple processes work together to control sea level variability and change. Further work is required to better understand the interaction of different processes in those regions.

  2. Critical Zone Architecture and the Last Glacial Legacy in Unglaciated North America

    NASA Astrophysics Data System (ADS)

    Marshall, J. A.; Roering, J. J.; Rempel, A. W.; Bartlein, P. J.; Merritts, D. J.; Walter, R. C.

    2015-12-01

    As fresh bedrock is exhumed into the Critical Zone and intersects with water and life, rock attributes controlling geochemical reactions, hydrologic routing, accommodation space for roots, surface area, and the mobile fraction of regolith are set not just by present-day processes, but are predicated on the 'ghosts' of past processes embedded in the subsurface architecture. Easily observable modern ecosystem processes such as tree throw can erase the past and bias our interpretation of landscape evolution. Abundant paleoenvironmental records demonstrate that unglaciated regions experienced profound climate changes through the late Pleistocene-Holocene transition, but studies quantifying how environmental variables affect erosion and weathering rates in these settings often marginalize or even forego consideration of the role of past climate regimes. Here we combine seven downscaled Last Glacial Maximum (LGM) paleoclimate reconstructions with a state of the art frost cracking model to explore frost weathering potential across the North American continent 21 ka. We analyze existing evidence of LGM periglacial processes and features to better constrain frost weathering model predictions. All seven models predict frost cracking across a large swath to the west of the Continental Divide, with the southernmost extent at ~ latitude 35° N, and increasing latitude towards the buffering influence of the Pacific Ocean. All models predict significant frost cracking in the unglaciated Rocky Mountains. To the east of the Continental Divide, models results diverge more, but all predict regions with LGM temperatures too cold for significant frost cracking (mean annual temperatures < 15 °C), corroborated by observations of permafrost relics such as ice wedges in some areas. Our results provide a framework for coupling paleoclimate reconstructions with a predictive frost weathering model, and importantly, suggest that modeling modern Critical Zone process evolution may require a consideration of vastly different processes when rock was first exhumed into the Critical Zone reactor.

  3. Indian monsoon variability on millennial-orbital timescales

    PubMed Central

    Kathayat, Gayatri; Cheng, Hai; Sinha, Ashish; Spötl, Christoph; Edwards, R. Lawrence; Zhang, Haiwei; Li, Xianglei; Yi, Liang; Ning, Youfeng; Cai, Yanjun; Lui, Weiguo Lui; Breitenbach, Sebastian F. M.

    2016-01-01

    The Indian summer monsoon (ISM) monsoon is critical to billions of people living in the region. Yet, significant debates remain on primary ISM drivers on millennial-orbital timescales. Here, we use speleothem oxygen isotope (δ18O) data from Bittoo cave, Northern India to reconstruct ISM variability over the past 280,000 years. We find strong coherence between North Indian and Chinese speleothem δ18O records from the East Asian monsoon domain, suggesting that both Asian monsoon subsystems exhibit a coupled response to changes in Northern Hemisphere summer insolation (NHSI) without significant temporal lags, supporting the view that the tropical-subtropical monsoon variability is driven directly by precession-induced changes in NHSI. Comparisons of the North Indian record with both Antarctic ice core and sea-surface temperature records from the southern Indian Ocean over the last glacial period do not suggest a dominant role of Southern Hemisphere climate processes in regulating the ISM variability on millennial-orbital timescales. PMID:27071753

  4. Process characterization and Design Space definition.

    PubMed

    Hakemeyer, Christian; McKnight, Nathan; St John, Rick; Meier, Steven; Trexler-Schmidt, Melody; Kelley, Brian; Zettl, Frank; Puskeiler, Robert; Kleinjans, Annika; Lim, Fred; Wurth, Christine

    2016-09-01

    Quality by design (QbD) is a global regulatory initiative with the goal of enhancing pharmaceutical development through the proactive design of pharmaceutical manufacturing process and controls to consistently deliver the intended performance of the product. The principles of pharmaceutical development relevant to QbD are described in the ICH guidance documents (ICHQ8-11). An integrated set of risk assessments and their related elements developed at Roche/Genentech were designed to provide an overview of product and process knowledge for the production of a recombinant monoclonal antibody (MAb). This chapter describes the tools used for the characterization and validation of MAb manufacturing process under the QbD paradigm. This comprises risk assessments for the identification of potential Critical Process Parameters (pCPPs), statistically designed experimental studies as well as studies assessing the linkage of the unit operations. Outcome of the studies is the classification of process parameters according to their criticality and the definition of appropriate acceptable ranges of operation. The process and product knowledge gained in these studies can lead to the approval of a Design Space. Additionally, the information gained in these studies are used to define the 'impact' which the manufacturing process can have on the variability of the CQAs, which is used to define the testing and monitoring strategy. Copyright © 2016 International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.

  5. Effects of Processing Variables on Tantalum Nitride by Reactive-Ion-Assisted Magnetron Sputtering Deposition

    NASA Astrophysics Data System (ADS)

    Wei, Chao‑Tsang; Shieh, Han‑Ping D.

    2006-08-01

    The binary compound tantalum nitride (TaN) and ternary compounds tantalum tungsten nitrides (Ta1-xWxNy) exhibit interesting properties such as high melting point, high hardness, and chemical inertness. Such nitrides were deposited on a tungsten carbide (WC) die and silicon wafers by ion-beam-sputter evaporation of the respective metal under nitrogen ion-assisted deposition (IAD). The effects of N2/Ar flux ratio, post annealing, ion-assisted deposition, deposition rate, and W doping in coating processing variables on hardness, load critical scratching, oxidation resistance, stress and surface roughness were investigated. The optimum N2/Ar flux ratios in view of the hardness and critical load of TaN and Ta1-xWxNy films were ranged from 0.9 to 1.0. Doping W into TaN to form Ta1-xWxNy films led significant increases in hardness, critical load, oxidation resistance, and reduced surface roughness. The optimum doping ratio was [W/(W+Ta)]=0.85. From the deposition rate and IAD experiments, the stress in the films is mainly contributed by sputtering atoms. The lower deposition rate at a high N2/Ar flux ratio resulted in a higher compressive stress. A high compressive residual stress accounts for a high hardness. The relatively high compressive stress was attributed primarily to peening by atoms, ions and electrons during film growth, the Ta1-xWxNy films showed excellent hardness and strength against a high temperature, and sticking phenomena can essentially be avoided through their use. Ta1-xWxNy films showed better performance than the TaN film in terms of mechanical properties and oxidation resistance.

  6. EPE analysis of sub-N10 BEoL flow with and without fully self-aligned via using Coventor SEMulator3D

    NASA Astrophysics Data System (ADS)

    Franke, Joern-Holger; Gallagher, Matt; Murdoch, Gayle; Halder, Sandip; Juncker, Aurelie; Clark, William

    2017-03-01

    During the last few decades, the semiconductor industry has been able to scale device performance up while driving costs down. What started off as simple geometrical scaling, driven mostly by advances in lithography, has recently been accompanied by advances in processing techniques and in device architectures. The trend to combine efforts using process technology and lithography is expected to intensify, as further scaling becomes ever more difficult. One promising component of future nodes are "scaling boosters", i.e. processing techniques that enable further scaling. An indispensable component in developing these ever more complex processing techniques is semiconductor process modeling software. Visualization of complex 3D structures in SEMulator3D, along with budget analysis on film thicknesses, CD and etch budgets, allow process integrators to compare flows before any physical wafers are run. Hundreds of "virtual" wafers allow comparison of different processing approaches, along with EUV or DUV patterning options for defined layers and different overlay schemes. This "virtual fabrication" technology produces massively parallel process variation studies that would be highly time-consuming or expensive in experiment. Here, we focus on one particular scaling booster, the fully self-aligned via (FSAV). We compare metal-via-metal (mevia-me) chains with self-aligned and fully-self-aligned via's using a calibrated model for imec's N7 BEoL flow. To model overall variability, 3D Monte Carlo modeling of as many variability sources as possible is critical. We use Coventor SEMulator3D to extract minimum me-me distances and contact areas and show how fully self-aligned vias allow a better me-via distance control and tighter via-me contact area variability compared with the standard self-aligned via (SAV) approach.

  7. The Predictive Brain State: Asynchrony in Disorders of Attention?

    PubMed Central

    Ghajar, Jamshid; Ivry, Richard B.

    2015-01-01

    It is postulated that a key function of attention in goal-oriented behavior is to reduce performance variability by generating anticipatory neural activity that can be synchronized with expected sensory information. A network encompassing the prefrontal cortex, parietal lobe, and cerebellum may be critical in the maintenance and timing of such predictive neural activity. Dysfunction of this temporal process may constitute a fundamental defect in attention, causing working memory problems, distractibility, and decreased awareness. PMID:19074688

  8. Journalists' Occupational Stress: A Comparative Study between Reporting Critical Events and Domestic News.

    PubMed

    Monteiro, Susana; Marques-Pinto, Alexandra

    2017-07-27

    Nowadays, journalism is considered a stressful occupation, not only due to the stress perceived in journalists' daily work but also due to the critical, potentially traumatic events they report. However, research on journalists' occupational stress in both these professional settings is still scarce. This study aims to characterize and compare occupational stress variables perceived by journalists in their daily work and in critical scenarios. Taking the Holistic Model of Occupational Stress by Nelson and Simmons (2003) as a framework, 25 Portuguese journalists, all with experience in reporting critical events, were interviewed on their perceptions of some core variables of the model: occupational stressors, distress and eustress emotional reactions, and the consequences of these experiences on their well-being. Differences among these core variables, according to the number of deployments to a critical event, were statistically analysed in order to ascertain whether repeated exposure to trauma influenced journalists' occupational stress perceptions. The data content analysis showed that occupational stressors and emotional reactions differed across settings, while the consequences associated with journalists' experiences were perceived as being mainly negative in both occupational contexts. Significant differences were identified in some of these variables according to the number of deployments to a critical event (p < .05). These findings may contribute to a reflection on the role of media organizations in preparing and supporting journalists in their work performance, and on the promotion of occupational health within the scope of journalists' daily work and critical events. The article closes with considerations for future studies.

  9. Functional Relationships for Investigating Cognitive Processes

    PubMed Central

    Wright, Anthony A.

    2013-01-01

    Functional relationships (from systematic manipulation of critical variables) are advocated for revealing fundamental processes of (comparative) cognition—through examples from my work in psychophysics, learning, and memory. Functional relationships for pigeon wavelength (hue) discrimination revealed best discrimination at the spectral points of hue transition for pigeons—a correspondence (i.e., functional relationship) similar to that for humans. Functional relationships for learning revealed: Item-specific or relational learning in matching to sample as a function of the pigeons’ sample-response requirement, and same/different abstract-concept learning as a function of the training set size for rhesus monkeys, capuchin monkeys, and pigeons. Functional relationships for visual memory revealed serial position functions (a 1st order functional relationship) that changed systematically with retention delay (a 2nd order relationship) for pigeons, capuchin monkeys, rhesus monkeys, and humans. Functional relationships for rhesus-monkey auditory memory also revealed systematic changes in serial position functions with delay, but these changes were opposite to those for visual memory. Functional relationships for proactive interference revealed interference that varied as a function of a ratio of delay times. Functional relationships for change detection memory revealed (qualitative) similarities and (quantitative) differences in human and monkey visual short term memory as a function of the number of memory items. It is concluded that these findings were made possible by varying critical variables over a substantial portion of the manipulable range to generate functions and derive relationships. PMID:23174335

  10. Distracted driving in elderly and middle-aged drivers.

    PubMed

    Thompson, Kelsey R; Johnson, Amy M; Emerson, Jamie L; Dawson, Jeffrey D; Boer, Erwin R; Rizzo, Matthew

    2012-03-01

    Automobile driving is a safety-critical real-world example of multitasking. A variety of roadway and in-vehicle distracter tasks create information processing loads that compete for the neural resources needed to drive safely. Drivers with mind and brain aging may be particularly susceptible to distraction due to waning cognitive resources and control over attention. This study examined distracted driving performance in an instrumented vehicle (IV) in 86 elderly (mean=72.5 years, SD=5.0 years) and 51 middle-aged drivers (mean=53.7 years, SD=9.3 year) under a concurrent auditory-verbal processing load created by the Paced Auditory Serial Addition Task (PASAT). Compared to baseline (no-task) driving performance, distraction was associated with reduced steering control in both groups, with middle-aged drivers showing a greater increase in steering variability. The elderly drove slower and showed decreased speed variability during distraction compared to middle-aged drivers. They also tended to "freeze up", spending significantly more time holding the gas pedal steady, another tactic that may mitigate time pressured integration and control of information, thereby freeing mental resources to maintain situation awareness. While 39% of elderly and 43% of middle-aged drivers committed significantly more driving safety errors during distraction, 28% and 18%, respectively, actually improved, compatible with allocation of attention resources to safety critical tasks under a cognitive load. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Distracted Driving in Elderly and Middle-Aged Drivers

    PubMed Central

    Thompson, Kelsey R.; Johnson, Amy M.; Emerson, Jamie L.; Dawson, Jeffrey D.; Boer, Erwin R.

    2011-01-01

    Automobile driving is a safety-critical real-world example of multitasking. A variety of roadway and in-vehicle distracter tasks create information processing loads that compete for the neural resources needed to drive safely. Drivers with mind and brain aging may be particularly susceptible to distraction due to waning cognitive resources and control over attention. This study examined distracted driving performance in an instrumented vehicle (IV) in 86 elderly (mean = 72.5 years, SD = 5.0 years) and 51 middle-aged drivers (mean = 53.7 years, SD = 9.3 year) under a concurrent auditory-verbal processing load created by the Paced Auditory Serial Addition Task (PASAT). Compared to baseline (no-task) driving performance, distraction was associated with reduced steering control in both groups, with middle-aged drivers showing a greater increase in steering variability. The elderly drove slower and showed decreased speed variability during distraction compared to middle-aged drivers. They also tended to “freeze up”, spending significantly more time holding the gas pedal steady, another tactic that may mitigate time pressured integration and control of information, thereby freeing mental resources to maintain situation awareness. While 39% of elderly and 43% of middle-aged drivers committed significantly more driving safety errors during distraction, 28% and 18%, respectively, actually improved, compatible with allocation of attention resources to safety critical tasks under a cognitive load. PMID:22269561

  12. A molecular investigation of soil organic carbon composition across a subalpine catchment

    USGS Publications Warehouse

    Hsu, Hsiao-Tieh; Lawrence, Corey R.; Winnick, Matthew J.; Bargar, John R.; Maher, Katharine

    2018-01-01

    The dynamics of soil organic carbon (SOC) storage and turnover are a critical component of the global carbon cycle. Mechanistic models seeking to represent these complex dynamics require detailed SOC compositions, which are currently difficult to characterize quantitatively. Here, we address this challenge by using a novel approach that combines Fourier transform infrared spectroscopy (FT-IR) and bulk carbon X-ray absorption spectroscopy (XAS) to determine the abundance of SOC functional groups, using elemental analysis (EA) to constrain the total amount of SOC. We used this SOC functional group abundance (SOC-fga) method to compare variability in SOC compositions as a function of depth across a subalpine watershed (East River, Colorado, USA) and found a large degree of variability in SOC functional group abundances between sites at different elevations. Soils at a lower elevation are predominantly composed of polysaccharides, while soils at a higher elevation have more substantial portions of carbonyl, phenolic, or aromatic carbon. We discuss the potential drivers of differences in SOC composition between these sites, including vegetation inputs, internal processing and losses, and elevation-driven environmental factors. Although numerical models would facilitate the understanding and evaluation of the observed SOC distributions, quantitative and meaningful measurements of SOC molecular compositions are required to guide such models. Comparison among commonly used characterization techniques on shared reference materials is a critical next step for advancing our understanding of the complex processes controlling SOC compositions.

  13. Critical care nursing: Embedded complex systems.

    PubMed

    Trinier, Ruth; Liske, Lori; Nenadovic, Vera

    2016-01-01

    Variability in parameters such as heart rate, respiratory rate and blood pressure defines healthy physiology and the ability of the person to adequately respond to stressors. Critically ill patients have lost this variability and require highly specialized nursing care to support life and monitor changes in condition. The critical care environment is a dynamic system through which information flows. The critical care unit is typically designed as a tree structure with generally one attending physician and multiple nurses and allied health care professionals. Information flow through the system allows for identification of deteriorating patient status and timely interventionfor rescue from further deleterious effects. Nurses provide the majority of direct patient care in the critical care setting in 2:1, 1:1 or 1:2 nurse-to-patient ratios. The bedside nurse-critically ill patient relationship represents the primary, real-time feedback loop of information exchange, monitoring and treatment. Variables that enhance information flow through this loop and support timely nursing intervention can improve patient outcomes, while barriers can lead to errors and adverse events. Examining patient information flow in the critical care environment from a dynamic systems perspective provides insights into how nurses deliver effective patient care and prevent adverse events.

  14. Therapists' experiences and perceptions of teamwork in neurological rehabilitation: critical happenings in effective and ineffective teamwork.

    PubMed

    Suddick, Kitty M; De Souza, Lorraine H

    2007-12-01

    This paper reports the second part of an exploratory study into occupational therapists' and physiotherapists' perceptions and experiences of teamwork in neurological rehabilitation: the factors that were thought to influence effective and ineffective teamwork, and the meaning behind effective and ineffective teamwork in neurological rehabilitation. The study was undertaken through semi-structured interviews of 10 therapists from three different neurological rehabilitation teams based in the United Kingdom, and used the critical incident technique. Through analysis of the data, several main themes emerged regarding the perceived critical happenings in effective and ineffective teamwork. These were: team events and characteristics, team members' characteristics, shared and collaborative working practices, communication, specific organizational structures, environmental, external, and patient and family-related factors. Effective and ineffective team-work was perceived to impact on a number of levels: having implications for the team, the patient, individual team members, and the neurological rehabilitation service. The study supported the perceived value of team work within neurological rehabilitation. It also indicated the extensive and variable factors that may influence the team-working process as well as the complex and diverse nature of the process.

  15. Bivariate and multivariate analyses of the correlations between stability of the erythrocyte membrane, serum lipids and hematological variables.

    PubMed

    Bernardino Neto, M; de Avelar, E B; Arantes, T S; Jordão, I A; da Costa Huss, J C; de Souza, T M T; de Souza Penha, V A; da Silva, S C; de Souza, P C A; Tavares, M; Penha-Silva, N

    2013-01-01

    The observation that the fluidity must remain within a critical interval, outside which the stability and functionality of the cell tends to decrease, shows that stability, fluidity and function are related and that the measure of erythrocyte stability allows inferences about the fluidity or functionality of these cells. This study determined the biochemical and hematological variables that are directly or indirectly related to erythrocyte stability in a population of 71 volunteers. Data were evaluated by bivariate and multivariate analysis. The erythrocyte stability showed a greater association with hematological variables than the biochemical variables. The RDW stands out for its strong correlation with the stability of erythrocyte membrane, without being heavily influenced by other factors. Regarding the biochemical variables, the erythrocyte stability was more sensitive to LDL-C. Erythrocyte stability was significantly associated with RDW and LDL-C. Thus, the level of LDL-C is a consistent link between stability and functionality, suggesting that a measure of stability could be more one indirect parameter for assessing the risk of degenerative processes associated with high levels of LDL-C.

  16. Reducing Design Risk Using Robust Design Methods: A Dual Response Surface Approach

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Yeniay, Ozgur; Lepsch, Roger A. (Technical Monitor)

    2003-01-01

    Space transportation system conceptual design is a multidisciplinary process containing considerable element of risk. Risk here is defined as the variability in the estimated (output) performance characteristic of interest resulting from the uncertainties in the values of several disciplinary design and/or operational parameters. Uncertainties from one discipline (and/or subsystem) may propagate to another, through linking parameters and the final system output may have a significant accumulation of risk. This variability can result in significant deviations from the expected performance. Therefore, an estimate of variability (which is called design risk in this study) together with the expected performance characteristic value (e.g. mean empty weight) is necessary for multidisciplinary optimization for a robust design. Robust design in this study is defined as a solution that minimizes variability subject to a constraint on mean performance characteristics. Even though multidisciplinary design optimization has gained wide attention and applications, the treatment of uncertainties to quantify and analyze design risk has received little attention. This research effort explores the dual response surface approach to quantify variability (risk) in critical performance characteristics (such as weight) during conceptual design.

  17. Handgrip fatiguing exercise can provide objective assessment of cancer-related fatigue: a pilot study.

    PubMed

    Veni, T; Boyas, S; Beaune, B; Bourgeois, H; Rahmani, A; Landry, S; Bochereau, A; Durand, S; Morel, B

    2018-06-24

    As a subjective symptom, cancer-related fatigue is assessed via patient-reported outcomes. Due to the inherent bias of such evaluation, screening and treatment for cancer-related fatigue remains suboptimal. The purpose is to evaluate whether objective cancer patients' hand muscle mechanical parameters (maximal force, critical force, force variability) extracted from a fatiguing handgrip exercise may be correlated to the different dimensions (physical, emotional, and cognitive) of cancer-related fatigue. Fourteen women with advanced breast cancer, still under or having previously received chemotherapy within the preceding 3 months, and 11 healthy women participated to the present study. Cancer-related fatigue was first assessed through the EORTC QLQ-30 and its fatigue module. Fatigability was then measured during 60 maximal repeated handgrip contractions. The maximum force, critical force (asymptote of the force-time evolution), and force variability (root mean square of the successive differences) were extracted. Multiple regression models were performed to investigate the influence of the force parameters on cancer-related fatigue's dimensions. The multiple linear regression analysis evidenced that physical fatigue was best explained by maximum force and critical force (r = 0.81; p = 0.029). The emotional fatigue was best explained by maximum force, critical force, and force variability (r = 0.83; p = 0.008). The cognitive fatigue was best explained by critical force and force variability (r = 0.62; p = 0.035). The handgrip maximal force, critical force, and force variability may offer objective measures of the different dimensions of cancer-related fatigue and could provide a complementary approach to the patient reported outcomes.

  18. Optimization of critical quality attributes in continuous twin-screw wet granulation via design space validated with pilot scale experimental data.

    PubMed

    Liu, Huolong; Galbraith, S C; Ricart, Brendon; Stanton, Courtney; Smith-Goettler, Brandye; Verdi, Luke; O'Connor, Thomas; Lee, Sau; Yoon, Seongkyu

    2017-06-15

    In this study, the influence of key process variables (screw speed, throughput and liquid to solid (L/S) ratio) of a continuous twin screw wet granulation (TSWG) was investigated using a central composite face-centered (CCF) experimental design method. Regression models were developed to predict the process responses (motor torque, granule residence time), granule properties (size distribution, volume average diameter, yield, relative width, flowability) and tablet properties (tensile strength). The effects of the three key process variables were analyzed via contour and interaction plots. The experimental results have demonstrated that all the process responses, granule properties and tablet properties are influenced by changing the screw speed, throughput and L/S ratio. The TSWG process was optimized to produce granules with specific volume average diameter of 150μm and the yield of 95% based on the developed regression models. A design space (DS) was built based on volume average granule diameter between 90 and 200μm and the granule yield larger than 75% with a failure probability analysis using Monte Carlo simulations. Validation experiments successfully validated the robustness and accuracy of the DS generated using the CCF experimental design in optimizing a continuous TSWG process. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Impact of youth cultural orientation on perception of family process and development among Korean Americans.

    PubMed

    Choi, Yoonsun; Kim, Tae Yeun; Pekelnicky, Dina Drankus; Kim, Kihyun; Kim, You Seung

    2017-04-01

    This study examined how cultural orientations influence youth perception of family processes in Korean American families and how these family processes, in turn, predict depressive symptoms and antisocial behaviors among youth. Family processes were examined separately for maternal and paternal variables. This study used survey data from Korean American families living in the Midwest (256 youth and their parents) across 2 time periods, spanned over a year. At the time of the first interview, the average age of youth was 13 (SD = 1.00). Using structural equation modeling, this study tested the hypothesized associations concurrently, longitudinally, and accounting for earlier outcomes. Results show that identity and behavioral enculturation in one's heritage culture are predictors of bonding with parents, which is notably protective for youth. The results highlight the critical effect of enculturation in enhancing youth perception of the parent-child relationship. Behavioral acculturation to mainstream culture, in contrast, predicts youth problems, although the effect may not necessarily always be via family processes. Similarly, Korean and English language proficiencies predict fewer youth problems, but not always by way of family processes. A few differences emerged across maternal and paternal variables, although there was much commonality in the hypothesized relationships. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. Impact of Youth Cultural Orientation on Perception of Family Process and Development among Korean Americans

    PubMed Central

    Choi, Yoonsun; Kim, Tae Yeun; Pekelnicky, Dina Drankus; Kim, Kihyun; Kim, You Seung

    2016-01-01

    Objectives This study examined how cultural orientations influence youth perception of family processes in Korean American families, and how these family processes in turn predict depressive symptoms and antisocial behaviors among youth. Family processes were examined separately for maternal and paternal variables. Methods This study used survey data from Korean American families living in the Midwest (256 youth and their parents) across two time periods, spanned over a year. At the time of the first interview, the average age of youth was 13 (SD=1.00). Using Structural Equation Modeling, this study tested the hypothesized associations concurrently, longitudinally, and accounting for earlier outcomes. Results and Conclusion Results show that identity and behavioral enculturation in one’s heritage culture are predictors of bonding with parents, which is notably protective for youth. The results highlight the critical effect of enculturation in enhancing youth perception of the parent-child relationship. Behavioral acculturation to mainstream culture, in contrast, predicts youth problems, although the effect may not necessarily always be via family processes. Similarly, Korean and English language proficiencies predict fewer youth problems, but not always by way of family processes. A few differences emerged across maternal and paternal variables, although there was much commonality in the hypothesized relationships. PMID:27429061

  1. A critical view of the quest for brain structural markers of Albert Einstein's special talents (a pot of gold under the rainbow).

    PubMed

    Colombo, Jorge A

    2018-06-01

    Assertions regarding attempts to link glial and macrostructural brain events with cognitive performance regarding Albert Einstein, are critically reviewed. One basic problem arises from attempting to draw causal relationships regarding complex, delicately interactive functional processes involving finely tuned molecular and connectivity phenomena expressed in cognitive performance, based on highly variable brain structural events of a single, aged, formalin fixed brain. Data weaknesses and logical flaws are considered. In other instances, similar neuroanatomical observations received different interpretations and conclusions, as those drawn, e.g., from schizophrenic brains. Observations on white matter events also raise methodological queries. Additionally, neurocognitive considerations on other intellectual aptitudes of A. Einstein were simply ignored.

  2. Estimating heterotrophic respiration at large scales: Challenges, approaches, and next steps

    DOE PAGES

    Bond-Lamberty, Ben; Epron, Daniel; Harden, Jennifer; ...

    2016-06-27

    Heterotrophic respiration (HR), the aerobic and anaerobic processes mineralizing organic matter, is a key carbon flux but one impossible to measure at scales significantly larger than small experimental plots. This impedes our ability to understand carbon and nutrient cycles, benchmark models, or reliably upscale point measurements. Given that a new generation of highly mechanistic, genomic-specific global models is not imminent, we suggest that a useful step to improve this situation would be the development of Decomposition Functional Types (DFTs). Analogous to plant functional types (PFTs), DFTs would abstract and capture important differences in HR metabolism and flux dynamics, allowing modelersmore » and experimentalists to efficiently group and vary these characteristics across space and time. We argue that DFTs should be initially informed by top-down expert opinion, but ultimately developed using bottom-up, data-driven analyses, and provide specific examples of potential dependent and independent variables that could be used. We present an example clustering analysis to show how annual HR can be broken into distinct groups associated with global variability in biotic and abiotic factors, and demonstrate that these groups are distinct from (but complementary to) already-existing PFTs. A similar analysis incorporating observational data could form the basis for future DFTs. Finally, we suggest next steps and critical priorities: collection and synthesis of existing data; more in-depth analyses combining open data with rigorous testing of analytical results; using point measurements and realistic forcing variables to constrain process-based models; and planning by the global modeling community for decoupling decomposition from fixed site data. Lastly, these are all critical steps to build a foundation for DFTs in global models, thus providing the ecological and climate change communities with robust, scalable estimates of HR.« less

  3. Intrinsic vs. spurious long-range memory in high-frequency records of environmental radioactivity. Critical re-assessment and application to indoor 222Rn concentrations from Coimbra, Portugal

    NASA Astrophysics Data System (ADS)

    Donner, R. V.; Potirakis, S. M.; Barbosa, S. M.; Matos, J. A. O.; Pereira, A. J. S. C.; Neves, L. J. P. F.

    2015-05-01

    The presence or absence of long-range correlations in the environmental radioactivity fluctuations has recently attracted considerable interest. Among a multiplicity of practically relevant applications, identifying and disentangling the environmental factors controlling the variable concentrations of the radioactive noble gas radon is important for estimating its effect on human health and the efficiency of possible measures for reducing the corresponding exposition. In this work, we present a critical re-assessment of a multiplicity of complementary methods that have been previously applied for evaluating the presence of long-range correlations and fractal scaling in environmental radon variations with a particular focus on the specific properties of the underlying time series. As an illustrative case study, we subsequently re-analyze two high-frequency records of indoor radon concentrations from Coimbra, Portugal, each of which spans several weeks of continuous measurements at a high temporal resolution of five minutes.Our results reveal that at the study site, radon concentrations exhibit complex multi-scale dynamics with qualitatively different properties at different time-scales: (i) essentially white noise in the high-frequency part (up to time-scales of about one hour), (ii) spurious indications of a non-stationary, apparently long-range correlated process (at time scales between some hours and one day) arising from marked periodic components, and (iii) low-frequency variability indicating a true long-range dependent process. In the presence of such multi-scale variability, common estimators of long-range memory in time series are prone to fail if applied to the raw data without previous separation of time-scales with qualitatively different dynamics.

  4. Estimating heterotrophic respiration at large scales: Challenges, approaches, and next steps

    USGS Publications Warehouse

    Bond-Lamberty, Ben; Epron, Daniel; Harden, Jennifer W.; Harmon, Mark E.; Hoffman, Forrest; Kumar, Jitendra; McGuire, Anthony David; Vargas, Rodrigo

    2016-01-01

    Heterotrophic respiration (HR), the aerobic and anaerobic processes mineralizing organic matter, is a key carbon flux but one impossible to measure at scales significantly larger than small experimental plots. This impedes our ability to understand carbon and nutrient cycles, benchmark models, or reliably upscale point measurements. Given that a new generation of highly mechanistic, genomic-specific global models is not imminent, we suggest that a useful step to improve this situation would be the development of “Decomposition Functional Types” (DFTs). Analogous to plant functional types (PFTs), DFTs would abstract and capture important differences in HR metabolism and flux dynamics, allowing modelers and experimentalists to efficiently group and vary these characteristics across space and time. We argue that DFTs should be initially informed by top-down expert opinion, but ultimately developed using bottom-up, data-driven analyses, and provide specific examples of potential dependent and independent variables that could be used. We present an example clustering analysis to show how annual HR can be broken into distinct groups associated with global variability in biotic and abiotic factors, and demonstrate that these groups are distinct from (but complementary to) already-existing PFTs. A similar analysis incorporating observational data could form the basis for future DFTs. Finally, we suggest next steps and critical priorities: collection and synthesis of existing data; more in-depth analyses combining open data with rigorous testing of analytical results; using point measurements and realistic forcing variables to constrain process-based models; and planning by the global modeling community for decoupling decomposition from fixed site data. These are all critical steps to build a foundation for DFTs in global models, thus providing the ecological and climate change communities with robust, scalable estimates of HR.

  5. Translational applications of evaluating physiologic variability in human endotoxemia

    PubMed Central

    Scheff, Jeremy D.; Mavroudis, Panteleimon D.; Calvano, Steve E.; Androulakis, Ioannis P.

    2012-01-01

    Dysregulation of the inflammatory response is a critical component of many clinically challenging disorders such as sepsis. Inflammation is a biological process designed to lead to healing and recovery, ultimately restoring homeostasis; however, the failure to fully achieve those beneficial results can leave a patient in a dangerous persistent inflammatory state. One of the primary challenges in developing novel therapies in this area is that inflammation is comprised of a complex network of interacting pathways. Here, we discuss our approaches towards addressing this problem through computational systems biology, with a particular focus on how the presence of biological rhythms and the disruption of these rhythms in inflammation may be applied in a translational context. By leveraging the information content embedded in physiologic variability, ranging in scale from oscillations in autonomic activity driving short-term heart rate variability (HRV) to circadian rhythms in immunomodulatory hormones, there is significant potential to gain insight into the underlying physiology. PMID:23203205

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dritz, K.W.; Boyle, J.M.

    This paper addresses the problem of measuring and analyzing the performance of fine-grained parallel programs running on shared-memory multiprocessors. Such processors use locking (either directly in the application program, or indirectly in a subroutine library or the operating system) to serialize accesses to global variables. Given sufficiently high rates of locking, the chief factor preventing linear speedup (besides lack of adequate inherent parallelism in the application) is lock contention - the blocking of processes that are trying to acquire a lock currently held by another process. We show how a high-resolution, low-overhead clock may be used to measure both lockmore » contention and lack of parallel work. Several ways of presenting the results are covered, culminating in a method for calculating, in a single multiprocessing run, both the speedup actually achieved and the speedup lost to contention for each lock and to lack of parallel work. The speedup losses are reported in the same units, ''processor-equivalents,'' as the speedup achieved. Both are obtained without having to perform the usual one-process comparison run. We chronicle also a variety of experiments motivated by actual results obtained with our measurement method. The insights into program performance that we gained from these experiments helped us to refine the parts of our programs concerned with communication and synchronization. Ultimately these improvements reduced lock contention to a negligible amount and yielded nearly linear speedup in applications not limited by lack of parallel work. We describe two generally applicable strategies (''code motion out of critical regions'' and ''critical-region fissioning'') for reducing lock contention and one (''lock/variable fusion'') applicable only on certain architectures.« less

  7. Assessing the vulnerability of economic sectors to climate variability to improve the usability of seasonal to decadal climate forecasts in Europe - a preliminary concept

    NASA Astrophysics Data System (ADS)

    Funk, Daniel

    2015-04-01

    Climate variability poses major challenges for decision-makers in climate-sensitive sectors. Seasonal to decadal (S2D) forecasts provide potential value for management decisions especially in the context of climate change where information from present or past climatology loses significance. However, usable and decision-relevant tailored climate forecasts are still sparse for Europe and successful examples of application require elaborate and individual producer-user interaction. The assessment of sector-specific vulnerabilities to critical climate conditions at specific temporal scale will be a great step forward to increase the usability and efficiency of climate forecasts. A concept for a sector-specific vulnerability assessment (VA) to climate variability is presented. The focus of this VA is on the provision of usable vulnerability information which can be directly incorporated in decision-making processes. This is done by developing sector-specific climate-impact-decision-pathways and the identification of their specific time frames using data from both bottom-up and top-down approaches. The structure of common VA's for climate change related issues is adopted which envisages the determination of exposure, sensitivity and coping capacity. However, the application of the common vulnerability components within the context of climate service application poses some fundamental considerations: Exposure - the effect of climate events on the system of concern may be modified and delayed due to interconnected systems (e.g. catchment). The critical time-frame of a climate event or event sequence is dependent on system-internal thresholds and initial conditions. But also on decision-making processes which require specific lead times of climate information to initiate respective coping measures. Sensitivity - in organizational systems climate may pose only one of many factors relevant for decision making. The scope of "sensitivity" in this concept comprises both the potential physical response of the system of concern as well as the criticality of climate-related decision-making processes. Coping capacity - in an operational context coping capacity can only reduce vulnerability if it can be applied purposeful. With respect to climate vulnerabilities this refers to the availability of suitable, usable and skillful climate information. The focus for this concept is on existing S2D climate service products and their match with user needs. The outputs of the VA are climate-impact-decision-pathways which characterize critical climate conditions, estimate the role of climate in decision-making processes and evaluate the availability and potential usability of S2D climate forecast products. A classification scheme is developed for each component of the impact-pathway to assess its specific significance. The systemic character of these schemes enables a broad application of this VA across sectors where quantitative data is limited. This concept is developed and will be tested within the context of the EU-FP7 project "European Provision Of Regional Impacts Assessments on Seasonal and Decadal Timescales" EUPORIAS.

  8. Preload assessment and optimization in critically ill patients.

    PubMed

    Voga, Gorazd

    2010-01-01

    Preload assessment and optimization is the basic hemodynamic intervention in critically ill. Beside clinical assessment, non-invasive or invasive assessment by measurement of various pressure or volume hemodynamic variables, are helpful for estimation of preload and fluid responsiveness. The use of dynamic variables is useful in particular subgroup of critically ill patients. In patients with inadequate preload, fluid responsiveness and inadequate flow, treatment with crystalloids or colloids is mandatory. When rapid hemodynamic response is necessary colloids are preferred.

  9. Fracture Patterns within the Shale Hills Critical Zone Observatory

    NASA Astrophysics Data System (ADS)

    Singha, K.; White, T.; Perron, J.; Chattopadhyay, P. B.; Duffy, C.

    2012-12-01

    Rock fractures are known to exist within the deep Critical Zone and are expected to influence groundwater flow, but there are limited data on their orientation and spatial arrangement and no general framework for systematically predicting their effects. Here, we explore fracture patterns within the Susquehanna-Shale Hills Critical Zone Observatory, and consider how they may be influenced by weathering, rock structure, and stress via field observations of variable fracture orientation within the site, with implications for the spatial variability of structural control on hydrologic processes. Based on field observations from 16-m deep boreholes and surface outcrop, we suggest that the appropriate structural model for the watershed is steeply dipping strata with meter- to decimeter-scale folds superimposed, including a superimposed fold at the mouth of the watershed that creates a short fold limb with gently dipping strata. These settings would produce an anisotropy in the hydraulic conductivity and perhaps also flow, especially within the context of the imposed stress field. Recently conducted 2-D numerical stress modeling indicates that the proxy for shear fracture declines more rapidly with depth beneath valleys than beneath ridgelines, which may produce or enhance the spatial variability in permeability. Even if topographic stresses do not cause new fractures, they could activate and cause displacement on old fractures, making the rocks easier to erode and increasing the permeability, and potentially driving a positive feedback that enhances the growth of valley relief. Calculated stress fields are consistent with field observations, which show a rapid decline in fracture abundance with increasing depth below the valley floor, and predict a more gradual trend beneath ridgetops, leading to a more consistent (and lower) hydraulic conductivity with depth on the ridgetops when compared to the valley, where values are higher but more variable with depth. Hydraulic conductivity is a fundamental property controlling the zone of active flow within the watershed.

  10. Role of slack variables in quasi-Newton methods for constrained optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tapia, R.A.

    In constrained optimization the technique of converting an inequality constraint into an equality constraint by the addition of a squared slack variable is well known but rarely used. In choosing an active constraint philosophy over the slack variable approach, researchers quickly justify their choice with the standard criticisms: the slack variable approach increases the dimension of the problem, is numerically unstable, and gives rise to singular systems. It is shown that these criticisms of the slack variable approach need not apply and the two seemingly distinct approaches are actually very closely related. In fact, the squared slack variable formulation canmore » be used to develop a superior and more comprehensive active constraint philosophy.« less

  11. Controlling Contagion Processes in Activity Driven Networks

    NASA Astrophysics Data System (ADS)

    Liu, Suyu; Perra, Nicola; Karsai, Márton; Vespignani, Alessandro

    2014-03-01

    The vast majority of strategies aimed at controlling contagion processes on networks consider the connectivity pattern of the system either quenched or annealed. However, in the real world, many networks are highly dynamical and evolve, in time, concurrently with the contagion process. Here, we derive an analytical framework for the study of control strategies specifically devised for a class of time-varying networks, namely activity-driven networks. We develop a block variable mean-field approach that allows the derivation of the equations describing the coevolution of the contagion process and the network dynamic. We derive the critical immunization threshold and assess the effectiveness of three different control strategies. Finally, we validate the theoretical picture by simulating numerically the spreading process and control strategies in both synthetic networks and a large-scale, real-world, mobile telephone call data set.

  12. Flow Observations with Tufts and Lampblack of the Stalling of Four Typical Airfoil Sections in the NACA Variable-density Tunnel

    NASA Technical Reports Server (NTRS)

    Abbott, Ira H; Sherman, Albert

    1938-01-01

    A preliminary investigation of the stalling processes of four typical airfoil sections was made over the critical range of the Reynolds Number. Motion pictures were taken of the movements of small silk tufts on the airfoil surface as the angle of attack increased through a range of angles including the stall. The boundary-layer flow also at certain angles of attack was indicated by the patterns formed by a suspension of lampblack in oil brushed onto the airfoil surface. These observations were analyzed together with corresponding force-test measurements to derive a picture of the stalling processes of airfoils.

  13. Evaluation of the impact of sodium lauryl sulfate source variability on solid oral dosage form development.

    PubMed

    Qiang, Dongmei; Gunn, Jocelyn A; Schultz, Leon; Li, Z Jane

    2010-12-01

    The objective of this study was to investigate the effects of sodium lauryl sulfate (SLS) from different sources on solubilization/wetting, granulation process, and tablet dissolution of BILR 355 and the potential causes. The particle size distribution, morphology, and thermal behaviors of two pharmaceutical grades of SLS from Spectrum and Cognis were characterized. The surface tension and drug solubility in SLS solutions were measured. The BILR 355 tablets were prepared by a wet granulation process and the dissolution was evaluated. The critical micelle concentration was lower for Spectrum SLS, which resulted in a higher BILR 355 solubility. During wet granulation, less water was required to reach the same end point using Spectrum than Cognis SLS. In general, BILR 355 tablets prepared with Spectrum SLS showed a higher dissolution than the tablets containing Cognis SLS. Micronization of SLS achieved the same improved tablet dissolution as micronized active pharmaceutical ingredient. The observed differences in wetting and solubilization were likely due to the different impurity levels in SLS from two sources. This study demonstrated that SLS from different sources could have significant impact on wet granulation process and dissolution. Therefore, it is critical to evaluate SLS properties from different suppliers, and then identify optimal formulation and process parameters to ensure robustness of drug product manufacture process and performance.

  14. Current aspects of Salmonella contamination in the US poultry production chain and the potential application of risk strategies in understanding emerging hazards.

    PubMed

    Rajan, Kalavathy; Shi, Zhaohao; Ricke, Steven C

    2017-05-01

    One of the leading causes of foodborne illness in poultry products is Salmonella enterica. Salmonella hazards in poultry may be estimated and possible control methods modeled and evaluated through the use of quantitative microbiological risk assessment (QMRA) models and tools. From farm to table, there are many possible routes of Salmonella dissemination and contamination in poultry. From the time chicks are hatched through growth, transportation, processing, storage, preparation, and finally consumption, the product could be contaminated through exposure to different materials and sources. Examination of each step of the process is necessary as well as an examination of the overall picture to create effective countermeasures against contamination and prevent disease. QMRA simulation models can use either point estimates or probability distributions to examine variables such as Salmonella concentrations at retail or at any given point of processing to gain insight on the chance of illness due to Salmonella ingestion. For modeling Salmonella risk in poultry, it is important to look at variables such as Salmonella transfer and cross contamination during processing. QMRA results may be useful for the identification and control of critical sources of Salmonella contamination.

  15. EPR and Bell's theorem: A critical review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stapp, H.P.

    1991-01-01

    The argument of Einstein, Podolsky, and Rosen is reviewed with attention to logical structure and character of assumptions. Bohr's reply is discussed. Bell's contribution is formulated without use of hidden variables, and efforts to equate hidden variables to realism are critically examined. An alternative derivation of nonlocality that makes no use of hidden variables, microrealism, counterfactual definiteness, or any other assumption alien to orthodox quantum thinking is described in detail, with particular attention to the quartet or broken-square question.

  16. Causes of the great mass extinction of marine organisms in the Late Devonian

    NASA Astrophysics Data System (ADS)

    Barash, M. S.

    2016-11-01

    The second of the five great mass extinctions of the Phanerozoic occurred in the Late Devonian. The number of species decreased by 70-82%. Major crises occurred at the Frasnian-Famennian and Devonian-Carboniferous boundary. The lithological and geochemical compositions of sediments, volcanic deposits, impactites, carbon and oxygen isotope ratios, evidence of climate variability, and sea level changes reflect the processes that led the critical conditions. Critical intervals are marked by layers of black shales, which were deposited in euxinic or anoxic environments. These conditions were the main direct causes of the extinctions. The Late Devonian mass extinction was determined by a combination of impact events and extensive volcanism. They produced similar effects: emissions of harmful chemical compounds and aerosols to cause greenhouse warming; darkening of the atmosphere, which prevented photosynthesis; and stagnation of oceans and development of anoxia. Food chains collapsed and biological productivity decreased. As a result, all vital processes were disturbed and a large portion of the biota became extinct.

  17. State Equation Determination of Cow Dung Biogas

    NASA Astrophysics Data System (ADS)

    Marzuki, A.; Wicaksono, L. B.

    2017-08-01

    A state function is a thermodynamic function which relates various macroscopically measurable properties of a system (state variable) describing the state of matter under a given set of physical conditions. A good understanding of a biogas state function plays a very important role in an effort to maximize biogas processes and to help predicting combation performance. This paper presents a step by step process of an experimental study aimed at determining the equation of state of cow dung biogas. The equation was derived from the data obtained from the experimental results of compressibility (κ) and expansivity (β) following the general form of gas state equation dV = βdT + κdP. In this equation, dV is gas volume variation, dT is temperature variation, and dP is pressure variation. From these results, we formulated a unique state equation from which the biogas critical temperature (Tc) and critical pressure were then determined (Tc = 266.7 K, Pc = 5096647.5 Pa).

  18. Contact patterning strategies for 32nm and 28nm technology

    NASA Astrophysics Data System (ADS)

    Morgenfeld, Bradley; Stobert, Ian; An, Ju j.; Kanai, Hideki; Chen, Norman; Aminpur, Massud; Brodsky, Colin; Thomas, Alan

    2011-04-01

    As 193 nm immersion lithography is extended indefinitely to sustain technology roadmaps, there is increasing pressure to contain escalating lithography costs by identifying patterning solutions that can minimize the use of multiple-pass processes. Contact patterning for the 32/28 nm technology nodes has been greatly facilitated by just-in-time introduction of new process enablers that allow the simultaneous support of flexible foundry-oriented ground rules alongside highperformance technology, while also migrating to a single-pass patterning process. The incorporation of device based performance metrics along with rigorous patterning and structural variability studies were critical in the evaluation of material innovation for improved resolution and CD shrink along with novel data preparation flows utilizing aggressive strategies for SRAF insertion and retargeting.

  19. Tailored metal matrix composites for high-temperature performance

    NASA Technical Reports Server (NTRS)

    Morel, M. R.; Saravanos, D. A.; Chamis, C. C.

    1992-01-01

    A multi-objective tailoring methodology is presented to maximize stiffness and load carrying capacity of a metal matrix cross-ply laminated at elevated temperatures. The fabrication process and fiber volume ratio are used as the design variables. A unique feature is the concurrent effects from fabrication, residual stresses, material nonlinearity, and thermo-mechanical loading on the laminate properties at the post-fabrication phase. For a (0/90)(sub s) graphite/copper laminate, strong coupling was observed between the fabrication process, laminate characteristics, and thermo-mechanical loading. The multi-objective tailoring was found to be more effective than single objective tailoring. Results indicate the potential to increase laminate stiffness and load carrying capacity by controlling the critical parameters of the fabrication process and the laminate.

  20. Children's Learning in Scientific Thinking: Instructional Approaches and Roles of Variable Identification and Executive Function

    NASA Astrophysics Data System (ADS)

    Blums, Angela

    The present study examines instructional approaches and cognitive factors involved in elementary school children's thinking and learning the Control of Variables Strategy (CVS), a critical aspect of scientific reasoning. Previous research has identified several features related to effective instruction of CVS, including using a guided learning approach, the use of self-reflective questions, and learning in individual and group contexts. The current study examined the roles of procedural and conceptual instruction in learning CVS and investigated the role of executive function in the learning process. Additionally, this study examined how learning to identify variables is a part of the CVS process. In two studies (individual and classroom experiments), 139 third, fourth, and fifth grade students participated in hands-on and paper and pencil CVS learning activities and, in each study, were assigned to either a procedural instruction, conceptual instruction, or control (no instruction) group. Participants also completed a series of executive function tasks. The study was carried out with two parts--Study 1 used an individual context and Study 2 was carried out in a group setting. Results indicated that procedural and conceptual instruction were more effective than no instruction, and the ability to identify variables was identified as a key piece to the CVS process. Executive function predicted ability to identify variables and predicted success on CVS tasks. Developmental differences were present, in that older children outperformed younger children on CVS tasks, and that conceptual instruction was slightly more effective for older children. Some differences between individual and group instruction were found, with those in the individual context showing some advantage over the those in the group setting in learning CVS concepts. Conceptual implications about scientific thinking and practical implications in science education are discussed.

  1. A realistic chemical system presenting a self-organized critical behavior

    NASA Astrophysics Data System (ADS)

    Gaveau, Bernard; Latrémolière, Daniel; Moreau, Michel

    2003-04-01

    We consider a realistic example of chemical system which presents self-organized criticality. We can study the kinetic equations analytically, and show that the conditions for self-organized criticality are satisfied. We find power relaxation laws for certain variables near the critical state, confirming the self-organized critical behavior.

  2. Job satisfaction and work related variables in Chinese cardiac critical care nurses.

    PubMed

    Liu, Yun-E; While, Alison; Li, Shu-Jun; Ye, Wen-Qin

    2015-05-01

    To explore critical care nurses' views of their job satisfaction and the relationship with job burnout, practice environment, coping style, social support, intention to stay in current employment and other work-related variables. Nurse shortage is a global issue, especially in critical care. Job satisfaction is the most frequently cited factor linked to nurses' turnover. A convenience sample of cardiac critical care nurses (n = 215; 97.7% response rate) from 12 large general hospitals in Shanghai was surveyed from December 2010 to March 2011. Over half of the sample reported satisfaction with their jobs. Nurses with 10-20 years of professional experience and those who had taken all their holiday entitlement reported higher levels of job satisfaction. The independent variables of practice environment, intention to stay, emotional exhaustion, personal accomplishment and positive coping style explained about 55% of the variance in job satisfaction. Chinese cardiac critical care nurses' job satisfaction was related to work related variables, which are amenable to managerial action. Our findings highlight the imperative of improving intrinsic and extrinsic rewards, together with the flexibility of work schedules to promote job satisfaction and staff retention. A clinical ladder system is needed to provide promotion opportunities for Chinese nurses. © 2013 John Wiley & Sons Ltd.

  3. 28nm node process optimization: a lithography centric view

    NASA Astrophysics Data System (ADS)

    Seltmann, Rolf

    2014-10-01

    Many experts claim that the 28nm technology node will be the most cost effective technology node forever. This results from primarily from the cost of manufacturing due to the fact that 28nm is the last true Single Patterning (SP) node. It is also affected by the dramatic increase of design costs and the limited shrink factor of the next following nodes. Thus, it is assumed that this technology still will be alive still for many years. To be cost competitive, high yields are mandatory. Meanwhile, leading edge foundries have optimized the yield of the 28nm node to such a level that that it is nearly exclusively defined by random defectivity. However, it was a long way to go to come to that level. In my talk I will concentrate on the contribution of lithography to this yield learning curve. I will choose a critical metal patterning application. I will show what was needed to optimize the process window to a level beyond the usual OPC model work that was common on previous nodes. Reducing the process (in particular focus) variability is a complementary need. It will be shown which improvements were needed in tooling, process control and design-mask-wafer interaction to remove all systematic yield detractors. Over the last couple of years new scanner platforms were introduced that were targeted for both better productivity and better parametric performance. But this was not a clear run-path. It needed some extra affords of the tool suppliers together with the Fab to bring the tool variability down to the necessary level. Another important topic to reduce variability is the interaction of wafer none-planarity and lithography optimization. Having an accurate knowledge of within die topography is essential for optimum patterning. By completing both the variability reduction work and the process window enhancement work we were able to transfer the original marginal process budget to a robust positive budget and thus ensuring high yield and low costs.

  4. Follow-on proposal identifying environmental features for land management decisions

    NASA Technical Reports Server (NTRS)

    Wright, P. M.; Ridd, M. K.

    1986-01-01

    Urban morphology (an examination of spatial fabric and structure), natural ecosystem (investigations emphasizing biophysical processes and patterns), and human ecosystem (emphasizing socio-economic and engineering parameters) were studied. The most critical variable, transpiration, in the ASPCON model, created by Jaynes (1978), describing the hydrology of aspen to conifer succession was studied to improve the accuracy. Transpiration is determined by a canopy transpiration model which estimates consumptive water use (CWU) for specific species and a plant activity index. Also studied was Pinyon-Juniper woodland erosion.

  5. Perceptions of nursing students after performing an individual activity designed to develop their critical thinking: The "critical card" tool.

    PubMed

    Urcola-Pardo, Fernando; Blázquez-Ornat, Isabel; Anguas-Gracia, Ana; Gasch-Gallen, Ángel; Germán-Bes, Concepción

    2018-03-01

    Critical thinking in Health Sciences is among the transversal competences in the Nursing Degree. The critical card is a tool of individual learning, designed to develop critical thinking, and set in the process of environmental health learning. Every student must perform the activity to obtain the highest qualification in Community Health Nursing subject. The aim of this project was to evaluate this learning tool using the students' perceptions after its performance. The evaluation was based on the answers to a questionnaire obtained from the third course students of Nursing Degree at the University of Zaragoza. The questionnaire was made up of 14 Likert-type questions, grouped in four dimensions. The student participation rate was higher than 50%. The analysis of the questionnaire obtained 67,8% positive answers. The variability between dimensions ranged between 49% of positive answers for application in other subjects and 87% of positive answers for the improvements applicable to the instrument. The students coincided in indicating that the critical card is a useful learning tool and could be applicable in other subjects. However, the weight it is given in the global evaluation of the subject is considered to be too low, considering the time used to complete the activity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Virtual Sensors for On-line Wheel Wear and Part Roughness Measurement in the Grinding Process

    PubMed Central

    Arriandiaga, Ander; Portillo, Eva; Sánchez, Jose A.; Cabanes, Itziar; Pombo, Iñigo

    2014-01-01

    Grinding is an advanced machining process for the manufacturing of valuable complex and accurate parts for high added value sectors such as aerospace, wind generation, etc. Due to the extremely severe conditions inside grinding machines, critical process variables such as part surface finish or grinding wheel wear cannot be easily and cheaply measured on-line. In this paper a virtual sensor for on-line monitoring of those variables is presented. The sensor is based on the modelling ability of Artificial Neural Networks (ANNs) for stochastic and non-linear processes such as grinding; the selected architecture is the Layer-Recurrent neural network. The sensor makes use of the relation between the variables to be measured and power consumption in the wheel spindle, which can be easily measured. A sensor calibration methodology is presented, and the levels of error that can be expected are discussed. Validation of the new sensor is carried out by comparing the sensor's results with actual measurements carried out in an industrial grinding machine. Results show excellent estimation performance for both wheel wear and surface roughness. In the case of wheel wear, the absolute error is within the range of microns (average value 32 μm). In the case of surface finish, the absolute error is well below Ra 1 μm (average value 0.32 μm). The present approach can be easily generalized to other grinding operations. PMID:24854055

  7. Imperfect physician assistant and physical therapist admissions processes in the United States

    PubMed Central

    2014-01-01

    We compared and contrasted physician assistant and physical therapy profession admissions processes based on the similar number of accredited programs in the United States and the co-existence of many programs in the same school of health professions, because both professions conduct similar centralized application procedures administered by the same organization. Many studies are critical of the fallibility and inadequate scientific rigor of the high-stakes nature of health professions admissions decisions, yet typical admission processes remain very similar. Cognitive variables, most notably undergraduate grade point averages, have been shown to be the best predictors of academic achievement in the health professions. The variability of non-cognitive attributes assessed and the methods used to measure them have come under increasing scrutiny in the literature. The variance in health professions students’ performance in the classroom and on certifying examinations remains unexplained, and cognitive considerations vary considerably between and among programs that describe them. One uncertainty resulting from this review is whether or not desired candidate attributes highly sought after by individual programs are more student-centered or graduate-centered. Based on the findings from the literature, we suggest that student success in the classroom versus the clinic is based on a different set of variables. Given the range of positions and general lack of reliability and validity in studies of non-cognitive admissions attributes, we think that health professions admissions processes remain imperfect works in progress. PMID:24810020

  8. Application of a quality by design approach to the cell culture process of monoclonal antibody production, resulting in the establishment of a design space.

    PubMed

    Nagashima, Hiroaki; Watari, Akiko; Shinoda, Yasuharu; Okamoto, Hiroshi; Takuma, Shinya

    2013-12-01

    This case study describes the application of Quality by Design elements to the process of culturing Chinese hamster ovary cells in the production of a monoclonal antibody. All steps in the cell culture process and all process parameters in each step were identified by using a cause-and-effect diagram. Prospective risk assessment using failure mode and effects analysis identified the following four potential critical process parameters in the production culture step: initial viable cell density, culture duration, pH, and temperature. These parameters and lot-to-lot variability in raw material were then evaluated by process characterization utilizing a design of experiments approach consisting of a face-centered central composite design integrated with a full factorial design. Process characterization was conducted using a scaled down model that had been qualified by comparison with large-scale production data. Multivariate regression analysis was used to establish statistical prediction models for performance indicators and quality attributes; with these, we constructed contour plots and conducted Monte Carlo simulation to clarify the design space. The statistical analyses, especially for raw materials, identified set point values, which were most robust with respect to the lot-to-lot variability of raw materials while keeping the product quality within the acceptance criteria. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.

  9. So Many Brands and Varieties to Choose from: Does This Compromise the Control of Food Intake in Humans?

    PubMed Central

    Hardman, Charlotte A.; Ferriday, Danielle; Kyle, Lesley; Rogers, Peter J.; Brunstrom, Jeffrey M.

    2015-01-01

    The recent rise in obesity is widely attributed to changes in the dietary environment (e.g., increased availability of energy-dense foods and larger portion sizes). However, a critical feature of our “obesogenic environment” may have been overlooked - the dramatic increase in “dietary variability” (the tendency for specific mass-produced foods to be available in numerous varieties that differ in energy content). In this study we tested the hypothesis that dietary variability compromises the control of food intake in humans. Specifically, we examined the effects of dietary variability in pepperoni pizza on two key outcome variables; i) compensation for calories in pepperoni pizza and ii) expectations about the satiating properties of pepperoni pizza (expected satiation). We reasoned that dietary variability might generate uncertainty about the postingestive effects of a food. An internet-based questionnaire was completed by 199 adults. This revealed substantial variation in exposure to different varieties of pepperoni pizza. In a follow-up study (n= 66; 65% female), high pizza variability was associated with i) poorer compensation for calories in pepperoni pizza and ii) lower expected satiation for pepperoni pizza. Furthermore, the effect of uncertainty on caloric compensation was moderated by individual differences in decision making (loss aversion). For the first time, these findings highlight a process by which dietary variability may compromise food-intake control in humans. This is important because it exposes a new feature of Western diets (processed foods in particular) that might contribute to overeating and obesity. PMID:25923118

  10. Time-Warp–Invariant Neuronal Processing

    PubMed Central

    Gütig, Robert; Sompolinsky, Haim

    2009-01-01

    Fluctuations in the temporal durations of sensory signals constitute a major source of variability within natural stimulus ensembles. The neuronal mechanisms through which sensory systems can stabilize perception against such fluctuations are largely unknown. An intriguing instantiation of such robustness occurs in human speech perception, which relies critically on temporal acoustic cues that are embedded in signals with highly variable duration. Across different instances of natural speech, auditory cues can undergo temporal warping that ranges from 2-fold compression to 2-fold dilation without significant perceptual impairment. Here, we report that time-warp–invariant neuronal processing can be subserved by the shunting action of synaptic conductances that automatically rescales the effective integration time of postsynaptic neurons. We propose a novel spike-based learning rule for synaptic conductances that adjusts the degree of synaptic shunting to the temporal processing requirements of a given task. Applying this general biophysical mechanism to the example of speech processing, we propose a neuronal network model for time-warp–invariant word discrimination and demonstrate its excellent performance on a standard benchmark speech-recognition task. Our results demonstrate the important functional role of synaptic conductances in spike-based neuronal information processing and learning. The biophysics of temporal integration at neuronal membranes can endow sensory pathways with powerful time-warp–invariant computational capabilities. PMID:19582146

  11. Critical thinking and creativity in nursing: learners' perspectives.

    PubMed

    Chan, Zenobia C Y

    2013-05-01

    Although the development of critical thinking and the development of creativity are major areas in nursing programme, little has been explored about learners' perspectives towards these two concepts, especially in Chinese contexts. This study aimed to reveal nursing learners' perspectives on creativity and critical thinking. Qualitative data collection methods were adopted, namely group interviews and concept map drawings. The process of data collection was conducted in private rooms at a University. 36 nursing students from two problem-based learning classes were recruited in two groups for the study. After data collection, content analysis with axial coding approach was conducted to explore the narrative themes, to summarise the main ideas, and to make valid inferences from the connections among critical thinking, creativity, and other exogenous variables. Based on the findings, six major themes were identified: "revisiting the meanings of critical thinking"; "critical thinking and knowledge: partners or rivals?"; "is critical thinking criticising?"; "revising the meanings of creativity"; "creativity and experience: partners or rivals?"; and "should creativity be practical?". This study showed that learners had diverse perspectives towards critical thinking and creativity, and their debate on these two domains provided implications on nursing education, since the voices of learners are crucial in teaching. By closing the gap between learners and educators, this study offered some insights on nursing education in the new curriculum, in particular to co-construct nursing knowledge which is student-driven, and to consider students' voices towards understanding and applying creativity and critical thinking in nursing. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Use of near-infrared spectroscopy (NIRs) in the biopharmaceutical industry for real-time determination of critical process parameters and integration of advanced feedback control strategies using MIDUS control.

    PubMed

    Vann, Lucas; Sheppard, John

    2017-12-01

    Control of biopharmaceutical processes is critical to achieve consistent product quality. The most challenging unit operation to control is cell growth in bioreactors due to the exquisitely sensitive and complex nature of the cells that are converting raw materials into new cells and products. Current monitoring capabilities are increasing, however, the main challenge is now becoming the ability to use the data generated in an effective manner. There are a number of contributors to this challenge including integration of different monitoring systems as well as the functionality to perform data analytics in real-time to generate process knowledge and understanding. In addition, there is a lack of ability to easily generate strategies and close the loop to feedback into the process for advanced process control (APC). The current research aims to demonstrate the use of advanced monitoring tools along with data analytics to generate process understanding in an Escherichia coli fermentation process. NIR spectroscopy was used to measure glucose and critical amino acids in real-time to help in determining the root cause of failures associated with different lots of yeast extract. First, scale-down of the process was required to execute a simple design of experiment, followed by scale-up to build NIR models as well as soft sensors for advanced process control. In addition, the research demonstrates the potential for a novel platform technology that enables manufacturers to consistently achieve "goldenbatch" performance through monitoring, integration, data analytics, understanding, strategy design and control (MIDUS control). MIDUS control was employed to increase batch-to-batch consistency in final product titers, decrease the coefficient of variability from 8.49 to 1.16%, predict possible exhaust filter failures and close the loop to prevent their occurrence and avoid lost batches.

  13. New type side weir discharge coefficient simulation using three novel hybrid adaptive neuro-fuzzy inference systems

    NASA Astrophysics Data System (ADS)

    Bonakdari, Hossein; Zaji, Amir Hossein

    2018-03-01

    In many hydraulic structures, side weirs have a critical role. Accurately predicting the discharge coefficient is one of the most important stages in the side weir design process. In the present paper, a new high efficient side weir is investigated. To simulate the discharge coefficient of these side weirs, three novel soft computing methods are used. The process includes modeling the discharge coefficient with the hybrid Adaptive Neuro-Fuzzy Interface System (ANFIS) and three optimization algorithms, namely Differential Evaluation (ANFIS-DE), Genetic Algorithm (ANFIS-GA) and Particle Swarm Optimization (ANFIS-PSO). In addition, sensitivity analysis is done to find the most efficient input variables for modeling the discharge coefficient of these types of side weirs. According to the results, the ANFIS method has higher performance when using simpler input variables. In addition, the ANFIS-DE with RMSE of 0.077 has higher performance than the ANFIS-GA and ANFIS-PSO methods with RMSE of 0.079 and 0.096, respectively.

  14. Variability of the reflectance coefficient of skylight from the ocean surface and its implications to ocean color.

    PubMed

    Gilerson, Alexander; Carrizo, Carlos; Foster, Robert; Harmel, Tristan

    2018-04-16

    The value and spectral dependence of the reflectance coefficient (ρ) of skylight from wind-roughened ocean surfaces is critical for determining accurate water leaving radiance and remote sensing reflectances from shipborne, AERONET-Ocean Color and satellite observations. Using a vector radiative transfer code, spectra of the reflectance coefficient and corresponding radiances near the ocean surface and at the top of the atmosphere (TOA) are simulated for a broad range of parameters including flat and windy ocean surfaces with wind speeds up to 15 m/s, aerosol optical thicknesses of 0-1 at 440nm, wavelengths of 400-900 nm, and variable Sun and viewing zenith angles. Results revealed a profound impact of the aerosol load and type on the spectral values of ρ. Such impacts, not included yet in standard processing, may produce significant inaccuracies in the reflectance spectra retrieved from above-water radiometry and satellite observations. Implications for satellite cal/val activities as well as potential changes in measurement and data processing schemes are discussed.

  15. Adventures in holistic ecosystem modelling: the cumberland basin ecosystem model

    NASA Astrophysics Data System (ADS)

    Gordon, D. C.; Keizer, P. D.; Daborn, G. R.; Schwinghamer, P.; Silvert, W. L.

    A holistic ecosystem model has been developed for the Cumberland Basin, a turbid macrotidal estuary at the head of Canada's Bay of Fundy. The model was constructed as a group exercise involving several dozen scientists. Philosophy of approach and methods were patterned after the BOEDE Ems-Dollard modelling project. The model is one-dimensional, has 3 compartments and 3 boundaries, and is composed of 3 separate submodels (physical, pelagic and benthic). The 28 biological state variables cover the complete estuarine ecosystem and represent broad functional groups of organisms based on trophic relationships. Although still under development and not yet validated, the model has been verified and has reached the stage where most state variables provide reasonable output. The modelling process has stimulated interdisciplinary discussion, identified important data gaps and produced a quantitative tool which can be used to examine ecological hypotheses and determine critical environmental processes. As a result, Canadian scientists have a much better understanding of the Cumberland Basin ecosystem and are better able to provide competent advice on environmental management.

  16. Controls on the spatial variability of key soil properties: comparing field data with a mechanistic soilscape evolution model

    NASA Astrophysics Data System (ADS)

    Vanwalleghem, T.; Román, A.; Giraldez, J. V.

    2016-12-01

    There is a need for better understanding the processes influencing soil formation and the resulting distribution of soil properties. Soil properties can exhibit strong spatial variation, even at the small catchment scale. Especially soil carbon pools in semi-arid, mountainous areas are highly uncertain because bulk density and stoniness are very heterogeneous and rarely measured explicitly. In this study, we explore the spatial variability in key soil properties (soil carbon stocks, stoniness, bulk density and soil depth) as a function of processes shaping the critical zone (weathering, erosion, soil water fluxes and vegetation patterns). We also compare the potential of a geostatistical versus a mechanistic soil formation model (MILESD) for predicting these key soil properties. Soil core samples were collected from 67 locations at 6 depths. Total soil organic carbon stocks were 4.38 kg m-2. Solar radiation proved to be the key variable controlling soil carbon distribution. Stone content was mostly controlled by slope, indicating the importance of erosion. Spatial distribution of bulk density was found to be highly random. Finally, total carbon stocks were predicted using a random forest model whose main covariates were solar radiation and NDVI. The model predicts carbon stocks that are double as high on north versus south-facing slopes. However, validation showed that these covariates only explained 25% of the variation in the dataset. Apparently, present-day landscape and vegetation properties are not sufficient to fully explain variability in the soil carbon stocks in this complex terrain under natural vegetation. This is attributed to a high spatial variability in bulk density and stoniness, key variables controlling carbon stocks. Similar results were obtained with the mechanistic soil formation model MILESD, suggesting that more complex models might be needed to further explore this high spatial variability.

  17. Things that go bump in the light - On the optical specification of contact severity

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K.; Phatak, Anil V.

    1993-01-01

    Psychologists are intrigued with the idea that optical variables can specify not only the time until an object impacts an observer but also the severity of the impact. However, the mapping between the optical variables and the kinematic variables has been misstated, erroneously implying that there exist critical values of the optical variables used for locomotion and control. In this commentary, the mathematical relationship between the optical and kinematic variables is reexamined and the erroneous assumptions that have led to the proposal of critical values are shown. Also examined are the empirical data on deceleration to approach to assess whether the proposed optical variables are likely candidates for control strategies. Finally, problems associated with numerical approximations to dynamic systems, particularly when analytic solutions exist, are discussed.

  18. A Correlational Study on Critical Thinking in Nursing as an Outcome Variable for Success

    ERIC Educational Resources Information Center

    Porter, Rebecca Jean

    2018-01-01

    Critical thinking is a required curricular outcome for nursing education; however, the literature shows a gap related to valid and reliable tools to measure critical thinking specific to nursing and relating that critical thinking measurement to meaningful outcomes. This study examined critical thinking scores, as measured by Assessment…

  19. Quality by design approach: application of artificial intelligence techniques of tablets manufactured by direct compression.

    PubMed

    Aksu, Buket; Paradkar, Anant; de Matas, Marcel; Ozer, Ozgen; Güneri, Tamer; York, Peter

    2012-12-01

    The publication of the International Conference of Harmonization (ICH) Q8, Q9, and Q10 guidelines paved the way for the standardization of quality after the Food and Drug Administration issued current Good Manufacturing Practices guidelines in 2003. "Quality by Design", mentioned in the ICH Q8 guideline, offers a better scientific understanding of critical process and product qualities using knowledge obtained during the life cycle of a product. In this scope, the "knowledge space" is a summary of all process knowledge obtained during product development, and the "design space" is the area in which a product can be manufactured within acceptable limits. To create the spaces, artificial neural networks (ANNs) can be used to emphasize the multidimensional interactions of input variables and to closely bind these variables to a design space. This helps guide the experimental design process to include interactions among the input variables, along with modeling and optimization of pharmaceutical formulations. The objective of this study was to develop an integrated multivariate approach to obtain a quality product based on an understanding of the cause-effect relationships between formulation ingredients and product properties with ANNs and genetic programming on the ramipril tablets prepared by the direct compression method. In this study, the data are generated through the systematic application of the design of experiments (DoE) principles and optimization studies using artificial neural networks and neurofuzzy logic programs.

  20. Utilisation d'analyse de concepts formels pour la gestion de variabilite d'un logiciel configure dynamiquement

    NASA Astrophysics Data System (ADS)

    Menguy, Theotime

    Because of its critical nature, avionic industry is bound with numerous constraints such as security standards and certifications while having to fulfill the clients' desires for personalization. In this context, variability management is a very important issue for re-engineering projects of avionic softwares. In this thesis, we propose a new approach, based on formal concept analysis and semantic web, to support variability management. The first goal of this research is to identify characteristic behaviors and interactions of configuration variables in a dynamically configured system. To identify such elements, we used formal concept analysis on different levels of abstractions in the system and defined new metrics. Then, we built a classification for the configuration variables and their relations in order to enable a quick identification of a variable's behavior in the system. This classification could help finding a systematic approach to process variables during a re-engineering operation, depending on their category. To have a better understanding of the system, we also studied the shared controls of code between configuration variables. A second objective of this research is to build a knowledge platform to gather the results of all the analysis performed, and to store any additional element relevant in the variability management context, for instance new results helping define re-engineering process for each of the categories. To address this goal, we built a solution based on a semantic web, defining a new ontology, very extensive and enabling to build inferences related to the evolution processes. The approach presented here is, to the best of our knowledge, the first classification of configuration variables of a dynamically configured software and an original use of documentation and variability management techniques using semantic web in the aeronautic field. The analysis performed and the final results show that formal concept analysis is a way to identify specific properties and behaviors and that semantic web is a good solution to store and explore the results. However, the use of formal concept analysis with new boolean relations, such as the link between configuration variables and files, and the definition of new inferences may be a way to draw better conclusions. The use of the same methodology with other systems would enable to validate the approach in other contexts.

  1. Criticality meets learning: Criticality signatures in a self-organizing recurrent neural network

    PubMed Central

    Del Papa, Bruno; Priesemann, Viola

    2017-01-01

    Many experiments have suggested that the brain operates close to a critical state, based on signatures of criticality such as power-law distributed neuronal avalanches. In neural network models, criticality is a dynamical state that maximizes information processing capacities, e.g. sensitivity to input, dynamical range and storage capacity, which makes it a favorable candidate state for brain function. Although models that self-organize towards a critical state have been proposed, the relation between criticality signatures and learning is still unclear. Here, we investigate signatures of criticality in a self-organizing recurrent neural network (SORN). Investigating criticality in the SORN is of particular interest because it has not been developed to show criticality. Instead, the SORN has been shown to exhibit spatio-temporal pattern learning through a combination of neural plasticity mechanisms and it reproduces a number of biological findings on neural variability and the statistics and fluctuations of synaptic efficacies. We show that, after a transient, the SORN spontaneously self-organizes into a dynamical state that shows criticality signatures comparable to those found in experiments. The plasticity mechanisms are necessary to attain that dynamical state, but not to maintain it. Furthermore, onset of external input transiently changes the slope of the avalanche distributions – matching recent experimental findings. Interestingly, the membrane noise level necessary for the occurrence of the criticality signatures reduces the model’s performance in simple learning tasks. Overall, our work shows that the biologically inspired plasticity and homeostasis mechanisms responsible for the SORN’s spatio-temporal learning abilities can give rise to criticality signatures in its activity when driven by random input, but these break down under the structured input of short repeating sequences. PMID:28552964

  2. Assessing groundwater vulnerability to agrichemical contamination in the Midwest US

    USGS Publications Warehouse

    Burkart, M.R.; Kolpin, D.W.; James, D.E.

    1999-01-01

    Agrichemicals (herbicides and nitrate) are significant sources of diffuse pollution to groundwater. Indirect methods are needed to assess the potential for groundwater contamination by diffuse sources because groundwater monitoring is too costly to adequately define the geographic extent of contamination at a regional or national scale. This paper presents examples of the application of statistical, overlay and index, and process-based modeling methods for groundwater vulnerability assessments to a variety of data from the Midwest U.S. The principles for vulnerability assessment include both intrinsic (pedologic, climatologic, and hydrogeologic factors) and specific (contaminant and other anthropogenic factors) vulnerability of a location. Statistical methods use the frequency of contaminant occurrence, contaminant concentration, or contamination probability as a response variable. Statistical assessments are useful for defining the relations among explanatory and response variables whether they define intrinsic or specific vulnerability. Multivariate statistical analyses are useful for ranking variables critical to estimating water quality responses of interest. Overlay and index methods involve intersecting maps of intrinsic and specific vulnerability properties and indexing the variables by applying appropriate weights. Deterministic models use process-based equations to simulate contaminant transport and are distinguished from the other methods in their potential to predict contaminant transport in both space and time. An example of a one-dimensional leaching model linked to a geographic information system (GIS) to define a regional metamodel for contamination in the Midwest is included.

  3. Abortion law reforms in Colombia and Nicaragua: issue networks and opportunity contexts.

    PubMed

    Reuterswärd, Camilla; Zetterberg, Pär; Thapar-Björkert, Suruchi; Molyneux, Maxine

    2011-01-01

    This article analyses two instances of abortion law reform in Latin America. In 2006, after a decades-long impasse, the highly controversial issue of abortion came to dominate the political agenda when Colombia liberalized its abortion law and Nicaragua adopted a total ban on abortion. The article analyses the central actors in the reform processes, their strategies and the opportunity contexts. Drawing on Htun's (2003) framework, it examines why these processes concluded with opposing legislative outcomes. The authors argue for the need to understand the state as a non-unitary site of politics and policy, and for judicial processes to be seen as a key variable in facilitating gender policy reforms in Latin America. In addition, they argue that ‘windows of opportunity’ such as the timing of elections can be critically important in legislative change processes.

  4. Dimensional and material characteristics of direct deposited tool steel by CO II laser

    NASA Astrophysics Data System (ADS)

    Choi, J.

    2006-01-01

    Laser aided direct metalimaterial deposition (DMD) process builds metallic parts layer-by-layer directly from the CAD representation. In general, the process uses powdered metaUmaterials fed into a melt pool, creating fully dense parts. Success of this technology in the die and tool industry depends on the parts quality to be achieved. To obtain designed geometric dimensions and material properties, delicate control of the parameters such as laser power, spot diameter, traverse speed and powder mass flow rate is critical. In this paper, the dimensional and material characteristics of directed deposited H13 tool steel by CO II laser are investigated for the DMD process with a feedback height control system. The relationships between DMD process variables and the product characteristics are analyzed using statistical techniques. The performance of the DMD process is examined with the material characteristics of hardness, porosity, microstructure, and composition.

  5. The longitudinal effect of concept map teaching on critical thinking of nursing students.

    PubMed

    Lee, Weillie; Chiang, Chi-Hua; Liao, I-Chen; Lee, Mei-Li; Chen, Shiah-Lian; Liang, Tienli

    2013-10-01

    Concept map is a useful cognitive tool for enhancing a student's critical thinking by encouraging students to process information deeply for understanding. However, there is limited understanding of longitudinal effects of concept map teaching on students' critical thinking. The purpose of the study was to investigate the growth and the other factors influencing the development of critical thinking in response to concept map as an interventional strategy for nursing students in a two-year registered nurse baccalaureate program. The study was a quasi-experimental and longitudinal follow-up design. A convenience sample was drawn from a university in central Taiwan. Data were collected at different time points at the beginning of each semester using structured questionnaires including Critical Thinking Scale and Approaches to Learning and Studying. The intervention of concept map teaching was given at the second semester in the Medical-Surgical Nursing course. The results of the findings revealed student started with a mean critical thinking score of 41.32 and decreased at a rate of 0.42 over time, although not significant. After controlling for individual characteristics, the final model revealed that the experimental group gained a higher critical thinking score across time than the control group. The best predictive variables of initial status in critical thinking were without clinical experience and a higher pre-test score. The growth in critical thinking was predicted best by a lower pre-test score, and lower scores on surface approach and organized study. Our study suggested that concept map is a useful teaching strategy to enhance student critical thinking. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Resolving critical dimension drift over time in plasma etching through virtual metrology based wafer-to-wafer control

    NASA Astrophysics Data System (ADS)

    Lee, Ho Ki; Baek, Kye Hyun; Shin, Kyoungsub

    2017-06-01

    As semiconductor devices are scaled down to sub-20 nm, process window of plasma etching gets extremely small so that process drift or shift becomes more significant. This study addresses one of typical process drift issues caused by consumable parts erosion over time and provides feasible solution by using virtual metrology (VM) based wafer-to-wafer control. Since erosion of a shower head has center-to-edge area dependency, critical dimensions (CDs) at the wafer center and edge area get reversed over time. That CD trend is successfully estimated on a wafer-to-wafer basis by a partial least square (PLS) model which combines variables from optical emission spectroscopy (OES), VI-probe and equipment state gauges. R 2 of the PLS model reaches 0.89 and its prediction performance is confirmed in a mass production line. As a result, the model can be exploited as a VM for wafer-to-wafer control. With the VM, advanced process control (APC) strategy is implemented to solve the CD drift. Three σ of CD across wafer is improved from the range (1.3-2.9 nm) to the range (0.79-1.7 nm). Hopefully, results introduced in this paper will contribute to accelerating implementation of VM based APC strategy in semiconductor industry.

  7. Development of metoprolol tartrate extended-release matrix tablet formulations for regulatory policy consideration.

    PubMed

    Nellore, R V; Rekhi, G S; Hussain, A S; Tillman, L G; Augsburger, L L

    1998-01-02

    This research study was designed to develop model extended-release (ER) matrix tablet formulations for metoprolol tartrate (100 mg) sufficiently sensitive to manufacturing variable and to serve as the scientific basis for regulatory policy development on scale-up and post approval changes for modified-release dosage forms (SUPAC-MR). Several grades and levels of hydroxypropyl methylcellulose (Methocel K4M, K15M, K100M and K100LV), fillers and binders and studied. Three granulation processes were evaluated; direct compression, fluid-bed or high-shear granulation. Lubrication was performed in a V-blender and tablets were compressed on an instrumented rotary tablet press. Direct compression formulations exhibited poor flow, picking and sticking problems during tableting. High-shear granulation resulted in the formation of hard granules that were difficult to mill but yielded good tablets. Fluid-bed granulations were made using various binders and appeared to be satisfactory in terms of flow and tableting performance. In vitro drug release testing was performed in pH 6.8 phosphate buffer using USP apparatus 2 (paddle) at 50 rpm. At a fixed polymer level, drug release from the higher viscosity grades (K100M) was slower as compared to the lower viscosity grades (K100LV). In addition, release from K100LV was found to be more sensitive to polymer level changes. Increased in polymer level from 10 to 40% and/or filler change from lactose to dicalcium phosphate resulted in about 25-30% decrease in the amount of metoprolol release after 12 h. The results of this study led to the choice of Methocel K100LV as the hydrophilic matrix polymer and fluid-bed granulation as the process of choice for further evaluation of critical and non-critical formulation and processing variables.

  8. Improving cardiac operating room to intensive care unit handover using a standardised handover process.

    PubMed

    Gleicher, Yehoshua; Mosko, Jeffrey David; McGhee, Irene

    2017-01-01

    Handovers from the cardiovascular operating room (CVOR) to the cardiovascular intensive care unit (CVICU) are complex processes involving the transfer of information, equipment and responsibility, at a time when the patient is most vulnerable. This transfer is typically variable in structure, content and execution. This variability can lead to the omission and miscommunication of critical information leading to patient harm. We set out to improve the quality of patient handover from the CVOR to the CVICU by introducing a standardised handover protocol. This study is an interventional time-series study over a 4-month period at an adult cardiac surgery centre. A standardised handover protocol was developed using quality improvement methodologies. The protocol included a handover content checklist and introduction of a formal 'sterile cockpit' timeout. Implementation of the protocol was refined using monthly iterative Plan-Do-Study-Act. The primary outcome was the quality of handovers, measured by a Handover Score, comprising handover content, teamwork and patient care planning indicators. Secondary outcomes included handover duration, adherence to the standardised handover protocol and handover team satisfaction surveys. 37 handovers were observed (6 pre intervention and 31 post intervention). The mean handover score increased from 6.5 to 14.0 (maximum 18 points). Specific improvements included fewer handover interruptions and more frequent postoperative patient care planning. Average handover duration increased slightly from 2:40 to 2:57 min. Caregivers noted improvements in teamwork, content received and patient care planning. The majority (>95%) agreed that the intervention was a valuable addition to the CVOR to CVICU handover process. Implementation of a standardised handover protocol for postcardiac surgery patients was associated with fewer interruptions during handover, more reliable transfer of critical content and improved patient care planning.

  9. Integrated Water Resources Planning and Management in Arid/Semi-arid Regions: Data, Modeling, and Assessment

    NASA Astrophysics Data System (ADS)

    Gupta, H.; Liu, Y.; Wagener, T.; Durcik, M.; Duffy, C.; Springer, E.

    2005-12-01

    Water resources in arid and semi-arid regions are highly sensitive to climate variability and change. As the demand for water continues to increase due to economic and population growth, planning and management of available water resources under climate uncertainties becomes increasingly critical in order to achieve basin-scale water sustainability (i.e., to ensure a long-term balance between supply and demand of water).The tremendous complexity of the interactions between the natural hydrologic system and the human environment means that modeling is the only available mechanism for properly integrating new knowledge into the decision-making process. Basin-scale integrated models have the potential to allow us to study the feedback processes between the physical and human systems (including institutional, engineering, and behavioral components); and an integrated assessment of the potential second- and higher-order effects of political and management decisions can aid in the selection of a rational water-resources policy. Data and information, especially hydrological and water-use data, are critical to the integrated modeling and assessment for water resources management of any region. To this end we are in the process of developing a multi-resolution integrated modeling and assessment framework for the south-western USA, which can be used to generate simulations of the probable effects of human actions while taking into account the uncertainties brought about by future climatic variability and change. Data are being collected (including the development of a hydro-geospatial database) and used in support of the modeling and assessment activities. This paper will present a blueprint of the modeling framework, describe achievements so far and discuss the science questions which still require answers with a particular emphasis on issues related to dry regions.

  10. A self-organized criticality model for ion temperature gradient mode driven turbulence in confined plasma

    NASA Astrophysics Data System (ADS)

    Isliker, H.; Pisokas, Th.; Strintzi, D.; Vlahos, L.

    2010-08-01

    A new self-organized criticality (SOC) model is introduced in the form of a cellular automaton (CA) for ion temperature gradient (ITG) mode driven turbulence in fusion plasmas. Main characteristics of the model are that it is constructed in terms of the actual physical variable, the ion temperature, and that the temporal evolution of the CA, which necessarily is in the form of rules, mimics actual physical processes as they are considered to be active in the system, i.e., a heating process and a local diffusive process that sets on if a threshold in the normalized ITG R /LT is exceeded. The model reaches the SOC state and yields ion temperature profiles of exponential shape, which exhibit very high stiffness, in that they basically are independent of the loading pattern applied. This implies that there is anomalous heat transport present in the system, despite the fact that diffusion at the local level is imposed to be of a normal kind. The distributions of the heat fluxes in the system and of the heat out-fluxes are of power-law shape. The basic properties of the model are in good qualitative agreement with experimental results.

  11. The Impeller Exit Flow Coefficient As a Performance Map Variable for Predicting Centrifugal Compressor Off-Design Operation Applied to a Supercritical CO 2 Working Fluid

    DOE PAGES

    Liese, Eric; Zitney, Stephen E.

    2017-06-26

    A multi-stage centrifugal compressor model is presented with emphasis on analyzing use of an exit flow coefficient vs. an inlet flow coefficient performance parameter to predict off-design conditions in the critical region of a supercritical carbon dioxide (CO 2) power cycle. A description of the performance parameters is given along with their implementation in a design model (number of stages, basic sizing, etc.) and a dynamic model (for use in transient studies). A design case is shown for two compressors, a bypass compressor and a main compressor, as defined in a process simulation of a 10 megawatt (MW) supercritical COmore » 2 recompression Brayton cycle. Simulation results are presented for a simple open cycle and closed cycle process with changes to the inlet temperature of the main compressor which operates near the CO 2 critical point. Results showed some difference in results using the exit vs. inlet flow coefficient correction, however, it was not significant for the range of conditions examined. Here, this paper also serves as a reference for future works, including a full process simulation of the 10 MW recompression Brayton cycle.« less

  12. The Impeller Exit Flow Coefficient As a Performance Map Variable for Predicting Centrifugal Compressor Off-Design Operation Applied to a Supercritical CO 2 Working Fluid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liese, Eric; Zitney, Stephen E.

    A multi-stage centrifugal compressor model is presented with emphasis on analyzing use of an exit flow coefficient vs. an inlet flow coefficient performance parameter to predict off-design conditions in the critical region of a supercritical carbon dioxide (CO 2) power cycle. A description of the performance parameters is given along with their implementation in a design model (number of stages, basic sizing, etc.) and a dynamic model (for use in transient studies). A design case is shown for two compressors, a bypass compressor and a main compressor, as defined in a process simulation of a 10 megawatt (MW) supercritical COmore » 2 recompression Brayton cycle. Simulation results are presented for a simple open cycle and closed cycle process with changes to the inlet temperature of the main compressor which operates near the CO 2 critical point. Results showed some difference in results using the exit vs. inlet flow coefficient correction, however, it was not significant for the range of conditions examined. Here, this paper also serves as a reference for future works, including a full process simulation of the 10 MW recompression Brayton cycle.« less

  13. Firm Size, a Self-Organized Critical Phenomenon: Evidence from the Dynamical Systems Theory

    NASA Astrophysics Data System (ADS)

    Chandra, Akhilesh

    This research draws upon a recent innovation in the dynamical systems literature called the theory of self -organized criticality (SOC) (Bak, Tang, and Wiesenfeld 1988) to develop a computational model of a firm's size by relating its internal and the external sub-systems. As a holistic paradigm, the theory of SOC implies that a firm as a composite system of many degrees of freedom naturally evolves to a critical state in which a minor event starts a chain reaction that can affect either a part or the system as a whole. Thus, the global features of a firm cannot be understood by analyzing its individual parts separately. The causal framework builds upon a constant capital resource to support a volume of production at the existing level of efficiency. The critical size is defined as the production level at which the average product of a firm's factors of production attains its maximum value. The non -linearity is inferred by a change in the nature of relations at the border of criticality, between size and the two performance variables, viz., the operating efficiency and the financial efficiency. The effect of breaching the critical size is examined on the stock price reactions. Consistent with the theory of SOC, it is hypothesized that the temporal response of a firm breaching the level of critical size should behave as a flicker noise (1/f) process. The flicker noise is characterized by correlations extended over a wide range of time scales, indicating some sort of cooperative effect among a firm's degrees of freedom. It is further hypothesized that a firm's size evolves to a spatial structure with scale-invariant, self-similar (fractal) properties. The system is said to be self-organized inasmuch as it naturally evolves to the state of criticality without any detailed specifications of the initial conditions. In this respect, the critical state is an attractor of the firm's dynamics. Another set of hypotheses examines the relations between the size and the performance variables during the sub-critical (below the critical size) and the supra-critical (above the critical size) states. Since the dynamics of any two firms is likely to be different, the analysis is performed individually for each company within the Pharmaceuticals and the Perfume industries. The statistical results of this study provide evidence in support of the hypotheses. The size of a firm is found to be a self-organized critical phenomenon. The presence of 1/f noise and the spatial power-law behavior is taken as an evidence of the firm's size as a self-organized critical phenomenon. (Abstract shortened by UMI.).

  14. Individual Differences in Pain: Understanding the Mosaic that Makes Pain Personal

    PubMed Central

    Fillingim, Roger B.

    2016-01-01

    The experience of pain is characterized by tremendous inter-individual variability. Multiple biological and psychosocial variables contribute to these individual differences in pain, including demographic variables, genetic factors, and psychosocial processes. For example, sex, age and ethnic group differences in the prevalence of chronic pain conditions have been widely reported. Moreover, these demographic factors have been associated with responses to experimentally-induced pain. Similarly, both genetic and psychosocial factors contribute to clinical and experimental pain responses. Importantly, these different biopsychosocial influences interact with each other in complex ways to sculpt the experience of pain. Some genetic associations with pain have been found to vary across sex and ethnic group. Moreover, genetic factors also interact with psychosocial factors, including stress and pain catastrophizing, to influence pain. The individual and combined influences of these biological and psychosocial variables results in a unique mosaic of factors that contributes pain in each individual. Understanding these mosaics is critically important in order to provide optimal pain treatment, and future research to further elucidate the nature of these biopsychosocial interactions is needed in order to provide more informed and personalized pain care. PMID:27902569

  15. Observations and Models of Highly Intermittent Phytoplankton Distributions

    PubMed Central

    Mandal, Sandip; Locke, Christopher; Tanaka, Mamoru; Yamazaki, Hidekatsu

    2014-01-01

    The measurement of phytoplankton distributions in ocean ecosystems provides the basis for elucidating the influences of physical processes on plankton dynamics. Technological advances allow for measurement of phytoplankton data to greater resolution, displaying high spatial variability. In conventional mathematical models, the mean value of the measured variable is approximated to compare with the model output, which may misinterpret the reality of planktonic ecosystems, especially at the microscale level. To consider intermittency of variables, in this work, a new modelling approach to the planktonic ecosystem is applied, called the closure approach. Using this approach for a simple nutrient-phytoplankton model, we have shown how consideration of the fluctuating parts of model variables can affect system dynamics. Also, we have found a critical value of variance of overall fluctuating terms below which the conventional non-closure model and the mean value from the closure model exhibit the same result. This analysis gives an idea about the importance of the fluctuating parts of model variables and about when to use the closure approach. Comparisons of plot of mean versus standard deviation of phytoplankton at different depths, obtained using this new approach with real observations, give this approach good conformity. PMID:24787740

  16. Ground Water and Climate Change

    NASA Technical Reports Server (NTRS)

    Taylor, Richard G.; Scanlon, Bridget; Doell, Petra; Rodell, Matt; van Beek, Rens; Wada, Yoshihide; Longuevergne, Laurent; Leblanc, Marc; Famiglietti, James S.; Edmunds, Mike; hide

    2013-01-01

    As the world's largest distributed store of fresh water, ground water plays a central part in sustaining ecosystems and enabling human adaptation to climate variability and change. The strategic importance of ground water for global water and food security will probably intensify under climate change as more frequent and intense climate extremes (droughts and floods) increase variability in precipitation, soil moisture and surface water. Here we critically review recent research assessing the impacts of climate on ground water through natural and human-induced processes as well as through groundwater-driven feedbacks on the climate system. Furthermore, we examine the possible opportunities and challenges of using and sustaining groundwater resources in climate adaptation strategies, and highlight the lack of groundwater observations, which, at present, limits our understanding of the dynamic relationship between ground water and climate.

  17. Quantum interference magnetoconductance of polycrystalline germanium films in the variable-range hopping regime

    NASA Astrophysics Data System (ADS)

    Li, Zhaoguo; Peng, Liping; Zhang, Jicheng; Li, Jia; Zeng, Yong; Zhan, Zhiqiang; Wu, Weidong

    2018-06-01

    Direct evidence of quantum interference magnetotransport in polycrystalline germanium films in the variable-range hopping (VRH) regime is reported. The temperature dependence of the conductivity of germanium films fulfilled the Mott VRH mechanism with the form of ? in the low-temperature regime (?). For the magnetotransport behaviour of our germanium films in the VRH regime, a crossover, from negative magnetoconductance at the low-field to positive magnetoconductance at the high-field, is observed while the zero-field conductivity is higher than the critical value (?). In the regime of ?, the magnetoconductance is positive and quadratic in the field for some germanium films. These features are in agreement with the VRH magnetotransport theory based on the quantum interference effect among random paths in the hopping process.

  18. Ground water and climate change

    USGS Publications Warehouse

    Taylor, Richard G.; Scanlon, Bridget R.; Döll, Petra; Rodell, Matt; van Beek, Rens; Wada, Yoshihide; Longuevergne, Laurent; Leblanc, Marc; Famiglietti, James S.; Edmunds, Mike; Konikow, Leonard F.; Green, Timothy R.; Chen, Jianyao; Taniguchi, Makoto; Bierkens, Marc F.P.; MacDonald, Alan; Fan, Ying; Maxwell, Reed M.; Yechieli, Yossi; Gurdak, Jason J.; Allen, Diana M.; Shamsudduha, Mohammad; Hiscock, Kevin; Yeh, Pat J.-F.; Holman, Ian; Treidel, Holger

    2012-01-01

    As the world's largest distributed store of fresh water, ground water plays a central part in sustaining ecosystems and enabling human adaptation to climate variability and change. The strategic importance of ground water for global water and food security will probably intensify under climate change as more frequent and intense climate extremes (droughts and floods) increase variability in precipitation, soil moisture and surface water. Here we critically review recent research assessing the impacts of climate on ground water through natural and human-induced processes as well as through groundwater-driven feedbacks on the climate system. Furthermore, we examine the possible opportunities and challenges of using and sustaining groundwater resources in climate adaptation strategies, and highlight the lack of groundwater observations, which, at present, limits our understanding of the dynamic relationship between ground water and climate.

  19. Probing Gas Adsorption in Zeolites by Variable-Temperature IR Spectroscopy: An Overview of Current Research.

    PubMed

    Garrone, Edoardo; Delgado, Montserrat R; Bonelli, Barbara; Arean, Carlos O

    2017-09-15

    The current state of the art in the application of variable-temperature IR (VTIR) spectroscopy to the study of (i) adsorption sites in zeolites, including dual cation sites; (ii) the structure of adsorption complexes and (iii) gas-solid interaction energy is reviewed. The main focus is placed on the potential use of zeolites for gas separation, purification and transport, but possible extension to the field of heterogeneous catalysis is also envisaged. A critical comparison with classical IR spectroscopy and adsorption calorimetry shows that the main merits of VTIR spectroscopy are (i) its ability to provide simultaneously the spectroscopic signature of the adsorption complex and the standard enthalpy change involved in the adsorption process; and (ii) the enhanced potential of VTIR to be site specific in favorable cases.

  20. A thermomechanical constitutive model for cemented granular materials with quantifiable internal variables. Part I-Theory

    NASA Astrophysics Data System (ADS)

    Tengattini, Alessandro; Das, Arghya; Nguyen, Giang D.; Viggiani, Gioacchino; Hall, Stephen A.; Einav, Itai

    2014-10-01

    This is the first of two papers introducing a novel thermomechanical continuum constitutive model for cemented granular materials. Here, we establish the theoretical foundations of the model, and highlight its novelties. At the limit of no cement, the model is fully consistent with the original Breakage Mechanics model. An essential ingredient of the model is the use of measurable and micro-mechanics based internal variables, describing the evolution of the dominant inelastic processes. This imposes a link between the macroscopic mechanical behavior and the statistically averaged evolution of the microstructure. As a consequence this model requires only a few physically identifiable parameters, including those of the original breakage model and new ones describing the cement: its volume fraction, its critical damage energy and bulk stiffness, and the cohesion.

  1. Dielectric properties of magnetorheological elastomers with different microstructure

    NASA Astrophysics Data System (ADS)

    Moucka, R.; Sedlacik, M.; Cvek, M.

    2018-03-01

    Composite materials containing magnetic particles organised within the polymer matrix by the means of an external magnetic field during the curing process were prepared, and their dielectric properties were compared with their isotropic analogues of the same filler concentration but homogeneous spatial distribution. A substantial dielectric response observed for anisotropic systems in a form of relaxation processes was explained as charge transport via the mechanism of variable range hopping. The changes in registered relaxations' critical frequency and shape of dielectric spectra with the filler concentration were discussed in terms of decreasing anisotropy of the system. The knowledge of the dielectric response of studied systems is essential for their practical applications such as piezoresistive sensors or radio-absorbing materials.

  2. Low-sensitivity, frequency-selective amplifier circuits for hybrid and bipolar fabrication.

    NASA Technical Reports Server (NTRS)

    Pi, C.; Dunn, W. R., Jr.

    1972-01-01

    A network is described which is suitable for realizing a low-sensitivity high-Q second-order frequency-selective amplifier for high-frequency operation. Circuits are obtained from this network which are well suited for realizing monolithic integrated circuits and which do not require any process steps more critical than those used for conventional monolithic operational and video amplifiers. A single chip version using compatible thin-film techniques for the frequency determination elements is then feasible. Center frequency and bandwidth can be set independently by trimming two resistors. The frequency selective circuits have a low sensitivity to the process variables, and the sensitivity of the center frequency and bandwidth to changes in temperature is very low.

  3. Phytoremediation and bioremediation of polychlorinated biphenyls (PCBs): state of knowledge and research perspectives.

    PubMed

    Passatore, Laura; Rossetti, Simona; Juwarkar, Asha A; Massacci, Angelo

    2014-08-15

    This review summarizes the bioremediation and phytoremediation technologies proposed so far to detoxify PCB-contaminated sites. A critical analysis about the potential and limits of the PCB pollution treatment strategies by means of plants, fungi and bacteria are elucidated, including the new insights emerged from recent studies on the rhizosphere potential and on the implementation of simultaneous aerobic and anaerobic biodegradation processes. The review describes the biodegradation and phytoremediation processes and elaborates on the environmental variables affecting contaminant degradation rates, summarizing the amendments recommended to enhance PCB degradation. Additionally, issues connected with PCB toxicology, actual field remediation strategies and economical evaluation are discussed. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Online analysis and process control in recombinant protein production (review).

    PubMed

    Palmer, Shane M; Kunji, Edmund R S

    2012-01-01

    Online analysis and control is essential for efficient and reproducible bioprocesses. A key factor in real-time control is the ability to measure critical variables rapidly. Online in situ measurements are the preferred option and minimize the potential loss of sterility. The challenge is to provide sensors with a good lifespan that withstand harsh bioprocess conditions, remain stable for the duration of a process without the need for recalibration, and offer a suitable working range. In recent decades, many new techniques that promise to extend the possibilities of analysis and control, not only by providing new parameters for analysis, but also through the improvement of accepted, well practiced, measurements have arisen.

  5. REFLECTIVE PRACTICE IN ORGANIZATIONAL LEARNING, CULTURAL SELF-UNDERSTANDING, AND COMMUNITY SELF-STRENGTHENING.

    PubMed

    Sparrow, Joshua

    2016-11-01

    The infant mental health field can amplify its effects when it extends its purview beyond the dyad to the larger contexts in which infants and adult caregivers interact and develop over time. Within health, mental health, education, and other human service organizations, the quality of relationships is a critical variable in the individual-level outcomes that such organizations seek. The goals of this work and the means for accomplishing them are highly dependent on human qualities and interactions that are shaped by organizational processes. In communities, too, processes that shape relationships also strongly influence child-, family-, and community-level outcomes. The Touchpoints approach to reflective practice can guide relational processes among professionals, parents, and infants in organizations and communities that influence these outcomes. © 2016 Michigan Association for Infant Mental Health.

  6. Daily affect variability and context-specific alcohol consumption.

    PubMed

    Mohr, Cynthia D; Arpin, Sarah; McCabe, Cameron T

    2015-11-01

    Research explored the effects of variability in negative and positive affect on alcohol consumption, specifying daily fluctuation in affect as a critical form of emotion dysregulation. Using daily process methodology allows for a more objective calculation of affect variability relative to traditional self-reports. The present study models within-person negative and positive affect variabilities as predictors of context-specific consumption (i.e. solitary vs. social drinking), controlling for mean levels of affect. A community sample of moderate-to-heavy drinkers (n = 47; 49% women) from a US metropolitan area reported on affect and alcohol consumption thrice daily for 30 days via a handheld electronic interviewer. Within-person affect variability was calculated using daily standard deviations in positive and negative affect. Within person, greater negative and positive variabilities are related to greater daily solitary and social consumption. Across study days, mean levels of negative and positive affect variabilities related to greater social consumption between persons; yet, aggregated negative affect variability was related to less solitary consumption. Results affirm affect variability as a unique predictor of alcohol consumption, independent of mean affect levels. Yet, it is important to differentiate social context of consumption, as well as type of affect variability, particularly at the between-person level. These distinctions help clarify inconsistencies in the self-medication literature regarding associations between average levels of affect and consumption. Importantly, consistent within-person relationships for both variabilities support arguments that both negative and positive affect variabilities are detrimental and reflect an inability to regulate emotional experience. © 2015 Australasian Professional Society on Alcohol and other Drugs.

  7. Fostering Collaboration Across the U.S. Critical Zone Observatories Network

    NASA Astrophysics Data System (ADS)

    Sharkey, S.; White, T. S.

    2017-12-01

    The Critical Zone (CZ) is defined as the permeable layer from the top of the vegetation canopy to the bottom of freely circulating groundwater where rock, soil, water, air and life meet. The study of the CZ is motivated by an overall lack of understanding of the coupled physical, chemical, and biological processes in this zone at differing spatial and temporal scales. Critical Zone Observatories (CZOs), supported by the U.S. National Science Foundation's Geosciences Directorate, are natural laboratories that aim to provide infrastructure, data and models to gain understanding of the evolution and function of the CZ from grain-to-watershed scales. The nine U.S. observatories span a range of climatic, ecologic, geologic, and physiographic environments from California to Puerto Rico, working on site-specific hypotheses and network-scale goals. CZO research infrastructure allows for teams of cross-disciplinary scientists at each site to further CZ science using field and theoretical approaches, education and outreach, and cross-CZO science. Cross-CZO science emerges from a set of common CZ science questions and hypotheses focused on CZ structure and evolution, event-based and continuous fluxes across CZ interfaces, and changes in storage of major CZ reservoirs at the catchment scale. CZO research seeks to understand coupled processes across all timescales using quantitative models parameterized from observations of meteorological variables, streams, and groundwater, and sampling and analyzing landforms, bedrock, soils, and ecosystems. Each observatory strives to apply common infrastructure, protocols and measurements that help quantify the composition and fluxes of energy, water, solutes, sediments, energy, and mass across boundaries of the CZ system through both space and time. This type of approach enables researchers to access and integrate data in a way that allows for the isolation of environmental variables and comparison of processes and responses across environmental gradients. There is opportunity to foster cross-collaborations with existing research infrastructure (i.e. LTER, NEON, international CZOs) to promote cross-site science and expand upon geologic, climatic, ecological, land use and hydrologic gradients required to understand the CZ.

  8. From Cocoa to Chocolate: The Impact of Processing on In Vitro Antioxidant Activity and the Effects of Chocolate on Antioxidant Markers In Vivo

    PubMed Central

    Di Mattia, Carla D.; Sacchetti, Giampiero; Mastrocola, Dino; Serafini, Mauro

    2017-01-01

    Chocolate is a product processed from cocoa rich in flavonoids, antioxidant compounds, and bioactive ingredients that have been associated with both its healthy and sensory properties. Chocolate production consists of a multistep process which, starting from cocoa beans, involves fermentation, drying, roasting, nib grinding and refining, conching, and tempering. During cocoa processing, the naturally occurring antioxidants (flavonoids) are lost, while others, such as Maillard reaction products, are formed. The final content of antioxidant compounds and the antioxidant activity of chocolate is a function of several variables, some related to the raw material and others related to processing and formulation. The aim of this mini-review is to revise the literature on the impact of full processing on the in vitro antioxidant activity of chocolate, providing a critical analysis of the implications of processing on the evaluation of the antioxidant effect of chocolate in in vivo studies in humans. PMID:29033932

  9. From Cocoa to Chocolate: The Impact of Processing on In Vitro Antioxidant Activity and the Effects of Chocolate on Antioxidant Markers In Vivo.

    PubMed

    Di Mattia, Carla D; Sacchetti, Giampiero; Mastrocola, Dino; Serafini, Mauro

    2017-01-01

    Chocolate is a product processed from cocoa rich in flavonoids, antioxidant compounds, and bioactive ingredients that have been associated with both its healthy and sensory properties. Chocolate production consists of a multistep process which, starting from cocoa beans, involves fermentation, drying, roasting, nib grinding and refining, conching, and tempering. During cocoa processing, the naturally occurring antioxidants (flavonoids) are lost, while others, such as Maillard reaction products, are formed. The final content of antioxidant compounds and the antioxidant activity of chocolate is a function of several variables, some related to the raw material and others related to processing and formulation. The aim of this mini-review is to revise the literature on the impact of full processing on the in vitro antioxidant activity of chocolate, providing a critical analysis of the implications of processing on the evaluation of the antioxidant effect of chocolate in in vivo studies in humans.

  10. Microeconomics of process control in semiconductor manufacturing

    NASA Astrophysics Data System (ADS)

    Monahan, Kevin M.

    2003-06-01

    Process window control enables accelerated design-rule shrinks for both logic and memory manufacturers, but simple microeconomic models that directly link the effects of process window control to maximum profitability are rare. In this work, we derive these links using a simplified model for the maximum rate of profit generated by the semiconductor manufacturing process. We show that the ability of process window control to achieve these economic objectives may be limited by variability in the larger manufacturing context, including measurement delays and process variation at the lot, wafer, x-wafer, x-field, and x-chip levels. We conclude that x-wafer and x-field CD control strategies will be critical enablers of density, performance and optimum profitability at the 90 and 65nm technology nodes. These analyses correlate well with actual factory data and often identify millions of dollars in potential incremental revenue and cost savings. As an example, we show that a scatterometry-based CD Process Window Monitor is an economically justified, enabling technology for the 65nm node.

  11. A unified method for evaluating real-time computer controllers: A case study. [aircraft control

    NASA Technical Reports Server (NTRS)

    Shin, K. G.; Krishna, C. M.; Lee, Y. H.

    1982-01-01

    A real time control system consists of a synergistic pair, that is, a controlled process and a controller computer. Performance measures for real time controller computers are defined on the basis of the nature of this synergistic pair. A case study of a typical critical controlled process is presented in the context of new performance measures that express the performance of both controlled processes and real time controllers (taken as a unit) on the basis of a single variable: controller response time. Controller response time is a function of current system state, system failure rate, electrical and/or magnetic interference, etc., and is therefore a random variable. Control overhead is expressed as a monotonically nondecreasing function of the response time and the system suffers catastrophic failure, or dynamic failure, if the response time for a control task exceeds the corresponding system hard deadline, if any. A rigorous probabilistic approach is used to estimate the performance measures. The controlled process chosen for study is an aircraft in the final stages of descent, just prior to landing. First, the performance measures for the controller are presented. Secondly, control algorithms for solving the landing problem are discussed and finally the impact of the performance measures on the problem is analyzed.

  12. Convection in the Rayleigh-Bénard flow with all fluid properties variable

    NASA Astrophysics Data System (ADS)

    Sassos, Athanasios; Pantokratoras, Asterios

    2011-10-01

    In the present paper, the effect of variable fluid properties (density, viscosity, thermal conductivity and specific heat) on the convection in the classical Rayleigh-Bénard problem is investigated. The investigation concerns water, air, and engine oil by taking into account the variation of fluid properties with temperature. The results are obtained by numerically solving the governing equations, using the SIMPLE algorithm and covering large temperature differences. It is found that the critical Rayleigh number increases as the temperature difference increases considering all fluid properties variable. However, when the fluid properties are kept constant, calculated at the mean temperature, and only density is considered variable, the critical Rayleigh number either decreases or remains constant.

  13. Natural Diversity in Heat Resistance of Bacteria and Bacterial Spores: Impact on Food Safety and Quality.

    PubMed

    den Besten, Heidy M W; Wells-Bennik, Marjon H J; Zwietering, Marcel H

    2018-03-25

    Heat treatments are widely used in food processing often with the aim of reducing or eliminating spoilage microorganisms and pathogens in food products. The efficacy of applying heat to control microorganisms is challenged by the natural diversity of microorganisms with respect to their heat robustness. This review gives an overview of the variations in heat resistances of various species and strains, describes modeling approaches to quantify heat robustness, and addresses the relevance and impact of the natural diversity of microorganisms when assessing heat inactivation. This comparison of heat resistances of microorganisms facilitates the evaluation of which (groups of) organisms might be troublesome in a production process in which heat treatment is critical to reducing the microbial contaminants, and also allows fine-tuning of the process parameters. Various sources of microbiological variability are discussed and compared for a range of species, including spore-forming and non-spore-forming pathogens and spoilage organisms. This benchmarking of variability factors gives crucial information about the most important factors that should be included in risk assessments to realistically predict heat inactivation of bacteria and spores as part of the measures for controlling shelf life and safety of food products.

  14. Carbon Nanotube Chopped Fiber for Enhanced Properties in Additive Manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menchhofer, Paul A.; Johnson, Joseph E.; Lindahl, John M.

    2016-06-06

    Nanocomp Technologies, Inc. is working with Oak Ridge National Laboratory to develop carbon nanotube (CNT) composite materials and evaluate their use in additive manufacturing (3D printing). The first phase demonstrated feasibility and improvements for carbon nanotube (CNT)- acrylonitrile butadiene styrene (ABS) composite filaments use in additive manufacturing, with potential future work centering on further improvements. By focusing the initial phase on standard processing methods (developed mainly for the incorporation of carbon fibers in ABS) and characterization techniques, a basis of knowledge for the incorporation of CNTs in ABS was learned. The ability to understand the various processing variables is criticalmore » to the successful development of these composites. From the degradation effects on ABS (caused by excessive temperatures), to the length of time the ABS is in the melt state, to the order of addition of constituents, and also to the many possible mixing approaches, a workable flow sequence that addresses each processing step is critical to the final material properties. Although this initial phase could not deal with each of these variables in-depth, a future study is recommended that will build on the lessons learned for this effort.« less

  15. Teachers' Expectations of Educational Leaders' Leadership Approach and Perspectives on the Principalship: Identifying Critical Leadership Paradigms for the 21st Century

    ERIC Educational Resources Information Center

    Thompson, Canute Sylvester

    2017-01-01

    There has been acceptance of the view that leadership is a critical variable in determining the outcomes of schools. Leithwood et. al (2004) contend that effective leadership is second only to the quality of classroom instruction in determining student outcomes. The quality of classroom instruction is a function of a number of variables including…

  16. Quantitative effects of composting state variables on C/N ratio through GA-aided multivariate analysis.

    PubMed

    Sun, Wei; Huang, Guo H; Zeng, Guangming; Qin, Xiaosheng; Yu, Hui

    2011-03-01

    It is widely known that variation of the C/N ratio is dependent on many state variables during composting processes. This study attempted to develop a genetic algorithm aided stepwise cluster analysis (GASCA) method to describe the nonlinear relationships between the selected state variables and the C/N ratio in food waste composting. The experimental data from six bench-scale composting reactors were used to demonstrate the applicability of GASCA. Within the GASCA framework, GA searched optimal sets of both specified state variables and SCA's internal parameters; SCA established statistical nonlinear relationships between state variables and the C/N ratio; to avoid unnecessary and time-consuming calculation, a proxy table was introduced to save around 70% computational efforts. The obtained GASCA cluster trees had smaller sizes and higher prediction accuracy than the conventional SCA trees. Based on the optimal GASCA tree, the effects of the GA-selected state variables on the C/N ratio were ranged in a descending order as: NH₄+-N concentration>Moisture content>Ash Content>Mean Temperature>Mesophilic bacteria biomass. Such a rank implied that the variation of ammonium nitrogen concentration, the associated temperature and the moisture conditions, the total loss of both organic matters and available mineral constituents, and the mesophilic bacteria activity, were critical factors affecting the C/N ratio during the investigated food waste composting. This first application of GASCA to composting modelling indicated that more direct search algorithms could be coupled with SCA or other multivariate analysis methods to analyze complicated relationships during composting and many other environmental processes. Copyright © 2010 Elsevier B.V. All rights reserved.

  17. Taking the pulse of mountains: Ecosystem responses to climatic variability

    USGS Publications Warehouse

    Fagre, Daniel B.; Peterson, David L.; Hessl, Amy E.

    2003-01-01

    An integrated program of ecosystem modeling and field studies in the mountains of the Pacific Northwest (U.S.A.) has quantified many of the ecological processes affected by climatic variability. Paleoecological and contemporary ecological data in forest ecosystems provided model parameterization and validation at broad spatial and temporal scales for tree growth, tree regeneration and treeline movement. For subalpine tree species, winter precipitation has a strong negative correlation with growth; this relationship is stronger at higher elevations and west-side sites (which have more precipitation). Temperature affects tree growth at some locations with respect to length of growing season (spring) and severity of drought at drier sites (summer). Furthermore, variable but predictable climate-growth relationships across elevation gradients suggest that tree species respond differently to climate at different locations, making a uniform response of these species to future climatic change unlikely. Multi-decadal variability in climate also affects ecosystem processes. Mountain hemlock growth at high-elevation sites is negatively correlated with winter snow depth and positively correlated with the winter Pacific Decadal Oscillation (PDO) index. At low elevations, the reverse is true. Glacier mass balance and fire severity are also linked to PDO. Rapid establishment of trees in subalpine ecosystems during this century is increasing forest cover and reducing meadow cover at many subalpine locations in the western U.S.A. and precipitation (snow depth) is a critical variable regulating conifer expansion. Lastly, modeling potential future ecosystem conditions suggests that increased climatic variability will result in increasing forest fire size and frequency, and reduced net primary productivity in drier, east-side forest ecosystems. As additional empirical data and modeling output become available, we will improve our ability to predict the effects of climatic change across a broad range of climates and mountain ecosystems in the northwestern U.S.A.

  18. Latent mnemonic strengths are latent: a comment on Mickes, Wixted, and Wais (2007).

    PubMed

    Rouder, Jeffrey N; Pratte, Michael S; Morey, Richard D

    2010-06-01

    Mickes, Wixted, and Wais (2007) proposed a simple test of latent strength variability in recognition memory. They asked participants to rate their confidence using either a 20-point or a 99-point strength scale and plotted distributions of the resulting ratings. They found 25% more variability in ratings for studied than for new items, which they interpreted as providing evidence that latent mnemonic strength distributions are 25% more variable for studied than for new items. We show here that this conclusion is critically dependent on assumptions--so much so that these assumptions determine the conclusions. In fact, opposite conclusions, such that study does not affect the variability of latent strength, may be reached by making different but equally plausible assumptions. Because all measurements of mnemonic strength variability are critically dependent on untestable assumptions, all are arbitrary. Hence, there is no principled method for assessing the relative variability of latent mnemonic strength distributions.

  19. Introduction of a theoretical splashing degree to assess the performance of low-viscosity oils in filling of capsules.

    PubMed

    Niederquell, Andreas; Kuentz, Martin

    2011-03-01

    These days an alternative to soft capsules is liquid-filled hard capsules. Their filling technology was investigated earlier with highly viscous formulations, while hardly any academic research focused on low-viscosity systems. Accordingly, this work addressed the filling of such oils that are splashing during the dosing process. It was aimed to first study capsule filling, using middle-chain triglycerides as reference oil, in order to then evaluate the concept of a new theoretical splashing degree for different oils. A laboratory-scale filling machine was used that included capsule sealing. Thus, the liquid encapsulation by microspray technology was employed to seal the dosage form. As a result of the study with reference oil, the filling volume and the temperature were found to be significant for the rate of leaking capsules. The filling volume was also important for weight variability of the capsules. However, most critical for this variability was the diameter of the filling nozzle. We proposed a power law for the coefficient of weight variability as a function of the nozzle diameter and the obtained exponent agreed with the proposed theory. Subsequently, a comparison of different oils revealed that the relative splashing degree shared a correlation with the coefficient of the capsule weight variability (Pearson product moment correlation of r=0.990). The novel theoretical concept was therefore found to be predictive for weight variability of the filled capsules. Finally, guidance was provided for the process development of liquid-filled capsules using low-viscosity oils. © 2011 American Association of Pharmaceutical Scientists

  20. Tropical Ocean Surface Energy Balance Variability: Linking Weather to Climate Scales

    NASA Technical Reports Server (NTRS)

    Roberts, J. Brent; Clayson, Carol Anne

    2013-01-01

    Radiative and turbulent surface exchanges of heat and moisture across the atmosphere-ocean interface are fundamental components of the Earth s energy and water balance. Characterizing the spatiotemporal variability of these exchanges of heat and moisture is critical to understanding the global water and energy cycle variations, quantifying atmosphere-ocean feedbacks, and improving model predictability. These fluxes are integral components to tropical ocean-atmosphere variability; they can drive ocean mixed layer variations and modify the atmospheric boundary layer properties including moist static stability, thereby influencing larger-scale tropical dynamics. Non-parametric cluster-based classification of atmospheric and ocean surface properties has shown an ability to identify coherent weather regimes, each typically associated with similar properties and processes. Using satellite-based observational radiative and turbulent energy flux products, this study investigates the relationship between these weather states and surface energy processes within the context of tropical climate variability. Investigations of surface energy variations accompanying intraseasonal and interannual tropical variability often use composite-based analyses of the mean quantities of interest. Here, a similar compositing technique is employed, but the focus is on the distribution of the heat and moisture fluxes within their weather regimes. Are the observed changes in surface energy components dominated by changes in the frequency of the weather regimes or through changes in the associated fluxes within those regimes? It is this question that the presented work intends to address. The distribution of the surface heat and moisture fluxes is evaluated for both normal and non-normal states. By examining both phases of the climatic oscillations, the symmetry of energy and water cycle responses are considered.

  1. Interpatient Variability in Dexmedetomidine Response: A Survey of the Literature

    PubMed Central

    Holliday, Samantha F.; Kane-Gill, Sandra L.; Empey, Philip E.; Buckley, Mitchell S.; Smithburger, Pamela L.

    2014-01-01

    Fifty-five thousand patients are cared for in the intensive care unit (ICU) daily with sedation utilized to reduce anxiety and agitation while optimizing comfort. The Society of Critical Care Medicine (SCCM) released updated guidelines for management of pain, agitation, and delirium in the ICU and recommended nonbenzodiazepines, such as dexmedetomidine and propofol, as first line sedation agents. Dexmedetomidine, an alpha-2 agonist, offers many benefits yet its use is mired by the inability to consistently achieve sedation goals. Three hypotheses including patient traits/characteristics, pharmacokinetics in critically ill patients, and clinically relevant genetic polymorphisms that could affect dexmedetomidine response are presented. Studies in patient traits have yielded conflicting results regarding the role of race yet suggest that dexmedetomidine may produce more consistent results in less critically ill patients and with home antidepressant use. Pharmacokinetics of critically ill patients are reported as similar to healthy individuals yet wide, unexplained interpatient variability in dexmedetomidine serum levels exist. Genetic polymorphisms in both metabolism and receptor response have been evaluated in few studies, and the results remain inconclusive. To fully understand the role of dexmedetomidine, it is vital to further evaluate what prompts such marked interpatient variability in critically ill patients. PMID:24558330

  2. Corrosion of Ceramic Materials

    NASA Technical Reports Server (NTRS)

    Opila, Elizabeth J.; Jacobson, Nathan S.

    1999-01-01

    Non-oxide ceramics are promising materials for a range of high temperature applications. Selected current and future applications are listed. In all such applications, the ceramics are exposed to high temperature gases. Therefore it is critical to understand the response of these materials to their environment. The variables to be considered here include both the type of ceramic and the environment to which it is exposed. Non-oxide ceramics include borides, nitrides, and carbides. Most high temperature corrosion environments contain oxygen and hence the emphasis of this chapter will be on oxidation processes.

  3. Challenges in Melt Furnace Tests

    NASA Astrophysics Data System (ADS)

    Belt, Cynthia

    2014-09-01

    Measurement is a critical part of running a cast house. Key performance indicators such as energy intensity, production (or melt rate), downtime (or OEE), and melt loss must all be understood and monitored on a weekly or monthly basis. Continuous process variables such as bath temperature, flue temperature, and furnace pressure should be used to control the furnace systems along with storing the values in databases for later analysis. While using measurement to track furnace performance over time is important, there is also a time and place for short-term tests.

  4. Human factors in spacecraft design

    NASA Technical Reports Server (NTRS)

    Harrison, Albert A.; Connors, Mary M.

    1990-01-01

    This paper describes some of the salient implications of evolving mission parameters for spacecraft design. Among the requirements for future spacecraft are new, higher standards of living, increased support of human productivity, and greater accommodation of physical and cultural variability. Design issues include volumetric allowances, architecture and layouts, closed life support systems, health maintenance systems, recreational facilities, automation, privacy, and decor. An understanding of behavioral responses to design elements is a precondition for critical design decisions. Human factors research results must be taken into account early in the course of the design process.

  5. Phase and vortex correlations in superconducting Josephson-junction arrays at irrational magnetic frustration.

    PubMed

    Granato, Enzo

    2008-07-11

    Phase coherence and vortex order in a Josephson-junction array at irrational frustration are studied by extensive Monte Carlo simulations using the parallel-tempering method. A scaling analysis of the correlation length of phase variables in the full equilibrated system shows that the critical temperature vanishes with a power-law divergent correlation length and critical exponent nuph, in agreement with recent results from resistivity scaling analysis. A similar scaling analysis for vortex variables reveals a different critical exponent nuv, suggesting that there are two distinct correlation lengths associated with a decoupled zero-temperature phase transition.

  6. Uncertainties in hydrological extremes projections and its effects on decision-making processes in an Amazonian sub-basin.

    NASA Astrophysics Data System (ADS)

    Andres Rodriguez, Daniel; Garofolo, Lucas; Lazaro Siqueira Junior, Jose

    2013-04-01

    Uncertainties in Climate Change projections are affected by irreducible uncertainties due to knowledge's limitations, chaotic nature of climate system and human decision-making process. Such uncertainties affect the impact studies, complicating the decision-making process aimed at mitigation and adaptation. However, these uncertainties allow the possibility to develop exploratory analyses on system's vulnerability to different sceneries. Through these kinds of analyses it is possible to identify critical issues, which must be deeper studied. For this study we used several future's projections from General Circulation Models to feed a Hydrological Model, applied to the Amazonian sub-basin of Ji-Paraná. Hydrological Model integrations are performed for present historical time (1970-1990) and for future period (2010-2100). Extreme values analyses are performed to each simulated time series and results are compared with extremes events in present time. A simple approach to identify potential vulnerabilities consists of evaluating the hydrologic system response to climate variability and extreme events observed in the past, comparing them with the conditions projected for the future. Thus it is possible to identify critical issues that need attention and more detailed studies. For the goal of this work, we used socio-economic data from Brazilian Institute of Geography and Statistics, the Operator of the National Electric System, the Brazilian National Water Agency and scientific and press published information. This information is used to characterize impacts associated to extremes hydrological events in the basin during the present historical time and to evaluate potential impacts in the future face to the different hydrological projections. Results show inter-model variability results in a broad dispersion on projected extreme's values. The impact of such dispersion is differentiated for different aspects of socio-economic and natural systems and must be carefully addressed in order to help in decision-making processes.

  7. Holocene climate variability in Texas, USA: An integration of existing paleoclimate data and modeling with a new, high-resolution speleothem record

    NASA Astrophysics Data System (ADS)

    Wong, Corinne I.; Banner, Jay L.; Musgrove, MaryLynn

    2015-11-01

    Delineating the climate processes governing precipitation variability in drought-prone Texas is critical for predicting and mitigating climate change effects, and requires the reconstruction of past climate beyond the instrumental record. We synthesize existing paleoclimate proxy data and climate simulations to provide an overview of climate variability in Texas during the Holocene. Conditions became progressively warmer and drier transitioning from the early to mid Holocene, culminating between 7 and 3 ka (thousand years ago), and were more variable during the late Holocene. The timing and relative magnitude of Holocene climate variability, however, is poorly constrained owing to considerable variability among the different records. To help address this, we present a new speleothem (NBJ) reconstruction from a central Texas cave that comprises the highest resolution proxy record to date, spanning the mid to late Holocene. NBJ trace-element concentrations indicate variable moisture conditions with no clear temporal trend. There is a decoupling between NBJ growth rate, trace-element concentrations, and δ18O values, which indicate that (i) the often direct relation between speleothem growth rate and moisture availability is likely complicated by changes in the overlying ecosystem that affect subsurface CO2 production, and (ii) speleothem δ18O variations likely reflect changes in moisture source (i.e., proportion of Pacific-vs. Gulf of Mexico-derived moisture) that appear not to be linked to moisture amount.

  8. Modeling financial markets by self-organized criticality

    NASA Astrophysics Data System (ADS)

    Biondo, Alessio Emanuele; Pluchino, Alessandro; Rapisarda, Andrea

    2015-10-01

    We present a financial market model, characterized by self-organized criticality, that is able to generate endogenously a realistic price dynamics and to reproduce well-known stylized facts. We consider a community of heterogeneous traders, composed by chartists and fundamentalists, and focus on the role of informative pressure on market participants, showing how the spreading of information, based on a realistic imitative behavior, drives contagion and causes market fragility. In this model imitation is not intended as a change in the agent's group of origin, but is referred only to the price formation process. We introduce in the community also a variable number of random traders in order to study their possible beneficial role in stabilizing the market, as found in other studies. Finally, we also suggest some counterintuitive policy strategies able to dampen fluctuations by means of a partial reduction of information.

  9. 1,500 Year Periodicity in Central Texas Moisture Source Variability Reconstructed from Speleothems

    NASA Astrophysics Data System (ADS)

    Wong, C. I.; James, E. W.; Silver, M. M.; Banner, J. L.; Musgrove, M.

    2014-12-01

    Delineating the climate processes governing precipitation variability in drought-prone Texas is critical for predicting and mitigating climate change effects, and requires the reconstruction of past climate beyond the instrumental record. Presently, there are few high-resolution Holocene climate records for this region, which limits the assessment of precipitation variability during a relatively stable climatic interval that comprises the closest analogue to the modern climate state. To address this, we present speleothem growth rate and δ18O records from two central Texas caves that span the mid to late Holocene, and assess hypotheses about the climate processes that can account for similarity in the timing and periodicity of variability with other regional and global records. A key finding is the independent variation of speleothem growth rate and δ18O values, suggesting the decoupling of moisture amount and source. This decoupling likely occurs because i) the often direct relation between speleothem growth rate and moisture availability is complicated by changes in the overlying ecosystem that affect subsurface CO2 production, and ii) speleothem δ18O variations reflect changes in moisture source (i.e., proportion of Pacific- vs. Gulf of Mexico-derived moisture) that appear not to be linked to moisture amount. Furthermore, we document a 1,500-year periodicity in δ18O values that is consistent with variability in the percent of hematite-stained grains in North Atlantic sediments, North Pacific SSTs, and El Nino events preserved in an Ecuadorian lake. Previous modeling experiments and analysis of observational data delineate the coupled atmospheric-ocean processes that can account for the coincidence of such variability in climate archives across the northern hemisphere. Reduction of the thermohaline circulation results in North Atlantic cooling, which translates to cooler North Pacific SSTs. The resulting reduction of the meridional SST gradient in the Pacific weakens the air-sea coupling that modulates ENSO activity, resulting in faster growth of interannual anomalies and larger mature El Niño relative to La Niña events. The asymmetrically enhanced ENSO variability can account for a greater portion of Pacific-derived moisture reflected by speleothem δ18O values.

  10. Spray drying formulation of amorphous solid dispersions.

    PubMed

    Singh, Abhishek; Van den Mooter, Guy

    2016-05-01

    Spray drying is a well-established manufacturing technique which can be used to formulate amorphous solid dispersions (ASDs) which is an effective strategy to deliver poorly water soluble drugs (PWSDs). However, the inherently complex nature of the spray drying process coupled with specific characteristics of ASDs makes it an interesting area to explore. Numerous diverse factors interact in an inter-dependent manner to determine the final product properties. This review discusses the basic background of ASDs, various formulation and process variables influencing the critical quality attributes (CQAs) of the ASDs and aspects of downstream processing. Also various aspects of spray drying such as instrumentation, thermodynamics, drying kinetics, particle formation process and scale-up challenges are included. Recent advances in the spray-based drying techniques are mentioned along with some future avenues where major research thrust is needed. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. High pressure processing and its application to the challenge of virus-contaminated foods.

    PubMed

    Kingsley, David H

    2013-03-01

    High pressure processing (HPP) is an increasingly popular non-thermal food processing technology. Study of HPP's potential to inactivate foodborne viruses has defined general pressure levels required to inactivate hepatitis A virus, norovirus surrogates, and human norovirus itself within foods such as shellfish and produce. The sensitivity of a number of different picornaviruses to HPP is variable. Experiments suggest that HPP inactivates viruses via denaturation of capsid proteins which render the virus incapable of binding to its receptor on the surface of its host cell. Beyond the primary consideration of treatment pressure level, the effects of extending treatment times, temperature of initial pressure application, and matrix composition have been identified as critical parameters for designing HPP inactivation strategies. Research described here can serve as a preliminary guide to whether a current commercial process could be effective against HuNoV or HAV.

  12. Global Scale Remote Sensing Monitoring of Endorheic Lake Systems

    NASA Astrophysics Data System (ADS)

    Scuderi, L. A.

    2010-12-01

    Semi-arid regions of the world contain thousands of endorheic lakes in large shallow basins. Due to their generally remote locations few are continuously monitored. Documentation of recent variability is essential to assessing how endorheic lakes respond to short-term meteorological conditions and longer-term decadal-scale climatic variability and is critical in determining future disturbance of hydrological regimes with respect to predicted warming and drying in the mid-latitudes. Short- and long-term departures from climatic averages, rapid environmental shifts and increased population pressures may result in significant fluctuations in the hydrologic budgets of these lakes and adversely impact endorheic lake/basin ecosystems. Information on flooding variability is also critical in estimating changes in P/E balances and on the production of exposed and easily deflated surfaces that may impact dust loading locally and regionally. In order to provide information on how these lakes respond we need to understand how entire systems respond hydrologically to different climatic inputs. This requires monitoring and analysis of regional to continental-scale systems. To date, this level of monitoring has not been achieved in an operational system. In order to assess the possibility of creating a global-scale lake inundation database we analyzed two contrasting lake systems in western North America (Mexico and New Mexico, USA) and China (Inner Mongolia). We asked two major questions: 1) is it possible to quickly and accurately quantify current lake inundation events in near real time using remote sensing? and, 2) is it possible to differentiate variable meteorological sources and resultant lake inundation responses using this type of database? With respect to these results we outline an automated lake monitoring approach using MODIS data and real-time processing systems that may provide future global monitoring capabilities.

  13. Diving Behaviors and Habitat Use of Adult Female Steller Sea Lion (Eumetopias jubatus), A Top Predator of the Bering Sea and North Pacific Ocean Ecosystems

    NASA Astrophysics Data System (ADS)

    Lander, M. E.; Fadely, B.; Gelatt, T.; Sterling, J.; Johnson, D.; Haulena, M.; McDermott, S.

    2016-02-01

    Decreased natality resulting from nutritional stress is one hypothesized mechanism for declines of Steller sea lions (SSLs; Eumetopias jubatus) in western Alaska, but little is known of the winter foraging habitats or behavior of adult females. To address this critical data need, adult female Steller sea lions were chemically immobilized and tagged with Fastloc® GPS satellite transmitters during the fall at Southeast Alaska (SEAK) during 2010 (n=3), and the central and western Aleutian Islands (AI) from 2011-2014 (n=9). To identify habitat features of biological importance to these animals, location data were processed with a continuous-time correlated random walk model and kernel density estimates of predicted locations were used to compute individual-based utilization distributions. Kernel density estimates and diving behaviors (i.e. mean, maximum, and frequency of dive depths) were examined with respect to a series of static and dynamic environmental variables using linear mixed-effects models. Habitat use varied within and among individuals, but overall, all response variables were significantly related to a combination of the predictor variables season, distance to nearest SSL site, bathymetric slope, on/off shelf, sea surface temperature, sea surface height, proportion of daylight, and some interaction effects (P≤0.05). The habitat use of SSL from SEAK was consistent with previous reports and reflected the seasonal distribution of predictable forage fish, whereas SSL from the AI used a variety of marine ecosystems and habitat use was more variable, likely reflecting specific prey behaviors encountered in different areas. These results have improved our understanding of the habitat features necessary for the conservation of adult female SSL and have been useful for reviewing designated critical habitat for Steller sea lions throughout the U.S. range.

  14. Submesoscale Sea Surface Temperature Variability from UAV and Satellite Measurements

    NASA Astrophysics Data System (ADS)

    Castro, S. L.; Emery, W. J.; Tandy, W., Jr.; Good, W. S.

    2017-12-01

    Technological advances in spatial resolution of observations have revealed the importance of short-lived ocean processes with scales of O(1km). These submesoscale processes play an important role for the transfer of energy from the meso- to small scales and for generating significant spatial and temporal intermittency in the upper ocean, critical for the mixing of the oceanic boundary layer. Submesoscales have been observed in sea surface temperatures (SST) from satellites. Satellite SST measurements are spatial averages over the footprint of the satellite. When the variance of the SST distribution within the footprint is small, the average value is representative of the SST over the whole pixel. If the variance is large, the spatial heterogeneity is a source of uncertainty in satellite derived SSTs. Here we show evidence that the submesoscale variability in SSTs at spatial scales of 1km is responsible for the spatial variability within satellite footprints. Previous studies of the spatial variability in SST, using ship-based radiometric data suggested that variability at scales smaller than 1 km is significant and affects the uncertainty of satellite-derived skin SSTs. We examine data collected by a calibrated thermal infrared radiometer, the Ball Experimental Sea Surface Temperature (BESST), flown on a UAV over the Arctic Ocean and compare them with coincident measurements from the MODIS spaceborne radiometer to assess the spatial variability of SST within 1 km pixels. By taking the standard deviation of all the BESST measurements within individual MODIS pixels we show that significant spatial variability exists within the footprints. The distribution of the surface variability measured by BESST shows a peak value of O(0.1K) with 95% of the pixels showing σ < 0.45K. More importantly, high-variability pixels are located at density fronts in the marginal ice zone, which are a primary source of submesoscale intermittency near the surface in the Arctic Ocean. Wavenumber spectra of the BESST SSTs indicate a spectral slope of -2, consistent with the presence of submesoscale processes. Furthermore, not only is the BESST wavenumber spectra able to match the MODIS SST spectra well, but also extends the spectral slope of -2 by 2 decades relative to MODIS, from wavelengths of 8km to 0.08km.

  15. Virtual tryout planning in automotive industry based on simulation metamodels

    NASA Astrophysics Data System (ADS)

    Harsch, D.; Heingärtner, J.; Hortig, D.; Hora, P.

    2016-11-01

    Deep drawn sheet metal parts are increasingly designed to the feasibility limit, thus achieving a robust manufacturing is often challenging. The fluctuation of process and material properties often lead to robustness problems. Therefore, numerical simulations are used to detect the critical regions. To enhance the agreement with the real process conditions, the material data are acquired through a variety of experiments. Furthermore, the force distribution is taken into account. The simulation metamodel contains the virtual knowledge of a particular forming process, which is determined based on a series of finite element simulations with variable input parameters. Based on the metamodels, virtual process windows can be displayed for different configurations. This helps to improve the operating point as well as to adjust process settings in case the process becomes unstable. Furthermore, the time of tool tryout can be shortened due to transfer of the virtual knowledge contained in the metamodels on the optimisation of the drawbeads. This allows the tool manufacturer to focus on the essential, to save time and to recognize complex relationships.

  16. Systematic review of the neural basis of social cognition in patients with mood disorders.

    PubMed

    Cusi, Andrée M; Nazarov, Anthony; Holshausen, Katherine; Macqueen, Glenda M; McKinnon, Margaret C

    2012-05-01

    This review integrates neuroimaging studies of 2 domains of social cognition--emotion comprehension and theory of mind (ToM)--in patients with major depressive disorder and bipolar disorder. The influence of key clinical and method variables on patterns of neural activation during social cognitive processing is also examined. Studies were identified using PsycINFO and PubMed (January 1967 to May 2011). The search terms were "fMRI," "emotion comprehension," "emotion perception," "affect comprehension," "affect perception," "facial expression," "prosody," "theory of mind," "mentalizing" and "empathy" in combination with "major depressive disorder," "bipolar disorder," "major depression," "unipolar depression," "clinical depression" and "mania." Taken together, neuroimaging studies of social cognition in patients with mood disorders reveal enhanced activation in limbic and emotion-related structures and attenuated activity within frontal regions associated with emotion regulation and higher cognitive functions. These results reveal an overall lack of inhibition by higher-order cognitive structures on limbic and emotion-related structures during social cognitive processing in patients with mood disorders. Critically, key variables, including illness burden, symptom severity, comorbidity, medication status and cognitive load may moderate this pattern of neural activation. Studies that did not include control tasks or a comparator group were included in this review. Further work is needed to examine the contribution of key moderator variables and to further elucidate the neural networks underlying altered social cognition in patients with mood disorders. The neural networks under lying higher-order social cognitive processes, including empathy, remain unexplored in patients with mood disorders.

  17. Lack of Critical Slowing Down Suggests that Financial Meltdowns Are Not Critical Transitions, yet Rising Variability Could Signal Systemic Risk.

    PubMed

    Guttal, Vishwesha; Raghavendra, Srinivas; Goel, Nikunj; Hoarau, Quentin

    2016-01-01

    Complex systems inspired analysis suggests a hypothesis that financial meltdowns are abrupt critical transitions that occur when the system reaches a tipping point. Theoretical and empirical studies on climatic and ecological dynamical systems have shown that approach to tipping points is preceded by a generic phenomenon called critical slowing down, i.e. an increasingly slow response of the system to perturbations. Therefore, it has been suggested that critical slowing down may be used as an early warning signal of imminent critical transitions. Whether financial markets exhibit critical slowing down prior to meltdowns remains unclear. Here, our analysis reveals that three major US (Dow Jones Index, S&P 500 and NASDAQ) and two European markets (DAX and FTSE) did not exhibit critical slowing down prior to major financial crashes over the last century. However, all markets showed strong trends of rising variability, quantified by time series variance and spectral function at low frequencies, prior to crashes. These results suggest that financial crashes are not critical transitions that occur in the vicinity of a tipping point. Using a simple model, we argue that financial crashes are likely to be stochastic transitions which can occur even when the system is far away from the tipping point. Specifically, we show that a gradually increasing strength of stochastic perturbations may have caused to abrupt transitions in the financial markets. Broadly, our results highlight the importance of stochastically driven abrupt transitions in real world scenarios. Our study offers rising variability as a precursor of financial meltdowns albeit with a limitation that they may signal false alarms.

  18. Criticism and Depression among the Caregivers of At-Risk Mental State and First-Episode Psychosis Patients

    PubMed Central

    Hamaie, Yumiko; Ohmuro, Noriyuki; Katsura, Masahiro; Obara, Chika; Kikuchi, Tatsuo; Ito, Fumiaki; Miyakoshi, Tetsuo; Matsuoka, Hiroo; Matsumoto, Kazunori

    2016-01-01

    Expressed emotion (EE), especially criticism, is an important predictor of outcomes for the patient for a wide range of mental health problems. To understand complex links between EE and various relevant variables in early phase psychosis, this study examined criticism, distress of caregivers, other patients’, and caregivers’ variables, and links between criticism and these variables in those with at-risk mental state (ARMS) for psychosis and first-episode psychosis (FEP). The participants were 56 patients (mean age 18.8 ± 4.2 years) with ARMS and their caregivers (49.4 ± 5.8 years) and 43 patients (21.7 ± 5.2 years) with FEP and their caregivers (49.3 ± 7.4 years). We investigated criticisms made by caregivers using the Japanese version of the Family Attitude Scale and caregiver depressive symptoms via the self-report Beck Depression Inventory. We also assessed psychiatric symptoms and functioning of the patients. Approximately one-third of caregivers of patients with ARMS or FEP had depressive symptoms, predominately with mild-to-moderate symptom levels, whereas only a small portion exhibited high criticism. The level of criticism and depression were comparable between ARMS and FEP caregivers. The link between criticism, caregivers’ depression, and patients’ symptoms were observed in FEP but not in ARMS caregivers. These findings imply that the interaction between criticism and caregivers’ and patients’ mental states may develop during or after the onset of established psychosis and interventions for the caregivers should be tailored to the patient’s specific stage of illness. Interventions for FEP caregivers should target their emotional distress and include education about patient’s general symptoms. PMID:26918629

  19. Lack of Critical Slowing Down Suggests that Financial Meltdowns Are Not Critical Transitions, yet Rising Variability Could Signal Systemic Risk

    PubMed Central

    Hoarau, Quentin

    2016-01-01

    Complex systems inspired analysis suggests a hypothesis that financial meltdowns are abrupt critical transitions that occur when the system reaches a tipping point. Theoretical and empirical studies on climatic and ecological dynamical systems have shown that approach to tipping points is preceded by a generic phenomenon called critical slowing down, i.e. an increasingly slow response of the system to perturbations. Therefore, it has been suggested that critical slowing down may be used as an early warning signal of imminent critical transitions. Whether financial markets exhibit critical slowing down prior to meltdowns remains unclear. Here, our analysis reveals that three major US (Dow Jones Index, S&P 500 and NASDAQ) and two European markets (DAX and FTSE) did not exhibit critical slowing down prior to major financial crashes over the last century. However, all markets showed strong trends of rising variability, quantified by time series variance and spectral function at low frequencies, prior to crashes. These results suggest that financial crashes are not critical transitions that occur in the vicinity of a tipping point. Using a simple model, we argue that financial crashes are likely to be stochastic transitions which can occur even when the system is far away from the tipping point. Specifically, we show that a gradually increasing strength of stochastic perturbations may have caused to abrupt transitions in the financial markets. Broadly, our results highlight the importance of stochastically driven abrupt transitions in real world scenarios. Our study offers rising variability as a precursor of financial meltdowns albeit with a limitation that they may signal false alarms. PMID:26761792

  20. A new statistical tool for NOAA local climate studies

    NASA Astrophysics Data System (ADS)

    Timofeyeva, M. M.; Meyers, J. C.; Hollingshead, A.

    2011-12-01

    The National Weather Services (NWS) Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices' ability to efficiently access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NOAA's staff to conduct regional and local climate studies using state-of-the-art station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. LCAT will augment current climate reference materials with information pertinent to the local and regional levels as they apply to diverse variables appropriate to each locality. The LCAT main emphasis is to enable studies of extreme meteorological and hydrological events such as tornadoes, flood, drought, severe storms, etc. LCAT will close a very critical gap in NWS local climate services because it will allow addressing climate variables beyond average temperature and total precipitation. NWS external partners and government agencies will benefit from the LCAT outputs that could be easily incorporated into their own analysis and/or delivery systems. Presently we identified five existing requirements for local climate: (1) Local impacts of climate change; (2) Local impacts of climate variability; (3) Drought studies; (4) Attribution of severe meteorological and hydrological events; and (5) Climate studies for water resources. The methodologies for the first three requirements will be included in the LCAT first phase implementation. Local rate of climate change is defined as a slope of the mean trend estimated from the ensemble of three trend techniques: (1) hinge, (2) Optimal Climate Normals (running mean for optimal time periods), (3) exponentially-weighted moving average. Root mean squared error is used to determine the best fit of trend to the observations with the least error. The studies of climate variability impacts on local extremes use composite techniques applied to various definitions of local variables: from specified percentiles to critical thresholds. Drought studies combine visual capabilities of Google maps with statistical estimates of drought severity indices. The process of development will be linked to local office interactions with users to ensure the tool will meet their needs as well as provide adequate training. A rigorous internal and tiered peer-review process will be implemented to ensure the studies are scientifically-sound that will be published and submitted to the local studies catalog (database) and eventually to external sources, such as the Climate Portal.

  1. Rotordynamic Feasibility of a Conceptual Variable-Speed Power Turbine Propulsion System for Large Civil Tilt-Rotor Applications

    NASA Technical Reports Server (NTRS)

    Howard, Samuel

    2012-01-01

    A variable-speed power turbine concept is analyzed for rotordynamic feasibility in a Large Civil Tilt-Rotor (LCTR) class engine. Implementation of a variable-speed power turbine in a rotorcraft engine would enable high efficiency propulsion at the high forward velocities anticipated of large tilt-rotor vehicles. Therefore, rotordynamics is a critical issue for this engine concept. A preliminary feasibility study is presented herein to address this concern and identify if variable-speed is possible in a conceptual engine sized for the LCTR. The analysis considers critical speed placement in the operating speed envelope, stability analysis up to the maximum anticipated operating speed, and potential unbalance response amplitudes to determine that a variable-speed power turbine is likely to be challenging, but not impossible to achieve in a tilt-rotor propulsion engine.

  2. Comparison of Critical Listening Proficiency of Teacher Candidates in Terms of Several Variables

    ERIC Educational Resources Information Center

    Kazu, Hilal; Demiralp, Demet

    2017-01-01

    Purpose: The research has been designed to determine the level of critical listening proficiency of the teacher candidates. It aims at finding answers to the following questions: (1) What is the level of critical listening proficiency of teacher candidates? (2) Do the teacher candidates' levels of critical listening proficiency indicate a…

  3. Optimising UAV topographic surveys processed with structure-from-motion: Ground control quality, quantity and bundle adjustment

    NASA Astrophysics Data System (ADS)

    James, M. R.; Robson, S.; d'Oleire-Oltmanns, S.; Niethammer, U.

    2017-03-01

    Structure-from-motion (SfM) algorithms greatly facilitate the production of detailed topographic models from photographs collected using unmanned aerial vehicles (UAVs). However, the survey quality achieved in published geomorphological studies is highly variable, and sufficient processing details are never provided to understand fully the causes of variability. To address this, we show how survey quality and consistency can be improved through a deeper consideration of the underlying photogrammetric methods. We demonstrate the sensitivity of digital elevation models (DEMs) to processing settings that have not been discussed in the geomorphological literature, yet are a critical part of survey georeferencing, and are responsible for balancing the contributions of tie and control points. We provide a Monte Carlo approach to enable geomorphologists to (1) carefully consider sources of survey error and hence increase the accuracy of SfM-based DEMs and (2) minimise the associated field effort by robust determination of suitable lower-density deployments of ground control. By identifying appropriate processing settings and highlighting photogrammetric issues such as over-parameterisation during camera self-calibration, processing artefacts are reduced and the spatial variability of error minimised. We demonstrate such DEM improvements with a commonly-used SfM-based software (PhotoScan), which we augment with semi-automated and automated identification of ground control points (GCPs) in images, and apply to two contrasting case studies - an erosion gully survey (Taroudant, Morocco) and an active landslide survey (Super-Sauze, France). In the gully survey, refined processing settings eliminated step-like artefacts of up to 50 mm in amplitude, and overall DEM variability with GCP selection improved from 37 to 16 mm. In the much more challenging landslide case study, our processing halved planimetric error to 0.1 m, effectively doubling the frequency at which changes in landslide velocity could be detected. In both case studies, the Monte Carlo approach provided a robust demonstration that field effort could by substantially reduced by only deploying approximately half the number of GCPs, with minimal effect on the survey quality. To reduce processing artefacts and promote confidence in SfM-based geomorphological surveys, published results should include processing details which include the image residuals for both tie points and GCPs, and ensure that these are considered appropriately within the workflow.

  4. Automated in situ observations of upper ocean biogeochemistry, bio-optics, and physics and their potential use for global studies

    NASA Technical Reports Server (NTRS)

    Dickey, Tommy D.; Granata, Timothy C.; Taupier-Letage, Isabelle

    1992-01-01

    The processes controlling the flux of carbon in the upper ocean have dynamic ranges in space and time of at least nine orders of magnitude. These processes depend on a broad suite of inter-related biogeochemical, bio-optical, and physical variables. These variables should be sampled on scales matching the relevant phenomena. Traditional ship-based sampling, while critical for detailed and more comprehensive observations, can span only limited portions of these ranges because of logistical and financial constraints. Further, remote observations from satellite platforms enable broad horizontal coverage which is restricted to the upper few meters of the ocean. For these main reasons, automated subsurface measurement systems are important for the fulfillment of research goals related to the regional and global estimation and modeling of time varying biogeochemical fluxes. Within the past few years, new sensors and systems capable of autonomously measuring several of the critical variables have been developed. The platforms for deploying these systems now include moorings and drifters and it is likely that autonomous underwater vehicles (AUV's) will become available for use in the future. Each of these platforms satisfies particular sampling needs and can be used to complement both shipboard and satellite observations. In the present review, (1) sampling considerations will be summarized, (2) examples of data obtained from some of the existing automated in situ sampling systems will be highlighted, (3) future sensors and systems will be discussed, (4) data management issues for present and future automated systems will be considered, and (5) the status of near real-time data telemetry will be outlined. Finally, we wish to make it clear at the outset that the perspectives presented here are those of the authors and are not intended to represent those of the United States JGOFS program, the International JGOFS program, NOAA's C&GC program, or other global ocean programs.

  5. Intrinsic vs. spurious long-range memory in high-frequency records of environmental radioactivity - Critical re-assessment and application to indoor 222Rn concentrations from Coimbra, Portugal

    NASA Astrophysics Data System (ADS)

    Donner, Reik V.; Potirakis, Stelios M.; Barbosa, Susana M.; Matos, Jose A. O.

    2015-04-01

    The presence or absence of long-range correlations in environmental radioactivity fluctuations has recently attracted considerable interest. Among a multiplicity of practically relevant applications, identifying and disentangling the environmental factors controlling the variable concentrations of the radioactive noble gas Radon is important for estimating its effect on human health and the efficiency of possible measures for reducing the corresponding exposition. In this work, we present a critical re-assessment of a multiplicity of complementary methods that have been previously applied for evaluating the presence of long-range correlations and fractal scaling in environmental Radon variations with a particular focus on the specific properties of the underlying time series. As an illustrative case study, we subsequently re-analyze two high-frequency records of indoor Radon concentrations from Coimbra, Portugal, each of which spans several months of continuous measurements at a high temporal resolution of five minutes. Our results reveal that at the study site, Radon concentrations exhibit complex multi-scale dynamics with qualitatively different properties at different time-scales: (i) essentially white noise in the high-frequency part (up to time-scales of about one hour), (ii) spurious indications of a non-stationary, apparently long-range correlated process (at time scales between hours and one day) arising from marked periodic components probably related to tidal frequencies, and (iii) low-frequency variability indicating a true long-range dependent process, which might be dominated by a response to meteorological drivers. In the presence of such multi-scale variability, common estimators of long-range memory in time series are necessarily prone to fail if applied to the raw data without previous separation of time-scales with qualitatively different dynamics. We emphasize that similar properties can be found in other types of geophysical time series (for example, tide gauge records), calling for a careful application of time series analysis tools when studying such data.

  6. The impact of 14-nm photomask uncertainties on computational lithography solutions

    NASA Astrophysics Data System (ADS)

    Sturtevant, John; Tejnil, Edita; Lin, Tim; Schultze, Steffen; Buck, Peter; Kalk, Franklin; Nakagawa, Kent; Ning, Guoxiang; Ackmann, Paul; Gans, Fritz; Buergel, Christian

    2013-04-01

    Computational lithography solutions rely upon accurate process models to faithfully represent the imaging system output for a defined set of process and design inputs. These models, which must balance accuracy demands with simulation runtime boundary conditions, rely upon the accurate representation of multiple parameters associated with the scanner and the photomask. While certain system input variables, such as scanner numerical aperture, can be empirically tuned to wafer CD data over a small range around the presumed set point, it can be dangerous to do so since CD errors can alias across multiple input variables. Therefore, many input variables for simulation are based upon designed or recipe-requested values or independent measurements. It is known, however, that certain measurement methodologies, while precise, can have significant inaccuracies. Additionally, there are known errors associated with the representation of certain system parameters. With shrinking total CD control budgets, appropriate accounting for all sources of error becomes more important, and the cumulative consequence of input errors to the computational lithography model can become significant. In this work, we examine with a simulation sensitivity study, the impact of errors in the representation of photomask properties including CD bias, corner rounding, refractive index, thickness, and sidewall angle. The factors that are most critical to be accurately represented in the model are cataloged. CD Bias values are based on state of the art mask manufacturing data and other variables changes are speculated, highlighting the need for improved metrology and awareness.

  7. State of the Art in Large-Scale Soil Moisture Monitoring

    NASA Technical Reports Server (NTRS)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  8. Exposure Estimation and Interpretation of Occupational Risk: Enhanced Information for the Occupational Risk Manager

    PubMed Central

    Waters, Martha; McKernan, Lauralynn; Maier, Andrew; Jayjock, Michael; Schaeffer, Val; Brosseau, Lisa

    2015-01-01

    The fundamental goal of this article is to describe, define, and analyze the components of the risk characterization process for occupational exposures. Current methods are described for the probabilistic characterization of exposure, including newer techniques that have increasing applications for assessing data from occupational exposure scenarios. In addition, since the probability of health effects reflects variability in the exposure estimate as well as the dose-response curve—the integrated considerations of variability surrounding both components of the risk characterization provide greater information to the occupational hygienist. Probabilistic tools provide a more informed view of exposure as compared to use of discrete point estimates for these inputs to the risk characterization process. Active use of such tools for exposure and risk assessment will lead to a scientifically supported worker health protection program. Understanding the bases for an occupational risk assessment, focusing on important sources of variability and uncertainty enables characterizing occupational risk in terms of a probability, rather than a binary decision of acceptable risk or unacceptable risk. A critical review of existing methods highlights several conclusions: (1) exposure estimates and the dose-response are impacted by both variability and uncertainty and a well-developed risk characterization reflects and communicates this consideration; (2) occupational risk is probabilistic in nature and most accurately considered as a distribution, not a point estimate; and (3) occupational hygienists have a variety of tools available to incorporate concepts of risk characterization into occupational health and practice. PMID:26302336

  9. A critical assessment of the Burning Index in Los Angeles County, California

    USGS Publications Warehouse

    Schoenberg, F.P.; Chang, H.-C.; Keeley, J.E.; Pompa, J.; Woods, J.; Xu, H.

    2007-01-01

    The Burning Index (BI) is commonly used as a predictor of wildfire activity. An examination of data on the BI and wildfires in Los Angeles County, California, from January 1976 to December 2000 reveals that although the BI is positively associated with wildfire occurrence, its predictive value is quite limited. Wind speed alone has a higher correlation with burn area than BI, for instance, and a simple alternative point process model using wind speed, relative humidity, precipitation and temperature well outperforms the BI in terms of predictive power. The BI is generally far too high in winter and too low in fall, and may exaggerate the impact of individual variables such as wind speed or temperature during times when other variables, such as precipitation or relative humidity, render the environment ill suited for wildfires. ?? IAWF 2007.

  10. Security Requirements Management in Software Product Line Engineering

    NASA Astrophysics Data System (ADS)

    Mellado, Daniel; Fernández-Medina, Eduardo; Piattini, Mario

    Security requirements engineering is both a central task and a critical success factor in product line development due to the complexity and extensive nature of product lines. However, most of the current product line practices in requirements engineering do not adequately address security requirements engineering. Therefore, in this chapter we will propose a security requirements engineering process (SREPPLine) driven by security standards and based on a security requirements decision model along with a security variability model to manage the variability of the artefacts related to security requirements. The aim of this approach is to deal with security requirements from the early stages of the product line development in a systematic way, in order to facilitate conformance with the most relevant security standards with regard to the management of security requirements, such as ISO/IEC 27001 and ISO/IEC 15408.

  11. Active and passive interaction mechanism of smart materials for health monitoring of engineering structures: a review

    NASA Astrophysics Data System (ADS)

    Annamdas, Venu Gopal Madhav; Annamdas, Kiran Kumar

    2009-03-01

    Smart materials when interact with engineering structures, should have the capability to sense, measure, process, and detect any change in the selected variables (stress, damage) at critical locations. These smart materials can be classified into active and passive depending on the type of the structure, variables to be monitored, and interaction mechanism due to surface bonding or embedment. Some of the prominent smart materials are piezoelectric materials, micro fiber composite, polymers, shape memory alloys, electrostrictive and magnetostrictive materials, electrorheological and magnetorheological fluids and fiber optics. In addition, host structures do have the properties to support or repel the usage of smart materials inside or on it. This paper presents some of the most widely used smart materials and their interaction mechanism for structural health monitoring of engineering structures.

  12. [Metabolic control in the critically ill patient an update: hyperglycemia, glucose variability hypoglycemia and relative hypoglycemia].

    PubMed

    Pérez-Calatayud, Ángel Augusto; Guillén-Vidaña, Ariadna; Fraire-Félix, Irving Santiago; Anica-Malagón, Eduardo Daniel; Briones Garduño, Jesús Carlos; Carrillo-Esper, Raúl

    Metabolic changes of glucose in critically ill patients increase morbidity and mortality. The appropriate level of blood glucose has not been established so far and should be adjusted for different populations. However concepts such as glucose variability and relative hypoglycemia of critically ill patients are concepts that are changing management methods and achieving closer monitoring. The purpose of this review is to present new data about the management and metabolic control of patients in critical areas. Currently glucose can no longer be regarded as an innocent element in critical patients; both hyperglycemia and hypoglycemia increase morbidity and mortality of patients. Protocols and better instruments for continuous measurement are necessary to achieve the metabolic control of our patients. Copyright © 2016 Academia Mexicana de Cirugía A.C. Publicado por Masson Doyma México S.A. All rights reserved.

  13. CBT Specific Process in Exposure-Based Treatments: Initial Examination in a Pediatric OCD Sample

    PubMed Central

    Benito, Kristen Grabill; Conelea, Christine; Garcia, Abbe M.; Freeman, Jennifer B.

    2012-01-01

    Cognitive-Behavioral theory and empirical support suggest that optimal activation of fear is a critical component for successful exposure treatment. Using this theory, we developed coding methodology for measuring CBT-specific process during exposure. We piloted this methodology in a sample of young children (N = 18) who previously received CBT as part of a randomized controlled trial. Results supported the preliminary reliability and predictive validity of coding variables with 12 week and 3 month treatment outcome data, generally showing results consistent with CBT theory. However, given our limited and restricted sample, additional testing is warranted. Measurement of CBT-specific process using this methodology may have implications for understanding mechanism of change in exposure-based treatments and for improving dissemination efforts through identification of therapist behaviors associated with improved outcome. PMID:22523609

  14. Forcing functions governing salt transport processes in coastal navigation canals and connectivity to surrounding marshes in South Louisiana using Houma Navigation Canal as a surrogate

    USGS Publications Warehouse

    Snedden, Gregg

    2014-01-01

    Understanding how circulation and mixing processes in coastal navigation canals influence the exchange of salt between marshes and coastal ocean, and how those processes are modulated by external physical processes, is critical to anticipating effects of future actions and circumstance. Examples of such circumstances include deepening the channel, placement of locks in the channel, changes in freshwater discharge down the channel, changes in outer continental shelf (OCS) vessel traffic volume, and sea level rise. The study builds on previous BOEM-funded studies by investigating salt flux variability through the Houma Navigation Canal (HNC). It examines how external physical factors, such as buoyancy forcing and mixing from tidal stirring and OCS vessel wakes, influence dispersive and advective fluxes through the HNC and the impact of this salt flux on salinity in nearby marshes. This study quantifies salt transport processes and salinity variability in the HNC and surrounding Terrebonne marshes. Data collected for this study include time-series data of salinity and velocity in the HNC, monthly salinity-depth profiles along the length of the channel, hourly vertical profiles of velocity and salinity over multiple tidal cycles, and salinity time series data at three locations in the surrounding marshes along a transect of increasing distance from the HNC. Two modes of vertical current structure were identified. The first mode, making up 90% of the total flow field variability, strongly resembled a barotropic current structure and was coherent with alongshelf wind stress over the coastal Gulf of Mexico. The second mode was indicative of gravitational circulation and was linked to variability in tidal stirring and the longitudinal salinity gradients along the channel’s length. Diffusive process were dominant drivers of upestuary salt transport, except during periods of minimal tidal stirring when gravitational circulation became more important. Salinity in the surrounding marshes was much more responsive to salinity variations in the HNC than it was to variations in the lower Terrebonne marshes, suggesting that the HNC is the primary conduit for saltwater intrusion to the middle Terrebonne marshes. Finally, salt transport to the middle Terrebonne marshes directly associated with vessel wakes was negligible.

  15. A conceptual framework for implementation fidelity

    PubMed Central

    Carroll, Christopher; Patterson, Malcolm; Wood, Stephen; Booth, Andrew; Rick, Jo; Balain, Shashi

    2007-01-01

    Background Implementation fidelity refers to the degree to which an intervention or programme is delivered as intended. Only by understanding and measuring whether an intervention has been implemented with fidelity can researchers and practitioners gain a better understanding of how and why an intervention works, and the extent to which outcomes can be improved. Discussion The authors undertook a critical review of existing conceptualisations of implementation fidelity and developed a new conceptual framework for understanding and measuring the process. The resulting theoretical framework requires testing by empirical research. Summary Implementation fidelity is an important source of variation affecting the credibility and utility of research. The conceptual framework presented here offers a means for measuring this variable and understanding its place in the process of intervention implementation. PMID:18053122

  16. The Effects of Mobile-Computer-Supported Collaborative Learning: Meta-Analysis and Critical Synthesis.

    PubMed

    Sung, Yao-Ting; Yang, Je-Ming; Lee, Han-Yueh

    2017-08-01

    One of the trends in collaborative learning is using mobile devices for supporting the process and products of collaboration, which has been forming the field of mobile-computer-supported collaborative learning (mCSCL). Although mobile devices have become valuable collaborative learning tools, evaluative evidence for their substantial contributions to collaborative learning is still scarce. The present meta-analysis, which included 48 peer-reviewed journal articles and doctoral dissertations written over a 16-year period (2000-2015) involving 5,294 participants, revealed that mCSCL has produced meaningful improvements for collaborative learning, with an overall mean effect size of 0.516. Moderator variables, such as domain subject, group size, teaching method, intervention duration, and reward method were related to different effect sizes. The results provided implications for future research and practice, such as suggestions on how to appropriately use the functionalities of mobile devices, how to best leverage mCSCL through effective group learning mechanisms, and what outcome variables should be included in future studies to fully elucidate the process and products of mCSCL.

  17. Reconstructing historical habitat data with predictive models Read More: http://www.esajournals.org/doi/abs/10.1890/13-0327.1

    USGS Publications Warehouse

    Zweig, Christa L.; Kitchens, Wiley M.

    2014-01-01

    Historical vegetation data are important to ecological studies, as many structuring processes operate at long time scales, from decades to centuries. Capturing the pattern of variability within a system (enough to declare a significant change from past to present) relies on correct assumptions about the temporal scale of the processes involved. Sufficient long-term data are often lacking, and current techniques have their weaknesses. To address this concern, we constructed multistate and artificial neural network models (ANN) to provide fore- and hindcast vegetation communities considered critical foraging habitat for an endangered bird, the Florida Snail Kite (Rostrhamus sociabilis). Multistate models were not able to hindcast due to our data not satisfying a detailed balance requirement for time reversibility in Markovian dynamics. Multistate models were useful for forecasting and providing environmental variables for the ANN. Results from our ANN hindcast closely mirrored the population collapse of the Snail Kite population using only environmental data to inform the model. The parallel between the two gives us confidence in the hindcasting results and their use in future demographic models.

  18. The Effects of Mobile-Computer-Supported Collaborative Learning: Meta-Analysis and Critical Synthesis

    PubMed Central

    Sung, Yao-Ting; Yang, Je-Ming; Lee, Han-Yueh

    2017-01-01

    One of the trends in collaborative learning is using mobile devices for supporting the process and products of collaboration, which has been forming the field of mobile-computer-supported collaborative learning (mCSCL). Although mobile devices have become valuable collaborative learning tools, evaluative evidence for their substantial contributions to collaborative learning is still scarce. The present meta-analysis, which included 48 peer-reviewed journal articles and doctoral dissertations written over a 16-year period (2000–2015) involving 5,294 participants, revealed that mCSCL has produced meaningful improvements for collaborative learning, with an overall mean effect size of 0.516. Moderator variables, such as domain subject, group size, teaching method, intervention duration, and reward method were related to different effect sizes. The results provided implications for future research and practice, such as suggestions on how to appropriately use the functionalities of mobile devices, how to best leverage mCSCL through effective group learning mechanisms, and what outcome variables should be included in future studies to fully elucidate the process and products of mCSCL. PMID:28989193

  19. Climate and demography in early prehistory: using calibrated (14)C dates as population proxies.

    PubMed

    Riede, Felix

    2009-04-01

    Although difficult to estimate for prehistoric hunter-gatherer populations, demographic variables-population size, density, and the connectedness of demes-are critical for a better understanding of the processes of material culture change, especially in deep prehistory. Demography is the middle-range link between climatic changes and both biological and cultural evolutionary trajectories of human populations. Much of human material culture functions as a buffer against climatic changes, and the study of prehistoric population dynamics, estimated through changing frequencies of calibrated radiocarbon dates, therefore affords insights into how effectively such buffers operated and when they failed. In reviewing a number of case studies (Mesolithic Ireland, the origin of the Bromme culture, and the earliest late glacial human recolonization of southern Scandinavia), I suggest that a greater awareness of demographic processes, and in particular of demographic declines, provides many fresh insights into what structured the archaeological record. I argue that we cannot sideline climatic and environmental factors or extreme geophysical events in our reconstructions of prehistoric culture change. The implications of accepting demographic variability as a departure point for evaluating the archaeological record are discussed.

  20. First-principles modeling of laser-matter interaction and plasma dynamics in nanosecond pulsed laser shock processing

    NASA Astrophysics Data System (ADS)

    Zhang, Zhongyang; Nian, Qiong; Doumanidis, Charalabos C.; Liao, Yiliang

    2018-02-01

    Nanosecond pulsed laser shock processing (LSP) techniques, including laser shock peening, laser peen forming, and laser shock imprinting, have been employed for widespread industrial applications. In these processes, the main beneficial characteristic is the laser-induced shockwave with a high pressure (in the order of GPa), which leads to the plastic deformation with an ultrahigh strain rate (105-106/s) on the surface of target materials. Although LSP processes have been extensively studied by experiments, few efforts have been put on elucidating underlying process mechanisms through developing a physics-based process model. In particular, development of a first-principles model is critical for process optimization and novel process design. This work aims at introducing such a theoretical model for a fundamental understanding of process mechanisms in LSP. Emphasis is placed on the laser-matter interaction and plasma dynamics. This model is found to offer capabilities in predicting key parameters including electron and ion temperatures, plasma state variables (temperature, density, and pressure), and the propagation of the laser shockwave. The modeling results were validated by experimental data.

  1. Optimization of process parameters for a quasi-continuous tablet coating system using design of experiments.

    PubMed

    Cahyadi, Christine; Heng, Paul Wan Sia; Chan, Lai Wah

    2011-03-01

    The aim of this study was to identify and optimize the critical process parameters of the newly developed Supercell quasi-continuous coater for optimal tablet coat quality. Design of experiments, aided by multivariate analysis techniques, was used to quantify the effects of various coating process conditions and their interactions on the quality of film-coated tablets. The process parameters varied included batch size, inlet temperature, atomizing pressure, plenum pressure, spray rate and coating level. An initial screening stage was carried out using a 2(6-1(IV)) fractional factorial design. Following these preliminary experiments, optimization study was carried out using the Box-Behnken design. Main response variables measured included drug-loading efficiency, coat thickness variation, and the extent of tablet damage. Apparent optimum conditions were determined by using response surface plots. The process parameters exerted various effects on the different response variables. Hence, trade-offs between individual optima were necessary to obtain the best compromised set of conditions. The adequacy of the optimized process conditions in meeting the combined goals for all responses was indicated by the composite desirability value. By using response surface methodology and optimization, coating conditions which produced coated tablets of high drug-loading efficiency, low incidences of tablet damage and low coat thickness variation were defined. Optimal conditions were found to vary over a large spectrum when different responses were considered. Changes in processing parameters across the design space did not result in drastic changes to coat quality, thereby demonstrating robustness in the Supercell coating process. © 2010 American Association of Pharmaceutical Scientists

  2. Variability in alignment of central venous pressure transducer to physiologic reference point in the intensive care unit-A descriptive and correlational study.

    PubMed

    Sjödin, Carl; Sondergaard, Soren; Johansson, Lotta

    2018-06-01

    The phlebostatic axis is the most commonly used anatomical external reference point for central venous pressure measurements. Deviation in the central venous pressure transducer alignment from the phlebostatic axis causes inadequate pressure readings, which may affect treatment decisions for critically ill patients in intensive care units. The primary aim of the study was to assess the variability in central venous pressure transducer levelling in the intensive care unit. We also assessed whether patient characteristics impacted on central venous pressure transducer alignment deviation. A sample of 61 critical care nurses was recruited and asked to place a transducer at the appropriate level for central venous pressure measurement. The measurements were performed in the intensive care unit on critically ill patients in supine and Fowler's positions. The variability among the participants using eyeball levelling and a laser levelling device was calculated in both sessions and adjusted for patient characteristics. A significant variation was found among critical care nurses in the horizontal levelling of the pressure transducer placement when measuring central venous pressure in the intensive care unit. Using a laser levelling device did not reduce the deviation from the phlebostatic axis. Patient characteristics had little impact on the deviation in the measurements. The anatomical external landmark for the phlebostatic axis varied between critical care nurses, as the variation in the central venous pressure transducer placement was not reduced with a laser levelling device. Standardisation of a zero-level for vascular pressures should be considered to reduce the variability in vascular pressure readings in the intensive care unit to improve patient treatment decisions. Further studies are needed to evaluate critical care nurses' knowledge and use of central venous pressure monitoring and whether assistive tools and/or routines can improve the accuracy in vascular pressure measurements in intensive care units. Copyright © 2018 Australian College of Critical Care Nurses Ltd. Published by Elsevier Ltd. All rights reserved.

  3. Simulation-Driven Design Approach for Design and Optimization of Blankholder

    NASA Astrophysics Data System (ADS)

    Sravan, Tatipala; Suddapalli, Nikshep R.; Johan, Pilthammar; Mats, Sigvant; Christian, Johansson

    2017-09-01

    Reliable design of stamping dies is desired for efficient and safe production. The design of stamping dies are today mostly based on casting feasibility, although it can also be based on criteria for fatigue, stiffness, safety, economy. Current work presents an approach that is built on Simulation Driven Design, enabling Design Optimization to address this issue. A structural finite element model of a stamping die, used to produce doors for Volvo V70/S80 car models, is studied. This die had developed cracks during its usage. To understand the behaviour of stress distribution in the stamping die, structural analysis of the die is conducted and critical regions with high stresses are identified. The results from structural FE-models are compared with analytical calculations pertaining to fatigue properties of the material. To arrive at an optimum design with increased stiffness and lifetime, topology and free-shape optimization are performed. In the optimization routine, identified critical regions of the die are set as design variables. Other optimization variables are set to maintain manufacturability of the resultant stamping die. Thereafter a CAD model is built based on geometrical results from topology and free-shape optimizations. Then the CAD model is subjected to structural analysis to visualize the new stress distribution. This process is iterated until a satisfactory result is obtained. The final results show reduction in stress levels by 70% with a more homogeneous distribution. Even though mass of the die is increased by 17 %, overall, a stiffer die with better lifetime is obtained. Finally, by reflecting on the entire process, a coordinated approach to handle such situations efficiently is presented.

  4. The critical role of NIR spectroscopy and statistical process control (SPC) strategy towards captopril tablets (25 mg) manufacturing process understanding: a case study.

    PubMed

    Curtivo, Cátia Panizzon Dal; Funghi, Nathália Bitencourt; Tavares, Guilherme Diniz; Barbosa, Sávio Fujita; Löbenberg, Raimar; Bou-Chacra, Nádia Araci

    2015-05-01

    In this work, near-infrared spectroscopy (NIRS) method was used to evaluate the uniformity of dosage units of three captopril 25 mg tablets commercial batches. The performance of the calibration method was assessed by determination of Q value (0.9986), standard error of estimation (C-set SEE = 1.956), standard error of prediction (V-set SEP = 2.076) as well as the consistency (106.1%). These results indicated the adequacy of the selected model. The method validation revealed the agreement of the reference high pressure liquid chromatography (HPLC) and NIRS methods. The process evaluation using the NIRS method showed that the variability was due to common causes and delivered predictable results consistently. Cp and Cpk values were, respectively, 2.05 and 1.80. These results revealed a non-centered process in relation to the average target (100% w/w), in the specified range (85-115%). The probability of failure was 21:100 million tablets of captopril. The NIRS in combination with the method of multivariate calibration, partial least squares (PLS) regression, allowed the development of methodology for the uniformity of dosage units evaluation of captopril tablets 25 mg. The statistical process control strategy associated with NIRS method as PAT played a critical role in understanding of the sources and degree of variation and its impact on the process. This approach led towards a better process understanding and provided the sound scientific basis for its continuous improvement.

  5. Iontophoretic delivery of lisinopril: Optimization of process variables by Box-Behnken statistical design.

    PubMed

    Gannu, Ramesh; Yamsani, Vamshi Vishnu; Palem, Chinna Reddy; Yamsani, Shravan Kumar; Yamsani, Madhusudan Rao

    2010-01-01

    The objective of the investigation was to optimize the iontophoresis process parameters of lisinopril (LSP) by 3 x 3 factorial design, Box-Behnken statistical design. LSP is an ideal candidate for iontophoretic delivery to avoid the incomplete absorption problem associated after its oral administration. Independent variables selected were current (X(1)), salt (sodium chloride) concentration (X(2)) and medium/pH (X(3)). The dependent variables studied were amount of LSP permeated in 4 h (Y(1): Q(4)), 24 h (Y(2): Q(24)) and lag time (Y(3)). Mathematical equations and response surface plots were used to relate the dependent and independent variables. The regression equation generated for the iontophoretic permeation was Y(1) = 1.98 + 1.23X(1) - 0.49X(2) + 0.025X(3) - 0.49X(1)X(2) + 0.040X(1)X(3) - 0.010X(2)X(3) + 0.58X(1)(2) - 0.17X(2)(2) - 0.18X(3)(2); Y(2) = 7.28 + 3.32X(1) - 1.52X(2) + 0.22X(3) - 1.30X(1)X(2) + 0.49X(1)X(3) - 0.090X(2)X(3) + 0.79X(1)(2) - 0.62X(2)(2) - 0.33X(3)(2) and Y(3) = 0.60 + 0.0038X(1) + 0.12X(2) - 0.011X(3) + 0.005X(1)X(2) - 0.018X(1)X(3) - 0.015X(2)X(3) - 0.00075X(1)(2) + 0.017X(2)(2) - 0.11X(3)(2). The statistical validity of the polynomials was established and optimized process parameters were selected by feasibility and grid search. Validation of the optimization study with 8 confirmatory runs indicated high degree of prognostic ability of response surface methodology. The use of Box-Behnken design approach helped in identifying the critical process parameters in the iontophoretic delivery of lisinopril.

  6. Is Critical Thinking a Mediator Variable of Student Performance in School?

    ERIC Educational Resources Information Center

    Walter, Christel; Walter, Paul

    2018-01-01

    The study explores the influences of critical thinking and interests on students' performance at school. The tested students attended German grammar schools ("Gymnasien"). Separate regression analyses showed the expected moderate positive influences of critical thinking and interests on school performance. But analyzed simultaneously,…

  7. Development and evaluation of paclitaxel nanoparticles using a quality-by-design approach.

    PubMed

    Yerlikaya, Firat; Ozgen, Aysegul; Vural, Imran; Guven, Olgun; Karaagaoglu, Ergun; Khan, Mansoor A; Capan, Yilmaz

    2013-10-01

    The aims of this study were to develop and characterize paclitaxel nanoparticles, to identify and control critical sources of variability in the process, and to understand the impact of formulation and process parameters on the critical quality attributes (CQAs) using a quality-by-design (QbD) approach. For this, a risk assessment study was performed with various formulation and process parameters to determine their impact on CQAs of nanoparticles, which were determined to be average particle size, zeta potential, and encapsulation efficiency. Potential risk factors were identified using an Ishikawa diagram and screened by Plackett-Burman design and finally nanoparticles were optimized using Box-Behnken design. The optimized formulation was further characterized by Fourier transform infrared spectroscopy, X-ray diffractometry, differential scanning calorimetry, scanning electron microscopy, atomic force microscopy, and gas chromatography. It was observed that paclitaxel transformed from crystalline state to amorphous state while totally encapsulating into the nanoparticles. The nanoparticles were spherical, smooth, and homogenous with no dichloromethane residue. In vitro cytotoxicity test showed that the developed nanoparticles are more efficient than free paclitaxel in terms of antitumor activity (more than 25%). In conclusion, this study demonstrated that understanding formulation and process parameters with the philosophy of QbD is useful for the optimization of complex drug delivery systems. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.

  8. Evaluation of graphic cardiovascular display in a high-fidelity simulator.

    PubMed

    Agutter, James; Drews, Frank; Syroid, Noah; Westneskow, Dwayne; Albert, Rob; Strayer, David; Bermudez, Julio; Weinger, Matthew B

    2003-11-01

    "Human error" in anesthesia can be attributed to misleading information from patient monitors or to the physician's failure to recognize a pattern. A graphic representation of monitored data may provide better support for detection, diagnosis, and treatment. We designed a graphic display to show hemodynamic variables. Twenty anesthesiologists were asked to assume care of a simulated patient. Half the participants used the graphic cardiovascular display; the other half used a Datex As/3 monitor. One scenario was a total hip replacement with a transfusion reaction to mismatched blood. The second scenario was a radical prostatectomy with 1.5 L of blood loss and myocardial ischemia. Subjects who used the graphic display detected myocardial ischemia 2 min sooner than those who did not use the display. Treatment was initiated sooner (2.5 versus 4.9 min). There were no significant differences between groups in the hip replacement scenario. Systolic blood pressure deviated less from baseline, central venous pressure was closer to its baseline, and arterial oxygen saturation was higher at the end of the case when the graphic display was used. The study lends some support for the hypothesis that providing clinical information graphically in a display designed with emergent features and functional relationships can improve clinicians' ability to detect, diagnose, manage, and treat critical cardiovascular events in a simulated environment. A graphic representation of monitored data may provide better support for detection, diagnosis, and treatment. A user-centered design process led to a novel object-oriented graphic display of hemodynamic variables containing emergent features and functional relationships. In a simulated environment, this display appeared to support clinicians' ability to diagnose, manage, and treat a critical cardiovascular event in a simulated environment. We designed a graphic display to show hemodynamic variables. The study provides some support for the hypothesis that providing clinical information graphically in a display designed with emergent features and functional relationships can improve clinicians' ability to detect, diagnosis, mange, and treat critical cardiovascular events in a simulated environment.

  9. Criticism in the Romantic Relationships of Individuals With Social Anxiety.

    PubMed

    Porter, Eliora; Chambless, Dianne L; Keefe, John R

    2017-07-01

    Social anxiety is associated with difficulties in intimate relationships. Because fear of negative evaluation is a cardinal feature of social anxiety disorder, perceived criticism and upset due to criticism from partners may play a significant role in socially anxious individuals' intimate relationships. In the present study, we examine associations between social anxiety and perceived, observed, and expressed criticism in interactions with romantic partners. In Study 1, we collected self-report data from 343 undergraduates and their romantic partners on social anxiety symptoms, perceived and expressed criticism, and upset due to criticism. One year later couples reported whether they were still in this relationship. Results showed that social anxiety was associated with being more critical of one's partner, and among women, being more upset by criticism from a partner. Social anxiety was not related to perceived criticism, nor did criticism variables predict relationship status at Time 2. In Study 2, undergraduate couples with a partner high (n = 26) or low (n = 26) in social anxiety completed a 10-minute, video-recorded problem-solving task. Both partners rated their perceived and expressed criticism and upset due to criticism following the interaction, and observers coded interactions for criticism. Results indicated that social anxiety was not significantly related to any of the criticism variables, but post hoc analyses cast doubts upon the external validity of the problem-solving task. Results are discussed in light of known difficulties with intimacy among individuals with social anxiety. Copyright © 2016. Published by Elsevier Ltd.

  10. Healthy Variability in Organizational Behavior: Empirical Evidence and New Steps for Future Research.

    PubMed

    Navarro, José; Rueff-Lopes, Rita

    2015-10-01

    The healthy variability thesis suggests that healthy systems function in a complex manner over time. This thesis is well-established in fields like physiology. In the field of organizational behavior, however, this relation is only starting to be explored. The objective of this article is threefold: First, we aim to provide a comprehensive review of the healthy variability thesis including some of the most important findings across different fields, with a focus on evidences from organizational research in work motivation and performance. Second, we discuss an opposite pattern, unhealthy stability, i.e., the relationship between unhealthy behaviors and lower variability. Again, we provide evidence from diverse areas, from affective processes to disruptive organizational comportments like mobbing. Third, we provide a critical evaluation of current methodological trends and highlight what we believe to be the main factors that are stopping organizational research from advancing in the field. Theoretical, methodological and epistemological implications are discussed. To conclude, we draw a compilation of the lessons learned, which hopefully provide insights for prolific research avenues. Our main purpose is to raise awareness of the healthy variability thesis and to enthuse organizational researchers to consider it in order to advance existing knowledge, revisit old theories and create new ones.

  11. Developing a 'critical' approach to patient and public involvement in patient safety in the NHS: learning lessons from other parts of the public sector?

    PubMed

    Ocloo, Josephine E; Fulop, Naomi J

    2012-12-01

    There has been considerable momentum within the NHS over the last 10 years to develop greater patient and public involvement (PPI). This commitment has been reflected in numerous policy initiatives. In patient safety, the drive to increase involvement has increasingly been seen as an important way of building a safety culture. Evidence suggests, however, that progress has been slow and even more variable than in health care generally. Given this context, the paper analyses some of the key underlying drivers for involvement in the wider context of health and social care and makes some suggestions on what lessons can be learned for developing the PPI agenda in patient safety. To develop PPI further, it is argued that a greater understanding is needed of the contested nature of involvement in patient safety and how this has similarities to the emergence of user involvement in other parts of the public services. This understanding has led to the development of a range of critical theories to guide involvement that also make more explicit the underlying factors that support and hinder involvement processes, often related to power inequities and control. Achieving greater PPI in patient safety is therefore seen to require a more critical framework for understanding processes of involvement that can also help guide and evaluate involvement practices. © 2011 Blackwell Publishing Ltd.

  12. Variation of organic matter quantity and quality in streams at Critical Zone Observatory watersheds

    USGS Publications Warehouse

    Miller, Matthew P.; Boyer, Elizabeth W.; McKnight, Diane M.; Brown, Michael G.; Gabor, Rachel S.; Hunsaker, Carolyn T.; Iavorivska , Lidiia; Inamdar, Shreeram; Kaplan, Louis A.; Johnson, Dale W.; Lin, Henry; McDowell, William H.; Perdrial, Julia N.

    2016-01-01

    The quantity and chemical composition of dissolved organic matter (DOM) in surface waters influence ecosystem processes and anthropogenic use of freshwater. However, despite the importance of understanding spatial and temporal patterns in DOM, measures of DOM quality are not routinely included as part of large-scale ecosystem monitoring programs and variations in analytical procedures can introduce artifacts. In this study, we used consistent sampling and analytical methods to meet the objective of defining variability in DOM quantity and quality and other measures of water quality in streamflow issuing from small forested watersheds located within five Critical Zone Observatory sites representing contrasting environmental conditions. Results show distinct separations among sites as a function of water quality constituents. Relationships among rates of atmospheric deposition, water quality conditions, and stream DOM quantity and quality are consistent with the notion that areas with relatively high rates of atmospheric nitrogen and sulfur deposition and high concentrations of divalent cations result in selective transport of DOM derived from microbial sources, including in-stream microbial phototrophs. We suggest that the critical zone as a whole strongly influences the origin, composition, and fate of DOM in streams. This study highlights the value of consistent DOM characterization methods included as part of long-term monitoring programs for improving our understanding of interactions among ecosystem processes as controls on DOM biogeochemistry.

  13. Quality-by-design III: application of near-infrared spectroscopy to monitor roller compaction in-process and product quality attributes of immediate release tablets.

    PubMed

    Kona, Ravikanth; Fahmy, Raafat M; Claycamp, Gregg; Polli, James E; Martinez, Marilyn; Hoag, Stephen W

    2015-02-01

    The objective of this study is to use near-infrared spectroscopy (NIRS) coupled with multivariate chemometric models to monitor granule and tablet quality attributes in the formulation development and manufacturing of ciprofloxacin hydrochloride (CIP) immediate release tablets. Critical roller compaction process parameters, compression force (CFt), and formulation variables identified from our earlier studies were evaluated in more detail. Multivariate principal component analysis (PCA) and partial least square (PLS) models were developed during the development stage and used as a control tool to predict the quality of granules and tablets. Validated models were used to monitor and control batches manufactured at different sites to assess their robustness to change. The results showed that roll pressure (RP) and CFt played a critical role in the quality of the granules and the finished product within the range tested. Replacing binder source did not statistically influence the quality attributes of the granules and tablets. However, lubricant type has significantly impacted the granule size. Blend uniformity, crushing force, disintegration time during the manufacturing was predicted using validated PLS regression models with acceptable standard error of prediction (SEP) values, whereas the models resulted in higher SEP for batches obtained from different manufacturing site. From this study, we were able to identify critical factors which could impact the quality attributes of the CIP IR tablets. In summary, we demonstrated the ability of near-infrared spectroscopy coupled with chemometrics as a powerful tool to monitor critical quality attributes (CQA) identified during formulation development.

  14. Improved spectral comparisons of paleoclimate models and observations via proxy system modeling: Implications for multi-decadal variability

    NASA Astrophysics Data System (ADS)

    Dee, S. G.; Parsons, L. A.; Loope, G. R.; Overpeck, J. T.; Ault, T. R.; Emile-Geay, J.

    2017-10-01

    The spectral characteristics of paleoclimate observations spanning the last millennium suggest the presence of significant low-frequency (multi-decadal to centennial scale) variability in the climate system. Since this low-frequency climate variability is critical for climate predictions on societally-relevant scales, it is essential to establish whether General Circulation models (GCMs) are able to simulate it faithfully. Recent studies find large discrepancies between models and paleoclimate data at low frequencies, prompting concerns surrounding the ability of GCMs to predict long-term, high-magnitude variability under greenhouse forcing (Laepple and Huybers, 2014a, 2014b). However, efforts to ground climate model simulations directly in paleoclimate observations are impeded by fundamental differences between models and the proxy data: proxy systems often record a multivariate and/or nonlinear response to climate, precluding a direct comparison to GCM output. In this paper we bridge this gap via a forward proxy modeling approach, coupled to an isotope-enabled GCM. This allows us to disentangle the various contributions to signals embedded in ice cores, speleothem calcite, coral aragonite, tree-ring width, and tree-ring cellulose. The paper addresses the following questions: (1) do forward-modeled ;pseudoproxies; exhibit variability comparable to proxy data? (2) if not, which processes alter the shape of the spectrum of simulated climate variability, and are these processes broadly distinguishable from climate? We apply our method to representative case studies, and broaden these insights with an analysis of the PAGES2k database (PAGES2K Consortium, 2013). We find that current proxy system models (PSMs) can help resolve model-data discrepancies on interannual to decadal timescales, but cannot account for the mismatch in variance on multi-decadal to centennial timescales. We conclude that, specific to this set of PSMs and isotope-enabled model, the paleoclimate record may exhibit larger low-frequency variability than GCMs currently simulate, indicative of incomplete physics and/or forcings.

  15. On Colour, Category Effects, and Alzheimer's Disease: A Critical Review of Studies and Further Longitudinal Evidence

    PubMed Central

    Moreno-Martínez, F. Javier; Rodríguez-Rojo, Inmaculada C.

    2015-01-01

    The role of colour in object recognition is controversial; in this study, a critical review of previous studies, as well as a longitudinal study, was conducted. We examined whether colour benefits the ability of Alzheimer's disease (AD) patients and normal controls (NC) when naming items differing in colour diagnosticity: living things (LT) versus nonliving things (NLT). Eleven AD patients were evaluated twice with a temporal interval of 3 years; 26 NC were tested once. The participants performed a naming task (colour and greyscale photographs); the impact of nuisance variables (NVs) and potential ceiling effects were also controlled. Our results showed that (i) colour slightly favoured processing of items with higher colour diagnosticity (i.e., LT) in both groups; (ii) AD patients used colour information similarly to NC, retaining this ability over time; (iii) NVs played a significant role as naming predictors in all the participants, relegating domain to a minor plane; and (iv) category effects (better processing of NLT) were present in both groups. Finally, although patients underwent semantic longitudinal impairment, this was independent of colour deterioration. This finding provides better support to the view that colour is effective at the visual rather than at the semantic level of object processing. PMID:26074675

  16. On Colour, Category Effects, and Alzheimer's Disease: A Critical Review of Studies and Further Longitudinal Evidence.

    PubMed

    Moreno-Martínez, F Javier; Rodríguez-Rojo, Inmaculada C

    2015-01-01

    The role of colour in object recognition is controversial; in this study, a critical review of previous studies, as well as a longitudinal study, was conducted. We examined whether colour benefits the ability of Alzheimer's disease (AD) patients and normal controls (NC) when naming items differing in colour diagnosticity: living things (LT) versus nonliving things (NLT). Eleven AD patients were evaluated twice with a temporal interval of 3 years; 26 NC were tested once. The participants performed a naming task (colour and greyscale photographs); the impact of nuisance variables (NVs) and potential ceiling effects were also controlled. Our results showed that (i) colour slightly favoured processing of items with higher colour diagnosticity (i.e., LT) in both groups; (ii) AD patients used colour information similarly to NC, retaining this ability over time; (iii) NVs played a significant role as naming predictors in all the participants, relegating domain to a minor plane; and (iv) category effects (better processing of NLT) were present in both groups. Finally, although patients underwent semantic longitudinal impairment, this was independent of colour deterioration. This finding provides better support to the view that colour is effective at the visual rather than at the semantic level of object processing.

  17. WRC bulletin. A review of underclad cracking in pressure-vessel components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vinckier, A.G.; Pense, A.W.

    1974-01-01

    This review of cracking underneath the weld cladding is to determine what factors contribute to this condition, and to outline means for alleviating or eliminating this condition. Considerable data on manufacture, heat treatment, and cladding of heavy-section pressure-vessel steels for nuclear service are also included. Three factors in combination that promote underclad cracking are susceptible microstructure, favorable residual-stress pattern, and a thermal treatment bringing the steel into a critical temperature region (600-650/sup 0/C) where creep ductility is low. High-heat-input weld-overlay cladding produces the susceptible microstructure and residual-stress pattern and postweld heat treatment produces the critical temperature. Most underclad cracking wasmore » found in SA508 Class 2 steel forgings clad with one-layer submerged-arc strip electrodes or multi-electrode processes. It was not produced in SA533 Grade B plate or when multilayer overlay processes were used. Underclad cracking can be reduced or eliminated by a two-layer cladding technique, by controlling welding process variables (low heat input), renormalizing the sensitive HAZ region prior to heat treatment, by use of nonsusceptible materials, or by eliminating the postweld heat treatment. Results of a questionnaire survey are also included. 50 references. (DLC)« less

  18. Mycotoxin Contamination in the EU Feed Supply Chain: A Focus on Cereal Byproducts.

    PubMed

    Pinotti, Luciano; Ottoboni, Matteo; Giromini, Carlotta; Dell'Orto, Vittorio; Cheli, Federica

    2016-02-15

    Mycotoxins represent a risk to the feed supply chain with an impact on economies and international trade. A high percentage of feed samples have been reported to be contaminated with more than one mycotoxin. In most cases, the concentrations were low enough to ensure compliance with the European Union (EU) guidance values or maximum admitted levels. However, mycotoxin co-contamination might still exert adverse effects on animals due to additive/synergistic interactions. Studies on the fate of mycotoxins during cereal processing, such as milling, production of ethanol fuels, and beer brewing, have shown that mycotoxins are concentrated into fractions that are commonly used as animal feed. Published data show a high variability in mycotoxin repartitioning, mainly due to the type of mycotoxins, the level and extent of fungal contamination, and a failure to understand the complexity of food processing technologies. Precise knowledge of mycotoxin repartitioning during technological processes is critical and may provide a sound technical basis for feed managers to conform to legislation requirements and reduce the risk of severe adverse market and trade repercussions. Regular, economical and straightforward feed testing is critical to reach a quick and accurate diagnosis of feed quality. The use of rapid methods represents a future challenge.

  19. Mycotoxin Contamination in the EU Feed Supply Chain: A Focus on Cereal Byproducts

    PubMed Central

    Pinotti, Luciano; Ottoboni, Matteo; Giromini, Carlotta; Dell’Orto, Vittorio; Cheli, Federica

    2016-01-01

    Mycotoxins represent a risk to the feed supply chain with an impact on economies and international trade. A high percentage of feed samples have been reported to be contaminated with more than one mycotoxin. In most cases, the concentrations were low enough to ensure compliance with the European Union (EU) guidance values or maximum admitted levels. However, mycotoxin co-contamination might still exert adverse effects on animals due to additive/synergistic interactions. Studies on the fate of mycotoxins during cereal processing, such as milling, production of ethanol fuels, and beer brewing, have shown that mycotoxins are concentrated into fractions that are commonly used as animal feed. Published data show a high variability in mycotoxin repartitioning, mainly due to the type of mycotoxins, the level and extent of fungal contamination, and a failure to understand the complexity of food processing technologies. Precise knowledge of mycotoxin repartitioning during technological processes is critical and may provide a sound technical basis for feed managers to conform to legislation requirements and reduce the risk of severe adverse market and trade repercussions. Regular, economical and straightforward feed testing is critical to reach a quick and accurate diagnosis of feed quality. The use of rapid methods represents a future challenge. PMID:26891326

  20. [Multiple time scales analysis of spatial differentiation characteristics of non-point source nitrogen loss within watershed].

    PubMed

    Liu, Mei-bing; Chen, Xing-wei; Chen, Ying

    2015-07-01

    Identification of the critical source areas of non-point source pollution is an important means to control the non-point source pollution within the watershed. In order to further reveal the impact of multiple time scales on the spatial differentiation characteristics of non-point source nitrogen loss, a SWAT model of Shanmei Reservoir watershed was developed. Based on the simulation of total nitrogen (TN) loss intensity of all 38 subbasins, spatial distribution characteristics of nitrogen loss and critical source areas were analyzed at three time scales of yearly average, monthly average and rainstorms flood process, respectively. Furthermore, multiple linear correlation analysis was conducted to analyze the contribution of natural environment and anthropogenic disturbance on nitrogen loss. The results showed that there were significant spatial differences of TN loss in Shanmei Reservoir watershed at different time scales, and the spatial differentiation degree of nitrogen loss was in the order of monthly average > yearly average > rainstorms flood process. TN loss load mainly came from upland Taoxi subbasin, which was identified as the critical source area. At different time scales, land use types (such as farmland and forest) were always the dominant factor affecting the spatial distribution of nitrogen loss, while the effect of precipitation and runoff on the nitrogen loss was only taken in no fertilization month and several processes of storm flood at no fertilization date. This was mainly due to the significant spatial variation of land use and fertilization, as well as the low spatial variability of precipitation and runoff.

  1. Long-Distance Rescue and Slow Extinction Dynamics Govern Multiscale Metapopulations.

    PubMed

    Huth, Géraldine; Haegeman, Bart; Pitard, Estelle; Munoz, François

    2015-10-01

    Rare long-distance dispersal is known to be critical for species dynamics, but how the interplay between short- and long-distance colonization influences regional persistence in a fragmented habitat remains poorly understood. We propose a metapopulation model that combines local colonization within habitat islands and long-distance colonization between islands. We study how regional occupancy dynamics are affected by the multiscale colonization process. We find that the island size distribution (ISD) is a key driver of the long-term occupancy dynamics. When the ISD is heterogeneous-that is, when the size of islands is variable-we show that extinction dynamics become very slow. We demonstrate that this behavior is unrelated to the well-known extinction debt near the critical extinction threshold. Hence, this finding questions the equivalence between extinction debt and critical transitions in the context of metapopulation collapse. Furthermore, we show that long-distance colonization can rescue small islands from extinction and sustain a steady regional occupancy. These results provide novel theoretical and practical insights into extinction dynamics and persistence in fragmented habitats and are thus relevant for the design of conservation strategies.

  2. Corrected and Republished from: BCL11A Is a Critical Component of a Transcriptional Network That Activates Recombinase Activating Gene Expression and V(D)J Recombination

    PubMed Central

    Lee, Baeck-Seung; Lee, Bum-Kyu; Iyer, Vishwanath R.; Sleckman, Barry P.; Shaffer, Arthur L.; Ippolito, Gregory C.

    2017-01-01

    ABSTRACT Recombination activating gene 1 (RAG1) and RAG2 are critical enzymes for initiating variable-diversity-joining [V(D)J] segment recombination, an essential process for antigen receptor expression and lymphocyte development. The BCL11A transcription factor is required for B cell and plasmacytoid dendritic cell (pDC) development, but its molecular function(s) in early B cell fate specification and commitment is unknown. We show here that the major B cell isoform, BCL11A-XL, binds directly to the RAG1 promoter as well as directly to regulatory regions of transcription factors previously implicated in both B cell and pDC development to activate RAG1 and RAG2 gene transcription in pro- and pre-B cells. We employed BCL11A overexpression with recombination substrates to demonstrate direct consequences of BCL11A/RAG modulation on V(D)J recombination. We conclude that BCL11A is a critical component of a transcriptional network that regulates B cell fate by controlling V(D)J recombination. PMID:29038163

  3. Identification of Young Stellar Variables with KELT for K2 . I. Taurus Dippers and Rotators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodriguez, Joseph E.; Cargile, Phillip A.; Ansdell, Megan

    One of the most well-studied young stellar associations, Taurus–Auriga, was observed by the extended Kepler mission, K2 , in the spring of 2017. K2 Campaign 13 (C13) is a unique opportunity to study many stars in this young association at high photometric precision and cadence. Using observations from the Kilodegree Extremely Little Telescope (KELT) survey, we identify “dippers,” aperiodic and periodic variables among K2 C13 target stars. This release of the KELT data (light curve data in e-tables) provides the community with long-time baseline observations to assist in the understanding of the more exotic variables in the association. Transient-like phenomenamore » on timescales of months to years are known characteristics in the light curves of young stellar objects, making contextual pre- and post- K2 observations critical to understanding their underlying processes. We are providing a comprehensive set of the KELT light curves for known Taurus–Auriga stars in K2 C13. The combined data sets from K2 and KELT should permit a broad array of investigations related to star formation, stellar variability, and protoplanetary environments.« less

  4. Using present day observations to detect when ocean acidification exceeds natural variability of surface seawater Ωaragonite

    NASA Astrophysics Data System (ADS)

    Sutton, A.; Sabine, C. L.; Feely, R. A.

    2016-02-01

    One of the major challenges to assessing the impact of ocean acidification on marine life is the need to better understand the magnitude of long-term change in the context of natural variability. High-frequency moored observations can be highly effective in defining interannual, seasonal, and subseasonal variability at key locations. Here we present monthly aragonite saturation state (Ωaragonite) climatology for 15 open ocean, coastal, and coral reef locations using 3-hourly moored observations of surface seawater pCO2 and pH collected together since as early as 2009. We then use these present day surface mooring observations to estimate pre-industrial variability at each location and compare these results to previous modeling studies addressing global-scale variability and change. Our observations suggest that open oceans sites, especially in the subtropics, are experiencing Ωaragonite values throughout much of the year which are outside the range of pre-industrial values. In coastal and coral reef ecosystems, which have higher natural variability, seasonal patterns where present day Ωaragonite values exceeding pre-industrial bounds are emerging with some sites exhibiting subseasonal conditions approaching Ωaragonite = 1. Linking these seasonal patterns in carbonate chemistry to biological processes in these regions is critical to identify when and where marine life may encounter Ωaragonite values outside the conditions to which they have adapted.

  5. Effects of climatic factors and ecosystem responses on the inter-annual variability of evapotranspiration in a coniferous plantation in subtropical China.

    PubMed

    Xu, Mingjie; Wen, Xuefa; Wang, Huimin; Zhang, Wenjiang; Dai, Xiaoqin; Song, Jie; Wang, Yidong; Fu, Xiaoli; Liu, Yunfen; Sun, Xiaomin; Yu, Guirui

    2014-01-01

    Because evapotranspiration (ET) is the second largest component of the water cycle and a critical process in terrestrial ecosystems, understanding the inter-annual variability of ET is important in the context of global climate change. Eight years of continuous eddy covariance measurements (2003-2010) in a subtropical coniferous plantation were used to investigate the impacts of climatic factors and ecosystem responses on the inter-annual variability of ET. The mean and standard deviation of annual ET for 2003-2010 were 786.9 and 103.4 mm (with a coefficient of variation of 13.1%), respectively. The inter-annual variability of ET was largely created in three periods: March, May-June, and October, which are the transition periods between seasons. A set of look-up table approaches were used to separate the sources of inter-annual variability of ET. The annual ETs were calculated by assuming that (a) both the climate and ecosystem responses among years are variable (Vcli-eco), (b) the climate is variable but the ecosystem responses are constant (Vcli), and (c) the climate is constant but ecosystem responses are variable (Veco). The ETs that were calculated under the above assumptions suggested that the inter-annual variability of ET was dominated by ecosystem responses and that there was a negative interaction between the effects of climate and ecosystem responses. These results suggested that for long-term predictions of water and energy balance in global climate change projections, the ecosystem responses must be taken into account to better constrain the uncertainties associated with estimation.

  6. Effects of Climatic Factors and Ecosystem Responses on the Inter-Annual Variability of Evapotranspiration in a Coniferous Plantation in Subtropical China

    PubMed Central

    Xu, Mingjie; Wen, Xuefa; Wang, Huimin; Zhang, Wenjiang; Dai, Xiaoqin; Song, Jie; Wang, Yidong; Fu, Xiaoli; Liu, Yunfen; Sun, Xiaomin; Yu, Guirui

    2014-01-01

    Because evapotranspiration (ET) is the second largest component of the water cycle and a critical process in terrestrial ecosystems, understanding the inter-annual variability of ET is important in the context of global climate change. Eight years of continuous eddy covariance measurements (2003–2010) in a subtropical coniferous plantation were used to investigate the impacts of climatic factors and ecosystem responses on the inter-annual variability of ET. The mean and standard deviation of annual ET for 2003–2010 were 786.9 and 103.4 mm (with a coefficient of variation of 13.1%), respectively. The inter-annual variability of ET was largely created in three periods: March, May–June, and October, which are the transition periods between seasons. A set of look-up table approaches were used to separate the sources of inter-annual variability of ET. The annual ETs were calculated by assuming that (a) both the climate and ecosystem responses among years are variable (Vcli-eco), (b) the climate is variable but the ecosystem responses are constant (Vcli), and (c) the climate is constant but ecosystem responses are variable (Veco). The ETs that were calculated under the above assumptions suggested that the inter-annual variability of ET was dominated by ecosystem responses and that there was a negative interaction between the effects of climate and ecosystem responses. These results suggested that for long-term predictions of water and energy balance in global climate change projections, the ecosystem responses must be taken into account to better constrain the uncertainties associated with estimation. PMID:24465610

  7. Investigating Academic Achievements and Critical Thinking Dispositions of Teacher Candidates

    ERIC Educational Resources Information Center

    Karagöl, Ibrahim; Bekmezci, Sinan

    2015-01-01

    The aim of this study is to examine the relationship between academic achievements and critical thinking dispositions of teacher candidates in Faculty of Education and to find out whether critical thinking dispositions and academic achievements scores of teacher candidates differ according to different variables. The population consists of the…

  8. Understanding the Nature and Determinants of Critical Thinking among Senior Business Undergraduate Students

    ERIC Educational Resources Information Center

    Brown, F. William; Bielinska-Kwapisz, Agnieszka

    2015-01-01

    The authors examine the dimensions and determinants of critical thinking skills, as measured by the California Critical Thinking Skills Test, among graduating senior students enrolled in an Association to Advance Collegiate Schools of Business-accredited undergraduate business program. Utilizing explanatory variables, a methodology for predicting…

  9. System-wide hybrid MPC-PID control of a continuous pharmaceutical tablet manufacturing process via direct compaction.

    PubMed

    Singh, Ravendra; Ierapetritou, Marianthi; Ramachandran, Rohit

    2013-11-01

    The next generation of QbD based pharmaceutical products will be manufactured through continuous processing. This will allow the integration of online/inline monitoring tools, coupled with an efficient advanced model-based feedback control systems, to achieve precise control of process variables, so that the predefined product quality can be achieved consistently. The direct compaction process considered in this study is highly interactive and involves time delays for a number of process variables due to sensor placements, process equipment dimensions, and the flow characteristics of the solid material. A simple feedback regulatory control system (e.g., PI(D)) by itself may not be sufficient to achieve the tight process control that is mandated by regulatory authorities. The process presented herein comprises of coupled dynamics involving slow and fast responses, indicating the requirement of a hybrid control scheme such as a combined MPC-PID control scheme. In this manuscript, an efficient system-wide hybrid control strategy for an integrated continuous pharmaceutical tablet manufacturing process via direct compaction has been designed. The designed control system is a hybrid scheme of MPC-PID control. An effective controller parameter tuning strategy involving an ITAE method coupled with an optimization strategy has been used for tuning of both MPC and PID parameters. The designed hybrid control system has been implemented in a first-principles model-based flowsheet that was simulated in gPROMS (Process System Enterprise). Results demonstrate enhanced performance of critical quality attributes (CQAs) under the hybrid control scheme compared to only PID or MPC control schemes, illustrating the potential of a hybrid control scheme in improving pharmaceutical manufacturing operations. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Kiloampere, Variable-Temperature, Critical-Current Measurements of High-Field Superconductors

    PubMed Central

    Goodrich, LF; Cheggour, N; Stauffer, TC; Filla, BJ; Lu, XF

    2013-01-01

    We review variable-temperature, transport critical-current (Ic) measurements made on commercial superconductors over a range of critical currents from less than 0.1 A to about 1 kA. We have developed and used a number of systems to make these measurements over the last 15 years. Two exemplary variable-temperature systems with coil sample geometries will be described: a probe that is only variable-temperature and a probe that is variable-temperature and variable-strain. The most significant challenge for these measurements is temperature stability, since large amounts of heat can be generated by the flow of high current through the resistive sample fixture. Therefore, a significant portion of this review is focused on the reduction of temperature errors to less than ±0.05 K in such measurements. A key feature of our system is a pre-regulator that converts a flow of liquid helium to gas and heats the gas to a temperature close to the target sample temperature. The pre-regulator is not in close proximity to the sample and it is controlled independently of the sample temperature. This allows us to independently control the total cooling power, and thereby fine tune the sample cooling power at any sample temperature. The same general temperature-control philosophy is used in all of our variable-temperature systems, but the addition of another variable, such as strain, forces compromises in design and results in some differences in operation and protocol. These aspects are analyzed to assess the extent to which the protocols for our systems might be generalized to other systems at other laboratories. Our approach to variable-temperature measurements is also placed in the general context of measurement-system design, and the perceived advantages and disadvantages of design choices are presented. To verify the accuracy of the variable-temperature measurements, we compared critical-current values obtained on a specimen immersed in liquid helium (“liquid” or Ic liq) at 5 K to those measured on the same specimen in flowing helium gas (“gas” or Ic gas) at the same temperature. These comparisons indicate the temperature control is effective over the superconducting wire length between the voltage taps, and this condition is valid for all types of sample investigated, including Nb-Ti, Nb3Sn, and MgB2 wires. The liquid/gas comparisons are used to study the variable-temperature measurement protocol that was necessary to obtain the “correct” critical current, which was assumed to be the Ic liq. We also calibrated the magnetoresistance effect of resistive thermometers for temperatures from 4 K to 35 K and magnetic fields from 0 T to 16 T. This calibration reduces systematic errors in the variable-temperature data, but it does not affect the liquid/gas comparison since the same thermometers are used in both cases. PMID:26401435

  11. Hydraulic and Wave Aspects of Novorossiysk Bora

    NASA Astrophysics Data System (ADS)

    Shestakova, Anna A.; Moiseenko, Konstantin B.; Toropov, Pavel A.

    2018-02-01

    Bora in Novorossiysk (seaport on the Black Sea coast of the Caucasus) is one of the strongest and most prominent downslope windstorms on the territory of Russia. In this paper, we evaluate the applicability of the hydraulic and wave hypotheses, which are widely used for downslope winds around the world, to Novorossiysk bora on the basis of observational data, reanalysis, and mesoscale numerical modeling with WRF-ARW. It is shown that mechanism of formation of Novorossiysk bora is essentially mixed, which is expressed in the simultaneous presence of gravity waves breaking and a hydraulic jump, as well as in the significant variability of the contribution of wave processes to the windstorm dynamics. Effectiveness of each mechanism depends on the elevated inversion intensity and mean state critical level height. Most favorable conditions for both mechanisms working together are moderate or weak inversion and high or absent critical level.

  12. From physics to biology by extending criticality and symmetry breakings.

    PubMed

    Longo, G; Montévil, M

    2011-08-01

    Symmetries play a major role in physics, in particular since the work by E. Noether and H. Weyl in the first half of last century. Herein, we briefly review their role by recalling how symmetry changes allow to conceptually move from classical to relativistic and quantum physics. We then introduce our ongoing theoretical analysis in biology and show that symmetries play a radically different role in this discipline, when compared to those in current physics. By this comparison, we stress that symmetries must be understood in relation to conservation and stability properties, as represented in the theories. We posit that the dynamics of biological organisms, in their various levels of organization, are not "just" processes, but permanent (extended, in our terminology) critical transitions and, thus, symmetry changes. Within the limits of a relative structural stability (or interval of viability), variability is at the core of these transitions. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. What can be Learned from X-Ray Spectroscopy Concerning Hot Gas in the Local Bubble and Charge Exchange Processes

    NASA Technical Reports Server (NTRS)

    Snowden, Steven L.

    2007-01-01

    Solar wind charge exchange produces diffuse X-ray emission with a variable surface brightness comparable to that of the cosmic background. While the temporal variation of the charge exchange emission allows some separation of the components, there remains a great deal of uncertainty as to the zero level of both. Because the production mechanisms of the two components are considerably different, their spectra would provide critical diagnostics to the understanding of both. However, current X-ray observatories are very limited in both spectral resolution and sensitivity in the critical soft X-ray (less than 1.0 keV) energy range. Non-dispersive high-resolution spectrometers, such as the calorimeter proposed for the Spectrum Roentgen Gamma mission, will be extremely useful in distinguishing the cascade emission of charge exchange from the spectra of thermal bremsstrahlung cosmic plasmas.

  14. Iterative Stable Alignment and Clustering of 2D Transmission Electron Microscope Images

    PubMed Central

    Yang, Zhengfan; Fang, Jia; Chittuluru, Johnathan; Asturias, Francisco J.; Penczek, Pawel A.

    2012-01-01

    SUMMARY Identification of homogeneous subsets of images in a macromolecular electron microscopy (EM) image data set is a critical step in single-particle analysis. The task is handled by iterative algorithms, whose performance is compromised by the compounded limitations of image alignment and K-means clustering. Here we describe an approach, iterative stable alignment and clustering (ISAC) that, relying on a new clustering method and on the concepts of stability and reproducibility, can extract validated, homogeneous subsets of images. ISAC requires only a small number of simple parameters and, with minimal human intervention, can eliminate bias from two-dimensional image clustering and maximize the quality of group averages that can be used for ab initio three-dimensional structural determination and analysis of macromolecular conformational variability. Repeated testing of the stability and reproducibility of a solution within ISAC eliminates heterogeneous or incorrect classes and introduces critical validation to the process of EM image clustering. PMID:22325773

  15. Cortisol Secretion and Change in Sleep Problems in Early Childhood: Moderation by Maternal Overcontrol

    PubMed Central

    Kiel, Elizabeth J.; Hummel, Alexandra C.; Luebbe, Aaron M.

    2015-01-01

    Childhood sleep problems are prevalent and relate to a wide range of negative psychological outcomes. However, it remains unclear how biological processes, such as HPA activity, may predict sleep problems over time in childhood in the context of certain parenting environments. Fifty-one mothers and their 18–20 month-old toddlers participated in a short-term longitudinal study assessing how shared variance among morning levels, diurnal change, and nocturnal change in toddlers’ cortisol secretion predicted change in sleep problems in the context of maternal overprotection and critical control. A composite characterized by low variability in, and, to a lesser extent, high morning values of cortisol, predicted increasing sleep problems from age 2 to age 3 when mothers reported high critical control. Results suggest value in assessing shared variance among different indices of cortisol secretion patterns and the interaction between cortisol and the environment in predicting sleep problems in early childhood. PMID:25766262

  16. Online counseling: a narrative and critical review of the literature.

    PubMed

    Richards, Derek; Viganó, Noemi

    2013-09-01

    This article aimed to critically review the literature on online counseling. Database and hand-searches were made using search terms and eligibility criteria, yielding a total of 123 studies. The review begins with what characterizes online counseling. Outcome and process research in online counseling is reviewed. Features and cyberbehaviors of online counseling such as anonymity and disinhibition, convenience, time-delay, the loss of social signaling, and writing behavior in cyberspace are discussed. Ethical behavior, professional training, client suitability, and clients' and therapists' attitudes and experiences of online counseling are reviewed. A growing body of knowledge to date is positive in showing that online counseling can have a similar impact and is capable of replicating the facilitative conditions as face-to-face encounters. A need remains for stronger empirical evidence to establish efficacy and effectiveness and to understand better the unique mediating and facilitative variables. © 2013 Wiley Periodicals, Inc.

  17. Efficient affinity maturation of antibody variable domains requires co-selection of compensatory mutations to maintain thermodynamic stability

    PubMed Central

    Julian, Mark C.; Li, Lijuan; Garde, Shekhar; Wilen, Rebecca; Tessier, Peter M.

    2017-01-01

    The ability of antibodies to accumulate affinity-enhancing mutations in their complementarity-determining regions (CDRs) without compromising thermodynamic stability is critical to their natural function. However, it is unclear if affinity mutations in the hypervariable CDRs generally impact antibody stability and to what extent additional compensatory mutations are required to maintain stability during affinity maturation. Here we have experimentally and computationally evaluated the functional contributions of mutations acquired by a human variable (VH) domain that was evolved using strong selections for enhanced stability and affinity for the Alzheimer’s Aβ42 peptide. Interestingly, half of the key affinity mutations in the CDRs were destabilizing. Moreover, the destabilizing effects of these mutations were compensated for by a subset of the affinity mutations that were also stabilizing. Our findings demonstrate that the accumulation of both affinity and stability mutations is necessary to maintain thermodynamic stability during extensive mutagenesis and affinity maturation in vitro, which is similar to findings for natural antibodies that are subjected to somatic hypermutation in vivo. These findings for diverse antibodies and antibody fragments specific for unrelated antigens suggest that the formation of the antigen-binding site is generally a destabilizing process and that co-enrichment for compensatory mutations is critical for maintaining thermodynamic stability. PMID:28349921

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Yiqi; Ahlström, Anders; Allison, Steven D.

    Soil carbon (C) is a critical component of Earth system models (ESMs) and its diverse representations are a major source of the large spread across models in the terrestrial C sink from the 3rd to 5th assessment reports of the Intergovernmental Panel on Climate Change (IPCC). Improving soil C projections is of a high priority for Earth system modeling in the future IPCC and other assessments. To achieve this goal, we suggest that (1) model structures should reflect real-world processes, (2) parameters should be calibrated to match model outputs with observations, and (3) external forcing variables should accurately prescribe themore » environmental conditions that soils experience. Firstly, most soil C cycle models simulate C input from litter production and C release through decomposition. The latter process has traditionally been represented by 1st-order decay functions, regulated primarily by temperature, moisture, litter quality, and soil texture. While this formulation well captures macroscopic SOC dynamics, better understanding is needed of their underlying mechanisms as related to microbial processes, depth-dependent environmental controls, and other processes that strongly affect soil C dynamics. Secondly, incomplete use of observations in model parameterization is a major cause of bias in soil C projections from ESMs. Optimal parameter calibration with both pool- and flux-based datasets through data assimilation is among the highest priorities for near-term research to reduce biases among ESMs. Thirdly, external variables are represented inconsistently among ESMs, leading to differences in modeled soil C dynamics. We recommend the implementation of traceability analyses to identify how external variables and model parameterizations influence SOC dynamics in different ESMs. Overall, projections of the terrestrial C sink can be substantially improved when reliable datasets are available to select the most representative model structure, constrain parameters, and prescribe forcing fields.« less

  19. Predicting Vaccination Intention and Benefit and Risk Perceptions: The Incorporation of Affect, Trust, and Television Influence in a Dual-Mode Model.

    PubMed

    Chen, Nien-Tsu Nancy

    2015-07-01

    Major health behavior change models tend to consider health decisions as primarily resulting from a systematic appraisal of relevant beliefs, such as the perceived benefits and risks of a pharmacological intervention. Drawing on research from the disciplines of risk management, communication, and psychology, this study proposed the inclusion of a heuristic route in established theory and tested the direction of influence between heuristic and systematic process variables. Affect and social trust were included as key heuristics in the proposed dual-mode framework of health decision making. Furthermore, exposure to health-related coverage on television was considered potentially influential over both heuristic and systematic process variables. To test this framework, data were collected from a national probability sample of 584 adults in the United States in 2012 regarding their decision to vaccinate against a hypothetical avian flu. The results provided some support for the bidirectional influence between heuristic and systematic processing. Affect toward flu vaccination and trust in the Food and Drug Administration were found to be powerful predictors of vaccination intention, enhancing intention both directly and indirectly via certain systematic process variables. The direction of influence between perceived susceptibility and severity, on the one hand, and affect, on the other, is less clear, suggesting the need for further research. Contrary to the opinion of media critics, exposure to televised health coverage was negatively associated with the perceived risks of vaccination. Results from this study carry theoretical and practical implications, and applying this model to the acceptance of different health interventions constitutes an area for future inquiries. © 2015 Society for Risk Analysis.

  20. A comparison of four typical green exercise environments and prediction of psychological health outcomes.

    PubMed

    Rogerson, Mike; Brown, Daniel K; Sandercock, Gavin; Wooller, John-James; Barton, Jo

    2016-05-01

    'Green exercise' (GE) is physical activity while simultaneously being exposed to nature. GE comprises three physical components: the individual, the exercise and the environment, and one processes component encompassing a range of psychological and physiological processes. Previous research has consistently shown affective benefits of GE compared to equivalent non-GE. Investigating the possibility of optimum GE environments may help maximise health benefits. The aim of this study was to compare affective outcomes of GE participation between four different typical GE environments (beach, grasslands, riverside, heritage), and further examine influences of several physical component-related variables and one processes component-related variable, on these outcomes. Participants (N = 331) completed questionnaires before and after a 5km run, at one of four parkrun event locations. Self-esteem (Δ = 1.61, 95% confidence interval (CI) = (1.30, 1.93)), stress (Δ = -2.36, 95% CI = (-3.01, -1.71)) and mood (Δ = -5.25, 95% CI = (-7.45, -3.05)) all significantly improved from pre- to post-run (p < .05). Improvements in these measures were not significantly different between environments. Several component-related variables significantly predicted these improvements, accounting for 9% of self-esteem improvement, 1.6% of perceived stress improvement, and 9.5% of mood improvement. GE offers accessible provision for improving acute psychological wellbeing. Although nature-based exercise environments can facilitate affective outcomes, the overall type of nature may be less critical. Other characteristics of the individual, exercise and environment can significantly influence attainment of psychological GE benefits. However, the results support a greater importance of the processes component in attaining previously reported affective outcomes. © Royal Society for Public Health 2015.

  1. Process Network Approach to Understanding How Forest Ecosystems Adapt to Changes

    NASA Astrophysics Data System (ADS)

    Kim, J.; Yun, J.; Hong, J.; Kwon, H.; Chun, J.

    2011-12-01

    Sustainability challenges are transforming science and its role in society. Complex systems science has emerged as an inevitable field of education and research, which transcends disciplinary boundaries and focuses on understanding of the dynamics of complex social-ecological systems (SES). SES is a combined system of social and ecological components and drivers that interact and give rise to results, which could not be understood on the basis of sociological or ecological considerations alone. However, both systems may be viewed as a network of processes, and such a network hierarchy may serve as a hinge to bridge social and ecological systems. As a first step toward such effort, we attempted to delineate and interpret such process networks in forest ecosystems, which play a critical role in the cycles of carbon and water from local to global scales. These cycles and their variability, in turn, play an important role in the emergent and self-organizing interactions between forest ecosystems and their environment. Ruddell and Kumar (2009) define a process network as a network of feedback loops and the related time scales, which describe the magnitude and direction of the flow of energy, matter, and information between the different variables in a complex system. Observational evidence, based on micrometeorological eddy covariance measurements, suggests that heterogeneity and disturbances in forest ecosystems in monsoon East Asia may facilitate to build resilience for adaptation to change. Yet, the principles that characterize the role of variability in these interactions remain elusive. In this presentation, we report results from the analysis of multivariate ecohydrologic and biogeochemical time series data obtained from temperate forest ecosystems in East Asia based on information flow statistics.

  2. Toward more realistic projections of soil carbon dynamics by Earth system models

    USGS Publications Warehouse

    Luo, Y.; Ahlström, Anders; Allison, Steven D.; Batjes, Niels H.; Brovkin, V.; Carvalhais, Nuno; Chappell, Adrian; Ciais, Philippe; Davidson, Eric A.; Finzi, Adien; Georgiou, Katerina; Guenet, Bertrand; Hararuk, Oleksandra; Harden, Jennifer; He, Yujie; Hopkins, Francesca; Jiang, L.; Koven, Charles; Jackson, Robert B.; Jones, Chris D.; Lara, M.; Liang, J.; McGuire, A. David; Parton, William; Peng, Changhui; Randerson, J.; Salazar, Alejandro; Sierra, Carlos A.; Smith, Matthew J.; Tian, Hanqin; Todd-Brown, Katherine E. O; Torn, Margaret S.; van Groenigen, Kees Jan; Wang, Ying; West, Tristram O.; Wei, Yaxing; Wieder, William R.; Xia, Jianyang; Xu, Xia; Xu, Xiaofeng; Zhou, T.

    2016-01-01

    Soil carbon (C) is a critical component of Earth system models (ESMs), and its diverse representations are a major source of the large spread across models in the terrestrial C sink from the third to fifth assessment reports of the Intergovernmental Panel on Climate Change (IPCC). Improving soil C projections is of a high priority for Earth system modeling in the future IPCC and other assessments. To achieve this goal, we suggest that (1) model structures should reflect real-world processes, (2) parameters should be calibrated to match model outputs with observations, and (3) external forcing variables should accurately prescribe the environmental conditions that soils experience. First, most soil C cycle models simulate C input from litter production and C release through decomposition. The latter process has traditionally been represented by first-order decay functions, regulated primarily by temperature, moisture, litter quality, and soil texture. While this formulation well captures macroscopic soil organic C (SOC) dynamics, better understanding is needed of their underlying mechanisms as related to microbial processes, depth-dependent environmental controls, and other processes that strongly affect soil C dynamics. Second, incomplete use of observations in model parameterization is a major cause of bias in soil C projections from ESMs. Optimal parameter calibration with both pool- and flux-based data sets through data assimilation is among the highest priorities for near-term research to reduce biases among ESMs. Third, external variables are represented inconsistently among ESMs, leading to differences in modeled soil C dynamics. We recommend the implementation of traceability analyses to identify how external variables and model parameterizations influence SOC dynamics in different ESMs. Overall, projections of the terrestrial C sink can be substantially improved when reliable data sets are available to select the most representative model structure, constrain parameters, and prescribe forcing fields.

  3. Immigration and early life stages recruitment of the European flounder (Platichthys flesus) to an estuarine nursery: The influence of environmental factors

    NASA Astrophysics Data System (ADS)

    Amorim, Eva; Ramos, Sandra; Elliott, Michael; Bordalo, Adriano A.

    2016-01-01

    Connectivity between coastal spawning grounds and estuarine nurseries is a critical step in the life cycle of many fish species. Larval immigration and transport-associated physical-biological processes are determinants of recruitment success to nursery areas. The recruitment of the European flounder, Platichthys flesus, to estuarine nurseries located at the southern edge of the species distribution range, has been usually investigated during its juvenile stages, while estuarine recruitment during the earlier planktonic life stage remains largely unstudied. The present study investigated the patterns of flounder larval recruitment and the influence of environmental factors on the immigration of the early life stages to the Lima estuary (NW Portugal), integrating data on fish larvae and post-settlement individuals (< 50 mm length), collected over 7 years. Late-stage larvae arrived at the estuary between February and July and peak abundances were observed in April. Post-settlement individuals (< 50 mm) occurred later between April and October, whereas newly-settled ones (< 20 mm) were found only in May and June. Variables associated with the spawning, survival and growth of larvae in the ocean (sea surface temperature, chlorophyll a and inland hydrological variables) were the major drivers of flounder occurrence in the estuarine nursery. Although the adjacent coastal area is characterized by a current system with strong seasonality and mesoscale variability, we did not identify any influence of variables related with physical processes (currents and upwelling) on the occurrence of early life stages in the estuary. A wider knowledge on the influence of the coastal circulation variability and its associated effects upon ocean-estuarine connectivity is required to improve our understanding of the population dynamics of marine spawning fish that use estuarine nurseries.

  4. Intensity of interprofessional collaboration among intensive care nurses at a tertiary hospital.

    PubMed

    Serrano-Gemes, G; Rich-Ruiz, M

    To measure the intensity of interprofessional collaboration (IPC) in nurses of an intensive care unit (ICU) at a tertiary hospital, to check differences between the dimensions of the Intensity of Interprofessional Collaboration Questionnaire, and to identify the influence of personal variables. A cross-sectional descriptive study was conducted with 63 intensive care nurses selected by simple random sampling. Explanatory variables: age, sex, years of experience in nursing, years of experience in critical care, workday type and work shift type; variable of outcome: IPC. The IPC was measured by: Intensity of Interprofessional Collaboration Questionnaire. Descriptive and bivariate statistical analysis (IPC and its dimensions with explanatory variables). 73.8% were women, with a mean age of 46.54 (±6.076) years. The average years experience in nursing and critical care was 23.03 (±6.24) and 14.25 (±8.532), respectively. 77% had a full time and 95.1% had a rotating shift. 62.3% obtained average IPC values. Statistically significant differences were found (P<.05) between IPC (overall score) and overall assessment with years of experience in critical care. This study shows average levels of IPC; the nurses with less experience in critical care obtained higher IPC and overall assessment scores. Copyright © 2016 Sociedad Española de Enfermería Intensiva y Unidades Coronarias (SEEIUC). Publicado por Elsevier España, S.L.U. All rights reserved.

  5. Holocene climate variability in Texas, USA: An integration of existing paleoclimate data and modeling with a new, high-resolution speleothem record

    USGS Publications Warehouse

    Wong, Corinne I.; Banner, Jay L.; Musgrove, MaryLynn

    2015-01-01

    Delineating the climate processes governing precipitation variability in drought-prone Texas is critical for predicting and mitigating climate change effects, and requires the reconstruction of past climate beyond the instrumental record. We synthesize existing paleoclimate proxy data and climate simulations to provide an overview of climate variability in Texas during the Holocene. Conditions became progressively warmer and drier transitioning from the early to mid Holocene, culminating between 7 and 3 ka (thousand years ago), and were more variable during the late Holocene. The timing and relative magnitude of Holocene climate variability, however, is poorly constrained owing to considerable variability among the different records. To help address this, we present a new speleothem (NBJ) reconstruction from a central Texas cave that comprises the highest resolution proxy record to date, spanning the mid to late Holocene. NBJ trace-element concentrations indicate variable moisture conditions with no clear temporal trend. There is a decoupling between NBJ growth rate, trace-element concentrations, and δ18O values, which indicate that (i) the often direct relation between speleothem growth rate and moisture availability is likely complicated by changes in the overlying ecosystem that affect subsurface CO2 production, and (ii) speleothem δ18O variations likely reflect changes in moisture source (i.e., proportion of Pacific-vs. Gulf of Mexico-derived moisture) that appear not to be linked to moisture amount.

  6. Compact touchless fingerprint reader based on digital variable-focus liquid lens

    NASA Astrophysics Data System (ADS)

    Tsai, C. W.; Wang, P. J.; Yeh, J. A.

    2014-09-01

    Identity certification in the cyberworld has always been troublesome if critical information and financial transaction must be processed. Biometric identification is the most effective measure to circumvent the identity issues in mobile devices. Due to bulky and pricy optical design, conventional optical fingerprint readers have been discarded for mobile applications. In this paper, a digital variable-focus liquid lens was adopted for capture of a floating finger via fast focusplane scanning. Only putting a finger in front of a camera could fulfill the fingerprint ID process. This prototyped fingerprint reader scans multiple focal planes from 30 mm to 15 mm in 0.2 second. Through multiple images at various focuses, one of the images is chosen for extraction of fingerprint minutiae used for identity certification. In the optical design, a digital liquid lens atop a webcam with a fixed-focus lens module is to fast-scan a floating finger at preset focus planes. The distance, rolling angle and pitching angle of the finger are stored for crucial parameters during the match process of fingerprint minutiae. This innovative compact touchless fingerprint reader could be packed into a minute size of 9.8*9.8*5 (mm) after the optical design and multiple focus-plane scan function are optimized.

  7. The use of bibliometrics to measure research quality in UK higher education institutions.

    PubMed

    Adams, Jonathan

    2009-01-01

    Research assessment in the UK has evolved over a quarter of a century from a loosely structured, peer-review based process to one with a well understood data portfolio and assessment methodology. After 2008, the assessment process will shift again, to the use of indicators based largely on publication and citation data. These indicators will in part follow the format introduced in 2008, with a profiling of assessment outcomes at national and international levels. However, the shift from peer assessment to a quantitative methodology raises critical issues about which metrics are appropriate and informative and how such metrics should be managed to produce weighting factors for funding formulae. The link between publication metrics and other perceptions of research quality needs to be thoroughly tested and reviewed, and may be variable between disciplines. Many of the indicators that drop out of publication data are poorly linked to quality and should not be used at all. There are also issues about which publications are the correct base for assessment, which staff should be included in a review, how subjects should be structured and how the citation data should be normalised to account for discipline-dependent variables. Finally, it is vital to consider the effect that any assessment process will have on the behaviour of those to be assessed.

  8. Goal specificity: a proxy measure for improvements in environmental outcomes in collaborative governance.

    PubMed

    Biddle, Jennifer C; Koontz, Tomas M

    2014-12-01

    Collaborative governance critics continually call for evidence to support its prevalent use. As is often the case in environmental policy, environmental outcomes occur at a rate incompatible with political agendas. In addition, a multitude of possibly confounding variables makes it difficult to correlate collaborative governance processes with environmental outcomes. The findings of this study offer empirical evidence that collaborative processes have a measurable, beneficial effect on environmental outcomes. Through the use of a unique paired-waterbody design, our dataset reduced the potential for confounding variables to impact our environmental outcome measurements. The results of a path analysis indicate that the output of setting specific pollutant reduction goals is significantly related to watershed partnerships' level of attainment of their environmental improvement goals. The action of setting specific goals (e.g. percentage of load reductions in pollutant levels) is fostered by sustained participation from partnership members throughout the lifecycle of the collaborative. In addition, this study demonstrates the utility of logic modeling for environmental planning and management, and suggests that the process of setting specific pollutant reduction goals is a useful proxy measure for reporting progress towards improvements in environmental outcomes when long-term environmental data are not available. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Stochastic Growth of Ion Cyclotron And Mirror Waves In Earth's Magnetosheath

    NASA Technical Reports Server (NTRS)

    Cairns, Iver H.; Grubits, K. A.

    2001-01-01

    Electromagnetic ion cyclotron and mirror waves in Earth's magnetosheath are bursty, have widely variable fields, and are unexpectedly persistent, properties difficult to reconcile with uniform secular growth. Here it is shown for specific periods that stochastic growth theory (SGT) quantitatively accounts for the functional form of the wave statistics and qualitatively explains the wave properties. The wave statistics are inconsistent with uniform secular growth or self-organized criticality, but nonlinear processes sometimes play a role at high fields. The results show SGT's relevance near marginal stability and suggest that it is widely relevant to space and astrophysical plasmas.

  10. Spares Management : Optimizing Hardware Usage for the Space Shuttle Main Engine

    NASA Technical Reports Server (NTRS)

    Gulbrandsen, K. A.

    1999-01-01

    The complexity of the Space Shuttle Main Engine (SSME), combined with mounting requirements to reduce operations costs have increased demands for accurate tracking, maintenance, and projections of SSME assets. The SSME Logistics Team is developing an integrated asset management process. This PC-based tool provides a user-friendly asset database for daily decision making, plus a variable-input hardware usage simulation with complex logic yielding output that addresses essential asset management issues. Cycle times on critical tasks are significantly reduced. Associated costs have decreased as asset data quality and decision-making capability has increased.

  11. Quantum Hamiltonian identification from measurement time traces.

    PubMed

    Zhang, Jun; Sarovar, Mohan

    2014-08-22

    Precise identification of parameters governing quantum processes is a critical task for quantum information and communication technologies. In this Letter, we consider a setting where system evolution is determined by a parametrized Hamiltonian, and the task is to estimate these parameters from temporal records of a restricted set of system observables (time traces). Based on the notion of system realization from linear systems theory, we develop a constructive algorithm that provides estimates of the unknown parameters directly from these time traces. We illustrate the algorithm and its robustness to measurement noise by applying it to a one-dimensional spin chain model with variable couplings.

  12. Real-time determination of critical quality attributes using near-infrared spectroscopy: a contribution for Process Analytical Technology (PAT).

    PubMed

    Rosas, Juan G; Blanco, Marcel; González, Josep M; Alcalà, Manel

    2012-08-15

    Process Analytical Technology (PAT) is playing a central role in current regulations on pharmaceutical production processes. Proper understanding of all operations and variables connecting the raw materials to end products is one of the keys to ensuring quality of the products and continuous improvement in their production. Near infrared spectroscopy (NIRS) has been successfully used to develop faster and non-invasive quantitative methods for real-time predicting critical quality attributes (CQA) of pharmaceutical granulates (API content, pH, moisture, flowability, angle of repose and particle size). NIR spectra have been acquired from the bin blender after granulation process in a non-classified area without the need of sample withdrawal. The methodology used for data acquisition, calibration modelling and method application in this context is relatively inexpensive and can be easily implemented by most pharmaceutical laboratories. For this purpose, Partial Least-Squares (PLS) algorithm was used to calculate multivariate calibration models, that provided acceptable Root Mean Square Error of Predictions (RMSEP) values (RMSEP(API)=1.0 mg/g; RMSEP(pH)=0.1; RMSEP(Moisture)=0.1%; RMSEP(Flowability)=0.6 g/s; RMSEP(Angle of repose)=1.7° and RMSEP(Particle size)=2.5%) that allowed the application for routine analyses of production batches. The proposed method affords quality assessment of end products and the determination of important parameters with a view to understanding production processes used by the pharmaceutical industry. As shown here, the NIRS technique is a highly suitable tool for Process Analytical Technologies. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Soil Erosion as a stochastic process

    NASA Astrophysics Data System (ADS)

    Casper, Markus C.

    2015-04-01

    The main tools to provide estimations concerning risk and amount of erosion are different types of soil erosion models: on the one hand, there are empirically based model concepts on the other hand there are more physically based or process based models. However, both types of models have substantial weak points. All empirical model concepts are only capable of providing rough estimates over larger temporal and spatial scales, they do not account for many driving factors that are in the scope of scenario related analysis. In addition, the physically based models contain important empirical parts and hence, the demand for universality and transferability is not given. As a common feature, we find, that all models rely on parameters and input variables, which are to certain, extend spatially and temporally averaged. A central question is whether the apparent heterogeneity of soil properties or the random nature of driving forces needs to be better considered in our modelling concepts. Traditionally, researchers have attempted to remove spatial and temporal variability through homogenization. However, homogenization has been achieved through physical manipulation of the system, or by statistical averaging procedures. The price for obtaining this homogenized (average) model concepts of soils and soil related processes has often been a failure to recognize the profound importance of heterogeneity in many of the properties and processes that we study. Especially soil infiltrability and the resistance (also called "critical shear stress" or "critical stream power") are the most important empirical factors of physically based erosion models. The erosion resistance is theoretically a substrate specific parameter, but in reality, the threshold where soil erosion begins is determined experimentally. The soil infiltrability is often calculated with empirical relationships (e.g. based on grain size distribution). Consequently, to better fit reality, this value needs to be corrected experimentally. To overcome this disadvantage of our actual models, soil erosion models are needed that are able to use stochastic directly variables and parameter distributions. There are only some minor approaches in this direction. The most advanced is the model "STOSEM" proposed by Sidorchuk in 2005. In this model, only a small part of the soil erosion processes is described, the aggregate detachment and the aggregate transport by flowing water. The concept is highly simplified, for example, many parameters are temporally invariant. Nevertheless, the main problem is that our existing measurements and experiments are not geared to provide stochastic parameters (e.g. as probability density functions); in the best case they deliver a statistical validation of the mean values. Again, we get effective parameters, spatially and temporally averaged. There is an urgent need for laboratory and field experiments on overland flow structure, raindrop effects and erosion rate, which deliver information on spatial and temporal structure of soil and surface properties and processes.

  14. Variability of dissolved organic carbon in precipitation during storms at the Shale Hills Critical Zone Observatory

    USGS Publications Warehouse

    Iavorivska , Lidiia; Boyer, Elizabeth W.; Grimm, Jeffrey W.; Miller, Matthew P.; DeWalle, David R.; Davis, Kenneth J.; Kaye, Margot W.

    2017-01-01

    Organic compounds are removed from the atmosphere and deposited to the earth's surface via precipitation. In this study, we quantified variations of dissolved organic carbon (DOC) in precipitation during storm events at the Shale Hills Critical Zone Observatory, a forested watershed in central Pennsylvania (USA). Precipitation samples were collected consecutively throughout the storm during 13 events, which spanned a range of seasons and synoptic meteorological conditions, including a hurricane. Further, we explored factors that affect the temporal variability by considering relationships of DOC in precipitation with atmospheric and storm characteristics. Concentrations and chemical composition of DOC changed considerably during storms, with the magnitude of change within individual events being comparable or higher than the range of variation in average event composition among events. While some previous studies observed that concentrations of other elements in precipitation typically decrease over the course of individual storm events, results of this study show that DOC concentrations in precipitation are highly variable. During most storm events concentrations decreased over time, possibly as a result of washing out of the below-cloud atmosphere. However, increasing concentrations that were observed in the later stages of some storm events highlight that DOC removal with precipitation is not merely a dilution response. Increases in DOC during events could result from advection of air masses, local emissions during breaks in precipitation, or chemical transformations in the atmosphere that enhance solubility of organic carbon compounds. This work advances understanding of processes occurring during storms that are relevant to studies of atmospheric chemistry, carbon cycling, and ecosystem responses.

  15. A Surrogate Technique for Investigating Deterministic Dynamics in Discrete Human Movement.

    PubMed

    Taylor, Paul G; Small, Michael; Lee, Kwee-Yum; Landeo, Raul; O'Meara, Damien M; Millett, Emma L

    2016-10-01

    Entropy is an effective tool for investigation of human movement variability. However, before applying entropy, it can be beneficial to employ analyses to confirm that observed data are not solely the result of stochastic processes. This can be achieved by contrasting observed data with that produced using surrogate methods. Unlike continuous movement, no appropriate method has been applied to discrete human movement. This article proposes a novel surrogate method for discrete movement data, outlining the processes for determining its critical values. The proposed technique reliably generated surrogates for discrete joint angle time series, destroying fine-scale dynamics of the observed signal, while maintaining macro structural characteristics. Comparison of entropy estimates indicated observed signals had greater regularity than surrogates and were not only the result of stochastic but also deterministic processes. The proposed surrogate method is both a valid and reliable technique to investigate determinism in other discrete human movement time series.

  16. New directions: Time for a new approach to modeling surface-atmosphere exchanges in air quality models?

    NASA Astrophysics Data System (ADS)

    Saylor, Rick D.; Hicks, Bruce B.

    2016-03-01

    Just as the exchange of heat, moisture and momentum between the Earth's surface and the atmosphere are critical components of meteorological and climate models, the surface-atmosphere exchange of many trace gases and aerosol particles is a vitally important process in air quality (AQ) models. Current state-of-the-art AQ models treat the emission and deposition of most gases and particles as separate model parameterizations, even though evidence has accumulated over time that the emission and deposition processes of many constituents are often two sides of the same coin, with the upward (emission) or downward (deposition) flux over a landscape depending on a range of environmental, seasonal and biological variables. In this note we argue that the time has come to integrate the treatment of these processes in AQ models to provide biological, physical and chemical consistency and improved predictions of trace gases and particles.

  17. Total quality management in orthodontic practice.

    PubMed

    Atta, A E

    1999-12-01

    Quality is the buzz word for the new Millennium. Patients demand it, and we must serve it. Yet one must identify it. Quality is not imaging or public relations; it is a business process. This short article presents quality as a balance of three critical notions: core clinical competence, perceived values that our patients seek and want, and the cost of quality. Customer satisfaction is a variable that must be identified for each practice. In my practice, patients perceive quality as communication and time, be it treatment or waiting time. Time is a value and cost that must be managed effectively. Total quality management is a business function; it involves diagnosis, design, implementation, and measurement of the process, the people, and the service. Kazien is a function that reduces value services, eliminates waste, and manages time and cost in the process. Total quality management is a total commitment for continuous improvement.

  18. High-resolution eye tracking using V1 neuron activity

    PubMed Central

    McFarland, James M.; Bondy, Adrian G.; Cumming, Bruce G.; Butts, Daniel A.

    2014-01-01

    Studies of high-acuity visual cortical processing have been limited by the inability to track eye position with sufficient accuracy to precisely reconstruct the visual stimulus on the retina. As a result, studies on primary visual cortex (V1) have been performed almost entirely on neurons outside the high-resolution central portion of the visual field (the fovea). Here we describe a procedure for inferring eye position using multi-electrode array recordings from V1 coupled with nonlinear stimulus processing models. We show that this method can be used to infer eye position with one arc-minute accuracy – significantly better than conventional techniques. This allows for analysis of foveal stimulus processing, and provides a means to correct for eye-movement induced biases present even outside the fovea. This method could thus reveal critical insights into the role of eye movements in cortical coding, as well as their contribution to measures of cortical variability. PMID:25197783

  19. An optics-based variable-temperature assay system for characterizing thermodynamics of biomolecular reactions on solid support

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fei, Yiyan; Landry, James P.; Zhu, X. D., E-mail: xdzhu@physics.ucdavis.edu

    A biological state is equilibrium of multiple concurrent biomolecular reactions. The relative importance of these reactions depends on physiological temperature typically between 10 °C and 50 °C. Experimentally the temperature dependence of binding reaction constants reveals thermodynamics and thus details of these biomolecular processes. We developed a variable-temperature opto-fluidic system for real-time measurement of multiple (400–10 000) biomolecular binding reactions on solid supports from 10 °C to 60 °C within ±0.1 °C. We illustrate the performance of this system with investigation of binding reactions of plant lectins (carbohydrate-binding proteins) with 24 synthetic glycans (i.e., carbohydrates). We found that the lectin-glycan reactions in general can be enthalpy-driven,more » entropy-driven, or both, and water molecules play critical roles in the thermodynamics of these reactions.« less

  20. Association between periodontal disease and dementia: A literature review.

    PubMed

    Pazos, P; Leira, Y; Domínguez, C; Pías-Peleteiro, J M; Blanco, J; Aldrey, J M

    2016-10-22

    Periodontal disease and dementia are very prevalent, especially in elderly populations. Multiple studies have shown a link between these diseases; however, the conditions are highly heterogeneous and so is the diagnostic methodology, which may hinder interpretation and comparison of the results. The aim of this article is to provide a critical review of the literature linking these 2 processes. We retrieved 22 studies, most of which were retrospective, and analysed various methodological variables including study population, diagnosis of periodontitis, definition of dementia, adjusted variables, and results. The different aetiopathogenic mechanisms that may affect the progression and interaction of these 2 conditions were also analysed. Although available evidence indicates a positive association between periodontitis and dementia, both the strength of that association and the presence of a causal relationship have yet to be determined. Copyright © 2016 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  1. Large-scale climatic anomalies affect marine predator foraging behaviour and demography.

    PubMed

    Bost, Charles A; Cotté, Cedric; Terray, Pascal; Barbraud, Christophe; Bon, Cécile; Delord, Karine; Gimenez, Olivier; Handrich, Yves; Naito, Yasuhiko; Guinet, Christophe; Weimerskirch, Henri

    2015-10-27

    Determining the links between the behavioural and population responses of wild species to environmental variations is critical for understanding the impact of climate variability on ecosystems. Using long-term data sets, we show how large-scale climatic anomalies in the Southern Hemisphere affect the foraging behaviour and population dynamics of a key marine predator, the king penguin. When large-scale subtropical dipole events occur simultaneously in both subtropical Southern Indian and Atlantic Oceans, they generate tropical anomalies that shift the foraging zone southward. Consequently the distances that penguins foraged from the colony and their feeding depths increased and the population size decreased. This represents an example of a robust and fast impact of large-scale climatic anomalies affecting a marine predator through changes in its at-sea behaviour and demography, despite lack of information on prey availability. Our results highlight a possible behavioural mechanism through which climate variability may affect population processes.

  2. Consequences of Part Temperature Variability in Electron Beam Melting of Ti-6Al-4V

    NASA Astrophysics Data System (ADS)

    Fisher, Brian A.; Mireles, Jorge; Ridwan, Shakerur; Wicker, Ryan B.; Beuth, Jack

    2017-12-01

    To facilitate adoption of Ti-6Al-4V (Ti64) parts produced via additive manufacturing (AM), the ability to ensure part quality is critical. Measuring temperatures is an important component of part quality monitoring in all direct metal AM processes. In this work, surface temperatures were monitored using a custom infrared camera system attached to an Arcam electron beam melting (EBM®) machine. These temperatures were analyzed to understand their possible effect on solidification microstructure based on solidification cooling rates extracted from finite element simulations. Complicated thermal histories were seen during part builds, and temperature changes occurring during typical Ti64 builds may be large enough to affect solidification microstructure. There is, however, enough time between fusion of individual layers for spatial temperature variations (i.e., hot spots) to dissipate. This means that an effective thermal control strategy for EBM® can be based on average measured surface temperatures, ignoring temperature variability.

  3. Using health education theories to explain behavior change: a cross-country analysis. 2000-2001.

    PubMed

    Murray-Johnson, Lisa; Witte, Kim; Boulay, Marc; Figueroa, Maria Elena; Storey, Douglas; Tweedie, Ian

    Scholars within the fields of public health, health education, health promotion, and health communication look to specific theories to explain health behavior change. The purpose of this article is to critically compare four health theories and key variables within them with regard to behavior change in the area of reproductive health. Using cross-country analyses of Ghana, Nepal, and Nicaragua (data sets provided by the Center for Communication Programs, Johns Hopkins University), the authors looked at the Health Belief Model, Theory of Reasoned Action, Extended Parallel Process Model, and Social Cognitive Theory for these two defined objectives. Results show that all four theories provide an excellent fit to the data, but that certain variables within them may have particular value for understanding specific aspects of behavior change. Recommendations for the selection of theories to use as guidelines in the design and evaluation of reproductive health programs are provided.

  4. Large-scale climatic anomalies affect marine predator foraging behaviour and demography

    NASA Astrophysics Data System (ADS)

    Bost, Charles A.; Cotté, Cedric; Terray, Pascal; Barbraud, Christophe; Bon, Cécile; Delord, Karine; Gimenez, Olivier; Handrich, Yves; Naito, Yasuhiko; Guinet, Christophe; Weimerskirch, Henri

    2015-10-01

    Determining the links between the behavioural and population responses of wild species to environmental variations is critical for understanding the impact of climate variability on ecosystems. Using long-term data sets, we show how large-scale climatic anomalies in the Southern Hemisphere affect the foraging behaviour and population dynamics of a key marine predator, the king penguin. When large-scale subtropical dipole events occur simultaneously in both subtropical Southern Indian and Atlantic Oceans, they generate tropical anomalies that shift the foraging zone southward. Consequently the distances that penguins foraged from the colony and their feeding depths increased and the population size decreased. This represents an example of a robust and fast impact of large-scale climatic anomalies affecting a marine predator through changes in its at-sea behaviour and demography, despite lack of information on prey availability. Our results highlight a possible behavioural mechanism through which climate variability may affect population processes.

  5. Computer modeling of thermoelectric generator performance

    NASA Technical Reports Server (NTRS)

    Chmielewski, A. B.; Shields, V.

    1982-01-01

    Features of the DEGRA 2 computer code for simulating the operations of a spacecraft thermoelectric generator are described. The code models the physical processes occurring during operation. Input variables include the thermoelectric couple geometry and composition, the thermoelectric materials' properties, interfaces and insulation in the thermopile, the heat source characteristics, mission trajectory, and generator electrical requirements. Time steps can be specified and sublimation of the leg and hot shoe is accounted for, as are shorts between legs. Calculations are performed for conduction, Peltier, Thomson, and Joule heating, the cold junction can be adjusted for solar radition, and the legs of the thermoelectric couple are segmented to enhance the approximation accuracy. A trial run covering 18 couple modules yielded data with 0.3% accuracy with regard to test data. The model has been successful with selenide materials, SiGe, and SiN4, with output of all critical operational variables.

  6. An optics-based variable-temperature assay system for characterizing thermodynamics of biomolecular reactions on solid support

    NASA Astrophysics Data System (ADS)

    Fei, Yiyan; Landry, James P.; Li, Yanhong; Yu, Hai; Lau, Kam; Huang, Shengshu; Chokhawala, Harshal A.; Chen, Xi; Zhu, X. D.

    2013-11-01

    A biological state is equilibrium of multiple concurrent biomolecular reactions. The relative importance of these reactions depends on physiological temperature typically between 10 °C and 50 °C. Experimentally the temperature dependence of binding reaction constants reveals thermodynamics and thus details of these biomolecular processes. We developed a variable-temperature opto-fluidic system for real-time measurement of multiple (400-10 000) biomolecular binding reactions on solid supports from 10 °C to 60 °C within ±0.1 °C. We illustrate the performance of this system with investigation of binding reactions of plant lectins (carbohydrate-binding proteins) with 24 synthetic glycans (i.e., carbohydrates). We found that the lectin-glycan reactions in general can be enthalpy-driven, entropy-driven, or both, and water molecules play critical roles in the thermodynamics of these reactions.

  7. Breast self-examination pamphlets: a content analysis grounded in fear appeal research.

    PubMed

    Kline, K N; Mattson, M

    2000-01-01

    In this study, we used the topic of breast self-examination (BSE) to illustrate how content analysis of promotional texts (already in existence, in the process of being created, or both) can provide supplementary data to that derived from audience analysis. Specifically, we used content analysis to isolate messages in BSE pamphlets that are consistent with the variables of severity, susceptibility, response efficacy, and self-efficacy, identified by existing fear appeal research and supported by other persuasion research as critical to the construction of effective health promotion messages. We then used statistical analyses to describe the relation among these 4 message variables. Our findings suggested that BSE pamphlets contain an unbalanced proportion of threat to efficacy arguments. Additionally, the efficacy messages were substantively weak. We contrasted these messages against the relatively strong mammography arguments contained in these pamphlets. We then provided recommendations for formulating stronger persuasive arguments in BSE promotional materials.

  8. Critical Thinking Dispositions of Pre-Service Teachers

    ERIC Educational Resources Information Center

    Bakir, Selda

    2015-01-01

    This study investigated the critical thinking dispositions of pre-service teachers in terms of various variables. The study included 1106 participants and used the survey model and the Turkish version (CCTDI-T) of the California Critical Thinking Disposition Inventory (CCTDI). The reliability of the scale for this study was found to be 0.82. The…

  9. Statistically Characterizing Intra- and Inter-Individual Variability in Children with Developmental Coordination Disorder

    ERIC Educational Resources Information Center

    King, Bradley R.; Harring, Jeffrey R.; Oliveira, Marcio A.; Clark, Jane E.

    2011-01-01

    Previous research investigating children with Developmental Coordination Disorder (DCD) has consistently reported increased intra- and inter-individual variability during motor skill performance. Statistically characterizing this variability is not only critical for the analysis and interpretation of behavioral data, but also may facilitate our…

  10. Seasonal controls of the short term variability of pCO2 at the Scotian Shelf

    NASA Astrophysics Data System (ADS)

    Thomas, H.; Craig, S.; Greenan, B. J. W.; Burt, W.; Herndl, G. J.; Higginson, S.; Salt, L.; Shadwick, E. H.; Urrego-Blanco, J.

    2012-04-01

    Much of the surface ocean carbon cycle variability can be attributed to the availability of sunlight, through processes such as heat fluxes or photosynthesis, which regulate the ocean carbon cycle over a wide range of time scales. The critical processes occurring on timescales of a day or less, however, have undergone few investigations, and most of those have been limited to a time span of several days to months, or exceptionally, for longer periods. Optical methods have helped to infer short-term biological variability, however lacking corresponding investigations of oceanic CO2 system. Here, we employ high-frequency CO2 system and optical observations covering the full seasonal cycle on the Scotian Shelf, Northwestern Atlantic Ocean, in order to unravel daily periodicity of the surface ocean carbon cycle and its effects on annual budgets. We show that significant daily periodicity occurs only if the water column is sufficiently stable as observed during seasonal warming. During that time biological CO2 drawdown, or net community production (NCP), is delayed for several hours relative to the daylight cycle due the daily build-up of essential Chlorophyll a, to cell physiology and to grazing effects, all restricting or hindering photosynthesis in the early morning hours. NCP collapses in summer by more than 90%, when the mixed layer depth reaches the seasonal minimum, which eventually makes the observed daily periodicity of the CO2 system vanish.

  11. Robustness of solvent/detergent treatment of plasma derivatives: a data collection from Plasma Protein Therapeutics Association member companies.

    PubMed

    Dichtelmüller, Herbert O; Biesert, Lothar; Fabbrizzi, Fabrizio; Gajardo, Rodrigo; Gröner, Albrecht; von Hoegen, Ilka; Jorquera, Juan I; Kempf, Christoph; Kreil, Thomas R; Pifat, Dominique; Osheroff, Wendy; Poelsler, Gerhard

    2009-09-01

    Solvent/detergent (S/D) treatment is an established virus inactivation technology that has been applied in the manufacture of medicinal products derived from human plasma for more than 20 years. Data on the inactivation of enveloped viruses by S/D treatment collected from seven Plasma Protein Therapeutics Association member companies demonstrate the robustness, reliability, and efficacy of this virus inactivation method. The results from 308 studies reflecting production conditions as well as technical variables significantly beyond the product release specification were evaluated for virus inactivation, comprising different combinations of solvent and detergent (tri(n-butyl) phosphate [TNBP]/Tween 80, TNBP/Triton X-100, TNBP/Na-cholate) and different products (Factor [F]VIII, F IX, and intravenous and intramuscular immunoglobulins). Neither product class, process temperature, protein concentration, nor pH value has a significant impact on virus inactivation. A variable that did appear to be critical was the concentration of solvent and detergent. The data presented here demonstrate the robustness of virus inactivation by S/D treatment for a broad spectrum of enveloped test viruses and process variables. Our data substantiate the fact that no transmission of viruses such as human immunodeficiency virus, hepatitis B virus, hepatitis C virus, or of other enveloped viruses was reported for licensed plasma derivatives since the introduction of S/D treatment.

  12. Criticality as a Set-Point for Adaptive Behavior in Neuromorphic Hardware

    PubMed Central

    Srinivasa, Narayan; Stepp, Nigel D.; Cruz-Albrecht, Jose

    2015-01-01

    Neuromorphic hardware are designed by drawing inspiration from biology to overcome limitations of current computer architectures while forging the development of a new class of autonomous systems that can exhibit adaptive behaviors. Several designs in the recent past are capable of emulating large scale networks but avoid complexity in network dynamics by minimizing the number of dynamic variables that are supported and tunable in hardware. We believe that this is due to the lack of a clear understanding of how to design self-tuning complex systems. It has been widely demonstrated that criticality appears to be the default state of the brain and manifests in the form of spontaneous scale-invariant cascades of neural activity. Experiment, theory and recent models have shown that neuronal networks at criticality demonstrate optimal information transfer, learning and information processing capabilities that affect behavior. In this perspective article, we argue that understanding how large scale neuromorphic electronics can be designed to enable emergent adaptive behavior will require an understanding of how networks emulated by such hardware can self-tune local parameters to maintain criticality as a set-point. We believe that such capability will enable the design of truly scalable intelligent systems using neuromorphic hardware that embrace complexity in network dynamics rather than avoiding it. PMID:26648839

  13. Sleep dynamics: A self-organized critical system

    NASA Astrophysics Data System (ADS)

    Comte, J. C.; Ravassard, P.; Salin, P. A.

    2006-05-01

    In psychiatric and neurological diseases, sleep is often perturbed. Moreover, recent works on humans and animals tend to show that sleep plays a strong role in memory processes. Reciprocally, sleep dynamics following a learning task is modified [Hubert , Nature (London) 02663, 1 (2004), Peigneux , Neuron 44, 535 (2004)]. However, sleep analysis in humans and animals is often limited to the total sleep and wake duration quantification. These two parameters are not fully able to characterize the sleep dynamics. In mammals sleep presents a complex organization with an alternation of slow wave sleep (SWS) and paradoxical sleep (PS) episodes. Moreover, it has been shown recently that these sleep episodes are frequently interrupted by micro-arousal (without awakening). We present here a detailed analysis of the basal sleep properties emerging from the mechanisms underlying the vigilance states alternation in an animal model. These properties present a self-organized critical system signature and reveal the existence of two W, two SWS, and a PS structure exhibiting a criticality as met in sand piles. We propose a theoretical model of the sleep dynamics based on several interacting neuronal populations. This new model of sleep dynamics presents the same properties as experimentally observed, and explains the variability of the collected data. This experimental and theoretical study suggests that sleep dynamics shares several common features with critical systems.

  14. Toward a framework for levels of robot autonomy in human-robot interaction.

    PubMed

    Beer, Jenay M; Fisk, Arthur D; Rogers, Wendy A

    2014-07-01

    A critical construct related to human-robot interaction (HRI) is autonomy, which varies widely across robot platforms. Levels of robot autonomy (LORA), ranging from teleoperation to fully autonomous systems, influence the way in which humans and robots may interact with one another. Thus, there is a need to understand HRI by identifying variables that influence - and are influenced by - robot autonomy. Our overarching goal is to develop a framework for levels of robot autonomy in HRI. To reach this goal, the framework draws links between HRI and human-automation interaction, a field with a long history of studying and understanding human-related variables. The construct of autonomy is reviewed and redefined within the context of HRI. Additionally, the framework proposes a process for determining a robot's autonomy level, by categorizing autonomy along a 10-point taxonomy. The framework is intended to be treated as guidelines to determine autonomy, categorize the LORA along a qualitative taxonomy, and consider which HRI variables (e.g., acceptance, situation awareness, reliability) may be influenced by the LORA.

  15. Toward a framework for levels of robot autonomy in human-robot interaction

    PubMed Central

    Beer, Jenay M.; Fisk, Arthur D.; Rogers, Wendy A.

    2017-01-01

    A critical construct related to human-robot interaction (HRI) is autonomy, which varies widely across robot platforms. Levels of robot autonomy (LORA), ranging from teleoperation to fully autonomous systems, influence the way in which humans and robots may interact with one another. Thus, there is a need to understand HRI by identifying variables that influence – and are influenced by – robot autonomy. Our overarching goal is to develop a framework for levels of robot autonomy in HRI. To reach this goal, the framework draws links between HRI and human-automation interaction, a field with a long history of studying and understanding human-related variables. The construct of autonomy is reviewed and redefined within the context of HRI. Additionally, the framework proposes a process for determining a robot’s autonomy level, by categorizing autonomy along a 10-point taxonomy. The framework is intended to be treated as guidelines to determine autonomy, categorize the LORA along a qualitative taxonomy, and consider which HRI variables (e.g., acceptance, situation awareness, reliability) may be influenced by the LORA. PMID:29082107

  16. Supercritical Fluid: Liquid, Gas, Both or Neither? A Different Approach.

    ERIC Educational Resources Information Center

    Meyer, Edwin F.; Meyer, Thomas P.

    1986-01-01

    Presents a laboratory experiment which determines critical temperature and density of carbon dioxide. Discusses critical point and provides equations to estimate liquid volume fraction. Analyzes experimental results in terms of variables. (JM)

  17. Dual coding theory, word abstractness, and emotion: a critical review of Kousta et al. (2011).

    PubMed

    Paivio, Allan

    2013-02-01

    Kousta, Vigliocco, Del Campo, Vinson, and Andrews (2011) questioned the adequacy of dual coding theory and the context availability model as explanations of representational and processing differences between concrete and abstract words. They proposed an alternative approach that focuses on the role of emotional content in the processing of abstract concepts. Their dual coding critique is, however, based on impoverished and, in some respects, incorrect interpretations of the theory and its implications. This response corrects those gaps and misinterpretations and summarizes research findings that show predicted variations in the effects of dual coding variables in different tasks and contexts. Especially emphasized is an empirically supported dual coding theory of emotion that goes beyond the Kousta et al. emphasis on emotion in abstract semantics. 2013 APA, all rights reserved

  18. Silicone absorption of elastomeric closures--an accelerated study.

    PubMed

    Degrazio, F L; Hlobik, T; Vaughan, S

    1998-01-01

    There is a trend in the parenteral industry to move from the use of elastomeric closures which are washed, siliconized, dried and sterilized in-house at the pharmaceutical manufacturers' site to pre-prepared closures purchased from the closure supplier. This preparation can consist of washing to reduce particle-load and bioburden, siliconization, placement in ready-to-sterilize bags and may eventually extend to sterilization by steam autoclave or gamma irradiation. Since silicone oil lubrication is critical to the processability/machinability of closures, research was designed to investigate this phenomenon in closures prepared using the Westar RS (Ready-to-Sterilize) process. This paper presents the data gathered in a study of the characteristic of silicone absorption into elastomeric closures under accelerated conditions. Variables such as silicone viscosity, rubber formulation, effect of sterilization and others are considered.

  19. High-temperature optical fiber instrumentation for gas flow monitoring in gas turbine engines

    NASA Astrophysics Data System (ADS)

    Roberts, Adrian; May, Russell G.; Pickrell, Gary R.; Wang, Anbo

    2002-02-01

    In the design and testing of gas turbine engines, real-time data about such physical variables as temperature, pressure and acoustics are of critical importance. The high temperature environment experienced in the engines makes conventional electronic sensors devices difficult to apply. Therefore, there is a need for innovative sensors that can reliably operate under the high temperature conditions and with the desirable resolution and frequency response. A fiber optic high temperature sensor system for dynamic pressure measurement is presented in this paper. This sensor is based on a new sensor technology - the self-calibrated interferometric/intensity-based (SCIIB) sensor, recently developed at Virginia Tech. State-of-the-art digital signal processing (DSP) methods are applied to process the signal from the sensor to acquire high-speed frequency response.

  20. Enhancing the usability of seasonal to decadal (S2D) climate information - an evidence-based framework for the identification and assessment of sector-specific vulnerabilities

    NASA Astrophysics Data System (ADS)

    Funk, Daniel

    2016-04-01

    The successful provision of from seasonal to decadal (S2D) climate service products to sector-specific users is dependent on specific problem characteristics and individual user needs and decision-making processes. Climate information requires an impact on decision making to have any value (Rodwell and Doblas-Reyes, 2006). For that reason the knowledge of sector-specific vulnerabilities to S2D climate variability is very valuable information for both, climate service producers and users. In this context a concept for a vulnerability assessment framework was developed to (i) identify climate events (and especially their temporal scales) critical for sector-specific problems to assess the basic requirements for an appropriate climate-service product development; and to (ii) assess the potential impact or value of related climate information for decision-makers. The concept was developed within the EUPORIAS project (European Provision of Regional Impacts Assessments on Seasonal and Decadal Timescales) based on ten project-related case-studies from different sectors all over Europe. In the prevalent stage the framework may be useful as preliminary assessment or 'quick-scan' of the vulnerability of specific systems to climate variability in the context of S2D climate service provision. The assessment strategy of the framework is user-focused, using predominantly a bottom-up approach (vulnerability as state) but also a top-down approach (vulnerability as outcome) generally based on qualitative data (surveys, interviews, etc.) and literature research for system understanding. The starting point of analysis is a climate-sensitive 'critical situation' of the considered system which requires a decision and is defined by the user. From this basis the related 'critical climate conditions' are assessed and 'climate information needs' are derived. This mainly refers to the critical period of time of the climate event or sequence of events. The relevant period of time of problem-specific critical climate conditions may be assessed by the resilience of the system of concern, the response time of an interconnected system (i.e. top-down approach using a bottom-up methodology) or alternatively, by the critical time-frame of decision-making processes (bottom-up approach). This approach counters the challenges for a vulnerability assessment of economic sectors to S2D climate events which originate from the inherent role of climate for economic sectors: climate may affect economic sectors as hazard, resource, production- or regulation factor. This implies, that climate dependencies are often indirect and nonlinear. Consequently, climate events which are critical for affected systems do not necessarily correlate with common climatological extremes. One important output of the framework is a classification system of 'climate-impact types' which classifies sector-specific problems in a systemic way. This system proves to be promising because (i) it reflects and thus differentiates the cause for the climate relevance of a specific problem (compositions of buffer factors); (ii) it integrates decision-making processes which proved to be a significant factor; (iii) it indicates a potential usability of S2D climate service products and thus integrates coping options, and (vi) it is a systemic approach which goes beyond the established 'snap-shot' of vulnerability assessments.

  1. Simplifiying global biogeochemistry models to evaluate methane emissions

    NASA Astrophysics Data System (ADS)

    Gerber, S.; Alonso-Contes, C.

    2017-12-01

    Process-based models are important tools to quantify wetland methane emissions, particularly also under climate change scenarios, evaluating these models is often cumbersome as they are embedded in larger land-surface models where fluctuating water table and the carbon cycle (including new readily decomposable plant material) are predicted variables. Here, we build on these large scale models but instead of modeling water table and plant productivity we provide values as boundary conditions. In contrast, aerobic and anaerobic decomposition, as well as soil column transport of oxygen and methane are predicted by the model. Because of these simplifications, the model has the potential to be more readily adaptable to the analysis of field-scale data. Here we determine the sensitivity of the model to specific setups, parameter choices, and to boundary conditions in order to determine set-up needs and inform what critical auxiliary variables need to be measured in order to better predict field-scale methane emissions from wetland soils. To that end we performed a global sensitivity analysis that also considers non-linear interactions between processes. The global sensitivity analysis revealed, not surprisingly, that water table dynamics (both mean level and amplitude of fluctuations), and the rate of the carbon cycle (i.e. net primary productivity) are critical determinants of methane emissions. The depth-scale where most of the potential decomposition occurs also affects methane emissions. Different transport mechanisms are compensating each other to some degree: If plant conduits are constrained, methane emissions by diffusive flux and ebullition compensate to some degree, however annual emissions are higher when plants help to bypass methanotrophs in temporally unsaturated upper layers. Finally, while oxygen consumption by plant roots help creating anoxic conditions it has little effect on overall methane emission. Our initial sensitivity analysis helps guiding further model development and improvement. However, an important goal for our model is to use it in field settings as a tool to deconvolve the different processes that contribute to the net transfer of methane from soils to atmosphere.

  2. Effect of flow rate on environmental variables and phytoplankton dynamics: results from field enclosures

    NASA Astrophysics Data System (ADS)

    Zhang, Haiping; Chen, Ruihong; Li, Feipeng; Chen, Ling

    2015-03-01

    To investigate the effects of flow rate on phytoplankton dynamics and related environment variables, a set of enclosure experiments with different flow rates were conducted in an artificial lake. We monitored nutrients, temperature, dissolved oxygen, pH, conductivity, turbidity, chlorophyll- a and phytoplankton levels. The lower biomass in all flowing enclosures showed that flow rate significantly inhibited the growth of phytoplankton. A critical flow rate occurred near 0.06 m/s, which was the lowest relative inhibitory rate. Changes in flow conditions affected algal competition for light, resulting in a dramatic shift in phytoplankton composition, from blue-green algae in still waters to green algae in flowing conditions. These findings indicate that critical flow rate can be useful in developing methods to reduce algal bloom occurrence. However, flow rate significantly enhanced the inter-relationships among environmental variables, in particular by inducing higher water turbidity and vegetative reproduction of periphyton ( Spirogyra). These changes were accompanied by a decrease in underwater light intensity, which consequently inhibited the photosynthetic intensity of phytoplankton. These results warn that a universal critical flow rate might not exist, because the effect of flow rate on phytoplankton is interlinked with many other environmental variables.

  3. Advanced metrology by offline SEM data processing

    NASA Astrophysics Data System (ADS)

    Lakcher, Amine; Schneider, Loïc.; Le-Gratiet, Bertrand; Ducoté, Julien; Farys, Vincent; Besacier, Maxime

    2017-06-01

    Today's technology nodes contain more and more complex designs bringing increasing challenges to chip manufacturing process steps. It is necessary to have an efficient metrology to assess process variability of these complex patterns and thus extract relevant data to generate process aware design rules and to improve OPC models. Today process variability is mostly addressed through the analysis of in-line monitoring features which are often designed to support robust measurements and as a consequence are not always very representative of critical design rules. CD-SEM is the main CD metrology technique used in chip manufacturing process but it is challenged when it comes to measure metrics like tip to tip, tip to line, areas or necking in high quantity and with robustness. CD-SEM images contain a lot of information that is not always used in metrology. Suppliers have provided tools that allow engineers to extract the SEM contours of their features and to convert them into a GDS. Contours can be seen as the signature of the shape as it contains all the dimensional data. Thus the methodology is to use the CD-SEM to take high quality images then generate SEM contours and create a data base out of them. Contours are used to feed an offline metrology tool that will process them to extract different metrics. It was shown in two previous papers that it is possible to perform complex measurements on hotspots at different process steps (lithography, etch, copper CMP) by using SEM contours with an in-house offline metrology tool. In the current paper, the methodology presented previously will be expanded to improve its robustness and combined with the use of phylogeny to classify the SEM images according to their geometrical proximities.

  4. Characterization of Three L-Asparaginases from Maritime Pine (Pinus pinaster Ait.).

    PubMed

    Van Kerckhoven, Sonia H; de la Torre, Fernando N; Cañas, Rafael A; Avila, Concepción; Cantón, Francisco R; Cánovas, Francisco M

    2017-01-01

    Asparaginases (ASPG, EC 3.5.1.1) catalyze the hydrolysis of the amide group of L-asparagine producing L-aspartate and ammonium. Three ASPG, PpASPG1, PpASPG2, and PpASPG3, have been identified in the transcriptome of maritime pine ( Pinus pinaster Ait.) that were transiently expressed in Nicotiana benthamiana by agroinfection. The three recombinant proteins were processed in planta to active enzymes and it was found that all mature forms exhibited double activity asparaginase/isoaspartyl dipeptidase but only PpASPG1 was able to catalyze efficiently L-asparagine hydrolysis. PpASPG1 contains a variable region of 77 amino acids that is critical for proteolytic processing of the precursor and is retained in the mature enzyme. Furthermore, the functional analysis of deletion mutants demonstrated that this protein fragment is required for specific recognition of the substrate and favors enzyme stability. Potassium has a limited effect on the activation of maritime pine ASPG what is consistent with the lack of a critical residue essential for interaction of cation. Taken together, the results presented here highlight the specific features of ASPG from conifers when compared to the enzymes from angiosperms.

  5. Planning for long-duration space exploration: Interviews with NASA subject matter experts

    NASA Astrophysics Data System (ADS)

    McIntosh, Tristan; Mulhearn, Tyler; Gibson, Carter; Mumford, Michael D.; Yammarino, Francis J.; Connelly, Shane; Day, Eric A.; Vessey, William B.

    2016-12-01

    Planning is critical to organizations, especially for those involved in pursuing technologic, scientific, and innovative ventures. Examination of planning processes is particularly important in high-stake and high-risk environments. In the present study, to highlight the significance of planning in the context of long-duration space missions, 11 current and former National Aeronautics and Space Administration (NASA) personnel were interviewed to gain a better understanding of astronaut and Mission Control leadership in preparing for and carrying out space missions. Interviewees focused their responses on perceptions of leadership and thoughts on how long-duration spaceflight leadership should be different from current and short-term spaceflight. Notes from these interviews were content coded and qualitatively analyzed. We found that cognitive planning skills and case-based reasoning were among the variables that were most highly rated for being critical to the success of long-duration space missions. Moreover, qualitative analyses revealed new considerations for long-duration space missions, such as granting greater autonomy to crewmembers and the need for more near-term forecasting. The implications of these findings for understanding the planning processes and necessary characteristics of individuals tasked with planning are discussed.

  6. Characterization of Three L-Asparaginases from Maritime Pine (Pinus pinaster Ait.)

    PubMed Central

    Van Kerckhoven, Sonia H.; de la Torre, Fernando N.; Cañas, Rafael A.; Avila, Concepción; Cantón, Francisco R.; Cánovas, Francisco M.

    2017-01-01

    Asparaginases (ASPG, EC 3.5.1.1) catalyze the hydrolysis of the amide group of L-asparagine producing L-aspartate and ammonium. Three ASPG, PpASPG1, PpASPG2, and PpASPG3, have been identified in the transcriptome of maritime pine (Pinus pinaster Ait.) that were transiently expressed in Nicotiana benthamiana by agroinfection. The three recombinant proteins were processed in planta to active enzymes and it was found that all mature forms exhibited double activity asparaginase/isoaspartyl dipeptidase but only PpASPG1 was able to catalyze efficiently L-asparagine hydrolysis. PpASPG1 contains a variable region of 77 amino acids that is critical for proteolytic processing of the precursor and is retained in the mature enzyme. Furthermore, the functional analysis of deletion mutants demonstrated that this protein fragment is required for specific recognition of the substrate and favors enzyme stability. Potassium has a limited effect on the activation of maritime pine ASPG what is consistent with the lack of a critical residue essential for interaction of cation. Taken together, the results presented here highlight the specific features of ASPG from conifers when compared to the enzymes from angiosperms. PMID:28690619

  7. Delayed clarification: information, clarification and ethical decisions in critical care in Norway.

    PubMed

    Bunch, E H

    2000-12-01

    Following the analysis of about 150 hours of field observation on a critical care unit in Norway a theory was generated to explain the actual ethical decision-making process. This was grounded in the empirical reality of physicians, nurses and family. The core theme in this study was a delayed clarification in assessing the prognosis of accident victims with neurosurgical traumas. The physicians, nurses and family had to wait for the clinical picture to clarify, during which time there was an exchange and emergence of information. Exchanging information, a subprocess to delayed clarification, involved a continuous flow of collecting and dispersing information about the clinical status of the patient. The nurses engaged in two useful strategies: grading information to family when the patient prognosis was poor, and providing grieving strategies for themselves, colleagues and family members. The core variable, delayed clarification has three dimensions: clinical, psychological and ethical. The nurses participated in the decision-making process to discontinue treatment as passive participants, they did not engage in collegial deliberations with the physicians. Ethical dilemmas were end of life questions, resource allocations, and questions of justice and organ transplants.

  8. Can the Relationship Between Rapid Automatized Naming and Word Reading Be Explained by a Catastrophe? Empirical Evidence From Students With and Without Reading Difficulties.

    PubMed

    Sideridis, Georgios D; Simos, Panagiotis; Mouzaki, Angeliki; Stamovlasis, Dimitrios; Georgiou, George K

    2018-05-01

    The purpose of the present study was to explain the moderating role of rapid automatized naming (RAN) in word reading with a cusp catastrophe model. We hypothesized that increases in RAN performance speed beyond a critical point would be associated with the disruption in word reading, consistent with a "generic shutdown" hypothesis. Participants were 587 elementary schoolchildren (Grades 2-4), among whom 87 had reading comprehension difficulties per the IQ-achievement discrepancy criterion. Data were analyzed via a cusp catastrophe model derived from the nonlinear dynamics systems theory. Results indicated that for children with reading comprehension difficulties, as naming speed falls below a critical level, the association between core reading processes (word recognition and decoding) becomes chaotic and unpredictable. However, after the significant common variance attributed to motivation, emotional, and internalizing symptoms measures from RAN scores was partialed out, its role as a bifurcation variable was no longer evident. Taken together, these findings suggest that RAN represents a salient cognitive measure that may be associated with psychoemotional processes that are, at least in part, responsible for unpredictable and chaotic word reading behavior among children with reading comprehension deficits.

  9. Linking Nurse Leadership and Work Characteristics to Nurse Burnout and Engagement.

    PubMed

    Lewis, Heather Smith; Cunningham, Christopher J L

    2016-01-01

    Burnout and engagement are critical conditions affecting patient safety and the functioning of healthcare organizations; the areas of worklife model suggest that work environment characteristics may impact employee burnout and general worklife quality. The purpose was to present and test a conditional process model linking perceived transformational nurse leadership to nurse staff burnout and engagement via important work environment characteristics. Working nurses (N = 120) provided perceptions of the core study variables via Internet- or paper-based survey. The hypothesized model was tested using the PROCESS analysis tool, which enables simultaneous testing of multiple, parallel, indirect effects within the SPSS statistical package. Findings support the areas of worklife model and suggest that transformational leadership is strongly associated with work environment characteristics that are further linked to nurse burnout and engagement. Interestingly, different work characteristics appear to be critical channels through which transformational leadership impacts nurse burnout and engagement. There are several methodological and practical implications of this work for researchers and practitioners interested in preventing burnout and promoting occupational health within healthcare organizations. These implications are tied to the connections observed between transformational leadership, specific work environment characteristics, and burnout and engagement outcomes.

  10. Characterizing hyporheic exchange processes using high-frequency electrical conductivity-discharge relationships on subhourly to interannual timescales

    NASA Astrophysics Data System (ADS)

    Singley, Joel G.; Wlostowski, Adam N.; Bergstrom, Anna J.; Sokol, Eric R.; Torrens, Christa L.; Jaros, Chris; Wilson, Colleen E.; Hendrickson, Patrick J.; Gooseff, Michael N.

    2017-05-01

    Concentration-discharge (C-Q) relationships are often used to quantify source water contributions and biogeochemical processes occurring within catchments, especially during discrete hydrological events. Yet, the interpretation of C-Q hysteresis is often confounded by complexity of the critical zone, such as numerous source waters and hydrochemical nonstationarity. Consequently, researchers must often ignore important runoff pathways and geochemical sources/sinks, especially the hyporheic zone because it lacks a distinct hydrochemical signature. Such simplifications limit efforts to identify processes responsible for the transience of C-Q hysteresis over time. To address these limitations, we leverage the hydrologic simplicity and long-term, high-frequency Q and electrical conductivity (EC) data from streams in the McMurdo Dry Valleys, Antarctica. In this two end-member system, EC can serve as a proxy for the concentration of solutes derived from the hyporheic zone. We utilize a novel approach to decompose loops into subhysteretic EC-Q dynamics to identify individual mechanisms governing hysteresis across a wide range of timescales. We find that hydrologic and hydraulic processes govern EC response to diel and seasonal Q variability and that the effects of hyporheic mixing processes on C-Q transience differ in short and long streams. We also observe that variable hyporheic turnover rates govern EC-Q patterns at daily to interannual timescales. Last, subhysteretic analysis reveals a period of interannual freshening of glacial meltwater streams related to the effects of unsteady flow on hyporheic exchange. The subhysteretic analysis framework we introduce may be applied more broadly to constrain the processes controlling C-Q transience and advance understanding of catchment evolution.

  11. Haemodialysis, nutritional disorders and hypoglycaemia in critical care.

    PubMed

    Crespo, Jeiel Carlos Lamonica; Gomes, Vanessa Rossato; Barbosa, Ricardo Luís; Padilha, Katia Grillo; Secoli, Silvia Regina

    2017-03-09

    This study aimed to determine hypoglycemia incidence and associated factors in critically ill patients. It looked at a retrospective cohort with 106 critically ill adult patients with 48 hours of glycaemic control and 72 hours of follow up. The dependent variable, hypoglycaemia (≤70 mg/dl), was assessed with respect to independent variables: age, diet, insulin, catecholamines, haemodialysis, nursing workload and the Simplified Acute Physiology Score. Statistical analysis was performed using Student's t-test, Fisher's exact test and logistic regression at 5% significance level. Incidence of hypoglycaemia was 14.2%. Hypoglycaemia was higher in the group of patients on catecholamines (p=0.040), with higher glycaemic variability (p<0.001) and death in the intensive care unit (p=0.008). Risk factors were identified as absence of oral diet (OR 5.11; 95% CI 1.04-25.10) and haemodialysis (OR 4.28; 95% CI 1.16-15.76). Patients on haemodialysis and with no oral diet should have their glycaemic control intensified in order to prevent and/or manage hypoglycaemic episodes.

  12. [Competence of triage nurses in hospital emergency departments].

    PubMed

    Martínez-Segura, Estrella; Lleixà-Fortuño, Mar; Salvadó-Usach, Teresa; Solà-Miravete, Elena; Adell-Lleixà, Mireia; Chanovas-Borrás, Manel R; March-Pallarés, Gemma; Mora-López, Gerard

    2017-06-01

    To identify associations between sociodemographic characteristics variables and competence levels of triage nurses in hospital emergency departments. Descriptive, cross-sectional, multicenter study of triage nurses in hospital emergency departments in the southwestern area of Catalonia (Ebre River territory). We used an instrument for evaluating competencies (the COM_VA questionnaire) and recording sociodemographic variables (age, sex, total work experience, emergency department experience, training in critical patient care and triage) and perceived confidence when performing triage. We then analyzed the association between these variables and competency scores. Competency scores on the COM_VA questionnaire were significantly higher in nurses with training in critical patient care (P=.001) and triage (P=0.002) and in those with longer emergency department experience (P<.0001). Perceived confidence when performing triage increased with competency score (P<.0001) and training in critical patient care (P<.0001) and triage (P=.045). The competence of triage nurses and their perception of confidence when performing triage increases with emergency department experience and training.

  13. Learning Additional Languages as Hierarchical Probabilistic Inference: Insights From First Language Processing.

    PubMed

    Pajak, Bozena; Fine, Alex B; Kleinschmidt, Dave F; Jaeger, T Florian

    2016-12-01

    We present a framework of second and additional language (L2/L n ) acquisition motivated by recent work on socio-indexical knowledge in first language (L1) processing. The distribution of linguistic categories covaries with socio-indexical variables (e.g., talker identity, gender, dialects). We summarize evidence that implicit probabilistic knowledge of this covariance is critical to L1 processing, and propose that L2/L n learning uses the same type of socio-indexical information to probabilistically infer latent hierarchical structure over previously learned and new languages. This structure guides the acquisition of new languages based on their inferred place within that hierarchy, and is itself continuously revised based on new input from any language. This proposal unifies L1 processing and L2/L n acquisition as probabilistic inference under uncertainty over socio-indexical structure. It also offers a new perspective on crosslinguistic influences during L2/L n learning, accommodating gradient and continued transfer (both negative and positive) from previously learned to novel languages, and vice versa.

  14. Learning Additional Languages as Hierarchical Probabilistic Inference: Insights From First Language Processing

    PubMed Central

    Pajak, Bozena; Fine, Alex B.; Kleinschmidt, Dave F.; Jaeger, T. Florian

    2015-01-01

    We present a framework of second and additional language (L2/Ln) acquisition motivated by recent work on socio-indexical knowledge in first language (L1) processing. The distribution of linguistic categories covaries with socio-indexical variables (e.g., talker identity, gender, dialects). We summarize evidence that implicit probabilistic knowledge of this covariance is critical to L1 processing, and propose that L2/Ln learning uses the same type of socio-indexical information to probabilistically infer latent hierarchical structure over previously learned and new languages. This structure guides the acquisition of new languages based on their inferred place within that hierarchy, and is itself continuously revised based on new input from any language. This proposal unifies L1 processing and L2/Ln acquisition as probabilistic inference under uncertainty over socio-indexical structure. It also offers a new perspective on crosslinguistic influences during L2/Ln learning, accommodating gradient and continued transfer (both negative and positive) from previously learned to novel languages, and vice versa. PMID:28348442

  15. Tables of critical-flow functions and thermodynamic properties for methane and computational procedures for both methane and natural gas

    NASA Technical Reports Server (NTRS)

    Johnson, R. C.

    1972-01-01

    Procedures for calculating the mass flow rate of methane and natural gas through nozzles are given, along with the FORTRAN 4 subroutines used to make these calculations. Three sets of independent variables are permitted in these routines. In addition to the plenum pressure and temperature, the third independent variable is either nozzle exit pressure, Mach number, or temperature. A critical-flow factor that becomes a convenient means for determining the mass flow rate of methane through critical-flow nozzles is tabulated. Other tables are included for nozzle throat velocity and critical pressure, density, and temperature ratios, along with some thermodynamic properties of methane, including compressibility factor, enthalpy, entropy, specific heat, specific-heat ratio, and speed of sound. These tabulations cover a temperature range from 120 to 600 K and pressures to 3 million N/sq m.

  16. Soil carbon cycling proxies: Understanding their critical role in predicting climate change feedbacks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, Vanessa L.; Bond-Lamberty, Ben; DeAngelis, Kristen

    The complexity of processes and interactions that drive soil C dynamics necessitate the use of proxy variables to represent soil characteristics that cannot be directly measured (correlative proxies), or that aggregate information about multiple soil characteristics into one variable (integrative proxies). These proxies have proven useful for understanding the soil C cycle, which is highly variable in both space and time, and are now being used to make predictions of the C fate and persistence under future climate scenarios. As these proxies are used at increasingly larger scales, the C pools and processes that proxies represent must be thoughtfully consideredmore » in order to minimize uncertainties in empirical understanding, as well as in model parameters and in model outcomes. The importance of these uncertainties is further amplified by the current need to make predictions of the C cycle for the non steady state environmental conditions resulting from global climate change. To clarify the appropriate uses of proxy variables, we provide specific examples of proxy variables that could improve decision making, adaptation choices, and modeling skill, while not foreclosing on – and also encouraging – continued work on their mechanistic underpinnings. We explore the use of three common soil proxies used to study soil organic matter: metabolic quotient, clay content, and physical fractionation. We also consider emerging data types, specifically genome-sequence data, and how these serve as proxies for microbial community activities. We opine that the demand for increasing mechanistic detail, and the flood of data from new imaging and genetic techniques, does not replace the value of correlative and integrative proxies--variables that are simpler, easier, or cheaper to measure. By closely examining the current knowledge gaps and broad assumptions in soil C cycling with the proxies already in use, we can develop new hypotheses and specify criteria for new and needed proxies.« less

  17. Impact of Vial Capping on Residual Seal Force and Container Closure Integrity.

    PubMed

    Mathaes, Roman; Mahler, Hanns-Christian; Roggo, Yves; Ovadia, Robert; Lam, Philippe; Stauch, Oliver; Vogt, Martin; Roehl, Holger; Huwyler, Joerg; Mohl, Silke; Streubel, Alexander

    2016-01-01

    The vial capping process is a critical unit operation during drug product manufacturing, as it could possibly generate cosmetic defects or even affect container closure integrity. Yet there is significant variability in capping equipment and processes, and their relation to potential defects or container closure integrity has not been thoroughly studied. In this study we applied several methods-residual seal force tester, a self-developed system of a piezo force sensor measurement, and computed tomography-to characterize different container closure system combinations that had been sealed using different capping process parameter settings. Additionally, container closure integrity of these samples was measured using helium leakage (physical container closure integrity) and compared to characterization data. The different capping equipment settings lead to residual seal force values from 7 to 115 N. High residual seal force values were achieved with high capping pre-compression force and a short distance between the capping plate and plunge. The choice of container closure system influenced the obtained residual seal force values. The residual seal force tester and piezoelectric measurements showed similar trends. All vials passed physical container closure integrity testing, and no stopper rupture was seen with any of the settings applied, suggesting that container closure integrity was warranted for the studied container closure system with the chosen capping setting ranges. The vial capping process is a critical unit operation during drug product manufacturing, as it could possibly generate cosmetic defects or even affect container closure integrity. Yet there is significant variability in capping equipment and processes, and their relation to potential defects or container closure integrity has not been thoroughly studied. In this study we applied several methods-residual seal force tester, a self-developed system of a piezo force sensor measurement, and computed tomography-to characterize different container closure system combinations that had been sealed using different capping process parameter settings. The residual seal force tester can analyze a variety of different container closure systems independent of the capping equipment. An adequate and safe residual seal force range for each container closure system configuration can be established with the residual seal force tester and additional methods like computed tomography scans and leak testing. In the residual seal force range studied, the physical container closure integrity of the container closure system was warranted. © PDA, Inc. 2016.

  18. Prior exposure to hyperglycaemia attenuates the relationship between glycaemic variability during critical illness and mortality.

    PubMed

    Plummer, Mark P; Finnis, Mark E; Horsfall, Matthew; Ly, Marleesa; Kar, Palash; Abdelhamid, Yasmine Ali; Deane, Adam M

    2016-09-01

    Our primary objective was to determine the impact of prior exposure to hyperglycaemia on the association between glycaemic variability during critical illness and mortality. Our secondary objectives included evaluating the relationships between prior hyperglycaemia and hyperglycaemia or hypoglycaemia during critical illness and mortality. A single-centre, retrospective, observational study in a tertiary intensive care unit. Patients admitted to the ICU between 1 September 2011 and 30 June 2015, with diabetes recorded using ICD-10-AM coding or a glycated haemoglobin (HbA1c) level of ≥ 6.5%, were stratified by prior hyperglycaemic level (HbA1c < 6.5%, 6.5%-8.5%, or > 8.5%). Glycaemic variability was assessed as the blood glucose coefficient of variation during the patient's stay in the ICU. Multivariate logistic regression and marginal predictive plots were used to assess the impact of prior hyperglycaemia on the association between glycaemic metrics and mortality. We studied 1569 patients with diabetes (HbA1c < 6.5%, n = 495; HbA1c 6.5%-8.5%, n = 731; and HbA1c > 8.5%, n = 343). Glycaemic variability was strongly associated with hospital mortality (P = 0.001), but this asso ciation showed a significant interaction with prior hyperglycaemia (P = 0.011), such that for patients with HbA1c > 8.5%, increasing glycaemic variability was not associated with increased mortality. Acute hyperglycaemia was strongly associated with mortality (P < 0.0001) and also showed a significant interaction with prior hyperglycaemia (P = 0.001), such that for patients with HbA1c > 8.5%, acute hyperglycaemia was not associated with mortality. Hypoglycaemia was also associated with mortality (P < 0.0001), but prior exposure to hyperglycaemia had a lesser effect on this relationship. Prior exposure to hyperglycaemia attenuates the association between glycaemic variability and mortality in critically ill patients with diabetes.

  19. Elucidating Critical Zone Process Interactions with an Integrated Hydrology Model in a Headwaters Research Catchment

    NASA Astrophysics Data System (ADS)

    Collins, C.; Maxwell, R. M.

    2017-12-01

    Providence Creek (P300) watershed is an alpine headwaters catchment located at the Southern Sierra Critical Zone Observatory (SSCZO). Evidence of groundwater-dependent vegetation and drought-induced tree mortality at P300 along with the effect of subsurface characterization on mountain ecohydrology motivates this study. A hyper resolution integrated hydrology model of this site, along with extensive instrumentation, provides an opportunity to study the effects of lateral groundwater flow on vegetation's tolerance to drought. ParFlow-CLM is a fully integrated surface-subsurface model that is driven with reconstructed meteorology, such as the North American Land Data Assimilation System project phase 2 (NLDAS-2) dataset. However, large-scale data products mute orographic effects on climate at smaller scales. Climate variables often do not behave uniformly in highly heterogeneous mountain regions. Therefore, forcing physically-based integrated hydrologic models—especially of mountain headwaters catchments—with a large-scale data product is a major challenge. Obtaining reliable observations in complex terrain is challenging and while climate data products introduce uncertainties likewise, documented discrepancies between several data products and P300 observations suggest these data products may suffice. To tackle these issues, a suite of simulations was run to parse out (1) the effects of climate data source (data products versus observations) and (2) the effects of climate data spatial variability. One tool for evaluating the effect of climate data on model outputs is the relationship between latent head flux (LH) and evapotranspiration (ET) partitioning with water table depth (WTD). This zone of LH sensitivity to WTD is referred to as the "critical zone." Preliminary results suggest that these critical zone relationships are preserved despite forcing albeit significant shifts in magnitude. These results demonstrate that integrated hydrology models are sensitive to climate data thereby impacting the accuracy of hydrologic modeling of headwaters catchments used for water management and planning purposes and exploring the effects of climate change perturbations.

  20. Process monitoring using automatic physical measurement based on electrical and physical variability analysis

    NASA Astrophysics Data System (ADS)

    Shauly, Eitan N.; Levi, Shimon; Schwarzband, Ishai; Adan, Ofer; Latinsky, Sergey

    2015-04-01

    A fully automated silicon-based methodology for systematic analysis of electrical features is shown. The system was developed for process monitoring and electrical variability reduction. A mapping step was created by dedicated structures such as static-random-access-memory (SRAM) array or standard cell library, or by using a simple design rule checking run-set. The resulting database was then used as an input for choosing locations for critical dimension scanning electron microscope images and for specific layout parameter extraction then was input to SPICE compact modeling simulation. Based on the experimental data, we identified two items that must be checked and monitored using the method described here: transistor's sensitivity to the distance between the poly end cap and edge of active area (AA) due to AA rounding, and SRAM leakage due to a too close N-well to P-well. Based on this example, for process monitoring and variability analyses, we extensively used this method to analyze transistor gates having different shapes. In addition, analysis for a large area of high density standard cell library was done. Another set of monitoring focused on a high density SRAM array is also presented. These examples provided information on the poly and AA layers, using transistor parameters such as leakage current and drive current. We successfully define "robust" and "less-robust" transistor configurations included in the library and identified unsymmetrical transistors in the SRAM bit-cells. These data were compared to data extracted from the same devices at the end of the line. Another set of analyses was done to samples after Cu M1 etch. Process monitoring information on M1 enclosed contact was extracted based on contact resistance as a feedback. Guidelines for the optimal M1 space for different layout configurations were also extracted. All these data showed the successful in-field implementation of our methodology as a useful process monitoring method.

  1. Climate reddening increases the chance of critical transitions

    NASA Astrophysics Data System (ADS)

    van der Bolt, Bregje; van Nes, Egbert H.; Bathiany, Sebastian; Vollebregt, Marlies E.; Scheffer, Marten

    2018-06-01

    Climate change research often focuses on trends in the mean and variance. However, analyses of palaeoclimatic and contemporary dynamics reveal that climate memory — as measured for instance by temporal autocorrelation — may also change substantially over time. Here, we show that elevated temporal autocorrelation in climatic variables should be expected to increase the chance of critical transitions in climate-sensitive systems with tipping points. We demonstrate that this prediction is consistent with evidence from forests, coral reefs, poverty traps, violent conflict and ice sheet instability. In each example, the duration of anomalous dry or warm events elevates chances of invoking a critical transition. Understanding the effects of climate variability thus requires research not only on variance, but also on climate memory.

  2. Statistical relations of salt and selenium loads to geospatial characteristics of corresponding subbasins of the Colorado and Gunnison Rivers in Colorado

    USGS Publications Warehouse

    Leib, Kenneth J.; Linard, Joshua I.; Williams, Cory A.

    2012-01-01

    Elevated loads of salt and selenium can impair the quality of water for both anthropogenic and natural uses. Understanding the environmental processes controlling how salt and selenium are introduced to streams is critical to managing and mitigating the effects of elevated loads. Dominant relations between salt and selenium loads and environmental characteristics can be established by using geospatial data. The U.S. Geological Survey, in cooperation with the Bureau of Reclamation, investigated statistical relations between seasonal salt or selenium loads emanating from the Upper Colorado River Basin and geospatial data. Salt and selenium loads measured during the irrigation and nonirrigation seasons were related to geospatial variables for 168 subbasins within the Gunnison and Colorado River Basins. These geospatial variables represented subbasin characteristics of the physical environment, precipitation, geology, land use, and the irrigation network. All subbasin variables with units of area had statistically significant relations with load. The few variables that were not in units of area but were statistically significant helped to identify types of geospatial data that might influence salt and selenium loading. Following a stepwise approach, combinations of these statistically significant variables were used to develop multiple linear regression models. The models can be used to help prioritize areas where salt and selenium control projects might be most effective.

  3. Variability in Cell Response of Cronobacter sakazakii after Mild-Heat Treatments and Its Impact on Food Safety

    PubMed Central

    Parra-Flores, Julio; Juneja, Vijay; Garcia de Fernando, Gonzalo; Aguirre, Juan

    2016-01-01

    Cronobacter spp. have been responsible for severe infections in infants associated with consumption of powdered infant formula and follow-up formulae. Despite several risk assessments described in published studies, few approaches have considered the tremendous variability in cell response that small micropopulations or single cells can have in infant formula during storage, preparation or post process/preparation before the feeding of infants. Stochastic approaches can better describe microbial single cell response than deterministic models as we prove in this study. A large variability of lag phase was observed in single cell and micropopulations of ≤50 cells. This variability increased as the heat shock increased and growth temperature decreased. Obviously, variability of growth of individual Cronobacter sakazakii cell is affected by inoculum size, growth temperature and the probability of cells able to grow at the conditions imposed by the experimental conditions should be taken into account, especially when errors in bottle-preparation practices, such as improper holding temperatures, or manipulation, may lead to growth of the pathogen to a critical cell level. The mean probability of illness from initial inoculum size of 1 cell was below 0.2 in all the cases and for inoculum size of 50 cells the mean probability of illness, in most of the cases, was above 0.7. PMID:27148223

  4. Preliminary Design of Critical Function Monitoring System of PGSFR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2015-07-01

    A PGSFR (Prototype Gen-IV Sodium-cooled Fast Reactor) is under development at Korea Atomic Energy Research Institute. A critical function monitoring system of the PGSFR is preliminarily studied. The functions of CFMS are to display critical plant variables related to the safety of the plant during normal and accident conditions and guide the operators corrective actions to keep the plant in a safe condition and mitigate the consequences of accidents. The minimal critical functions of the PGSFR are composed of reactivity control, reactor core cooling, reactor coolant system integrity, primary heat transfer system(PHTS) heat removal, sodium water reaction mitigation, radiation controlmore » and containment conditions. The variables and alarm legs of each critical function of the PGSFR are as follows; - Reactivity control: The variables of reactivity control function are power range neutron flux instrumentation, intermediate range neutron flux instrumentation, source range neutron flux instrumentation, and control rod bottom contacts. The alarm leg to display the reactivity controls consists of status of control drop malfunction, high post trip power and thermal reactivity addition. - Reactor core cooling: The variables are PHTS sodium level, hot pool temperature of PHTS, subassembly exit temperature, cold pool temperature of the PHTS, PHTS pump current, and PHTS pump breaker status. The alarm leg consists of high core delta temperature, low sodium level of the PHTS, high subassembly exit temperature, and low PHTS pump load. - Reactor coolant system integrity: The variables are PHTS sodium level, cover gas pressure, and safeguard vessel sodium level. The alarm leg is composed of low sodium level of PHTS, high cover gas pressure and high sodium level of the safety guard vessel. - PHTS heat removal: The variables are PHTS sodium level, hot pool temperature of PHTS, core exit temperature, cold pool temperature of the PHTS, flow rate of passive residual heat removal system, flow rate of active residual heat removal system, and temperatures of air heat exchanger temperature of residual heat removal systems. The alarm legs are composed of two legs of a 'passive residual heat removal system not cooling' and 'active residual heat removal system not cooling'. - Sodium water reaction mitigation: The variables are intermediate heat transfer system(IHTS) pressure, pressure and temperature and level of sodium dump tank, the status of rupture disk, hydrogen concentration in IHTS and direct variable of sodium-water-reaction measure. The alarm leg consists of high IHTS pressure, the status of sodium water reaction mitigation system and the indication of direct measure. - Radiation control: The variables are radiation of PHTS, radiation of IHTS, and radiation of containment purge. The alarm leg is composed of high radiation of PHTS and IHTS, and containment purge system. - Containment condition: The variables are containment pressure, containment isolation status, and sodium fire. The alarm leg consists of high containment pressure, status of containment isolation and status of sodium fire. (authors)« less

  5. Certification Processes for Safety-Critical and Mission-Critical Aerospace Software

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy

    2003-01-01

    This document is a quick reference guide with an overview of the processes required to certify safety-critical and mission-critical flight software at selected NASA centers and the FAA. Researchers and software developers can use this guide to jumpstart their understanding of how to get new or enhanced software onboard an aircraft or spacecraft. The introduction contains aerospace industry definitions of safety and safety-critical software, as well as, the current rationale for certification of safety-critical software. The Standards for Safety-Critical Aerospace Software section lists and describes current standards including NASA standards and RTCA DO-178B. The Mission-Critical versus Safety-Critical software section explains the difference between two important classes of software: safety-critical software involving the potential for loss of life due to software failure and mission-critical software involving the potential for aborting a mission due to software failure. The DO-178B Safety-critical Certification Requirements section describes special processes and methods required to obtain a safety-critical certification for aerospace software flying on vehicles under auspices of the FAA. The final two sections give an overview of the certification process used at Dryden Flight Research Center and the approval process at the Jet Propulsion Lab (JPL).

  6. Water and Nitrogen Limitations of Ecosystem Processes Across Three Dryland Plant Communities

    NASA Astrophysics Data System (ADS)

    Beltz, C.; Lauenroth, W. K.; Burke, I. C.

    2017-12-01

    The availability of water and nitrogen (N) play a major role in controlling the distribution of ecosystem types and the rates of ecosystem processes across the globe. Both these resources are being altered by human activity. Anthropogenic fixation of N has increased inputs into the biosphere from 0.5 kg N ha-1 yr-1 to upwards of 10 kg N ha-1 yr-1, while the amount and seasonality of precipitation are expected to continue to change. Within dryland environments, the relationships between increasingly available N and ecosystem processes are especially complex due to dryland's characteristic strong limitation by low and highly variable precipitation. Other experiments have shown that this interplay between N and water can cause temporally complex co-limitation and spatially complex responses with variable effects on ecosystems, such as those to net primary productivity, soil respiration, and plant community composition. Research spanning multiple dryland plant communities is critical for generalizing findings to the 40% of the Earth's terrestrial surface covered in dryland ecosystems. Given IPCC projections in which both N availability and precipitation are altered, examining their interactive effect across multiple plant communities is critical to increasing our understanding of the limitations to ecosystem process in drylands. We are studying a gradient of three plant communities representing a C4 grassland (shortgrass steppe), a C3/C4 grassland (mixed grass prairie), and a shrub-dominated ecosystem with C3 and C4 grasses (sagebrush steppe). We added two levels of N (10 kg N ha-1 and 100 kg N ha-1) and increased summer monthly precipitation by 20%. Sites responded differently to treatments, with the scale of effect varying by treatment. The high-level nitrogen increased soil N availability and soil respiration, while decreasing soil carbon in the labile pool in the upper soil layers. These results will allow for better understanding of increased N in combination with altered water availability across different plant communities and ecosystems, particularly helping to close the gap in knowledge on the effects of low-level, chronic N addition in drylands.

  7. Constructing critical thinking in health professional education.

    PubMed

    Kahlke, Renate; Eva, Kevin

    2018-04-04

    Calls for enabling 'critical thinking' are ubiquitous in health professional education. However, there is little agreement in the literature or in practice as to what this term means and efforts to generate a universal definition have found limited traction. Moreover, the variability observed might suggest that multiplicity has value that the quest for universal definitions has failed to capture. In this study, we sought to map the multiple conceptions of critical thinking in circulation in health professional education to understand the relationships and tensions between them. We used an inductive, qualitative approach to explore conceptions of critical thinking with educators from four health professions: medicine, nursing, pharmacy, and social work. Four participants from each profession participated in two individual in-depth semi-structured interviews, the latter of which induced reflection on a visual depiction of results generated from the first set of interviews. Three main conceptions of critical thinking were identified: biomedical, humanist, and social justice-oriented critical thinking. 'Biomedical critical thinking' was the dominant conception. While each conception had distinct features, the particular conceptions of critical thinking espoused by individual participants were not stable within or between interviews. Multiple conceptions of critical thinking likely offer educators the ability to express diverse beliefs about what 'good thinking' means in variable contexts. The findings suggest that any single definition of critical thinking in the health professions will be inherently contentious and, we argue, should be. Such debates, when made visible to educators and trainees, can be highly productive.

  8. Predictive model for early math skills based on structural equations.

    PubMed

    Aragón, Estíbaliz; Navarro, José I; Aguilar, Manuel; Cerda, Gamal; García-Sedeño, Manuel

    2016-12-01

    Early math skills are determined by higher cognitive processes that are particularly important for acquiring and developing skills during a child's early education. Such processes could be a critical target for identifying students at risk for math learning difficulties. Few studies have considered the use of a structural equation method to rationalize these relations. Participating in this study were 207 preschool students ages 59 to 72 months, 108 boys and 99 girls. Performance with respect to early math skills, early literacy, general intelligence, working memory, and short-term memory was assessed. A structural equation model explaining 64.3% of the variance in early math skills was applied. Early literacy exhibited the highest statistical significance (β = 0.443, p < 0.05), followed by intelligence (β = 0.286, p < 0.05), working memory (β = 0.220, p < 0.05), and short-term memory (β = 0.213, p < 0.05). Correlations between the independent variables were also significant (p < 0.05). According to the results, cognitive variables should be included in remedial intervention programs. © 2016 Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  9. Predator-guided sampling reveals biotic structure in the bathypelagic.

    PubMed

    Benoit-Bird, Kelly J; Southall, Brandon L; Moline, Mark A

    2016-02-24

    We targeted a habitat used differentially by deep-diving, air-breathing predators to empirically sample their prey's distributions off southern California. Fine-scale measurements of the spatial variability of potential prey animals from the surface to 1,200 m were obtained using conventional fisheries echosounders aboard a surface ship and uniquely integrated into a deep-diving autonomous vehicle. Significant spatial variability in the size, composition, total biomass, and spatial organization of biota was evident over all spatial scales examined and was consistent with the general distribution patterns of foraging Cuvier's beaked whales (Ziphius cavirostris) observed in separate studies. Striking differences found in prey characteristics between regions at depth, however, did not reflect differences observed in surface layers. These differences in deep pelagic structure horizontally and relative to surface structure, absent clear physical differences, change our long-held views of this habitat as uniform. The revelation that animals deep in the water column are so spatially heterogeneous at scales from 10 m to 50 km critically affects our understanding of the processes driving predator-prey interactions, energy transfer, biogeochemical cycling, and other ecological processes in the deep sea, and the connections between the productive surface mixed layer and the deep-water column. © 2016 The Author(s).

  10. Coral bleaching pathways under the control of regional temperature variability

    NASA Astrophysics Data System (ADS)

    Langlais, C. E.; Lenton, A.; Heron, S. F.; Evenhuis, C.; Sen Gupta, A.; Brown, J. N.; Kuchinke, M.

    2017-11-01

    Increasing sea surface temperatures (SSTs) are predicted to adversely impact coral populations worldwide through increasing thermal bleaching events. Future bleaching is unlikely to be spatially uniform. Therefore, understanding what determines regional differences will be critical for adaptation management. Here, using a cumulative heat stress metric, we show that characteristics of regional SST determine the future bleaching risk patterns. Incorporating observed information on SST variability, in assessing future bleaching risk, provides novel options for management strategies. As a consequence, the known biases in climate model variability and the uncertainties in regional warming rate across climate models are less detrimental than previously thought. We also show that the thresholds used to indicate reef viability can strongly influence a decision on what constitutes a potential refugia. Observing and understanding the drivers of regional variability, and the viability limits of coral reefs, is therefore critical for making meaningful projections of coral bleaching risk.

  11. Just tell me what to do: bringing back experimenter control in active contingency tasks with the command-performance procedure and finding cue density effects along the way.

    PubMed

    Hannah, Samuel D; Beneteau, Jennifer L

    2009-03-01

    Active contingency tasks, such as those used to explore judgments of control, suffer from variability in the actual values of critical variables. The authors debut a new, easily implemented procedure that restores control over these variables to the experimenter simply by telling participants when to respond, and when to withhold responding. This command-performance procedure not only restores control over critical variables such as actual contingency, it also allows response frequency to be manipulated independently of contingency or outcome frequency. This yields the first demonstration, to our knowledge, of the equivalent of a cue density effect in an active contingency task. Judgments of control are biased by response frequency outcome frequency, just as they are also biased by outcome frequency. (c) 2009 APA, all rights reserved

  12. Controls on the Environmental Fate of Compounds Controlled by Coupled Hydrologic and Reactive Processes

    NASA Astrophysics Data System (ADS)

    Hixson, J.; Ward, A. S.; McConville, M.; Remucal, C.

    2017-12-01

    Current understanding of how compounds interact with hydrologic processes or reactive processes have been well established. However, the environmental fate for compounds that interact with hydrologic AND reactive processes is not well known, yet critical in evaluating environmental risk. Evaluations of risk are often simplified to homogenize processes in space and time and to assess processes independently of one another. However, we know spatial heterogeneity and time-variable reactivities complicate predictions of environmental transport and fate, and is further complicated by the interaction of these processes, limiting our ability to accurately predict risk. Compounds that interact with both systems, such as photolytic compounds, require that both components are fully understood in order to predict transport and fate. Release of photolytic compounds occurs through both unintentional releases and intentional loadings. Evaluating risks associated with unintentional releases and implementing best management practices for intentional releases requires an in-depth understanding of the sensitivity of photolytic compounds to external controls. Lampricides, such as 3-trifluoromethyl-4-nitrophenol (TFM), are broadly applied in the Great Lakes system to control the population of invasive sea lamprey. Over-dosing can yield fish kills and other detrimental impacts. Still, planning accounts for time of passage and dilution, but not the interaction of the physical and chemical systems (i.e., storage in the hyporheic zone and time-variable decay rates). In this study, we model a series of TFM applications to test the efficacy of dosing as a function of system characteristics. Overall, our results demonstrate the complexity associated with photo-sensitive compounds through stream-hyporheic systems, and highlight the need to better understand how physical and chemical systems interact to control transport and fate in the environment.

  13. Promoting group empowerment and self-reliance through participatory research: a case study of people with physical disability.

    PubMed

    Stewart, R; Bhagwanjee, A

    1999-07-01

    Despite the growing popularity of the empowerment construct among social scientists, relatively few empowerment studies involving groupwork with people with physical disabilities exist. This article accordingly describes and analyses the organic development of the empowerment process within a spinal cord injury self-help group, set against the backdrop of policy imperatives for disability in post-apartheid South Africa. The treatise on the group empowerment process is located within the context of a group evaluation conducted within a participatory research framework. Key variables informing the research approach included: quality of participation, control over resources and decision-making, shift in critical consciousness and understanding, malleability of roles within the group and role of the health professional. Group members assumed ownership of group management and decision-making and shifted from a professionally-led to a peer-led self-help group. Group objectives changed from providing mutual support to community education and outreach activities. The role of the health professional shifted from group facilitator to invited consultant. This case study demonstrates how group participation, promoted by a critically informed therapeutic and research praxis, can unlock the inherent potential for self-reliance and empowerment of socially marginalized collectives. It offers important insights with regard to group process, participatory research and the role of the health professional in creating opportunities for empowerment and self-reliance of people with disability.

  14. Cultural context and a critical approach to eliminating health disparities.

    PubMed

    Griffith, Derek M; Johnson, Jonetta; Ellis, Katrina R; Schulz, Amy Jo

    2010-01-01

    The science of eliminating racial health disparities requires a clear understanding of the underlying social processes that drive persistent differences in health outcomes by self-identified race. Understanding these social processes requires analysis of cultural notions of race as these are instantiated in institutional policies and practices that ultimately contribute to health disparities. Racism provides a useful framework for understanding how social, political and economic factors directly and indirectly influence health outcomes. While it is important to capture how individuals are influenced by their psychological experience of prejudice and discrimination, racism is more than an intrapersonal or interpersonal variable. Considerable attention has focused on race-based residential segregation and other forms of institutional racism but less focus has been placed on how cultural values, frameworks and meanings shape institutional policies and practices. In this article, we highlight the intersection of cultural and institutional racism as a critical mechanism through which racial inequities in social determinants of health not only develop but persist. This distinction highlights and helps to explain processes and structures that contribute to racial disparities persisting across time and outcomes. Using two historical examples, the National Negro Health Movement and hospital desegregation during the Civil Rights Era, we identify key questions that an analysis of cultural racism might add to the more common focus on overt policy decisions and practices.

  15. Family centred care before and during life-sustaining treatment withdrawal in intensive care: A survey of information provided to families by Australasian critical care nurses.

    PubMed

    Ranse, Kristen; Bloomer, Melissa; Coombs, Maureen; Endacott, Ruth

    2016-11-01

    A core component of family-centred nursing care during the provision of end-of-life care in intensive care settings is information sharing with families. Yet little is known about information provided in these circumstances. To identify information most frequently given by critical care nurses to families in preparation for and during withdrawal of life-sustaining treatment. An online cross-sectional survey. During May 2015, critical care nurses in Australia and New Zealand were invited to complete the Preparing Families for Treatment Withdrawal questionnaire. Data analysis included descriptive statistics to identify areas of information most and least frequently shared with families. Cross tabulations with demographic data were used to explore any associations in the data. From the responses of 159 critical care nurses, information related to the emotional care and support of the family was most frequently provided to families in preparation for and during withdrawal of life-sustaining treatment. Variation was noted in the frequency of provision of information across body systems and their associated physical changes during the dying process. Significant associations (p<0.05) were identified between the variables gender, nursing experience and critical care experiences and some of the information items most and least frequently provided. The provision of information during end-of-life care reflects a family-centred care approach by critical care nurses with information pertaining to emotional care and support of the family paramount. The findings of this study provide a useful framework for the development of interventions to improve practice and support nurses in communicating with families at this time. Copyright © 2016 Australian College of Critical Care Nurses Ltd. Published by Elsevier Ltd. All rights reserved.

  16. Interregional migration in socialist countries: the case of China.

    PubMed

    Wei, Y

    1997-03-01

    "This paper analyzes changing interregional migration in China and reveals that the recent eastward migration reverses patterns of migration under Mao. It finds that investment variables are more important than the conventional variables of income and job opportunities in determining China's recent interregional migration. It suggests that both state policy and the global force influence interregional migration, challenging the popular view that the socialist state is the only critical determinant. This paper also criticizes Mao's approach to interregional migration and discusses the impact of migration on development." excerpt

  17. Variability of DKA Management Among Pediatric Emergency Room and Critical Care Providers: A Call for More Evidence-Based and Cost-Effective Care?

    PubMed

    Clark, Matthew G; Dalabih, Abdallah

    2014-09-01

    Management protocols have been shown to be effective in the pediatric emergency medicine (PEM) and pediatric critical care (PCC) settings. Treatment protocols define clear goals which are achieved with consistency in implementation. Over the last decade, many new recommendations have been proposed on managing diabetic ketoacidosis (DKA). Although no perfect set of guidelines exist, many institutions are developing DKA treatment protocols. We sought to determine the variability between institutions in implementation of these protocols.

  18. All varieties of encoding variability are not created equal: Separating variable processing from variable tasks

    PubMed Central

    Huff, Mark J.; Bodner, Glen E.

    2014-01-01

    Whether encoding variability facilitates memory is shown to depend on whether item-specific and relational processing are both performed across study blocks, and whether study items are weakly versus strongly related. Variable-processing groups studied a word list once using an item-specific task and once using a relational task. Variable-task groups’ two different study tasks recruited the same type of processing each block. Repeated-task groups performed the same study task each block. Recall and recognition were greatest in the variable-processing group, but only with weakly related lists. A variable-processing benefit was also found when task-based processing and list-type processing were complementary (e.g., item-specific processing of a related list) rather than redundant (e.g., relational processing of a related list). That performing both item-specific and relational processing across trials, or within a trial, yields encoding-variability benefits may help reconcile decades of contradictory findings in this area. PMID:25018583

  19. Optimization and development of stable w/o/w cosmetic multiple emulsions by means of the Quality by Design approach.

    PubMed

    Kovács, A; Erős, I; Csóka, I

    2016-04-01

    The aim of our present work was to develop stable water-in-oil-in-water (w/o/w) cosmetic multiple emulsions that are proper for cosmetic use and can also be applied on the skin as pharmaceutical vehicles by means of Quality by Design (QbD) concept. This product design concept consists of a risk assessment step and also the 'predetermination' of the critical material attributes and process parameters of a stable multiple emulsion system. We have set up the hypothesis that the stability of multiple emulsions can be improved by the development based on such systematic planning - making a map of critical product parameters - so their industrial usage can be increased. The risk assessment and the determination of critical physical-chemical stability parameters of w/o/w multiple emulsions to define critical control points were performed by means of quality tools and the leanqbd(™) (QbD Works LLC, Fremont, CA, U.S.A.) software. Critical materials and process parameters: Based on the results of preformulation experiments, three factors, namely entrapped active agent, preparation methodology and shear rate, were found to be highly critical factors for critical quality attributes (CQAs) and for stability, whereas the nature of oil was found a medium level risk factor. The results of the risk assessment are the following: (i) droplet structure and size distribution should be evaluated together to be able to predict the stability issues, (ii) the presence of entrapped active agents had a great impact on droplet structure, (iii) the viscosity curves represent the structural changes during storage, if the decrease in relative viscosity is >15% the emulsion disintegrates, and (iv) it is enough to use the shear rate between 34g and 116g relative centrifugal force (RCF). CQAs: By risk assessment, we discovered that four factors should be considered to be high-risk variables as compared to others: droplet size, droplet structure, viscosity and multiple character were found to be highly critical attributes. The preformulation experiment is the part of a development plan. On the basis of these results, the control strategy can be defined and a stable multiple emulsion can be ensured that meets the relevant stakeholders' quality expectations. © 2015 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  20. Life hassles and delusional ideation: Scoping the potential role of cognitive and affective mediators.

    PubMed

    Kingston, Cara; Schuurmans-Stekhoven, James

    2016-12-01

    An intertemporal association between major psychological stress and subsequent delusion formation has been established by others. The current study explores (1) whether the stress from life hassles predicts delusional ideation and (2) if so, do self-criticism, self-reassurance, and positive and negative affectivity (PA and NA, respectively) mediate this link. This paper thus aimed to scope-out viable psychological processes involved in the formation of stress-induced delusions. A cross-sectional survey using a non-clinical community sample. Responses (N = 251) to an online community survey were tested via a nonparametric bootstrap sampling approach to examine the effects of multiple mediators. Self-criticism and NA appear to mediate a connection found between life hassles and delusions. A second mediation analysis found that life hassles positively predicts NA directly and indirectly (via self-criticism). NA in turn predicted delusional tendencies. Life events had direct statistical effects on delusions in all models. Neither PA nor self-reassurance mediated the stress-delusion link. Self-criticism and NA seem to be viable mediators worth contemplating when elaborating upon the connection between life hassles and delusions. Compared to self-criticism, NA appears to be the intervening variable most proximal to delusions and explains more variance. Even if these cross-sectional results were interpreted as causative, life hassles and delusions remained directly interconnected in all mediation models (suggesting much of the association remains unexplained). Although the results are theory-consistent, investigations using longitudinal, known-group, and experimental methods are now warranted to establish causation and possible feedback loops - especially from delusion to life hassles. Self-criticism and negative affectivity (NA) mediate the link between stressful life events and delusions suggesting they might actively elicit delusional ideation, whereas self-reassurance and PA (although negatively associated with life hassles) have no unique predictive link to delusions. This study offers initial evidence that NA and self-criticism may be viable clinical intervention targets for early psychosis-sufferers under stress - especially for medically non-compliant and marginal (where drug treatment is not clinically indicated) cases. The clinical efficacy of alleviating self-criticism and/or negative emotional processes in those displaying early psychosis or at high risk appear worthy of exploration using both practice-based case studies and formal experimental research methods. © 2016 The British Psychological Society.

  1. The Relationship between Academic Achievement, Reading Habits and Critical Thinking Dispositions of Turkish Tertiary Level EFL Learners

    ERIC Educational Resources Information Center

    Genç, Gülten

    2017-01-01

    The aim of this study was to describe EFL learners' critical thinking levels and to examine the relationship between participants' critical thinking levels and selected variables such as gender, academic achievement in EFL, subject area, and self-reported reading. The overall design of the study was based on the quantitative research method. Data…

  2. National Longitudinal Study of the High School Class of 1972: Critical Data Base. 22U-884.

    ERIC Educational Resources Information Center

    Talbert, Robin

    The National Longitudinal Study of the High School Class of 1972 (NLS) critical data base contains 151 items (plus background information) from the base year and followup questionnaires; about thirty-seven percent of all items. This set of critical items consists of: (1) basic demographic variables; (2) items necessary for defining activity states…

  3. False recognition production indexes in forward associative strength (FAS) lists with three critical words.

    PubMed

    Beato, María Soledad; Arndt, Jason

    2014-01-01

    False memory illusions have been widely studied using the Deese/Roediger-McDermott paradigm (DRM). In this paradigm, participants study words semantically related to a single nonpresented critical word. In a memory test critical words are often falsely recalled and recognized. The present study was conducted to measure the levels of false recognition for seventy-five Spanish DRM word lists that have multiple critical words per list. Lists included three critical words (e.g., HELL, LUCEFER, and SATAN) simultaneously associated with six studied words (e.g., devil, demon, fire, red, bad, and evil). Different levels of forward associative strength (FAS) between the critical words and their studied associates were used in the construction of the lists. Specifically, we selected lists with the highest FAS values possible and FAS was continuously decreased in order to obtain the 75 lists. Six words per list, simultaneously associated with three critical words, were sufficient to produce false recognition. Furthermore, there was wide variability in rates of false recognition (e.g., 53% for DUNGEON, PRISON, and GRATES; 1% for BRACKETS, GARMENT, and CLOTHING). Finally, there was no correlation between false recognition and associative strength. False recognition variability could not be attributed to differences in the forward associative strength.

  4. [Optimize dropping process of Ginkgo biloba dropping pills by using design space approach].

    PubMed

    Shen, Ji-Chen; Wang, Qing-Qing; Chen, An; Pan, Fang-Lai; Gong, Xing-Chu; Qu, Hai-Bin

    2017-07-01

    In this paper, a design space approach was applied to optimize the dropping process of Ginkgo biloba dropping pills. Firstly, potential critical process parameters and potential process critical quality attributes were determined through literature research and pre-experiments. Secondly, experiments were carried out according to Box-Behnken design. Then the critical process parameters and critical quality attributes were determined based on the experimental results. Thirdly, second-order polynomial models were used to describe the quantitative relationships between critical process parameters and critical quality attributes. Finally, a probability-based design space was calculated and verified. The verification results showed that efficient production of Ginkgo biloba dropping pills can be guaranteed by operating within the design space parameters. The recommended operation ranges for the critical dropping process parameters of Ginkgo biloba dropping pills were as follows: dropping distance of 5.5-6.7 cm, and dropping speed of 59-60 drops per minute, providing a reference for industrial production of Ginkgo biloba dropping pills. Copyright© by the Chinese Pharmaceutical Association.

  5. Cognitive and Emotion Regulation Change Processes in Cognitive Behavioural Therapy for Social Anxiety Disorder.

    PubMed

    O'Toole, Mia S; Mennin, Douglas S; Hougaard, Esben; Zachariae, Robert; Rosenberg, Nicole K

    2015-01-01

    The objective of the study was to investigate variables, derived from both cognitive and emotion regulation conceptualizations of social anxiety disorder (SAD), as possible change processes in cognitive behaviour therapy (CBT) for SAD. Several proposed change processes were investigated: estimated probability, estimated cost, safety behaviours, acceptance of emotions, cognitive reappraisal and expressive suppression. Participants were 50 patients with SAD, receiving a standard manualized CBT program, conducted in groups or individually. All variables were measured pre-therapy, mid-therapy and post-therapy. Lower level mediation models revealed that while a change in most process measures significantly predicted clinical improvement, only changes in estimated probability and cost and acceptance of emotions showed significant indirect effects of CBT for SAD. The results are in accordance with previous studies supporting the mediating role of changes in cognitive distortions in CBT for SAD. In addition, acceptance of emotions may also be a critical component to clinical improvement in SAD during CBT, although more research is needed on which elements of acceptance are most helpful for individuals with SAD. The study's lack of a control condition limits any conclusion regarding the specificity of the findings to CBT. Change in estimated probability and cost, and acceptance of emotions showed an indirect effect of CBT for SAD. Cognitive distortions appear relevant to target with cognitive restructuring techniques. Finding acceptance to have an indirect effect could be interpreted as support for contemporary CBT approaches that include acceptance-based strategies. Copyright © 2014 John Wiley & Sons, Ltd.

  6. Using multi-disciplinary strategic master facilities planning for organizations experiencing programmatic re-direction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heubach, J.G.; Weimer, W.C.; Bruce, W.A.

    Facility master planning is critical to the future productivity of a laboratory and the quality of worklife for the laboratory staff. For organizations undergoing programmatic re-direction, a master facility planning approach linked to the organization`s strategic planning process is even more important. Major changes in an organization such as programmatic re-direction can significantly impact a broad range of variables which exceed the expertise of traditional planning teams, e.g., capacity variability, work team organization, organizational culture, and work process simplification. By expanding the diversity of the participants of the planning team, there is a greater likelihood that a research organization`s scientific,more » organizational, economic, and employees` needs can be meshed in the strategic plan and facility plan. Recent recommendations from facility planners suggest drawing from diverse fields in building multi-disciplinary planning teams: Architecture, engineering, natural science, social psychology, and strategic planning (Gibson,1993). For organizations undergoing significant operational or culture change, the master facility planning team should also include members with expertise in organizational effectiveness, industrial engineering, human resources, and environmental psychology. A recent planning and design project provides an example which illustrates the use of an expanded multi-disciplinary team engaged in planning laboratory renovations for a research organization undergoing programmatic re-direction. The purpose of the proposed poster session is to present a multi-disciplinary master facility planning process linked to an organization`s strategic planning process or organizational strategies.« less

  7. C-terminal motifs in promyelocytic leukemia protein isoforms critically regulate PML nuclear body formation.

    PubMed

    Li, Chuang; Peng, Qiongfang; Wan, Xiao; Sun, Haili; Tang, Jun

    2017-10-15

    Promyelocytic leukemia protein (PML) nuclear bodies (NBs), which are sub-nuclear protein structures, are involved in a variety of important cellular functions. PML-NBs are assembled by PML isoforms, and contact between small ubiquitin-like modifiers (SUMOs) with the SUMO interaction motif (SIM) are critically involved in this process. PML isoforms contain a common N-terminal region and a variable C-terminus. However, the contribution of the C-terminal regions to PML-NB formation remains poorly defined. Here, using high-resolution microscopy, we show that mutation of the SIM distinctively influences the structure of NBs formed by each individual PML isoform, with that of PML-III and PML-V minimally changed, and PML-I and PML-IV dramatically impaired. We further identify several C-terminal elements that are important in regulating NB structure and provide strong evidence to suggest that the 8b element in PML-IV possesses a strong ability to interact with SUMO-1 and SUMO-2, and critically participates in NB formation. Our findings highlight the importance of PML C-termini in NB assembly and function, and provide molecular insight into the PML-NB assembly of each distinctive isoform. © 2017. Published by The Company of Biologists Ltd.

  8. Probabilistic analysis of the influence of the bonding degree of the stem-cement interface in the performance of cemented hip prostheses.

    PubMed

    Pérez, M A; Grasa, J; García-Aznar, J M; Bea, J A; Doblaré, M

    2006-01-01

    The long-term behavior of the stem-cement interface is one of the most frequent topics of discussion in the design of cemented total hip replacements, especially with regards to the process of damage accumulation in the cement layer. This effect is analyzed here comparing two different situations of the interface: completely bonded and debonded with friction. This comparative analysis is performed using a probabilistic computational approach that considers the variability and uncertainty of determinant factors that directly compromise the damage accumulation in the cement mantle. This stochastic technique is based on the combination of probabilistic finite elements (PFEM) and a cumulative damage approach known as B-model. Three random variables were considered: muscle and joint contact forces at the hip (both for walking and stair climbing), cement damage and fatigue properties of the cement. The results predicted that the regions with higher failure probability in the bulk cement are completely different depending on the stem-cement interface characteristics. In a bonded interface, critical sites appeared at the distal and medial parts of the cement, while for debonded interfaces, the critical regions were found distally and proximally. In bonded interfaces, the failure probability was higher than in debonded ones. The same conclusion may be established for stair climbing in comparison with walking activity.

  9. Spatiotemporal Variability and in Snow Phenology over Eurasian Continent druing 1966-2012

    NASA Astrophysics Data System (ADS)

    Zhong, X.; Zhang, T.; Wang, K.; Zheng, L.; Wang, H.

    2016-12-01

    Snow cover is a key part of the cryosphere, which is a critical component of the global climate system. Snow cover phenology critically effects on the surface energy budget, the surface albedo and hydrological processes. In this study, the climatology and spatiotemporal variability of snow cover phenology were investigated using the long-term (1966-2012) ground-based measurements of daily snow depth from 1103 stations across the Eurasian Continent. The results showed that the distributions of the first date, last date, snow cover duration and number of snow cover days generally represented the latitudinal zonality over the Eurasian Continent, and there were significant elevation gradient patterns in the Tibetan Plateau. The first date of snow cover delayed by about 1.2 day decade-1, the last date of snow cover advanced with the rate of -1.2 day decade-1, snow cover duration and number of snow cover days shortened by about 2.7and 0.6 day decade-1, respectively, from 1966 through 2012. Compared with precipitation, the correlation between snow cover phenology and air temperature was more significant. The changes in snow cover duration were mainly controlled by the changes of air temperature in autumn and spring. The shortened number of snow cover days was affected by rising temperature during the cold season except for the air temperature in autumn and spring.

  10. Defining critical habitats of threatened and endemic reef fishes with a multivariate approach.

    PubMed

    Purcell, Steven W; Clarke, K Robert; Rushworth, Kelvin; Dalton, Steven J

    2014-12-01

    Understanding critical habitats of threatened and endemic animals is essential for mitigating extinction risks, developing recovery plans, and siting reserves, but assessment methods are generally lacking. We evaluated critical habitats of 8 threatened or endemic fish species on coral and rocky reefs of subtropical eastern Australia, by measuring physical and substratum-type variables of habitats at fish sightings. We used nonmetric and metric multidimensional scaling (nMDS, mMDS), Analysis of similarities (ANOSIM), similarity percentages analysis (SIMPER), permutational analysis of multivariate dispersions (PERMDISP), and other multivariate tools to distinguish critical habitats. Niche breadth was widest for 2 endemic wrasses, and reef inclination was important for several species, often found in relatively deep microhabitats. Critical habitats of mainland reef species included small caves or habitat-forming hosts such as gorgonian corals and black coral trees. Hard corals appeared important for reef fishes at Lord Howe Island, and red algae for mainland reef fishes. A wide range of habitat variables are required to assess critical habitats owing to varied affinities of species to different habitat features. We advocate assessments of critical habitats matched to the spatial scale used by the animals and a combination of multivariate methods. Our multivariate approach furnishes a general template for assessing the critical habitats of species, understanding how these vary among species, and determining differences in the degree of habitat specificity. © 2014 Society for Conservation Biology.

  11. Comprehending childhood bereavement by parental suicide: a critical review of research on outcomes, grief processes, and interventions.

    PubMed

    Hung, Natalie C; Rabin, Laura A

    2009-09-01

    The experience of bereavement by parental suicide is not well understood, as evidenced by the lack of empirically supported interventions for this underserved population. This article reviews quantitative and qualitative research on the psychopathological outcomes and thematic characteristics of childhood and adolescent suicide survivorship and moderating variables such as life stressors, stigma, the manner of communication about the suicide, and the surviving parent's functioning. The authors outline several approaches to intervention and address conceptual and methodological challenges within the field. With the ultimate goal of efficacious intervention, recommendations for future priorities and the use of unconventional research methods are offered.

  12. Status and Evaluation of Microwave Furnace Capabilities at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Lizcano, Maricela; Mackey, Jonathan A.

    2014-01-01

    The microwave (MW) furnace is a HY-Tech Microwave Systems, 2 kW 2.45 GHz Single Mode Microwave Applicator operating in continuous wave (CW) with variable power. It is located in Cleveland, Ohio at NASA Glenn Research Center. Until recently, the furnace capabilities had not been fully realized due to unknown failure that subsequently damaged critical furnace components. Although the causes of the problems were unknown, an assessment of the furnace itself indicated operational failure may have been partially caused by power quality. This report summarizes the status of the MW furnace and evaluates its capabilities in materials processing.

  13. Human footprint affects US carbon balance more than climate change

    USGS Publications Warehouse

    Bachelet, Dominique; Ferschweiler, Ken; Sheehan, Tim; Baker, Barry; Sleeter, Benjamin M.; Zhu, Zhiliang

    2017-01-01

    The MC2 model projects an overall increase in carbon capture in conterminous United States during the 21st century while also simulating a rise in fire causing much carbon loss. Carbon sequestration in soils is critical to prevent carbon losses from future disturbances, and we show that natural ecosystems store more carbon belowground than managed systems do. Natural and human-caused disturbances affect soil processes that shape ecosystem recovery and competitive interactions between native, exotics, and climate refugees. Tomorrow's carbon budgets will depend on how land use, natural disturbances, and climate variability will interact and affect the balance between carbon capture and release.

  14. Maintaining activity engagement: individual differences in the process of self-regulating motivation.

    PubMed

    Sansone, Carol; Thoman, Dustin B

    2006-12-01

    Typically, models of self-regulation include motivation in terms of goals. Motivation is proposed to differ among individuals as a consequence of the goals they hold as well as how much they value those goals and expect to attain them. We suggest that goal-defined motivation is only one source of motivation critical for sustained engagement. A second source is the motivation that arises from the degree of interest experienced in the process of goal pursuit. Our model integrates both sources of motivation within the goal-striving process and suggests that individuals may actively monitor and regulate them. Conceptualizing motivation in terms of a self-regulatory process provides an organizing framework for understanding how individuals might differ in whether they experience interest while working toward goals, whether they persist without interest, and whether and how they try to create interest. We first present the self-regulation of motivation model and then review research illustrating how the consideration of individual differences at different points in the process allows a better understanding of variability in people's choices, efforts, and persistence over time.

  15. Constituents of Music and Visual-Art Related Pleasure - A Critical Integrative Literature Review.

    PubMed

    Tiihonen, Marianne; Brattico, Elvira; Maksimainen, Johanna; Wikgren, Jan; Saarikallio, Suvi

    2017-01-01

    The present literature review investigated how pleasure induced by music and visual-art has been conceptually understood in empirical research over the past 20 years. After an initial selection of abstracts from seven databases (keywords: pleasure, reward, enjoyment, and hedonic), twenty music and eleven visual-art papers were systematically compared. The following questions were addressed: (1) What is the role of the keyword in the research question? (2) Is pleasure considered a result of variation in the perceiver's internal or external attributes? (3) What are the most commonly employed methods and main variables in empirical settings? Based on these questions, our critical integrative analysis aimed to identify which themes and processes emerged as key features for conceptualizing art-induced pleasure. The results demonstrated great variance in how pleasure has been approached: In the music studies pleasure was often a clear object of investigation, whereas in the visual-art studies the term was often embedded into the context of an aesthetic experience, or used otherwise in a descriptive, indirect sense. Music studies often targeted different emotions, their intensity or anhedonia. Biographical and background variables and personality traits of the perceiver were often measured. Next to behavioral methods, a common method was brain imaging which often targeted the reward circuitry of the brain in response to music. Visual-art pleasure was also frequently addressed using brain imaging methods, but the research focused on sensory cortices rather than the reward circuit alone. Compared with music research, visual-art research investigated more frequently pleasure in relation to conscious, cognitive processing, where the variations of stimulus features and the changing of viewing modes were regarded as explanatory factors of the derived experience. Despite valence being frequently applied in both domains, we conclude, that in empirical music research pleasure seems to be part of core affect and hedonic tone modulated by stable personality variables, whereas in visual-art research pleasure is a result of the so called conceptual act depending on a chosen strategy to approach art. We encourage an integration of music and visual-art into to a multi-modal framework to promote a more versatile understanding of pleasure in response to aesthetic artifacts.

  16. Enhancing the Design Process for Complex Space Systems through Early Integration of Risk and Variable-Fidelity Modeling

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri; Osburg, Jan

    2005-01-01

    An important enabler of the new national Vision for Space Exploration is the ability to rapidly and efficiently develop optimized concepts for the manifold future space missions that this effort calls for. The design of such complex systems requires a tight integration of all the engineering disciplines involved, in an environment that fosters interaction and collaboration. The research performed under this grant explored areas where the space systems design process can be enhanced: by integrating risk models into the early stages of the design process, and by including rapid-turnaround variable-fidelity tools for key disciplines. Enabling early assessment of mission risk will allow designers to perform trades between risk and design performance during the initial design space exploration. Entry into planetary atmospheres will require an increased emphasis of the critical disciplines of aero- and thermodynamics. This necessitates the pulling forward of EDL disciplinary expertise into the early stage of the design process. Radiation can have a large potential impact on overall mission designs, in particular for the planned nuclear-powered robotic missions under Project Prometheus and for long-duration manned missions to the Moon, Mars and beyond under Project Constellation. This requires that radiation and associated risk and hazards be assessed and mitigated at the earliest stages of the design process. Hence, RPS is another discipline needed to enhance the engineering competencies of conceptual design teams. Researchers collaborated closely with NASA experts in those disciplines, and in overall space systems design, at Langley Research Center and at the Jet Propulsion Laboratory. This report documents the results of this initial effort.

  17. On the nature of motor planning variables during arm pointing movement: Compositeness and speed dependence.

    PubMed

    Vu, Van Hoan; Isableu, Brice; Berret, Bastien

    2016-07-22

    The purpose of this study was to investigate the nature of the variables and rules underlying the planning of unrestrained 3D arm reaching. To identify whether the brain uses kinematic, dynamic and energetic values in an isolated manner or combines them in a flexible way, we examined the effects of speed variations upon the chosen arm trajectories during free arm movements. Within the optimal control framework, we uncovered which (possibly composite) optimality criterion underlays at best the empirical data. Fifteen participants were asked to perform free-endpoint reaching movements from a specific arm configuration at slow, normal and fast speeds. Experimental results revealed that prominent features of observed motor behaviors were significantly speed-dependent, such as the chosen reach endpoint and the final arm posture. Nevertheless, participants exhibited different arm trajectories and various degrees of speed dependence of their reaching behavior. These inter-individual differences were addressed using a numerical inverse optimal control methodology. Simulation results revealed that a weighted combination of kinematic, energetic and dynamic cost functions was required to account for all the critical features of the participants' behavior. Furthermore, no evidence for the existence of a speed-dependent tuning of these weights was found, thereby suggesting subject-specific but speed-invariant weightings of kinematic, energetic and dynamic variables during the motor planning process of free arm movements. This suggested that the inter-individual difference of arm trajectories and speed dependence was not only due to anthropometric singularities but also to critical differences in the composition of the subjective cost function. Copyright © 2016 IBRO. Published by Elsevier Ltd. All rights reserved.

  18. Middle atmosphere electrodynamics: Report of the workshop on the Role of the Electrodynamics of the Middle Atmosphere on Solar Terrestrial Coupling

    NASA Technical Reports Server (NTRS)

    Maynard, N. C. (Editor)

    1979-01-01

    Significant deficiencies exist in the present understanding of the basic physical processes taking place within the middle atmosphere (the region between the tropopause and the mesopause), and in the knowledge of the variability of many of the primary parameters that regulate Middle Atmosphere Electrodynamics (MAE). Knowledge of the electrical properties, i.e., electric fields, plasma characteristics, conductivity and currents, and the physical processes that govern them is of fundamental importance to the physics of the region. Middle atmosphere electrodynamics may play a critical role in the electrodynamical aspects of solar-terrestrial relations. As a first step, the Workshop on the Role of the Electrodynamics of the Middle Atmosphere on Solar-Terrestrial Coupling was held to review the present status and define recommendations for future MAE research.

  19. A Fluorescent G-quadruplex Sensor for Chemical RNA Copying.

    PubMed

    Giurgiu, Constantin; Wright, Tom; O'Flaherty, Derek; Szostak, Jack

    2018-06-25

    Non-enzymatic RNA replication may have been one of the processes involved in the appearance of life on Earth. Attempts to recreate this process in a laboratory setting have not been successful thus far, highlighting a critical need for finding prebiotic conditions that increase the rate and the yield. Here, we present a highly parallel assay for template directed RNA synthesis that relies on the intrinsic fluorescence of a 2-aminopurine modified G-quadruplex. We demonstrate the application of the assay to examine the combined influence of multiple variables including pH, divalent metal concentrations and ribonucleotide concentrations on the copying of RNA sequences. The assay enables a direct survey of physical and chemical conditions, potentially prebiotic, which could enable the chemical replication of RNA. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. A new method to detect transitory signatures and local time/space variability structures in the climate system: the scale-dependent correlation analysis

    NASA Astrophysics Data System (ADS)

    Rodó, Xavier; Rodríguez-Arias, Miquel-Àngel

    2006-10-01

    The study of transitory signals and local variability structures in both/either time and space and their role as sources of climatic memory, is an important but often neglected topic in climate research despite its obvious importance and extensive coverage in the literature. Transitory signals arise either from non-linearities, in the climate system, transitory atmosphere-ocean couplings, and other processes in the climate system evolving after a critical threshold is crossed. These temporary interactions that, though intense, may not last long, can be responsible for a large amount of unexplained variability but are normally considered of limited relevance and often, discarded. With most of the current techniques at hand these typology of signatures are difficult to isolate because the low signal-to-noise ratio in midlatitudes, the limited recurrence of the transitory signals during a customary interval of data considered. Also, there is often a serious problem arising from the smoothing of local or transitory processes if statistical techniques are applied, that consider all the length of data available, rather than taking into account the size of the specific variability structure under investigation. Scale-dependent correlation (SDC) analysis is a new statistical method capable of highlighting the presence of transitory processes, these former being understood as temporary significant lag-dependent autocovariance in a single series, or covariance structures between two series. This approach, therefore, complements other approaches such as those resulting from the families of wavelet analysis, singular-spectrum analysis and recurrence plots. A main feature of SDC is its high-performance for short time series, its ability to characterize phase-relationships and thresholds in the bivariate domain. Ultimately, SDC helps tracking short-lagged relationships among processes that locally or temporarily couple and uncouple. The use of SDC is illustrated in the present paper by means of some synthetic time-series examples of increasing complexity, and it is compared with wavelet analysis in order to provide a well-known reference of its capabilities. A comparison between SDC and companion techniques is also addressed and results are exemplified for the specific case of some relevant El Niño-Southern Oscillation teleconnections.

Top