Sample records for development sensitivity analysis

  1. Sensitivity Analysis of QSAR Models for Assessing Novel Military Compounds

    DTIC Science & Technology

    2009-01-01

    ER D C TR -0 9 -3 Strategic Environmental Research and Development Program Sensitivity Analysis of QSAR Models for Assessing Novel...Environmental Research and Development Program ERDC TR-09-3 January 2009 Sensitivity Analysis of QSAR Models for Assessing Novel Military Compound...Jay L. Clausen Cold Regions Research and Engineering Laboratory U.S. Army Engineer Research and Development Center 72 Lyme Road Hanover, NH

  2. Multidisciplinary design optimization using multiobjective formulation techniques

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Pagaldipti, Narayanan S.

    1995-01-01

    This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.

  3. Applying geologic sensitivity analysis to environmental risk management: The financial implications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rogers, D.T.

    The financial risks associated with environmental contamination can be staggering and are often difficult to identify and accurately assess. Geologic sensitivity analysis is gaining recognition as a significant and useful tool that can empower the user with crucial information concerning environmental risk management and brownfield redevelopment. It is particularly useful when (1) evaluating the potential risks associated with redevelopment of historical industrial facilities (brownfields) and (2) planning for future development, especially in areas of rapid development because the number of potential contaminating sources often increases with an increase in economic development. An examination of the financial implications relating to geologicmore » sensitivity analysis in southeastern Michigan from numerous case studies indicate that the environmental cost of contamination may be 100 to 1,000 times greater at a geologically sensitive location compared to the least sensitive location. Geologic sensitivity analysis has demonstrated that near-surface geology may influence the environmental impact of a contaminated site to a greater extent than the amount and type of industrial development.« less

  4. A geostatistics-informed hierarchical sensitivity analysis method for complex groundwater flow and transport modeling

    NASA Astrophysics Data System (ADS)

    Dai, Heng; Chen, Xingyuan; Ye, Ming; Song, Xuehang; Zachara, John M.

    2017-05-01

    Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study, we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multilayer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially distributed input variables.

  5. A Geostatistics-Informed Hierarchical Sensitivity Analysis Method for Complex Groundwater Flow and Transport Modeling

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2017-12-01

    Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multi-layer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed input variables.

  6. Development of a sensitivity analysis technique for multiloop flight control systems

    NASA Technical Reports Server (NTRS)

    Vaillard, A. H.; Paduano, J.; Downing, D. R.

    1985-01-01

    This report presents the development and application of a sensitivity analysis technique for multiloop flight control systems. This analysis yields very useful information on the sensitivity of the relative-stability criteria of the control system, with variations or uncertainties in the system and controller elements. The sensitivity analysis technique developed is based on the computation of the singular values and singular-value gradients of a feedback-control system. The method is applicable to single-input/single-output as well as multiloop continuous-control systems. Application to sampled-data systems is also explored. The sensitivity analysis technique was applied to a continuous yaw/roll damper stability augmentation system of a typical business jet, and the results show that the analysis is very useful in determining the system elements which have the largest effect on the relative stability of the closed-loop system. As a secondary product of the research reported here, the relative stability criteria based on the concept of singular values were explored.

  7. AN OVERVIEW OF THE UNCERTAINTY ANALYSIS, SENSITIVITY ANALYSIS, AND PARAMETER ESTIMATION (UA/SA/PE) API AND HOW TO IMPLEMENT IT

    EPA Science Inventory

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and
    Parameter Estimation (UA/SA/PE API) (also known as Calibration, Optimization and Sensitivity and Uncertainty (CUSO)) was developed in a joint effort between several members of both ...

  8. Development of the High-Order Decoupled Direct Method in Three Dimensions for Particulate Matter: Enabling Advanced Sensitivity Analysis in Air Quality Models

    EPA Science Inventory

    The high-order decoupled direct method in three dimensions for particular matter (HDDM-3D/PM) has been implemented in the Community Multiscale Air Quality (CMAQ) model to enable advanced sensitivity analysis. The major effort of this work is to develop high-order DDM sensitivity...

  9. Shape design sensitivity analysis and optimization of three dimensional elastic solids using geometric modeling and automatic regridding. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Yao, Tse-Min; Choi, Kyung K.

    1987-01-01

    An automatic regridding method and a three dimensional shape design parameterization technique were constructed and integrated into a unified theory of shape design sensitivity analysis. An algorithm was developed for general shape design sensitivity analysis of three dimensional eleastic solids. Numerical implementation of this shape design sensitivity analysis method was carried out using the finite element code ANSYS. The unified theory of shape design sensitivity analysis uses the material derivative of continuum mechanics with a design velocity field that represents shape change effects over the structural design. Automatic regridding methods were developed by generating a domain velocity field with boundary displacement method. Shape design parameterization for three dimensional surface design problems was illustrated using a Bezier surface with boundary perturbations that depend linearly on the perturbation of design parameters. A linearization method of optimization, LINRM, was used to obtain optimum shapes. Three examples from different engineering disciplines were investigated to demonstrate the accuracy and versatility of this shape design sensitivity analysis method.

  10. Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS

    DOE PAGES

    Brown, C. S.; Zhang, Hongbin

    2016-05-24

    Uncertainty quantification and sensitivity analysis are important for nuclear reactor safety design and analysis. A 2x2 fuel assembly core design was developed and simulated by the Virtual Environment for Reactor Applications, Core Simulator (VERA-CS) coupled neutronics and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis with fourteen uncertain input parameters. Furthermore, the minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surfacemore » temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. We used parameters as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.« less

  11. Preliminary Thermal-Mechanical Sizing of Metallic TPS: Process Development and Sensitivity Studies

    NASA Technical Reports Server (NTRS)

    Poteet, Carl C.; Abu-Khajeel, Hasan; Hsu, Su-Yuen

    2002-01-01

    The purpose of this research was to perform sensitivity studies and develop a process to perform thermal and structural analysis and sizing of the latest Metallic Thermal Protection System (TPS) developed at NASA LaRC (Langley Research Center). Metallic TPS is a key technology for reducing the cost of reusable launch vehicles (RLV), offering the combination of increased durability and competitive weights when compared to other systems. Accurate sizing of metallic TPS requires combined thermal and structural analysis. Initial sensitivity studies were conducted using transient one-dimensional finite element thermal analysis to determine the influence of various TPS and analysis parameters on TPS weight. The thermal analysis model was then used in combination with static deflection and failure mode analysis of the sandwich panel outer surface of the TPS to obtain minimum weight TPS configurations at three vehicle stations on the windward centerline of a representative RLV. The coupled nature of the analysis requires an iterative analysis process, which will be described herein. Findings from the sensitivity analysis are reported, along with TPS designs at the three RLV vehicle stations considered.

  12. A geostatistics-informed hierarchical sensitivity analysis method for complex groundwater flow and transport modeling: GEOSTATISTICAL SENSITIVITY ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Chen, Xingyuan; Ye, Ming

    Sensitivity analysis is an important tool for quantifying uncertainty in the outputs of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a hierarchical sensitivity analysis method that (1) constructs an uncertainty hierarchy by analyzing the input uncertainty sources, and (2) accounts for the spatial correlation among parameters at each level ofmore » the hierarchy using geostatistical tools. The contribution of uncertainty source at each hierarchy level is measured by sensitivity indices calculated using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport in model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally as driven by the dynamic interaction between groundwater and river water at the site. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed parameters.« less

  13. Estimating Sobol Sensitivity Indices Using Correlations

    EPA Science Inventory

    Sensitivity analysis is a crucial tool in the development and evaluation of complex mathematical models. Sobol's method is a variance-based global sensitivity analysis technique that has been applied to computational models to assess the relative importance of input parameters on...

  14. Netlist Oriented Sensitivity Evaluation (NOSE)

    DTIC Science & Technology

    2017-03-01

    developing methodologies to assess sensitivities of alternative chip design netlist implementations. The research is somewhat foundational in that such...Netlist-Oriented Sensitivity Evaluation (NOSE) project was to develop methodologies to assess sensitivities of alternative chip design netlist...analysis to devise a methodology for scoring the sensitivity of circuit nodes in a netlist and thus providing the raw data for any meaningful

  15. Sensitivity analysis of a wing aeroelastic response

    NASA Technical Reports Server (NTRS)

    Kapania, Rakesh K.; Eldred, Lloyd B.; Barthelemy, Jean-Francois M.

    1991-01-01

    A variation of Sobieski's Global Sensitivity Equations (GSE) approach is implemented to obtain the sensitivity of the static aeroelastic response of a three-dimensional wing model. The formulation is quite general and accepts any aerodynamics and structural analysis capability. An interface code is written to convert one analysis's output to the other's input, and visa versa. Local sensitivity derivatives are calculated by either analytic methods or finite difference techniques. A program to combine the local sensitivities, such as the sensitivity of the stiffness matrix or the aerodynamic kernel matrix, into global sensitivity derivatives is developed. The aerodynamic analysis package FAST, using a lifting surface theory, and a structural package, ELAPS, implementing Giles' equivalent plate model are used.

  16. A Sensitivity Analysis of the Rigid Pavement Life-Cycle Cost Analysis Program

    DOT National Transportation Integrated Search

    2000-12-01

    Original Report Date: September 1999. This report describes the sensitivity analysis performed on the Rigid Pavement Life-Cycle Cost Analysis program, a computer program developed by the Center for Transportation Research for the Texas Department of ...

  17. CALIBRATION, OPTIMIZATION, AND SENSITIVITY AND UNCERTAINTY ALGORITHMS APPLICATION PROGRAMMING INTERFACE (COSU-API)

    EPA Science Inventory

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and Parameter Estimation (UA/SA/PE API) tool development, here fore referred to as the Calibration, Optimization, and Sensitivity and Uncertainty Algorithms API (COSU-API), was initially d...

  18. Grid sensitivity for aerodynamic optimization and flow analysis

    NASA Technical Reports Server (NTRS)

    Sadrehaghighi, I.; Tiwari, S. N.

    1993-01-01

    After reviewing relevant literature, it is apparent that one aspect of aerodynamic sensitivity analysis, namely grid sensitivity, has not been investigated extensively. The grid sensitivity algorithms in most of these studies are based on structural design models. Such models, although sufficient for preliminary or conceptional design, are not acceptable for detailed design analysis. Careless grid sensitivity evaluations, would introduce gradient errors within the sensitivity module, therefore, infecting the overall optimization process. Development of an efficient and reliable grid sensitivity module with special emphasis on aerodynamic applications appear essential. The organization of this study is as follows. The physical and geometric representations of a typical model are derived in chapter 2. The grid generation algorithm and boundary grid distribution are developed in chapter 3. Chapter 4 discusses the theoretical formulation and aerodynamic sensitivity equation. The method of solution is provided in chapter 5. The results are presented and discussed in chapter 6. Finally, some concluding remarks are provided in chapter 7.

  19. Shape design sensitivity analysis using domain information

    NASA Technical Reports Server (NTRS)

    Seong, Hwal-Gyeong; Choi, Kyung K.

    1985-01-01

    A numerical method for obtaining accurate shape design sensitivity information for built-up structures is developed and demonstrated through analysis of examples. The basic character of the finite element method, which gives more accurate domain information than boundary information, is utilized for shape design sensitivity improvement. A domain approach for shape design sensitivity analysis of built-up structures is derived using the material derivative idea of structural mechanics and the adjoint variable method of design sensitivity analysis. Velocity elements and B-spline curves are introduced to alleviate difficulties in generating domain velocity fields. The regularity requirements of the design velocity field are studied.

  20. Reduction of low frequency vibration of truck driver and seating system through system parameter identification, sensitivity analysis and active control

    NASA Astrophysics Data System (ADS)

    Wang, Xu; Bi, Fengrong; Du, Haiping

    2018-05-01

    This paper aims to develop an 5-degree-of-freedom driver and seating system model for optimal vibration control. A new method for identification of the driver seating system parameters from experimental vibration measurement has been developed. The parameter sensitivity analysis has been conducted considering the random excitation frequency and system parameter uncertainty. The most and least sensitive system parameters for the transmissibility ratio have been identified. The optimised PID controllers have been developed to reduce the driver's body vibration.

  1. Analysis of the sensitivity properties of a model of vector-borne bubonic plague.

    PubMed

    Buzby, Megan; Neckels, David; Antolin, Michael F; Estep, Donald

    2008-09-06

    Model sensitivity is a key to evaluation of mathematical models in ecology and evolution, especially in complex models with numerous parameters. In this paper, we use some recently developed methods for sensitivity analysis to study the parameter sensitivity of a model of vector-borne bubonic plague in a rodent population proposed by Keeling & Gilligan. The new sensitivity tools are based on a variational analysis involving the adjoint equation. The new approach provides a relatively inexpensive way to obtain derivative information about model output with respect to parameters. We use this approach to determine the sensitivity of a quantity of interest (the force of infection from rats and their fleas to humans) to various model parameters, determine a region over which linearization at a specific parameter reference point is valid, develop a global picture of the output surface, and search for maxima and minima in a given region in the parameter space.

  2. Design sensitivity analysis of nonlinear structural response

    NASA Technical Reports Server (NTRS)

    Cardoso, J. B.; Arora, J. S.

    1987-01-01

    A unified theory is described of design sensitivity analysis of linear and nonlinear structures for shape, nonshape and material selection problems. The concepts of reference volume and adjoint structure are used to develop the unified viewpoint. A general formula for design sensitivity analysis is derived. Simple analytical linear and nonlinear examples are used to interpret various terms of the formula and demonstrate its use.

  3. Sensitivity Analysis of Multicriteria Choice to Changes in Intervals of Value Tradeoffs

    NASA Astrophysics Data System (ADS)

    Podinovski, V. V.

    2018-03-01

    An approach to sensitivity (stability) analysis of nondominated alternatives to changes in the bounds of intervals of value tradeoffs, where the alternatives are selected based on interval data of criteria tradeoffs is proposed. Methods of computations for the analysis of sensitivity of individual nondominated alternatives and the set of such alternatives as a whole are developed.

  4. Development of Multiobjective Optimization Techniques for Sonic Boom Minimization

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.

    1996-01-01

    A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.

  5. Sensitivity analysis for large-scale problems

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  6. A Bayesian Network Based Global Sensitivity Analysis Method for Identifying Dominant Processes in a Multi-physics Model

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2016-12-01

    Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, C. S.; Zhang, Hongbin

    Uncertainty quantification and sensitivity analysis are important for nuclear reactor safety design and analysis. A 2x2 fuel assembly core design was developed and simulated by the Virtual Environment for Reactor Applications, Core Simulator (VERA-CS) coupled neutronics and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis with fourteen uncertain input parameters. Furthermore, the minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surfacemore » temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. We used parameters as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.« less

  8. A new framework for comprehensive, robust, and efficient global sensitivity analysis: 2. Application

    NASA Astrophysics Data System (ADS)

    Razavi, Saman; Gupta, Hoshin V.

    2016-01-01

    Based on the theoretical framework for sensitivity analysis called "Variogram Analysis of Response Surfaces" (VARS), developed in the companion paper, we develop and implement a practical "star-based" sampling strategy (called STAR-VARS), for the application of VARS to real-world problems. We also develop a bootstrap approach to provide confidence level estimates for the VARS sensitivity metrics and to evaluate the reliability of inferred factor rankings. The effectiveness, efficiency, and robustness of STAR-VARS are demonstrated via two real-data hydrological case studies (a 5-parameter conceptual rainfall-runoff model and a 45-parameter land surface scheme hydrology model), and a comparison with the "derivative-based" Morris and "variance-based" Sobol approaches are provided. Our results show that STAR-VARS provides reliable and stable assessments of "global" sensitivity across the full range of scales in the factor space, while being 1-2 orders of magnitude more efficient than the Morris or Sobol approaches.

  9. Enabling Predictive Simulation and UQ of Complex Multiphysics PDE Systems by the Development of Goal-Oriented Variational Sensitivity Analysis and a-Posteriori Error Estimation Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estep, Donald

    2015-11-30

    This project addressed the challenge of predictive computational analysis of strongly coupled, highly nonlinear multiphysics systems characterized by multiple physical phenomena that span a large range of length- and time-scales. Specifically, the project was focused on computational estimation of numerical error and sensitivity analysis of computational solutions with respect to variations in parameters and data. In addition, the project investigated the use of accurate computational estimates to guide efficient adaptive discretization. The project developed, analyzed and evaluated new variational adjoint-based techniques for integration, model, and data error estimation/control and sensitivity analysis, in evolutionary multiphysics multiscale simulations.

  10. Optimization of Parameter Ranges for Composite Tape Winding Process Based on Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Yu, Tao; Shi, Yaoyao; He, Xiaodong; Kang, Chao; Deng, Bo; Song, Shibo

    2017-08-01

    This study is focus on the parameters sensitivity of winding process for composite prepreg tape. The methods of multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis are proposed. The polynomial empirical model of interlaminar shear strength is established by response surface experimental method. Using this model, the relative sensitivity of key process parameters including temperature, tension, pressure and velocity is calculated, while the single-parameter sensitivity curves are obtained. According to the analysis of sensitivity curves, the stability and instability range of each parameter are recognized. Finally, the optimization method of winding process parameters is developed. The analysis results show that the optimized ranges of the process parameters for interlaminar shear strength are: temperature within [100 °C, 150 °C], tension within [275 N, 387 N], pressure within [800 N, 1500 N], and velocity within [0.2 m/s, 0.4 m/s], respectively.

  11. Maternal sensitivity: a concept analysis.

    PubMed

    Shin, Hyunjeong; Park, Young-Joo; Ryu, Hosihn; Seomun, Gyeong-Ae

    2008-11-01

    The aim of this paper is to report a concept analysis of maternal sensitivity. Maternal sensitivity is a broad concept encompassing a variety of interrelated affective and behavioural caregiving attributes. It is used interchangeably with the terms maternal responsiveness or maternal competency, with no consistency of use. There is a need to clarify the concept of maternal sensitivity for research and practice. A search was performed on the CINAHL and Ovid MEDLINE databases using 'maternal sensitivity', 'maternal responsiveness' and 'sensitive mothering' as key words. The searches yielded 54 records for the years 1981-2007. Rodgers' method of evolutionary concept analysis was used to analyse the material. Four critical attributes of maternal sensitivity were identified: (a) dynamic process involving maternal abilities; (b) reciprocal give-and-take with the infant; (c) contingency on the infant's behaviour and (d) quality of maternal behaviours. Maternal identity and infant's needs and cues are antecedents for these attributes. The consequences are infant's comfort, mother-infant attachment and infant development. In addition, three positive affecting factors (social support, maternal-foetal attachment and high self-esteem) and three negative affecting factors (maternal depression, maternal stress and maternal anxiety) were identified. A clear understanding of the concept of maternal sensitivity could be useful for developing ways to enhance maternal sensitivity and to maximize the developmental potential of infants. Knowledge of the attributes of maternal sensitivity identified in this concept analysis may be helpful for constructing measuring items or dimensions.

  12. The application of sensitivity analysis to models of large scale physiological systems

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1974-01-01

    A survey of the literature of sensitivity analysis as it applies to biological systems is reported as well as a brief development of sensitivity theory. A simple population model and a more complex thermoregulatory model illustrate the investigatory techniques and interpretation of parameter sensitivity analysis. The role of sensitivity analysis in validating and verifying models, and in identifying relative parameter influence in estimating errors in model behavior due to uncertainty in input data is presented. This analysis is valuable to the simulationist and the experimentalist in allocating resources for data collection. A method for reducing highly complex, nonlinear models to simple linear algebraic models that could be useful for making rapid, first order calculations of system behavior is presented.

  13. Bayesian Sensitivity Analysis of Statistical Models with Missing Data

    PubMed Central

    ZHU, HONGTU; IBRAHIM, JOSEPH G.; TANG, NIANSHENG

    2013-01-01

    Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures. PMID:24753718

  14. Development of Flight Safety Prediction Methodology for U. S. Naval Safety Center. Revision 1

    DTIC Science & Technology

    1970-02-01

    Safety Center. The methodology develoned encompassed functional analysis of the F-4J aircraft, assessment of the importance of safety- sensitive ... Sensitivity ... ....... . 4-8 V 4.5 Model Implementation ........ ......... . 4-10 4.5.1 Functional Analysis ..... ........... . 4-11 4. 5. 2 Major...Function Sensitivity Assignment ........ ... 4-13 i 4.5.3 Link Dependency Assignment ... ......... . 4-14 4.5.4 Computer Program for Sensitivity

  15. Single-molecule detection: applications to ultrasensitive biochemical analysis

    NASA Astrophysics Data System (ADS)

    Castro, Alonso; Shera, E. Brooks

    1995-06-01

    Recent developments in laser-based detection of fluorescent molecules have made possible the implementation of very sensitive techniques for biochemical analysis. We present and discuss our experiments on the applications of our recently developed technique of single-molecule detection to the analysis of molecules of biological interest. These newly developed methods are capable of detecting and identifying biomolecules at the single-molecule level of sensitivity. In one case, identification is based on measuring fluorescence brightness from single molecules. In another, molecules are classified by determining their electrophoretic velocities.

  16. Sensitivity analysis of add-on price estimate for select silicon wafering technologies

    NASA Technical Reports Server (NTRS)

    Mokashi, A. R.

    1982-01-01

    The cost of producing wafers from silicon ingots is a major component of the add-on price of silicon sheet. Economic analyses of the add-on price estimates and their sensitivity internal-diameter (ID) sawing, multiblade slurry (MBS) sawing and fixed-abrasive slicing technique (FAST) are presented. Interim price estimation guidelines (IPEG) are used for estimating a process add-on price. Sensitivity analysis of price is performed with respect to cost parameters such as equipment, space, direct labor, materials (blade life) and utilities, and the production parameters such as slicing rate, slices per centimeter and process yield, using a computer program specifically developed to do sensitivity analysis with IPEG. The results aid in identifying the important cost parameters and assist in deciding the direction of technology development efforts.

  17. Efficient sensitivity analysis method for chaotic dynamical systems

    NASA Astrophysics Data System (ADS)

    Liao, Haitao

    2016-05-01

    The direct differentiation and improved least squares shadowing methods are both developed for accurately and efficiently calculating the sensitivity coefficients of time averaged quantities for chaotic dynamical systems. The key idea is to recast the time averaged integration term in the form of differential equation before applying the sensitivity analysis method. An additional constraint-based equation which forms the augmented equations of motion is proposed to calculate the time averaged integration variable and the sensitivity coefficients are obtained as a result of solving the augmented differential equations. The application of the least squares shadowing formulation to the augmented equations results in an explicit expression for the sensitivity coefficient which is dependent on the final state of the Lagrange multipliers. The LU factorization technique to calculate the Lagrange multipliers leads to a better performance for the convergence problem and the computational expense. Numerical experiments on a set of problems selected from the literature are presented to illustrate the developed methods. The numerical results demonstrate the correctness and effectiveness of the present approaches and some short impulsive sensitivity coefficients are observed by using the direct differentiation sensitivity analysis method.

  18. Eigenvalue and eigenvector sensitivity and approximate analysis for repeated eigenvalue problems

    NASA Technical Reports Server (NTRS)

    Hou, Gene J. W.; Kenny, Sean P.

    1991-01-01

    A set of computationally efficient equations for eigenvalue and eigenvector sensitivity analysis are derived, and a method for eigenvalue and eigenvector approximate analysis in the presence of repeated eigenvalues is presented. The method developed for approximate analysis involves a reparamaterization of the multivariable structural eigenvalue problem in terms of a single positive-valued parameter. The resulting equations yield first-order approximations of changes in both the eigenvalues and eigenvectors associated with the repeated eigenvalue problem. Examples are given to demonstrate the application of such equations for sensitivity and approximate analysis.

  19. Sensitivity-Uncertainty Based Nuclear Criticality Safety Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    2016-09-20

    These are slides from a seminar given to the University of Mexico Nuclear Engineering Department. Whisper is a statistical analysis package developed to support nuclear criticality safety validation. It uses the sensitivity profile data for an application as computed by MCNP6 along with covariance files for the nuclear data to determine a baseline upper-subcritical-limit for the application. Whisper and its associated benchmark files are developed and maintained as part of MCNP6, and will be distributed with all future releases of MCNP6. Although sensitivity-uncertainty methods for NCS validation have been under development for 20 years, continuous-energy Monte Carlo codes such asmore » MCNP could not determine the required adjoint-weighted tallies for sensitivity profiles. The recent introduction of the iterated fission probability method into MCNP led to the rapid development of sensitivity analysis capabilities for MCNP6 and the development of Whisper. Sensitivity-uncertainty based methods represent the future for NCS validation – making full use of today’s computer power to codify past approaches based largely on expert judgment. Validation results are defensible, auditable, and repeatable as needed with different assumptions and process models. The new methods can supplement, support, and extend traditional validation approaches.« less

  20. Results of an integrated structure-control law design sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Gilbert, Michael G.

    1988-01-01

    Next generation air and space vehicle designs are driven by increased performance requirements, demanding a high level of design integration between traditionally separate design disciplines. Interdisciplinary analysis capabilities have been developed, for aeroservoelastic aircraft and large flexible spacecraft control for instance, but the requisite integrated design methods are only beginning to be developed. One integrated design method which has received attention is based on hierarchal problem decompositions, optimization, and design sensitivity analyses. This paper highlights a design sensitivity analysis method for Linear Quadratic Cost, Gaussian (LQG) optimal control laws, which predicts change in the optimal control law due to changes in fixed problem parameters using analytical sensitivity equations. Numerical results of a design sensitivity analysis for a realistic aeroservoelastic aircraft example are presented. In this example, the sensitivity of the optimally controlled aircraft's response to various problem formulation and physical aircraft parameters is determined. These results are used to predict the aircraft's new optimally controlled response if the parameter was to have some other nominal value during the control law design process. The sensitivity results are validated by recomputing the optimal control law for discrete variations in parameters, computing the new actual aircraft response, and comparing with the predicted response. These results show an improvement in sensitivity accuracy for integrated design purposes over methods which do not include changess in the optimal control law. Use of the analytical LQG sensitivity expressions is also shown to be more efficient that finite difference methods for the computation of the equivalent sensitivity information.

  1. Experiences on p-Version Time-Discontinuous Galerkin's Method for Nonlinear Heat Transfer Analysis and Sensitivity Analysis

    NASA Technical Reports Server (NTRS)

    Hou, Gene

    2004-01-01

    The focus of this research is on the development of analysis and sensitivity analysis equations for nonlinear, transient heat transfer problems modeled by p-version, time discontinuous finite element approximation. The resulting matrix equation of the state equation is simply in the form ofA(x)x = c, representing a single step, time marching scheme. The Newton-Raphson's method is used to solve the nonlinear equation. Examples are first provided to demonstrate the accuracy characteristics of the resultant finite element approximation. A direct differentiation approach is then used to compute the thermal sensitivities of a nonlinear heat transfer problem. The report shows that only minimal coding effort is required to enhance the analysis code with the sensitivity analysis capability.

  2. Development and application of optimum sensitivity analysis of structures

    NASA Technical Reports Server (NTRS)

    Barthelemy, J. F. M.; Hallauer, W. L., Jr.

    1984-01-01

    The research focused on developing an algorithm applying optimum sensitivity analysis for multilevel optimization. The research efforts have been devoted to assisting NASA Langley's Interdisciplinary Research Office (IRO) in the development of a mature methodology for a multilevel approach to the design of complex (large and multidisciplinary) engineering systems. An effort was undertaken to identify promising multilevel optimization algorithms. In the current reporting period, the computer program generating baseline single level solutions was completed and tested out.

  3. Scale/TSUNAMI Sensitivity Data for ICSBEP Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Reed, Davis Allan; Lefebvre, Robert A

    2011-01-01

    The Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) software developed at Oak Ridge National Laboratory (ORNL) as part of the Scale code system provide unique methods for code validation, gap analysis, and experiment design. For TSUNAMI analysis, sensitivity data are generated for each application and each existing or proposed experiment used in the assessment. The validation of diverse sets of applications requires potentially thousands of data files to be maintained and organized by the user, and a growing number of these files are available through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE) distributed through themore » International Criticality Safety Benchmark Evaluation Program (ICSBEP). To facilitate the use of the IHECSBE benchmarks in rigorous TSUNAMI validation and gap analysis techniques, ORNL generated SCALE/TSUNAMI sensitivity data files (SDFs) for several hundred benchmarks for distribution with the IHECSBE. For the 2010 edition of IHECSBE, the sensitivity data were generated using 238-group cross-section data based on ENDF/B-VII.0 for 494 benchmark experiments. Additionally, ORNL has developed a quality assurance procedure to guide the generation of Scale inputs and sensitivity data, as well as a graphical user interface to facilitate the use of sensitivity data in identifying experiments and applying them in validation studies.« less

  4. Sensitivity analysis and multidisciplinary optimization for aircraft design: Recent advances and results

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1988-01-01

    Optimization by decomposition, complex system sensitivity analysis, and a rapid growth of disciplinary sensitivity analysis are some of the recent developments that hold promise of a quantum jump in the support engineers receive from computers in the quantitative aspects of design. Review of the salient points of these techniques is given and illustrated by examples from aircraft design as a process that combines the best of human intellect and computer power to manipulate data.

  5. Sensitivity analysis of the add-on price estimate for the edge-defined film-fed growth process

    NASA Technical Reports Server (NTRS)

    Mokashi, A. R.; Kachare, A. H.

    1981-01-01

    The analysis is in terms of cost parameters and production parameters. The cost parameters include equipment, space, direct labor, materials, and utilities. The production parameters include growth rate, process yield, and duty cycle. A computer program was developed specifically to do the sensitivity analysis.

  6. Development of Solution Algorithm and Sensitivity Analysis for Random Fuzzy Portfolio Selection Model

    NASA Astrophysics Data System (ADS)

    Hasuike, Takashi; Katagiri, Hideki

    2010-10-01

    This paper focuses on the proposition of a portfolio selection problem considering an investor's subjectivity and the sensitivity analysis for the change of subjectivity. Since this proposed problem is formulated as a random fuzzy programming problem due to both randomness and subjectivity presented by fuzzy numbers, it is not well-defined. Therefore, introducing Sharpe ratio which is one of important performance measures of portfolio models, the main problem is transformed into the standard fuzzy programming problem. Furthermore, using the sensitivity analysis for fuzziness, the analytical optimal portfolio with the sensitivity factor is obtained.

  7. Multiculturally Sensitive Mental Health Scale (MSMHS): Development, Factor Analysis, Reliability, and Validity

    ERIC Educational Resources Information Center

    Chao, Ruth Chu-Lien; Green, Kathy E.

    2011-01-01

    Effectively and efficiently diagnosing African Americans' mental health has been a chronically unresolved challenge. To meet this challenge we developed a tool to better understand African Americans' mental health: the Multiculturally Sensitive Mental Health Scale (MSMHS). Three studies reporting the development and initial validation of the MSMHS…

  8. A three-dimensional cohesive sediment transport model with data assimilation: Model development, sensitivity analysis and parameter estimation

    NASA Astrophysics Data System (ADS)

    Wang, Daosheng; Cao, Anzhou; Zhang, Jicai; Fan, Daidu; Liu, Yongzhi; Zhang, Yue

    2018-06-01

    Based on the theory of inverse problems, a three-dimensional sigma-coordinate cohesive sediment transport model with the adjoint data assimilation is developed. In this model, the physical processes of cohesive sediment transport, including deposition, erosion and advection-diffusion, are parameterized by corresponding model parameters. These parameters are usually poorly known and have traditionally been assigned empirically. By assimilating observations into the model, the model parameters can be estimated using the adjoint method; meanwhile, the data misfit between model results and observations can be decreased. The model developed in this work contains numerous parameters; therefore, it is necessary to investigate the parameter sensitivity of the model, which is assessed by calculating a relative sensitivity function and the gradient of the cost function with respect to each parameter. The results of parameter sensitivity analysis indicate that the model is sensitive to the initial conditions, inflow open boundary conditions, suspended sediment settling velocity and resuspension rate, while the model is insensitive to horizontal and vertical diffusivity coefficients. A detailed explanation of the pattern of sensitivity analysis is also given. In ideal twin experiments, constant parameters are estimated by assimilating 'pseudo' observations. The results show that the sensitive parameters are estimated more easily than the insensitive parameters. The conclusions of this work can provide guidance for the practical applications of this model to simulate sediment transport in the study area.

  9. Skeletal Mechanism Generation of Surrogate Jet Fuels for Aeropropulsion Modeling

    NASA Astrophysics Data System (ADS)

    Sung, Chih-Jen; Niemeyer, Kyle E.

    2010-05-01

    A novel implementation for the skeletal reduction of large detailed reaction mechanisms using the directed relation graph with error propagation and sensitivity analysis (DRGEPSA) is developed and presented with skeletal reductions of two important hydrocarbon components, n-heptane and n-decane, relevant to surrogate jet fuel development. DRGEPSA integrates two previously developed methods, directed relation graph-aided sensitivity analysis (DRGASA) and directed relation graph with error propagation (DRGEP), by first applying DRGEP to efficiently remove many unimportant species prior to sensitivity analysis to further remove unimportant species, producing an optimally small skeletal mechanism for a given error limit. It is illustrated that the combination of the DRGEP and DRGASA methods allows the DRGEPSA approach to overcome the weaknesses of each previous method, specifically that DRGEP cannot identify all unimportant species and that DRGASA shields unimportant species from removal.

  10. Efficient sensitivity analysis and optimization of a helicopter rotor

    NASA Technical Reports Server (NTRS)

    Lim, Joon W.; Chopra, Inderjit

    1989-01-01

    Aeroelastic optimization of a system essentially consists of the determination of the optimum values of design variables which minimize the objective function and satisfy certain aeroelastic and geometric constraints. The process of aeroelastic optimization analysis is illustrated. To carry out aeroelastic optimization effectively, one needs a reliable analysis procedure to determine steady response and stability of a rotor system in forward flight. The rotor dynamic analysis used in the present study developed inhouse at the University of Maryland is based on finite elements in space and time. The analysis consists of two major phases: vehicle trim and rotor steady response (coupled trim analysis), and aeroelastic stability of the blade. For a reduction of helicopter vibration, the optimization process requires the sensitivity derivatives of the objective function and aeroelastic stability constraints. For this, the derivatives of steady response, hub loads and blade stability roots are calculated using a direct analytical approach. An automated optimization procedure is developed by coupling the rotor dynamic analysis, design sensitivity analysis and constrained optimization code CONMIN.

  11. The Model Optimization, Uncertainty, and SEnsitivity analysis (MOUSE) toolbox: overview and application

    USDA-ARS?s Scientific Manuscript database

    For several decades, optimization and sensitivity/uncertainty analysis of environmental models has been the subject of extensive research. Although much progress has been made and sophisticated methods developed, the growing complexity of environmental models to represent real-world systems makes it...

  12. A New Framework for Effective and Efficient Global Sensitivity Analysis of Earth and Environmental Systems Models

    NASA Astrophysics Data System (ADS)

    Razavi, Saman; Gupta, Hoshin

    2015-04-01

    Earth and Environmental Systems (EES) models are essential components of research, development, and decision-making in science and engineering disciplines. With continuous advances in understanding and computing power, such models are becoming more complex with increasingly more factors to be specified (model parameters, forcings, boundary conditions, etc.). To facilitate better understanding of the role and importance of different factors in producing the model responses, the procedure known as 'Sensitivity Analysis' (SA) can be very helpful. Despite the availability of a large body of literature on the development and application of various SA approaches, two issues continue to pose major challenges: (1) Ambiguous Definition of Sensitivity - Different SA methods are based in different philosophies and theoretical definitions of sensitivity, and can result in different, even conflicting, assessments of the underlying sensitivities for a given problem, (2) Computational Cost - The cost of carrying out SA can be large, even excessive, for high-dimensional problems and/or computationally intensive models. In this presentation, we propose a new approach to sensitivity analysis that addresses the dual aspects of 'effectiveness' and 'efficiency'. By effective, we mean achieving an assessment that is both meaningful and clearly reflective of the objective of the analysis (the first challenge above), while by efficiency we mean achieving statistically robust results with minimal computational cost (the second challenge above). Based on this approach, we develop a 'global' sensitivity analysis framework that efficiently generates a newly-defined set of sensitivity indices that characterize a range of important properties of metric 'response surfaces' encountered when performing SA on EES models. Further, we show how this framework embraces, and is consistent with, a spectrum of different concepts regarding 'sensitivity', and that commonly-used SA approaches (e.g., Sobol, Morris, etc.) are actually limiting cases of our approach under specific conditions. Multiple case studies are used to demonstrate the value of the new framework. The results show that the new framework provides a fundamental understanding of the underlying sensitivities for any given problem, while requiring orders of magnitude fewer model runs.

  13. Overview and application of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) toolbox

    USDA-ARS?s Scientific Manuscript database

    For several decades, optimization and sensitivity/uncertainty analysis of environmental models has been the subject of extensive research. Although much progress has been made and sophisticated methods developed, the growing complexity of environmental models to represent real-world systems makes it...

  14. A Very Much Faster and More Sensitive In Situ Stable Isotope Analysis Instrument

    NASA Astrophysics Data System (ADS)

    Coleman, M.; Christensen, L. E.; Kriesel, J. M.; Kelly, J. F.; Moran, J. J.; Vance, S.

    2016-10-01

    We are developing, Capillary Absorption Spectrometry (CAS) for H and O stable isotope analyses, giving > 4 orders of magnitude improved sensitivity, allowing analysis of 5 nano-moles of water and coupled to laser sampling to free water from hydrated minerals and ice.

  15. Design component method for sensitivity analysis of built-up structures

    NASA Technical Reports Server (NTRS)

    Choi, Kyung K.; Seong, Hwai G.

    1986-01-01

    A 'design component method' that provides a unified and systematic organization of design sensitivity analysis for built-up structures is developed and implemented. Both conventional design variables, such as thickness and cross-sectional area, and shape design variables of components of built-up structures are considered. It is shown that design of components of built-up structures can be characterized and system design sensitivity expressions obtained by simply adding contributions from each component. The method leads to a systematic organization of computations for design sensitivity analysis that is similar to the way in which computations are organized within a finite element code.

  16. Ethical Sensitivity in Nursing Ethical Leadership: A Content Analysis of Iranian Nurses Experiences

    PubMed Central

    Esmaelzadeh, Fatemeh; Abbaszadeh, Abbas; Borhani, Fariba; Peyrovi, Hamid

    2017-01-01

    Background: Considering that many nursing actions affect other people’s health and life, sensitivity to ethics in nursing practice is highly important to ethical leaders as a role model. Objective: The study aims to explore ethical sensitivity in ethical nursing leaders in Iran. Method: This was a qualitative study based on the conventional content analysis in 2015. Data were collected using deep and semi-structured interviews with 20 Iranian nurses. The participants were chosen using purposive sampling. Data were analyzed using conventional content analysis. In order to increase the accuracy and integrity of the data, Lincoln and Guba's criteria were considered. Results: Fourteen sub-categories and five main categories emerged. Main categories consisted of sensitivity to care, sensitivity to errors, sensitivity to communication, sensitivity in decision making and sensitivity to ethical practice. Conclusion: Ethical sensitivity appears to be a valuable attribute for ethical nurse leaders, having an important effect on various aspects of professional practice and help the development of ethics in nursing practice. PMID:28584564

  17. Ethanol-nicotine interactions in long-sleep and short-sleep mice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Fiebre, C.M.; Marks, M.J.; Collins, A.C.

    The possibility that common genetic factors regulate initial sensitivities to ethanol and nicotine as well as the development of cross-tolerance between these agents was explored using the long-sleep (LS) and short-sleep (SS) mice. The LS mice proved to be more sensitive to an acute challenge with nicotine than were the SS mice. Segregation analysis (F1, F2, backcross) indicated that ethanol sensitivity and nicotine sensitivity segregate together. Acute pretreatment with nicotine did not significantly affect sensitivity to ethanol, but ethanol pretreatment altered nicotine responsiveness. The LS mice develop more tolerance to nicotine and ethanol than do the SS and they alsomore » develop more cross-tolerance. These genetically determined differences in initial sensitivities, and tolerance and cross-tolerance development are not readily explained by differences in brain nicotinic receptor numbers.« less

  18. New Uses for Sensitivity Analysis: How Different Movement Tasks Effect Limb Model Parameter Sensitivity

    NASA Technical Reports Server (NTRS)

    Winters, J. M.; Stark, L.

    1984-01-01

    Original results for a newly developed eight-order nonlinear limb antagonistic muscle model of elbow flexion and extension are presented. A wider variety of sensitivity analysis techniques are used and a systematic protocol is established that shows how the different methods can be used efficiently to complement one another for maximum insight into model sensitivity. It is explicitly shown how the sensitivity of output behaviors to model parameters is a function of the controller input sequence, i.e., of the movement task. When the task is changed (for instance, from an input sequence that results in the usual fast movement task to a slower movement that may also involve external loading, etc.) the set of parameters with high sensitivity will in general also change. Such task-specific use of sensitivity analysis techniques identifies the set of parameters most important for a given task, and even suggests task-specific model reduction possibilities.

  19. Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty

    DOE PAGES

    Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.

    2016-09-12

    Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less

  20. Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.

    Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less

  1. Acceleration and sensitivity analysis of lattice kinetic Monte Carlo simulations using parallel processing and rate constant rescaling

    NASA Astrophysics Data System (ADS)

    Núñez, M.; Robie, T.; Vlachos, D. G.

    2017-10-01

    Kinetic Monte Carlo (KMC) simulation provides insights into catalytic reactions unobtainable with either experiments or mean-field microkinetic models. Sensitivity analysis of KMC models assesses the robustness of the predictions to parametric perturbations and identifies rate determining steps in a chemical reaction network. Stiffness in the chemical reaction network, a ubiquitous feature, demands lengthy run times for KMC models and renders efficient sensitivity analysis based on the likelihood ratio method unusable. We address the challenge of efficiently conducting KMC simulations and performing accurate sensitivity analysis in systems with unknown time scales by employing two acceleration techniques: rate constant rescaling and parallel processing. We develop statistical criteria that ensure sufficient sampling of non-equilibrium steady state conditions. Our approach provides the twofold benefit of accelerating the simulation itself and enabling likelihood ratio sensitivity analysis, which provides further speedup relative to finite difference sensitivity analysis. As a result, the likelihood ratio method can be applied to real chemistry. We apply our methodology to the water-gas shift reaction on Pt(111).

  2. Parallel Computing and Model Evaluation for Environmental Systems: An Overview of the Supermuse and Frames Software Technologies

    EPA Science Inventory

    ERD’s Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) is a key to enhancing quality assurance in environmental models and applications. Uncertainty analysis and sensitivity analysis remain critical, though often overlooked steps in the development and e...

  3. Nuclear morphology for the detection of alterations in bronchial cells from lung cancer: an attempt to improve sensitivity and specificity.

    PubMed

    Fafin-Lefevre, Mélanie; Morlais, Fabrice; Guittet, Lydia; Clin, Bénédicte; Launoy, Guy; Galateau-Sallé, Françoise; Plancoulaine, Benoît; Herlin, Paulette; Letourneux, Marc

    2011-08-01

    To identify which morphologic or densitometric parameters are modified in cell nuclei from bronchopulmonary cancer based on 18 parameters involving shape, intensity, chromatin, texture, and DNA content and develop a bronchopulmonary cancer screening method relying on analysis of sputum sample cell nuclei. A total of 25 sputum samples from controls and 22 bronchial aspiration samples from patients presenting with bronchopulmonary cancer who were professionally exposed to cancer were used. After Feulgen staining, 18 morphologic and DNA content parameters were measured on cell nuclei, via image cytom- etry. A method was developed for analyzing distribution quantiles, compared with simply interpreting mean values, to characterize morphologic modifications in cell nuclei. Distribution analysis of parameters enabled us to distinguish 13 of 18 parameters that demonstrated significant differences between controls and cancer cases. These parameters, used alone, enabled us to distinguish two population types, with both sensitivity and specificity > 70%. Three parameters offered 100% sensitivity and specificity. When mean values offered high sensitivity and specificity, comparable or higher sensitivity and specificity values were observed for at least one of the corresponding quantiles. Analysis of modification in morphologic parameters via distribution analysis proved promising for screening bronchopulmonary cancer from sputum.

  4. What Do We Mean By Sensitivity Analysis? The Need For A Comprehensive Characterization Of Sensitivity In Earth System Models

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Gupta, H. V.

    2014-12-01

    Sensitivity analysis (SA) is an important paradigm in the context of Earth System model development and application, and provides a powerful tool that serves several essential functions in modelling practice, including 1) Uncertainty Apportionment - attribution of total uncertainty to different uncertainty sources, 2) Assessment of Similarity - diagnostic testing and evaluation of similarities between the functioning of the model and the real system, 3) Factor and Model Reduction - identification of non-influential factors and/or insensitive components of model structure, and 4) Factor Interdependence - investigation of the nature and strength of interactions between the factors, and the degree to which factors intensify, cancel, or compensate for the effects of each other. A variety of sensitivity analysis approaches have been proposed, each of which formally characterizes a different "intuitive" understanding of what is meant by the "sensitivity" of one or more model responses to its dependent factors (such as model parameters or forcings). These approaches are based on different philosophies and theoretical definitions of sensitivity, and range from simple local derivatives and one-factor-at-a-time procedures to rigorous variance-based (Sobol-type) approaches. In general, each approach focuses on, and identifies, different features and properties of the model response and may therefore lead to different (even conflicting) conclusions about the underlying sensitivity. This presentation revisits the theoretical basis for sensitivity analysis, and critically evaluates existing approaches so as to demonstrate their flaws and shortcomings. With this background, we discuss several important properties of response surfaces that are associated with the understanding and interpretation of sensitivity. Finally, a new approach towards global sensitivity assessment is developed that is consistent with important properties of Earth System model response surfaces.

  5. Ethical sensitivity in professional practice: concept analysis.

    PubMed

    Weaver, Kathryn; Morse, Janice; Mitcham, Carl

    2008-06-01

    This paper is a report of a concept analysis of ethical sensitivity. Ethical sensitivity enables nurses and other professionals to respond morally to the suffering and vulnerability of those receiving professional care and services. Because of its significance to nursing and other professional practices, ethical sensitivity deserves more focused analysis. A criteria-based method oriented toward pragmatic utility guided the analysis of 200 papers and books from the fields of nursing, medicine, psychology, dentistry, clinical ethics, theology, education, law, accounting or business, journalism, philosophy, political and social sciences and women's studies. This literature spanned 1970 to 2006 and was sorted by discipline and concept dimensions and examined for concept structure and use across various contexts. The analysis was completed in September 2007. Ethical sensitivity in professional practice develops in contexts of uncertainty, client suffering and vulnerability, and through relationships characterized by receptivity, responsiveness and courage on the part of professionals. Essential attributes of ethical sensitivity are identified as moral perception, affectivity and dividing loyalties. Outcomes include integrity preserving decision-making, comfort and well-being, learning and professional transcendence. Our findings promote ethical sensitivity as a type of practical wisdom that pursues client comfort and professional satisfaction with care delivery. The analysis and resulting model offers an inclusive view of ethical sensitivity that addresses some of the limitations with prior conceptualizations.

  6. Computational aspects of sensitivity calculations in linear transient structural analysis. Ph.D. Thesis - Virginia Polytechnic Inst. and State Univ.

    NASA Technical Reports Server (NTRS)

    Greene, William H.

    1990-01-01

    A study was performed focusing on the calculation of sensitivities of displacements, velocities, accelerations, and stresses in linear, structural, transient response problems. One significant goal of the study was to develop and evaluate sensitivity calculation techniques suitable for large-order finite element analyses. Accordingly, approximation vectors such as vibration mode shapes are used to reduce the dimensionality of the finite element model. Much of the research focused on the accuracy of both response quantities and sensitivities as a function of number of vectors used. Two types of sensitivity calculation techniques were developed and evaluated. The first type of technique is an overall finite difference method where the analysis is repeated for perturbed designs. The second type of technique is termed semi-analytical because it involves direct, analytical differentiation of the equations of motion with finite difference approximation of the coefficient matrices. To be computationally practical in large-order problems, the overall finite difference methods must use the approximation vectors from the original design in the analyses of the perturbed models. In several cases this fixed mode approach resulted in very poor approximations of the stress sensitivities. Almost all of the original modes were required for an accurate sensitivity and for small numbers of modes, the accuracy was extremely poor. To overcome this poor accuracy, two semi-analytical techniques were developed. The first technique accounts for the change in eigenvectors through approximate eigenvector derivatives. The second technique applies the mode acceleration method of transient analysis to the sensitivity calculations. Both result in accurate values of the stress sensitivities with a small number of modes and much lower computational costs than if the vibration modes were recalculated and then used in an overall finite difference method.

  7. Development of a noise annoyance sensitivity scale

    NASA Technical Reports Server (NTRS)

    Bregman, H. L.; Pearson, R. G.

    1972-01-01

    Examining the problem of noise pollution from the psychological rather than the engineering view, a test of human sensitivity to noise was developed against the criterion of noise annoyance. Test development evolved from a previous study in which biographical, attitudinal, and personality data was collected on a sample of 166 subjects drawn from the adult community of Raleigh. Analysis revealed that only a small subset of the data collected was predictive of noise annoyance. Item analysis yielded 74 predictive items that composed the preliminary noise sensitivity test. This was administered to a sample of 80 adults who later rate the annoyance value of six sounds (equated in terms of peak sound pressure level) presented in a simulated home, living-room environment. A predictive model involving 20 test items was developed using multiple regression techniques, and an item weighting scheme was evaluated.

  8. [Tourism function zoning of Jinyintan Grassland Scenic Area in Qinghai Province based on ecological sensitivity analysis].

    PubMed

    Zhong, Lin-sheng; Tang, Cheng-cai; Guo, Hua

    2010-07-01

    Based on the statistical data of natural ecology and social economy in Jinyintan Grassland Scenic Area in Qinghai Province in 2008, an evaluation index system for the ecological sensitivity of this area was established from the aspects of protected area rank, vegetation type, slope, and land use type. The ecological sensitivity of the sub-areas with higher tourism value and ecological function in the area was evaluated, and the tourism function zoning of these sub-areas was made by the technology of GIS and according to the analysis of eco-environmental characteristics and ecological sensitivity of each sensitive sub-area. It was suggested that the Jinyintan Grassland Scenic Area could be divided into three ecological sensitivity sub-areas (high, moderate, and low), three tourism functional sub-areas (restricted development ecotourism, moderate development ecotourism, and mass tourism), and six tourism functional sub-areas (wetland protection, primitive ecological sightseeing, agriculture and pasture tourism, grassland tourism, town tourism, and rural tourism).

  9. Examining the accuracy of the infinite order sudden approximation using sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Eno, Larry; Rabitz, Herschel

    1981-08-01

    A method is developed for assessing the accuracy of scattering observables calculated within the framework of the infinite order sudden (IOS) approximation. In particular, we focus on the energy sudden assumption of the IOS method and our approach involves the determination of the sensitivity of the IOS scattering matrix SIOS with respect to a parameter which reintroduces the internal energy operator ?0 into the IOS Hamiltonian. This procedure is an example of sensitivity analysis of missing model components (?0 in this case) in the reference Hamiltonian. In contrast to simple first-order perturbation theory a finite result is obtained for the effect of ?0 on SIOS. As an illustration, our method of analysis is applied to integral state-to-state cross sections for the scattering of an atom and rigid rotor. Results are generated within the He+H2 system and a comparison is made between IOS and coupled states cross sections and the corresponding IOS sensitivities. It is found that the sensitivity coefficients are very useful indicators of the accuracy of the IOS results. Finally, further developments and applications are discussed.

  10. Sensitive Amino Acid Composition and Chirality Analysis with the Mars Organic Analyzer (MOA)

    NASA Technical Reports Server (NTRS)

    Skelley, Alison M.; Scherer, James R.; Aubrey, Andrew D.; Grover, William H.; Ivester, Robin H. C.; Ehrenfreund, Pascale; Grunthaner, Frank J.; Bada, Jeffrey L.; Mathies, Richard A.

    2005-01-01

    Detection of life on Mars requires definition of a suitable biomarker and development of sensitive yet compact instrumentation capable of performing in situ analyses. Our studies are focused on amino acid analysis because amino acids are more resistant to decomposition than other biomolecules, and because amino acid chirality is a well-defined biomarker. Amino acid composition and chirality analysis has been previously demonstrated in the lab using microfabricated capillary electrophoresis (CE) chips. To analyze amino acids in the field, we have developed the Mars Organic Analyzer (MOA), a portable analysis system that consists of a compact instrument and a novel multi-layer CE microchip.

  11. Phase 1 of the near term hybrid passenger vehicle development program. Appendix D: Sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Traversi, M.

    1979-01-01

    Data are presented on the sensitivity of: (1) mission analysis results to the boundary values given for number of passenger cars and average annual vehicle miles traveled per car; (2) vehicle characteristics and performance to specifications; and (3) tradeoff study results to the expected parameters.

  12. Computing sensitivity and selectivity in parallel factor analysis and related multiway techniques: the need for further developments in net analyte signal theory.

    PubMed

    Olivieri, Alejandro C

    2005-08-01

    Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.

  13. Sensitivity Analysis of Launch Vehicle Debris Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Lawrence, Scott L.

    2010-01-01

    As part of an analysis of the loss of crew risk associated with an ascent abort system for a manned launch vehicle, a model was developed to predict the impact risk of the debris resulting from an explosion of the launch vehicle on the crew module. The model consisted of a debris catalog describing the number, size and imparted velocity of each piece of debris, a method to compute the trajectories of the debris and a method to calculate the impact risk given the abort trajectory of the crew module. The model provided a point estimate of the strike probability as a function of the debris catalog, the time of abort and the delay time between the abort and destruction of the launch vehicle. A study was conducted to determine the sensitivity of the strike probability to the various model input parameters and to develop a response surface model for use in the sensitivity analysis of the overall ascent abort risk model. The results of the sensitivity analysis and the response surface model are presented in this paper.

  14. A diagnostic model for the detection of sensitization to wheat allergens was developed and validated in bakery workers.

    PubMed

    Suarthana, Eva; Vergouwe, Yvonne; Moons, Karel G; de Monchy, Jan; Grobbee, Diederick; Heederik, Dick; Meijer, Evert

    2010-09-01

    To develop and validate a prediction model to detect sensitization to wheat allergens in bakery workers. The prediction model was developed in 867 Dutch bakery workers (development set, prevalence of sensitization 13%) and included questionnaire items (candidate predictors). First, principal component analysis was used to reduce the number of candidate predictors. Then, multivariable logistic regression analysis was used to develop the model. Internal validation and extent of optimism was assessed with bootstrapping. External validation was studied in 390 independent Dutch bakery workers (validation set, prevalence of sensitization 20%). The prediction model contained the predictors nasoconjunctival symptoms, asthma symptoms, shortness of breath and wheeze, work-related upper and lower respiratory symptoms, and traditional bakery. The model showed good discrimination with an area under the receiver operating characteristic (ROC) curve area of 0.76 (and 0.75 after internal validation). Application of the model in the validation set gave a reasonable discrimination (ROC area=0.69) and good calibration after a small adjustment of the model intercept. A simple model with questionnaire items only can be used to stratify bakers according to their risk of sensitization to wheat allergens. Its use may increase the cost-effectiveness of (subsequent) medical surveillance.

  15. Sensitivity-Uncertainty Techniques for Nuclear Criticality Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise

    2017-08-07

    The sensitivity and uncertainty analysis course will introduce students to k eff sensitivity data, cross-section uncertainty data, how k eff sensitivity data and k eff uncertainty data are generated and how they can be used. Discussion will include how sensitivity/uncertainty data can be used to select applicable critical experiments, to quantify a defensible margin to cover validation gaps and weaknesses, and in development of upper subcritical limits.

  16. Global sensitivity analysis in stochastic simulators of uncertain reaction networks.

    PubMed

    Navarro Jimenez, M; Le Maître, O P; Knio, O M

    2016-12-28

    Stochastic models of chemical systems are often subjected to uncertainties in kinetic parameters in addition to the inherent random nature of their dynamics. Uncertainty quantification in such systems is generally achieved by means of sensitivity analyses in which one characterizes the variability with the uncertain kinetic parameters of the first statistical moments of model predictions. In this work, we propose an original global sensitivity analysis method where the parametric and inherent variability sources are both treated through Sobol's decomposition of the variance into contributions from arbitrary subset of uncertain parameters and stochastic reaction channels. The conceptual development only assumes that the inherent and parametric sources are independent, and considers the Poisson processes in the random-time-change representation of the state dynamics as the fundamental objects governing the inherent stochasticity. A sampling algorithm is proposed to perform the global sensitivity analysis, and to estimate the partial variances and sensitivity indices characterizing the importance of the various sources of variability and their interactions. The birth-death and Schlögl models are used to illustrate both the implementation of the algorithm and the richness of the proposed analysis method. The output of the proposed sensitivity analysis is also contrasted with a local derivative-based sensitivity analysis method classically used for this type of systems.

  17. Global sensitivity analysis in stochastic simulators of uncertain reaction networks

    DOE PAGES

    Navarro Jimenez, M.; Le Maître, O. P.; Knio, O. M.

    2016-12-23

    Stochastic models of chemical systems are often subjected to uncertainties in kinetic parameters in addition to the inherent random nature of their dynamics. Uncertainty quantification in such systems is generally achieved by means of sensitivity analyses in which one characterizes the variability with the uncertain kinetic parameters of the first statistical moments of model predictions. In this work, we propose an original global sensitivity analysis method where the parametric and inherent variability sources are both treated through Sobol’s decomposition of the variance into contributions from arbitrary subset of uncertain parameters and stochastic reaction channels. The conceptual development only assumes thatmore » the inherent and parametric sources are independent, and considers the Poisson processes in the random-time-change representation of the state dynamics as the fundamental objects governing the inherent stochasticity. Here, a sampling algorithm is proposed to perform the global sensitivity analysis, and to estimate the partial variances and sensitivity indices characterizing the importance of the various sources of variability and their interactions. The birth-death and Schlögl models are used to illustrate both the implementation of the algorithm and the richness of the proposed analysis method. The output of the proposed sensitivity analysis is also contrasted with a local derivative-based sensitivity analysis method classically used for this type of systems.« less

  18. Global sensitivity analysis in stochastic simulators of uncertain reaction networks

    NASA Astrophysics Data System (ADS)

    Navarro Jimenez, M.; Le Maître, O. P.; Knio, O. M.

    2016-12-01

    Stochastic models of chemical systems are often subjected to uncertainties in kinetic parameters in addition to the inherent random nature of their dynamics. Uncertainty quantification in such systems is generally achieved by means of sensitivity analyses in which one characterizes the variability with the uncertain kinetic parameters of the first statistical moments of model predictions. In this work, we propose an original global sensitivity analysis method where the parametric and inherent variability sources are both treated through Sobol's decomposition of the variance into contributions from arbitrary subset of uncertain parameters and stochastic reaction channels. The conceptual development only assumes that the inherent and parametric sources are independent, and considers the Poisson processes in the random-time-change representation of the state dynamics as the fundamental objects governing the inherent stochasticity. A sampling algorithm is proposed to perform the global sensitivity analysis, and to estimate the partial variances and sensitivity indices characterizing the importance of the various sources of variability and their interactions. The birth-death and Schlögl models are used to illustrate both the implementation of the algorithm and the richness of the proposed analysis method. The output of the proposed sensitivity analysis is also contrasted with a local derivative-based sensitivity analysis method classically used for this type of systems.

  19. Parametric Sensitivity Analysis of Oscillatory Delay Systems with an Application to Gene Regulation.

    PubMed

    Ingalls, Brian; Mincheva, Maya; Roussel, Marc R

    2017-07-01

    A parametric sensitivity analysis for periodic solutions of delay-differential equations is developed. Because phase shifts cause the sensitivity coefficients of a periodic orbit to diverge, we focus on sensitivities of the extrema, from which amplitude sensitivities are computed, and of the period. Delay-differential equations are often used to model gene expression networks. In these models, the parametric sensitivities of a particular genotype define the local geometry of the evolutionary landscape. Thus, sensitivities can be used to investigate directions of gradual evolutionary change. An oscillatory protein synthesis model whose properties are modulated by RNA interference is used as an example. This model consists of a set of coupled delay-differential equations involving three delays. Sensitivity analyses are carried out at several operating points. Comments on the evolutionary implications of the results are offered.

  20. Integrating heterogeneous drug sensitivity data from cancer pharmacogenomic studies.

    PubMed

    Pozdeyev, Nikita; Yoo, Minjae; Mackie, Ryan; Schweppe, Rebecca E; Tan, Aik Choon; Haugen, Bryan R

    2016-08-09

    The consistency of in vitro drug sensitivity data is of key importance for cancer pharmacogenomics. Previous attempts to correlate drug sensitivities from the large pharmacogenomics databases, such as the Cancer Cell Line Encyclopedia (CCLE) and the Genomics of Drug Sensitivity in Cancer (GDSC), have produced discordant results. We developed a new drug sensitivity metric, the area under the dose response curve adjusted for the range of tested drug concentrations, which allows integration of heterogeneous drug sensitivity data from the CCLE, the GDSC, and the Cancer Therapeutics Response Portal (CTRP). We show that there is moderate to good agreement of drug sensitivity data for many targeted therapies, particularly kinase inhibitors. The results of this largest cancer cell line drug sensitivity data analysis to date are accessible through the online portal, which serves as a platform for high power pharmacogenomics analysis.

  1. SVDS plume impingement modeling development. Sensitivity analysis supporting level B requirements

    NASA Technical Reports Server (NTRS)

    Chiu, P. B.; Pearson, D. J.; Muhm, P. M.; Schoonmaker, P. B.; Radar, R. J.

    1977-01-01

    A series of sensitivity analyses (trade studies) performed to select features and capabilities to be implemented in the plume impingement model is described. Sensitivity analyses were performed in study areas pertaining to geometry, flowfield, impingement, and dynamical effects. Recommendations based on these analyses are summarized.

  2. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1991-01-01

    The three dimensional quasi-analytical sensitivity analysis and the ancillary driver programs are developed needed to carry out the studies and perform comparisons. The code is essentially contained in one unified package which includes the following: (1) a three dimensional transonic wing analysis program (ZEBRA); (2) a quasi-analytical portion which determines the matrix elements in the quasi-analytical equations; (3) a method for computing the sensitivity coefficients from the resulting quasi-analytical equations; (4) a package to determine for comparison purposes sensitivity coefficients via the finite difference approach; and (5) a graphics package.

  3. An adjoint method of sensitivity analysis for residual vibrations of structures subject to impacts

    NASA Astrophysics Data System (ADS)

    Yan, Kun; Cheng, Gengdong

    2018-03-01

    For structures subject to impact loads, the residual vibration reduction is more and more important as the machines become faster and lighter. An efficient sensitivity analysis of residual vibration with respect to structural or operational parameters is indispensable for using a gradient based optimization algorithm, which reduces the residual vibration in either active or passive way. In this paper, an integrated quadratic performance index is used as the measure of the residual vibration, since it globally measures the residual vibration response and its calculation can be simplified greatly with Lyapunov equation. Several sensitivity analysis approaches for performance index were developed based on the assumption that the initial excitations of residual vibration were given and independent of structural design. Since the resulting excitations by the impact load often depend on structural design, this paper aims to propose a new efficient sensitivity analysis method for residual vibration of structures subject to impacts to consider the dependence. The new method is developed by combining two existing methods and using adjoint variable approach. Three numerical examples are carried out and demonstrate the accuracy of the proposed method. The numerical results show that the dependence of initial excitations on structural design variables may strongly affects the accuracy of sensitivities.

  4. OECD/NEA expert group on uncertainty analysis for criticality safety assessment: Results of benchmark on sensitivity calculation (phase III)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ivanova, T.; Laville, C.; Dyrda, J.

    2012-07-01

    The sensitivities of the k{sub eff} eigenvalue to neutron cross sections have become commonly used in similarity studies and as part of the validation algorithm for criticality safety assessments. To test calculations of the sensitivity coefficients, a benchmark study (Phase III) has been established by the OECD-NEA/WPNCS/EG UACSA (Expert Group on Uncertainty Analysis for Criticality Safety Assessment). This paper presents some sensitivity results generated by the benchmark participants using various computational tools based upon different computational methods: SCALE/TSUNAMI-3D and -1D, MONK, APOLLO2-MORET 5, DRAGON-SUSD3D and MMKKENO. The study demonstrates the performance of the tools. It also illustrates how model simplificationsmore » impact the sensitivity results and demonstrates the importance of 'implicit' (self-shielding) sensitivities. This work has been a useful step towards verification of the existing and developed sensitivity analysis methods. (authors)« less

  5. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE PAGES

    Dai, Heng; Ye, Ming; Walker, Anthony P.; ...

    2017-03-28

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  6. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  7. Development of a generalized perturbation theory method for sensitivity analysis using continuous-energy Monte Carlo methods

    DOE PAGES

    Perfetti, Christopher M.; Rearden, Bradley T.

    2016-03-01

    The sensitivity and uncertainty analysis tools of the ORNL SCALE nuclear modeling and simulation code system that have been developed over the last decade have proven indispensable for numerous application and design studies for nuclear criticality safety and reactor physics. SCALE contains tools for analyzing the uncertainty in the eigenvalue of critical systems, but cannot quantify uncertainty in important neutronic parameters such as multigroup cross sections, fuel fission rates, activation rates, and neutron fluence rates with realistic three-dimensional Monte Carlo simulations. A more complete understanding of the sources of uncertainty in these design-limiting parameters could lead to improvements in processmore » optimization, reactor safety, and help inform regulators when setting operational safety margins. A novel approach for calculating eigenvalue sensitivity coefficients, known as the CLUTCH method, was recently explored as academic research and has been found to accurately and rapidly calculate sensitivity coefficients in criticality safety applications. The work presented here describes a new method, known as the GEAR-MC method, which extends the CLUTCH theory for calculating eigenvalue sensitivity coefficients to enable sensitivity coefficient calculations and uncertainty analysis for a generalized set of neutronic responses using high-fidelity continuous-energy Monte Carlo calculations. Here, several criticality safety systems were examined to demonstrate proof of principle for the GEAR-MC method, and GEAR-MC was seen to produce response sensitivity coefficients that agreed well with reference direct perturbation sensitivity coefficients.« less

  8. Stability, performance and sensitivity analysis of I.I.D. jump linear systems

    NASA Astrophysics Data System (ADS)

    Chávez Fuentes, Jorge R.; González, Oscar R.; Gray, W. Steven

    2018-06-01

    This paper presents a symmetric Kronecker product analysis of independent and identically distributed jump linear systems to develop new, lower dimensional equations for the stability and performance analysis of this type of systems than what is currently available. In addition, new closed form expressions characterising multi-parameter relative sensitivity functions for performance metrics are introduced. The analysis technique is illustrated with a distributed fault-tolerant flight control example where the communication links are allowed to fail randomly.

  9. Examining the accuracy of the infinite order sudden approximation using sensitivity analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eno, L.; Rabitz, H.

    1981-08-15

    A method is developed for assessing the accuracy of scattering observables calculated within the framework of the infinite order sudden (IOS) approximation. In particular, we focus on the energy sudden assumption of the IOS method and our approach involves the determination of the sensitivity of the IOS scattering matrix S/sup IOS/ with respect to a parameter which reintroduces the internal energy operator h/sub 0/ into the IOS Hamiltonian. This procedure is an example of sensitivity analysis of missing model components (h/sub 0/ in this case) in the reference Hamiltonian. In contrast to simple first-order perturbation theory a finite result ismore » obtained for the effect of h/sub 0/ on S/sup IOS/. As an illustration, our method of analysis is applied to integral state-to-state cross sections for the scattering of an atom and rigid rotor. Results are generated within the He+H/sub 2/ system and a comparison is made between IOS and coupled states cross sections and the corresponding IOS sensitivities. It is found that the sensitivity coefficients are very useful indicators of the accuracy of the IOS results. Finally, further developments and applications are discussed.« less

  10. Computational Aspects of Sensitivity Calculations in Linear Transient Structural Analysis. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Greene, William H.

    1989-01-01

    A study has been performed focusing on the calculation of sensitivities of displacements, velocities, accelerations, and stresses in linear, structural, transient response problems. One significant goal was to develop and evaluate sensitivity calculation techniques suitable for large-order finite element analyses. Accordingly, approximation vectors such as vibration mode shapes are used to reduce the dimensionality of the finite element model. Much of the research focused on the accuracy of both response quantities and sensitivities as a function of number of vectors used. Two types of sensitivity calculation techniques were developed and evaluated. The first type of technique is an overall finite difference method where the analysis is repeated for perturbed designs. The second type of technique is termed semianalytical because it involves direct, analytical differentiation of the equations of motion with finite difference approximation of the coefficient matrices. To be computationally practical in large-order problems, the overall finite difference methods must use the approximation vectors from the original design in the analyses of the perturbed models.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, Haitao, E-mail: liaoht@cae.ac.cn

    The direct differentiation and improved least squares shadowing methods are both developed for accurately and efficiently calculating the sensitivity coefficients of time averaged quantities for chaotic dynamical systems. The key idea is to recast the time averaged integration term in the form of differential equation before applying the sensitivity analysis method. An additional constraint-based equation which forms the augmented equations of motion is proposed to calculate the time averaged integration variable and the sensitivity coefficients are obtained as a result of solving the augmented differential equations. The application of the least squares shadowing formulation to the augmented equations results inmore » an explicit expression for the sensitivity coefficient which is dependent on the final state of the Lagrange multipliers. The LU factorization technique to calculate the Lagrange multipliers leads to a better performance for the convergence problem and the computational expense. Numerical experiments on a set of problems selected from the literature are presented to illustrate the developed methods. The numerical results demonstrate the correctness and effectiveness of the present approaches and some short impulsive sensitivity coefficients are observed by using the direct differentiation sensitivity analysis method.« less

  12. Adjoint-based sensitivity analysis of low-order thermoacoustic networks using a wave-based approach

    NASA Astrophysics Data System (ADS)

    Aguilar, José G.; Magri, Luca; Juniper, Matthew P.

    2017-07-01

    Strict pollutant emission regulations are pushing gas turbine manufacturers to develop devices that operate in lean conditions, with the downside that combustion instabilities are more likely to occur. Methods to predict and control unstable modes inside combustion chambers have been developed in the last decades but, in some cases, they are computationally expensive. Sensitivity analysis aided by adjoint methods provides valuable sensitivity information at a low computational cost. This paper introduces adjoint methods and their application in wave-based low order network models, which are used as industrial tools, to predict and control thermoacoustic oscillations. Two thermoacoustic models of interest are analyzed. First, in the zero Mach number limit, a nonlinear eigenvalue problem is derived, and continuous and discrete adjoint methods are used to obtain the sensitivities of the system to small modifications. Sensitivities to base-state modification and feedback devices are presented. Second, a more general case with non-zero Mach number, a moving flame front and choked outlet, is presented. The influence of the entropy waves on the computed sensitivities is shown.

  13. Installation Restoration General Environmental Technology Development. Task 6. Materials Handling of Explosive Contaminated Soil and Sediment.

    DTIC Science & Technology

    1985-06-01

    of chemical analysis and sensitivity testing on material samples . At this 4 time, these samples must be packaged and...preparation at a rate of three samples per hour. One analyst doing both sample preparation and the HPLC analysis can run 16 samples in an 8-hour day. II... study , sensitivity testing was reviewed to enable recommendations for complete analysis of contaminated soils. Materials handling techniques,

  14. Sensitive and inexpensive digital DNA analysis by microfluidic enrichment of rolling circle amplified single-molecules

    PubMed Central

    Kühnemund, Malte; Hernández-Neuta, Iván; Sharif, Mohd Istiaq; Cornaglia, Matteo; Gijs, Martin A.M.

    2017-01-01

    Abstract Single molecule quantification assays provide the ultimate sensitivity and precision for molecular analysis. However, most digital analysis techniques, i.e. droplet PCR, require sophisticated and expensive instrumentation for molecule compartmentalization, amplification and analysis. Rolling circle amplification (RCA) provides a simpler means for digital analysis. Nevertheless, the sensitivity of RCA assays has until now been limited by inefficient detection methods. We have developed a simple microfluidic strategy for enrichment of RCA products into a single field of view of a low magnification fluorescent sensor, enabling ultra-sensitive digital quantification of nucleic acids over a dynamic range from 1.2 aM to 190 fM. We prove the broad applicability of our analysis platform by demonstrating 5-plex detection of as little as ∼1 pg (∼300 genome copies) of pathogenic DNA with simultaneous antibiotic resistance marker detection, and the analysis of rare oncogene mutations. Our method is simpler, more cost-effective and faster than other digital analysis techniques and provides the means to implement digital analysis in any laboratory equipped with a standard fluorescent microscope. PMID:28077562

  15. Automated metal-free multiple-column nanoLC for improved phosphopeptide analysis sensitivity and throughput

    PubMed Central

    Zhao, Rui; Ding, Shi-Jian; Shen, Yufeng; Camp, David G.; Livesay, Eric A.; Udseth, Harold; Smith, Richard D.

    2009-01-01

    We report on the development and characterization of automated metal-free multiple-column nanoLC instrumentation for sensitive and high-throughput analysis of phosphopeptides with mass spectrometry analysis. The system implements a multiple-column capillary LC fluidic design developed for high-throughput analysis of peptides (Anal. Chem. 2001, 73, 3011–3021), incorporating modifications to achieve broad and sensitive analysis of phosphopeptides. The integrated nanoLC columns (50 µm i.d. × 30 cm containing 5 µm C18 particles) and the on-line solid phase extraction columns (150 µm i.d. × 4 cm containing 5 µm C18 particles) were connected to automatic switching valves with non-metal chromatographic accessories, and other modifications to avoid the exposure of the analyte to any metal surfaces during handling, separation, and electrospray ionization. The nanoLC developed provided a separation peak capacity of ∼250 for phosphopeptides (and ∼400 for normal peptides). A detection limit of 0.4 fmol was obtained when a linear ion trap tandem mass spectrometer (Finnegan LTQ) was coupled to a 50-µm i.d. column of the nanoLC. The separation power and sensitivity provided by the nanoLC-LTQ enabled identification of ∼4600 phosphopeptide candidates from ∼60 µg COS-7 cell tryptic digest followed by IMAC enrichment and ∼520 tyrosine phosphopeptides from ∼2 mg of human T cells digests followed by phosphotyrosine peptide immunoprecipitation. PMID:19217835

  16. Applying an intelligent model and sensitivity analysis to inspect mass transfer kinetics, shrinkage and crust color changes of deep-fat fried ostrich meat cubes.

    PubMed

    Amiryousefi, Mohammad Reza; Mohebbi, Mohebbat; Khodaiyan, Faramarz

    2014-01-01

    The objectives of this study were to use image analysis and artificial neural network (ANN) to predict mass transfer kinetics as well as color changes and shrinkage of deep-fat fried ostrich meat cubes. Two generalized feedforward networks were separately developed by using the operation conditions as inputs. Results based on the highest numerical quantities of the correlation coefficients between the experimental versus predicted values, showed proper fitting. Sensitivity analysis results of selected ANNs showed that among the input variables, frying temperature was the most sensitive to moisture content (MC) and fat content (FC) compared to other variables. Sensitivity analysis results of selected ANNs showed that MC and FC were the most sensitive to frying temperature compared to other input variables. Similarly, for the second ANN architecture, microwave power density was the most impressive variable having the maximum influence on both shrinkage percentage and color changes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Harnessing Connectivity in a Large-Scale Small-Molecule Sensitivity Dataset | Office of Cancer Genomics

    Cancer.gov

    Identifying genetic alterations that prime a cancer cell to respond to a particular therapeutic agent can facilitate the development of precision cancer medicines. Cancer cell-line (CCL) profiling of small-molecule sensitivity has emerged as an unbiased method to assess the relationships between genetic or cellular features of CCLs and small-molecule response. Here, we developed annotated cluster multidimensional enrichment analysis to explore the associations between groups of small molecules and groups of CCLs in a new, quantitative sensitivity dataset.

  18. Civil and mechanical engineering applications of sensitivity analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Komkov, V.

    1985-07-01

    In this largely tutorial presentation, the historical development of optimization theories has been outlined as they applied to mechanical and civil engineering designs and the development of modern sensitivity techniques during the last 20 years has been traced. Some of the difficulties and the progress made in overcoming them have been outlined. Some of the recently developed theoretical methods have been stressed to indicate their importance to computer-aided design technology.

  19. Summary Findings from the AVT-191 Project to Assess Sensitivity Analysis and Uncertainty Quantification Methods for Military Vehicle Design

    NASA Technical Reports Server (NTRS)

    Benek, John A.; Luckring, James M.

    2017-01-01

    A NATO symposium held in Greece in 2008 identified many promising sensitivity analysis and uncertainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not clear. The NATO Science and Technology Organization, Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic vehicle development problems. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper summarizes findings and lessons learned from the task group.

  20. Ecological Sensitivity Evaluation of Tourist Region Based on Remote Sensing Image - Taking Chaohu Lake Area as a Case Study

    NASA Astrophysics Data System (ADS)

    Lin, Y.; Li, W. J.; Yu, J.; Wu, C. Z.

    2018-04-01

    Remote sensing technology is of significant advantages for monitoring and analysing ecological environment. By using of automatic extraction algorithm, various environmental resources information of tourist region can be obtained from remote sensing imagery. Combining with GIS spatial analysis and landscape pattern analysis, relevant environmental information can be quantitatively analysed and interpreted. In this study, taking the Chaohu Lake Basin as an example, Landsat-8 multi-spectral satellite image of October 2015 was applied. Integrated the automatic ELM (Extreme Learning Machine) classification results with the data of digital elevation model and slope information, human disturbance degree, land use degree, primary productivity, landscape evenness , vegetation coverage, DEM, slope and normalized water body index were used as the evaluation factors to construct the eco-sensitivity evaluation index based on AHP and overlay analysis. According to the value of eco-sensitivity evaluation index, by using of GIS technique of equal interval reclassification, the Chaohu Lake area was divided into four grades: very sensitive area, sensitive area, sub-sensitive areas and insensitive areas. The results of the eco-sensitivity analysis shows: the area of the very sensitive area was 4577.4378 km2, accounting for about 37.12 %, the sensitive area was 5130.0522 km2, accounting for about 37.12 %; the area of sub-sensitive area was 3729.9312 km2, accounting for 26.99 %; the area of insensitive area was 382.4399 km2, accounting for about 2.77 %. At the same time, it has been found that there were spatial differences in ecological sensitivity of the Chaohu Lake basin. The most sensitive areas were mainly located in the areas with high elevation and large terrain gradient. Insensitive areas were mainly distributed in slope of the slow platform area; the sensitive areas and the sub-sensitive areas were mainly agricultural land and woodland. Through the eco-sensitivity analysis of the study area, the automatic recognition and analysis techniques for remote sensing imagery are integrated into the ecological analysis and ecological regional planning, which can provide a reliable scientific basis for rational planning and regional sustainable development of the Chaohu Lake tourist area.

  1. Development and evaluation of a microdevice for amino acid biomarker detection and analysis on Mars

    PubMed Central

    Skelley, Alison M.; Scherer, James R.; Aubrey, Andrew D.; Grover, William H.; Ivester, Robin H. C.; Ehrenfreund, Pascale; Grunthaner, Frank J.; Bada, Jeffrey L.; Mathies, Richard A.

    2005-01-01

    The Mars Organic Analyzer (MOA), a microfabricated capillary electrophoresis (CE) instrument for sensitive amino acid biomarker analysis, has been developed and evaluated. The microdevice consists of a four-wafer sandwich combining glass CE separation channels, microfabricated pneumatic membrane valves and pumps, and a nanoliter fluidic network. The portable MOA instrument integrates high voltage CE power supplies, pneumatic controls, and fluorescence detection optics necessary for field operation. The amino acid concentration sensitivities range from micromolar to 0.1 nM, corresponding to part-per-trillion sensitivity. The MOA was first used in the lab to analyze soil extracts from the Atacama Desert, Chile, detecting amino acids ranging from 10–600 parts per billion. Field tests of the MOA in the Panoche Valley, CA, successfully detected amino acids at 70 parts per trillion to 100 parts per billion in jarosite, a sulfate-rich mineral associated with liquid water that was recently detected on Mars. These results demonstrate the feasibility of using the MOA to perform sensitive in situ amino acid biomarker analysis on soil samples representative of a Mars-like environment. PMID:15657130

  2. Development and evaluation of a microdevice for amino acid biomarker detection and analysis on Mars.

    PubMed

    Skelley, Alison M; Scherer, James R; Aubrey, Andrew D; Grover, William H; Ivester, Robin H C; Ehrenfreund, Pascale; Grunthaner, Frank J; Bada, Jeffrey L; Mathies, Richard A

    2005-01-25

    The Mars Organic Analyzer (MOA), a microfabricated capillary electrophoresis (CE) instrument for sensitive amino acid biomarker analysis, has been developed and evaluated. The microdevice consists of a four-wafer sandwich combining glass CE separation channels, microfabricated pneumatic membrane valves and pumps, and a nanoliter fluidic network. The portable MOA instrument integrates high voltage CE power supplies, pneumatic controls, and fluorescence detection optics necessary for field operation. The amino acid concentration sensitivities range from micromolar to 0.1 nM, corresponding to part-per-trillion sensitivity. The MOA was first used in the lab to analyze soil extracts from the Atacama Desert, Chile, detecting amino acids ranging from 10-600 parts per billion. Field tests of the MOA in the Panoche Valley, CA, successfully detected amino acids at 70 parts per trillion to 100 parts per billion in jarosite, a sulfate-rich mineral associated with liquid water that was recently detected on Mars. These results demonstrate the feasibility of using the MOA to perform sensitive in situ amino acid biomarker analysis on soil samples representative of a Mars-like environment.

  3. Sensitivity of surface meteorological analyses to observation networks

    NASA Astrophysics Data System (ADS)

    Tyndall, Daniel Paul

    A computationally efficient variational analysis system for two-dimensional meteorological fields is developed and described. This analysis approach is most efficient when the number of analysis grid points is much larger than the number of available observations, such as for large domain mesoscale analyses. The analysis system is developed using MATLAB software and can take advantage of multiple processors or processor cores. A version of the analysis system has been exported as a platform independent application (i.e., can be run on Windows, Linux, or Macintosh OS X desktop computers without a MATLAB license) with input/output operations handled by commonly available internet software combined with data archives at the University of Utah. The impact of observation networks on the meteorological analyses is assessed by utilizing a percentile ranking of individual observation sensitivity and impact, which is computed by using the adjoint of the variational surface assimilation system. This methodology is demonstrated using a case study of the analysis from 1400 UTC 27 October 2010 over the entire contiguous United States domain. The sensitivity of this approach to the dependence of the background error covariance on observation density is examined. Observation sensitivity and impact provide insight on the influence of observations from heterogeneous observing networks as well as serve as objective metrics for quality control procedures that may help to identify stations with significant siting, reporting, or representativeness issues.

  4. Global Sensitivity Analysis for Identifying Important Parameters of Nitrogen Nitrification and Denitrification under Model and Scenario Uncertainties

    NASA Astrophysics Data System (ADS)

    Ye, M.; Chen, Z.; Shi, L.; Zhu, Y.; Yang, J.

    2017-12-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. While global sensitivity analysis is a vital tool for identifying the parameters important to nitrogen reactive transport, conventional global sensitivity analysis only considers parametric uncertainty. This may result in inaccurate selection of important parameters, because parameter importance may vary under different models and modeling scenarios. By using a recently developed variance-based global sensitivity analysis method, this paper identifies important parameters with simultaneous consideration of parametric uncertainty, model uncertainty, and scenario uncertainty. In a numerical example of nitrogen reactive transport modeling, a combination of three scenarios of soil temperature and two scenarios of soil moisture leads to a total of six scenarios. Four alternative models are used to evaluate reduction functions used for calculating actual rates of nitrification and denitrification. The model uncertainty is tangled with scenario uncertainty, as the reduction functions depend on soil temperature and moisture content. The results of sensitivity analysis show that parameter importance varies substantially between different models and modeling scenarios, which may lead to inaccurate selection of important parameters if model and scenario uncertainties are not considered. This problem is avoided by using the new method of sensitivity analysis in the context of model averaging and scenario averaging. The new method of sensitivity analysis can be applied to other problems of contaminant transport modeling when model uncertainty and/or scenario uncertainty are present.

  5. High-sensitivity ESCA instrument

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davies, R.D.; Herglotz, H.K.; Lee, J.D.

    1973-01-01

    A new electron spectroscopy for chemical analysis (ESCA) instrument has been developed to provide high sensitivity and efficient operation for laboratory analysis of composition and chemical bonding in very thin surface layers of solid samples. High sensitivity is achieved by means of the high-intensity, efficient x-ray source described by Davies and Herglotz at the 1968 Denver X-Ray Conference, in combination with the new electron energy analyzer described by Lee at the 1972 Pittsburgh Conference on Analytical Chemistry and Applied Spectroscopy. A sample chamber designed to provide for rapid introduction and replacement of samples has adequate facilities for various sample treatmentsmore » and conditiouing followed immediately by ESCA analysis of the sample. Examples of application are presented, demonstrating the sensitivity and resolution achievable with this instrument. Its usefulness in trace surface analysis is shown and some chemical shifts'' measured by the instrument are compared with those obtained by x-ray spectroscopy. (auth)« less

  6. Benchmark On Sensitivity Calculation (Phase III)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ivanova, Tatiana; Laville, Cedric; Dyrda, James

    2012-01-01

    The sensitivities of the keff eigenvalue to neutron cross sections have become commonly used in similarity studies and as part of the validation algorithm for criticality safety assessments. To test calculations of the sensitivity coefficients, a benchmark study (Phase III) has been established by the OECD-NEA/WPNCS/EG UACSA (Expert Group on Uncertainty Analysis for Criticality Safety Assessment). This paper presents some sensitivity results generated by the benchmark participants using various computational tools based upon different computational methods: SCALE/TSUNAMI-3D and -1D, MONK, APOLLO2-MORET 5, DRAGON-SUSD3D and MMKKENO. The study demonstrates the performance of the tools. It also illustrates how model simplifications impactmore » the sensitivity results and demonstrates the importance of 'implicit' (self-shielding) sensitivities. This work has been a useful step towards verification of the existing and developed sensitivity analysis methods.« less

  7. Analysis of the causes of discrepancies in troponin I concentrations as measured by ARCHITECT High-Sensitive Troponin I ST and STACIA CLEIA cTnI.

    PubMed

    Kondo, Takashi; Kobayashi, Daisuke; Mochizuki, Maki; Asanuma, Kouichi; Takahashi, Satoshi

    2017-01-01

    Background Recently developed reagents for the highly sensitive measurement of cardiac troponin I are useful for early diagnosis of acute coronary syndrome. However, differences in measured values between these new reagents and previously used reagents have not been well studied. In this study, we aimed to compare the values between ARCHITECT High-Sensitive Troponin I ST (newly developed reagents), ARCHITECT Troponin I ST and STACIA CLEIA cardiac troponin I (two previously developed reagent kits). Methods Gel filtration high-performance liquid chromatography was used to analyse the causes of differences in measured values. Results The measured values differed between ARCHITECT High-Sensitive Troponin I ST and STACIA CLEIA cardiac troponin I reagents (r = 0.82). Cross-reactivity tests using plasma with added skeletal-muscle troponin I resulted in higher reactivity (2.17-3.03%) for the STACIA CLEIA cardiac troponin I reagents compared with that for the ARCHITECT High-Sensitive Troponin I ST reagents (less than 0.014%). In addition, analysis of three representative samples using gel filtration high-performance liquid chromatography revealed reagent-specific differences in the reactivity against each cardiac troponin I complex; this could explain the differences in values observed for some of the samples. Conclusion The newly developed ARCHITECT High-Sensitive Troponin I ST reagents were not affected by the presence of skeletal-muscle troponin I in the blood and may be useful for routine examinations.

  8. Boosting Sensitivity in Liquid Chromatography–Fourier Transform Ion Cyclotron Resonance–Tandem Mass Spectrometry for Product Ion Analysis of Monoterpene Indole Alkaloids

    PubMed Central

    Nakabayashi, Ryo; Tsugawa, Hiroshi; Kitajima, Mariko; Takayama, Hiromitsu; Saito, Kazuki

    2015-01-01

    In metabolomics, the analysis of product ions in tandem mass spectrometry (MS/MS) is noteworthy to chemically assign structural information. However, the development of relevant analytical methods are less advanced. Here, we developed a method to boost sensitivity in liquid chromatography–Fourier transform ion cyclotron resonance–tandem mass spectrometry analysis (MS/MS boost analysis). To verify the MS/MS boost analysis, both quercetin and uniformly labeled 13C quercetin were analyzed, revealing that the origin of the product ions is not the instrument, but the analyzed compounds resulting in sensitive product ions. Next, we applied this method to the analysis of monoterpene indole alkaloids (MIAs). The comparative analyses of MIAs having indole basic skeleton (ajmalicine, catharanthine, hirsuteine, and hirsutine) and oxindole skeleton (formosanine, isoformosanine, pteropodine, isopteropodine, rhynchophylline, isorhynchophylline, and mitraphylline) identified 86 and 73 common monoisotopic ions, respectively. The comparative analyses of the three pairs of stereoisomers showed more than 170 common monoisotopic ions in each pair. This method was also applied to the targeted analysis of MIAs in Catharanthus roseus and Uncaria rhynchophylla to profile indole and oxindole compounds using the product ions. This analysis is suitable for chemically assigning features of the metabolite groups, which contributes to targeted metabolome analysis. PMID:26734034

  9. Quantification of pressure sensitive adhesive, residual ink, and other colored process contaminants using dye and color image analysis

    Treesearch

    Roy R. Rosenberger; Carl J. Houtman

    2000-01-01

    The USPS Image Analysis (IA) protocol recommends the use of hydrophobic dyes to develop contrast between pressure sensitive adhesive (PSA) particles and cellulosic fibers before using a dirt counter to detect all contaminants that have contrast with the handsheet background. Unless the sample contains no contaminants other than those of interest, two measurement steps...

  10. Receiver operating characteristic analysis of prediction for gastric cancer development using serum pepsinogen and Helicobacter pylori antibody tests.

    PubMed

    Hamashima, Chisato; Sasazuki, Shizuka; Inoue, Manami; Tsugane, Shoichiro

    2017-03-09

    Chronic Helicobacter pylori infection plays a central role in the development of gastric cancer as shown by biological and epidemiological studies. The H. pylori antibody and serum pepsinogen (PG) tests have been anticipated to predict gastric cancer development. We determined the predictive sensitivity and specificity of gastric cancer development using these tests. Receiver operating characteristic analysis was performed, and areas under the curve were estimated. The predictive sensitivity and specificity of gastric cancer development were compared among single tests and combined methods using serum pepsinogen and H. pylori antibody tests. From a large-scale population-based cohort of over 100,000 subjects followed between 1990 and 2004, 497 gastric cancer subjects and 497 matched healthy controls were chosen. The predictive sensitivity and specificity were low in all single tests and combination methods. The highest predictive sensitivity and specificity were obtained for the serum PG I/II ratio. The optimal PG I/II cut-off values were 2.5 and 3.0. At a PG I/II cut-off value of 3.0, the sensitivity was 86.9% and the specificity was 39.8%. Even if three biomarkers were combined, the sensitivity was 97.2% and the specificity was 21.1% when the cut-off values were 3.0 for PG I/II, 70 ng/mL for PG I, and 10.0 U/mL for H. pylori antibody. The predictive accuracy of gastric cancer development was low with the serum pepsinogen and H. pylori antibody tests even if these tests were combined. To adopt these biomarkers for gastric cancer screening, a high specificity is required. When these tests are adopted for gastric cancer screening, they should be carefully interpreted with a clear understanding of their limitations.

  11. Theoretical foundations for finite-time transient stability and sensitivity analysis of power systems

    NASA Astrophysics Data System (ADS)

    Dasgupta, Sambarta

    Transient stability and sensitivity analysis of power systems are problems of enormous academic and practical interest. These classical problems have received renewed interest, because of the advancement in sensor technology in the form of phasor measurement units (PMUs). The advancement in sensor technology has provided unique opportunity for the development of real-time stability monitoring and sensitivity analysis tools. Transient stability problem in power system is inherently a problem of stability analysis of the non-equilibrium dynamics, because for a short time period following a fault or disturbance the system trajectory moves away from the equilibrium point. The real-time stability decision has to be made over this short time period. However, the existing stability definitions and hence analysis tools for transient stability are asymptotic in nature. In this thesis, we discover theoretical foundations for the short-term transient stability analysis of power systems, based on the theory of normally hyperbolic invariant manifolds and finite time Lyapunov exponents, adopted from geometric theory of dynamical systems. The theory of normally hyperbolic surfaces allows us to characterize the rate of expansion and contraction of co-dimension one material surfaces in the phase space. The expansion and contraction rates of these material surfaces can be computed in finite time. We prove that the expansion and contraction rates can be used as finite time transient stability certificates. Furthermore, material surfaces with maximum expansion and contraction rate are identified with the stability boundaries. These stability boundaries are used for computation of stability margin. We have used the theoretical framework for the development of model-based and model-free real-time stability monitoring methods. Both the model-based and model-free approaches rely on the availability of high resolution time series data from the PMUs for stability prediction. The problem of sensitivity analysis of power system, subjected to changes or uncertainty in load parameters and network topology, is also studied using the theory of normally hyperbolic manifolds. The sensitivity analysis is used for the identification and rank ordering of the critical interactions and parameters in the power network. The sensitivity analysis is carried out both in finite time and in asymptotic. One of the distinguishing features of the asymptotic sensitivity analysis is that the asymptotic dynamics of the system is assumed to be a periodic orbit. For asymptotic sensitivity analysis we employ combination of tools from ergodic theory and geometric theory of dynamical systems.

  12. VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.

    2016-12-01

    VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.

  13. Simultaneous Solid Phase Extraction and Derivatization of Aliphatic Primary Amines Prior to Separation and UV-Absorbance Detection

    PubMed Central

    Felhofer, Jessica L.; Scida, Karen; Penick, Mark; Willis, Peter A.; Garcia, Carlos D.

    2013-01-01

    To overcome the problem of poor sensitivity of capillary electrophoresis-UV absorbance for the detection of aliphatic amines, a solid phase extraction and derivatization scheme was developed. This work demonstrates successful coupling of amines to a chromophore immobilized on a solid phase and subsequent cleavage and analysis. Although the analysis of many types of amines is relevant for myriad applications, this paper focuses on the derivatization and separation of amines with environmental relevance. This work aims to provide the foundations for future developments of an integrated sample preparation microreactor capable of performing simultaneous derivatization, preconcentration, and sample cleanup for sensitive analysis of primary amines. PMID:24054648

  14. Biosensing Technologies for Mycobacterium tuberculosis Detection: Status and New Developments

    PubMed Central

    Zhou, Lixia; He, Xiaoxiao; He, Dinggeng; Wang, Kemin; Qin, Dilan

    2011-01-01

    Biosensing technologies promise to improve Mycobacterium tuberculosis (M. tuberculosis) detection and management in clinical diagnosis, food analysis, bioprocess, and environmental monitoring. A variety of portable, rapid, and sensitive biosensors with immediate “on-the-spot” interpretation have been developed for M. tuberculosis detection based on different biological elements recognition systems and basic signal transducer principles. Here, we present a synopsis of current developments of biosensing technologies for M. tuberculosis detection, which are classified on the basis of basic signal transducer principles, including piezoelectric quartz crystal biosensors, electrochemical biosensors, and magnetoelastic biosensors. Special attention is paid to the methods for improving the framework and analytical parameters of the biosensors, including sensitivity and analysis time as well as automation of analysis procedures. Challenges and perspectives of biosensing technologies development for M. tuberculosis detection are also discussed in the final part of this paper. PMID:21437177

  15. Development of a structural optimization capability for the aeroelastic tailoring of composite rotor blades with straight and swept tips

    NASA Technical Reports Server (NTRS)

    Friedmann, P. P.; Venkatesan, C.; Yuan, K.

    1992-01-01

    This paper describes the development of a new structural optimization capability aimed at the aeroelastic tailoring of composite rotor blades with straight and swept tips. The primary objective is to reduce vibration levels in forward flight without diminishing the aeroelastic stability margins of the blade. In the course of this research activity a number of complicated tasks have been addressed: (1) development of a new, aeroelastic stability and response analysis; (2) formulation of a new comprehensive sensitive analysis, which facilitates the generation of the appropriate approximations for the objective and the constraints; (3) physical understanding of the new model and, in particular, determination of its potential for aeroelastic tailoring, and (4) combination of the newly developed analysis capability, the sensitivity derivatives and the optimizer into a comprehensive optimization capability. The first three tasks have been completed and the fourth task is in progress.

  16. Sensitive and inexpensive digital DNA analysis by microfluidic enrichment of rolling circle amplified single-molecules.

    PubMed

    Kühnemund, Malte; Hernández-Neuta, Iván; Sharif, Mohd Istiaq; Cornaglia, Matteo; Gijs, Martin A M; Nilsson, Mats

    2017-05-05

    Single molecule quantification assays provide the ultimate sensitivity and precision for molecular analysis. However, most digital analysis techniques, i.e. droplet PCR, require sophisticated and expensive instrumentation for molecule compartmentalization, amplification and analysis. Rolling circle amplification (RCA) provides a simpler means for digital analysis. Nevertheless, the sensitivity of RCA assays has until now been limited by inefficient detection methods. We have developed a simple microfluidic strategy for enrichment of RCA products into a single field of view of a low magnification fluorescent sensor, enabling ultra-sensitive digital quantification of nucleic acids over a dynamic range from 1.2 aM to 190 fM. We prove the broad applicability of our analysis platform by demonstrating 5-plex detection of as little as ∼1 pg (∼300 genome copies) of pathogenic DNA with simultaneous antibiotic resistance marker detection, and the analysis of rare oncogene mutations. Our method is simpler, more cost-effective and faster than other digital analysis techniques and provides the means to implement digital analysis in any laboratory equipped with a standard fluorescent microscope. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  17. Developmental and hormonal regulation of thermosensitive neuron potential activity in rat brain.

    PubMed

    Belugin, S; Akino, K; Takamura, N; Mine, M; Romanovsky, D; Fedoseev, V; Kubarko, A; Kosaka, M; Yamashita, S

    1999-08-01

    To understand the involvement of thyroid hormone on the postnatal development of hypothalamic thermosensitive neurons, we focused on the analysis of thermosensitive neuronal activity in the preoptic and anterior hypothalamic (PO/AH) regions of developing rats with and without hypothyroidism. In euthyroid rats, the distribution of thermosensitive neurons in PO/AH showed that in 3-week-old rats (46 neurons tested), 19.5% were warm-sensitive and 80.5% were nonsensitive. In 5- to 12-week-old euthyroid rats (122 neurons), 33.6% were warm-sensitive and 66.4% were nonsensitive. In 5- to 12-week-old hypothyroid rats (108 neurons), however, 18.5% were warm-sensitive and 81.5% were nonsensitive. Temperature thresholds of warm-sensitive neurons were lower in 12-week-old euthyroid rats (36.4+/-0.2 degrees C, n = 15, p<0.01,) than in 3-week-old and in 5-week-old euthyroid rats (38.5+/-0.5 degrees C, n = 9 and 38.0+/-0.3 degrees C, n = 15, respectively). The temperature thresholds of warm-sensitive neurons in 12-week-old hypothyroid rats (39.5+/-0.3 degrees C, n = 8) were similar to that of warm-sensitive neurons of 3-week-old raats (euthyroid and hypothyroid). In contrast, there was no difference in the thresholds of warm-sensitive neurons between hypothyroid and euthyroid rats at the age of 3-5 weeks. In conclusion, monitoring the thermosensitive neuronal tissue activity demonstrated the evidence that thyroid hormone regulates the maturation of warm-sensitive hypothalamic neurons in developing rat brain by electrophysiological analysis.

  18. Insulin sensitivity and diabetic kidney disease in children and adolescents with type 2 diabetes: an observational analysis of data from the today clinical trial

    USDA-ARS?s Scientific Manuscript database

    Diabetic kidney disease is a major cause of premature mortality in type 2 diabetes mellitus (T2DM). Worsening insulin sensitivity independent of glycemic control may contribute to the development of diabetic kidney disease. We investigated the longitudinal association of insulin sensitivity with hyp...

  19. Sensitivity analysis of automatic flight control systems using singular value concepts

    NASA Technical Reports Server (NTRS)

    Herrera-Vaillard, A.; Paduano, J.; Downing, D.

    1985-01-01

    A sensitivity analysis is presented that can be used to judge the impact of vehicle dynamic model variations on the relative stability of multivariable continuous closed-loop control systems. The sensitivity analysis uses and extends the singular-value concept by developing expressions for the gradients of the singular value with respect to variations in the vehicle dynamic model and the controller design. Combined with a priori estimates of the accuracy of the model, the gradients are used to identify the elements in the vehicle dynamic model and controller that could severely impact the system's relative stability. The technique is demonstrated for a yaw/roll damper stability augmentation designed for a business jet.

  20. [The history of hepatitis B virus-related determination tests and inspection and the measurements of problems in Japan].

    PubMed

    Shibata, Hiroshi

    2013-09-01

    Since Hepatitis B virus was detected as the cause of hepatitis, many high-sensitive measurement methods have been developed. In the development history, there are many problems in accuracy, sensitivity and health insurance regulations among different types of kits with different measurement principles. Advanced medical treatments cause problems of gene mutation or reactivation of HBV, leading to the necessity for high sensitive and sophisticated determination. The history of clinical analysis for the detection of HBV was reviewed from the viewpoint of our experiences.

  1. Computational methods for efficient structural reliability and reliability sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  2. Disgusted by Sexual Abuse: Exploring the Association Between Disgust Sensitivity and Posttraumatic Stress Symptoms Among Mothers of Sexually Abused Children.

    PubMed

    van Delft, Ivanka; Finkenauer, Catrin; Tybur, Joshua M; Lamers-Winkelman, Francien

    2016-06-01

    Nonoffending mothers of sexually abused children often exhibit high levels of posttraumatic stress (PTS) symptoms. Emerging evidence suggests that trait-like individual differences in sensitivity to disgust play a role in the development of PTS symptoms. One such individual difference, disgust sensitivity, has not been examined as far as we are aware among victims of secondary traumatic stress. The current study examined associations between disgust sensitivity and PTS symptoms among mothers of sexually abused children (N = 72). Mothers completed the Impact of Event Scale-Revised and the Three Domain Disgust Scale (Tybur, Lieberman, & Griskevicius, 2009). More than one third of mothers scored above a suggested cutoff (mean score = 1.5) for high levels of PTS symptoms. Hierarchical linear regression analysis results indicated that sexual disgust sensitivity (β = .39, p = .002) was associated with PTS symptoms (R(2) = .18). An interaction analysis showed that sexual disgust sensitivity was associated with maternal PTS symptoms only when the perpetrator was not biologically related to the child (β = -.32, p = .047; R(2) = .28). Our findings suggested that sexual disgust sensitivity may be a risk factor for developing PTS symptoms among mothers of sexually abused children. Copyright © 2016 International Society for Traumatic Stress Studies.

  3. SEP thrust subsystem performance sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Atkins, K. L.; Sauer, C. G., Jr.; Kerrisk, D. J.

    1973-01-01

    This is a two-part report on solar electric propulsion (SEP) performance sensitivity analysis. The first part describes the preliminary analysis of the SEP thrust system performance for an Encke rendezvous mission. A detailed description of thrust subsystem hardware tolerances on mission performance is included together with nominal spacecraft parameters based on these tolerances. The second part describes the method of analysis and graphical techniques used in generating the data for Part 1. Included is a description of both the trajectory program used and the additional software developed for this analysis. Part 2 also includes a comprehensive description of the use of the graphical techniques employed in this performance analysis.

  4. Sensitivity Analysis for Coupled Aero-structural Systems

    NASA Technical Reports Server (NTRS)

    Giunta, Anthony A.

    1999-01-01

    A novel method has been developed for calculating gradients of aerodynamic force and moment coefficients for an aeroelastic aircraft model. This method uses the Global Sensitivity Equations (GSE) to account for the aero-structural coupling, and a reduced-order modal analysis approach to condense the coupling bandwidth between the aerodynamic and structural models. Parallel computing is applied to reduce the computational expense of the numerous high fidelity aerodynamic analyses needed for the coupled aero-structural system. Good agreement is obtained between aerodynamic force and moment gradients computed with the GSE/modal analysis approach and the same quantities computed using brute-force, computationally expensive, finite difference approximations. A comparison between the computational expense of the GSE/modal analysis method and a pure finite difference approach is presented. These results show that the GSE/modal analysis approach is the more computationally efficient technique if sensitivity analysis is to be performed for two or more aircraft design parameters.

  5. Structural reliability methods: Code development status

    NASA Astrophysics Data System (ADS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  6. Structural reliability methods: Code development status

    NASA Technical Reports Server (NTRS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-01-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  7. Probabilistic sensitivity analysis for NICE technology assessment: not an optional extra.

    PubMed

    Claxton, Karl; Sculpher, Mark; McCabe, Chris; Briggs, Andrew; Akehurst, Ron; Buxton, Martin; Brazier, John; O'Hagan, Tony

    2005-04-01

    Recently the National Institute for Clinical Excellence (NICE) updated its methods guidance for technology assessment. One aspect of the new guidance is to require the use of probabilistic sensitivity analysis with all cost-effectiveness models submitted to the Institute. The purpose of this paper is to place the NICE guidance on dealing with uncertainty into a broader context of the requirements for decision making; to explain the general approach that was taken in its development; and to address each of the issues which have been raised in the debate about the role of probabilistic sensitivity analysis in general. The most appropriate starting point for developing guidance is to establish what is required for decision making. On the basis of these requirements, the methods and framework of analysis which can best meet these needs can then be identified. It will be argued that the guidance on dealing with uncertainty and, in particular, the requirement for probabilistic sensitivity analysis, is justified by the requirements of the type of decisions that NICE is asked to make. Given this foundation, the main issues and criticisms raised during and after the consultation process are reviewed. Finally, some of the methodological challenges posed by the need fully to characterise decision uncertainty and to inform the research agenda will be identified and discussed. Copyright (c) 2005 John Wiley & Sons, Ltd.

  8. A comparison of computer-assisted detection (CAD) programs for the identification of colorectal polyps: performance and sensitivity analysis, current limitations and practical tips for radiologists.

    PubMed

    Bell, L T O; Gandhi, S

    2018-06-01

    To directly compare the accuracy and speed of analysis of two commercially available computer-assisted detection (CAD) programs in detecting colorectal polyps. In this retrospective single-centre study, patients who had colorectal polyps identified on computed tomography colonography (CTC) and subsequent lower gastrointestinal endoscopy, were analysed using two commercially available CAD programs (CAD1 and CAD2). Results were compared against endoscopy to ascertain sensitivity and positive predictive value (PPV) for colorectal polyps. Time taken for CAD analysis was also calculated. CAD1 demonstrated a sensitivity of 89.8%, PPV of 17.6% and mean analysis time of 125.8 seconds. CAD2 demonstrated a sensitivity of 75.5%, PPV of 44.0% and mean analysis time of 84.6 seconds. The sensitivity and PPV for colorectal polyps and CAD analysis times can vary widely between current commercially available CAD programs. There is still room for improvement. Generally, there is a trade-off between sensitivity and PPV, and so further developments should aim to optimise both. Information on these factors should be made routinely available, so that an informed choice on their use can be made. This information could also potentially influence the radiologist's use of CAD results. Copyright © 2018 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  9. HPAEC-PAD for oligosaccharide analysis-novel insights into analyte sensitivity and response stability.

    PubMed

    Mechelke, Matthias; Herlet, Jonathan; Benz, J Philipp; Schwarz, Wolfgang H; Zverlov, Vladimir V; Liebl, Wolfgang; Kornberger, Petra

    2017-12-01

    The rising importance of accurately detecting oligosaccharides in biomass hydrolyzates or as ingredients in food, such as in beverages and infant milk products, demands for the availability of tools to sensitively analyze the broad range of available oligosaccharides. Over the last decades, HPAEC-PAD has been developed into one of the major technologies for this task and represents a popular alternative to state-of-the-art LC-MS oligosaccharide analysis. This work presents the first comprehensive study which gives an overview of the separation of 38 analytes as well as enzymatic hydrolyzates of six different polysaccharides focusing on oligosaccharides. The high sensitivity of the PAD comes at cost of its stability due to recession of the gold electrode. By an in-depth analysis of the sensitivity drop over time for 35 analytes, including xylo- (XOS), arabinoxylo- (AXOS), laminari- (LOS), manno- (MOS), glucomanno- (GMOS), and cellooligosaccharides (COS), we developed an analyte-specific one-phase decay model for this effect over time. Using this model resulted in significantly improved data normalization when using an internal standard. Our results thereby allow a quantification approach which takes the inevitable and analyte-specific PAD response drop into account. Graphical abstract HPAEC-PAD analysis of oligosaccharides and determination of PAD response drop leading to an improved data normalization.

  10. Sensitivity Enhancement of FBG-Based Strain Sensor.

    PubMed

    Li, Ruiya; Chen, Yiyang; Tan, Yuegang; Zhou, Zude; Li, Tianliang; Mao, Jian

    2018-05-17

    A novel fiber Bragg grating (FBG)-based strain sensor with a high-sensitivity is presented in this paper. The proposed FBG-based strain sensor enhances sensitivity by pasting the FBG on a substrate with a lever structure. This typical mechanical configuration mechanically amplifies the strain of the FBG to enhance overall sensitivity. As this mechanical configuration has a high stiffness, the proposed sensor can achieve a high resonant frequency and a wide dynamic working range. The sensing principle is presented, and the corresponding theoretical model is derived and validated. Experimental results demonstrate that the developed FBG-based strain sensor achieves an enhanced strain sensitivity of 6.2 pm/με, which is consistent with the theoretical analysis result. The strain sensitivity of the developed sensor is 5.2 times of the strain sensitivity of a bare fiber Bragg grating strain sensor. The dynamic characteristics of this sensor are investigated through the finite element method (FEM) and experimental tests. The developed sensor exhibits an excellent strain-sensitivity-enhancing property in a wide frequency range. The proposed high-sensitivity FBG-based strain sensor can be used for small-amplitude micro-strain measurement in harsh industrial environments.

  11. Sensitivity Enhancement of FBG-Based Strain Sensor

    PubMed Central

    Chen, Yiyang; Tan, Yuegang; Zhou, Zude; Mao, Jian

    2018-01-01

    A novel fiber Bragg grating (FBG)-based strain sensor with a high-sensitivity is presented in this paper. The proposed FBG-based strain sensor enhances sensitivity by pasting the FBG on a substrate with a lever structure. This typical mechanical configuration mechanically amplifies the strain of the FBG to enhance overall sensitivity. As this mechanical configuration has a high stiffness, the proposed sensor can achieve a high resonant frequency and a wide dynamic working range. The sensing principle is presented, and the corresponding theoretical model is derived and validated. Experimental results demonstrate that the developed FBG-based strain sensor achieves an enhanced strain sensitivity of 6.2 pm/με, which is consistent with the theoretical analysis result. The strain sensitivity of the developed sensor is 5.2 times of the strain sensitivity of a bare fiber Bragg grating strain sensor. The dynamic characteristics of this sensor are investigated through the finite element method (FEM) and experimental tests. The developed sensor exhibits an excellent strain-sensitivity-enhancing property in a wide frequency range. The proposed high-sensitivity FBG-based strain sensor can be used for small-amplitude micro-strain measurement in harsh industrial environments. PMID:29772826

  12. The Sensitivity Analysis for the Flow Past Obstacles Problem with Respect to the Reynolds Number

    PubMed Central

    Ito, Kazufumi; Li, Zhilin; Qiao, Zhonghua

    2013-01-01

    In this paper, numerical sensitivity analysis with respect to the Reynolds number for the flow past obstacle problem is presented. To carry out such analysis, at each time step, we need to solve the incompressible Navier-Stokes equations on irregular domains twice, one for the primary variables; the other is for the sensitivity variables with homogeneous boundary conditions. The Navier-Stokes solver is the augmented immersed interface method for Navier-Stokes equations on irregular domains. One of the most important contribution of this paper is that our analysis can predict the critical Reynolds number at which the vortex shading begins to develop in the wake of the obstacle. Some interesting experiments are shown to illustrate how the critical Reynolds number varies with different geometric settings. PMID:24910780

  13. The Sensitivity Analysis for the Flow Past Obstacles Problem with Respect to the Reynolds Number.

    PubMed

    Ito, Kazufumi; Li, Zhilin; Qiao, Zhonghua

    2012-02-01

    In this paper, numerical sensitivity analysis with respect to the Reynolds number for the flow past obstacle problem is presented. To carry out such analysis, at each time step, we need to solve the incompressible Navier-Stokes equations on irregular domains twice, one for the primary variables; the other is for the sensitivity variables with homogeneous boundary conditions. The Navier-Stokes solver is the augmented immersed interface method for Navier-Stokes equations on irregular domains. One of the most important contribution of this paper is that our analysis can predict the critical Reynolds number at which the vortex shading begins to develop in the wake of the obstacle. Some interesting experiments are shown to illustrate how the critical Reynolds number varies with different geometric settings.

  14. Critical processes and parameters in the development of accident tolerant fuels drop-in capsule irradiation tests

    DOE PAGES

    Barrett, K. E.; Ellis, K. D.; Glass, C. R.; ...

    2015-12-01

    The goal of the Accident Tolerant Fuel (ATF) program is to develop the next generation of Light Water Reactor (LWR) fuels with improved performance, reliability, and safety characteristics during normal operations and accident conditions and with reduced waste generation. An irradiation test series has been defined to assess the performance of proposed ATF concepts under normal LWR operating conditions. The Phase I ATF irradiation test series is planned to be performed as a series of drop-in capsule tests to be irradiated in the Advanced Test Reactor (ATR) operated by the Idaho National Laboratory (INL). Design, analysis, and fabrication processes formore » ATR drop-in capsule experiment preparation are presented in this paper to demonstrate the importance of special design considerations, parameter sensitivity analysis, and precise fabrication and inspection techniques for figure innovative materials used in ATF experiment assemblies. A Taylor Series Method sensitivity analysis approach was used to identify the most critical variables in cladding and rodlet stress, temperature, and pressure calculations for design analyses. The results showed that internal rodlet pressure calculations are most sensitive to the fission gas release rate uncertainty while temperature calculations are most sensitive to cladding I.D. and O.D. dimensional uncertainty. The analysis showed that stress calculations are most sensitive to rodlet internal pressure uncertainties, however the results also indicated that the inside radius, outside radius, and internal pressure were all magnified as they propagate through the stress equation. This study demonstrates the importance for ATF concept development teams to provide the fabricators as much information as possible about the material properties and behavior observed in prototype testing, mock-up fabrication and assembly, and chemical and mechanical testing of the materials that may have been performed in the concept development phase. Special handling, machining, welding, and inspection of materials, if known, should also be communicated to the experiment fabrication and inspection team.« less

  15. Highly sensitive index of sympathetic activity based on time-frequency spectral analysis of electrodermal activity.

    PubMed

    Posada-Quintero, Hugo F; Florian, John P; Orjuela-Cañón, Álvaro D; Chon, Ki H

    2016-09-01

    Time-domain indices of electrodermal activity (EDA) have been used as a marker of sympathetic tone. However, they often show high variation between subjects and low consistency, which has precluded their general use as a marker of sympathetic tone. To examine whether power spectral density analysis of EDA can provide more consistent results, we recently performed a variety of sympathetic tone-evoking experiments (43). We found significant increase in the spectral power in the frequency range of 0.045 to 0.25 Hz when sympathetic tone-evoking stimuli were induced. The sympathetic tone assessed by the power spectral density of EDA was found to have lower variation and more sensitivity for certain, but not all, stimuli compared with the time-domain analysis of EDA. We surmise that this lack of sensitivity in certain sympathetic tone-inducing conditions with time-invariant spectral analysis of EDA may lie in its inability to characterize time-varying dynamics of the sympathetic tone. To overcome the disadvantages of time-domain and time-invariant power spectral indices of EDA, we developed a highly sensitive index of sympathetic tone, based on time-frequency analysis of EDA signals. Its efficacy was tested using experiments designed to elicit sympathetic dynamics. Twelve subjects underwent four tests known to elicit sympathetic tone arousal: cold pressor, tilt table, stand test, and the Stroop task. We hypothesize that a more sensitive measure of sympathetic control can be developed using time-varying spectral analysis. Variable frequency complex demodulation, a recently developed technique for time-frequency analysis, was used to obtain spectral amplitudes associated with EDA. We found that the time-varying spectral frequency band 0.08-0.24 Hz was most responsive to stimulation. Spectral power for frequencies higher than 0.24 Hz were determined to be not related to the sympathetic dynamics because they comprised less than 5% of the total power. The mean value of time-varying spectral amplitudes in the frequency band 0.08-0.24 Hz were used as the index of sympathetic tone, termed TVSymp. TVSymp was found to be overall the most sensitive to the stimuli, as evidenced by a low coefficient of variation (0.54), and higher consistency (intra-class correlation, 0.96) and sensitivity (Youden's index > 0.75), area under the receiver operating characteristic (ROC) curve (>0.8, accuracy > 0.88) compared with time-domain and time-invariant spectral indices, including heart rate variability. Copyright © 2016 the American Physiological Society.

  16. Analyzing reflective narratives to assess the ethical reasoning of pediatric residents.

    PubMed

    Moon, Margaret; Taylor, Holly A; McDonald, Erin L; Hughes, Mark T; Beach, Mary Catherine; Carrese, Joseph A

    2013-01-01

    A limiting factor in ethics education in medical training has been difficulty in assessing competence in ethics. This study was conducted to test the concept that content analysis of pediatric residents' personal reflections about ethics experiences can identify changes in ethical sensitivity and reasoning over time. Analysis of written narratives focused on two of our ethics curriculum's goals: 1) To raise sensitivity to ethical issues in everyday clinical practice and 2) to enhance critical reflection on personal and professional values as they affect patient care. Content analysis of written reflections was guided by a tool developed to identify and assess the level of ethical reasoning in eight domains determined to be important aspects of ethical competence. Based on the assessment of narratives written at two times (12 to 16 months/apart) during their training, residents showed significant progress in two specific domains: use of professional values, and use of personal values. Residents did not show decline in ethical reasoning in any domain. This study demonstrates that content analysis of personal narratives may provide a useful method for assessment of developing ethical sensitivity and reasoning.

  17. [Analysis and experimental verification of sensitivity and SNR of laser warning receiver].

    PubMed

    Zhang, Ji-Long; Wang, Ming; Tian, Er-Ming; Li, Xiao; Wang, Zhi-Bin; Zhang, Yue

    2009-01-01

    In order to countermeasure increasingly serious threat from hostile laser in modern war, it is urgent to do research on laser warning technology and system, and the sensitivity and signal to noise ratio (SNR) are two important performance parameters in laser warning system. In the present paper, based on the signal statistical detection theory, a method for calculation of the sensitivity and SNR in coherent detection laser warning receiver (LWR) has been proposed. Firstly, the probabilities of the laser signal and receiver noise were analyzed. Secondly, based on the threshold detection theory and Neyman-Pearson criteria, the signal current equation was established by introducing detection probability factor and false alarm rate factor, then, the mathematical expressions of sensitivity and SNR were deduced. Finally, by using method, the sensitivity and SNR of the sinusoidal grating laser warning receiver developed by our group were analyzed, and the theoretic calculation and experimental results indicate that the SNR analysis method is feasible, and can be used in performance analysis of LWR.

  18. Statistical sensitivity analysis of a simple nuclear waste repository model

    NASA Astrophysics Data System (ADS)

    Ronen, Y.; Lucius, J. L.; Blow, E. M.

    1980-06-01

    A preliminary step in a comprehensive sensitivity analysis of the modeling of a nuclear waste repository. The purpose of the complete analysis is to determine which modeling parameters and physical data are most important in determining key design performance criteria and then to obtain the uncertainty in the design for safety considerations. The theory for a statistical screening design methodology is developed for later use in the overall program. The theory was applied to the test case of determining the relative importance of the sensitivity of near field temperature distribution in a single level salt repository to modeling parameters. The exact values of the sensitivities to these physical and modeling parameters were then obtained using direct methods of recalculation. The sensitivity coefficients found to be important for the sample problem were thermal loading, distance between the spent fuel canisters and their radius. Other important parameters were those related to salt properties at a point of interest in the repository.

  19. Modeling and Analysis of a Combined Stress-Vibration Fiber Bragg Grating Sensor

    PubMed Central

    Yao, Kun; Lin, Qijing; Jiang, Zhuangde; Zhao, Na; Tian, Bian; Shi, Peng; Peng, Gang-Ding

    2018-01-01

    A combined stress-vibration sensor was developed to measure stress and vibration simultaneously based on fiber Bragg grating (FBG) technology. The sensor is composed of two FBGs and a stainless steel plate with a special design. The two FBGs sense vibration and stress and the sensor can realize temperature compensation by itself. The stainless steel plate can significantly increase sensitivity of vibration measurement. Theoretical analysis and Finite Element Method (FEM) were used to analyze the sensor’s working mechanism. As demonstrated with analysis, the obtained sensor has working range of 0–6000 Hz for vibration sensing and 0–100 MPa for stress sensing, respectively. The corresponding sensitivity for vibration is 0.46 pm/g and the resulted stress sensitivity is 5.94 pm/MPa, while the nonlinearity error for vibration and stress measurement is 0.77% and 1.02%, respectively. Compared to general FBGs, the vibration sensitivity of this sensor is 26.2 times higher. Therefore, the developed sensor can be used to concurrently detect vibration and stress. As this sensor has height of 1 mm and weight of 1.15 g, it is beneficial for minimization and integration. PMID:29494544

  20. Modeling and Analysis of a Combined Stress-Vibration Fiber Bragg Grating Sensor.

    PubMed

    Yao, Kun; Lin, Qijing; Jiang, Zhuangde; Zhao, Na; Tian, Bian; Shi, Peng; Peng, Gang-Ding

    2018-03-01

    A combined stress-vibration sensor was developed to measure stress and vibration simultaneously based on fiber Bragg grating (FBG) technology. The sensor is composed of two FBGs and a stainless steel plate with a special design. The two FBGs sense vibration and stress and the sensor can realize temperature compensation by itself. The stainless steel plate can significantly increase sensitivity of vibration measurement. Theoretical analysis and Finite Element Method (FEM) were used to analyze the sensor's working mechanism. As demonstrated with analysis, the obtained sensor has working range of 0-6000 Hz for vibration sensing and 0-100 MPa for stress sensing, respectively. The corresponding sensitivity for vibration is 0.46 pm/g and the resulted stress sensitivity is 5.94 pm/MPa, while the nonlinearity error for vibration and stress measurement is 0.77% and 1.02%, respectively. Compared to general FBGs, the vibration sensitivity of this sensor is 26.2 times higher. Therefore, the developed sensor can be used to concurrently detect vibration and stress. As this sensor has height of 1 mm and weight of 1.15 g, it is beneficial for minimization and integration.

  1. Design sensitivity analysis and optimization tool (DSO) for sizing design applications

    NASA Technical Reports Server (NTRS)

    Chang, Kuang-Hua; Choi, Kyung K.; Perng, Jyh-Hwa

    1992-01-01

    The DSO tool, a structural design software system that provides the designer with a graphics-based menu-driven design environment to perform easy design optimization for general applications, is presented. Three design stages, preprocessing, design sensitivity analysis, and postprocessing, are implemented in the DSO to allow the designer to carry out the design process systematically. A framework, including data base, user interface, foundation class, and remote module, has been designed and implemented to facilitate software development for the DSO. A number of dedicated commercial software/packages have been integrated in the DSO to support the design procedures. Instead of parameterizing an FEM, design parameters are defined on a geometric model associated with physical quantities, and the continuum design sensitivity analysis theory is implemented to compute design sensitivity coefficients using postprocessing data from the analysis codes. A tracked vehicle road wheel is given as a sizing design application to demonstrate the DSO's easy and convenient design optimization process.

  2. A sensitivity analysis of "Forests on the Edge: Housing Development on America's Private Forests."

    Treesearch

    Eric M. White; Ralph J. Alig; Lisa G. Mahal; David M. Theobald

    2009-01-01

    The original Forests on the Edge report (FOTE 1) indicated that 44.2 million acres of private forest land was projected to experience substantial increases in residential development in the coming decades. In this study, we examined the sensitivity of the FOTE 1 results to four factors: (1) use of updated private land and forest cover spatial data and a revised model...

  3. Blurring the Inputs: A Natural Language Approach to Sensitivity Analysis

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Thompson, Richard A.; Johnston, Christopher O.

    2007-01-01

    To document model parameter uncertainties and to automate sensitivity analyses for numerical simulation codes, a natural-language-based method to specify tolerances has been developed. With this new method, uncertainties are expressed in a natural manner, i.e., as one would on an engineering drawing, namely, 5.25 +/- 0.01. This approach is robust and readily adapted to various application domains because it does not rely on parsing the particular structure of input file formats. Instead, tolerances of a standard format are added to existing fields within an input file. As a demonstration of the power of this simple, natural language approach, a Monte Carlo sensitivity analysis is performed for three disparate simulation codes: fluid dynamics (LAURA), radiation (HARA), and ablation (FIAT). Effort required to harness each code for sensitivity analysis was recorded to demonstrate the generality and flexibility of this new approach.

  4. Computational simulation and aerodynamic sensitivity analysis of film-cooled turbines

    NASA Astrophysics Data System (ADS)

    Massa, Luca

    A computational tool is developed for the time accurate sensitivity analysis of the stage performance of hot gas, unsteady turbine components. An existing turbomachinery internal flow solver is adapted to the high temperature environment typical of the hot section of jet engines. A real gas model and film cooling capabilities are successfully incorporated in the software. The modifications to the existing algorithm are described; both the theoretical model and the numerical implementation are validated. The accuracy of the code in evaluating turbine stage performance is tested using a turbine geometry typical of the last stage of aeronautical jet engines. The results of the performance analysis show that the predictions differ from the experimental data by less than 3%. A reliable grid generator, applicable to the domain discretization of the internal flow field of axial flow turbine is developed. A sensitivity analysis capability is added to the flow solver, by rendering it able to accurately evaluate the derivatives of the time varying output functions. The complex Taylor's series expansion (CTSE) technique is reviewed. Two of them are used to demonstrate the accuracy and time dependency of the differentiation process. The results are compared with finite differences (FD) approximations. The CTSE is more accurate than the FD, but less efficient. A "black box" differentiation of the source code, resulting from the automated application of the CTSE, generates high fidelity sensitivity algorithms, but with low computational efficiency and high memory requirements. New formulations of the CTSE are proposed and applied. Selective differentiation of the method for solving the non-linear implicit residual equation leads to sensitivity algorithms with the same accuracy but improved run time. The time dependent sensitivity derivatives are computed in run times comparable to the ones required by the FD approach.

  5. Chronic pesticide poisoning from persistent low-dose exposures in Ecuadorean floriculture workers: toward validating a low-cost test battery.

    PubMed

    Breilh, Jaime; Pagliccia, Nino; Yassi, Annalee

    2012-01-01

    Chronic pesticide poisoning is difficult to detect. We sought to develop a low-cost test battery for settings such as Ecuador's floriculture industry. First we had to develop a case definition; as with all occupational diseases a case had to have both sufficient effective dose and associated health effects. For the former, using canonical discriminant analysis, we found that adding measures of protection and overall environmental stressors to occupational category and duration of exposure was useful. For the latter, factor analysis suggested three distinct manifestations of pesticide poisoning. We then determined sensitivity and specificity of various combinations of symptoms and simple neurotoxicity tests from the Pentox questionnaire, and found that doing so increased sensitivity and specificity compared to use of acethylcholinesterase alone--the current screening standard. While sensitivity and specificity varied with different case definitions, our results support the development of a low-cost test battery for screening in such settings.

  6. Application of global sensitivity analysis methods to Takagi-Sugeno-Kang rainfall-runoff fuzzy models

    NASA Astrophysics Data System (ADS)

    Jacquin, A. P.; Shamseldin, A. Y.

    2009-04-01

    This study analyses the sensitivity of the parameters of Takagi-Sugeno-Kang rainfall-runoff fuzzy models previously developed by the authors. These models can be classified in two types, where the first type is intended to account for the effect of changes in catchment wetness and the second type incorporates seasonality as a source of non-linearity in the rainfall-runoff relationship. The sensitivity analysis is performed using two global sensitivity analysis methods, namely Regional Sensitivity Analysis (RSA) and Sobol's Variance Decomposition (SVD). In general, the RSA method has the disadvantage of not being able to detect sensitivities arising from parameter interactions. By contrast, the SVD method is suitable for analysing models where the model response surface is expected to be affected by interactions at a local scale and/or local optima, such as the case of the rainfall-runoff fuzzy models analysed in this study. The data of six catchments from different geographical locations and sizes are used in the sensitivity analysis. The sensitivity of the model parameters is analysed in terms of two measures of goodness of fit, assessing the model performance from different points of view. These measures are the Nash-Sutcliffe criterion and the index of volumetric fit. The results of the study show that the sensitivity of the model parameters depends on both the type of non-linear effects (i.e. changes in catchment wetness or seasonality) that dominates the catchment's rainfall-runoff relationship and the measure used to assess the model performance. Acknowledgements: This research was supported by FONDECYT, Research Grant 11070130. We would also like to express our gratitude to Prof. Kieran M. O'Connor from the National University of Ireland, Galway, for providing the data used in this study.

  7. Enhanced electrochemical nanoring electrode for analysis of cytosol in single cells.

    PubMed

    Zhuang, Lihong; Zuo, Huanzhen; Wu, Zengqiang; Wang, Yu; Fang, Danjun; Jiang, Dechen

    2014-12-02

    A microelectrode array has been applied for single cell analysis with relatively high throughput; however, the cells were typically cultured on the microelectrodes under cell-size microwell traps leading to the difficulty in the functionalization of an electrode surface for higher detection sensitivity. Here, nanoring electrodes embedded under the microwell traps were fabricated to achieve the isolation of the electrode surface and the cell support, and thus, the electrode surface can be modified to obtain enhanced electrochemical sensitivity for single cell analysis. Moreover, the nanometer-sized electrode permitted a faster diffusion of analyte to the surface for additional improvement in the sensitivity, which was evidenced by the electrochemical characterization and the simulation. To demonstrate the concept of the functionalized nanoring electrode for single cell analysis, the electrode surface was deposited with prussian blue to detect intracellular hydrogen peroxide at a single cell. Hundreds of picoamperes were observed on our functionalized nanoring electrode exhibiting the enhanced electrochemical sensitivity. The success in the achievement of a functionalized nanoring electrode will benefit the development of high throughput single cell electrochemical analysis.

  8. Language evaluation protocol for children aged 2 months to 23 months: analysis of sensitivity and specificity.

    PubMed

    Labanca, Ludimila; Alves, Cláudia Regina Lindgren; Bragança, Lidia Lourenço Cunha; Dorim, Diego Dias Ramos; Alvim, Cristina Gonçalves; Lemos, Stela Maris Aguiar

    2015-01-01

    To establish cutoff points for the analysis of the Behavior Observation Form (BOF) of children in the ages of 2 to 23 months and evaluate the sensitivity and specificity by age group and domains (Emission, Reception, and Cognitive Aspects of Language). The sample consisted of 752 children who underwent BOF. Each child was classified as having appropriate language development for the age or having possible risk of language impairment. Performance Indicators (PI) were calculated in each domain as well as the overall PI in all domains. The values for sensitivity and specificity were also calculated. The cutoff points for possible risk of language impairment for each domain and each age group were obtained using the receiver operating characteristics curve. The results of the study revealed that one-third of the assessed children have a risk of language impairment in the first two years of life. The analysis of BOF showed high sensitivity (>90%) in all categories and in all age groups; however, the chance of false-positive results was higher than 20% in the majority of aspects evaluated. It was possible to establish the cutoff points for all categories and age groups with good correlation between sensitivity and specificity, except for the age group of 2 to 6 months. This study provides important contributions to the discussion on the evaluation of the language development of children younger than 2 years.

  9. Pain sensitivity profiles in patients with advanced knee osteoarthritis

    PubMed Central

    Frey-Law, Laura A.; Bohr, Nicole L.; Sluka, Kathleen A.; Herr, Keela; Clark, Charles R.; Noiseux, Nicolas O.; Callaghan, John J; Zimmerman, M Bridget; Rakel, Barbara A.

    2016-01-01

    The development of patient profiles to subgroup individuals on a variety of variables has gained attention as a potential means to better inform clinical decision-making. Patterns of pain sensitivity response specific to quantitative sensory testing (QST) modality have been demonstrated in healthy subjects. It has not been determined if these patterns persist in a knee osteoarthritis population. In a sample of 218 participants, 19 QST measures along with pain, psychological factors, self-reported function, and quality of life were assessed prior to total knee arthroplasty. Component analysis was used to identify commonalities across the 19 QST assessments to produce standardized pain sensitivity factors. Cluster analysis then grouped individuals that exhibited similar patterns of standardized pain sensitivity component scores. The QST resulted in four pain sensitivity components: heat, punctate, temporal summation, and pressure. Cluster analysis resulted in five pain sensitivity profiles: a “low pressure pain” group, an “average pain” group, and three “high pain” sensitivity groups who were sensitive to different modalities (punctate, heat, and temporal summation). Pain and function differed between pain sensitivity profiles, along with sex distribution; however no differences in OA grade, medication use, or psychological traits were found. Residualizing QST data by age and sex resulted in similar components and pain sensitivity profiles. Further, these profiles are surprisingly similar to those reported in healthy populations suggesting that individual differences in pain sensitivity are a robust finding even in an older population with significant disease. PMID:27152688

  10. Sensitivity analysis of the add-on price estimate for the silicon web growth process

    NASA Technical Reports Server (NTRS)

    Mokashi, A. R.

    1981-01-01

    The web growth process, a silicon-sheet technology option, developed for the flat plate solar array (FSA) project, was examined. Base case data for the technical and cost parameters for the technical and commercial readiness phase of the FSA project are projected. The process add on price, using the base case data for cost parameters such as equipment, space, direct labor, materials and utilities, and the production parameters such as growth rate and run length, using a computer program developed specifically to do the sensitivity analysis with improved price estimation are analyzed. Silicon price, sheet thickness and cell efficiency are also discussed.

  11. Modeling, design, packing and experimental analysis of liquid-phase shear-horizontal surface acoustic wave sensors

    NASA Astrophysics Data System (ADS)

    Pollard, Thomas B

    Recent advances in microbiology, computational capabilities, and microelectromechanical-system fabrication techniques permit modeling, design, and fabrication of low-cost, miniature, sensitive and selective liquid-phase sensors and lab-on-a-chip systems. Such devices are expected to replace expensive, time-consuming, and bulky laboratory-based testing equipment. Potential applications for devices include: fluid characterization for material science and industry; chemical analysis in medicine and pharmacology; study of biological processes; food analysis; chemical kinetics analysis; and environmental monitoring. When combined with liquid-phase packaging, sensors based on surface-acoustic-wave (SAW) technology are considered strong candidates. For this reason such devices are focused on in this work; emphasis placed on device modeling and packaging for liquid-phase operation. Regarding modeling, topics considered include mode excitation efficiency of transducers; mode sensitivity based on guiding structure materials/geometries; and use of new piezoelectric materials. On packaging, topics considered include package interfacing with SAW devices, and minimization of packaging effects on device performance. In this work novel numerical models are theoretically developed and implemented to study propagation and transduction characteristics of sensor designs using wave/constitutive equations, Green's functions, and boundary/finite element methods. Using developed simulation tools that consider finite-thickness of all device electrodes, transduction efficiency for SAW transducers with neighboring uniform or periodic guiding electrodes is reported for the first time. Results indicate finite electrode thickness strongly affects efficiency. Using dense electrodes, efficiency is shown to approach 92% and 100% for uniform and periodic electrode guiding, respectively; yielding improved sensor detection limits. A numerical sensitivity analysis is presented targeting viscosity using uniform-electrode and shear-horizontal mode configurations on potassium-niobate, langasite, and quartz substrates. Optimum configurations are determined yielding maximum sensitivity. Results show mode propagation-loss and sensitivity to viscosity are correlated by a factor independent of substrate material. The analysis is useful for designing devices meeting sensitivity and signal level requirements. A novel, rapid and precise microfluidic chamber alignment/bonding method was developed for SAW platforms. The package is shown to have little effect on device performance and permits simple macrofluidic interfacing. Lastly, prototypes were designed, fabricated, and tested for viscosity and biosensor applications; results show ability to detect as low as 1% glycerol in water and surface-bound DNA crosslinking.

  12. Spacecraft design sensitivity for a disaster warning satellite system

    NASA Technical Reports Server (NTRS)

    Maloy, J. E.; Provencher, C. E.; Leroy, B. E.; Braley, R. C.; Shumaker, H. A.

    1977-01-01

    A disaster warning satellite (DWS) is described for warning the general public of impending natural catastrophes. The concept is responsive to NOAA requirements and maximizes the use of ATS-6 technology. Upon completion of concept development, the study was extended to establishing the sensitivity of the DWSS spacecraft power, weight, and cost to variations in both warning and conventional communications functions. The results of this sensitivity analysis are presented.

  13. Sensitivity Analysis of the Static Aeroelastic Response of a Wing

    NASA Technical Reports Server (NTRS)

    Eldred, Lloyd B.

    1993-01-01

    A technique to obtain the sensitivity of the static aeroelastic response of a three dimensional wing model is designed and implemented. The formulation is quite general and accepts any aerodynamic and structural analysis capability. A program to combine the discipline level, or local, sensitivities into global sensitivity derivatives is developed. A variety of representations of the wing pressure field are developed and tested to determine the most accurate and efficient scheme for representing the field outside of the aerodynamic code. Chebyshev polynomials are used to globally fit the pressure field. This approach had some difficulties in representing local variations in the field, so a variety of local interpolation polynomial pressure representations are also implemented. These panel based representations use a constant pressure value, a bilinearly interpolated value. or a biquadraticallv interpolated value. The interpolation polynomial approaches do an excellent job of reducing the numerical problems of the global approach for comparable computational effort. Regardless of the pressure representation used. sensitivity and response results with excellent accuracy have been produced for large integrated quantities such as wing tip deflection and trim angle of attack. The sensitivities of such things as individual generalized displacements have been found with fair accuracy. In general, accuracy is found to be proportional to the relative size of the derivatives to the quantity itself.

  14. An Equal Employment Opportunity Sensitivity Workshop

    ERIC Educational Resources Information Center

    Patten, Thomas H., Jr.; Dorey, Lester E.

    1972-01-01

    The equal employment opportunity sensitivity workshop seems to be a useful training device for getting an organization started on developing black and white change agents. A report on the establishment of such a workshop at the U.S. Army Tank Automotive Command (TACOM). Includes charts of design, characteristics, analysis of results, program…

  15. USEPA EXAMPLE EXIT LEVEL ANALYSIS RESULTS

    EPA Science Inventory

    Developed by NERL/ERD for the Office of Solid Waste, the enclosed product provides an example uncertainty analysis (UA) and initial process-based sensitivity analysis (SA) of hazardous waste "exit" concentrations for 7 chemicals and metals using the 3MRA Version 1.0 Modeling Syst...

  16. Sobol‧ sensitivity analysis of NAPL-contaminated aquifer remediation process based on multiple surrogates

    NASA Astrophysics Data System (ADS)

    Luo, Jiannan; Lu, Wenxi

    2014-06-01

    Sobol‧ sensitivity analyses based on different surrogates were performed on a trichloroethylene (TCE)-contaminated aquifer to assess the sensitivity of the design variables of remediation duration, surfactant concentration and injection rates at four wells to remediation efficiency First, the surrogate models of a multi-phase flow simulation model were constructed by applying radial basis function artificial neural network (RBFANN) and Kriging methods, and the two models were then compared. Based on the developed surrogate models, the Sobol‧ method was used to calculate the sensitivity indices of the design variables which affect the remediation efficiency. The coefficient of determination (R2) and the mean square error (MSE) of these two surrogate models demonstrated that both models had acceptable approximation accuracy, furthermore, the approximation accuracy of the Kriging model was slightly better than that of the RBFANN model. Sobol‧ sensitivity analysis results demonstrated that the remediation duration was the most important variable influencing remediation efficiency, followed by rates of injection at wells 1 and 3, while rates of injection at wells 2 and 4 and the surfactant concentration had negligible influence on remediation efficiency. In addition, high-order sensitivity indices were all smaller than 0.01, which indicates that interaction effects of these six factors were practically insignificant. The proposed Sobol‧ sensitivity analysis based on surrogate is an effective tool for calculating sensitivity indices, because it shows the relative contribution of the design variables (individuals and interactions) to the output performance variability with a limited number of runs of a computationally expensive simulation model. The sensitivity analysis results lay a foundation for the optimal groundwater remediation process optimization.

  17. Sensitivity and Uncertainty Analysis of the GFR MOX Fuel Subassembly

    NASA Astrophysics Data System (ADS)

    Lüley, J.; Vrban, B.; Čerba, Š.; Haščík, J.; Nečas, V.; Pelloni, S.

    2014-04-01

    We performed sensitivity and uncertainty analysis as well as benchmark similarity assessment of the MOX fuel subassembly designed for the Gas-Cooled Fast Reactor (GFR) as a representative material of the core. Material composition was defined for each assembly ring separately allowing us to decompose the sensitivities not only for isotopes and reactions but also for spatial regions. This approach was confirmed by direct perturbation calculations for chosen materials and isotopes. Similarity assessment identified only ten partly comparable benchmark experiments that can be utilized in the field of GFR development. Based on the determined uncertainties, we also identified main contributors to the calculation bias.

  18. Photocleavable DNA barcode-antibody conjugates allow sensitive and multiplexed protein analysis in single cells.

    PubMed

    Agasti, Sarit S; Liong, Monty; Peterson, Vanessa M; Lee, Hakho; Weissleder, Ralph

    2012-11-14

    DNA barcoding is an attractive technology, as it allows sensitive and multiplexed target analysis. However, DNA barcoding of cellular proteins remains challenging, primarily because barcode amplification and readout techniques are often incompatible with the cellular microenvironment. Here we describe the development and validation of a photocleavable DNA barcode-antibody conjugate method for rapid, quantitative, and multiplexed detection of proteins in single live cells. Following target binding, this method allows DNA barcodes to be photoreleased in solution, enabling easy isolation, amplification, and readout. As a proof of principle, we demonstrate sensitive and multiplexed detection of protein biomarkers in a variety of cancer cells.

  19. Design sensitivity analysis of rotorcraft airframe structures for vibration reduction

    NASA Technical Reports Server (NTRS)

    Murthy, T. Sreekanta

    1987-01-01

    Optimization of rotorcraft structures for vibration reduction was studied. The objective of this study is to develop practical computational procedures for structural optimization of airframes subject to steady-state vibration response constraints. One of the key elements of any such computational procedure is design sensitivity analysis. A method for design sensitivity analysis of airframes under vibration response constraints is presented. The mathematical formulation of the method and its implementation as a new solution sequence in MSC/NASTRAN are described. The results of the application of the method to a simple finite element stick model of the AH-1G helicopter airframe are presented and discussed. Selection of design variables that are most likely to bring about changes in the response at specified locations in the airframe is based on consideration of forced response strain energy. Sensitivity coefficients are determined for the selected design variable set. Constraints on the natural frequencies are also included in addition to the constraints on the steady-state response. Sensitivity coefficients for these constraints are determined. Results of the analysis and insights gained in applying the method to the airframe model are discussed. The general nature of future work to be conducted is described.

  20. Limiting similarity and niche theory for structured populations.

    PubMed

    Szilágyi, András; Meszéna, Géza

    2009-05-07

    We develop the theory of limiting similarity and niche for structured populations with finite number of individual states (i-state). In line with a previously published theory for unstructured populations, the niche of a species is specified by the impact and sensitivity niche vectors. They describe the population's impact on and sensitivity towards the variables involved in the population regulation. Robust coexistence requires sufficient segregation of the impact, as well as of the sensitivity niche vectors. Connection between the population-level impact and sensitivity and the impact/sensitivity of the specific i-states is developed. Each i-state contributes to the impact of the population proportional to its frequency in the population. Sensitivity of the population is composed of the sensitivity of the rates of demographic transitions, weighted by the frequency and by the reproductive value of the initial and final i-states of the transition, respectively. Coexistence in a multi-patch environment is studied. This analysis is interpreted as spatial niche segregation.

  1. Development of computer-aided design system of elastic sensitive elements of automatic metering devices

    NASA Astrophysics Data System (ADS)

    Kalinkina, M. E.; Kozlov, A. S.; Labkovskaia, R. I.; Pirozhnikova, O. I.; Tkalich, V. L.; Shmakov, N. A.

    2018-05-01

    The object of research is the element base of devices of control and automation systems, including in its composition annular elastic sensitive elements, methods of their modeling, calculation algorithms and software complexes for automation of their design processes. The article is devoted to the development of the computer-aided design system of elastic sensitive elements used in weight- and force-measuring automation devices. Based on the mathematical modeling of deformation processes in a solid, as well as the results of static and dynamic analysis, the calculation of elastic elements is given using the capabilities of modern software systems based on numerical simulation. In the course of the simulation, the model was a divided hexagonal grid of finite elements with a maximum size not exceeding 2.5 mm. The results of modal and dynamic analysis are presented in this article.

  2. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1991-01-01

    Continuing studies associated with the development of the quasi-analytical (QA) sensitivity method for three dimensional transonic flow about wings are presented. Furthermore, initial results using the quasi-analytical approach were obtained and compared to those computed using the finite difference (FD) approach. The basic goals achieved were: (1) carrying out various debugging operations pertaining to the quasi-analytical method; (2) addition of section design variables to the sensitivity equation in the form of multiple right hand sides; (3) reconfiguring the analysis/sensitivity package in order to facilitate the execution of analysis/FD/QA test cases; and (4) enhancing the display of output data to allow careful examination of the results and to permit various comparisons of sensitivity derivatives obtained using the FC/QA methods to be conducted easily and quickly. In addition to discussing the above goals, the results of executing subcritical and supercritical test cases are presented.

  3. Stochastic sensitivity measure for mistuned high-performance turbines

    NASA Technical Reports Server (NTRS)

    Murthy, Durbha V.; Pierre, Christophe

    1992-01-01

    A stochastic measure of sensitivity is developed in order to predict the effects of small random blade mistuning on the dynamic aeroelastic response of turbomachinery blade assemblies. This sensitivity measure is based solely on the nominal system design (i.e., on tuned system information), which makes it extremely easy and inexpensive to calculate. The measure has the potential to become a valuable design tool that will enable designers to evaluate mistuning effects at a preliminary design stage and thus assess the need for a full mistuned rotor analysis. The predictive capability of the sensitivity measure is illustrated by examining the effects of mistuning on the aeroelastic modes of the first stage of the oxidizer turbopump in the Space Shuttle Main Engine. Results from a full analysis mistuned systems confirm that the simple stochastic sensitivity measure predicts consistently the drastic changes due to misturning and the localization of aeroelastic vibration to a few blades.

  4. Application of advanced multidisciplinary analysis and optimization methods to vehicle design synthesis

    NASA Technical Reports Server (NTRS)

    Consoli, Robert David; Sobieszczanski-Sobieski, Jaroslaw

    1990-01-01

    Advanced multidisciplinary analysis and optimization methods, namely system sensitivity analysis and non-hierarchical system decomposition, are applied to reduce the cost and improve the visibility of an automated vehicle design synthesis process. This process is inherently complex due to the large number of functional disciplines and associated interdisciplinary couplings. Recent developments in system sensitivity analysis as applied to complex non-hierarchic multidisciplinary design optimization problems enable the decomposition of these complex interactions into sub-processes that can be evaluated in parallel. The application of these techniques results in significant cost, accuracy, and visibility benefits for the entire design synthesis process.

  5. Highly sensitive protein detection by biospecific AFM-based fishing with pulsed electrical stimulation.

    PubMed

    Pleshakova, Tatyana O; Malsagova, Kristina A; Kaysheva, Anna L; Kopylov, Arthur T; Tatur, Vadim Yu; Ziborov, Vadim S; Kanashenko, Sergey L; Galiullin, Rafael A; Ivanov, Yuri D

    2017-08-01

    We report here the highly sensitive detection of protein in solution at concentrations from 10 -15 to 10 -18 m using the combination of atomic force microscopy (AFM) and mass spectrometry. Biospecific detection of biotinylated bovine serum albumin was carried out by fishing out the protein onto the surface of AFM chips with immobilized avidin, which determined the specificity of the analysis. Electrical stimulation was applied to enhance the fishing efficiency. A high sensitivity of detection was achieved by application of nanosecond electric pulses to highly oriented pyrolytic graphite placed under the AFM chip. A peristaltic pump-based flow system, which is widely used in routine bioanalytical assays, was employed throughout the analysis. These results hold promise for the development of highly sensitive protein detection methods using nanosensor devices.

  6. Space transportation architecture: Reliability sensitivities

    NASA Technical Reports Server (NTRS)

    Williams, A. M.

    1992-01-01

    A sensitivity analysis is given of the benefits and drawbacks associated with a proposed Earth to orbit vehicle architecture. The architecture represents a fleet of six vehicles (two existing, four proposed) that would be responsible for performing various missions as mandated by NASA and the U.S. Air Force. Each vehicle has a prescribed flight rate per year for a period of 31 years. By exposing this fleet of vehicles to a probabilistic environment where the fleet experiences failures, downtimes, setbacks, etc., the analysis involves determining the resiliency and costs associated with the fleet of specific vehicle/subsystem reliabilities. The resources required were actual observed data on the failures and downtimes associated with existing vehicles, data based on engineering judgement for proposed vehicles, and the development of a sensitivity analysis program.

  7. A fluorescent graphitic carbon nitride nanosheet biosensor for highly sensitive, label-free detection of alkaline phosphatase.

    PubMed

    Xiang, Mei-Hao; Liu, Jin-Wen; Li, Na; Tang, Hao; Yu, Ru-Qin; Jiang, Jian-Hui

    2016-02-28

    Graphitic C3N4 (g-C3N4) nanosheets provide an attractive option for bioprobes and bioimaging applications. Utilizing highly fluorescent and water-dispersible ultrathin g-C3N4 nanosheets, a highly sensitive, selective and label-free biosensor has been developed for ALP detection for the first time. The developed approach utilizes a natural substrate of ALP in biological systems and thus affords very high catalytic efficiency. This novel biosensor is demonstrated to enable quantitative analysis of ALP in a wide range from 0.1 to 1000 U L(-1) with a low detection limit of 0.08 U L(-1), which is among the most sensitive assays for ALP. It is expected that the developed method may provide a low-cost, convenient, rapid and highly sensitive platform for ALP-based clinical diagnostics and biomedical applications.

  8. Aeroelastic optimization methodology for viscous and turbulent flows

    NASA Astrophysics Data System (ADS)

    Barcelos Junior, Manuel Nascimento Dias

    2007-12-01

    In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.

  9. Algorithm for automatic analysis of electro-oculographic data

    PubMed Central

    2013-01-01

    Background Large amounts of electro-oculographic (EOG) data, recorded during electroencephalographic (EEG) measurements, go underutilized. We present an automatic, auto-calibrating algorithm that allows efficient analysis of such data sets. Methods The auto-calibration is based on automatic threshold value estimation. Amplitude threshold values for saccades and blinks are determined based on features in the recorded signal. The performance of the developed algorithm was tested by analyzing 4854 saccades and 213 blinks recorded in two different conditions: a task where the eye movements were controlled (saccade task) and a task with free viewing (multitask). The results were compared with results from a video-oculography (VOG) device and manually scored blinks. Results The algorithm achieved 93% detection sensitivity for blinks with 4% false positive rate. The detection sensitivity for horizontal saccades was between 98% and 100%, and for oblique saccades between 95% and 100%. The classification sensitivity for horizontal and large oblique saccades (10 deg) was larger than 89%, and for vertical saccades larger than 82%. The duration and peak velocities of the detected horizontal saccades were similar to those in the literature. In the multitask measurement the detection sensitivity for saccades was 97% with a 6% false positive rate. Conclusion The developed algorithm enables reliable analysis of EOG data recorded both during EEG and as a separate metrics. PMID:24160372

  10. Harnessing Connectivity in a Large-Scale Small-Molecule Sensitivity Dataset.

    PubMed

    Seashore-Ludlow, Brinton; Rees, Matthew G; Cheah, Jaime H; Cokol, Murat; Price, Edmund V; Coletti, Matthew E; Jones, Victor; Bodycombe, Nicole E; Soule, Christian K; Gould, Joshua; Alexander, Benjamin; Li, Ava; Montgomery, Philip; Wawer, Mathias J; Kuru, Nurdan; Kotz, Joanne D; Hon, C Suk-Yee; Munoz, Benito; Liefeld, Ted; Dančík, Vlado; Bittker, Joshua A; Palmer, Michelle; Bradner, James E; Shamji, Alykhan F; Clemons, Paul A; Schreiber, Stuart L

    2015-11-01

    Identifying genetic alterations that prime a cancer cell to respond to a particular therapeutic agent can facilitate the development of precision cancer medicines. Cancer cell-line (CCL) profiling of small-molecule sensitivity has emerged as an unbiased method to assess the relationships between genetic or cellular features of CCLs and small-molecule response. Here, we developed annotated cluster multidimensional enrichment analysis to explore the associations between groups of small molecules and groups of CCLs in a new, quantitative sensitivity dataset. This analysis reveals insights into small-molecule mechanisms of action, and genomic features that associate with CCL response to small-molecule treatment. We are able to recapitulate known relationships between FDA-approved therapies and cancer dependencies and to uncover new relationships, including for KRAS-mutant cancers and neuroblastoma. To enable the cancer community to explore these data, and to generate novel hypotheses, we created an updated version of the Cancer Therapeutic Response Portal (CTRP v2). We present the largest CCL sensitivity dataset yet available, and an analysis method integrating information from multiple CCLs and multiple small molecules to identify CCL response predictors robustly. We updated the CTRP to enable the cancer research community to leverage these data and analyses. ©2015 American Association for Cancer Research.

  11. Peptidylation for the determination of low-molecular-weight compounds by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry.

    PubMed

    Tang, Feng; Cen, Si-Ying; He, Huan; Liu, Yi; Yuan, Bi-Feng; Feng, Yu-Qi

    2016-05-23

    Determination of low-molecular-weight compounds by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) has been a great challenge in the analytical research field. Here we developed a universal peptide-based derivatization (peptidylation) strategy for the sensitive analysis of low-molecular-weight compounds by MALDI-TOF-MS. Upon peptidylation, the molecular weights of target analytes increase, thus avoiding serious matrix ion interference in the low-molecular-weight region in MALDI-TOF-MS. Since peptides typically exhibit good signal response during MALDI-TOF-MS analysis, peptidylation endows high detection sensitivities of low-molecular-weight analytes. As a proof-of-concept, we analyzed low-molecular-weight compounds of aldehydes and thiols by the developed peptidylation strategy. Our results showed that aldehydes and thiols can be readily determined upon peptidylation, thus realizing the sensitive and efficient determination of low-molecular-weight compounds by MALDI-TOF-MS. Moreover, target analytes also can be unambiguously detected in biological samples using the peptidylation strategy. The established peptidylation strategy is a universal strategy and can be extended to the sensitive analysis of various low-molecular-weight compounds by MALDI-TOF-MS, which may be potentially used in areas such as metabolomics.

  12. Algorithm for automatic analysis of electro-oculographic data.

    PubMed

    Pettersson, Kati; Jagadeesan, Sharman; Lukander, Kristian; Henelius, Andreas; Haeggström, Edward; Müller, Kiti

    2013-10-25

    Large amounts of electro-oculographic (EOG) data, recorded during electroencephalographic (EEG) measurements, go underutilized. We present an automatic, auto-calibrating algorithm that allows efficient analysis of such data sets. The auto-calibration is based on automatic threshold value estimation. Amplitude threshold values for saccades and blinks are determined based on features in the recorded signal. The performance of the developed algorithm was tested by analyzing 4854 saccades and 213 blinks recorded in two different conditions: a task where the eye movements were controlled (saccade task) and a task with free viewing (multitask). The results were compared with results from a video-oculography (VOG) device and manually scored blinks. The algorithm achieved 93% detection sensitivity for blinks with 4% false positive rate. The detection sensitivity for horizontal saccades was between 98% and 100%, and for oblique saccades between 95% and 100%. The classification sensitivity for horizontal and large oblique saccades (10 deg) was larger than 89%, and for vertical saccades larger than 82%. The duration and peak velocities of the detected horizontal saccades were similar to those in the literature. In the multitask measurement the detection sensitivity for saccades was 97% with a 6% false positive rate. The developed algorithm enables reliable analysis of EOG data recorded both during EEG and as a separate metrics.

  13. Noninvasive and cost-effective trapping method for monitoring sensitive mammal populations

    Treesearch

    Stephanie E. Trapp; Elizabeth A. Flaherty

    2017-01-01

    Noninvasive sampling methods provide a means to monitor endangered, threatened, or sensitive species or populations while increasing the efficacy of personnel effort and time. We developed a monitoring protocol that utilizes single-capture hair snares and analysis of morphological features of hair for evaluating populations. During 2015, we used the West Virginia...

  14. ’Coxiella Burnetii’ Vaccine Development: Lipopolysaccharide Structural Analysis

    DTIC Science & Technology

    1991-02-20

    Analytical instrumentation and methodology is presented for the determination of endotoxin -related structures at much improved sensitivity and... ENDOTOXIN CHARACTERIZATION BY SFC .......................... 10 III. COXIELLA BURNETII LPS CHARACTERIZATION A. EXPERIMENTAL...period for the determination of endotoxin -related structures at much improved sensitivity and specificity. Reports, and their applications, are listed in

  15. Recent development of electrochemiluminescence sensors for food analysis.

    PubMed

    Hao, Nan; Wang, Kun

    2016-10-01

    Food quality and safety are closely related to human health. In the face of unceasing food safety incidents, various analytical techniques, such as mass spectrometry, chromatography, spectroscopy, and electrochemistry, have been applied in food analysis. High sensitivity usually requires expensive instruments and complicated procedures. Although these modern analytical techniques are sensitive enough to ensure food safety, sometimes their applications are limited because of the cost, usability, and speed of analysis. Electrochemiluminescence (ECL) is a powerful analytical technique that is attracting more and more attention because of its outstanding performance. In this review, the mechanisms of ECL and common ECL luminophores are briefly introduced. Then an overall review of the principles and applications of ECL sensors for food analysis is provided. ECL can be flexibly combined with various separation techniques. Novel materials (e.g., various nanomaterials) and strategies (e.g., immunoassay, aptasensors, and microfluidics) have been progressively introduced into the design of ECL sensors. By illustrating some selected representative works, we summarize the state of the art in the development of ECL sensors for toxins, heavy metals, pesticides, residual drugs, illegal additives, viruses, and bacterias. Compared with other methods, ECL can provide rapid, low-cost, and sensitive detection for various food contaminants in complex matrixes. However, there are also some limitations and challenges. Improvements suited to the characteristics of food analysis are still necessary.

  16. Improvement of the cost-benefit analysis algorithm for high-rise construction projects

    NASA Astrophysics Data System (ADS)

    Gafurov, Andrey; Skotarenko, Oksana; Plotnikov, Vladimir

    2018-03-01

    The specific nature of high-rise investment projects entailing long-term construction, high risks, etc. implies a need to improve the standard algorithm of cost-benefit analysis. An improved algorithm is described in the article. For development of the improved algorithm of cost-benefit analysis for high-rise construction projects, the following methods were used: weighted average cost of capital, dynamic cost-benefit analysis of investment projects, risk mapping, scenario analysis, sensitivity analysis of critical ratios, etc. This comprehensive approach helped to adapt the original algorithm to feasibility objectives in high-rise construction. The authors put together the algorithm of cost-benefit analysis for high-rise construction projects on the basis of risk mapping and sensitivity analysis of critical ratios. The suggested project risk management algorithms greatly expand the standard algorithm of cost-benefit analysis in investment projects, namely: the "Project analysis scenario" flowchart, improving quality and reliability of forecasting reports in investment projects; the main stages of cash flow adjustment based on risk mapping for better cost-benefit project analysis provided the broad range of risks in high-rise construction; analysis of dynamic cost-benefit values considering project sensitivity to crucial variables, improving flexibility in implementation of high-rise projects.

  17. Sensitivity Analysis of Fatigue Crack Growth Model for API Steels in Gaseous Hydrogen.

    PubMed

    Amaro, Robert L; Rustagi, Neha; Drexler, Elizabeth S; Slifka, Andrew J

    2014-01-01

    A model to predict fatigue crack growth of API pipeline steels in high pressure gaseous hydrogen has been developed and is presented elsewhere. The model currently has several parameters that must be calibrated for each pipeline steel of interest. This work provides a sensitivity analysis of the model parameters in order to provide (a) insight to the underlying mathematical and mechanistic aspects of the model, and (b) guidance for model calibration of other API steels.

  18. [Evaluation of land resources carrying capacity of development zone based on planning environment impact assessment].

    PubMed

    Fu, Shi-Feng; Zhang, Ping; Jiang, Jin-Long

    2012-02-01

    Assessment of land resources carrying capacity is the key point of planning environment impact assessment and the main foundation to determine whether the planning could be implemented or not. With the help of the space analysis function of Geographic Information System, and selecting altitude, slope, land use type, distance from resident land, distance from main traffic roads, and distance from environmentally sensitive area as the sensitive factors, a comprehensive assessment on the ecological sensitivity and its spatial distribution in Zhangzhou Merchants Economic and Technological Development Zone, Fujian Province of East China was conducted, and the assessment results were combined with the planning land layout diagram for the ecological suitability analysis. In the Development Zone, 84.0% of resident land, 93.1% of industrial land, 86.0% of traffic land, and 76. 0% of other constructive lands in planning were located in insensitive and gently sensitive areas, and thus, the implement of the land use planning generally had little impact on the ecological environment, and the land resources in the planning area was able to meet the land use demand. The assessment of the population carrying capacity with ecological land as the limiting factor indicated that in considering the highly sensitive area and 60% of the moderately sensitive area as ecological land, the population within the Zone in the planning could reach 240000, and the available land area per capita could be 134.0 m2. Such a planned population scale is appropriate, according to the related standards of constructive land.

  19. Sensitivity Analysis for Steady State Groundwater Flow Using Adjoint Operators

    NASA Astrophysics Data System (ADS)

    Sykes, J. F.; Wilson, J. L.; Andrews, R. W.

    1985-03-01

    Adjoint sensitivity theory is currently being considered as a potential method for calculating the sensitivity of nuclear waste repository performance measures to the parameters of the system. For groundwater flow systems, performance measures of interest include piezometric heads in the vicinity of a waste site, velocities or travel time in aquifers, and mass discharge to biosphere points. The parameters include recharge-discharge rates, prescribed boundary heads or fluxes, formation thicknesses, and hydraulic conductivities. The derivative of a performance measure with respect to the system parameters is usually taken as a measure of sensitivity. To calculate sensitivities, adjoint sensitivity equations are formulated from the equations describing the primary problem. The solution of the primary problem and the adjoint sensitivity problem enables the determination of all of the required derivatives and hence related sensitivity coefficients. In this study, adjoint sensitivity theory is developed for equations of two-dimensional steady state flow in a confined aquifer. Both the primary flow equation and the adjoint sensitivity equation are solved using the Galerkin finite element method. The developed computer code is used to investigate the regional flow parameters of the Leadville Formation of the Paradox Basin in Utah. The results illustrate the sensitivity of calculated local heads to the boundary conditions. Alternatively, local velocity related performance measures are more sensitive to hydraulic conductivities.

  20. ASKI: A modular toolbox for scattering-integral-based seismic full waveform inversion and sensitivity analysis utilizing external forward codes

    NASA Astrophysics Data System (ADS)

    Schumacher, Florian; Friederich, Wolfgang

    Due to increasing computational resources, the development of new numerically demanding methods and software for imaging Earth's interior remains of high interest in Earth sciences. Here, we give a description from a user's and programmer's perspective of the highly modular, flexible and extendable software package ASKI-Analysis of Sensitivity and Kernel Inversion-recently developed for iterative scattering-integral-based seismic full waveform inversion. In ASKI, the three fundamental steps of solving the seismic forward problem, computing waveform sensitivity kernels and deriving a model update are solved by independent software programs that interact via file output/input only. Furthermore, the spatial discretizations of the model space used for solving the seismic forward problem and for deriving model updates, respectively, are kept completely independent. For this reason, ASKI does not contain a specific forward solver but instead provides a general interface to established community wave propagation codes. Moreover, the third fundamental step of deriving a model update can be repeated at relatively low costs applying different kinds of model regularization or re-selecting/weighting the inverted dataset without need to re-solve the forward problem or re-compute the kernels. Additionally, ASKI offers the user sensitivity and resolution analysis tools based on the full sensitivity matrix and allows to compose customized workflows in a consistent computational environment. ASKI is written in modern Fortran and Python, it is well documented and freely available under terms of the GNU General Public License (http://www.rub.de/aski).

  1. Skeletal mechanism generation for surrogate fuels using directed relation graph with error propagation and sensitivity analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Niemeyer, Kyle E.; Sung, Chih-Jen; Raju, Mandhapati P.

    2010-09-15

    A novel implementation for the skeletal reduction of large detailed reaction mechanisms using the directed relation graph with error propagation and sensitivity analysis (DRGEPSA) is developed and presented with examples for three hydrocarbon components, n-heptane, iso-octane, and n-decane, relevant to surrogate fuel development. DRGEPSA integrates two previously developed methods, directed relation graph-aided sensitivity analysis (DRGASA) and directed relation graph with error propagation (DRGEP), by first applying DRGEP to efficiently remove many unimportant species prior to sensitivity analysis to further remove unimportant species, producing an optimally small skeletal mechanism for a given error limit. It is illustrated that the combination ofmore » the DRGEP and DRGASA methods allows the DRGEPSA approach to overcome the weaknesses of each, specifically that DRGEP cannot identify all unimportant species and that DRGASA shields unimportant species from removal. Skeletal mechanisms for n-heptane and iso-octane generated using the DRGEP, DRGASA, and DRGEPSA methods are presented and compared to illustrate the improvement of DRGEPSA. From a detailed reaction mechanism for n-alkanes covering n-octane to n-hexadecane with 2115 species and 8157 reactions, two skeletal mechanisms for n-decane generated using DRGEPSA, one covering a comprehensive range of temperature, pressure, and equivalence ratio conditions for autoignition and the other limited to high temperatures, are presented and validated. The comprehensive skeletal mechanism consists of 202 species and 846 reactions and the high-temperature skeletal mechanism consists of 51 species and 256 reactions. Both mechanisms are further demonstrated to well reproduce the results of the detailed mechanism in perfectly-stirred reactor and laminar flame simulations over a wide range of conditions. The comprehensive and high-temperature n-decane skeletal mechanisms are included as supplementary material with this article. (author)« less

  2. Geostationary Coastal and Air Pollution Events (GEO-CAPE) Sensitivity Analysis Experiment

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Bowman, Kevin

    2014-01-01

    Geostationary Coastal and Air pollution Events (GEO-CAPE) is a NASA decadal survey mission to be designed to provide surface reflectance at high spectral, spatial, and temporal resolutions from a geostationary orbit necessary for studying regional-scale air quality issues and their impact on global atmospheric composition processes. GEO-CAPE's Atmospheric Science Questions explore the influence of both gases and particles on air quality, atmospheric composition, and climate. The objective of the GEO-CAPE Observing System Simulation Experiment (OSSE) is to analyze the sensitivity of ozone to the global and regional NOx emissions and improve the science impact of GEO-CAPE with respect to the global air quality. The GEO-CAPE OSSE team at Jet propulsion Laboratory has developed a comprehensive OSSE framework that can perform adjoint-sensitivity analysis for a wide range of observation scenarios and measurement qualities. This report discusses the OSSE framework and presents the sensitivity analysis results obtained from the GEO-CAPE OSSE framework for seven observation scenarios and three instrument systems.

  3. Spectral characterization of biophysical characteristics in a boreal forest - Relationship between Thematic Mapper band reflectance and leaf area index for Aspen

    NASA Technical Reports Server (NTRS)

    Badhwar, G. D.; Macdonald, R. B.; Hall, F. G.; Carnes, J. G.

    1986-01-01

    Results from analysis of a data set of simultaneous measurements of Thematic Mapper band reflectance and leaf area index are presented. The measurements were made over pure stands of Aspen in the Superior National Forest of northern Minnesota. The analysis indicates that the reflectance may be sensitive to the leaf area index of the Aspen early in the season. The sensitivity disappears as the season progresses. Based on the results of model calculations, an explanation for the observed relationship is developed. The model calculations indicate that the sensitivity of the reflectance to the Aspen overstory depends on the amount of understory present.

  4. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    NASA Astrophysics Data System (ADS)

    Feng, Jinchao; Lansford, Joshua; Mironenko, Alexander; Pourkargar, Davood Babaei; Vlachos, Dionisios G.; Katsoulakis, Markos A.

    2018-03-01

    We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data). The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  5. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and risk...

  6. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and risk...

  7. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and risk...

  8. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and risk...

  9. Hyperspectral data analysis procedures with reduced sensitivity to noise

    NASA Technical Reports Server (NTRS)

    Landgrebe, David A.

    1993-01-01

    Multispectral sensor systems have become steadily improved over the years in their ability to deliver increased spectral detail. With the advent of hyperspectral sensors, including imaging spectrometers, this technology is in the process of taking a large leap forward, thus providing the possibility of enabling delivery of much more detailed information. However, this direction of development has drawn even more attention to the matter of noise and other deleterious effects in the data, because reducing the fundamental limitations of spectral detail on information collection raises the limitations presented by noise to even greater importance. Much current effort in remote sensing research is thus being devoted to adjusting the data to mitigate the effects of noise and other deleterious effects. A parallel approach to the problem is to look for analysis approaches and procedures which have reduced sensitivity to such effects. We discuss some of the fundamental principles which define analysis algorithm characteristics providing such reduced sensitivity. One such analysis procedure including an example analysis of a data set is described, illustrating this effect.

  10. Open pit mining profit maximization considering selling stage and waste rehabilitation cost

    NASA Astrophysics Data System (ADS)

    Muttaqin, B. I. A.; Rosyidi, C. N.

    2017-11-01

    In open pit mining activities, determination of the cut-off grade becomes crucial for the company since the cut-off grade affects how much profit will be earned for the mining company. In this study, we developed a cut-off grade determination mode for the open pit mining industry considering the cost of mining, waste removal (rehabilitation) cost, processing cost, fixed cost, and selling stage cost. The main goal of this study is to develop a model of cut-off grade determination to get the maximum total profit. Secondly, this study is also developed to observe the model of sensitivity based on changes in the cost components. The optimization results show that the models can help mining company managers to determine the optimal cut-off grade and also estimate how much profit that can be earned by the mining company. To illustrate the application of the models, a numerical example and a set of sensitivity analysis are presented. From the results of sensitivity analysis, we conclude that the changes in the sales price greatly affects the optimal cut-off value and the total profit.

  11. Regionalising MUSLE factors for application to a data-scarce catchment

    NASA Astrophysics Data System (ADS)

    Gwapedza, David; Slaughter, Andrew; Hughes, Denis; Mantel, Sukhmani

    2018-04-01

    The estimation of soil loss and sediment transport is important for effective management of catchments. A model for semi-arid catchments in southern Africa has been developed; however, simplification of the model parameters and further testing are required. Soil loss is calculated through the Modified Universal Soil Loss Equation (MUSLE). The aims of the current study were to: (1) regionalise the MUSLE erodibility factors and; (2) perform a sensitivity analysis and validate the soil loss outputs against independently-estimated measures. The regionalisation was developed using Geographic Information Systems (GIS) coverages. The model was applied to a high erosion semi-arid region in the Eastern Cape, South Africa. Sensitivity analysis indicated model outputs to be more sensitive to the vegetation cover factor. The simulated soil loss estimates of 40 t ha-1 yr-1 were within the range of estimates by previous studies. The outcome of the present research is a framework for parameter estimation for the MUSLE through regionalisation. This is part of the ongoing development of a model which can estimate soil loss and sediment delivery at broad spatial and temporal scales.

  12. Knock-in mice harboring a Ca(2+) desensitizing mutation in cardiac troponin C develop early onset dilated cardiomyopathy.

    PubMed

    McConnell, Bradley K; Singh, Sonal; Fan, Qiying; Hernandez, Adriana; Portillo, Jesus P; Reiser, Peter J; Tikunova, Svetlana B

    2015-01-01

    The physiological consequences of aberrant Ca(2+) binding and exchange with cardiac myofilaments are not clearly understood. In order to examine the effect of decreasing Ca(2+) sensitivity of cTnC on cardiac function, we generated knock-in mice carrying a D73N mutation (not known to be associated with heart disease in human patients) in cTnC. The D73N mutation was engineered into the regulatory N-domain of cTnC in order to reduce Ca(2+) sensitivity of reconstituted thin filaments by increasing the rate of Ca(2+) dissociation. In addition, the D73N mutation drastically blunted the extent of Ca(2+) desensitization of reconstituted thin filaments induced by cTnI pseudo-phosphorylation. Compared to wild-type mice, heterozygous knock-in mice carrying the D73N mutation exhibited a substantially decreased Ca(2+) sensitivity of force development in skinned ventricular trabeculae. Kaplan-Meier survival analysis revealed that median survival time for knock-in mice was 12 weeks. Echocardiographic analysis revealed that knock-in mice exhibited increased left ventricular dimensions with thinner walls. Echocardiographic analysis also revealed that measures of systolic function, such as ejection fraction (EF) and fractional shortening (FS), were dramatically reduced in knock-in mice. In addition, knock-in mice displayed electrophysiological abnormalities, namely prolonged QRS and QT intervals. Furthermore, ventricular myocytes isolated from knock-in mice did not respond to β-adrenergic stimulation. Thus, knock-in mice developed pathological features similar to those observed in human patients with dilated cardiomyopathy (DCM). In conclusion, our results suggest that decreasing Ca(2+) sensitivity of the regulatory N-domain of cTnC is sufficient to trigger the development of DCM.

  13. Compliance and stress sensitivity of spur gear teeth

    NASA Technical Reports Server (NTRS)

    Cornell, R. W.

    1983-01-01

    The magnitude and variation of tooth pair compliance with load position affects the dynamics and loading significantly, and the tooth root stressing per load varies significantly with load position. Therefore, the recently developed time history, interactive, closed form solution for the dynamic tooth loads for both low and high contact ratio spur gears was expanded to include improved and simplified methods for calculating the compliance and stress sensitivity for three involute tooth forms as a function of load position. The compliance analysis has an improved fillet/foundation. The stress sensitivity analysis is a modified version of the Heywood method but with an improvement in the magnitude and location of the peak stress in the fillet. These improved compliance and stress sensitivity analyses are presented along with their evaluation using test, finite element, and analytic transformation results, which showed good agreement.

  14. Sensitive and comprehensive analysis of O-glycosylation in biotherapeutics: a case study of novel erythropoiesis stimulating protein.

    PubMed

    Kim, Unyong; Oh, Myung Jin; Seo, Youngsuk; Jeon, Yinae; Eom, Joon-Ho; An, Hyun Joo

    2017-09-01

    Glycosylation of recombinant human erythropoietins (rhEPOs) is significantly associated with drug's quality and potency. Thus, comprehensive characterization of glycosylation is vital to assess the biotherapeutic quality and establish the equivalency of biosimilar rhEPOs. However, current glycan analysis mainly focuses on the N-glycans due to the absence of analytical tools to liberate O-glycans with high sensitivity. We developed selective and sensitive method to profile native O-glycans on rhEPOs. O-glycosylation on rhEPO including O-acetylation on a sialic acid was comprehensively characterized. Details such as O-glycan structure and O-acetyl-modification site were obtained from tandem MS. This method may be applied to QC and batch analysis of not only rhEPOs but also other biotherapeutics bearing multiple O-glycosylations.

  15. Convergence Estimates for Multidisciplinary Analysis and Optimization

    NASA Technical Reports Server (NTRS)

    Arian, Eyal

    1997-01-01

    A quantitative analysis of coupling between systems of equations is introduced. This analysis is then applied to problems in multidisciplinary analysis, sensitivity, and optimization. For the sensitivity and optimization problems both multidisciplinary and single discipline feasibility schemes are considered. In all these cases a "convergence factor" is estimated in terms of the Jacobians and Hessians of the system, thus it can also be approximated by existing disciplinary analysis and optimization codes. The convergence factor is identified with the measure for the "coupling" between the disciplines in the system. Applications to algorithm development are discussed. Demonstration of the convergence estimates and numerical results are given for a system composed of two non-linear algebraic equations, and for a system composed of two PDEs modeling aeroelasticity.

  16. Heat and mass transport during microwave heating of mashed potato in domestic oven--model development, validation, and sensitivity analysis.

    PubMed

    Chen, Jiajia; Pitchai, Krishnamoorthy; Birla, Sohan; Negahban, Mehrdad; Jones, David; Subbiah, Jeyamkondan

    2014-10-01

    A 3-dimensional finite-element model coupling electromagnetics and heat and mass transfer was developed to understand the interactions between the microwaves and fresh mashed potato in a 500 mL tray. The model was validated by performing heating of mashed potato from 25 °C on a rotating turntable in a microwave oven, rated at 1200 W, for 3 min. The simulated spatial temperature profiles on the top and bottom layer of the mashed potato showed similar hot and cold spots when compared to the thermal images acquired by an infrared camera. Transient temperature profiles at 6 locations collected by fiber-optic sensors showed good agreement with predicted results, with the root mean square error ranging from 1.6 to 11.7 °C. The predicted total moisture loss matched well with the observed result. Several input parameters, such as the evaporation rate constant, the intrinsic permeability of water and gas, and the diffusion coefficient of water and gas, are not readily available for mashed potato, and they cannot be easily measured experimentally. Reported values for raw potato were used as baseline values. A sensitivity analysis of these input parameters on the temperature profiles and the total moisture loss was evaluated by changing the baseline values to their 10% and 1000%. The sensitivity analysis showed that the gas diffusion coefficient, intrinsic water permeability, and the evaporation rate constant greatly influenced the predicted temperature and total moisture loss, while the intrinsic gas permeability and the water diffusion coefficient had little influence. This model can be used by the food product developers to understand microwave heating of food products spatially and temporally. This tool will allow food product developers to design food package systems that would heat more uniformly in various microwave ovens. The sensitivity analysis of this study will help us determine the most significant parameters that need to be measured accurately for reliable model prediction. © 2014 Institute of Food Technologists®

  17. A sensitivity analysis method for the body segment inertial parameters based on ground reaction and joint moment regressor matrices.

    PubMed

    Futamure, Sumire; Bonnet, Vincent; Dumas, Raphael; Venture, Gentiane

    2017-11-07

    This paper presents a method allowing a simple and efficient sensitivity analysis of dynamics parameters of complex whole-body human model. The proposed method is based on the ground reaction and joint moment regressor matrices, developed initially in robotics system identification theory, and involved in the equations of motion of the human body. The regressor matrices are linear relatively to the segment inertial parameters allowing us to use simple sensitivity analysis methods. The sensitivity analysis method was applied over gait dynamics and kinematics data of nine subjects and with a 15 segments 3D model of the locomotor apparatus. According to the proposed sensitivity indices, 76 segments inertial parameters out the 150 of the mechanical model were considered as not influent for gait. The main findings were that the segment masses were influent and that, at the exception of the trunk, moment of inertia were not influent for the computation of the ground reaction forces and moments and the joint moments. The same method also shows numerically that at least 90% of the lower-limb joint moments during the stance phase can be estimated only from a force-plate and kinematics data without knowing any of the segment inertial parameters. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Global Sensitivity Analysis with Small Sample Sizes: Ordinary Least Squares Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Michael J.; Liu, Wei; Sivaramakrishnan, Raghu

    2016-12-21

    A new version of global sensitivity analysis is developed in this paper. This new version coupled with tools from statistics, machine learning, and optimization can devise small sample sizes that allow for the accurate ordering of sensitivity coefficients for the first 10-30 most sensitive chemical reactions in complex chemical-kinetic mechanisms, and is particularly useful for studying the chemistry in realistic devices. A key part of the paper is calibration of these small samples. Because these small sample sizes are developed for use in realistic combustion devices, the calibration is done over the ranges of conditions in such devices, with amore » test case being the operating conditions of a compression ignition engine studied earlier. Compression ignition engines operate under low-temperature combustion conditions with quite complicated chemistry making this calibration difficult, leading to the possibility of false positives and false negatives in the ordering of the reactions. So an important aspect of the paper is showing how to handle the trade-off between false positives and false negatives using ideas from the multiobjective optimization literature. The combination of the new global sensitivity method and the calibration are sample sizes a factor of approximately 10 times smaller than were available with our previous algorithm.« less

  19. Optical modeling of waveguide coupled TES detectors towards the SAFARI instrument for SPICA

    NASA Astrophysics Data System (ADS)

    Trappe, N.; Bracken, C.; Doherty, S.; Gao, J. R.; Glowacka, D.; Goldie, D.; Griffin, D.; Hijmering, R.; Jackson, B.; Khosropanah, P.; Mauskopf, P.; Morozov, D.; Murphy, A.; O'Sullivan, C.; Ridder, M.; Withington, S.

    2012-09-01

    The next generation of space missions targeting far-infrared wavelengths will require large-format arrays of extremely sensitive detectors. The development of Transition Edge Sensor (TES) array technology is being developed for future Far-Infrared (FIR) space applications such as the SAFARI instrument for SPICA where low-noise and high sensitivity is required to achieve ambitious science goals. In this paper we describe a modal analysis of multi-moded horn antennas feeding integrating cavities housing TES detectors with superconducting film absorbers. In high sensitivity TES detector technology the ability to control the electromagnetic and thermo-mechanical environment of the detector is critical. Simulating and understanding optical behaviour of such detectors at far IR wavelengths is difficult and requires development of existing analysis tools. The proposed modal approach offers a computationally efficient technique to describe the partial coherent response of the full pixel in terms of optical efficiency and power leakage between pixels. Initial wok carried out as part of an ESA technical research project on optical analysis is described and a prototype SAFARI pixel design is analyzed where the optical coupling between the incoming field and the pixel containing horn, cavity with an air gap, and thin absorber layer are all included in the model to allow a comprehensive optical characterization. The modal approach described is based on the mode matching technique where the horn and cavity are described in the traditional way while a technique to include the absorber was developed. Radiation leakage between pixels is also included making this a powerful analysis tool.

  20. Parameterization, sensitivity analysis, and inversion: an investigation using groundwater modeling of the surface-mined Tivoli-Guidonia basin (Metropolitan City of Rome, Italy)

    NASA Astrophysics Data System (ADS)

    La Vigna, Francesco; Hill, Mary C.; Rossetto, Rudy; Mazza, Roberto

    2016-09-01

    With respect to model parameterization and sensitivity analysis, this work uses a practical example to suggest that methods that start with simple models and use computationally frugal model analysis methods remain valuable in any toolbox of model development methods. In this work, groundwater model calibration starts with a simple parameterization that evolves into a moderately complex model. The model is developed for a water management study of the Tivoli-Guidonia basin (Rome, Italy) where surface mining has been conducted in conjunction with substantial dewatering. The approach to model development used in this work employs repeated analysis using sensitivity and inverse methods, including use of a new observation-stacked parameter importance graph. The methods are highly parallelizable and require few model runs, which make the repeated analyses and attendant insights possible. The success of a model development design can be measured by insights attained and demonstrated model accuracy relevant to predictions. Example insights were obtained: (1) A long-held belief that, except for a few distinct fractures, the travertine is homogeneous was found to be inadequate, and (2) The dewatering pumping rate is more critical to model accuracy than expected. The latter insight motivated additional data collection and improved pumpage estimates. Validation tests using three other recharge and pumpage conditions suggest good accuracy for the predictions considered. The model was used to evaluate management scenarios and showed that similar dewatering results could be achieved using 20 % less pumped water, but would require installing newly positioned wells and cooperation between mine owners.

  1. Digital Correlation Microwave Polarimetry: Analysis and Demonstration

    NASA Technical Reports Server (NTRS)

    Piepmeier, J. R.; Gasiewski, A. J.; Krebs, Carolyn A. (Technical Monitor)

    2000-01-01

    The design, analysis, and demonstration of a digital-correlation microwave polarimeter for use in earth remote sensing is presented. We begin with an analysis of three-level digital correlation and develop the correlator transfer function and radiometric sensitivity. A fifth-order polynomial regression is derived for inverting the digital correlation coefficient into the analog statistic. In addition, the effects of quantizer threshold asymmetry and hysteresis are discussed. A two-look unpolarized calibration scheme is developed for identifying correlation offsets. The developed theory and calibration method are verified using a 10.7 GHz and a 37.0 GHz polarimeter. The polarimeters are based upon 1-GS/s three-level digital correlators and measure the first three Stokes parameters. Through experiment, the radiometric sensitivity is shown to approach the theoretical as derived earlier in the paper and the two-look unpolarized calibration method is successfully compared with results using a polarimetric scheme. Finally, sample data from an aircraft experiment demonstrates that the polarimeter is highly-useful for ocean wind-vector measurement.

  2. Seeded amplification of chronic wasting disease prions in nasal brushings and recto-anal mucosal associated lymphoid tissues from elk by real time quaking-induced conversion

    USGS Publications Warehouse

    Haley, Nicholas J.; Siepker, Chris; Hoon-Hanks , Laura L.; Mitchell, Gordon; Walter, W. David; Manca, Matteo; Monello, Ryan J.; Powers, Jenny G.; Wild, Margaret A.; Hoover, Edward A.; Caughey, Byron; Richt, Jürgen a.; Fenwick, B.W.

    2016-01-01

    Chronic wasting disease (CWD), a transmissible spongiform encephalopathy of cervids, was first documented nearly 50 years ago in Colorado and Wyoming and has since been detected across North America and the Republic of Korea. The expansion of this disease makes the development of sensitive diagnostic assays and antemortem sampling techniques crucial for the mitigation of its spread; this is especially true in cases of relocation/reintroduction or prevalence studies of large or protected herds, where depopulation may be contraindicated. This study evaluated the sensitivity of the real-time quaking-induced conversion (RT-QuIC) assay of recto-anal mucosa-associated lymphoid tissue (RAMALT) biopsy specimens and nasal brushings collected antemortem. These findings were compared to results of immunohistochemistry (IHC) analysis of ante- and postmortem samples. RAMALT samples were collected from populations of farmed and free-ranging Rocky Mountain elk (Cervus elaphus nelsoni; n = 323), and nasal brush samples were collected from a subpopulation of these animals (n = 205). We hypothesized that the sensitivity of RT-QuIC would be comparable to that of IHC analysis of RAMALT and would correspond to that of IHC analysis of postmortem tissues. We found RAMALT sensitivity (77.3%) to be highly correlative between RT-QuIC and IHC analysis. Sensitivity was lower when testing nasal brushings (34%), though both RAMALT and nasal brush test sensitivities were dependent on both the PRNP genotype and disease progression determined by the obex score. These data suggest that RT-QuIC, like IHC analysis, is a relatively sensitive assay for detection of CWD prions in RAMALT biopsy specimens and, with further investigation, has potential for large-scale and rapid automated testing of antemortem samples for CWD.

  3. An approach to measure parameter sensitivity in watershed ...

    EPA Pesticide Factsheets

    Hydrologic responses vary spatially and temporally according to watershed characteristics. In this study, the hydrologic models that we developed earlier for the Little Miami River (LMR) and Las Vegas Wash (LVW) watersheds were used for detail sensitivity analyses. To compare the relative sensitivities of the hydrologic parameters of these two models, we used Normalized Root Mean Square Error (NRMSE). By combining the NRMSE index with the flow duration curve analysis, we derived an approach to measure parameter sensitivities under different flow regimes. Results show that the parameters related to groundwater are highly sensitive in the LMR watershed, whereas the LVW watershed is primarily sensitive to near surface and impervious parameters. The high and medium flows are more impacted by most of the parameters. Low flow regime was highly sensitive to groundwater related parameters. Moreover, our approach is found to be useful in facilitating model development and calibration. This journal article describes hydrological modeling of climate change and land use changes on stream hydrology, and elucidates the importance of hydrological model construction in generating valid modeling results.

  4. Studies on Early Allergic Sensitization in the Lithuanian Birth Cohort

    PubMed Central

    Dubakiene, Ruta; Rudzeviciene, Odilija; Butiene, Indre; Sezaite, Indre; Petronyte, Malvina; Vaicekauskaite, Dalia; Zvirbliene, Aurelija

    2012-01-01

    Cohort studies are of great importance in defining the mechanism responsible for the development of allergy-associated diseases, such as atopic dermatitis, allergic asthma, and allergic rhinoconjunctivitis. Although these disorders share genetic and environmental risk factors, it is still under debate whether they are linked or develop sequentially along an atopic pathway. The current study was aimed to determine the pattern of allergy sensitization in the Lithuanian birth cohort “Alergemol” (n = 1558) established as a part of the multicenter European birth cohort “EuroPrevall”. Early sensitization to food allergens in the “Alergemol” birth cohort was analysed. The analysis revealed 1.3% and 2.8% of symptomatic-sensitized subjects at 6 and 12 months of age, respectively. The sensitization pattern in response to different allergens in the group of infants with food allergy symptoms was studied using allergological methods in vivo and in vitro. The impact of maternal and environmental risk factors on the early development of food allergy in at 6 and 12 months of age was evaluated. Our data showed that maternal diet, diseases, the use of antibiotics, and tobacco smoke during pregnancy had no significant impact on the early sensitization to food allergens. However, infants of atopic mothers were significantly more often sensitized to egg as compared to the infants of nonatopic mothers. PMID:22606067

  5. Ion-Sensitive Field-Effect Transistor for Biological Sensing

    PubMed Central

    Lee, Chang-Soo; Kim, Sang Kyu; Kim, Moonil

    2009-01-01

    In recent years there has been great progress in applying FET-type biosensors for highly sensitive biological detection. Among them, the ISFET (ion-sensitive field-effect transistor) is one of the most intriguing approaches in electrical biosensing technology. Here, we review some of the main advances in this field over the past few years, explore its application prospects, and discuss the main issues, approaches, and challenges, with the aim of stimulating a broader interest in developing ISFET-based biosensors and extending their applications for reliable and sensitive analysis of various biomolecules such as DNA, proteins, enzymes, and cells. PMID:22423205

  6. On the sensitivity of complex, internally coupled systems

    NASA Technical Reports Server (NTRS)

    Sobieszczanskisobieski, Jaroslaw

    1988-01-01

    A method is presented for computing sensitivity derivatives with respect to independent (input) variables for complex, internally coupled systems, while avoiding the cost and inaccuracy of finite differencing performed on the entire system analysis. The method entails two alternative algorithms: the first is based on the classical implicit function theorem formulated on residuals of governing equations, and the second develops the system sensitivity equations in a new form using the partial (local) sensitivity derivatives of the output with respect to the input of each part of the system. A few application examples are presented to illustrate the discussion.

  7. Behavioral metabolomics analysis identifies novel neurochemical signatures in methamphetamine sensitization

    PubMed Central

    Adkins, Daniel E.; McClay, Joseph L.; Vunck, Sarah A.; Batman, Angela M.; Vann, Robert E.; Clark, Shaunna L.; Souza, Renan P.; Crowley, James J.; Sullivan, Patrick F.; van den Oord, Edwin J.C.G.; Beardsley, Patrick M.

    2014-01-01

    Behavioral sensitization has been widely studied in animal models and is theorized to reflect neural modifications associated with human psychostimulant addiction. While the mesolimbic dopaminergic pathway is known to play a role, the neurochemical mechanisms underlying behavioral sensitization remain incompletely understood. In the present study, we conducted the first metabolomics analysis to globally characterize neurochemical differences associated with behavioral sensitization. Methamphetamine-induced sensitization measures were generated by statistically modeling longitudinal activity data for eight inbred strains of mice. Subsequent to behavioral testing, nontargeted liquid and gas chromatography-mass spectrometry profiling was performed on 48 brain samples, yielding 301 metabolite levels per sample after quality control. Association testing between metabolite levels and three primary dimensions of behavioral sensitization (total distance, stereotypy and margin time) showed four robust, significant associations at a stringent metabolome-wide significance threshold (false discovery rate < 0.05). Results implicated homocarnosine, a dipeptide of GABA and histidine, in total distance sensitization, GABA metabolite 4-guanidinobutanoate and pantothenate in stereotypy sensitization, and myo-inositol in margin time sensitization. Secondary analyses indicated that these associations were independent of concurrent methamphetamine levels and, with the exception of the myo-inositol association, suggest a mechanism whereby strain-based genetic variation produces specific baseline neurochemical differences that substantially influence the magnitude of MA-induced sensitization. These findings demonstrate the utility of mouse metabolomics for identifying novel biomarkers, and developing more comprehensive neurochemical models, of psychostimulant sensitization. PMID:24034544

  8. Optimal cure cycle design for autoclave processing of thick composites laminates: A feasibility study

    NASA Technical Reports Server (NTRS)

    Hou, Jean W.

    1985-01-01

    The thermal analysis and the calculation of thermal sensitivity of a cure cycle in autoclave processing of thick composite laminates were studied. A finite element program for the thermal analysis and design derivatives calculation for temperature distribution and the degree of cure was developed and verified. It was found that the direct differentiation was the best approach for the thermal design sensitivity analysis. In addition, the approach of the direct differentiation provided time histories of design derivatives which are of great value to the cure cycle designers. The approach of direct differentiation is to be used for further study, i.e., the optimal cycle design.

  9. Integrated multidisciplinary design optimization using discrete sensitivity analysis for geometrically complex aeroelastic configurations

    NASA Astrophysics Data System (ADS)

    Newman, James Charles, III

    1997-10-01

    The first two steps in the development of an integrated multidisciplinary design optimization procedure capable of analyzing the nonlinear fluid flow about geometrically complex aeroelastic configurations have been accomplished in the present work. For the first step, a three-dimensional unstructured grid approach to aerodynamic shape sensitivity analysis and design optimization has been developed. The advantage of unstructured grids, when compared with a structured-grid approach, is their inherent ability to discretize irregularly shaped domains with greater efficiency and less effort. Hence, this approach is ideally suited for geometrically complex configurations of practical interest. In this work the time-dependent, nonlinear Euler equations are solved using an upwind, cell-centered, finite-volume scheme. The discrete, linearized systems which result from this scheme are solved iteratively by a preconditioned conjugate-gradient-like algorithm known as GMRES for the two-dimensional cases and a Gauss-Seidel algorithm for the three-dimensional; at steady-state, similar procedures are used to solve the accompanying linear aerodynamic sensitivity equations in incremental iterative form. As shown, this particular form of the sensitivity equation makes large-scale gradient-based aerodynamic optimization possible by taking advantage of memory efficient methods to construct exact Jacobian matrix-vector products. Various surface parameterization techniques have been employed in the current study to control the shape of the design surface. Once this surface has been deformed, the interior volume of the unstructured grid is adapted by considering the mesh as a system of interconnected tension springs. Grid sensitivities are obtained by differentiating the surface parameterization and the grid adaptation algorithms with ADIFOR, an advanced automatic-differentiation software tool. To demonstrate the ability of this procedure to analyze and design complex configurations of practical interest, the sensitivity analysis and shape optimization has been performed for several two- and three-dimensional cases. In twodimensions, an initially symmetric NACA-0012 airfoil and a high-lift multielement airfoil were examined. For the three-dimensional configurations, an initially rectangular wing with uniform NACA-0012 cross-sections was optimized; in addition, a complete Boeing 747-200 aircraft was studied. Furthermore, the current study also examines the effect of inconsistency in the order of spatial accuracy between the nonlinear fluid and linear shape sensitivity equations. The second step was to develop a computationally efficient, high-fidelity, integrated static aeroelastic analysis procedure. To accomplish this, a structural analysis code was coupled with the aforementioned unstructured grid aerodynamic analysis solver. The use of an unstructured grid scheme for the aerodynamic analysis enhances the interaction compatibility with the wing structure. The structural analysis utilizes finite elements to model the wing so that accurate structural deflections may be obtained. In the current work, parameters have been introduced to control the interaction of the computational fluid dynamics and structural analyses; these control parameters permit extremely efficient static aeroelastic computations. To demonstrate and evaluate this procedure, static aeroelastic analysis results for a flexible wing in low subsonic, high subsonic (subcritical), transonic (supercritical), and supersonic flow conditions are presented.

  10. Predicting chemically-induced skin reactions. Part II: QSAR models of skin permeability and the relationships between skin permeability and skin sensitization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alves, Vinicius M.; Laboratory for Molecular Modeling, Division of Chemical Biology and Medicinal Chemistry, Eshelman School of Pharmacy, University of North Carolina, Chapel Hill, NC 27599; Muratov, Eugene

    Skin permeability is widely considered to be mechanistically implicated in chemically-induced skin sensitization. Although many chemicals have been identified as skin sensitizers, there have been very few reports analyzing the relationships between molecular structure and skin permeability of sensitizers and non-sensitizers. The goals of this study were to: (i) compile, curate, and integrate the largest publicly available dataset of chemicals studied for their skin permeability; (ii) develop and rigorously validate QSAR models to predict skin permeability; and (iii) explore the complex relationships between skin sensitization and skin permeability. Based on the largest publicly available dataset compiled in this study, wemore » found no overall correlation between skin permeability and skin sensitization. In addition, cross-species correlation coefficient between human and rodent permeability data was found to be as low as R{sup 2} = 0.44. Human skin permeability models based on the random forest method have been developed and validated using OECD-compliant QSAR modeling workflow. Their external accuracy was high (Q{sup 2}{sub ext} = 0.73 for 63% of external compounds inside the applicability domain). The extended analysis using both experimentally-measured and QSAR-imputed data still confirmed the absence of any overall concordance between skin permeability and skin sensitization. This observation suggests that chemical modifications that affect skin permeability should not be presumed a priori to modulate the sensitization potential of chemicals. The models reported herein as well as those developed in the companion paper on skin sensitization suggest that it may be possible to rationally design compounds with the desired high skin permeability but low sensitization potential. - Highlights: • It was compiled the largest publicly-available skin permeability dataset. • Predictive QSAR models were developed for skin permeability. • No concordance between skin sensitization and skin permeability has been found. • Structural rules for optimizing sensitization and penetration were established.« less

  11. Fast and sensitive optical toxicity bioassay based on dual wavelength analysis of bacterial ferricyanide reduction kinetics.

    PubMed

    Pujol-Vila, F; Vigués, N; Díaz-González, M; Muñoz-Berbel, X; Mas, J

    2015-05-15

    Global urban and industrial growth, with the associated environmental contamination, is promoting the development of rapid and inexpensive general toxicity methods. Current microbial methodologies for general toxicity determination rely on either bioluminescent bacteria and specific medium solution (i.e. Microtox(®)) or low sensitivity and diffusion limited protocols (i.e. amperometric microbial respirometry). In this work, fast and sensitive optical toxicity bioassay based on dual wavelength analysis of bacterial ferricyanide reduction kinetics is presented, using Escherichia coli as a bacterial model. Ferricyanide reduction kinetic analysis (variation of ferricyanide absorption with time), much more sensitive than single absorbance measurements, allowed for direct and fast toxicity determination without pre-incubation steps (assay time=10 min) and minimizing biomass interference. Dual wavelength analysis at 405 (ferricyanide and biomass) and 550 nm (biomass), allowed for ferricyanide monitoring without interference of biomass scattering. On the other hand, refractive index (RI) matching with saccharose reduced bacterial light scattering around 50%, expanding the analytical linear range in the determination of absorbent molecules. With this method, different toxicants such as metals and organic compounds were analyzed with good sensitivities. Half maximal effective concentrations (EC50) obtained after 10 min bioassay, 2.9, 1.0, 0.7 and 18.3 mg L(-1) for copper, zinc, acetic acid and 2-phenylethanol respectively, were in agreement with previously reported values for longer bioassays (around 60 min). This method represents a promising alternative for fast and sensitive water toxicity monitoring, opening the possibility of quick in situ analysis. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Systematic review: Comparison of Xpert MTB/RIF, LAMP and SAT methods for the diagnosis of pulmonary tuberculosis.

    PubMed

    Yan, Liping; Xiao, Heping; Zhang, Qing

    2016-01-01

    Technological advances in nucleic acid amplification have led to breakthroughs in the early detection of PTB compared to traditional sputum smear tests. The sensitivity and specificity of loop-mediated isothermal amplification (LAMP), simultaneous amplification testing (SAT), and Xpert MTB/RIF for the diagnosis of pulmonary tuberculosis were evaluated. A critical review of previous studies of LAMP, SAT, and Xpert MTB/RIF for the diagnosis of pulmonary tuberculosis that used laboratory culturing as the reference method was carried out together with a meta-analysis. In 25 previous studies, the pooled sensitivity and specificity of the diagnosis of tuberculosis were 93% and 94% for LAMP, 96% and 88% for SAT, and 89% and 98% for Xpert MTB/RIF. The I(2) values for the pooled data were >80%, indicating significant heterogeneity. In the smear-positive subgroup analysis of LAMP, the sensitivity increased from 93% to 98% (I(2) = 2.6%), and specificity was 68% (I(2) = 38.4%). In the HIV-infected subgroup analysis of Xpert MTB/RIF, the pooled sensitivity and specificity were 79% (I(2) = 72.9%) and 99% (I(2) = 64.4%). In the HIV-negative subgroup analysis for Xpert MTB/RIF, the pooled sensitivity and specificity were 72% (I(2) = 49.6%) and 99% (I(2) = 64.5%). LAMP, SAT and Xpert MTB/RIF had comparably high levels of sensitivity and specificity for the diagnosis of tuberculosis. The diagnostic sensitivity and specificity of three methods were similar, with LAMP being highly sensitive for the diagnosis of smear-positive PTB. The cost effectiveness of LAMP and SAT make them particularly suitable tests for diagnosing PTB in developing countries. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Case-Deletion Diagnostics for Maximum Likelihood Multipoint Quantitative Trait Locus Linkage Analysis

    PubMed Central

    Mendoza, Maria C.B.; Burns, Trudy L.; Jones, Michael P.

    2009-01-01

    Objectives Case-deletion diagnostic methods are tools that allow identification of influential observations that may affect parameter estimates and model fitting conclusions. The goal of this paper was to develop two case-deletion diagnostics, the exact case deletion (ECD) and the empirical influence function (EIF), for detecting outliers that can affect results of sib-pair maximum likelihood quantitative trait locus (QTL) linkage analysis. Methods Subroutines to compute the ECD and EIF were incorporated into the maximum likelihood QTL variance estimation components of the linkage analysis program MAPMAKER/SIBS. Performance of the diagnostics was compared in simulation studies that evaluated the proportion of outliers correctly identified (sensitivity), and the proportion of non-outliers correctly identified (specificity). Results Simulations involving nuclear family data sets with one outlier showed EIF sensitivities approximated ECD sensitivities well for outlier-affected parameters. Sensitivities were high, indicating the outlier was identified a high proportion of the time. Simulations also showed the enormous computational time advantage of the EIF. Diagnostics applied to body mass index in nuclear families detected observations influential on the lod score and model parameter estimates. Conclusions The EIF is a practical diagnostic tool that has the advantages of high sensitivity and quick computation. PMID:19172086

  14. CASL L1 Milestone report : CASL.P4.01, sensitivity and uncertainty analysis for CIPS with VIPRE-W and BOA.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sung, Yixing; Adams, Brian M.; Secker, Jeffrey R.

    2011-12-01

    The CASL Level 1 Milestone CASL.P4.01, successfully completed in December 2011, aimed to 'conduct, using methodologies integrated into VERA, a detailed sensitivity analysis and uncertainty quantification of a crud-relevant problem with baseline VERA capabilities (ANC/VIPRE-W/BOA).' The VUQ focus area led this effort, in partnership with AMA, and with support from VRI. DAKOTA was coupled to existing VIPRE-W thermal-hydraulics and BOA crud/boron deposit simulations representing a pressurized water reactor (PWR) that previously experienced crud-induced power shift (CIPS). This work supports understanding of CIPS by exploring the sensitivity and uncertainty in BOA outputs with respect to uncertain operating and model parameters. Thismore » report summarizes work coupling the software tools, characterizing uncertainties, and analyzing the results of iterative sensitivity and uncertainty studies. These studies focused on sensitivity and uncertainty of CIPS indicators calculated by the current version of the BOA code used in the industry. Challenges with this kind of analysis are identified to inform follow-on research goals and VERA development targeting crud-related challenge problems.« less

  15. Sensitive Multi-Species Emissions Monitoring: Infrared Laser-Based Detection of Trace-Level Contaminants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steill, Jeffrey D.; Huang, Haifeng; Hoops, Alexandra A.

    This report summarizes our development of spectroscopic chemical analysis techniques and spectral modeling for trace-gas measurements of highly-regulated low-concentration species present in flue gas emissions from utility coal boilers such as HCl under conditions of high humidity. Detailed spectral modeling of the spectroscopy of HCl and other important combustion and atmospheric species such as H 2 O, CO 2 , N 2 O, NO 2 , SO 2 , and CH 4 demonstrates that IR-laser spectroscopy is a sensitive multi-component analysis strategy. Experimental measurements from techniques based on IR laser spectroscopy are presented that demonstrate sub-ppm sensitivity levels to thesemore » species. Photoacoustic infrared spectroscopy is used to detect and quantify HCl at ppm levels with extremely high signal-to-noise even under conditions of high relative humidity. Additionally, cavity ring-down IR spectroscopy is used to achieve an extremely high sensitivity to combustion trace gases in this spectral region; ppm level CH 4 is one demonstrated example. The importance of spectral resolution in the sensitivity of a trace-gas measurement is examined by spectral modeling in the mid- and near-IR, and efforts to improve measurement resolution through novel instrument development are described. While previous project reports focused on benefits and complexities of the dual-etalon cavity ring-down infrared spectrometer, here details on steps taken to implement this unique and potentially revolutionary instrument are described. This report also illustrates and critiques the general strategy of IR- laser photodetection of trace gases leading to the conclusion that mid-IR laser spectroscopy techniques provide a promising basis for further instrument development and implementation that will enable cost-effective sensitive detection of multiple key contaminant species simultaneously.« less

  16. Timing of food introduction and development of food sensitization in a prospective birth cohort.

    PubMed

    Tran, Maxwell M; Lefebvre, Diana L; Dai, David; Dharma, Christoffer; Subbarao, Padmaja; Lou, Wendy; Azad, Meghan B; Becker, Allan B; Mandhane, Piush J; Turvey, Stuart E; Sears, Malcolm R

    2017-08-01

    The effect of infant feeding practices on the development of food allergy remains controversial. We examined the relationship between timing and patterns of food introduction and sensitization to foods at age 1 year in the Canadian Healthy Infant Longitudinal Development (CHILD) birth cohort study. Nutrition questionnaire data prospectively collected at age 3, 6, 12, 18, and 24 months were used to determine timing of introduction of cow's milk products, egg, and peanut. At age 1 year, infants underwent skin prick testing to cow's milk, egg white, and peanut. Logistic regression models were fitted to assess the impact of timing of food exposures on sensitization outcomes, and latent class analysis was used to study patterns of food introduction within the cohort. Among 2124 children with sufficient data, delaying introduction of cow's milk products, egg, and peanut beyond the first year of life significantly increased the odds of sensitization to that food (cow's milk adjOR 3.69, 95% CI 1.37-9.08; egg adjOR 1.89, 95% CI 1.25-2.80; peanut adjOR 1.76, 95% CI 1.07-3.01). Latent class analysis produced a three-class model: early, usual, and delayed introduction. A pattern of delayed introduction, characterized by avoidance of egg and peanut during the first year of life, increased the odds of sensitization to any of the three tested foods (adjOR 1.78, 95% CI 1.26-2.49). Avoidance of potentially allergenic foods during the first year of life significantly increased the odds of sensitization to the corresponding foods. © 2017 The Authors. Pediatric Allergy and Immunology Published by John Wiley & Sons Ltd.

  17. Sensitivity Analysis in Engineering

    NASA Technical Reports Server (NTRS)

    Adelman, Howard M. (Compiler); Haftka, Raphael T. (Compiler)

    1987-01-01

    The symposium proceedings presented focused primarily on sensitivity analysis of structural response. However, the first session, entitled, General and Multidisciplinary Sensitivity, focused on areas such as physics, chemistry, controls, and aerodynamics. The other four sessions were concerned with the sensitivity of structural systems modeled by finite elements. Session 2 dealt with Static Sensitivity Analysis and Applications; Session 3 with Eigenproblem Sensitivity Methods; Session 4 with Transient Sensitivity Analysis; and Session 5 with Shape Sensitivity Analysis.

  18. Functionalized Gold Nanoparticles for the Detection of C-Reactive Protein

    PubMed Central

    António, Maria

    2018-01-01

    C-reactive protein (CRP) is a very important biomarker of infection and inflammation for a number of diseases. Routine CRP measurements with high sensitivity and reliability are highly relevant to the assessment of states of inflammation and the efficacy of treatment intervention, and require the development of very sensitive, selective, fast, robust and reproducible assays. Gold nanoparticles (Au NPs) are distinguished for their unique electrical and optical properties and the ability to conjugate with biomolecules. Au NP-based probes have attracted considerable attention in the last decade in the analysis of biological samples due to their simplicity, high sensitivity and selectivity. Thus, this article aims to be a critical and constructive analysis of the literature of the last three years regarding the advances made in the development of bioanalytical assays based on gold nanoparticles for the in vitro detection and quantification of C-reactive protein from biological samples. Current methods for Au NP synthesis and the strategies for surface modification aiming at selectivity towards CRP are highlighted. PMID:29597295

  19. Behavior sensitivities for control augmented structures

    NASA Technical Reports Server (NTRS)

    Manning, R. A.; Lust, R. V.; Schmit, L. A.

    1987-01-01

    During the past few years it has been recognized that combining passive structural design methods with active control techniques offers the prospect of being able to find substantially improved designs. These developments have stimulated interest in augmenting structural synthesis by adding active control system design variables to those usually considered in structural optimization. An essential step in extending the approximation concepts approach to control augmented structural synthesis is the development of a behavior sensitivity analysis capability for determining rates of change of dynamic response quantities with respect to changes in structural and control system design variables. Behavior sensitivity information is also useful for man-machine interactive design as well as in the context of system identification studies. Behavior sensitivity formulations for both steady state and transient response are presented and the quality of the resulting derivative information is evaluated.

  20. SCALE 6.2 Continuous-Energy TSUNAMI-3D Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perfetti, Christopher M; Rearden, Bradley T

    2015-01-01

    The TSUNAMI (Tools for Sensitivity and UNcertainty Analysis Methodology Implementation) capabilities within the SCALE code system make use of sensitivity coefficients for an extensive number of criticality safety applications, such as quantifying the data-induced uncertainty in the eigenvalue of critical systems, assessing the neutronic similarity between different systems, quantifying computational biases, and guiding nuclear data adjustment studies. The need to model geometrically complex systems with improved ease of use and fidelity and the desire to extend TSUNAMI analysis to advanced applications have motivated the development of a SCALE 6.2 module for calculating sensitivity coefficients using three-dimensional (3D) continuous-energy (CE) Montemore » Carlo methods: CE TSUNAMI-3D. This paper provides an overview of the theory, implementation, and capabilities of the CE TSUNAMI-3D sensitivity analysis methods. CE TSUNAMI contains two methods for calculating sensitivity coefficients in eigenvalue sensitivity applications: (1) the Iterated Fission Probability (IFP) method and (2) the Contributon-Linked eigenvalue sensitivity/Uncertainty estimation via Track length importance CHaracterization (CLUTCH) method. This work also presents the GEneralized Adjoint Response in Monte Carlo method (GEAR-MC), a first-of-its-kind approach for calculating adjoint-weighted, generalized response sensitivity coefficients—such as flux responses or reaction rate ratios—in CE Monte Carlo applications. The accuracy and efficiency of the CE TSUNAMI-3D eigenvalue sensitivity methods are assessed from a user perspective in a companion publication, and the accuracy and features of the CE TSUNAMI-3D GEAR-MC methods are detailed in this paper.« less

  1. Space shuttle orbiter digital data processing system timing sensitivity analysis OFT ascent phase

    NASA Technical Reports Server (NTRS)

    Lagas, J. J.; Peterka, J. J.; Becker, D. A.

    1977-01-01

    Dynamic loads were investigated to provide simulation and analysis of the space shuttle orbiter digital data processing system (DDPS). Segments of the ascent test (OFT) configuration were modeled utilizing the information management system interpretive model (IMSIM) in a computerized simulation modeling of the OFT hardware and software workload. System requirements for simulation of the OFT configuration were defined, and sensitivity analyses determined areas of potential data flow problems in DDPS operation. Based on the defined system requirements and these sensitivity analyses, a test design was developed for adapting, parameterizing, and executing IMSIM, using varying load and stress conditions for model execution. Analyses of the computer simulation runs are documented, including results, conclusions, and recommendations for DDPS improvements.

  2. Use of SUSA in Uncertainty and Sensitivity Analysis for INL VHTR Coupled Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhard Strydom

    2010-06-01

    The need for a defendable and systematic Uncertainty and Sensitivity approach that conforms to the Code Scaling, Applicability, and Uncertainty (CSAU) process, and that could be used for a wide variety of software codes, was defined in 2008.The GRS (Gesellschaft für Anlagen und Reaktorsicherheit) company of Germany has developed one type of CSAU approach that is particularly well suited for legacy coupled core analysis codes, and a trial version of their commercial software product SUSA (Software for Uncertainty and Sensitivity Analyses) was acquired on May 12, 2010. This interim milestone report provides an overview of the current status of themore » implementation and testing of SUSA at the INL VHTR Project Office.« less

  3. RuO2 pH Sensor with Super-Glue-Inspired Reference Electrode

    PubMed Central

    Wajrak, Magdalena; Alameh, Kamal

    2017-01-01

    A pH-sensitive RuO2 electrode coated in a commercial cyanoacrylate adhesive typically exhibits very low pH sensitivity, and could be paired with a RuO2 working electrode as a differential type pH sensor. However, such sensors display poor performance in real sample matrices. A pH sensor employing a RuO2 pH-sensitive working electrode and a SiO2-PVB junction-modified RuO2 reference electrode is developed as an alternative high-performance solution. This sensor exhibits a performance similar to that of a commercial glass pH sensor in some common sample matrices, particularly, an excellent pH sensitivity of 55.7 mV/pH, a hysteresis as low as 2.7 mV, and a drift below 2.2 mV/h. The developed sensor structure opens the way towards the development of a simple, cost effective, and robust pH sensor for pH analysis in various sample matrices. PMID:28878182

  4. RuO₂ pH Sensor with Super-Glue-Inspired Reference Electrode.

    PubMed

    Lonsdale, Wade; Wajrak, Magdalena; Alameh, Kamal

    2017-09-06

    A pH-sensitive RuO₂ electrode coated in a commercial cyanoacrylate adhesive typically exhibits very low pH sensitivity, and could be paired with a RuO₂ working electrode as a differential type pH sensor. However, such sensors display poor performance in real sample matrices. A pH sensor employing a RuO₂ pH-sensitive working electrode and a SiO₂-PVB junction-modified RuO₂ reference electrode is developed as an alternative high-performance solution. This sensor exhibits a performance similar to that of a commercial glass pH sensor in some common sample matrices, particularly, an excellent pH sensitivity of 55.7 mV/pH, a hysteresis as low as 2.7 mV, and a drift below 2.2 mV/h. The developed sensor structure opens the way towards the development of a simple, cost effective, and robust pH sensor for pH analysis in various sample matrices.

  5. A PROBABILISTIC ARSENIC EXPOSURE ASSESSMENT FOR CHILDREN WHO CONTACT CHROMATED COPPER ARSENATE ( CAA )-TREATED PLAYSETS AND DECKS: PART 2 SENSITIVITY AND UNCERTAINTY ANALYSIS

    EPA Science Inventory

    A probabilistic model (SHEDS-Wood) was developed to examine children's exposure and dose to chromated copper arsenate (CCA)-treated wood, as described in Part 1 of this two part paper. This Part 2 paper discusses sensitivity and uncertainty analyses conducted to assess the key m...

  6. Sensitivity of Combustion-Acoustic Instabilities to Boundary Conditions for Premixed Gas Turbine Combustors

    NASA Technical Reports Server (NTRS)

    Darling, Douglas; Radhakrishnan, Krishnan; Oyediran, Ayo

    1995-01-01

    Premixed combustors, which are being considered for low NOx engines, are susceptible to instabilities due to feedback between pressure perturbations and combustion. This feedback can cause damaging mechanical vibrations of the system as well as degrade the emissions characteristics and combustion efficiency. In a lean combustor instabilities can also lead to blowout. A model was developed to perform linear combustion-acoustic stability analysis using detailed chemical kinetic mechanisms. The Lewis Kinetics and Sensitivity Analysis Code, LSENS, was used to calculate the sensitivities of the heat release rate to perturbations in density and temperature. In the present work, an assumption was made that the mean flow velocity was small relative to the speed of sound. Results of this model showed the regions of growth of perturbations to be most sensitive to the reflectivity of the boundary when reflectivities were close to unity.

  7. UTI diagnosis and antibiogram using Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Kastanos, Evdokia; Kyriakides, Alexandros; Hadjigeorgiou, Katerina; Pitris, Constantinos

    2009-07-01

    Urinary tract infection diagnosis and antibiogram require a 48 hour waiting period using conventional methods. This results in ineffective treatments, increased costs and most importantly in increased resistance to antibiotics. In this work, a novel method for classifying bacteria and determining their sensitivity to an antibiotic using Raman spectroscopy is described. Raman spectra of three species of gram negative Enterobacteria, most commonly responsible for urinary tract infections, were collected. The study included 25 samples each of E.coli, Klebsiella p. and Proteus spp. A novel algorithm based on spectral ratios followed by discriminant analysis resulted in classification with over 94% accuracy. Sensitivity and specificity for the three types of bacteria ranged from 88-100%. For the development of an antibiogram, bacterial samples were treated with the antibiotic ciprofloxacin to which they were all sensitive. Sensitivity to the antibiotic was evident after analysis of the Raman signatures of bacteria treated or not treated with this antibiotic as early as two hours after exposure. This technique can lead to the development of new technology for urinary tract infection diagnosis and antibiogram with same day results, bypassing urine cultures and avoiding all undesirable consequences of current practice.

  8. CORSSTOL: Cylinder Optimization of Rings, Skin, and Stringers with Tolerance sensitivity

    NASA Technical Reports Server (NTRS)

    Finckenor, J.; Bevill, M.

    1995-01-01

    Cylinder Optimization of Rings, Skin, and Stringers with Tolerance (CORSSTOL) sensitivity is a design optimization program incorporating a method to examine the effects of user-provided manufacturing tolerances on weight and failure. CORSSTOL gives designers a tool to determine tolerances based on need. This is a decisive way to choose the best design among several manufacturing methods with differing capabilities and costs. CORSSTOL initially optimizes a stringer-stiffened cylinder for weight without tolerances. The skin and stringer geometry are varied, subject to stress and buckling constraints. Then the same analysis and optimization routines are used to minimize the maximum material condition weight subject to the least favorable combination of tolerances. The adjusted optimum dimensions are provided with the weight and constraint sensitivities of each design variable. The designer can immediately identify critical tolerances. The safety of parts made out of tolerance can also be determined. During design and development of weight-critical systems, design/analysis tools that provide product-oriented results are of vital significance. The development of this program and methodology provides designers with an effective cost- and weight-saving design tool. The tolerance sensitivity method can be applied to any system defined by a set of deterministic equations.

  9. Urinary tract infection diagnosis and response to antibiotics using Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Kastanos, Evdokia; Kyriakides, Alexandros; Hadjigeorgiou, Katerina; Pitris, Constantinos

    2009-02-01

    Urinary tract infection diagnosis and antibiogram require a 48 hour waiting period using conventional methods. This results in ineffective treatments, increased costs and most importantly in increased resistance to antibiotics. In this work, a novel method for classifying bacteria and determining their sensitivity to an antibiotic using Raman spectroscopy is described. Raman spectra of three species of gram negative Enterobacteria, most commonly responsible for urinary tract infections, were collected. The study included 25 samples each of E.coli, Klebsiella p. and Proteus spp. A novel algorithm based on spectral ratios followed by discriminant analysis resulted in classification with over 94% accuracy. Sensitivity and specificity for the three types of bacteria ranged from 88-100%. For the development of an antibiogram, bacterial samples were treated with the antibiotic ciprofloxacin to which they were all sensitive. Sensitivity to the antibiotic was evident after analysis of the Raman signatures of bacteria treated or not treated with this antibiotic as early as two hours after exposure. This technique can lead to the development of new technology for urinary tract infection diagnosis and antibiogram with same day results, bypassing urine cultures and avoiding all undesirable consequences of current practice.

  10. DGSA: A Matlab toolbox for distance-based generalized sensitivity analysis of geoscientific computer experiments

    NASA Astrophysics Data System (ADS)

    Park, Jihoon; Yang, Guang; Satija, Addy; Scheidt, Céline; Caers, Jef

    2016-12-01

    Sensitivity analysis plays an important role in geoscientific computer experiments, whether for forecasting, data assimilation or model calibration. In this paper we focus on an extension of a method of regionalized sensitivity analysis (RSA) to applications typical in the Earth Sciences. Such applications involve the building of large complex spatial models, the application of computationally extensive forward modeling codes and the integration of heterogeneous sources of model uncertainty. The aim of this paper is to be practical: 1) provide a Matlab code, 2) provide novel visualization methods to aid users in getting a better understanding in the sensitivity 3) provide a method based on kernel principal component analysis (KPCA) and self-organizing maps (SOM) to account for spatial uncertainty typical in Earth Science applications and 4) provide an illustration on a real field case where the above mentioned complexities present themselves. We present methods that extend the original RSA method in several ways. First we present the calculation of conditional effects, defined as the sensitivity of a parameter given a level of another parameters. Second, we show how this conditional effect can be used to choose nominal values or ranges to fix insensitive parameters aiming to minimally affect uncertainty in the response. Third, we develop a method based on KPCA and SOM to assign a rank to spatial models in order to calculate the sensitivity on spatial variability in the models. A large oil/gas reservoir case is used as illustration of these ideas.

  11. Additional EIPC Study Analysis: Interim Report on High Priority Topics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hadley, Stanton W

    Between 2010 and 2012 the Eastern Interconnection Planning Collaborative (EIPC) conducted a major long-term resource and transmission study of the Eastern Interconnection (EI). With guidance from a Stakeholder Steering Committee (SSC) that included representatives from the Eastern Interconnection States Planning Council (EISPC) among others, the project was conducted in two phases. Phase 1 involved a long-term capacity expansion analysis that involved creation of eight major futures plus 72 sensitivities. Three scenarios were selected for more extensive transmission- focused evaluation in Phase 2. Five power flow analyses, nine production cost model runs (including six sensitivities), and three capital cost estimations weremore » developed during this second phase. The results from Phase 1 and 2 provided a wealth of data that could be examined further to address energy-related questions. A list of 13 topics was developed for further analysis; this paper discusses the first five.« less

  12. Documentation for a Structural Optimization Procedure Developed Using the Engineering Analysis Language (EAL)

    NASA Technical Reports Server (NTRS)

    Martin, Carl J., Jr.

    1996-01-01

    This report describes a structural optimization procedure developed for use with the Engineering Analysis Language (EAL) finite element analysis system. The procedure is written primarily in the EAL command language. Three external processors which are written in FORTRAN generate equivalent stiffnesses and evaluate stress and local buckling constraints for the sections. Several built-up structural sections were coded into the design procedures. These structural sections were selected for use in aircraft design, but are suitable for other applications. Sensitivity calculations use the semi-analytic method, and an extensive effort has been made to increase the execution speed and reduce the storage requirements. There is also an approximate sensitivity update method included which can significantly reduce computational time. The optimization is performed by an implementation of the MINOS V5.4 linear programming routine in a sequential liner programming procedure.

  13. Pseudotargeted MS Method for the Sensitive Analysis of Protein Phosphorylation in Protein Complexes.

    PubMed

    Lyu, Jiawen; Wang, Yan; Mao, Jiawei; Yao, Yating; Wang, Shujuan; Zheng, Yong; Ye, Mingliang

    2018-05-15

    In this study, we presented an enrichment-free approach for the sensitive analysis of protein phosphorylation in minute amounts of samples, such as purified protein complexes. This method takes advantage of the high sensitivity of parallel reaction monitoring (PRM). Specifically, low confident phosphopeptides identified from the data-dependent acquisition (DDA) data set were used to build a pseudotargeted list for PRM analysis to allow the identification of additional phosphopeptides with high confidence. The development of this targeted approach is very easy as the same sample and the same LC-system were used for the discovery and the targeted analysis phases. No sample fractionation or enrichment was required for the discovery phase which allowed this method to analyze minute amount of sample. We applied this pseudotargeted MS method to quantitatively examine phosphopeptides in affinity purified endogenous Shc1 protein complexes at four temporal stages of EGF signaling and identified 82 phospho-sites. To our knowledge, this is the highest number of phospho-sites identified from the protein complexes. This pseudotargeted MS method is highly sensitive in the identification of low abundance phosphopeptides and could be a powerful tool to study phosphorylation-regulated assembly of protein complex.

  14. Land quality, sustainable development and environmental degradation in agricultural districts: A computational approach based on entropy indexes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zambon, Ilaria, E-mail: ilaria.zambon@unitus.it; Colantoni, Andrea; Carlucci, Margherita

    Land Degradation (LD) in socio-environmental systems negatively impacts sustainable development paths. This study proposes a framework to LD evaluation based on indicators of diversification in the spatial distribution of sensitive land. We hypothesize that conditions for spatial heterogeneity in a composite index of land sensitivity are more frequently associated to areas prone to LD than spatial homogeneity. Spatial heterogeneity is supposed to be associated with degraded areas that act as hotspots for future degradation processes. A diachronic analysis (1960–2010) was performed at the Italian agricultural district scale to identify environmental factors associated with spatial heterogeneity in the degree of landmore » sensitivity to degradation based on the Environmentally Sensitive Area Index (ESAI). In 1960, diversification in the level of land sensitivity measured using two common indexes of entropy (Shannon's diversity and Pielou's evenness) increased significantly with the ESAI, indicating a high level of land sensitivity to degradation. In 2010, surface area classified as “critical” to LD was the highest in districts with diversification in the spatial distribution of ESAI values, confirming the hypothesis formulated above. Entropy indexes, based on observed alignment with the concept of LD, constitute a valuable base to inform mitigation strategies against desertification. - Highlights: • Spatial heterogeneity is supposed to be associated with degraded areas. • Entropy indexes can inform mitigation strategies against desertification. • Assessing spatial diversification in the degree of land sensitivity to degradation. • Mediterranean rural areas have an evident diversity in agricultural systems. • A diachronic analysis carried out at the Italian agricultural district scale.« less

  15. New non-chemically amplified molecular resist design with switchable sensitivity for multi-lithography applications and nanopatterning

    NASA Astrophysics Data System (ADS)

    Thakur, Neha; Guruprasad Reddy, Pulikanti; Nandi, Santu; Yogesh, Midathala; Sharma, Satinder K.; Pradeep, Chullikkattil P.; Ghosh, Subrata; Gonsalves, Kenneth E.

    2017-12-01

    The development of new photoresist materials for multi-lithography applications is crucial but a challenging task for semiconductor industries. During the last few decades, given the need for new resists to meet the requirements of semiconductor industries, several research groups have developed different resist materials for specific lithography applications. In this context, we have successfully synthesized a new molecular non-chemically amplified resist (n-CAR) (C3) based on the functionalization of aromatic hydroxyl core (4,4‧-(9H-fluorene-9,9-diyl)diphenol) with radiation sensitive sulfonium triflates for various lithography applications. While, micron scale features have been developed using i-line (365 nm) and DUVL (254 nm) exposure tools, electron beam studies on C3 thin films enabled us to pattern 20 nm line features with L/3S (line/space) characteristics on the silicon substrate. The sensitivity and contrast were calculated from the contrast curve analysis as 280 µC cm-2 and 0.025 respectively. Being an important parameter for any newly developed resists, the line edge roughness (LER) of 30 nm (L/5S) features were calculated, using SUMMIT metrology package, to be 3.66  ±  0.3 nm and found to be within the acceptable range. AFM analysis further confirmed 20 nm line width with smooth pattern wall. No deformation of patterned features was observed during AFM analysis which indicated good adhesion property between patterned resists and silicon substrates.

  16. Sensitivity Analysis of OECD Benchmark Tests in BISON

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiler, Laura Painton; Gamble, Kyle; Schmidt, Rodney C.

    2015-09-01

    This report summarizes a NEAMS (Nuclear Energy Advanced Modeling and Simulation) project focused on sensitivity analysis of a fuels performance benchmark problem. The benchmark problem was defined by the Uncertainty Analysis in Modeling working group of the Nuclear Science Committee, part of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development (OECD ). The benchmark problem involv ed steady - state behavior of a fuel pin in a Pressurized Water Reactor (PWR). The problem was created in the BISON Fuels Performance code. Dakota was used to generate and analyze 300 samples of 17 input parameters defining coremore » boundary conditions, manuf acturing tolerances , and fuel properties. There were 24 responses of interest, including fuel centerline temperatures at a variety of locations and burnup levels, fission gas released, axial elongation of the fuel pin, etc. Pearson and Spearman correlatio n coefficients and Sobol' variance - based indices were used to perform the sensitivity analysis. This report summarizes the process and presents results from this study.« less

  17. Atmospheric Environment Vulnerability Cause Analysis for the Beijing-Tianjin-Hebei Metropolitan Region

    PubMed Central

    Zhang, Yang; Shen, Jing; Li, Yu

    2018-01-01

    Assessing and quantifying atmospheric vulnerability is a key issue in urban environmental protection and management. This paper integrated the Analytical hierarchy process (AHP), fuzzy synthesis evaluation and Geographic Information System (GIS) spatial analysis into an Exposure-Sensitivity-Adaptive capacity (ESA) framework to quantitatively assess atmospheric environment vulnerability in the Beijing-Tianjin-Hebei (BTH) region with spatial and temporal comparisons. The elaboration of the relationships between atmospheric environment vulnerability and indices of exposure, sensitivity, and adaptive capacity supports enable analysis of the atmospheric environment vulnerability. Our findings indicate that the atmospheric environment vulnerability of 13 cities in the BTH region exhibits obvious spatial heterogeneity, which is caused by regional diversity in exposure, sensitivity, and adaptive capacity indices. The results of atmospheric environment vulnerability assessment and the cause analysis can provide guidance to pick out key control regions and recognize vulnerable indicators for study sites. The framework developed in this paper can also be replicated at different spatial and temporal scales using context-specific datasets to support environmental management. PMID:29342852

  18. Atmospheric Environment Vulnerability Cause Analysis for the Beijing-Tianjin-Hebei Metropolitan Region.

    PubMed

    Zhang, Yang; Shen, Jing; Li, Yu

    2018-01-13

    Assessing and quantifying atmospheric vulnerability is a key issue in urban environmental protection and management. This paper integrated the Analytical hierarchy process (AHP), fuzzy synthesis evaluation and Geographic Information System (GIS) spatial analysis into an Exposure-Sensitivity-Adaptive capacity (ESA) framework to quantitatively assess atmospheric environment vulnerability in the Beijing-Tianjin-Hebei (BTH) region with spatial and temporal comparisons. The elaboration of the relationships between atmospheric environment vulnerability and indices of exposure, sensitivity, and adaptive capacity supports enable analysis of the atmospheric environment vulnerability. Our findings indicate that the atmospheric environment vulnerability of 13 cities in the BTH region exhibits obvious spatial heterogeneity, which is caused by regional diversity in exposure, sensitivity, and adaptive capacity indices. The results of atmospheric environment vulnerability assessment and the cause analysis can provide guidance to pick out key control regions and recognize vulnerable indicators for study sites. The framework developed in this paper can also be replicated at different spatial and temporal scales using context-specific datasets to support environmental management.

  19. A framework for sensitivity analysis of decision trees.

    PubMed

    Kamiński, Bogumił; Jakubczyk, Michał; Szufel, Przemysław

    2018-01-01

    In the paper, we consider sequential decision problems with uncertainty, represented as decision trees. Sensitivity analysis is always a crucial element of decision making and in decision trees it often focuses on probabilities. In the stochastic model considered, the user often has only limited information about the true values of probabilities. We develop a framework for performing sensitivity analysis of optimal strategies accounting for this distributional uncertainty. We design this robust optimization approach in an intuitive and not overly technical way, to make it simple to apply in daily managerial practice. The proposed framework allows for (1) analysis of the stability of the expected-value-maximizing strategy and (2) identification of strategies which are robust with respect to pessimistic/optimistic/mode-favoring perturbations of probabilities. We verify the properties of our approach in two cases: (a) probabilities in a tree are the primitives of the model and can be modified independently; (b) probabilities in a tree reflect some underlying, structural probabilities, and are interrelated. We provide a free software tool implementing the methods described.

  20. Analysis of Fluorotelomer Alcohols in Soils: Optimization of Extraction and Chromatography

    EPA Science Inventory

    This article describes the development of an analytical method for the determination of fluorotelomer alcohols (FTOHs) in soil. The sensitive and selective determination of the telomer alcohols was performed by extraction with mthyl tert-butyl ether (MTBE) and analysis of the ext...

  1. Modelling ecological and human exposure to POPs in Venice lagoon - Part II: Quantitative uncertainty and sensitivity analysis in coupled exposure models.

    PubMed

    Radomyski, Artur; Giubilato, Elisa; Ciffroy, Philippe; Critto, Andrea; Brochot, Céline; Marcomini, Antonio

    2016-11-01

    The study is focused on applying uncertainty and sensitivity analysis to support the application and evaluation of large exposure models where a significant number of parameters and complex exposure scenarios might be involved. The recently developed MERLIN-Expo exposure modelling tool was applied to probabilistically assess the ecological and human exposure to PCB 126 and 2,3,7,8-TCDD in the Venice lagoon (Italy). The 'Phytoplankton', 'Aquatic Invertebrate', 'Fish', 'Human intake' and PBPK models available in MERLIN-Expo library were integrated to create a specific food web to dynamically simulate bioaccumulation in various aquatic species and in the human body over individual lifetimes from 1932 until 1998. MERLIN-Expo is a high tier exposure modelling tool allowing propagation of uncertainty on the model predictions through Monte Carlo simulation. Uncertainty in model output can be further apportioned between parameters by applying built-in sensitivity analysis tools. In this study, uncertainty has been extensively addressed in the distribution functions to describe the data input and the effect on model results by applying sensitivity analysis techniques (screening Morris method, regression analysis, and variance-based method EFAST). In the exposure scenario developed for the Lagoon of Venice, the concentrations of 2,3,7,8-TCDD and PCB 126 in human blood turned out to be mainly influenced by a combination of parameters (half-lives of the chemicals, body weight variability, lipid fraction, food assimilation efficiency), physiological processes (uptake/elimination rates), environmental exposure concentrations (sediment, water, food) and eating behaviours (amount of food eaten). In conclusion, this case study demonstrated feasibility of MERLIN-Expo to be successfully employed in integrated, high tier exposure assessment. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Diagnostic staging laparoscopy in gastric cancer treatment: A cost-effectiveness analysis.

    PubMed

    Li, Kevin; Cannon, John G D; Jiang, Sam Y; Sambare, Tanmaya D; Owens, Douglas K; Bendavid, Eran; Poultsides, George A

    2018-05-01

    Accurate preoperative staging helps avert morbidity, mortality, and cost associated with non-therapeutic laparotomy in gastric cancer (GC) patients. Diagnostic staging laparoscopy (DSL) can detect metastases with high sensitivity, but its cost-effectiveness has not been previously studied. We developed a decision analysis model to assess the cost-effectiveness of preoperative DSL in GC workup. Analysis was based on a hypothetical cohort of GC patients in the U.S. for whom initial imaging shows no metastases. The cost-effectiveness of DSL was measured as cost per quality-adjusted life-year (QALY) gained. Drivers of cost-effectiveness were assessed in sensitivity analysis. Preoperative DSL required an investment of $107 012 per QALY. In sensitivity analysis, DSL became cost-effective at a threshold of $100 000/QALY when the probability of occult metastases exceeded 31.5% or when test sensitivity for metastases exceeded 86.3%. The likelihood of cost-effectiveness increased from 46% to 93% when both parameters were set at maximum reported values. The cost-effectiveness of DSL for GC patients is highly dependent on patient and test characteristics, and is more likely when DSL is used selectively where procedure yield is high, such as for locally advanced disease or in detecting peritoneal and superficial versus deep liver lesions. © 2017 Wiley Periodicals, Inc.

  3. A sensitivity analysis for a thermomechanical model of the Antarctic ice sheet and ice shelves

    NASA Astrophysics Data System (ADS)

    Baratelli, F.; Castellani, G.; Vassena, C.; Giudici, M.

    2012-04-01

    The outcomes of an ice sheet model depend on a number of parameters and physical quantities which are often estimated with large uncertainty, because of lack of sufficient experimental measurements in such remote environments. Therefore, the efforts to improve the accuracy of the predictions of ice sheet models by including more physical processes and interactions with atmosphere, hydrosphere and lithosphere can be affected by the inaccuracy of the fundamental input data. A sensitivity analysis can help to understand which are the input data that most affect the different predictions of the model. In this context, a finite difference thermomechanical ice sheet model based on the Shallow-Ice Approximation (SIA) and on the Shallow-Shelf Approximation (SSA) has been developed and applied for the simulation of the evolution of the Antarctic ice sheet and ice shelves for the last 200 000 years. The sensitivity analysis of the model outcomes (e.g., the volume of the ice sheet and of the ice shelves, the basal melt rate of the ice sheet, the mean velocity of the Ross and Ronne-Filchner ice shelves, the wet area at the base of the ice sheet) with respect to the model parameters (e.g., the basal sliding coefficient, the geothermal heat flux, the present-day surface accumulation and temperature, the mean ice shelves viscosity, the melt rate at the base of the ice shelves) has been performed by computing three synthetic numerical indices: two local sensitivity indices and a global sensitivity index. Local sensitivity indices imply a linearization of the model and neglect both non-linear and joint effects of the parameters. The global variance-based sensitivity index, instead, takes into account the complete variability of the input parameters but is usually conducted with a Monte Carlo approach which is computationally very demanding for non-linear complex models. Therefore, the global sensitivity index has been computed using a development of the model outputs in a neighborhood of the reference parameter values with a second-order approximation. The comparison of the three sensitivity indices proved that the approximation of the non-linear model with a second-order expansion is sufficient to show some differences between the local and the global indices. As a general result, the sensitivity analysis showed that most of the model outcomes are mainly sensitive to the present-day surface temperature and accumulation, which, in principle, can be measured more easily (e.g., with remote sensing techniques) than the other input parameters considered. On the other hand, the parameters to which the model resulted less sensitive are the basal sliding coefficient and the mean ice shelves viscosity.

  4. Approximate analysis for repeated eigenvalue problems with applications to controls-structure integrated design

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Hou, Gene J. W.

    1994-01-01

    A method for eigenvalue and eigenvector approximate analysis for the case of repeated eigenvalues with distinct first derivatives is presented. The approximate analysis method developed involves a reparameterization of the multivariable structural eigenvalue problem in terms of a single positive-valued parameter. The resulting equations yield first-order approximations to changes in the eigenvalues and the eigenvectors associated with the repeated eigenvalue problem. This work also presents a numerical technique that facilitates the definition of an eigenvector derivative for the case of repeated eigenvalues with repeated eigenvalue derivatives (of all orders). Examples are given which demonstrate the application of such equations for sensitivity and approximate analysis. Emphasis is placed on the application of sensitivity analysis to large-scale structural and controls-structures optimization problems.

  5. Fluorescence-labeled methylation-sensitive amplified fragment length polymorphism (FL-MS-AFLP) analysis for quantitative determination of DNA methylation and demethylation status.

    PubMed

    Kageyama, Shinji; Shinmura, Kazuya; Yamamoto, Hiroko; Goto, Masanori; Suzuki, Koichi; Tanioka, Fumihiko; Tsuneyoshi, Toshihiro; Sugimura, Haruhiko

    2008-04-01

    The PCR-based DNA fingerprinting method called the methylation-sensitive amplified fragment length polymorphism (MS-AFLP) analysis is used for genome-wide scanning of methylation status. In this study, we developed a method of fluorescence-labeled MS-AFLP (FL-MS-AFLP) analysis by applying a fluorescence-labeled primer and fluorescence-detecting electrophoresis apparatus to the existing method of MS-AFLP analysis. The FL-MS-AFLP analysis enables quantitative evaluation of more than 350 random CpG loci per run. It was shown to allow evaluation of the differences in methylation level of blood DNA of gastric cancer patients and evaluation of hypermethylation and hypomethylation in DNA from gastric cancer tissue in comparison with adjacent non-cancerous tissue.

  6. Development, sensitivity and uncertainty analysis of LASH model

    USDA-ARS?s Scientific Manuscript database

    Many hydrologic models have been developed to help manage natural resources all over the world. Nevertheless, most models have presented a high complexity regarding data base requirements, as well as, many calibration parameters. This has brought serious difficulties for applying them in watersheds ...

  7. Patient safety and systematic reviews: finding papers indexed in MEDLINE, EMBASE and CINAHL.

    PubMed

    Tanon, A A; Champagne, F; Contandriopoulos, A-P; Pomey, M-P; Vadeboncoeur, A; Nguyen, H

    2010-10-01

    To develop search strategies for identifying papers on patient safety in MEDLINE, EMBASE and CINAHL. Six journals were electronically searched for papers on patient safety published between 2000 and 2006. Identified papers were divided into two gold standards: one to build and the other to validate the search strategies. Candidate terms for strategy construction were identified using a word frequency analysis of titles, abstracts and keywords used to index the papers in the databases. Searches were run for each one of the selected terms independently in every database. Sensitivity, precision and specificity were calculated for each candidate term. Terms with sensitivity greater than 10% were combined to form the final strategies. The search strategies developed were run against the validation gold standard to assess their performance. A final step in the validation process was to compare the performance of each strategy to those of other strategies found in the literature. We developed strategies for all three databases that were highly sensitive (range 95%-100%), precise (range 40%-60%) and balanced (the product of sensitivity and precision being in the range of 30%-40%). The strategies were very specific and outperformed those found in the literature. The strategies we developed can meet the needs of users aiming to maximise either sensitivity or precision, or seeking a reasonable compromise between sensitivity and precision, when searching for papers on patient safety in MEDLINE, EMBASE or CINAHL.

  8. Sensitivity Analysis of an ENteric Immunity SImulator (ENISI)-Based Model of Immune Responses to Helicobacter pylori Infection

    PubMed Central

    Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav

    2015-01-01

    Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close “neighborhood” of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa. PMID:26327290

  9. Sensitivity Analysis of an ENteric Immunity SImulator (ENISI)-Based Model of Immune Responses to Helicobacter pylori Infection.

    PubMed

    Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav

    2015-01-01

    Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close "neighborhood" of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa.

  10. Global Sensitivity Analysis for Process Identification under Model Uncertainty

    NASA Astrophysics Data System (ADS)

    Ye, M.; Dai, H.; Walker, A. P.; Shi, L.; Yang, J.

    2015-12-01

    The environmental system consists of various physical, chemical, and biological processes, and environmental models are always built to simulate these processes and their interactions. For model building, improvement, and validation, it is necessary to identify important processes so that limited resources can be used to better characterize the processes. While global sensitivity analysis has been widely used to identify important processes, the process identification is always based on deterministic process conceptualization that uses a single model for representing a process. However, environmental systems are complex, and it happens often that a single process may be simulated by multiple alternative models. Ignoring the model uncertainty in process identification may lead to biased identification in that identified important processes may not be so in the real world. This study addresses this problem by developing a new method of global sensitivity analysis for process identification. The new method is based on the concept of Sobol sensitivity analysis and model averaging. Similar to the Sobol sensitivity analysis to identify important parameters, our new method evaluates variance change when a process is fixed at its different conceptualizations. The variance considers both parametric and model uncertainty using the method of model averaging. The method is demonstrated using a synthetic study of groundwater modeling that considers recharge process and parameterization process. Each process has two alternative models. Important processes of groundwater flow and transport are evaluated using our new method. The method is mathematically general, and can be applied to a wide range of environmental problems.

  11. Systematic parameter estimation and sensitivity analysis using a multidimensional PEMFC model coupled with DAKOTA.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Chao Yang; Luo, Gang; Jiang, Fangming

    2010-05-01

    Current computational models for proton exchange membrane fuel cells (PEMFCs) include a large number of parameters such as boundary conditions, material properties, and numerous parameters used in sub-models for membrane transport, two-phase flow and electrochemistry. In order to successfully use a computational PEMFC model in design and optimization, it is important to identify critical parameters under a wide variety of operating conditions, such as relative humidity, current load, temperature, etc. Moreover, when experimental data is available in the form of polarization curves or local distribution of current and reactant/product species (e.g., O2, H2O concentrations), critical parameters can be estimated inmore » order to enable the model to better fit the data. Sensitivity analysis and parameter estimation are typically performed using manual adjustment of parameters, which is also common in parameter studies. We present work to demonstrate a systematic approach based on using a widely available toolkit developed at Sandia called DAKOTA that supports many kinds of design studies, such as sensitivity analysis as well as optimization and uncertainty quantification. In the present work, we couple a multidimensional PEMFC model (which is being developed, tested and later validated in a joint effort by a team from Penn State Univ. and Sandia National Laboratories) with DAKOTA through the mapping of model parameters to system responses. Using this interface, we demonstrate the efficiency of performing simple parameter studies as well as identifying critical parameters using sensitivity analysis. Finally, we show examples of optimization and parameter estimation using the automated capability in DAKOTA.« less

  12. Exploring the Relationship between Reward and Punishment Sensitivity and Gambling Disorder in a Clinical Sample: A Path Modeling Analysis.

    PubMed

    Jiménez-Murcia, Susana; Fernández-Aranda, Fernando; Mestre-Bach, Gemma; Granero, Roser; Tárrega, Salomé; Torrubia, Rafael; Aymamí, Neus; Gómez-Peña, Mónica; Soriano-Mas, Carles; Steward, Trevor; Moragas, Laura; Baño, Marta; Del Pino-Gutiérrez, Amparo; Menchón, José M

    2017-06-01

    Most individuals will gamble during their lifetime, yet only a select few will develop gambling disorder. Gray's Reinforcement Sensitivity Theory holds promise for providing insight into gambling disorder etiology and symptomatology as it ascertains that neurobiological differences in reward and punishment sensitivity play a crucial role in determining an individual's affect and motives. The aim of the study was to assess a mediational pathway, which included patients' sex, personality traits, reward and punishment sensitivity, and gambling-severity variables. The Sensitivity to Punishment and Sensitivity to Reward Questionnaire, the South Oaks Gambling Screen, the Symptom Checklist-Revised, and the Temperament and Character Inventory-Revised were administered to a sample of gambling disorder outpatients (N = 831), diagnosed according to DSM-5 criteria, attending a specialized outpatient unit. Sociodemographic variables were also recorded. A structural equation model found that both reward and punishment sensitivity were positively and directly associated with increased gambling severity, sociodemographic variables, and certain personality traits while also revealing a complex mediational role for these dimensions. To this end, our findings suggest that the Sensitivity to Punishment and Sensitivity to Reward Questionnaire could be a useful tool for gaining a better understanding of different gambling disorder phenotypes and developing tailored interventions.

  13. Assessment of energy and economic performance of office building models: a case study

    NASA Astrophysics Data System (ADS)

    Song, X. Y.; Ye, C. T.; Li, H. S.; Wang, X. L.; Ma, W. B.

    2016-08-01

    Energy consumption of building accounts for more than 37.3% of total energy consumption while the proportion of energy-saving buildings is just 5% in China. In this paper, in order to save potential energy, an office building in Southern China was selected as a test example for energy consumption characteristics. The base building model was developed by TRNSYS software and validated against the recorded data from the field work in six days out of August-September in 2013. Sensitivity analysis was conducted for energy performance of building envelope retrofitting; five envelope parameters were analyzed for assessing the thermal responses. Results indicated that the key sensitivity factors were obtained for the heat-transfer coefficient of exterior walls (U-wall), infiltration rate and shading coefficient (SC), of which the sum sensitivity factor was about 89.32%. In addition, the results were evaluated in terms of energy and economic analysis. The analysis of sensitivity validated against some important results of previous studies. On the other hand, the cost-effective method improved the efficiency of investment management in building energy.

  14. Parameter optimization, sensitivity, and uncertainty analysis of an ecosystem model at a forest flux tower site in the United States

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende

    2014-01-01

    Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.

  15. LSENS: A General Chemical Kinetics and Sensitivity Analysis Code for homogeneous gas-phase reactions. Part 3: Illustrative test problems

    NASA Technical Reports Server (NTRS)

    Bittker, David A.; Radhakrishnan, Krishnan

    1994-01-01

    LSENS, the Lewis General Chemical Kinetics and Sensitivity Analysis Code, has been developed for solving complex, homogeneous, gas-phase chemical kinetics problems and contains sensitivity analysis for a variety of problems, including nonisothermal situations. This report is part 3 of a series of three reference publications that describe LSENS, provide a detailed guide to its usage, and present many example problems. Part 3 explains the kinetics and kinetics-plus-sensitivity analysis problems supplied with LSENS and presents sample results. These problems illustrate the various capabilities of, and reaction models that can be solved by, the code and may provide a convenient starting point for the user to construct the problem data file required to execute LSENS. LSENS is a flexible, convenient, accurate, and efficient solver for chemical reaction problems such as static system; steady, one-dimensional, inviscid flow; reaction behind incident shock wave, including boundary layer correction; and perfectly stirred (highly backmixed) reactor. In addition, the chemical equilibrium state can be computed for the following assigned states: temperature and pressure, enthalpy and pressure, temperature and volume, and internal energy and volume. For static problems the code computes the sensitivity coefficients of the dependent variables and their temporal derivatives with respect to the initial values of the dependent variables and/or the three rate coefficient parameters of the chemical reactions.

  16. Polymorphisms of three genes (ACE, AGT and CYP11B2) in the renin-angiotensin-aldosterone system are not associated with blood pressure salt sensitivity: A systematic meta-analysis.

    PubMed

    Sun, Jiahong; Zhao, Min; Miao, Song; Xi, Bo

    2016-01-01

    Many studies have suggested that polymorphisms of three key genes (ACE, AGT and CYP11B2) in the renin-angiotensin-aldosterone system (RAAS) play important roles in the development of blood pressure (BP) salt sensitivity, but they have revealed inconsistent results. Thus, we performed a meta-analysis to clarify the association. PubMed and Embase databases were searched for eligible published articles. Fixed- or random-effect models were used to pool odds ratios and 95% confidence intervals based on whether there was significant heterogeneity between studies. In total, seven studies [237 salt-sensitive (SS) cases and 251 salt-resistant (SR) controls] for ACE gene I/D polymorphism, three studies (130 SS cases and 221 SR controls) for AGT gene M235T polymorphism and three studies (113 SS cases and 218 SR controls) for CYP11B2 gene C344T polymorphism were included in this meta-analysis. The results showed that there was no significant association between polymorphisms of these three polymorphisms in the RAAS and BP salt sensitivity under three genetic models (all p > 0.05). The meta-analysis suggested that three polymorphisms (ACE gene I/D, AGT gene M235T, CYP11B2 gene C344T) in the RAAS have no significant effect on BP salt sensitivity.

  17. Differential metabolome analysis of field-grown maize kernels in response to drought stress

    USDA-ARS?s Scientific Manuscript database

    Drought stress constrains maize kernel development and can exacerbate aflatoxin contamination. In order to identify drought responsive metabolites and explore pathways involved in kernel responses, a metabolomics analysis was conducted on kernels from a drought tolerant line, Lo964, and a sensitive ...

  18. Wavelet analysis enables system-independent texture analysis of optical coherence tomography images.

    PubMed

    Lingley-Papadopoulos, Colleen A; Loew, Murray H; Zara, Jason M

    2009-01-01

    Texture analysis for tissue characterization is a current area of optical coherence tomography (OCT) research. We discuss some of the differences between OCT systems and the effects those differences have on the resulting images and subsequent image analysis. In addition, as an example, two algorithms for the automatic recognition of bladder cancer are compared: one that was developed on a single system with no consideration for system differences, and one that was developed to address the issues associated with system differences. The first algorithm had a sensitivity of 73% and specificity of 69% when tested using leave-one-out cross-validation on data taken from a single system. When tested on images from another system with a different central wavelength, however, the method classified all images as cancerous regardless of the true pathology. By contrast, with the use of wavelet analysis and the removal of system-dependent features, the second algorithm reported sensitivity and specificity values of 87 and 58%, respectively, when trained on images taken with one imaging system and tested on images taken with another.

  19. Wavelet analysis enables system-independent texture analysis of optical coherence tomography images

    NASA Astrophysics Data System (ADS)

    Lingley-Papadopoulos, Colleen A.; Loew, Murray H.; Zara, Jason M.

    2009-07-01

    Texture analysis for tissue characterization is a current area of optical coherence tomography (OCT) research. We discuss some of the differences between OCT systems and the effects those differences have on the resulting images and subsequent image analysis. In addition, as an example, two algorithms for the automatic recognition of bladder cancer are compared: one that was developed on a single system with no consideration for system differences, and one that was developed to address the issues associated with system differences. The first algorithm had a sensitivity of 73% and specificity of 69% when tested using leave-one-out cross-validation on data taken from a single system. When tested on images from another system with a different central wavelength, however, the method classified all images as cancerous regardless of the true pathology. By contrast, with the use of wavelet analysis and the removal of system-dependent features, the second algorithm reported sensitivity and specificity values of 87 and 58%, respectively, when trained on images taken with one imaging system and tested on images taken with another.

  20. Computational mechanics analysis tools for parallel-vector supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf O.; Nguyen, Duc T.; Baddourah, Majdi; Qin, Jiangning

    1993-01-01

    Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigensolution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization search analysis and domain decomposition. The source code for many of these algorithms is available.

  1. Development of a High-Content Image Analysis Method for Quantifying Synaptic Contacts in Rodent Primary Neuronal Cultures

    EPA Science Inventory

    Development of the nervous system occurs through a series of critical processes, each of which may be sensitive to disruption by environmental contaminants. In vitro culture of neurons can be used to model these processes and evaluate the potential of chemicals to act as develop...

  2. On Learning Cluster Coefficient of Private Networks

    PubMed Central

    Wang, Yue; Wu, Xintao; Zhu, Jun; Xiang, Yang

    2013-01-01

    Enabling accurate analysis of social network data while preserving differential privacy has been challenging since graph features such as clustering coefficient or modularity often have high sensitivity, which is different from traditional aggregate functions (e.g., count and sum) on tabular data. In this paper, we treat a graph statistics as a function f and develop a divide and conquer approach to enforce differential privacy. The basic procedure of this approach is to first decompose the target computation f into several less complex unit computations f1, …, fm connected by basic mathematical operations (e.g., addition, subtraction, multiplication, division), then perturb the output of each fi with Laplace noise derived from its own sensitivity value and the distributed privacy threshold εi, and finally combine those perturbed fi as the perturbed output of computation f. We examine how various operations affect the accuracy of complex computations. When unit computations have large global sensitivity values, we enforce the differential privacy by calibrating noise based on the smooth sensitivity, rather than the global sensitivity. By doing this, we achieve the strict differential privacy guarantee with smaller magnitude noise. We illustrate our approach by using clustering coefficient, which is a popular statistics used in social network analysis. Empirical evaluations on five real social networks and various synthetic graphs generated from three random graph models show the developed divide and conquer approach outperforms the direct approach. PMID:24429843

  3. Retrieval of overviews of systematic reviews in MEDLINE was improved by the development of an objectively derived and validated search strategy.

    PubMed

    Lunny, Carole; McKenzie, Joanne E; McDonald, Steve

    2016-06-01

    Locating overviews of systematic reviews is difficult because of an absence of appropriate indexing terms and inconsistent terminology used to describe overviews. Our objective was to develop a validated search strategy to retrieve overviews in MEDLINE. We derived a test set of overviews from the references of two method articles on overviews. Two population sets were used to identify discriminating terms, that is, terms that appear frequently in the test set but infrequently in two population sets of references found in MEDLINE. We used text mining to conduct a frequency analysis of terms appearing in the titles and abstracts. Candidate terms were combined and tested in MEDLINE in various permutations, and the performance of strategies measured using sensitivity and precision. Two search strategies were developed: a sensitivity-maximizing strategy, achieving 93% sensitivity (95% confidence interval [CI]: 87, 96) and 7% precision (95% CI: 6, 8), and a sensitivity-and-precision-maximizing strategy, achieving 66% sensitivity (95% CI: 58, 74) and 21% precision (95% CI: 17, 25). The developed search strategies enable users to more efficiently identify overviews of reviews compared to current strategies. Consistent language in describing overviews would aid in their identification, as would a specific MEDLINE Publication Type. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Assessment of cognitive safety in clinical drug development

    PubMed Central

    Roiser, Jonathan P.; Nathan, Pradeep J.; Mander, Adrian P.; Adusei, Gabriel; Zavitz, Kenton H.; Blackwell, Andrew D.

    2016-01-01

    Cognitive impairment is increasingly recognised as an important potential adverse effect of medication. However, many drug development programmes do not incorporate sensitive cognitive measurements. Here, we review the rationale for cognitive safety assessment, and explain several basic methodological principles for measuring cognition during clinical drug development, including study design and statistical analysis, from Phase I through to postmarketing. The crucial issue of how cognition should be assessed is emphasized, especially the sensitivity of measurement. We also consider how best to interpret the magnitude of any identified effects, including comparison with benchmarks. We conclude by discussing strategies for the effective communication of cognitive risks. PMID:26610416

  5. Evaluation of an existing screening tool for psoriatic arthritis in people with psoriasis and the development of a new instrument: the Psoriasis Epidemiology Screening Tool (PEST) questionnaire.

    PubMed

    Ibrahim, G H; Buch, M H; Lawson, C; Waxman, R; Helliwell, P S

    2009-01-01

    To evaluate an existing tool (the Swedish modification of the Psoriasis Assessment Questionnaire) and to develop a new instrument to screen for psoriatic arthritis in people with psoriasis. The starting point was a community-based survey of people with psoriasis using questionnaires developed from the literature. Selected respondents were examined and additional known cases of psoriatic arthritis were included in the analysis. The new instrument was developed using univariate statistics and a logistic regression model, comparing people with and without psoriatic arthritis. The instruments were compared using receiver operating curve (ROC) curve analysis. 168 questionnaires were returned (response rate 27%) and 93 people attended for examination (55% of questionnaire respondents). Of these 93, twelve were newly diagnosed with psoriatic arthritis during this study. These 12 were supplemented by 21 people with known psoriatic arthritis. Just 5 questions were found to be significant predictors of psoriatic arthritis in this population. Figures for sensitivity and specificity were 0.92 and 0.78 respectively, an improvement on the Alenius tool (sensitivity and specificity, 0.63 and 0.72 respectively). A new screening tool for identifying people with psoriatic arthritis has been developed. Five simple questions demonstrated good sensitivity and specificity in this population but further validation is required.

  6. Predicting chemically-induced skin reactions. Part II: QSAR models of skin permeability and the relationships between skin permeability and skin sensitization

    PubMed Central

    Alves, Vinicius M.; Muratov, Eugene; Fourches, Denis; Strickland, Judy; Kleinstreuer, Nicole; Andrade, Carolina H.; Tropsha, Alexander

    2015-01-01

    Skin permeability is widely considered to be mechanistically implicated in chemically-induced skin sensitization. Although many chemicals have been identified as skin sensitizers, there have been very few reports analyzing the relationships between molecular structure and skin permeability of sensitizers and non-sensitizers. The goals of this study were to: (i) compile, curate, and integrate the largest publicly available dataset of chemicals studied for their skin permeability; (ii) develop and rigorously validate QSAR models to predict skin permeability; and (iii) explore the complex relationships between skin sensitization and skin permeability. Based on the largest publicly available dataset compiled in this study, we found no overall correlation between skin permeability and skin sensitization. In addition, cross-species correlation coefficient between human and rodent permeability data was found to be as low as R2=0.44. Human skin permeability models based on the random forest method have been developed and validated using OECD-compliant QSAR modeling workflow. Their external accuracy was high (Q2ext = 0.73 for 63% of external compounds inside the applicability domain). The extended analysis using both experimentally-measured and QSAR-imputed data still confirmed the absence of any overall concordance between skin permeability and skin sensitization. This observation suggests that chemical modifications that affect skin permeability should not be presumed a priori to modulate the sensitization potential of chemicals. The models reported herein as well as those developed in the companion paper on skin sensitization suggest that it may be possible to rationally design compounds with the desired high skin permeability but low sensitization potential. PMID:25560673

  7. Towards a Completely Implantable, Light-Sensitive Intraocular Retinal Prosthesis

    DTIC Science & Technology

    2001-10-25

    electronic retinal prosthesis is under development to treat retinitis pigmentosa and age-related macular degeneration, two presently incurable...34Preservation of the inner retina in retinitis pigmentosa . A morphometric analysis," Arch Ophthalmol, vol. 115, no. 4, pp. 511-515, Apr.1997...Towards a completely implantable, light-sensitive intraocular retinal prosthesis. M.S. Humayun, J.D. Weiland, B. Justus1, C. Merrit1, J. Whalen, D

  8. Modeling and Error Analysis of a Superconducting Gravity Gradiometer.

    DTIC Science & Technology

    1979-08-01

    fundamental limit to instrument - -1- sensitivity is the thermal noise of the sensor . For the gradiometer design outlined above, the best sensitivity...Mapoles at Stanford. Chapter IV determines the relation between dynamic range, the sensor Q, and the thermal noise of the cryogenic accelerometer. An...C.1 Accelerometer Optimization (1) Development and optimization of the loaded diaphragm sensor . (2) Determination of the optimal values of the

  9. Goal-oriented sensitivity analysis for lattice kinetic Monte Carlo simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arampatzis, Georgios, E-mail: garab@math.uoc.gr; Department of Mathematics and Statistics, University of Massachusetts, Amherst, Massachusetts 01003; Katsoulakis, Markos A., E-mail: markos@math.umass.edu

    2014-03-28

    In this paper we propose a new class of coupling methods for the sensitivity analysis of high dimensional stochastic systems and in particular for lattice Kinetic Monte Carlo (KMC). Sensitivity analysis for stochastic systems is typically based on approximating continuous derivatives with respect to model parameters by the mean value of samples from a finite difference scheme. Instead of using independent samples the proposed algorithm reduces the variance of the estimator by developing a strongly correlated-“coupled”- stochastic process for both the perturbed and unperturbed stochastic processes, defined in a common state space. The novelty of our construction is that themore » new coupled process depends on the targeted observables, e.g., coverage, Hamiltonian, spatial correlations, surface roughness, etc., hence we refer to the proposed method as goal-oriented sensitivity analysis. In particular, the rates of the coupled Continuous Time Markov Chain are obtained as solutions to a goal-oriented optimization problem, depending on the observable of interest, by considering the minimization functional of the corresponding variance. We show that this functional can be used as a diagnostic tool for the design and evaluation of different classes of couplings. Furthermore, the resulting KMC sensitivity algorithm has an easy implementation that is based on the Bortz–Kalos–Lebowitz algorithm's philosophy, where events are divided in classes depending on level sets of the observable of interest. Finally, we demonstrate in several examples including adsorption, desorption, and diffusion Kinetic Monte Carlo that for the same confidence interval and observable, the proposed goal-oriented algorithm can be two orders of magnitude faster than existing coupling algorithms for spatial KMC such as the Common Random Number approach. We also provide a complete implementation of the proposed sensitivity analysis algorithms, including various spatial KMC examples, in a supplementary MATLAB source code.« less

  10. A Small Range Six-Axis Accelerometer Designed with High Sensitivity DCB Elastic Element

    PubMed Central

    Sun, Zhibo; Liu, Jinhao; Yu, Chunzhan; Zheng, Yili

    2016-01-01

    This paper describes a small range six-axis accelerometer (the measurement range of the sensor is ±g) with high sensitivity DCB (Double Cantilever Beam) elastic element. This sensor is developed based on a parallel mechanism because of the reliability. The accuracy of sensors is affected by its sensitivity characteristics. To improve the sensitivity, a DCB structure is applied as the elastic element. Through dynamic analysis, the dynamic model of the accelerometer is established using the Lagrange equation, and the mass matrix and stiffness matrix are obtained by a partial derivative calculation and a conservative congruence transformation, respectively. By simplifying the structure of the accelerometer, a model of the free vibration is achieved, and the parameters of the sensor are designed based on the model. Through stiffness analysis of the DCB structure, the deflection curve of the beam is calculated. Compared with the result obtained using a finite element analysis simulation in ANSYS Workbench, the coincidence rate of the maximum deflection is 89.0% along the x-axis, 88.3% along the y-axis and 87.5% along the z-axis. Through strain analysis of the DCB elastic element, the sensitivity of the beam is obtained. According to the experimental result, the accuracy of the theoretical analysis is found to be 90.4% along the x-axis, 74.9% along the y-axis and 78.9% along the z-axis. The measurement errors of linear accelerations ax, ay and az in the experiments are 2.6%, 0.6% and 1.31%, respectively. The experiments prove that accelerometer with DCB elastic element performs great sensitive and precision characteristics. PMID:27657089

  11. Domain decomposition for aerodynamic and aeroacoustic analyses, and optimization

    NASA Technical Reports Server (NTRS)

    Baysal, Oktay

    1995-01-01

    The overarching theme was the domain decomposition, which intended to improve the numerical solution technique for the partial differential equations at hand; in the present study, those that governed either the fluid flow, or the aeroacoustic wave propagation, or the sensitivity analysis for a gradient-based optimization. The role of the domain decomposition extended beyond the original impetus of discretizing geometrical complex regions or writing modular software for distributed-hardware computers. It induced function-space decompositions and operator decompositions that offered the valuable property of near independence of operator evaluation tasks. The objectives have gravitated about the extensions and implementations of either the previously developed or concurrently being developed methodologies: (1) aerodynamic sensitivity analysis with domain decomposition (SADD); (2) computational aeroacoustics of cavities; and (3) dynamic, multibody computational fluid dynamics using unstructured meshes.

  12. Sensitive magnetic sensors without cooling in biomedical engineering.

    PubMed

    Nowak, H; Strähmel, E; Giessler, F; Rinneberg, G; Haueisen, J

    2003-01-01

    Magnetic field sensors are used in various fields of technology. In the past few years a large variety of magnetic field sensors has been established and the performance of these sensors has been improved enormously. In this review article all recent developments in the area of sensitive magnetic field sensory analysis (resolution better than 1 nT) are presented and examined regarding their parameters. This is mainly done under the aspect of application fields in biomedical engineering. A comparison of all commercial and available sensitive magnetic field sensors shows current and prospective ranges of application.

  13. Sensitive Determination of Onco-metabolites of D- and L-2-hydroxyglutarate Enantiomers by Chiral Derivatization Combined with Liquid Chromatography/Mass Spectrometry Analysis

    PubMed Central

    Cheng, Qing-Yun; Xiong, Jun; Huang, Wei; Ma, Qin; Ci, Weimin; Feng, Yu-Qi; Yuan, Bi-Feng

    2015-01-01

    2-hydroxyglutarate (2HG) is a potent competitor of α-ketoglutarate (α-KG) and can inhibit multiple α-KG dependent dioxygenases that function on the epigenetic modifications. The accumulation of 2HG contributes to elevated risk of malignant tumors. 2HG carries an asymmetric carbon atom in its carbon backbone and differentiation between D-2-hydroxyglutarate (D-2HG) and L-2-hydroxyglutarate (L-2HG) is crucially important for accurate diagnosis of 2HG related diseases. Here we developed a strategy by chiral derivatization combined with liquid chromatography-electrospray ionization-tandem mass spectrometry (LC-ESI-MS/MS) analysis for highly sensitive determination of D-2HG and L-2HG enantiomers. N-(p-toluenesulfonyl)-L-phenylalanyl chloride (TSPC) was used to derivatize 2HG. The formed diastereomers by TSPC labeling can efficiently improve the chromatographic separation of D-2HG and L-2HG. And derivatization by TSPC could also markedly increase the detection sensitivities by 291 and 346 folds for D-2HG and L-2HG, respectively. Using the developed method, we measured the contents of D-2HG and L-2HG in clear cell renal cell carcinoma (ccRCC) tissues. We observed 12.9 and 29.8 folds increase of D-2HG and L-2HG, respectively, in human ccRCC tissues compared to adjacent normal tissues. The developed chiral derivatization combined with LC-ESI-MS/MS analysis offers sensitive determination of D-2HG and L-2HG enantiomers, which benefits the precise diagnosis of 2HG related metabolic diseases. PMID:26458332

  14. Analysis of the NAEG model of transuranic radionuclide transport and dose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kercher, J.R.; Anspaugh, L.R.

    We analyze the model for estimating the dose from /sup 239/Pu developed for the Nevada Applied Ecology Group (NAEG) by using sensitivity analysis and uncertainty analysis. Sensitivity analysis results suggest that the air pathway is the critical pathway for the organs receiving the highest dose. Soil concentration and the factors controlling air concentration are the most important parameters. The only organ whose dose is sensitive to parameters in the ingestion pathway is the GI tract. The air pathway accounts for 100% of the dose to lung, upper respiratory tract, and thoracic lymph nodes; and 95% of its dose via ingestion.more » Leafy vegetable ingestion accounts for 70% of the dose from the ingestion pathway regardless of organ, peeled vegetables 20%; accidental soil ingestion 5%; ingestion of beef liver 4%; beef muscle 1%. Only a handful of model parameters control the dose for any one organ. The number of important parameters is usually less than 10. Uncertainty analysis indicates that choosing a uniform distribution for the input parameters produces a lognormal distribution of the dose. The ratio of the square root of the variance to the mean is three times greater for the doses than it is for the individual parameters. As found by the sensitivity analysis, the uncertainty analysis suggests that only a few parameters control the dose for each organ. All organs have similar distributions and variance to mean ratios except for the lymph modes. 16 references, 9 figures, 13 tables.« less

  15. Optimal design of an electro-hydraulic valve for heavy-duty vehicle clutch actuator with certain constraints

    NASA Astrophysics Data System (ADS)

    Meng, Fei; Shi, Peng; Karimi, Hamid Reza; Zhang, Hui

    2016-02-01

    The main objective of this paper is to investigate the sensitivity analysis and optimal design of a proportional solenoid valve (PSV) operated pressure reducing valve (PRV) for heavy-duty automatic transmission clutch actuators. The nonlinear electro-hydraulic valve model is developed based on fluid dynamics. In order to implement the sensitivity analysis and optimization for the PRV, the PSV model is validated by comparing the results with data obtained from a real test-bench. The sensitivity of the PSV pressure response with regard to the structural parameters is investigated by using Sobol's method. Finally, simulations and experimental investigations are performed on the optimized prototype and the results reveal that the dynamical characteristics of the valve have been improved in comparison with the original valve.

  16. GC/FT-IR ANALYSIS OF THE THERMALLY LABILE COMPOUND TRIS (2,3-DIBROMOPROPYL) PHOSPHATE

    EPA Science Inventory

    A fast and convenient GC method has been developed for a compound [tris(2,3-dibromopropyl)phosphate] that poses a difficult analytical problem for both GC (thermal instability/low volatility) and LC (not amenable to commonly available, sensitive detectors) analysis. his method em...

  17. Analysis techniques for multivariate root loci. [a tool in linear control systems

    NASA Technical Reports Server (NTRS)

    Thompson, P. M.; Stein, G.; Laub, A. J.

    1980-01-01

    Analysis and techniques are developed for the multivariable root locus and the multivariable optimal root locus. The generalized eigenvalue problem is used to compute angles and sensitivities for both types of loci, and an algorithm is presented that determines the asymptotic properties of the optimal root locus.

  18. Analysis of ecologically relevant pharmaceuticals in wastewater and surface water using selective solid phase extraction and UPLC/MS/MS

    EPA Science Inventory

    A rapid and sensitive method has been developed for the analysis of 48 human prescription active pharmaceutical ingredients (APIs) and 6 metabolites of interest, utilizing selective solid-phase extraction (SPE) and ultra performance liquid chromatography in combination with tripl...

  19. Monoallelic mutation analysis (MAMA) for identifying germline mutations.

    PubMed

    Papadopoulos, N; Leach, F S; Kinzler, K W; Vogelstein, B

    1995-09-01

    Dissection of germline mutations in a sensitive and specific manner presents a continuing challenge. In dominantly inherited diseases, mutations occur in only one allele and are often masked by the normal allele. Here we report the development of a sensitive and specific diagnostic strategy based on somatic cell hybridization termed MAMA (monoallelic mutation analysis). We have demonstrated the utility of this strategy in two different hereditary colorectal cancer syndromes, one caused by a defective tumour suppressor gene on chromosome 5 (familial adenomatous polyposis, FAP) and the other caused by a defective mismatch repair gene on chromosome 2 (hereditary non-polyposis colorectal cancer, HNPCC).

  20. Program Helps To Determine Chemical-Reaction Mechanisms

    NASA Technical Reports Server (NTRS)

    Bittker, D. A.; Radhakrishnan, K.

    1995-01-01

    General Chemical Kinetics and Sensitivity Analysis (LSENS) computer code developed for use in solving complex, homogeneous, gas-phase, chemical-kinetics problems. Provides for efficient and accurate chemical-kinetics computations and provides for sensitivity analysis for variety of problems, including problems involving honisothermal conditions. Incorporates mathematical models for static system, steady one-dimensional inviscid flow, reaction behind incident shock wave (with boundary-layer correction), and perfectly stirred reactor. Computations of equilibrium properties performed for following assigned states: enthalpy and pressure, temperature and pressure, internal energy and volume, and temperature and volume. Written in FORTRAN 77 with exception of NAMELIST extensions used for input.

  1. Sobol Sensitivity Analysis: A Tool to Guide the Development and Evaluation of Systems Pharmacology Models

    PubMed Central

    Trame, MN; Lesko, LJ

    2015-01-01

    A systems pharmacology model typically integrates pharmacokinetic, biochemical network, and systems biology concepts into a unifying approach. It typically consists of a large number of parameters and reaction species that are interlinked based upon the underlying (patho)physiology and the mechanism of drug action. The more complex these models are, the greater the challenge of reliably identifying and estimating respective model parameters. Global sensitivity analysis provides an innovative tool that can meet this challenge. CPT Pharmacometrics Syst. Pharmacol. (2015) 4, 69–79; doi:10.1002/psp4.6; published online 25 February 2015 PMID:27548289

  2. Nanomaterials for Electrochemical Immunosensing

    PubMed Central

    Pan, Mingfei; Gu, Ying; Yun, Yaguang; Li, Min; Jin, Xincui; Wang, Shuo

    2017-01-01

    Electrochemical immunosensors resulting from a combination of the traditional immunoassay approach with modern biosensors and electrochemical analysis constitute a current research hotspot. They exhibit both the high selectivity characteristics of immunoassays and the high sensitivity of electrochemical analysis, along with other merits such as small volume, convenience, low cost, simple preparation, and real-time on-line detection, and have been widely used in the fields of environmental monitoring, medical clinical trials and food analysis. Notably, the rapid development of nanotechnology and the wide application of nanomaterials have provided new opportunities for the development of high-performance electrochemical immunosensors. Various nanomaterials with different properties can effectively solve issues such as the immobilization of biological recognition molecules, enrichment and concentration of trace analytes, and signal detection and amplification to further enhance the stability and sensitivity of the electrochemical immunoassay procedure. This review introduces the working principles and development of electrochemical immunosensors based on different signals, along with new achievements and progress related to electrochemical immunosensors in various fields. The importance of various types of nanomaterials for improving the performance of electrochemical immunosensor is also reviewed to provide a theoretical basis and guidance for the further development and application of nanomaterials in electrochemical immunosensors. PMID:28475158

  3. Cognitive capital, equity and child-sensitive social protection in Asia and the Pacific.

    PubMed

    Samson, Michael; Fajth, Gaspar; François, Daphne

    2016-01-01

    Promoting child development and welfare delivers human rights and builds sustainable economies through investment in 'cognitive capital'. This analysis looks at conditions that support optimal brain development in childhood and highlights how social protection promotes these conditions and strengthens the achievement of the Sustainable Development Goals (SDGs) in Asia and the Pacific. Embracing child-sensitive social protection offers multiple benefits. The region has been a leader in global poverty reduction but the underlying pattern of economic growth exacerbates inequality and is increasingly unsustainable. The strategy of channelling low-skilled rural labour to industrial jobs left millions of children behind with limited opportunities for development. Building child-sensitive social protection and investing better in children's cognitive capacity could check these trends and trigger powerful long-term human capital development-enabling labour productivity to grow faster than populations age. While governments are investing more in social protection, the region's spending remains low by international comparison. Investment is particularly inadequate where it yields the highest returns: during the first 1000 days of life. Five steps are recommended for moving forward: (1) building cognitive capital by adjusting the region's development paradigms to reflect better the economic and social returns from investing in children; (2) understand and track better child poverty and vulnerability; (3) progressively build universal, child-sensitive systems that strengthen comprehensive interventions within life cycle frameworks; (4) mobilise national resources for early childhood investments and child-sensitive social protection; and (5) leverage the SDGs and other channels of national and international collaboration.

  4. Use of a Smartphone as a Colorimetric Analyzer in Paper-based Devices for Sensitive and Selective Determination of Mercury in Water Samples.

    PubMed

    Jarujamrus, Purim; Meelapsom, Rattapol; Pencharee, Somkid; Obma, Apinya; Amatatongchai, Maliwan; Ditcharoen, Nadh; Chairam, Sanoe; Tamuang, Suparb

    2018-01-01

    A smartphone application, called CAnal, was developed as a colorimetric analyzer in paper-based devices for sensitive and selective determination of mercury(II) in water samples. Measurement on the double layer of a microfluidic paper-based analytical device (μPAD) fabricated by alkyl ketene dimer (AKD)-inkjet printing technique with special design doped with unmodified silver nanoparticles (AgNPs) onto the detection zones was performed by monitoring the gray intensity in the blue channel of AgNPs, which disintegrated when exposed to mercury(II) on μPAD. Under the optimized conditions, the developed approach showed high sensitivity, low limit of detection (0.003 mg L -1 , 3SD blank/slope of the calibration curve), small sample volume uptake (two times of 2 μL), and short analysis time. The linearity range of this technique ranged from 0.01 to 10 mg L -1 (r 2 = 0.993). Furthermore, practical analysis of various water samples was also demonstrated to have acceptable performance that was in agreement with the data from cold vapor atomic absorption spectrophotometry (CV-AAS), a conventional method. The proposed technique allows for a rapid, simple (instant report of the final mercury(II) concentration in water samples via smartphone display), sensitive, selective, and on-site analysis with high sample throughput (48 samples h -1 , n = 3) of trace mercury(II) in water samples, which is suitable for end users who are unskilled in analyzing mercury(II) in water samples.

  5. Novel approach based on one-tube nested PCR and a lateral flow strip for highly sensitive diagnosis of tuberculous meningitis.

    PubMed

    Sun, Yajuan; Chen, Jiajun; Li, Jia; Xu, Yawei; Jin, Hui; Xu, Na; Yin, Rui; Hu, Guohua

    2017-01-01

    Rapid and sensitive detection of Mycobacterium tuberculosis (M. Tb) in cerebrospinal fluid is crucial in the diagnosis of tuberculous meningitis (TBM), but conventional diagnostic technologies have limited sensitivity and specificity or are time-consuming. In this work, a novel, highly sensitive molecular diagnostic method, one-tube nested PCR-lateral flow strip test (OTNPCR-LFST), was developed for detecting M. tuberculosis. This one-tube nested PCR maintains the sensitivity of conventional two-step nested PCR and reduces both the chance of cross-contamination and the time required for analysis. The PCR product was detected by a lateral flow strip assay, which provided a basis for migration of the test to a point-of-care (POC) microfluidic format. The developed assay had an improved sensitivity compared with traditional PCR, and the limit of detection was up to 1 fg DNA isolated from M. tuberculosis. The assay was also specific for M. tuberculosis, and no cross-reactions were found in other non-target bacteria. The application of this technique to clinical samples was successfully evaluated, and OTNPCR-LFST showed 89% overall sensitivity and 100% specificity for TBM patients. This one-tube nested PCR-lateral flow strip assay is useful for detecting M. tuberculosis in TBM due to its rapidity, high sensitivity and simple manipulation.

  6. Finite Element Model Calibration Approach for Area I-X

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, Mercedes C.; Buehrle, Ralph D.; Templeton, Justin D.; Gaspar, James L.; Lazor, Daniel R.; Parks, Russell A.; Bartolotta, Paul A.

    2010-01-01

    Ares I-X is a pathfinder vehicle concept under development by NASA to demonstrate a new class of launch vehicles. Although this vehicle is essentially a shell of what the Ares I vehicle will be, efforts are underway to model and calibrate the analytical models before its maiden flight. Work reported in this document will summarize the model calibration approach used including uncertainty quantification of vehicle responses and the use of non-conventional boundary conditions during component testing. Since finite element modeling is the primary modeling tool, the calibration process uses these models, often developed by different groups, to assess model deficiencies and to update parameters to reconcile test with predictions. Data for two major component tests and the flight vehicle are presented along with the calibration results. For calibration, sensitivity analysis is conducted using Analysis of Variance (ANOVA). To reduce the computational burden associated with ANOVA calculations, response surface models are used in lieu of computationally intensive finite element solutions. From the sensitivity studies, parameter importance is assessed as a function of frequency. In addition, the work presents an approach to evaluate the probability that a parameter set exists to reconcile test with analysis. Comparisons of pretest predictions of frequency response uncertainty bounds with measured data, results from the variance-based sensitivity analysis, and results from component test models with calibrated boundary stiffness models are all presented.

  7. Finite Element Model Calibration Approach for Ares I-X

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, Mercedes C.; Buehrle, Ralph D.; Templeton, Justin D.; Lazor, Daniel R.; Gaspar, James L.; Parks, Russel A.; Bartolotta, Paul A.

    2010-01-01

    Ares I-X is a pathfinder vehicle concept under development by NASA to demonstrate a new class of launch vehicles. Although this vehicle is essentially a shell of what the Ares I vehicle will be, efforts are underway to model and calibrate the analytical models before its maiden flight. Work reported in this document will summarize the model calibration approach used including uncertainty quantification of vehicle responses and the use of nonconventional boundary conditions during component testing. Since finite element modeling is the primary modeling tool, the calibration process uses these models, often developed by different groups, to assess model deficiencies and to update parameters to reconcile test with predictions. Data for two major component tests and the flight vehicle are presented along with the calibration results. For calibration, sensitivity analysis is conducted using Analysis of Variance (ANOVA). To reduce the computational burden associated with ANOVA calculations, response surface models are used in lieu of computationally intensive finite element solutions. From the sensitivity studies, parameter importance is assessed as a function of frequency. In addition, the work presents an approach to evaluate the probability that a parameter set exists to reconcile test with analysis. Comparisons of pre-test predictions of frequency response uncertainty bounds with measured data, results from the variance-based sensitivity analysis, and results from component test models with calibrated boundary stiffness models are all presented.

  8. Novel quantitative analysis of autofluorescence images for oral cancer screening.

    PubMed

    Huang, Tze-Ta; Huang, Jehn-Shyun; Wang, Yen-Yun; Chen, Ken-Chung; Wong, Tung-Yiu; Chen, Yi-Chun; Wu, Che-Wei; Chan, Leong-Perng; Lin, Yi-Chu; Kao, Yu-Hsun; Nioka, Shoko; Yuan, Shyng-Shiou F; Chung, Pau-Choo

    2017-05-01

    VELscope® was developed to inspect oral mucosa autofluorescence. However, its accuracy is heavily dependent on the examining physician's experience. This study was aimed toward the development of a novel quantitative analysis of autofluorescence images for oral cancer screening. Patients with either oral cancer or precancerous lesions and a control group with normal oral mucosa were enrolled in this study. White light images and VELscope® autofluorescence images of the lesions were taken with a digital camera. The lesion in the image was chosen as the region of interest (ROI). The average intensity and heterogeneity of the ROI were calculated. A quadratic discriminant analysis (QDA) was utilized to compute boundaries based on sensitivity and specificity. 47 oral cancer lesions, 54 precancerous lesions, and 39 normal oral mucosae controls were analyzed. A boundary of specificity of 0.923 and a sensitivity of 0.979 between the oral cancer lesions and normal oral mucosae were validated. The oral cancer and precancerous lesions could also be differentiated from normal oral mucosae with a specificity of 0.923 and a sensitivity of 0.970. The novel quantitative analysis of the intensity and heterogeneity of VELscope® autofluorescence images used in this study in combination with a QDA classifier can be used to differentiate oral cancer and precancerous lesions from normal oral mucosae. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Influence of ECG sampling rate in fetal heart rate variability analysis.

    PubMed

    De Jonckheere, J; Garabedian, C; Charlier, P; Champion, C; Servan-Schreiber, E; Storme, L; Debarge, V; Jeanne, M; Logier, R

    2017-07-01

    Fetal hypoxia results in a fetal blood acidosis (pH<;7.10). In such a situation, the fetus develops several adaptation mechanisms regulated by the autonomic nervous system. Many studies demonstrated significant changes in heart rate variability in hypoxic fetuses. So, fetal heart rate variability analysis could be of precious help for fetal hypoxia prediction. Commonly used fetal heart rate variability analysis methods have been shown to be sensitive to the ECG signal sampling rate. Indeed, a low sampling rate could induce variability in the heart beat detection which will alter the heart rate variability estimation. In this paper, we introduce an original fetal heart rate variability analysis method. We hypothesize that this method will be less sensitive to ECG sampling frequency changes than common heart rate variability analysis methods. We then compared the results of this new heart rate variability analysis method with two different sampling frequencies (250-1000 Hz).

  10. GIS coupled Multiple Criteria based Decision Support for Classification of Urban Coastal Areas in India

    NASA Astrophysics Data System (ADS)

    Dhiman, R.; Kalbar, P.; Inamdar, A. B.

    2017-12-01

    Coastal area classification in India is a challenge for federal and state government agencies due to fragile institutional framework, unclear directions in implementation of costal regulations and violations happening at private and government level. This work is an attempt to improvise the objectivity of existing classification methods to synergies the ecological systems and socioeconomic development in coastal cities. We developed a Geographic information system coupled Multi-criteria Decision Making (GIS-MCDM) approach to classify urban coastal areas where utility functions are used to transform the costal features into quantitative membership values after assessing the sensitivity of urban coastal ecosystem. Furthermore, these membership values for costal features are applied in different weighting schemes to derive Coastal Area Index (CAI) which classifies the coastal areas in four distinct categories viz. 1) No Development Zone, 2) Highly Sensitive Zone, 3) Moderately Sensitive Zone and 4) Low Sensitive Zone based on the sensitivity of urban coastal ecosystem. Mumbai, a coastal megacity in India is used as case study for demonstration of proposed method. Finally, uncertainty analysis using Monte Carlo approach to validate the sensitivity of CAI under specific multiple scenarios is carried out. Results of CAI method shows the clear demarcation of coastal areas in GIS environment based on the ecological sensitivity. CAI provides better decision support for federal and state level agencies to classify urban coastal areas according to the regional requirement of coastal resources considering resilience and sustainable development. CAI method will strengthen the existing institutional framework for decision making in classification of urban coastal areas where most effective coastal management options can be proposed.

  11. CAFNA{reg{underscore}sign}, coded aperture fast neutron analysis for contraband detection: Preliminary results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, L.; Lanza, R.C.

    1999-12-01

    The authors have developed a near field coded aperture imaging system for use with fast neutron techniques as a tool for the detection of contraband and hidden explosives through nuclear elemental analysis. The technique relies on the prompt gamma rays produced by fast neutron interactions with the object being examined. The position of the nuclear elements is determined by the location of the gamma emitters. For existing fast neutron techniques, in Pulsed Fast Neutron Analysis (PFNA), neutrons are used with very low efficiency; in Fast Neutron Analysis (FNS), the sensitivity for detection of the signature gamma rays is very low.more » For the Coded Aperture Fast Neutron Analysis (CAFNA{reg{underscore}sign}) the authors have developed, the efficiency for both using the probing fast neutrons and detecting the prompt gamma rays is high. For a probed volume of n{sup 3} volume elements (voxels) in a cube of n resolution elements on a side, they can compare the sensitivity with other neutron probing techniques. As compared to PFNA, the improvement for neutron utilization is n{sup 2}, where the total number of voxels in the object being examined is n{sup 3}. Compared to FNA, the improvement for gamma-ray imaging is proportional to the total open area of the coded aperture plane; a typical value is n{sup 2}/2, where n{sup 2} is the number of total detector resolution elements or the number of pixels in an object layer. It should be noted that the actual signal to noise ratio of a system depends also on the nature and distribution of background events and this comparison may reduce somewhat the effective sensitivity of CAFNA. They have performed analysis, Monte Carlo simulations, and preliminary experiments using low and high energy gamma-ray sources. The results show that a high sensitivity 3-D contraband imaging and detection system can be realized by using CAFNA.« less

  12. Retracted: Association of ACE I/D gene polymorphism with T2DN susceptibility and the risk of T2DM developing into T2DN in a Caucasian population.

    PubMed

    Liu, Guohui; Zhou, Tian-Biao; Jiang, Zongpei; Zheng, Dongwen

    2015-03-01

    The association of the angiotensin-converting enzyme (ACE) insertion/deletion (I/D) gene polymorphism with type-2 diabetic nephropathy (T2DN) susceptibility and the risk of type-2 diabetes mellitus (T2DM) developing into T2DN in Caucasian populations is still controversial. A meta-analysis was performed to evaluate the association of ACE I/D gene polymorphism with T2DN susceptibility and the risk of T2DM developing into T2DN in Caucasian populations. A predefined literature search and selection of eligible relevant studies were performed to collect data from electronic databases. Sixteen articles were identified for the analysis of the association of ACE I/D gene polymorphism with T2DN susceptibility and the risk of T2DM developing into T2DN in Caucasian populations. ACE I/D gene polymorphism was not associated with T2DN susceptibility and the risk of patients with T2DM developing T2DN in Caucasian populations. Sensitivity analysis according to sample size of case (<100 vs. ≥100) was also performed, and the results were similar to the non-sensitivity analysis. ACE I/D gene polymorphism was not associated with T2DN susceptibility and the risk of patients with T2DM developing T2DN in Caucasian populations. However, more studies should be performed in the future. © The Author(s) 2014.

  13. Functional analyses of cotton (Gossypium hirsutum L.) immature fiber (im) mutant reveal that fiber cell wall development is associated with sensitivity to stress.

    USDA-ARS?s Scientific Manuscript database

    Background: Cotton fiber maturity refers the degree of fiber cell wall development and is an important factor for determining commercial value of cotton. The molecular mechanism regulating the fiber cell wall development has not been well characterized. Microscopic image analysis of the cross-sect...

  14. Autonomous Aerobraking: Thermal Analysis and Response Surface Development

    NASA Technical Reports Server (NTRS)

    Dec, John A.; Thornblom, Mark N.

    2011-01-01

    A high-fidelity thermal model of the Mars Reconnaissance Orbiter was developed for use in an autonomous aerobraking simulation study. Response surface equations were derived from the high-fidelity thermal model and integrated into the autonomous aerobraking simulation software. The high-fidelity thermal model was developed using the Thermal Desktop software and used in all phases of the analysis. The use of Thermal Desktop exclusively, represented a change from previously developed aerobraking thermal analysis methodologies. Comparisons were made between the Thermal Desktop solutions and those developed for the previous aerobraking thermal analyses performed on the Mars Reconnaissance Orbiter during aerobraking operations. A variable sensitivity screening study was performed to reduce the number of variables carried in the response surface equations. Thermal analysis and response surface equation development were performed for autonomous aerobraking missions at Mars and Venus.

  15. Sensitivity analysis and approximation methods for general eigenvalue problems

    NASA Technical Reports Server (NTRS)

    Murthy, D. V.; Haftka, R. T.

    1986-01-01

    Optimization of dynamic systems involving complex non-hermitian matrices is often computationally expensive. Major contributors to the computational expense are the sensitivity analysis and reanalysis of a modified design. The present work seeks to alleviate this computational burden by identifying efficient sensitivity analysis and approximate reanalysis methods. For the algebraic eigenvalue problem involving non-hermitian matrices, algorithms for sensitivity analysis and approximate reanalysis are classified, compared and evaluated for efficiency and accuracy. Proper eigenvector normalization is discussed. An improved method for calculating derivatives of eigenvectors is proposed based on a more rational normalization condition and taking advantage of matrix sparsity. Important numerical aspects of this method are also discussed. To alleviate the problem of reanalysis, various approximation methods for eigenvalues are proposed and evaluated. Linear and quadratic approximations are based directly on the Taylor series. Several approximation methods are developed based on the generalized Rayleigh quotient for the eigenvalue problem. Approximation methods based on trace theorem give high accuracy without needing any derivatives. Operation counts for the computation of the approximations are given. General recommendations are made for the selection of appropriate approximation technique as a function of the matrix size, number of design variables, number of eigenvalues of interest and the number of design points at which approximation is sought.

  16. Receiver operating characteristic analysis of age-related changes in lineup performance.

    PubMed

    Humphries, Joyce E; Flowe, Heather D

    2015-04-01

    In the basic face memory literature, support has been found for the late maturation hypothesis, which holds that face recognition ability is not fully developed until at least adolescence. Support for the late maturation hypothesis in the criminal lineup identification literature, however, has been equivocal because of the analytic approach that has been used to examine age-related changes in identification performance. Recently, receiver operator characteristic (ROC) analysis was applied for the first time in the adult eyewitness memory literature to examine whether memory sensitivity differs across different types of lineup tests. ROC analysis allows for the separation of memory sensitivity from response bias in the analysis of recognition data. Here, we have made the first ROC-based comparison of adults' and children's (5- and 6-year-olds and 9- and 10-year-olds) memory performance on lineups by reanalyzing data from Humphries, Holliday, and Flowe (2012). In line with the late maturation hypothesis, memory sensitivity was significantly greater for adults compared with young children. Memory sensitivity for older children was similar to that for adults. The results indicate that the late maturation hypothesis can be generalized to account for age-related performance differences on an eyewitness memory task. The implications for developmental eyewitness memory research are discussed. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Vertically Aligned Nitrogen-Doped Carbon Nanotube Carpet Electrodes: Highly Sensitive Interfaces for the Analysis of Serum from Patients with Inflammatory Bowel Disease.

    PubMed

    Wang, Qian; Subramanian, Palaniappan; Schechter, Alex; Teblum, Eti; Yemini, Reut; Nessim, Gilbert Daniel; Vasilescu, Alina; Li, Musen; Boukherroub, Rabah; Szunerits, Sabine

    2016-04-20

    The number of patients suffering from inflammatory bowel disease (IBD) is increasing worldwide. The development of noninvasive tests that are rapid, sensitive, specific, and simple would allow preventing patient discomfort, delay in diagnosis, and the follow-up of the status of the disease. Herein, we show the interest of vertically aligned nitrogen-doped carbon nanotube (VA-NCNT) electrodes for the required sensitive electrochemical detection of lysozyme in serum, a protein that is up-regulated in IBD. To achieve selective lysozyme detection, biotinylated lysozyme aptamers were covalently immobilized onto the VA-NCNTs. Detection of lysozyme in serum was achieved by measuring the decrease in the peak current of the Fe(CN)6(3-/4-) redox couple by differential pulse voltammetry upon addition of the analyte. We achieved a detection limit as low as 100 fM with a linear range up to 7 pM, in line with the required demands for the determination of lysozyme level in patients suffering from IBD. We attained the sensitive detection of biomarkers in clinical samples of healthy patients and individuals suffering from IBD and compared the results to a classical turbidimetric assay. The results clearly indicate that the newly developed sensor allows for a reliable and efficient analysis of lysozyme in serum.

  18. Visual Resource Analysis for Solar Energy Zones in the San Luis Valley

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, Robert; Abplanalp, Jennifer M.; Zvolanek, Emily

    This report summarizes the results of a study conducted by Argonne National Laboratory’s (Argonne’s) Environmental Science Division for the U.S. Department of the Interior Bureau of Land Management (BLM). The study analyzed the regional effects of potential visual impacts of solar energy development on three BLM-designated solar energy zones (SEZs) in the San Luis Valley (SLV) in Colorado, and, based on the analysis, made recommendations for or against regional compensatory mitigation to compensate residents and other stakeholders for the potential visual impacts to the SEZs. The analysis was conducted as part of the solar regional mitigation strategy (SRMS) task conductedmore » by BLM Colorado with assistance from Argonne. Two separate analyses were performed. The first analysis, referred to as the VSA Analysis, analyzed the potential visual impacts of solar energy development in the SEZs on nearby visually sensitive areas (VSAs), and, based on the impact analyses, made recommendations for or against regional compensatory mitigation. VSAs are locations for which some type of visual sensitivity has been identified, either because the location is an area of high scenic value or because it is a location from which people view the surrounding landscape and attach some level of importance or sensitivity to what is seen from the location. The VSA analysis included both BLM-administered lands in Colorado and in the Taos FO in New Mexico. The second analysis, referred to as the SEZ Analysis, used BLM visual resource inventory (VRI) and other data on visual resources in the former Saguache and La Jara Field Offices (FOs), now contained within the San Luis Valley FO (SLFO), to determine whether the changes in scenic values that would result from the development of utility-scale solar energy facilities in the SEZs would affect the quality and quantity of valued scenic resources in the SLV region as a whole. If the regional effects were judged to be significant, regional compensatory mitigation was recommended. VRI data was not available for the Taos FO and it was not included in the SEZ analysis; the SEZ analysis includes BLM-administered lands in Colorado only.« less

  19. Development and optimization of SPECT gated blood pool cluster analysis for the prediction of CRT outcome.

    PubMed

    Lalonde, Michel; Wells, R Glenn; Birnie, David; Ruddy, Terrence D; Wassenaar, Richard

    2014-07-01

    Phase analysis of single photon emission computed tomography (SPECT) radionuclide angiography (RNA) has been investigated for its potential to predict the outcome of cardiac resynchronization therapy (CRT). However, phase analysis may be limited in its potential at predicting CRT outcome as valuable information may be lost by assuming that time-activity curves (TAC) follow a simple sinusoidal shape. A new method, cluster analysis, is proposed which directly evaluates the TACs and may lead to a better understanding of dyssynchrony patterns and CRT outcome. Cluster analysis algorithms were developed and optimized to maximize their ability to predict CRT response. About 49 patients (N = 27 ischemic etiology) received a SPECT RNA scan as well as positron emission tomography (PET) perfusion and viability scans prior to undergoing CRT. A semiautomated algorithm sampled the left ventricle wall to produce 568 TACs from SPECT RNA data. The TACs were then subjected to two different cluster analysis techniques, K-means, and normal average, where several input metrics were also varied to determine the optimal settings for the prediction of CRT outcome. Each TAC was assigned to a cluster group based on the comparison criteria and global and segmental cluster size and scores were used as measures of dyssynchrony and used to predict response to CRT. A repeated random twofold cross-validation technique was used to train and validate the cluster algorithm. Receiver operating characteristic (ROC) analysis was used to calculate the area under the curve (AUC) and compare results to those obtained for SPECT RNA phase analysis and PET scar size analysis methods. Using the normal average cluster analysis approach, the septal wall produced statistically significant results for predicting CRT results in the ischemic population (ROC AUC = 0.73;p < 0.05 vs. equal chance ROC AUC = 0.50) with an optimal operating point of 71% sensitivity and 60% specificity. Cluster analysis results were similar to SPECT RNA phase analysis (ROC AUC = 0.78, p = 0.73 vs cluster AUC; sensitivity/specificity = 59%/89%) and PET scar size analysis (ROC AUC = 0.73, p = 1.0 vs cluster AUC; sensitivity/specificity = 76%/67%). A SPECT RNA cluster analysis algorithm was developed for the prediction of CRT outcome. Cluster analysis results produced results equivalent to those obtained from Fourier and scar analysis.

  20. Development and optimization of SPECT gated blood pool cluster analysis for the prediction of CRT outcome

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lalonde, Michel, E-mail: mlalonde15@rogers.com; Wassenaar, Richard; Wells, R. Glenn

    2014-07-15

    Purpose: Phase analysis of single photon emission computed tomography (SPECT) radionuclide angiography (RNA) has been investigated for its potential to predict the outcome of cardiac resynchronization therapy (CRT). However, phase analysis may be limited in its potential at predicting CRT outcome as valuable information may be lost by assuming that time-activity curves (TAC) follow a simple sinusoidal shape. A new method, cluster analysis, is proposed which directly evaluates the TACs and may lead to a better understanding of dyssynchrony patterns and CRT outcome. Cluster analysis algorithms were developed and optimized to maximize their ability to predict CRT response. Methods: Aboutmore » 49 patients (N = 27 ischemic etiology) received a SPECT RNA scan as well as positron emission tomography (PET) perfusion and viability scans prior to undergoing CRT. A semiautomated algorithm sampled the left ventricle wall to produce 568 TACs from SPECT RNA data. The TACs were then subjected to two different cluster analysis techniques, K-means, and normal average, where several input metrics were also varied to determine the optimal settings for the prediction of CRT outcome. Each TAC was assigned to a cluster group based on the comparison criteria and global and segmental cluster size and scores were used as measures of dyssynchrony and used to predict response to CRT. A repeated random twofold cross-validation technique was used to train and validate the cluster algorithm. Receiver operating characteristic (ROC) analysis was used to calculate the area under the curve (AUC) and compare results to those obtained for SPECT RNA phase analysis and PET scar size analysis methods. Results: Using the normal average cluster analysis approach, the septal wall produced statistically significant results for predicting CRT results in the ischemic population (ROC AUC = 0.73;p < 0.05 vs. equal chance ROC AUC = 0.50) with an optimal operating point of 71% sensitivity and 60% specificity. Cluster analysis results were similar to SPECT RNA phase analysis (ROC AUC = 0.78, p = 0.73 vs cluster AUC; sensitivity/specificity = 59%/89%) and PET scar size analysis (ROC AUC = 0.73, p = 1.0 vs cluster AUC; sensitivity/specificity = 76%/67%). Conclusions: A SPECT RNA cluster analysis algorithm was developed for the prediction of CRT outcome. Cluster analysis results produced results equivalent to those obtained from Fourier and scar analysis.« less

  1. A GC-MS method for the detection and quantitation of ten major drugs of abuse in human hair samples.

    PubMed

    Orfanidis, A; Mastrogianni, O; Koukou, A; Psarros, G; Gika, H; Theodoridis, G; Raikos, N

    2017-03-15

    A sensitive analytical method has been developed in order to identify and quantify major drugs of abuse (DOA), namely morphine, codeine, 6-monoacetylmorphine, cocaine, ecgonine methyl ester, benzoylecgonine, amphetamine, methamphetamine, methylenedioxymethamphetamine and methylenedioxyamphetamine in human hair. Samples of hair were extracted with methanol under ultrasonication at 50°C after a three step rinsing process to remove external contamination and dirt hair. Derivatization with BSTFA was selected in order to increase detection sensitivity of GC/MS analysis. Optimization of derivatization parameters was based on experiments for the selection of derivatization time, temperature and volume of derivatising agent. Validation of the method included evaluation of linearity which ranged from 2 to 350ng/mg of hair mean concentration for all DOA, evaluation of sensitivity, accuracy, precision and repeatability. Limits of detection ranged from 0.05 to 0.46ng/mg of hair. The developed method was applied for the analysis of hair samples obtained from three human subjects and were found positive in cocaine, and opiates. Published by Elsevier B.V.

  2. Genome-wide identification of wheat (Triticum aestivum) expansins and expansin expression analysis in cold-tolerant and cold-sensitive wheat cultivars

    PubMed Central

    Zhang, Jun-Feng; Xu, Yong-Qing; Dong, Jia-Min; Peng, Li-Na; Feng, Xu; Wang, Xu; Li, Fei; Miao, Yu; Yao, Shu-Kuan; Zhao, Qiao-Qin; Feng, Shan-Shan; Hu, Bao-Zhong

    2018-01-01

    Plant expansins are proteins involved in cell wall loosening, plant growth, and development, as well as in response to plant diseases and other stresses. In this study, we identified 128 expansin coding sequences from the wheat (Triticum aestivum) genome. These sequences belong to 45 homoeologous copies of TaEXPs, including 26 TaEXPAs, 15 TaEXPBs and four TaEXLAs. No TaEXLB was identified. Gene expression and sub-expression profiles revealed that most of the TaEXPs were expressed either only in root tissues or in multiple organs. Real-time qPCR analysis showed that many TaEXPs were differentially expressed in four different tissues of the two wheat cultivars—the cold-sensitive ‘Chinese Spring (CS)’ and the cold-tolerant ‘Dongnongdongmai 1 (D1)’ cultivars. Our results suggest that the differential expression of TaEXPs could be related to low-temperature tolerance or sensitivity of different wheat cultivars. Our study expands our knowledge on wheat expansins and sheds new light on the functions of expansins in plant development and stress response. PMID:29596529

  3. Development of a highly sensitive and specific ELISA method for the determination of l-corydalmine in SD rats with monoclonal antibody.

    PubMed

    Zhang, Hongwei; Gao, Lan; Shu, Menglin; Liu, Jihua; Yu, Boyang

    2018-01-15

    l-Corydalmine (l-CDL) is a potent analgesic constituent of the traditional Chinese medicine, Rhizoma Corydalis. However, the pharmacokinetic process and tissue distribution of l-CDL in vivo are still unknown. Therefore, it is necessary to establish a simple and sensitive method to detect l-CDL, which will be helpful to study its distribution and pharmacokinetic process. To determine this compound in biological samples, a monoclonal antibody (mAb) against l-CDL was produced and a fast and highly sensitive indirect competitive enzyme-linked immunosorbent assay (icELISA) was developed in this study. The icELISA was applied to determine l-CDL in biological samples. The limit of detection (LOD) of the method was 0.015 ng/mL with a liner range of 1-1000 ng/mL (R 2  = 0.9912). The intra- and inter-day precision were below 15% and the recoveries were within 80-117%. Finally, the developed immunoassay was successfully applied to the analysis of the distribution of l-CDL in SD rats. In conclusion, the icELISA based on the anti-l-CDL mAb could be considered as a highly sensitive and rapid method for the determination of l-CDL in biological samples. The ELISA approach may provide a valuable tool for the analysis of small molecules in biological samples. Copyright © 2017. Published by Elsevier B.V.

  4. Sensitivity Analysis of Multidisciplinary Rotorcraft Simulations

    NASA Technical Reports Server (NTRS)

    Wang, Li; Diskin, Boris; Biedron, Robert T.; Nielsen, Eric J.; Bauchau, Olivier A.

    2017-01-01

    A multidisciplinary sensitivity analysis of rotorcraft simulations involving tightly coupled high-fidelity computational fluid dynamics and comprehensive analysis solvers is presented and evaluated. An unstructured sensitivity-enabled Navier-Stokes solver, FUN3D, and a nonlinear flexible multibody dynamics solver, DYMORE, are coupled to predict the aerodynamic loads and structural responses of helicopter rotor blades. A discretely-consistent adjoint-based sensitivity analysis available in FUN3D provides sensitivities arising from unsteady turbulent flows and unstructured dynamic overset meshes, while a complex-variable approach is used to compute DYMORE structural sensitivities with respect to aerodynamic loads. The multidisciplinary sensitivity analysis is conducted through integrating the sensitivity components from each discipline of the coupled system. Numerical results verify accuracy of the FUN3D/DYMORE system by conducting simulations for a benchmark rotorcraft test model and comparing solutions with established analyses and experimental data. Complex-variable implementation of sensitivity analysis of DYMORE and the coupled FUN3D/DYMORE system is verified by comparing with real-valued analysis and sensitivities. Correctness of adjoint formulations for FUN3D/DYMORE interfaces is verified by comparing adjoint-based and complex-variable sensitivities. Finally, sensitivities of the lift and drag functions obtained by complex-variable FUN3D/DYMORE simulations are compared with sensitivities computed by the multidisciplinary sensitivity analysis, which couples adjoint-based flow and grid sensitivities of FUN3D and FUN3D/DYMORE interfaces with complex-variable sensitivities of DYMORE structural responses.

  5. Linked Sensitivity Analysis, Calibration, and Uncertainty Analysis Using a System Dynamics Model for Stroke Comparative Effectiveness Research.

    PubMed

    Tian, Yuan; Hassmiller Lich, Kristen; Osgood, Nathaniel D; Eom, Kirsten; Matchar, David B

    2016-11-01

    As health services researchers and decision makers tackle more difficult problems using simulation models, the number of parameters and the corresponding degree of uncertainty have increased. This often results in reduced confidence in such complex models to guide decision making. To demonstrate a systematic approach of linked sensitivity analysis, calibration, and uncertainty analysis to improve confidence in complex models. Four techniques were integrated and applied to a System Dynamics stroke model of US veterans, which was developed to inform systemwide intervention and research planning: Morris method (sensitivity analysis), multistart Powell hill-climbing algorithm and generalized likelihood uncertainty estimation (calibration), and Monte Carlo simulation (uncertainty analysis). Of 60 uncertain parameters, sensitivity analysis identified 29 needing calibration, 7 that did not need calibration but significantly influenced key stroke outcomes, and 24 not influential to calibration or stroke outcomes that were fixed at their best guess values. One thousand alternative well-calibrated baselines were obtained to reflect calibration uncertainty and brought into uncertainty analysis. The initial stroke incidence rate among veterans was identified as the most influential uncertain parameter, for which further data should be collected. That said, accounting for current uncertainty, the analysis of 15 distinct prevention and treatment interventions provided a robust conclusion that hypertension control for all veterans would yield the largest gain in quality-adjusted life years. For complex health care models, a mixed approach was applied to examine the uncertainty surrounding key stroke outcomes and the robustness of conclusions. We demonstrate that this rigorous approach can be practical and advocate for such analysis to promote understanding of the limits of certainty in applying models to current decisions and to guide future data collection. © The Author(s) 2016.

  6. Molecular imprinting ratiometric fluorescence sensor for highly selective and sensitive detection of phycocyanin.

    PubMed

    Wang, Xiaoyan; Yu, Jialuo; Kang, Qi; Shen, Dazhong; Li, Jinhua; Chen, Lingxin

    2016-03-15

    A facile strategy was developed to prepare molecular imprinting ratiometric fluorescence sensor for highly selective and sensitive detection of phycocyanin (PC) based on fluorescence resonance energy transfer (FRET), via a sol-gel polymerization process using nitrobenzoxadiazole (NBD) as fluorescent signal source. The ratio of two fluorescence peak emission intensities of NBD and PC was utilized to determine the concentration of PC, which could effectively reduce the background interference and fluctuation of diverse conditions. As a result, this sensor obtained high sensitivity with a low detection limit of 0.14 nM within 6 min, and excellent recognition specificity for PC over its analogues with a high imprinting factor of 9.1. Furthermore, the sensor attained high recoveries in the range of 93.8-110.2% at three spiking levels of PC, with precisions below 4.7% in seawater and lake water samples. The developed sensor strategy demonstrated simplicity, reliability, rapidity, high selectivity and high sensitivity, proving to be a feasible way to develop high efficient fluorescence sensors and thus potentially applicable for ultratrace analysis of complicated matrices. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. High Sensitivity, Wearable, Piezoresistive Pressure Sensors Based on Irregular Microhump Structures and Its Applications in Body Motion Sensing.

    PubMed

    Wang, Zongrong; Wang, Shan; Zeng, Jifang; Ren, Xiaochen; Chee, Adrian J Y; Yiu, Billy Y S; Chung, Wai Choi; Yang, Yong; Yu, Alfred C H; Roberts, Robert C; Tsang, Anderson C O; Chow, Kwok Wing; Chan, Paddy K L

    2016-07-01

    A pressure sensor based on irregular microhump patterns has been proposed and developed. The devices show high sensitivity and broad operating pressure regime while comparing with regular micropattern devices. Finite element analysis (FEA) is utilized to confirm the sensing mechanism and predict the performance of the pressure sensor based on the microhump structures. Silicon carbide sandpaper is employed as the mold to develop polydimethylsiloxane (PDMS) microhump patterns with various sizes. The active layer of the piezoresistive pressure sensor is developed by spin coating PSS on top of the patterned PDMS. The devices show an averaged sensitivity as high as 851 kPa(-1) , broad operating pressure range (20 kPa), low operating power (100 nW), and fast response speed (6.7 kHz). Owing to their flexible properties, the devices are applied to human body motion sensing and radial artery pulse. These flexible high sensitivity devices show great potential in the next generation of smart sensors for robotics, real-time health monitoring, and biomedical applications. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Socioeconomic Stratification and Its Influences on Talent Development: Some Interdisciplinary Perspectives.

    ERIC Educational Resources Information Center

    Ambrose, Don

    2002-01-01

    In this analysis, socioeconomic barriers to talent development are explored from the vantage points of major thinkers and recent research findings in context-sensitive disciplines such as economics, sociology, and ethical philosophy. Insights drawn from these perspectives provide the basis for recommendations for educators of the gifted. (Contains…

  9. Development of a sensitivity and uncertainty analysis tool in R for parametrization of the APEX model

    USDA-ARS?s Scientific Manuscript database

    Hydrologic models are used to simulate the responses of agricultural systems to different inputs and management strategies to identify alternative management practices to cope up with future climate and/or geophysical changes. The Agricultural Policy/Environmental eXtender (APEX) is a model develope...

  10. A sensitive and innovative detection method for rapid C-reactive proteins analysis based on a micro-fluxgate sensor system

    PubMed Central

    Yang, Zhen; Zhi, Shaotao; Feng, Zhu; Lei, Chong; Zhou, Yong

    2018-01-01

    A sensitive and innovative assay system based on a micro-MEMS-fluxgate sensor and immunomagnetic beads-labels was developed for the rapid analysis of C-reactive proteins (CRP). The fluxgate sensor presented in this study was fabricated through standard micro-electro-mechanical system technology. A multi-loop magnetic core made of Fe-based amorphous ribbon was employed as the sensing element, and 3-D solenoid copper coils were used to control the sensing core. Antibody-conjugated immunomagnetic microbeads were strategically utilized as signal tags to label the CRP via the specific conjugation of CRP to polyclonal CRP antibodies. Separate Au film substrates were applied as immunoplatforms to immobilize CRP-beads labels through classical sandwich assays. Detection and quantification of the CRP at different concentrations were implemented by detecting the stray field of CRP labeled magnetic beads using the newly-developed micro-fluxgate sensor. The resulting system exhibited the required sensitivity, stability, reproducibility, and selectivity. A detection limit as low as 0.002 μg/mL CRP with a linearity range from 0.002 μg/mL to 10 μg/mL was achieved, and this suggested that the proposed biosystem possesses high sensitivity. In addition to the extremely low detection limit, the proposed method can be easily manipulated and possesses a quick response time. The response time of our sensor was less than 5 s, and the entire detection period for CRP analysis can be completed in less than 30 min using the current method. Given the detection performance and other advantages such as miniaturization, excellent stability and specificity, the proposed biosensor can be considered as a potential candidate for the rapid analysis of CRP, especially for point-of-care platforms. PMID:29601593

  11. Prediction of Chemical Respiratory and Contact Sensitizers by OX40L Expression in Dendritic Cells Using a Novel 3D Coculture System.

    PubMed

    Mizoguchi, Izuru; Ohashi, Mio; Chiba, Yukino; Hasegawa, Hideaki; Xu, Mingli; Owaki, Toshiyuki; Yoshimoto, Takayuki

    2017-01-01

    The use of animal models in chemical safety testing will be significantly limited due to the recent introduction of the 3Rs principle of animal experimentation in research. Although several in vitro assays to predict the sensitizing potential of chemicals have been developed, these methods cannot distinguish chemical respiratory sensitizers and skin sensitizers. In the present study, we describe a novel in vitro assay that can discriminate respiratory sensitizers from chemical skin sensitizers by taking advantage of the fundamental difference between their modes of action, namely the development of the T helper 2 immune response, which is critically important for respiratory sensitization. First, we established a novel three-dimensional (3D) coculture system of human upper airway epithelium using a commercially available scaffold. It consists of human airway epithelial cell line BEAS-2B, immature dendritic cells (DCs) derived from human peripheral blood CD14 + monocytes, and human lung fibroblast cell line MRC-5. Respective cells were first cultured in individual scaffolds and subsequently assembled into a 3D multi-cell tissue model to more closely mimic the in vivo situation. Then, three typical chemicals that are known respiratory sensitizers (ortho-phthaldialdehyde, hexamethylene diisocyanate, and trimellitic anhydride) and skin sensitizers (oxazolone, formaldehyde, and dinitrochlorobenzene) were added individually to the 3D coculture system. Immunohistochemical analysis revealed that DCs do not migrate into other scaffolds under the experimental conditions. Therefore, the 3D structure was disassembled and real-time reverse transcriptase-PCR analysis was performed in individual scaffolds to analyze the expression levels of molecules critical for Th2 differentiation such as OX40 ligand (OX40L), interleukin (IL)-4, IL-10, IL-33, and thymic stromal lymphopoietin. Both sensitizers showed similarly augmented expression of DC maturation markers (e.g., CD86), but among these molecules, OX40L expression in DCs was most consistently and significantly enhanced by respiratory sensitizers as compared to that by skin sensitizers. Thus, we have established a 3D coculture system mimicking the airway upper epithelium that may be successfully applied to discriminate chemical respiratory sensitizers from skin sensitizers by measuring the critical molecule for Th2 differentiation, OX40L, in DCs.

  12. Analysis of the Elodea nuttallii transcriptome in response to mercury and cadmium pollution: development of sensitive tools for rapid ecotoxicological testing.

    PubMed

    Regier, Nicole; Baerlocher, Loïc; Münsterkötter, Martin; Farinelli, Laurent; Cosio, Claudia

    2013-08-06

    Toxic metals polluting aquatic ecosystems are taken up by inhabitants and accumulate in the food web, affecting species at all trophic levels. It is therefore important to have good tools to assess the level of risk represented by toxic metals in the environment. Macrophytes are potential organisms for the identification of metal-responsive biomarkers but are still underrepresented in ecotoxicology. In the present study, we used next-generation sequencing to investigate the transcriptomic response of Elodea nuttallii exposed to enhanced concentrations of Hg and Cd. We de novo assembled more than 60 000 contigs, of which we found 170 to be regulated dose-dependently by Hg and 212 by Cd. Functional analysis showed that these genes were notably related to energy and metal homeostasis. Expression analysis using nCounter of a subset of genes showed that the gene expression pattern was able to assess toxic metal exposure in complex environmental samples and was more sensitive than other end points (e.g., bioaccumulation, photosynthesis, etc.). In conclusion, we demonstrate the feasibility of using gene expression signatures for the assessment of environmental contamination, using an organism without previous genetic information. This is of interest to ecotoxicology in a wider sense given the possibility to develop specific and sensitive bioassays.

  13. Sensitivity analysis of 1-D dynamical model for basin analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, S.

    1987-01-01

    Geological processes related to petroleum generation, migration and accumulation are very complicated in terms of time and variables involved, and it is very difficult to simulate these processes by laboratory experiments. For this reasons, many mathematic/computer models have been developed to simulate these geological processes based on geological, geophysical and geochemical principles. The sensitivity analysis in this study is a comprehensive examination on how geological, geophysical and geochemical parameters influence the reconstructions of geohistory, thermal history and hydrocarbon generation history using the 1-D fluid flow/compaction model developed in the Basin Modeling Group at the University of South Carolina. This studymore » shows the effects of some commonly used parameter such as depth, age, lithology, porosity, permeability, unconformity (eroded thickness and erosion time), temperature at sediment surface, bottom hole temperature, present day heat flow, thermal gradient, thermal conductivity and kerogen type and content on the evolutions of formation thickness, porosity, permeability, pressure with time and depth, heat flow with time, temperature with time and depth, vitrinite reflectance (Ro) and TTI with time and depth, and oil window in terms of time and depth, amount of hydrocarbons generated with time and depth. Lithology, present day heat flow and thermal conductivity are the most sensitive parameters in the reconstruction of temperature history.« less

  14. Analysis of Lidar Remote Sensing Concepts

    NASA Technical Reports Server (NTRS)

    Spiers, Gary D.

    1999-01-01

    Line of sight velocity and measurement position sensitivity analyses for an orbiting coherent Doppler lidar are developed and applied to two lidars, one with a nadir angle of 30 deg. in a 300 km altitude, 58 deg. inclination orbit and the second for a 45 deg. nadir angle instrument in a 833 km altitude, 89 deg. inclination orbit. The effect of orbit related effects on the backscatter sensitivity of a coherent Doppler lidar is also discussed. Draft performance estimate, error budgets and payload accommodation requirements for the SPARCLE (Space Readiness Coherent Lidar) instrument were also developed and documented.

  15. [Determination of triterpenoic acids in fruits of Ziziphus jujuba using HPLC-MS with polymeric ODS column].

    PubMed

    Zhang, Yong; Zhou, An; Xie, Xiao-Mei

    2013-03-01

    A simple and sensitive method has been developed to simultaneously determine betunilic acid, oleanolic acid and ursolic acid in the fruits of Ziziphus jujuba from different regions by HPLC-MS. This HPLC assay was performed on PAH polymeric C18 bonded stationary phase column with mobile phase contained acetonitrile-water (90: 10) and with negative ESI detection mode. The developed approach was characterized by short time consumption for chromatographic separation, high sensitivity and good reliability so as to meet the requirements for rapid analysis of large-batch fruits of Z. jujuba from different habitats.

  16. Surrogate-based Analysis and Optimization

    NASA Technical Reports Server (NTRS)

    Queipo, Nestor V.; Haftka, Raphael T.; Shyy, Wei; Goel, Tushar; Vaidyanathan, Raj; Tucker, P. Kevin

    2005-01-01

    A major challenge to the successful full-scale development of modem aerospace systems is to address competing objectives such as improved performance, reduced costs, and enhanced safety. Accurate, high-fidelity models are typically time consuming and computationally expensive. Furthermore, informed decisions should be made with an understanding of the impact (global sensitivity) of the design variables on the different objectives. In this context, the so-called surrogate-based approach for analysis and optimization can play a very valuable role. The surrogates are constructed using data drawn from high-fidelity models, and provide fast approximations of the objectives and constraints at new design points, thereby making sensitivity and optimization studies feasible. This paper provides a comprehensive discussion of the fundamental issues that arise in surrogate-based analysis and optimization (SBAO), highlighting concepts, methods, techniques, as well as practical implications. The issues addressed include the selection of the loss function and regularization criteria for constructing the surrogates, design of experiments, surrogate selection and construction, sensitivity analysis, convergence, and optimization. The multi-objective optimal design of a liquid rocket injector is presented to highlight the state of the art and to help guide future efforts.

  17. Analysis of glycosaminoglycan-derived disaccharides by capillary electrophoresis using laser-induced fluorescence detection

    PubMed Central

    Chang, Yuqing; Yang, Bo; Zhao, Xue; Linhardt, Robert J.

    2012-01-01

    A quantitative and highly sensitive method for the analysis of glycosaminoglycan (GAG)-derived disaccharides is presented that relies on capillary electrophoresis (CE) with laser-induced fluorescence (LIF) detection. This method enables complete separation of seventeen GAG-derived disaccharides in a single run. Unsaturated disaccharides were derivatized with 2-aminoacridone (AMAC) to improve sensitivity. The limit of detection was at the attomole level and about 100-fold more sensitive than traditional CE-ultraviolet detection. A CE separation timetable was developed to achieve complete resolution and shorten analysis time. The RSD of migration time and peak areas at both low and high concentrations of unsaturated disaccharides are all less than 2.7% and 3.2%, respectively, demonstrating that this is a reproducible method. This analysis was successfully applied to cultured Chinese hamster ovary cell samples for determination of GAG disaccharides. The current method simplifies GAG extraction steps, and reduces inaccuracy in calculating ratios of heparin/heparan sulfate to chondroitin sulfate/dermatan sulfate, resulting from the separate analyses of a single sample. PMID:22609076

  18. Sum over Histories Representation for Kinetic Sensitivity Analysis: How Chemical Pathways Change When Reaction Rate Coefficients Are Varied

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bai, Shirong; Davis, Michael J.; Skodje, Rex T.

    2015-11-12

    The sensitivity of kinetic observables is analyzed using a newly developed sum over histories representation of chemical kinetics. In the sum over histories representation, the concentrations of the chemical species are decomposed into the sum of probabilities for chemical pathways that follow molecules from reactants to products or intermediates. Unlike static flux methods for reaction path analysis, the sum over histories approach includes the explicit time dependence of the pathway probabilities. Using the sum over histories representation, the sensitivity of an observable with respect to a kinetic parameter such as a rate coefficient is then analyzed in terms of howmore » that parameter affects the chemical pathway probabilities. The method is illustrated for species concentration target functions in H-2 combustion where the rate coefficients are allowed to vary over their associated uncertainty ranges. It is found that large sensitivities are often associated with rate limiting steps along important chemical pathways or by reactions that control the branching of reactive flux« less

  19. Application of positron annihilation lineshape analysis to fatigue damage and thermal embrittlement for nuclear plant materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uchida, M.; Ohta, Y.; Nakamura, N.

    1995-08-01

    Positron annihilation (PA) lineshape analysis is sensitive to detect microstructural defects such as vacancies and dislocations. The authors are developing a portable system and applying this technique to nuclear power plant material evaluations; fatigue damage in type 316 stainless steel and SA508 low alloy steel, and thermal embrittlement in duplex stainless steel. The PA technique was found to be sensitive in the early fatigue life (up to 10%), but showed a little sensitivity for later stages of the fatigue life in both type 316 stainless steel and SA508 ferritic steel. Type 316 steel showed a higher PA sensitivity than SA508more » since the initial SA508 microstructure already contained a high dislocation density in the as-received state. The PA parameter increased as a fraction of aging time in CF8M samples aged at 350 C and 400 C, but didn`t change much in CF8 samples.« less

  20. Sensitivity analysis of reactive ecological dynamics.

    PubMed

    Verdy, Ariane; Caswell, Hal

    2008-08-01

    Ecological systems with asymptotically stable equilibria may exhibit significant transient dynamics following perturbations. In some cases, these transient dynamics include the possibility of excursions away from the equilibrium before the eventual return; systems that exhibit such amplification of perturbations are called reactive. Reactivity is a common property of ecological systems, and the amplification can be large and long-lasting. The transient response of a reactive ecosystem depends on the parameters of the underlying model. To investigate this dependence, we develop sensitivity analyses for indices of transient dynamics (reactivity, the amplification envelope, and the optimal perturbation) in both continuous- and discrete-time models written in matrix form. The sensitivity calculations require expressions, some of them new, for the derivatives of equilibria, eigenvalues, singular values, and singular vectors, obtained using matrix calculus. Sensitivity analysis provides a quantitative framework for investigating the mechanisms leading to transient growth. We apply the methodology to a predator-prey model and a size-structured food web model. The results suggest predator-driven and prey-driven mechanisms for transient amplification resulting from multispecies interactions.

  1. A review of promising new immunoassay technology for monitoring forest herbicides

    Treesearch

    Charles K. McMahon

    1993-01-01

    Rising costs of classical instrumental methods of chemical analysis coupled with an increasing need for environmental monitoring has lead to the development of highly sensitive, low-cost immunochemical methods of analysis for the detection of environmental contaminants. These methods known simply as immunoassays are chemical assays which use antibodies as reagents. A...

  2. A meta-analysis of confocal laser endomicroscopy for the detection of neoplasia in patients with Barrett's esophagus.

    PubMed

    Xiong, Yi-Quan; Ma, Shu-Juan; Zhou, Jun-Hua; Zhong, Xue-Shan; Chen, Qing

    2016-06-01

    Barrett's esophagus (BE) is considered the most important risk factor for development of esophageal adenocarcinoma. Confocal laser endomicroscopy (CLE) is a recently developed technique used to diagnose neoplasia in BE. This meta-analysis was performed to assess the accuracy of CLE for diagnosis of neoplasia in BE. We searched EMBASE, PubMed, Cochrane Library, and Web of Science to identify relevant studies for all articles published up to June 27, 2015 in English. The quality of included studies was assessed using QUADAS-2. Per-patient and per-lesion pooled sensitivity, specificity, positive likelihood ratio, and negative likelihood ratio with 95% confidence intervals (CIs) were calculated. In total, 14 studies were included in the final analysis, covering 789 patients with 4047 lesions. Seven studies were included in the per-patient analysis. Pooled sensitivity and specificity were 89% (95% CI: 0.82-0.94) and 83% (95% CI: 0.78-0.86), respectively. Ten studies were included in the per-lesion analysis. Compared with the PP analysis, the corresponding pooled sensitivity declined to 77% (95% CI: 0.73-0.81) and specificity increased to 89% (95% CI: 0.87-0.90). Subgroup analysis showed that probe-based CLE (pCLE) was superior to endoscope-based CLE (eCLE) in pooled specificity [91.4% (95% CI: 89.7-92.9) vs 86.1% (95% CI: 84.3-87.8)] and AUC for the sROC (0.885 vs 0.762). Confocal laser endomicroscopy is a valid method to accurately differentiate neoplasms from non-neoplasms in BE. It can be applied to BE surveillance and early diagnosis of esophageal adenocarcinoma. © 2015 Journal of Gastroenterology and Hepatology Foundation and John Wiley & Sons Australia, Ltd.

  3. Recent advances in chemiluminescence detection coupled with capillary electrophoresis and microchip capillary electrophoresis.

    PubMed

    Liu, Yuxuan; Huang, Xiangyi; Ren, Jicun

    2016-01-01

    CE is an ideal analytical method for extremely volume-limited biological microenvironments. However, the small injection volume makes it a challenge to achieve highly sensitive detection. Chemiluminescence (CL) detection is characterized by providing low background with excellent sensitivity because of requiring no light source. The coupling of CL with CE and MCE has become a powerful analytical method. So far, this method has been widely applied to chemical analysis, bioassay, drug analysis, and environment analysis. In this review, we first introduce some developments for CE-CL and MCE-CL systems, and then put the emphasis on the applications in the last 10 years. Finally, we discuss the future prospects. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Development of an automated scanning monochromator for sensitivity calibration of the MUSTANG instrument

    NASA Astrophysics Data System (ADS)

    Rivers, Thane D.

    1992-06-01

    An Automated Scanning Monochromator was developed using: an Acton Research Corporation (ARC) Monochromator, Ealing Photomultiplier Tube and a Macintosh PC in conjunction with LabVIEW software. The LabVIEW Virtual Instrument written to operate the ARC Monochromator is a mouse driven user friendly program developed for automated spectral data measurements. Resolution and sensitivity of the Automated Scanning Monochromator System were determined experimentally. The Automated monochromator was then used for spectral measurements of a Platinum Lamp. Additionally, the reflectivity curve for a BaSO4 coated screen has been measured. Reflectivity measurements indicate a large discrepancy with expected results. Further analysis of the reflectivity experiment is required for conclusive results.

  5. Space station integrated wall design and penetration damage control

    NASA Technical Reports Server (NTRS)

    Coronado, A. R.; Gibbins, M. N.; Wright, M. A.; Stern, P. H.

    1987-01-01

    The analysis code BUMPER executes a numerical solution to the problem of calculating the probability of no penetration (PNP) of a spacecraft subject to man-made orbital debris or meteoroid impact. The codes were developed on a DEC VAX 11/780 computer that uses the Virtual Memory System (VMS) operating system, which is written in FORTRAN 77 with no VAX extensions. To help illustrate the steps involved, a single sample analysis is performed. The example used is the space station reference configuration. The finite element model (FEM) of this configuration is relatively complex but demonstrates many BUMPER features. The computer tools and guidelines are described for constructing a FEM for the space station under consideration. The methods used to analyze the sensitivity of PNP to variations in design, are described. Ways are suggested for developing contour plots of the sensitivity study data. Additional BUMPER analysis examples are provided, including FEMs, command inputs, and data outputs. The mathematical theory used as the basis for the code is described, and illustrates the data flow within the analysis.

  6. Pulsed quantum cascade laser-based cavity ring-down spectroscopy for ammonia detection in breath.

    PubMed

    Manne, Jagadeeshwari; Sukhorukov, Oleksandr; Jäger, Wolfgang; Tulip, John

    2006-12-20

    Breath analysis can be a valuable, noninvasive tool for the clinical diagnosis of a number of pathological conditions. The detection of ammonia in exhaled breath is of particular interest for it has been linked to kidney malfunction and peptic ulcers. Pulsed cavity ringdown spectroscopy in the mid-IR region has developed into a sensitive analytical technique for trace gas analysis. A gas analyzer based on a pulsed mid-IR quantum cascade laser operating near 970 cm(-1) has been developed for the detection of ammonia levels in breath. We report a sensitivity of approximately 50 parts per billion with a 20 s time resolution for ammonia detection in breath with this system. The challenges and possible solutions for the quantification of ammonia in human breath by the described technique are discussed.

  7. Micro X-ray diffraction analysis of thin films using grazing-exit conditions.

    PubMed

    Noma, T; Iida, A

    1998-05-01

    An X-ray diffraction technique using a hard X-ray microbeam for thin-film analysis has been developed. To optimize the spatial resolution and the surface sensitivity, the X-ray microbeam strikes the sample surface at a large glancing angle while the diffracted X-ray signal is detected with a small (grazing) exit angle. Kirkpatrick-Baez optics developed at the Photon Factory were used, in combination with a multilayer monochromator, for focusing X-rays. The focused beam size was about 10 x 10 micro m. X-ray diffraction patterns of Pd, Pt and their layered structure were measured. Using a small exit angle, the signal-to-background ratio was improved due to a shallow escape depth. Under the grazing-exit condition, the refraction effect of diffracted X-rays was observed, indicating the possibility of surface sensitivity.

  8. [Study on the automatic parameters identification of water pipe network model].

    PubMed

    Jia, Hai-Feng; Zhao, Qi-Feng

    2010-01-01

    Based on the problems analysis on development and application of water pipe network model, the model parameters automatic identification is regarded as a kernel bottleneck of model's application in water supply enterprise. The methodology of water pipe network model parameters automatic identification based on GIS and SCADA database is proposed. Then the kernel algorithm of model parameters automatic identification is studied, RSA (Regionalized Sensitivity Analysis) is used for automatic recognition of sensitive parameters, and MCS (Monte-Carlo Sampling) is used for automatic identification of parameters, the detail technical route based on RSA and MCS is presented. The module of water pipe network model parameters automatic identification is developed. At last, selected a typical water pipe network as a case, the case study on water pipe network model parameters automatic identification is conducted and the satisfied results are achieved.

  9. Highly Sensitive Ratiometric Fluorescent Sensor for Trinitrotoluene Based on the Inner Filter Effect between Gold Nanoparticles and Fluorescent Nanoparticles.

    PubMed

    Lu, Hongzhi; Quan, Shuai; Xu, Shoufang

    2017-11-08

    In this work, we developed a simple and sensitive ratiometric fluorescent assay for sensing trinitrotoluene (TNT) based on the inner filter effect (IFE) between gold nanoparticles (AuNPs) and ratiometric fluorescent nanoparticles (RFNs), which was designed by hybridizing green emissive carbon dots (CDs) and red emissive quantum dots (QDs) into a silica sphere as a fluorophore pair. AuNPs in their dispersion state can be a powerful absorber to quench CDs, while the aggregated AuNPs can quench QDs in the IFE-based fluorescent assays as a result of complementary overlap between the absorption spectrum of AuNPs and emission spectrum of RFNs. As a result of the fact that TNT can induce the aggregation of AuNPs, with the addition of TNT, the fluorescent of QDs can be quenched, while the fluorescent of CDs would be recovered. Then, ratiometric fluorescent detection of TNT is feasible. The present IFE-based ratiometric fluorescent sensor can detect TNT ranging from 0.1 to 270 nM, with a detection limit of 0.029 nM. In addition, the developed method was successfully applied to investigate TNT in water and soil samples with satisfactory recoveries ranging from 95 to 103%, with precision below 4.5%. The simple sensing approach proposed here could improve the sensitivity of colorimetric analysis by changing the ultraviolet analysis to ratiometric fluorescent analysis and promote the development of a dual-mode detection system.

  10. CUSUM-Logistic Regression analysis for the rapid detection of errors in clinical laboratory test results.

    PubMed

    Sampson, Maureen L; Gounden, Verena; van Deventer, Hendrik E; Remaley, Alan T

    2016-02-01

    The main drawback of the periodic analysis of quality control (QC) material is that test performance is not monitored in time periods between QC analyses, potentially leading to the reporting of faulty test results. The objective of this study was to develop a patient based QC procedure for the more timely detection of test errors. Results from a Chem-14 panel measured on the Beckman LX20 analyzer were used to develop the model. Each test result was predicted from the other 13 members of the panel by multiple regression, which resulted in correlation coefficients between the predicted and measured result of >0.7 for 8 of the 14 tests. A logistic regression model, which utilized the measured test result, the predicted test result, the day of the week and time of day, was then developed for predicting test errors. The output of the logistic regression was tallied by a daily CUSUM approach and used to predict test errors, with a fixed specificity of 90%. The mean average run length (ARL) before error detection by CUSUM-Logistic Regression (CSLR) was 20 with a mean sensitivity of 97%, which was considerably shorter than the mean ARL of 53 (sensitivity 87.5%) for a simple prediction model that only used the measured result for error detection. A CUSUM-Logistic Regression analysis of patient laboratory data can be an effective approach for the rapid and sensitive detection of clinical laboratory errors. Published by Elsevier Inc.

  11. Are quantitative sensitivity analysis methods always reliable?

    NASA Astrophysics Data System (ADS)

    Huang, X.

    2016-12-01

    Physical parameterizations developed to represent subgrid-scale physical processes include various uncertain parameters, leading to large uncertainties in today's Earth System Models (ESMs). Sensitivity Analysis (SA) is an efficient approach to quantitatively determine how the uncertainty of the evaluation metric can be apportioned to each parameter. Also, SA can identify the most influential parameters, as a result to reduce the high dimensional parametric space. In previous studies, some SA-based approaches, such as Sobol' and Fourier amplitude sensitivity testing (FAST), divide the parameters into sensitive and insensitive groups respectively. The first one is reserved but the other is eliminated for certain scientific study. However, these approaches ignore the disappearance of the interactive effects between the reserved parameters and the eliminated ones, which are also part of the total sensitive indices. Therefore, the wrong sensitive parameters might be identified by these traditional SA approaches and tools. In this study, we propose a dynamic global sensitivity analysis method (DGSAM), which iteratively removes the least important parameter until there are only two parameters left. We use the CLM-CASA, a global terrestrial model, as an example to verify our findings with different sample sizes ranging from 7000 to 280000. The result shows DGSAM has abilities to identify more influential parameters, which is confirmed by parameter calibration experiments using four popular optimization methods. For example, optimization using Top3 parameters filtered by DGSAM could achieve substantial improvement against Sobol' by 10%. Furthermore, the current computational cost for calibration has been reduced to 1/6 of the original one. In future, it is necessary to explore alternative SA methods emphasizing parameter interactions.

  12. Parameterization of the InVEST Crop Pollination Model to spatially predict abundance of wild blueberry (Vaccinium angustifolium Aiton) native bee pollinators in Maine, USA

    USGS Publications Warehouse

    Groff, Shannon C.; Loftin, Cynthia S.; Drummond, Frank; Bushmann, Sara; McGill, Brian J.

    2016-01-01

    Non-native honeybees historically have been managed for crop pollination, however, recent population declines draw attention to pollination services provided by native bees. We applied the InVEST Crop Pollination model, developed to predict native bee abundance from habitat resources, in Maine's wild blueberry crop landscape. We evaluated model performance with parameters informed by four approaches: 1) expert opinion; 2) sensitivity analysis; 3) sensitivity analysis informed model optimization; and, 4) simulated annealing (uninformed) model optimization. Uninformed optimization improved model performance by 29% compared to expert opinion-informed model, while sensitivity-analysis informed optimization improved model performance by 54%. This suggests that expert opinion may not result in the best parameter values for the InVEST model. The proportion of deciduous/mixed forest within 2000 m of a blueberry field also reliably predicted native bee abundance in blueberry fields, however, the InVEST model provides an efficient tool to estimate bee abundance beyond the field perimeter.

  13. Advances in ultrasensitive mass spectrometry of organic molecules.

    PubMed

    Kandiah, Mathivathani; Urban, Pawel L

    2013-06-21

    Ultrasensitive mass spectrometric analysis of organic molecules is important for various branches of chemistry, and other fields including physics, earth and environmental sciences, archaeology, biomedicine, and materials science. It finds applications--as an enabling tool--in systems biology, biological imaging, clinical analysis, and forensics. Although there are a number of technical obstacles associated with the analysis of samples by mass spectrometry at ultratrace level (for example analyte losses during sample preparation, insufficient sensitivity, ion suppression), several noteworthy developments have been made over the years. They include: sensitive ion sources, loss-free interfaces, ion optics components, efficient mass analyzers and detectors, as well as "smart" sample preparation strategies. Some of the mass spectrometric methods published to date can achieve sensitivity which is by several orders of magnitude higher than that of alternative approaches. Femto- and attomole level limits of detection are nowadays common, while zepto- and yoctomole level limits of detection have also been reported. We envision that the ultrasensitive mass spectrometric assays will soon contribute to new discoveries in bioscience and other areas.

  14. A sensitive continuum analysis method for gamma ray spectra

    NASA Technical Reports Server (NTRS)

    Thakur, Alakh N.; Arnold, James R.

    1993-01-01

    In this work we examine ways to improve the sensitivity of the analysis procedure for gamma ray spectra with respect to small differences in the continuum (Compton) spectra. The method developed is applied to analyze gamma ray spectra obtained from planetary mapping by the Mars Observer spacecraft launched in September 1992. Calculated Mars simulation spectra and actual thick target bombardment spectra have been taken as test cases. The principle of the method rests on the extraction of continuum information from Fourier transforms of the spectra. We study how a better estimate of the spectrum from larger regions of the Mars surface will improve the analysis for smaller regions with poorer statistics. Estimation of signal within the continuum is done in the frequency domain which enables efficient and sensitive discrimination of subtle differences between two spectra. The process is compared to other methods for the extraction of information from the continuum. Finally we explore briefly the possible uses of this technique in other applications of continuum spectra.

  15. Analysis of imazaquin in soybeans by solid-phase extraction and high-performance liquid chromatography.

    PubMed

    Guo, C; Hu, J-Y; Chen, X-Y; Li, J-Z

    2008-02-01

    An analytical method for the determination imazaquin residues in soybeans was developed. The developed liquid/liquid partition and strong anion exchange solid-phase extraction procedures provide the effective cleanup, removing the greatest number of sample matrix interferences. By optimizing mobile-phase pH water/acetonitrile conditions with phosphoric acid, using a C-18 reverse-phase chromatographic column and employing ultraviolet detection, excellent peak resolution was achieved. The combined cleanup and chromatographic method steps reported herein were sensitive and reliable for determining the imazaquin residues in soybean samples. This method is characterized by recovery >88.4%, precision <6.7% CV, and sensitivity of 0.005 ppm, in agreement with directives for method validation in residue analysis. Imazaquin residues in soybeans were further confirmed by high performance liquid chromatography-mass spectrometry (LC-MS). The proposed method was successfully applied to the analysis of imazaquin residues in soybean samples grown in an experimental field after treatments of imazaquin formulation.

  16. Advances in on-chip photodetection for applications in miniaturized genetic analysis systems

    NASA Astrophysics Data System (ADS)

    Namasivayam, Vijay; Lin, Rongsheng; Johnson, Brian; Brahmasandra, Sundaresh; Razzacki, Zafar; Burke, David T.; Burns, Mark A.

    2004-01-01

    Microfabrication techniques have become increasingly popular in the development of next generation DNA analysis devices. Improved on-chip fluorescence detection systems may have applications in developing portable hand-held instruments for point-of-care diagnostics. Miniaturization of fluorescence detection involves construction of ultra-sensitive photodetectors that can be integrated onto a fluidic platform combined with the appropriate optical emission filters. We have previously demonstrated integration PIN photodiodes onto a microfabricated electrophoresis channel for separation and detection of DNA fragments. In this work, we present an improved detector structure that uses a PINN+ photodiode with an on-chip interference filter and a robust liquid barrier layer. This new design yields high sensitivity (detection limit of 0.9 ng µl-1 of DNA), low-noise (S/N ~ 100/1) and enhanced quantum efficiencies (>80%) over the entire visible spectrum. Applications of these photodiodes in various areas of DNA analysis such as microreactions (PCR), separations (electrophoresis) and microfluidics (drop sensing) are presented.

  17. Additional EIPC Study Analysis. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hadley, Stanton W; Gotham, Douglas J.; Luciani, Ralph L.

    Between 2010 and 2012 the Eastern Interconnection Planning Collaborative (EIPC) conducted a major long-term resource and transmission study of the Eastern Interconnection (EI). With guidance from a Stakeholder Steering Committee (SSC) that included representatives from the Eastern Interconnection States Planning Council (EISPC) among others, the project was conducted in two phases. Phase 1 involved a long-term capacity expansion analysis that involved creation of eight major futures plus 72 sensitivities. Three scenarios were selected for more extensive transmission- focused evaluation in Phase 2. Five power flow analyses, nine production cost model runs (including six sensitivities), and three capital cost estimations weremore » developed during this second phase. The results from Phase 1 and 2 provided a wealth of data that could be examined further to address energy-related questions. A list of 14 topics was developed for further analysis. This paper brings together the earlier interim reports of the first 13 topics plus one additional topic into a single final report.« less

  18. Flow analysis and design optimization methods for nozzle-afterbody of a hypersonic vehicle

    NASA Technical Reports Server (NTRS)

    Baysal, O.

    1992-01-01

    This report summarizes the methods developed for the aerodynamic analysis and the shape optimization of the nozzle-afterbody section of a hypersonic vehicle. Initially, exhaust gases were assumed to be air. Internal-external flows around a single scramjet module were analyzed by solving the 3D Navier-Stokes equations. Then, exhaust gases were simulated by a cold mixture of Freon and Ar. Two different models were used to compute these multispecies flows as they mixed with the hypersonic airflow. Surface and off-surface properties were successfully compared with the experimental data. The Aerodynamic Design Optimization with Sensitivity analysis was then developed. Pre- and postoptimization sensitivity coefficients were derived and used in this quasi-analytical method. These coefficients were also used to predict inexpensively the flow field around a changed shape when the flow field of an unchanged shape was given. Starting with totally arbitrary initial afterbody shapes, independent computations were converged to the same optimum shape, which rendered the maximum axial thrust.

  19. Flow analysis and design optimization methods for nozzle afterbody of a hypersonic vehicle

    NASA Technical Reports Server (NTRS)

    Baysal, Oktay

    1991-01-01

    This report summarizes the methods developed for the aerodynamic analysis and the shape optimization of the nozzle-afterbody section of a hypersonic vehicle. Initially, exhaust gases were assumed to be air. Internal-external flows around a single scramjet module were analyzed by solving the three dimensional Navier-Stokes equations. Then, exhaust gases were simulated by a cold mixture of Freon and Argon. Two different models were used to compute these multispecies flows as they mixed with the hypersonic airflow. Surface and off-surface properties were successfully compared with the experimental data. In the second phase of this project, the Aerodynamic Design Optimization with Sensitivity analysis (ADOS) was developed. Pre and post optimization sensitivity coefficients were derived and used in this quasi-analytical method. These coefficients were also used to predict inexpensively the flow field around a changed shape when the flow field of an unchanged shape was given. Starting with totally arbitrary initial afterbody shapes, independent computations were converged to the same optimum shape, which rendered the maximum axial thrust.

  20. A sensitivity analysis on seismic tomography data with respect to CO2 saturation of a CO2 geological sequestration field

    NASA Astrophysics Data System (ADS)

    Park, Chanho; Nguyen, Phung K. T.; Nam, Myung Jin; Kim, Jongwook

    2013-04-01

    Monitoring CO2 migration and storage in geological formations is important not only for the stability of geological sequestration of CO2 but also for efficient management of CO2 injection. Especially, geophysical methods can make in situ observation of CO2 to assess the potential leakage of CO2 and to improve reservoir description as well to monitor development of geologic discontinuity (i.e., fault, crack, joint, etc.). Geophysical monitoring can be based on wireline logging or surface surveys for well-scale monitoring (high resolution and nallow area of investigation) or basin-scale monitoring (low resolution and wide area of investigation). In the meantime, crosswell tomography can make reservoir-scale monitoring to bridge the resolution gap between well logs and surface measurements. This study focuses on reservoir-scale monitoring based on crosswell seismic tomography aiming describe details of reservoir structure and monitoring migration of reservoir fluid (water and CO2). For the monitoring, we first make a sensitivity analysis on crosswell seismic tomography data with respect to CO2 saturation. For the sensitivity analysis, Rock Physics Models (RPMs) are constructed by calculating the values of density and P and S-wave velocities of a virtual CO2 injection reservoir. Since the seismic velocity of the reservoir accordingly changes as CO2 saturation changes when the CO2 saturation is less than about 20%, while when the CO2 saturation is larger than 20%, the seismic velocity is insensitive to the change, sensitivity analysis is mainly made when CO2 saturation is less than 20%. For precise simulation of seismic tomography responses for constructed RPMs, we developed a time-domain 2D elastic modeling based on finite difference method with a staggered grid employing a boundary condition of a convolutional perfectly matched layer. We further make comparison between sensitivities of seismic tomography and surface measurements for RPMs to analysis resolution difference between them. Moreover, assuming a similar reservoir situation to the CO2 storage site in Nagaoka, Japan, we generate time-lapse tomographic data sets for the corresponding CO2 injection process, and make a preliminary interpretation of the data sets.

  1. Sensitivity and Nonlinearity of Thermoacoustic Oscillations

    NASA Astrophysics Data System (ADS)

    Juniper, Matthew P.; Sujith, R. I.

    2018-01-01

    Nine decades of rocket engine and gas turbine development have shown that thermoacoustic oscillations are difficult to predict but can usually be eliminated with relatively small ad hoc design changes. These changes can, however, be ruinously expensive to devise. This review explains why linear and nonlinear thermoacoustic behavior is so sensitive to parameters such as operating point, fuel composition, and injector geometry. It shows how nonperiodic behavior arises in experiments and simulations and discusses how fluctuations in thermoacoustic systems with turbulent reacting flow, which are usually filtered or averaged out as noise, can reveal useful information. Finally, it proposes tools to exploit this sensitivity in the future: adjoint-based sensitivity analysis to optimize passive control designs and complex systems theory to warn of impending thermoacoustic oscillations and to identify the most sensitive elements of a thermoacoustic system.

  2. A closure test for time-specific capture-recapture data

    USGS Publications Warehouse

    Stanley, T.R.; Burnham, K.P.

    1999-01-01

    The assumption of demographic closure in the analysis of capture-recapture data under closed-population models is of fundamental importance. Yet, little progress has been made in the development of omnibus tests of the closure assumption. We present a closure test for time-specific data that, in principle, tests the null hypothesis of closed-population model M(t) against the open-population Jolly-Seber model as a specific alternative. This test is chi-square, and can be decomposed into informative components that can be interpreted to determine the nature of closure violations. The test is most sensitive to permanent emigration and least sensitive to temporary emigration, and is of intermediate sensitivity to permanent or temporary immigration. This test is a versatile tool for testing the assumption of demographic closure in the analysis of capture-recapture data.

  3. Identification of heat-sensitive QTL derived from common wild rice (Oryza rufipogon Griff.).

    PubMed

    Lei, Dongyang; Tan, Lubin; Liu, Fengxia; Chen, Liyun; Sun, Chuanqing

    2013-03-01

    Understanding the responses of rice plants to heat-stress is a challenging, yet crucial, endeavor. A set of introgression lines was previously developed using an advanced backcrossing strategy that involved the elite indica cultivar Teqing as the recipient and an accession of common wild rice (Oryza rufipongon Griff.) as the donor. In this study, we evaluated the responses of 90 of these previously developed introgression lines to heat stress. Five quantitative trait loci (QTLs) related to heat response were detected. The phenotypic variances explained by these QTLs ranged from 6.83% to 14.63%, and O. rufipogon-derived alleles at one locus reduced sensitivity to heat. A heat-sensitive introgression line, YIL106, was identified and characterized. Genotypic analysis demonstrated that YIL106 contained four introgressed segments derived from O. rufipongon and two QTLs (qHTS1-1 and qHTS3) related to heat response. Physiological tests, including measurements of chlorophyll content, electrolyte leakage, malondialdehyde content, and soluble sugar content, were consistent with the heat sensitivity observed in YIL106. Ultrastructural analysis of YIL106 mesophyll cells showed that they were severely damaged following heat stress. This suggests that modification of the cell membrane system is a primary response to heat stress in plants. Identification and characterization of the heat-sensitive line YIL106 may facilitate the isolation of genes associated with the response of rice plants to heat stress. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  4. Efficient Analysis of Complex Structures

    NASA Technical Reports Server (NTRS)

    Kapania, Rakesh K.

    2000-01-01

    Last various accomplishments achieved during this project are : (1) A Survey of Neural Network (NN) applications using MATLAB NN Toolbox on structural engineering especially on equivalent continuum models (Appendix A). (2) Application of NN and GAs to simulate and synthesize substructures: 1-D and 2-D beam problems (Appendix B). (3) Development of an equivalent plate-model analysis method (EPA) for static and vibration analysis of general trapezoidal built-up wing structures composed of skins, spars and ribs. Calculation of all sorts of test cases and comparison with measurements or FEA results. (Appendix C). (4) Basic work on using second order sensitivities on simulating wing modal response, discussion of sensitivity evaluation approaches, and some results (Appendix D). (5) Establishing a general methodology of simulating the modal responses by direct application of NN and by sensitivity techniques, in a design space composed of a number of design points. Comparison is made through examples using these two methods (Appendix E). (6) Establishing a general methodology of efficient analysis of complex wing structures by indirect application of NN: the NN-aided Equivalent Plate Analysis. Training of the Neural Networks for this purpose in several cases of design spaces, which can be applicable for actual design of complex wings (Appendix F).

  5. Initial Results: An Ultra-Low-Background Germanium Crystal Array

    DTIC Science & Technology

    2010-09-01

    data (focused on γ -γ coincidence signatures) (Smith et al., 2004) and the Multi- Isotope Coincidence Analysis code (MICA) (Warren et al., 2006). The...The follow-on “CASCADES” project aims to develop a multicoincidence data- analysis package and make robust fission-product demonstration measurements...sensitivity. This effort is focused on improving gamma analysis capabilities for nuclear detonation detection (NDD) applications, e.g., nuclear treaty

  6. Experimental elaboration and analysis of dye-sensitized TiO2 solar cells (DSSC) dyed by natural dyes and conductive polymers

    NASA Astrophysics Data System (ADS)

    KałuŻyński, P.; Maciak, E.; Herzog, T.; Wójcik, M.

    2016-09-01

    In this paper we propose low cost and easy in development fully working dye-sensitized solar cell module made with use of a different sensitizing dyes (various anthocyanins and P3HT) for increasing the absorption spectrum, transparent conducting substrates (vaccum spattered chromium and gold), nanometer sized TiO2 film, iodide and methyl viologen dichloride based electrolyte, and a counter electrode (vaccum spattered platinum or carbon). Moreover, some of the different technologies and optimization manufacturing processes were elaborated for energy efficiency increase and were presented in this paper.

  7. Simple Electrolyzer Model Development for High-Temperature Electrolysis System Analysis Using Solid Oxide Electrolysis Cell

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    JaeHwa Koh; DuckJoo Yoon; Chang H. Oh

    2010-07-01

    An electrolyzer model for the analysis of a hydrogen-production system using a solid oxide electrolysis cell (SOEC) has been developed, and the effects for principal parameters have been estimated by sensitivity studies based on the developed model. The main parameters considered are current density, area specific resistance, temperature, pressure, and molar fraction and flow rates in the inlet and outlet. Finally, a simple model for a high-temperature hydrogen-production system using the solid oxide electrolysis cell integrated with very high temperature reactors is estimated.

  8. Photoacoustic Spectroscopy Analysis of Traditional Chinese Medicine

    NASA Astrophysics Data System (ADS)

    Chen, Lu; Zhao, Bin-xing; Xiao, Hong-tao; Tong, Rong-sheng; Gao, Chun-ming

    2013-09-01

    Chinese medicine is a historic cultural legacy of China. It has made a significant contribution to medicine and healthcare for generations. The development of Chinese herbal medicine analysis is emphasized by the Chinese pharmaceutical industry. This study has carried out the experimental analysis of ten kinds of Chinese herbal powder including Fritillaria powder, etc., based on the photoacoustic spectroscopy (PAS) method. First, a photoacoustic spectroscopy system was designed and constructed, especially a highly sensitive solid photoacoustic cell was established. Second, the experimental setup was verified through the characteristic emission spectrum of the light source, obtained by using carbon as a sample in the photoacoustic cell. Finally, as the photoacoustic spectroscopy analysis of Fritillaria, etc., was completed, the specificity of the Chinese herb medicine analysis was verified. This study shows that the PAS can provide a valid, highly sensitive analytical method for the specificity of Chinese herb medicine without preparing and damaging samples.

  9. Dynamic analysis of process reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shadle, L.J.; Lawson, L.O.; Noel, S.D.

    1995-06-01

    The approach and methodology of conducting a dynamic analysis is presented in this poster session in order to describe how this type of analysis can be used to evaluate the operation and control of process reactors. Dynamic analysis of the PyGas{trademark} gasification process is used to illustrate the utility of this approach. PyGas{trademark} is the gasifier being developed for the Gasification Product Improvement Facility (GPIF) by Jacobs-Siffine Engineering and Riley Stoker. In the first step of the analysis, process models are used to calculate the steady-state conditions and associated sensitivities for the process. For the PyGas{trademark} gasifier, the process modelsmore » are non-linear mechanistic models of the jetting fluidized-bed pyrolyzer and the fixed-bed gasifier. These process sensitivities are key input, in the form of gain parameters or transfer functions, to the dynamic engineering models.« less

  10. Biochemical component identification by plasmonic improved whispering gallery mode optical resonance based sensor

    NASA Astrophysics Data System (ADS)

    Saetchnikov, Vladimir A.; Tcherniavskaia, Elina A.; Saetchnikov, Anton V.; Schweiger, Gustav; Ostendorf, Andreas

    2014-05-01

    Experimental data on detection and identification of variety of biochemical agents, such as proteins, microelements, antibiotic of different generation etc. in both single and multi component solutions under varied in wide range concentration analyzed on the light scattering parameters of whispering gallery mode optical resonance based sensor are represented. Multiplexing on parameters and components has been realized using developed fluidic sensor cell with fixed in adhesive layer dielectric microspheres and data processing. Biochemical component identification has been performed by developed network analysis techniques. Developed approach is demonstrated to be applicable both for single agent and for multi component biochemical analysis. Novel technique based on optical resonance on microring structures, plasmon resonance and identification tools has been developed. To improve a sensitivity of microring structures microspheres fixed by adhesive had been treated previously by gold nanoparticle solution. Another technique used thin film gold layers deposited on the substrate below adhesive. Both biomolecule and nanoparticle injections caused considerable changes of optical resonance spectra. Plasmonic gold layers under optimized thickness also improve parameters of optical resonance spectra. Biochemical component identification has been also performed by developed network analysis techniques both for single and for multi component solution. So advantages of plasmon enhancing optical microcavity resonance with multiparameter identification tools is used for development of a new platform for ultra sensitive label-free biomedical sensor.

  11. Commercial test kits for detection of Lyme borreliosis: a meta-analysis of test accuracy

    PubMed Central

    Cook, Michael J; Puri, Basant K

    2016-01-01

    The clinical diagnosis of Lyme borreliosis can be supported by various test methodologies; test kits are available from many manufacturers. Literature searches were carried out to identify studies that reported characteristics of the test kits. Of 50 searched studies, 18 were included where the tests were commercially available and samples were proven to be positive using serology testing, evidence of an erythema migrans rash, and/or culture. Additional requirements were a test specificity of ≥85% and publication in the last 20 years. The weighted mean sensitivity for all tests and for all samples was 59.5%. Individual study means varied from 30.6% to 86.2%. Sensitivity for each test technology varied from 62.4% for Western blot kits, and 62.3% for enzyme-linked immunosorbent assay tests, to 53.9% for synthetic C6 peptide ELISA tests and 53.7% when the two-tier methodology was used. Test sensitivity increased as dissemination of the pathogen affected different organs; however, the absence of data on the time from infection to serological testing and the lack of standard definitions for “early” and “late” disease prevented analysis of test sensitivity versus time of infection. The lack of standardization of the definitions of disease stage and the possibility of retrospective selection bias prevented clear evaluation of test sensitivity by “stage”. The sensitivity for samples classified as acute disease was 35.4%, with a corresponding sensitivity of 64.5% for samples from patients defined as convalescent. Regression analysis demonstrated an improvement of 4% in test sensitivity over the 20-year study period. The studies did not provide data to indicate the sensitivity of tests used in a clinical setting since the effect of recent use of antibiotics or steroids or other factors affecting antibody response was not factored in. The tests were developed for only specific Borrelia species; sensitivities for other species could not be calculated. PMID:27920571

  12. Helping teachers conduct sex education in secondary schools in Thailand: overcoming culturally sensitive barriers to sex education.

    PubMed

    Thammaraksa, Pimrat; Powwattana, Arpaporn; Lagampan, Sunee; Thaingtham, Weena

    2014-06-01

    The purpose of this quasi experimental study was to evaluate the effects of Culturally Sensitive Sex Education Skill Development, a teacher-led sex education program in secondary schools in Thailand. Two public secondary schools in the suburban areas of Bangkok were randomly selected. One was designated as the experimental school and the other as the comparison school. Ninety grade seven and eight teachers, 45 from each school, were selected to participate in the study. Self efficacy theory and culturally appropriate basis were applied to develop the program which included 4 weeks of intervention and 2 weeks of follow up. Primary outcomes were attitudes toward sex education, perceived self efficacy, and sex education skills. Statistical analysis included independent and paired t test, and repeated one-way analysis of variance. At the end of the intervention and during the follow-up period, the intervention group had significantly higher mean scores of attitudes toward sex education, perceived self efficacy, and sex education skills than their scores before (p < .001), and than those of the comparison group (p < .001). The results showed that Culturally Sensitive Sex Education Skill Development could enhance attitudes and sex education self efficacy to promote the implementation of sex education among teachers. Copyright © 2014. Published by Elsevier B.V.

  13. Applicability of low-melting-point microcrystalline wax to develop temperature-sensitive formulations.

    PubMed

    Matsumoto, Kohei; Kimura, Shin-Ichiro; Iwao, Yasunori; Itai, Shigeru

    2017-10-30

    Low-melting-point substances are widely used to develop temperature-sensitive formulations. In this study, we focused on microcrystalline wax (MCW) as a low-melting-point substance. We evaluated the drug release behavior of wax matrix (WM) particles using various MCW under various temperature conditions. WM particles containing acetaminophen were prepared using a spray congealing technique. In the dissolution test at 37°C, WM particles containing low-melting-point MCWs whose melting was starting at approx. 40°C (Hi-Mic-1045 or 1070) released the drug initially followed by the release of only a small amount. On the other hand, in the dissolution test at 20 and 25°C for WM particles containing Hi-Mic-1045 and at 20, 25, and 30°C for that containing Hi-Mic-1070, both WM particles showed faster drug release than at 37°C. The characteristic drug release suppression of WM particles containing low-melting-point MCWs at 37°C was thought attributable to MCW melting, as evidenced by differential scanning calorimetry analysis and powder X-ray diffraction analysis. Taken together, low-melting-point MCWs may be applicable to develop implantable temperature-sensitive formulations that drug release is accelerated by cooling at administered site. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Robust Optimization and Sensitivity Analysis with Multi-Objective Genetic Algorithms: Single- and Multi-Disciplinary Applications

    DTIC Science & Technology

    2007-01-01

    multi-disciplinary optimization with uncertainty. Robust optimization and sensitivity analysis is usually used when an optimization model has...formulation is introduced in Section 2.3. We briefly discuss several definitions used in the sensitivity analysis in Section 2.4. Following in...2.5. 2.4 SENSITIVITY ANALYSIS In this section, we discuss several definitions used in Chapter 5 for Multi-Objective Sensitivity Analysis . Inner

  15. Computational mechanics analysis tools for parallel-vector supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Nguyen, D. T.; Baddourah, M. A.; Qin, J.

    1993-01-01

    Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigen-solution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization algorithm and domain decomposition. The source code for many of these algorithms is available from NASA Langley.

  16. CFD and Aeroelastic Analysis of the MEXICO Wind Turbine

    NASA Astrophysics Data System (ADS)

    Carrión, M.; Woodgate, M.; Steijl, R.; Barakos, G.; Gómez-Iradi, S.; Munduate, X.

    2014-12-01

    This paper presents an aerodynamic and aeroelastic analysis of the MEXICO wind turbine, using the compressible HMB solver of Liverpool. The aeroelasticity of the blade, as well as the effect of a low-Mach scheme were studied for the zero-yaw 15m/s wind case and steady- state computations. The wake developed behind the rotor was also extracted and compared with the experimental data, using the compressible solver and a low-Mach scheme. It was found that the loads were not sensitive to the Mach number effects, although the low-Mach scheme improved the wake predictions. The sensitivity of the results to the blade structural properties was also highlighted.

  17. Sensitivity analysis of discrete structural systems: A survey

    NASA Technical Reports Server (NTRS)

    Adelman, H. M.; Haftka, R. T.

    1984-01-01

    Methods for calculating sensitivity derivatives for discrete structural systems are surveyed, primarily covering literature published during the past two decades. Methods are described for calculating derivatives of static displacements and stresses, eigenvalues and eigenvectors, transient structural response, and derivatives of optimum structural designs with respect to problem parameters. The survey is focused on publications addressed to structural analysis, but also includes a number of methods developed in nonstructural fields such as electronics, controls, and physical chemistry which are directly applicable to structural problems. Most notable among the nonstructural-based methods are the adjoint variable technique from control theory, and the Green's function and FAST methods from physical chemistry.

  18. Implementation of structural response sensitivity calculations in a large-scale finite-element analysis system

    NASA Technical Reports Server (NTRS)

    Giles, G. L.; Rogers, J. L., Jr.

    1982-01-01

    The methodology used to implement structural sensitivity calculations into a major, general-purpose finite-element analysis system (SPAR) is described. This implementation includes a generalized method for specifying element cross-sectional dimensions as design variables that can be used in analytically calculating derivatives of output quantities from static stress, vibration, and buckling analyses for both membrane and bending elements. Limited sample results for static displacements and stresses are presented to indicate the advantages of analytically calculating response derivatives compared to finite difference methods. Continuing developments to implement these procedures into an enhanced version of SPAR are also discussed.

  19. Quantile regression in the presence of monotone missingness with sensitivity analysis

    PubMed Central

    Liu, Minzhao; Daniels, Michael J.; Perri, Michael G.

    2016-01-01

    In this paper, we develop methods for longitudinal quantile regression when there is monotone missingness. In particular, we propose pattern mixture models with a constraint that provides a straightforward interpretation of the marginal quantile regression parameters. Our approach allows sensitivity analysis which is an essential component in inference for incomplete data. To facilitate computation of the likelihood, we propose a novel way to obtain analytic forms for the required integrals. We conduct simulations to examine the robustness of our approach to modeling assumptions and compare its performance to competing approaches. The model is applied to data from a recent clinical trial on weight management. PMID:26041008

  20. Comparison between two methodologies for urban drainage decision aid.

    PubMed

    Moura, P M; Baptista, M B; Barraud, S

    2006-01-01

    The objective of the present work is to compare two methodologies based on multicriteria analysis for the evaluation of stormwater systems. The first methodology was developed in Brazil and is based on performance-cost analysis, the second one is ELECTRE III. Both methodologies were applied to a case study. Sensitivity and robustness analyses were then carried out. These analyses demonstrate that both methodologies have equivalent results, and present low sensitivity and high robustness. These results prove that the Brazilian methodology is consistent and can be used safely in order to select a good solution or a small set of good solutions that could be compared with more detailed methods afterwards.

  1. Direct Study of Parenting: A Serendipitous Outcome in a Course on Adult Development

    ERIC Educational Resources Information Center

    Williams, Robert B.

    2015-01-01

    This paper describes the activities of a course on adult development. The course intended to sensitize participants to the theories and reality of adulthood and aging by introducing them to selected literature on adult development and to the preparation of case records and mastery of activities that permit an analysis of the adult's world. The…

  2. Setting the stage for equity-sensitive monitoring of the maternal and child health Millennium Development Goals.

    PubMed Central

    Wirth, Meg E.; Balk, Deborah; Delamonica, Enrique; Storeygard, Adam; Sacks, Emma; Minujin, Alberto

    2006-01-01

    OBJECTIVE: This analysis seeks to set the stage for equity-sensitive monitoring of the health-related Millennium Development Goals (MDGs). METHODS: We use data from international household-level surveys (Demographic and Health Surveys (DHS) and Multiple Indicator Cluster Surveys (MICS)) to demonstrate that establishing an equity baseline is necessary and feasible, even in low-income and data-poor countries. We assess data from six countries using 11 health indicators and six social stratifiers. Simple bivariate stratification is complemented by simultaneous stratification to expose the compound effect of multiple forms of vulnerability. FINDINGS: The data reveal that inequities are complex and interactive: inferences cannot be drawn about the nature or extent of inequities in health outcomes from a single stratifier or indicator. CONCLUSION: The MDGs and other development initiatives must become more comprehensive and explicit in their analysis and tracking of inequities. The design of policies to narrow health gaps must take into account country-specific inequities. PMID:16878225

  3. GIS least-cost analysis approach for siting gas pipeline ROWs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sydelko, P.J.; Wilkey, P.L.

    1994-09-01

    Geographic-information-system applications for the siting and monitoring of gas pipeline rights-of-way (ROWS) were developed for areas near Rio Vista, California. The data layers developed for this project represent geographic features, such as landcover, elevation, aspect, slope, soils, hydrography, transportation corridors, endangered species habitats, wetlands, and public line surveys. A geographic information system was used to develop and store spatial data from several sources; to manipulate spatial data to evaluate environmental and engineering issues associated with the siting, permitting, construction, maintenance, and monitoring of gas-pipeline ROWS; and to graphically display analysis results. Examples of these applications include (1) determination of environmentallymore » sensitive areas, such as endangered species habitat, wetlands, and areas of highly erosive soils; (2) evaluation of engineering constraints, including shallow depth to bedrock, major hydrographic features, and shallow water table; (3) classification of satellite imagery for landuse/landcover that will affect ROWS; and (4) identification of alternative ROW corridors that avoid environmentally sensitive areas or areas with severe engineering constraints.« less

  4. Sensitivity to Structure in the Speech Signal by Children with Speech Sound Disorder and Reading Disability

    ERIC Educational Resources Information Center

    Johnson, Erin Phinney; Pennington, Bruce F.; Lowenstein, Joanna H.; Nittrouer, Susan

    2011-01-01

    Research Design;Intervention;Biology;Biotechnology;Teaching Methods;Hands on Science;Professional Development;Comparative Analysis;Genetics;Evaluation;Pretests Posttests;Control Groups;Science Education;Science Instruction;Pedagogical Content Knowledge;

  5. Understanding aerosol-cloud interactions in the development of orographic cumulus congestus during IPHEx

    NASA Astrophysics Data System (ADS)

    Barros, A. P.; Duan, Y.

    2017-12-01

    A new cloud parcel model (CPM) including activation, condensation, collision-coalescence, and lateral entrainment processes is presented here to investigate aerosol-cloud interactions (ACI) in cumulus development prior to rainfall onset. The CPM was employed along with ground based radar and surface aerosol measurements to predict the vertical structure of cloud formation at early stages and evaluated against airborne observations of cloud microphysics and thermodynamic conditions during the Integrated Precipitation and Hydrology Experiment (IPHEx) over the Southern Appalachian Mountains. Further, the CPM was applied to explore the space of ACI physical parameters controlling cumulus congestus growth not available from measurements, and to examine how variations in aerosol properties and microphysical processes influence the evolution and thermodynamic state of clouds over complex terrain via sensitivity analysis. Modeling results indicate that simulated spectra with a low value of condensation coefficient (0.01) are in good agreement with IPHEx aircraft observations around the same altitude. This is in contrast with high values reported in previous studies assuming adiabatic conditions. Entrainment is shown to govern the vertical development of clouds and the change of droplet numbers with height, and the sensitivity analysis suggests that there is a trade-off between entrainment strength and condensation process. Simulated CDNC also exhibits high sensitivity to variations in initial aerosol concentration at cloud base, but weak sensitivity to aerosol hygroscopicity. Exploratory multiple-parcel simulations capture realistic time-scales of vertical development of cumulus congestus (deeper clouds and faster droplet growth). These findings provide new insights into determinant factors of mid-day cumulus congestus formation that can explain a large fraction of warm season rainfall in mountainous regions.

  6. Improved Diffuse Fluorescence Flow Cytometer Prototype for High Sensitivity Detection of Rare Circulating Cells In Vivo

    NASA Astrophysics Data System (ADS)

    Pestana, Noah Benjamin

    Accurate quantification of circulating cell populations is important in many areas of pre-clinical and clinical biomedical research, for example, in the study of cancer metastasis or the immune response following tissue and organ transplants. Normally this is done "ex-vivo" by drawing and purifying a small volume of blood and then analyzing it with flow cytometry, hemocytometry or microfludic devices, but the sensitivity of these techniques are poor and the process of handling samples has been shown to affect cell viability and behavior. More recently "in vivo flow cytometry" (IVFC) techniques have been developed where fluorescently-labeled cells flowing in a small blood vessel in the ear or retina are analyzed, but the sensitivity is generally poor due to the small sampling volume. To address this, our group recently developed a method known as "Diffuse Fluorescence Flow Cytometry" (DFFC) that allows detection and counting of rare circulating cells with diffuse photons, offering extremely high single cell counting sensitivity. In this thesis, an improved DFFC prototype was designed and validated. The chief improvements were three-fold, i) improved optical collection efficiency, ii) improved detection electronics, and iii) development of a method to mitigate motion artifacts during in vivo measurements. In combination, these improvements yielded an overall instrument detection sensitivity better than 1 cell/mL in vivo, which is the most sensitive IVFC system reported to date. Second, development and validation of a low-cost microfluidic device reader for analysis of ocular fluids is described. We demonstrate that this device has equivalent or better sensitivity and accuracy compared a fluorescence microscope, but at an order-of-magnitude reduced cost with simplified operation. Future improvements to both instruments are also discussed.

  7. Using microRNA profiling in urine samples to develop a non-invasive test for bladder cancer.

    PubMed

    Mengual, Lourdes; Lozano, Juan José; Ingelmo-Torres, Mercedes; Gazquez, Cristina; Ribal, María José; Alcaraz, Antonio

    2013-12-01

    Current standard methods used to detect and monitor bladder urothelial cell carcinoma (UCC) are invasive or have low sensitivity. The incorporation into clinical practice of a non-invasive tool for UCC assessment would enormously improve patients' quality of life and outcome. This study aimed to examine the microRNA (miRNA) expression profiles in urines of UCC patients in order to develop a non-invasive accurate and reliable tool to diagnose and provide information on the aggressiveness of the tumor. We performed a global miRNA expression profiling analysis of the urinary cells from 40 UCC patients and controls using TaqMan Human MicroRNA Array followed by validation of 22 selected potentially diagnostic and prognostic miRNAs in a separate cohort of 277 samples using a miRCURY LNA qPCR system. miRNA-based signatures were developed by multivariate logistic regression analysis and internally cross-validated. In the initial cohort of patients, we identified 40 and 30 aberrantly expressed miRNA in UCC compared with control urines and in high compared with low grade tumors, respectively. Quantification of 22 key miRNAs in an independent cohort resulted in the identification of a six miRNA diagnostic signature with a sensitivity of 84.8% and specificity of 86.5% (AUC = 0.92) and a two miRNA prognostic model with a sensitivity of 84.95% and a specificity of 74.14% (AUC = 0.83). Internal cross-validation analysis confirmed the accuracy rates of both models, reinforcing the strength of our findings. Although the data needs to be externally validated, miRNA analysis in urine appears to be a valuable tool for the non-invasive assessment of UCC. Copyright © 2013 UICC.

  8. Development and Sensitivity Analysis of a Frost Risk model based primarily on freely distributed Earth Observation data

    NASA Astrophysics Data System (ADS)

    Louka, Panagiota; Petropoulos, George; Papanikolaou, Ioannis

    2015-04-01

    The ability to map the spatiotemporal distribution of extreme climatic conditions, such as frost, is a significant tool in successful agricultural management and decision making. Nowadays, with the development of Earth Observation (EO) technology, it is possible to obtain accurately, timely and in a cost-effective way information on the spatiotemporal distribution of frost conditions, particularly over large and otherwise inaccessible areas. The present study aimed at developing and evaluating a frost risk prediction model, exploiting primarily EO data from MODIS and ASTER sensors and ancillary ground observation data. For the evaluation of our model, a region in north-western Greece was selected as test site and a detailed sensitivity analysis was implemented. The agreement between the model predictions and the observed (remotely sensed) frost frequency obtained by MODIS sensor was evaluated thoroughly. Also, detailed comparisons of the model predictions were performed against reference frost ground observations acquired from the Greek Agricultural Insurance Organization (ELGA) over a period of 10-years (2000-2010). Overall, results evidenced the ability of the model to produce reasonably well the frost conditions, following largely explainable patterns in respect to the study site and local weather conditions characteristics. Implementation of our proposed frost risk model is based primarily on satellite imagery analysis provided nowadays globally at no cost. It is also straightforward and computationally inexpensive, requiring much less effort in comparison for example to field surveying. Finally, the method is adjustable to be potentially integrated with other high resolution data available from both commercial and non-commercial vendors. Keywords: Sensitivity analysis, frost risk mapping, GIS, remote sensing, MODIS, Greece

  9. Analysis of Bisphenol A, Alkylphenols, and Alkylphenol Ethoxylates in NIST SRM 2585 and Indoor House Dust by Gas Chromatography-Tandem Mass Spectrometry (GC/MS/MS).

    PubMed

    Fan, Xinghua; Kubwabo, Cariton; Wu, Fang; Rasmussen, Pat E

    2018-06-26

    Background: Ingestion of house dust has been demonstrated to be an important exposure pathway to several contaminants in young children. These compounds include bisphenol A (BPA), alkylphenols (APs), and alkylphenol ethoxylates (APEOs). Analysis of these compounds in house dust is challenging because of the complex composition of the sample matrix. Objective: The objective was to develop a simple and sensitive method to measure BPA, APs, and APEOs in indoor house dust. Methods: An integrated method that involved solvent extraction using sonication, sample cleanup by solid-phase extraction, derivatization by 2,2,2-trifluoro- N -methyl- N -(trimethylsilyl)acetamide, and analysis by GC coupled with tandem MS was developed for the simultaneous determination of BPA, APs, and APEOs in NIST Standard Reference Material (SRM) 2585 (Organic contaminants in house dust) and in settled house dust samples. Results: Target analytes included BPA, 4- tert -octylphenol (OP), OP monoethoxylate, OP diethoxylate, 4- n -nonylphenol (4 n NP), 4 n NP monoethoxylate (4 n NP 1 EO), branched nonylphenol (NP), NP monoethoxylate, NP diethoxylate, NP triethoxylate, and NP tetraethoxylate. The method was sensitive, with method detection limits ranging from 0.05 to 5.1 μg/g, and average recoveries between 82 and 115%. All target analytes were detected in SRM 2585 and house dust except 4 n NP and 4 n NP 1 EO. Conclusions: The method is simple and fast, with high sensitivity and good reproducibility. It is applicable to the analysis of target analytes in similar matrixes, such as sediments, soil, and biosolids. Highlights: Values measured in SRM 2585 will be useful for future research in method development and method comparison.

  10. Novel approach based on one-tube nested PCR and a lateral flow strip for highly sensitive diagnosis of tuberculous meningitis

    PubMed Central

    Sun, Yajuan; Chen, Jiajun; Li, Jia; Xu, Yawei; Jin, Hui; Xu, Na; Yin, Rui

    2017-01-01

    Rapid and sensitive detection of Mycobacterium tuberculosis (M. Tb) in cerebrospinal fluid is crucial in the diagnosis of tuberculous meningitis (TBM), but conventional diagnostic technologies have limited sensitivity and specificity or are time-consuming. In this work, a novel, highly sensitive molecular diagnostic method, one-tube nested PCR-lateral flow strip test (OTNPCR-LFST), was developed for detecting M. tuberculosis. This one-tube nested PCR maintains the sensitivity of conventional two-step nested PCR and reduces both the chance of cross-contamination and the time required for analysis. The PCR product was detected by a lateral flow strip assay, which provided a basis for migration of the test to a point-of-care (POC) microfluidic format. The developed assay had an improved sensitivity compared with traditional PCR, and the limit of detection was up to 1 fg DNA isolated from M. tuberculosis. The assay was also specific for M. tuberculosis, and no cross-reactions were found in other non-target bacteria. The application of this technique to clinical samples was successfully evaluated, and OTNPCR-LFST showed 89% overall sensitivity and 100% specificity for TBM patients. This one-tube nested PCR-lateral flow strip assay is useful for detecting M. tuberculosis in TBM due to its rapidity, high sensitivity and simple manipulation. PMID:29084241

  11. Combining 'Bottom-Up' and 'Top-Down' Methods to Assess Ethnic Difference in Clearance: Bitopertin as an Example.

    PubMed

    Feng, Sheng; Shi, Jun; Parrott, Neil; Hu, Pei; Weber, Cornelia; Martin-Facklam, Meret; Saito, Tomohisa; Peck, Richard

    2016-07-01

    We propose a strategy for studying ethnopharmacology by conducting sequential physiologically based pharmacokinetic (PBPK) prediction (a 'bottom-up' approach) and population pharmacokinetic (popPK) confirmation (a 'top-down' approach), or in reverse order, depending on whether the purpose is ethnic effect assessment for a new molecular entity under development or a tool for ethnic sensitivity prediction for a given pathway. The strategy is exemplified with bitopertin. A PBPK model was built using Simcyp(®) to simulate the pharmacokinetics of bitopertin and to predict the ethnic sensitivity in clearance, given pharmacokinetic data in just one ethnicity. Subsequently, a popPK model was built using NONMEM(®) to assess the effect of ethnicity on clearance, using human data from multiple ethnic groups. A comparison was made to confirm the PBPK-based ethnic sensitivity prediction, using the results of the popPK analysis. PBPK modelling predicted that the bitopertin geometric mean clearance values after 20 mg oral administration in Caucasians would be 1.32-fold and 1.27-fold higher than the values in Chinese and Japanese, respectively. The ratios of typical clearance in Caucasians to the values in Chinese and Japanese estimated by popPK analysis were 1.20 and 1.17, respectively. The popPK analysis results were similar to the PBPK modelling results. As a general framework, we propose that PBPK modelling should be considered to predict ethnic sensitivity of pharmacokinetics prior to any human data and/or with data in only one ethnicity. In some cases, this will be sufficient to guide initial dose selection in different ethnicities. After clinical trials in different ethnicities, popPK analysis can be used to confirm ethnic differences and to support dose justification and labelling. PBPK modelling prediction and popPK analysis confirmation can complement each other to assess ethnic differences in pharmacokinetics at different drug development stages.

  12. Microfluidic single-cell whole-transcriptome sequencing.

    PubMed

    Streets, Aaron M; Zhang, Xiannian; Cao, Chen; Pang, Yuhong; Wu, Xinglong; Xiong, Liang; Yang, Lu; Fu, Yusi; Zhao, Liang; Tang, Fuchou; Huang, Yanyi

    2014-05-13

    Single-cell whole-transcriptome analysis is a powerful tool for quantifying gene expression heterogeneity in populations of cells. Many techniques have, thus, been recently developed to perform transcriptome sequencing (RNA-Seq) on individual cells. To probe subtle biological variation between samples with limiting amounts of RNA, more precise and sensitive methods are still required. We adapted a previously developed strategy for single-cell RNA-Seq that has shown promise for superior sensitivity and implemented the chemistry in a microfluidic platform for single-cell whole-transcriptome analysis. In this approach, single cells are captured and lysed in a microfluidic device, where mRNAs with poly(A) tails are reverse-transcribed into cDNA. Double-stranded cDNA is then collected and sequenced using a next generation sequencing platform. We prepared 94 libraries consisting of single mouse embryonic cells and technical replicates of extracted RNA and thoroughly characterized the performance of this technology. Microfluidic implementation increased mRNA detection sensitivity as well as improved measurement precision compared with tube-based protocols. With 0.2 M reads per cell, we were able to reconstruct a majority of the bulk transcriptome with 10 single cells. We also quantified variation between and within different types of mouse embryonic cells and found that enhanced measurement precision, detection sensitivity, and experimental throughput aided the distinction between biological variability and technical noise. With this work, we validated the advantages of an early approach to single-cell RNA-Seq and showed that the benefits of combining microfluidic technology with high-throughput sequencing will be valuable for large-scale efforts in single-cell transcriptome analysis.

  13. General Outcome Measures for Verbal Operants

    ERIC Educational Resources Information Center

    Kubina, Richard M., Jr.; Wolfe, Pamela; Kostewicz, Douglas E.

    2009-01-01

    A general outcome measure (GOM) can be used to show progress towards a long-term goal. GOMs should sample domains of behavior across ages, be sensitive to change over time, be inexpensive and easy to use, and facilitate decision making. Skinner's (1957) analysis of verbal behavior may benefit from the development of GOM. To develop GOM, we…

  14. Tool development to assess the work related neck and upper limb musculoskeletal disorders among female garment workers in Sri-Lanka.

    PubMed

    Amarasinghe, Nirmalie Champika; De AlwisSenevirathne, Rohini

    2016-10-17

    Musculoskeletal disorders (MSDs) have been identified as a predisposing factor for lesser productivity, but no validated tool has been developed to assess them in the Sri- Lankan context. To develop a validated tool to assess the neck and upper limb MSDs. It comprises three components: item selections, item reduction using principal component analysis, and validation. A tentative self-administrated questionnaire was developed, translated, and pre-tested. Four important domains - neck, shoulder, elbow and wrist - were identified through principal component analysis. Prevalence of any MSDs was 38.1% and prevalence of neck, shoulder, elbow and wrist MSDs are 12.85%, 13.71%, 12%, 13.71% respectively. Content and criterion validity of the tool was assessed. Separate ROC curves were produced and sensitivity and specificity of neck (83.1%, 71.7%), shoulder (97.6%, 91.9%), elbow (98.2%, 87.2%), and wrist (97.6%, 94.9%) was determined. Cronbach's Alpha and correlation coefficient was above 0.7. The tool has high sensitivity, specificity, internal consistency, and test re-test reliability.

  15. Indel analysis by droplet digital PCR: a sensitive method for DNA mixture detection and chimerism analysis.

    PubMed

    Santurtún, Ana; Riancho, José A; Arozamena, Jana; López-Duarte, Mónica; Zarrabeitia, María T

    2017-01-01

    Several methods have been developed to determinate genetic profiles from a mixed samples and chimerism analysis in transplanted patients. The aim of this study was to explore the effectiveness of using the droplet digital PCR (ddPCR) for mixed chimerism detection (a mixture of genetic profiles resulting after allogeneic hematopoietic stem cell transplantation (HSCT)). We analyzed 25 DNA samples from patients who had undergone HSCT and compared the performance of ddPCR and two established methods for chimerism detection, based upon the Indel and STRs analysis, respectively. Additionally, eight artificial mixture DNA samples were created to evaluate the sensibility of ddPCR. Our results show that the chimerism percentages estimated by the analysis of a single Indel using ddPCR were very similar to those calculated by the amplification of 15 STRs (r 2  = 0.970) and with the results obtained by the amplification of 38 Indels (r 2  = 0.975). Moreover, the amplification of a single Indel by ddPCR was sensitive enough to detect a minor DNA contributor comprising down to 0.5 % of the sample. We conclude that ddPCR can be a powerful tool for the determination of a genetic profile of forensic mixtures and clinical chimerism analysis when traditional techniques are not sensitive enough.

  16. Mesoporous structured MIPs@CDs fluorescence sensor for highly sensitive detection of TNT.

    PubMed

    Xu, Shoufang; Lu, Hongzhi

    2016-11-15

    A facile strategy was developed to prepare mesoporous structured molecularly imprinted polymers capped carbon dots (M-MIPs@CDs) fluorescence sensor for highly sensitive and selective determination of TNT. The strategy using amino-CDs directly as "functional monomer" for imprinting simplify the imprinting process and provide well recognition sites accessibility. The as-prepared M-MIPs@CDs sensor, using periodic mesoporous silica as imprinting matrix, and amino-CDs directly as "functional monomer", exhibited excellent selectivity and sensitivity toward TNT with detection limit of 17nM. The recycling process was sustainable for 10 times without obvious efficiency decrease. The feasibility of the developed method in real samples was successfully evaluated through the analysis of TNT in soil and water samples with satisfactory recoveries of 88.6-95.7%. The method proposed in this work was proved to be a convenient and practical way to prepare high sensitive and selective fluorescence MIPs@CDs sensors. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Uncovering psychosocial needs: perspectives of Australian child and family health nurses in a sustained home visiting trial.

    PubMed

    Kardamanidis, Katina; Kemp, Lynn; Schmied, Virginia

    2009-08-01

    The first Australian trial of sustained nurse home visiting provided an opportunity to explore nurses' understanding of the situations that support mothers of infants to disclose personal and sensitive psychosocial information. Using a qualitative descriptive design, semi-structured interviews were conducted and transcripts were analysed drawing upon aspects of Smith's interpretative phenomenological analysis. Five themes pertaining to the experience of relationship building to foster disclosure of sensitive information emerged: (1) building trust is an ongoing process of giving and giving in return, (2) being 'actively passive' to develop trust, (3) the client is in control of the trust-relationship, (4) the association between disclosure of sensitive issues and a trust-relationship, and (5) empowerment over disclosure. This study provides a deeper understanding of how child and family health nurses develop relationships that lead women to entrust the nurse with personal, sensitive information, and may inform the practice of psychosocial needs assessment in other contexts.

  18. Thalamic functional connectivity predicts seizure laterality in individual TLE patients: application of a biomarker development strategy.

    PubMed

    Barron, Daniel S; Fox, Peter T; Pardoe, Heath; Lancaster, Jack; Price, Larry R; Blackmon, Karen; Berry, Kristen; Cavazos, Jose E; Kuzniecky, Ruben; Devinsky, Orrin; Thesen, Thomas

    2015-01-01

    Noninvasive markers of brain function could yield biomarkers in many neurological disorders. Disease models constrained by coordinate-based meta-analysis are likely to increase this yield. Here, we evaluate a thalamic model of temporal lobe epilepsy that we proposed in a coordinate-based meta-analysis and extended in a diffusion tractography study of an independent patient population. Specifically, we evaluated whether thalamic functional connectivity (resting-state fMRI-BOLD) with temporal lobe areas can predict seizure onset laterality, as established with intracranial EEG. Twenty-four lesional and non-lesional temporal lobe epilepsy patients were studied. No significant differences in functional connection strength in patient and control groups were observed with Mann-Whitney Tests (corrected for multiple comparisons). Notwithstanding the lack of group differences, individual patient difference scores (from control mean connection strength) successfully predicted seizure onset zone as shown in ROC curves: discriminant analysis (two-dimensional) predicted seizure onset zone with 85% sensitivity and 91% specificity; logistic regression (four-dimensional) achieved 86% sensitivity and 100% specificity. The strongest markers in both analyses were left thalamo-hippocampal and right thalamo-entorhinal cortex functional connection strength. Thus, this study shows that thalamic functional connections are sensitive and specific markers of seizure onset laterality in individual temporal lobe epilepsy patients. This study also advances an overall strategy for the programmatic development of neuroimaging biomarkers in clinical and genetic populations: a disease model informed by coordinate-based meta-analysis was used to anatomically constrain individual patient analyses.

  19. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  20. Chemical sensors for breath gas analysis: the latest developments at the Breath Analysis Summit 2013.

    PubMed

    Tisch, Ulrike; Haick, Hossam

    2014-06-01

    Profiling the body chemistry by means of volatile organic compounds (VOCs) in the breath opens exciting new avenues in medical diagnostics. Gas sensors could provide ideal platforms for realizing portable, hand-held breath testing devices in the near future. This review summarizes the latest developments and applications in the field of chemical sensors for diagnostic breath testing that were presented at the Breath Analysis Summit 2013 in Wallerfangen, Germany. Considerable progress has been made towards clinically applicable breath testing devices, especially by utilizing chemo-sensitive nanomaterials. Examples of several specialized breath testing applications are presented that are either based on stand-alone nanomaterial-based sensors being highly sensitive and specific to individual breath compounds over others, or on combinations of several highly specific sensors, or on experimental nanomaterial-based sensors arrays. Other interesting approaches include the adaption of a commercially available MOx-based sensor array to indirect breath testing applications, using a sample pre-concentration method, and the development of compact integrated GC-sensor systems. The recent trend towards device integration has led to the development of fully integrated prototypes of point-of-care devices. We describe and compare the performance of several prototypes that are based on different sensing technologies and evaluate their potential as low-cost and readily available next-generation medical devices.

  1. Highly sensitive measurement of whole blood chromium by inductively coupled plasma mass spectrometry.

    PubMed

    Cieslak, Wendy; Pap, Kathleen; Bunch, Dustin R; Reineks, Edmunds; Jackson, Raymond; Steinle, Roxanne; Wang, Sihe

    2013-02-01

    Chromium (Cr), a trace metal element, is implicated in diabetes and cardiovascular disease. A hypochromic state has been associated with poor blood glucose control and unfavorable lipid metabolism. Sensitive and accurate measurement of blood chromium is very important to assess the chromium nutritional status. However, interferents in biological matrices and contamination make the sensitive analysis challenging. The primary goal of this study was to develop a highly sensitive method for quantification of total Cr in whole blood by inductively coupled plasma mass spectrometry (ICP-MS) and to validate the reference interval in a local healthy population. This method was developed on an ICP-MS with a collision/reaction cell. Interference was minimized using both kinetic energy discrimination between the quadrupole and hexapole and a selective collision gas (helium). Reference interval was validated in whole blood samples (n=51) collected in trace element free EDTA tubes from healthy adults (12 males, 39 females), aged 19-64 years (38.8±12.6), after a minimum of 8 h fasting. Blood samples were aliquoted into cryogenic vials and stored at -70 °C until analysis. The assay linearity was 3.42 to 1446.59 nmol/L with an accuracy of 87.7 to 99.8%. The high sensitivity was achieved by minimization of interference through selective kinetic energy discrimination and selective collision using helium. The reference interval for total Cr using a non-parametric method was verified to be 3.92 to 7.48 nmol/L. This validated ICP-MS methodology is highly sensitive and selective for measuring total Cr in whole blood. Copyright © 2012 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved. Published by Elsevier Inc. All rights reserved.

  2. Using global sensitivity analysis of demographic models for ecological impact assessment.

    PubMed

    Aiello-Lammens, Matthew E; Akçakaya, H Resit

    2017-02-01

    Population viability analysis (PVA) is widely used to assess population-level impacts of environmental changes on species. When combined with sensitivity analysis, PVA yields insights into the effects of parameter and model structure uncertainty. This helps researchers prioritize efforts for further data collection so that model improvements are efficient and helps managers prioritize conservation and management actions. Usually, sensitivity is analyzed by varying one input parameter at a time and observing the influence that variation has over model outcomes. This approach does not account for interactions among parameters. Global sensitivity analysis (GSA) overcomes this limitation by varying several model inputs simultaneously. Then, regression techniques allow measuring the importance of input-parameter uncertainties. In many conservation applications, the goal of demographic modeling is to assess how different scenarios of impact or management cause changes in a population. This is challenging because the uncertainty of input-parameter values can be confounded with the effect of impacts and management actions. We developed a GSA method that separates model outcome uncertainty resulting from parameter uncertainty from that resulting from projected ecological impacts or simulated management actions, effectively separating the 2 main questions that sensitivity analysis asks. We applied this method to assess the effects of predicted sea-level rise on Snowy Plover (Charadrius nivosus). A relatively small number of replicate models (approximately 100) resulted in consistent measures of variable importance when not trying to separate the effects of ecological impacts from parameter uncertainty. However, many more replicate models (approximately 500) were required to separate these effects. These differences are important to consider when using demographic models to estimate ecological impacts of management actions. © 2016 Society for Conservation Biology.

  3. Perturbation Selection and Local Influence Analysis for Nonlinear Structural Equation Model

    ERIC Educational Resources Information Center

    Chen, Fei; Zhu, Hong-Tu; Lee, Sik-Yum

    2009-01-01

    Local influence analysis is an important statistical method for studying the sensitivity of a proposed model to model inputs. One of its important issues is related to the appropriate choice of a perturbation vector. In this paper, we develop a general method to select an appropriate perturbation vector and a second-order local influence measure…

  4. Satellite services system analysis study. Volume 2: Satellite and services user model

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Satellite services needs are analyzed. Topics include methodology: a satellite user model; representative servicing scenarios; potential service needs; manned, remote, and automated involvement; and inactive satellites/debris. Satellite and services user model development is considered. Groundrules and assumptions, servicing, events, and sensitivity analysis are included. Selection of references satellites is also discussed.

  5. Renewable Energy Deployment in Colorado and the West: A Modeling Sensitivity and GIS Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barrows, Clayton; Mai, Trieu; Haase, Scott

    2016-03-01

    The Resource Planning Model is a capacity expansion model designed for a regional power system, such as a utility service territory, state, or balancing authority. We apply a geospatial analysis to Resource Planning Model renewable energy capacity expansion results to understand the likelihood of renewable development on various lands within Colorado.

  6. DEVELOPING SYNTACTIC CONTROL IN SEVENTH GRADE WRITING THROUGH AUDIO-LINGUAL DRILL ON TRANSFORMATIONS.

    ERIC Educational Resources Information Center

    GRIFFIN, WILLIAM J.

    AN ANALYSIS OF "T-UNITS" (THE MINIMAL TERMINABLE SYNTACTIC UNITS ALLOWED BY THE GRAMMAR OF ENGLISH), AS FOUND IN CHILDREN'S WRITING, IS A MORE SENSITIVE MEASURE OF GROWTH OF SYNTACTIC SKILL THAN TRADITIONAL CRITERIA. HUNT'S 1965 COMPARATIVE ANALYSIS OF CLASSROOM WRITING OF FOURTH-, EIGHTH-, AND 12TH-GRADE CHILDREN, AND OF MAGAZINE…

  7. Competitive amplification of differentially melting amplicons (CADMA) enables sensitive and direct detection of all mutation types by high-resolution melting analysis.

    PubMed

    Kristensen, Lasse S; Andersen, Gitte B; Hager, Henrik; Hansen, Lise Lotte

    2012-01-01

    Sensitive and specific mutation detection is of particular importance in cancer diagnostics, prognostics, and individualized patient treatment. However, the majority of molecular methodologies that have been developed with the aim of increasing the sensitivity of mutation testing have drawbacks in terms of specificity, convenience, or costs. Here, we have established a new method, Competitive Amplification of Differentially Melting Amplicons (CADMA), which allows very sensitive and specific detection of all mutation types. The principle of the method is to amplify wild-type and mutated sequences simultaneously using a three-primer system. A mutation-specific primer is designed to introduce melting temperature decreasing mutations in the resulting mutated amplicon, while a second overlapping primer is designed to amplify both wild-type and mutated sequences. When combined with a third common primer very sensitive mutation detection becomes possible, when using high-resolution melting (HRM) as detection platform. The introduction of melting temperature decreasing mutations in the mutated amplicon also allows for further mutation enrichment by fast coamplification at lower denaturation temperature PCR (COLD-PCR). For proof-of-concept, we have designed CADMA assays for clinically relevant BRAF, EGFR, KRAS, and PIK3CA mutations, which are sensitive to, between 0.025% and 0.25%, mutated alleles in a wild-type background. In conclusion, CADMA enables highly sensitive and specific mutation detection by HRM analysis. © 2011 Wiley Periodicals, Inc.

  8. Using Dynamic Sensitivity Analysis to Assess Testability

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey; Morell, Larry; Miller, Keith

    1990-01-01

    This paper discusses sensitivity analysis and its relationship to random black box testing. Sensitivity analysis estimates the impact that a programming fault at a particular location would have on the program's input/output behavior. Locations that are relatively \\"insensitive" to faults can render random black box testing unlikely to uncover programming faults. Therefore, sensitivity analysis gives new insight when interpreting random black box testing results. Although sensitivity analysis is computationally intensive, it requires no oracle and no human intervention.

  9. Local influence for generalized linear models with missing covariates.

    PubMed

    Shi, Xiaoyan; Zhu, Hongtu; Ibrahim, Joseph G

    2009-12-01

    In the analysis of missing data, sensitivity analyses are commonly used to check the sensitivity of the parameters of interest with respect to the missing data mechanism and other distributional and modeling assumptions. In this article, we formally develop a general local influence method to carry out sensitivity analyses of minor perturbations to generalized linear models in the presence of missing covariate data. We examine two types of perturbation schemes (the single-case and global perturbation schemes) for perturbing various assumptions in this setting. We show that the metric tensor of a perturbation manifold provides useful information for selecting an appropriate perturbation. We also develop several local influence measures to identify influential points and test model misspecification. Simulation studies are conducted to evaluate our methods, and real datasets are analyzed to illustrate the use of our local influence measures.

  10. Development of a Dual Plasma Desorption/Ionization System for the Noncontact and Highly Sensitive Analysis of Surface Adhesive Compounds

    PubMed Central

    Aida, Mari; Iwai, Takahiro; Okamoto, Yuki; Kohno, Satoshi; Kakegawa, Ken; Miyahara, Hidekazu; Seto, Yasuo; Okino, Akitoshi

    2017-01-01

    We developed a dual plasma desorption/ionization system using two plasmas for the semi-invasive analysis of compounds on heat-sensitive substrates such as skin. The first plasma was used for the desorption of the surface compounds, whereas the second was used for the ionization of the desorbed compounds. Using the two plasmas, each process can be optimized individually. A successful analysis of phenyl salicylate and 2-isopropylpyridine was achieved using the developed system. Furthermore, we showed that it was possible to detect the mass signals derived from a sample even at a distance 50 times greater than the distance from the position at which the samples were detached. In addition, to increase the intensity of the mass signal, 0%–0.02% (v/v) of hydrogen gas was added to the base gas generated in the ionizing plasma. We found that by optimizing the gas flow rate through the addition of a small amount of hydrogen gas, it was possible to obtain the intensity of the mass signal that was 45–824 times greater than that obtained without the addition of hydrogen gas. PMID:29234573

  11. Influence of enrichment broths on multiplex PCR detection of total coliform bacteria, Escherichia coli and Clostridium perfringens, in spiked water samples.

    PubMed

    Worakhunpiset, S; Tharnpoophasiam, P

    2009-07-01

    Although multiplex PCR amplification condition for simultaneous detection of total coliform bacteria, Escherichia coli and Clostridium perfringens in water sample has been developed, results with high sensitivity are obtained when amplifying purified DNA, but the sensitivity is low when applied to spiked water samples. An enrichment broth culture prior PCR analysis increases sensitivity of the test but the specific nature of enrichment broth can affect the PCR results. Three enrichment broths, lactose broth, reinforced clostridial medium and fluid thioglycollate broth, were compared for their influence on sensitivity and on time required with multiplex PCR assay. Fluid thioglycollate broth was the most effective with shortest enrichment time and lowest detection limit.

  12. Computational, Experimental and Engineering Foundations of Ionic Channels as Miniaturized Sensors, Devices and Systems

    DTIC Science & Technology

    2003-10-01

    made in an ensemble of channels of unknown orientation and number, preventing quantitative analysis . • Currents have been compared among continuum PNP...microfluidic) analysis of ion channels to obtain fundamental insights into the selectivity, conductivity, and sensitivity of ion channels [19], [6...1.1 Develop fast and efficient simulators for steady-state analysis of continuum model for extraction of I-V curves. 1.2 Create

  13. A hydrogeologic framework for characterizing summer streamflow sensitivity to climate warming in the Pacific Northwest, USA

    NASA Astrophysics Data System (ADS)

    Safeeq, M.; Grant, G. E.; Lewis, S. L.; Kramer, M. G.; Staab, B.

    2014-09-01

    Summer streamflows in the Pacific Northwest are largely derived from melting snow and groundwater discharge. As the climate warms, diminishing snowpack and earlier snowmelt will cause reductions in summer streamflow. Most regional-scale assessments of climate change impacts on streamflow use downscaled temperature and precipitation projections from general circulation models (GCMs) coupled with large-scale hydrologic models. Here we develop and apply an analytical hydrogeologic framework for characterizing summer streamflow sensitivity to a change in the timing and magnitude of recharge in a spatially explicit fashion. In particular, we incorporate the role of deep groundwater, which large-scale hydrologic models generally fail to capture, into streamflow sensitivity assessments. We validate our analytical streamflow sensitivities against two empirical measures of sensitivity derived using historical observations of temperature, precipitation, and streamflow from 217 watersheds. In general, empirically and analytically derived streamflow sensitivity values correspond. Although the selected watersheds cover a range of hydrologic regimes (e.g., rain-dominated, mixture of rain and snow, and snow-dominated), sensitivity validation was primarily driven by the snow-dominated watersheds, which are subjected to a wider range of change in recharge timing and magnitude as a result of increased temperature. Overall, two patterns emerge from this analysis: first, areas with high streamflow sensitivity also have higher summer streamflows as compared to low-sensitivity areas. Second, the level of sensitivity and spatial extent of highly sensitive areas diminishes over time as the summer progresses. Results of this analysis point to a robust, practical, and scalable approach that can help assess risk at the landscape scale, complement the downscaling approach, be applied to any climate scenario of interest, and provide a framework to assist land and water managers in adapting to an uncertain and potentially challenging future.

  14. Surface enhanced Raman spectroscopy for urinary tract infection diagnosis and antibiogram

    NASA Astrophysics Data System (ADS)

    Kastanos, Evdokia; Hadjigeorgiou, Katerina; Kyriakides, Alexandros; Pitris, Constantinos

    2010-02-01

    Urinary tract infection diagnosis and antibiogram require a minimum of 48 hours using standard laboratory practice. This long waiting period contributes to an increase in recurrent infections, rising health care costs, and a growing number of bacterial strains developing resistance to antibiotics. In this work, Surface Enhanced Raman Spectroscopy (SERS) was used as a novel method for classifying bacteria and determining their antibiogram. Five species of bacteria were classified with > 90% accuracy using their SERS spectra and a classification algorithm involving novel feature extraction and discriminant analysis. Antibiotic resistance or sensitivity was determined after just a two-hour exposure of bacteria to ciprofloxacin (sensitive) and amoxicillin (resistant) and analysis of their SERS spectra. These results can become the basis for the development of a novel method that would provide same day diagnosis and selection of the most appropriate antibiotic for most effective treatment of a urinary tract infection.

  15. Visible and Extended Near-Infrared Multispectral Imaging for Skin Cancer Diagnosis

    PubMed Central

    Rey-Barroso, Laura; Burgos-Fernández, Francisco J.; Delpueyo, Xana; Ares, Miguel; Malvehy, Josep; Puig, Susana

    2018-01-01

    With the goal of diagnosing skin cancer in an early and noninvasive way, an extended near infrared multispectral imaging system based on an InGaAs sensor with sensitivity from 995 nm to 1613 nm was built to evaluate deeper skin layers thanks to the higher penetration of photons at these wavelengths. The outcomes of this device were combined with those of a previously developed multispectral system that works in the visible and near infrared range (414 nm–995 nm). Both provide spectral and spatial information from skin lesions. A classification method to discriminate between melanomas and nevi was developed based on the analysis of first-order statistics descriptors, principal component analysis, and support vector machine tools. The system provided a sensitivity of 78.6% and a specificity of 84.6%, the latter one being improved with respect to that offered by silicon sensors. PMID:29734747

  16. Quantitative targeted proteomic analysis of potential markers of tyrosine kinase inhibitor (TKI) sensitivity in EGFR mutated lung adenocarcinoma.

    PubMed

    Awasthi, Shivangi; Maity, Tapan; Oyler, Benjamin L; Qi, Yue; Zhang, Xu; Goodlett, David R; Guha, Udayan

    2018-04-13

    Lung cancer causes the highest mortality among all cancers. Patients harboring kinase domain mutations in the epidermal growth factor receptor (EGFR) respond to EGFR tyrosine kinase inhibitors (TKIs), however, acquired resistance always develops. Moreover, 30-40% of patients with EGFR mutations exhibit primary resistance. Hence, there is an unmet need for additional biomarkers of TKI sensitivity that complement EGFR mutation testing and predict treatment response. We previously identified phosphopeptides whose phosphorylation is inhibited upon treatment with EGFR TKIs, erlotinib and afatinib in TKI sensitive cells, but not in resistant cells. These phosphosites are potential biomarkers of TKI sensitivity. Here, we sought to develop modified immuno-multiple reaction monitoring (immuno-MRM)-based quantitation assays for select phosphosites including EGFR-pY1197, pY1172, pY998, AHNAK-pY160, pY715, DAPP1-pY139, CAV1-pY14, INPPL1-pY1135, NEDD9-pY164, NF1-pY2579, and STAT5A-pY694. These sites were significantly hypophosphorylated by erlotinib and a 3rd generation EGFR TKI, osimertinib, in TKI-sensitive H3255 cells, which harbor the TKI-sensitizing EGFR L858R mutation. However, in H1975 cells, which harbor the TKI-resistant EGFR L858R/T790M mutant, osimertinib, but not erlotinib, could significantly inhibit phosphorylation of EGFR-pY-1197, STAT5A-pY694 and CAV1-pY14, suggesting these sites also predict response in TKI-resistant cells. We could further validate EGFR-pY-1197 as a biomarker of TKI sensitivity by developing a calibration curve-based modified immuno-MRM assay. In this report, we have shown the development and optimization of MRM assays coupled with global phosphotyrosine enrichment (modified immuno-MRM) for a list of 11 phosphotyrosine peptides. Our optimized assays identified the targets reproducibly in biological samples with good selectivity. We also developed and characterized quantitation methods to determine endogenous abundance of these targets and correlated the results of the relative quantification with amounts estimated from the calibration curves. This approach represents a way to validate and verify biomarker candidates discovered from large-scale global phospho-proteomics analysis. The application of these modified immuno-MRM assays in lung adenocarcinoma cells provides proof-of concept for the feasibility of clinical applications. These assays may be used in prospective clinical studies of EGFR TKI treatment of EGFR mutant lung cancer to correlate treatment response and other clinical endpoints. Copyright © 2018. Published by Elsevier B.V.

  17. Simultaneous determination of flubendiamide its metabolite desiodo flubendiamide residues in cabbage, tomato and pigeon pea by HPLC.

    PubMed

    Paramasivam, M; Banerjee, Hemanta

    2011-10-01

    A sensitive and simple method for simultaneous analysis of flubendiamide and its metabolite desiodo flubendiamide in cabbage, tomato and pigeon pea has been developed. The residues were extracted with QuEChERS method followed by dispersive solid-phase extraction with primary secondary amine sorbent to remove co extractives, prior to analysis by HPLC coupled with UV-Vis detector. The recoveries of flubendiamide and desiodo flubendiamide were ranged from 85.1 to 98.5% and 85.9 to 97.1% respectively with relative standard deviations (RSD) less than 5% and sensitivity of 0.01 μg g(-1). The method offers a less expensive and safer alternative to the existing residue analysis methods for vegetables. © Springer Science+Business Media, LLC 2011

  18. Recent advances on multidimensional liquid chromatography-mass spectrometry for proteomics: from qualitative to quantitative analysis--a review.

    PubMed

    Wu, Qi; Yuan, Huiming; Zhang, Lihua; Zhang, Yukui

    2012-06-20

    With the acceleration of proteome research, increasing attention has been paid to multidimensional liquid chromatography-mass spectrometry (MDLC-MS) due to its high peak capacity and separation efficiency. Recently, many efforts have been put to improve MDLC-based strategies including "top-down" and "bottom-up" to enable highly sensitive qualitative and quantitative analysis of proteins, as well as accelerate the whole analytical procedure. Integrated platforms with combination of sample pretreatment, multidimensional separations and identification were also developed to achieve high throughput and sensitive detection of proteomes, facilitating highly accurate and reproducible quantification. This review summarized the recent advances of such techniques and their applications in qualitative and quantitative analysis of proteomes. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. A fully battery-powered inexpensive spectrophotometric system for high-sensitivity point-of-care analysis on a microfluidic chip

    PubMed Central

    Dou, Maowei; Lopez, Juan; Rios, Misael; Garcia, Oscar; Xiao, Chuan; Eastman, Michael

    2016-01-01

    A cost-effective battery-powered spectrophotometric system (BASS) was developed for quantitative point-of-care (POC) analysis on a microfluidic chip. By using methylene blue as a model analyte, we first compared the performance of the BASS with a commercial spectrophotometric system, and further applied the BASS for loop-mediated isothermal amplification (LAMP) detection and subsequent quantitative nucleic acid analysis which exhibited a comparable limit of detection to that of Nanodrop. Compared to the commercial spectrophotometric system, our spectrophotometric system is lower-cost, consumes less reagents, and has a higher detection sensitivity. Most importantly, it does not rely on external power supplies. All these features make our spectrophotometric system highly suitable for a variety of POC analyses, such as field detection. PMID:27143408

  20. Analysis of air-, moisture- and solvent-sensitive chemical compounds by mass spectrometry using an inert atmospheric pressure solids analysis probe.

    PubMed

    Mosely, Jackie A; Stokes, Peter; Parker, David; Dyer, Philip W; Messinis, Antonis M

    2018-02-01

    A novel method has been developed that enables chemical compounds to be transferred from an inert atmosphere glove box and into the atmospheric pressure ion source of a mass spectrometer whilst retaining a controlled chemical environment. This innovative method is simple and cheap to implement on some commercially available mass spectrometers. We have termed this approach inert atmospheric pressure solids analysis probe ( iASAP) and demonstrate the benefit of this methodology for two air-/moisture-sensitive chemical compounds whose characterisation by mass spectrometry is now possible and easily achieved. The simplicity of the design means that moving between iASAP and standard ASAP is straightforward and quick, providing a highly flexible platform with rapid sample turnaround.

  1. Probabilistic sensitivity analysis incorporating the bootstrap: an example comparing treatments for the eradication of Helicobacter pylori.

    PubMed

    Pasta, D J; Taylor, J L; Henning, J M

    1999-01-01

    Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.

  2. Sensitivity of combustion and ignition characteristics of the solid-fuel charge of the microelectromechanical system of a microthruster to macrokinetic and design parameters

    NASA Astrophysics Data System (ADS)

    Futko, S. I.; Ermolaeva, E. M.; Dobrego, K. V.; Bondarenko, V. P.; Dolgii, L. N.

    2012-07-01

    We have developed a sensitivity analysis permitting effective estimation of the change in the impulse responses of a microthrusters and in the ignition characteristics of the solid-fuel charge caused by the variation of the basic macrokinetic parameters of the mixed fuel and the design parameters of the microthruster's combustion chamber. On the basis of the proposed sensitivity analysis, we have estimated the spread of both the propulsive force and impulse and the induction period and self-ignition temperature depending on the macrokinetic parameters of combustion (pre-exponential factor, activation energy, density, and heat content) of the solid-fuel charge of the microthruster. The obtained results can be used for rapid and effective estimation of the spread of goal functions to provide stable physicochemical characteristics and impulse responses of solid-fuel mixtures in making and using microthrusters.

  3. Make or buy decision model with multi-stage manufacturing process and supplier imperfect quality

    NASA Astrophysics Data System (ADS)

    Pratama, Mega Aria; Rosyidi, Cucuk Nur

    2017-11-01

    This research develops an make or buy decision model considering supplier imperfect quality. This model can be used to help companies make the right decision in case of make or buy component with the best quality and the least cost in multistage manufacturing process. The imperfect quality is one of the cost component that must be minimizing in this model. Component with imperfect quality, not necessarily defective. It still can be rework and used for assembly. This research also provide a numerical example and sensitivity analysis to show how the model work. We use simulation and help by crystal ball to solve the numerical problem. The sensitivity analysis result show that percentage of imperfect generally not affect to the model significantly, and the model is not sensitive to changes in these parameters. This is because the imperfect cost are smaller than overall total cost components.

  4. Detector Development for the MARE Neutrino Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galeazzi, M.; Bogorin, D.; Molina, R.

    2009-12-16

    The MARE experiment is designed to measure the mass of the neutrino with sub-eV sensitivity by measuring the beta decay of {sup 187}Re with cryogenic microcalorimeters. A preliminary analysis shows that, to achieve the necessary statistics, between 10,000 and 50,000 detectors are likely necessary. We have fabricated and characterized Iridium transition edge sensors with high reproducibility and uniformity for such a large scale experiment. We have also started a full scale simulation of the experimental setup for MARE, including thermalization in the absorber, detector response, and optimum filter analysis, to understand the issues related to reaching a sub-eV sensitivity andmore » to optimize the design of the MARE experiment. We present our characterization of the Ir devices, including reproducibility, uniformity, and sensitivity, and we discuss the implementation and capabilities of our full scale simulation.« less

  5. Characterization of image heterogeneity using 2D Minkowski functionals increases the sensitivity of detection of a targeted MRI contrast agent.

    PubMed

    Canuto, Holly C; McLachlan, Charles; Kettunen, Mikko I; Velic, Marko; Krishnan, Anant S; Neves, Andre' A; de Backer, Maaike; Hu, D-E; Hobson, Michael P; Brindle, Kevin M

    2009-05-01

    A targeted Gd(3+)-based contrast agent has been developed that detects tumor cell death by binding to the phosphatidylserine (PS) exposed on the plasma membrane of dying cells. Although this agent has been used to detect tumor cell death in vivo, the differences in signal intensity between treated and untreated tumors was relatively small. As cell death is often spatially heterogeneous within tumors, we investigated whether an image analysis technique that parameterizes heterogeneity could be used to increase the sensitivity of detection of this targeted contrast agent. Two-dimensional (2D) Minkowski functionals (MFs) provided an automated and reliable method for parameterization of image heterogeneity, which does not require prior assumptions about the number of regions or features in the image, and were shown to increase the sensitivity of detection of the contrast agent as compared to simple signal intensity analysis. (c) 2009 Wiley-Liss, Inc.

  6. Structural development and web service based sensitivity analysis of the Biome-BGC MuSo model

    NASA Astrophysics Data System (ADS)

    Hidy, Dóra; Balogh, János; Churkina, Galina; Haszpra, László; Horváth, Ferenc; Ittzés, Péter; Ittzés, Dóra; Ma, Shaoxiu; Nagy, Zoltán; Pintér, Krisztina; Barcza, Zoltán

    2014-05-01

    Studying the greenhouse gas exchange, mainly the carbon dioxide sink and source character of ecosystems is still a highly relevant research topic in biogeochemistry. During the past few years research focused on managed ecosystems, because human intervention has an important role in the formation of the land surface through agricultural management, land use change, and other practices. In spite of considerable developments current biogeochemical models still have uncertainties to adequately quantify greenhouse gas exchange processes of managed ecosystem. Therefore, it is an important task to develop and test process-based biogeochemical models. Biome-BGC is a widely used, popular biogeochemical model that simulates the storage and flux of water, carbon, and nitrogen between the ecosystem and the atmosphere, and within the components of the terrestrial ecosystems. Biome-BGC was originally developed by the Numerical Terradynamic Simulation Group (NTSG) of University of Montana (http://www.ntsg.umt.edu/project/biome-bgc), and several other researchers used and modified it in the past. Our research group developed Biome-BGC version 4.1.1 to improve essentially the ability of the model to simulate carbon and water cycle in real managed ecosystems. The modifications included structural improvements of the model (e.g., implementation of multilayer soil module and drought related plant senescence; improved model phenology). Beside these improvements management modules and annually varying options were introduced and implemented (simulate mowing, grazing, planting, harvest, ploughing, application of fertilizers, forest thinning). Dynamic (annually varying) whole plant mortality was also enabled in the model to support more realistic simulation of forest stand development and natural disturbances. In the most recent model version separate pools have been defined for fruit. The model version which contains every former and new development is referred as Biome-BGC MuSo (Biome-BGC with multi-soil layer). Within the frame of the BioVeL project (http://www.biovel.eu) an open source and domain independent scientific workflow management system (http://www.taverna.org.uk) are used to support 'in silico' experimentation and easy applicability of different models including Biome-BGC MuSo. Workflows can be built upon functionally linked sets of web services like retrieval of meteorological dataset and other parameters; preparation of single run or spatial run model simulation; desk top grid technology based Monte Carlo experiment with parallel processing; model sensitivity analysis, etc. The newly developed, Monte Carlo experiment based sensitivity analysis is described in this study and results are presented about differences in the sensitivity of the original and the developed Biome-BGC model.

  7. Theoretical considerations of some nonlinear aspects of hypersonic panel flutter

    NASA Technical Reports Server (NTRS)

    Mcintosh, S. C., Jr.

    1974-01-01

    A research project to analyze the effects of hypersonic nonlinear aerodynamic loading on panel flutter is reported. The test equipment and procedures for conducting the tests are explained. The effects of aerodynamic linearities on stability were evaluated by determining constant-initial-energy amplitude-sensitive stability boundaries and comparing them with the corresponding linear stability boundaries. An attempt to develop an alternative method of analysis for systems where amplitude-sensitive instability is possible is presented.

  8. Sensitivity analysis of machine-learning models of hydrologic time series

    NASA Astrophysics Data System (ADS)

    O'Reilly, A. M.

    2017-12-01

    Sensitivity analysis traditionally has been applied to assessing model response to perturbations in model parameters, where the parameters are those model input variables adjusted during calibration. Unlike physics-based models where parameters represent real phenomena, the equivalent of parameters for machine-learning models are simply mathematical "knobs" that are automatically adjusted during training/testing/verification procedures. Thus the challenge of extracting knowledge of hydrologic system functionality from machine-learning models lies in their very nature, leading to the label "black box." Sensitivity analysis of the forcing-response behavior of machine-learning models, however, can provide understanding of how the physical phenomena represented by model inputs affect the physical phenomena represented by model outputs.As part of a previous study, hybrid spectral-decomposition artificial neural network (ANN) models were developed to simulate the observed behavior of hydrologic response contained in multidecadal datasets of lake water level, groundwater level, and spring flow. Model inputs used moving window averages (MWA) to represent various frequencies and frequency-band components of time series of rainfall and groundwater use. Using these forcing time series, the MWA-ANN models were trained to predict time series of lake water level, groundwater level, and spring flow at 51 sites in central Florida, USA. A time series of sensitivities for each MWA-ANN model was produced by perturbing forcing time-series and computing the change in response time-series per unit change in perturbation. Variations in forcing-response sensitivities are evident between types (lake, groundwater level, or spring), spatially (among sites of the same type), and temporally. Two generally common characteristics among sites are more uniform sensitivities to rainfall over time and notable increases in sensitivities to groundwater usage during significant drought periods.

  9. Immuno-magnetic beads-based extraction-capillary zone electrophoresis-deep UV laser-induced fluorescence analysis of erythropoietin.

    PubMed

    Wang, Heye; Dou, Peng; Lü, Chenchen; Liu, Zhen

    2012-07-13

    Erythropoietin (EPO) is an important glycoprotein hormone. Recombinant human EPO (rhEPO) is an important therapeutic drug and can be also used as doping reagent in sports. The analysis of EPO glycoforms in pharmaceutical and sports areas greatly challenges analytical scientists from several aspects, among which sensitive detection and effective and facile sample preparation are two essential issues. Herein, we investigated new possibilities for these two aspects. Deep UV laser-induced fluorescence detection (deep UV-LIF) was established to detect the intrinsic fluorescence of EPO while an immuno-magnetic beads-based extraction (IMBE) was developed to specifically extract EPO glycoforms. Combined with capillary zone electrophoresis (CZE), CZE-deep UV-LIF allows high resolution glycoform profiling with improved sensitivity. The detection sensitivity was improved by one order of magnitude as compared with UV absorbance detection. An additional advantage is that the original glycoform distribution can be completely preserved because no fluorescent labeling is needed. By combining IMBE with CZE-deep UV-LIF, the overall detection sensitivity was 1.5 × 10⁻⁸ mol/L, which was enhanced by two orders of magnitude relative to conventional CZE with UV absorbance detection. It is applicable to the analysis of pharmaceutical preparations of EPO, but the sensitivity is insufficient for the anti-doping analysis of EPO in blood and urine. IMBE can be straightforward and effective approach for sample preparation. However, antibodies with high specificity were the key for application to urine samples because some urinary proteins can severely interfere the immuno-extraction. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less

  11. Gravitropism: interaction of sensitivity modulation and effector redistribution

    NASA Technical Reports Server (NTRS)

    Evans, M. L.

    1991-01-01

    Our increasing capabilities for quantitative hormone analysis and automated high resolution growth studies have allowed a reassessment of the classical Cholodny-Went hypothesis of gravitropism. According to this hypothesis, gravity induces redistribution of auxin toward the lower side of the organ and this causes the growth asymmetry that leads to reorientation. Arguments against the Cholodny-Went hypothesis that were based primarily on concerns over the timing and magnitude of the development of hormone asymmetry are countered by recent evidence that such asymmetry develops early and is sufficiently large to account for curvature. Thus, it appears that the Cholodny-Went hypothesis is fundamentally valid. However, recent comparative studies of the kinetics of curvature and the timing of the development of hormone asymmetry indicate that this hypothesis alone cannot account for the intricacies of the gravitropic response. It appears that time-dependent gravity-induced changes in hormone sensitivity as well as changes in sensitivity of the gravity receptor play important roles in the response.

  12. Gravitropism: Interaction of Sensitivity Modulation and Effector Redistribution 1

    PubMed Central

    Evans, Michael L.

    1991-01-01

    Our increasing capabilities for quantitative hormone analysis and automated high resolution growth studies have allowed a reassessment of the classical Cholodny-Went hypothesis of gravitropism. According to this hypothesis, gravity induces redistribution of auxin toward the lower side of the organ and this causes the growth asymmetry that leads to reorientation. Arguments against the Cholodny-Went hypothesis that were based primarily on concerns over the timing and magnitude of the development of hormone asymmetry are countered by recent evidence that such asymmetry develops early and is sufficiently large to account for curvature. Thus, it appears that the Cholodny-Went hypothesis is fundamentally valid. However, recent comparative studies of the kinetics of curvature and the timing of the development of hormone asymmetry indicate that this hypothesis alone cannot account for the intricacies of the gravitropic response. It appears that time-dependent gravity-induced changes in hormone sensitivity as well as changes in sensitivity of the gravity receptor play important roles in the response. PMID:11537485

  13. Mapping Seabird Sensitivity to Offshore Wind Farms

    PubMed Central

    Bradbury, Gareth; Trinder, Mark; Furness, Bob; Banks, Alex N.; Caldow, Richard W. G.; Hume, Duncan

    2014-01-01

    We present a Geographic Information System (GIS) tool, SeaMaST (Seabird Mapping and Sensitivity Tool), to provide evidence on the use of sea areas by seabirds and inshore waterbirds in English territorial waters, mapping their relative sensitivity to offshore wind farms. SeaMaST is a freely available evidence source for use by all connected to the offshore wind industry and will assist statutory agencies in assessing potential risks to seabird populations from planned developments. Data were compiled from offshore boat and aerial observer surveys spanning the period 1979–2012. The data were analysed using distance analysis and Density Surface Modelling to produce predicted bird densities across a grid covering English territorial waters at a resolution of 3 km×3 km. Coefficients of Variation were estimated for each grid cell density, as an indication of confidence in predictions. Offshore wind farm sensitivity scores were compiled for seabird species using English territorial waters. The comparative risks to each species of collision with turbines and displacement from operational turbines were reviewed and scored separately, and the scores were multiplied by the bird density estimates to produce relative sensitivity maps. The sensitivity maps reflected well the amassed distributions of the most sensitive species. SeaMaST is an important new tool for assessing potential impacts on seabird populations from offshore development at a time when multiple large areas of development are proposed which overlap with many seabird species’ ranges. It will inform marine spatial planning as well as identifying priority areas of sea usage by marine birds. Example SeaMaST outputs are presented. PMID:25210739

  14. Mapping seabird sensitivity to offshore wind farms.

    PubMed

    Bradbury, Gareth; Trinder, Mark; Furness, Bob; Banks, Alex N; Caldow, Richard W G; Hume, Duncan

    2014-01-01

    We present a Geographic Information System (GIS) tool, SeaMaST (Seabird Mapping and Sensitivity Tool), to provide evidence on the use of sea areas by seabirds and inshore waterbirds in English territorial waters, mapping their relative sensitivity to offshore wind farms. SeaMaST is a freely available evidence source for use by all connected to the offshore wind industry and will assist statutory agencies in assessing potential risks to seabird populations from planned developments. Data were compiled from offshore boat and aerial observer surveys spanning the period 1979-2012. The data were analysed using distance analysis and Density Surface Modelling to produce predicted bird densities across a grid covering English territorial waters at a resolution of 3 km×3 km. Coefficients of Variation were estimated for each grid cell density, as an indication of confidence in predictions. Offshore wind farm sensitivity scores were compiled for seabird species using English territorial waters. The comparative risks to each species of collision with turbines and displacement from operational turbines were reviewed and scored separately, and the scores were multiplied by the bird density estimates to produce relative sensitivity maps. The sensitivity maps reflected well the amassed distributions of the most sensitive species. SeaMaST is an important new tool for assessing potential impacts on seabird populations from offshore development at a time when multiple large areas of development are proposed which overlap with many seabird species' ranges. It will inform marine spatial planning as well as identifying priority areas of sea usage by marine birds. Example SeaMaST outputs are presented.

  15. Atmospheric model development in support of SEASAT. Volume 1: Summary of findings

    NASA Technical Reports Server (NTRS)

    Kesel, P. G.

    1977-01-01

    Atmospheric analysis and prediction models of varying (grid) resolution were developed. The models were tested using real observational data for the purpose of assessing the impact of grid resolution on short range numerical weather prediction. The discretionary model procedures were examined so that the computational viability of SEASAT data might be enhanced during the conduct of (future) sensitivity tests. The analysis effort covers: (1) examining the procedures for allowing data to influence the analysis; (2) examining the effects of varying the weights in the analysis procedure; (3) testing and implementing procedures for solving the minimization equation in an optimal way; (4) describing the impact of grid resolution on analysis; and (5) devising and implementing numerous practical solutions to analysis problems, generally.

  16. Development of two highly sensitive immunoassays for detection of copper ions and a suite of relevant immunochemicals.

    PubMed

    Zhao, Hongwei; Nan, Tiegui; Tan, Guiyu; Gao, Wei; Cao, Zhen; Sun, Shuo; Li, Zhaohu; Li, Qing X; Wang, Baomin

    2011-09-19

    Availability of highly sensitive assays for metal ions can help monitor and manage the environmental and food contamination. In the present study, a monoclonal antibody against Copper(II)-ethylenediaminetetraacetic acid was used to develop two sensitive ELISAs for Cu(II) analysis. Cobalt(II)-EDTA-BSA was the coating antigen in a heterologous indirect competitive ELISA (hicELISA), whereas Co(II)-EDTA-BSA-horseradish peroxidase (HRP) was the enzyme tracer in a heterologous direct competitive ELISA (hdcELISA). Both ELISAs were validated for detecting the content of Cu(II) in environmental waters. The ELISA data agreed well with those from graphite furnace atomic absorption spectroscopy. The methods of developing the Cu(II) hicELISA and hdcELISA are potentially applicable for developing ELISAs for other metals. The chelator-protein complexes such as EDTA-BSA and EDTA-BSA-HRP can form a suite of metal complexes having the consistent hapten density, location and orientation on the conjugates except the difference of the metal core, which can be used as ideal reagents to investigate the relationship between assay sensitivity and antibody affinities for the haptens and the analytes. The strategy of conjugating a haptenated protein directly with HRP can reduce the loss of HRP activity during the conjugation reaction and thus can be applicable for the development of ELISAs for small molecules. Copyright © 2011. Published by Elsevier B.V.

  17. Development of a standardized battery of performance tests for the assessment of noise stress effects

    NASA Technical Reports Server (NTRS)

    Theologus, G. C.; Wheaton, G. R.; Mirabella, A.; Brahlek, R. E.

    1973-01-01

    A set of 36 relatively independent categories of human performance were identified. These categories encompass human performance in the cognitive, perceptual, and psychomotor areas, and include diagnostic measures and sensitive performance metrics. Then a prototype standardized test battery was constructed, and research was conducted to obtain information on the sensitivity of the tests to stress, the sensitivity of selected categories of performance degradation, the time course of stress effects on each of the selected tests, and the learning curves associated with each test. A research project utilizing a three factor partially repeated analysis of covariance design was conducted in which 60 male subjects were exposed to variations in noise level and quality during performance testing. Effects of randomly intermittent noise on performance of the reaction time tests were observed, but most of the other performance tests showed consistent stability. The results of 14 analyses of covariance of the data taken from the performance of the 60 subjects on the prototype standardized test battery provided information which will enable the final development and test of a standardized test battery and the associated development of differential sensitivity metrics and diagnostic classificatory system.

  18. Landscape Analysis of Nutrition-sensitive Agriculture Policy Development in Senegal.

    PubMed

    Lachat, Carl; Nago, Eunice; Ka, Abdoulaye; Vermeylen, Harm; Fanzo, Jessica; Mahy, Lina; Wüstefeld, Marzella; Kolsteren, Patrick

    2015-06-01

    Unlocking the agricultural potential of Africa offers a genuine opportunity to address malnutrition and drive development of the continent. Using Senegal as a case study, to identify gaps and opportunities to strengthen agricultural policies with nutrition-sensitive approaches. We carried out a systematic analysis of 13 policy documents that related to food production, agriculture, food security, or nutrition. Next, we collected data during a participatory analysis with 32 national stakeholders and in-depth interviews with 15 national experts of technical directorates of the different ministries that deal with agriculture and food production. The current agricultural context has various elements that are considered to enhance its nutrition sensitivity. On average, 8.3 of the 17 Food and Agriculture Organization guiding principles for agriculture programming for nutrition were included in the policies reviewed. Ensuring food security and increasing dietary diversity were considered to be the principal objectives of agricultural policies. Although there was considerable agreement that agriculture can contribute to nutrition, current agricultural programs generally do not target communities on the basis of their nutritional vulnerability. Agricultural programs were reported to have specific components to target female beneficiaries but were generally not used as delivery platforms for nutritional interventions. The findings of this study indicate the need for a coherent policy environment across the food system that aligns recommendations at the national level with local action on the ground. In addition, specific activities are needed to develop a shared understanding of nutrition and public health nutrition within the agricultural community in Senegal. © The Author(s) 2015.

  19. High-throughput determination of vancomycin in human plasma by a cost-effective system of two-dimensional liquid chromatography.

    PubMed

    Sheng, Yanghao; Zhou, Boting

    2017-05-26

    Therapeutic drug monitoring (TDM) is one of the most important services of clinical laboratories. Two main techniques are commonly used: the immunoassay and chromatography method. We have developed a cost-effective system of two-dimensional liquid chromatography with ultraviolet detection (2D-LC-UV) for high-throughput determination of vancomycin in human plasma that combines the automation and low start-up costs of the immunoassay with the high selectivity and sensitivity of the liquid chromatography coupled with mass spectrometric detection without incurring their disadvantages, achieving high cost-effectiveness. This 2D-LC system offers a large volume injection to provide sufficient sensitivity and uses simulated gradient peak compression technology to control peak broadening and to improve peak shape. A middle column was added to reduce the analysis cycle time and make it suitable for high-throughput routine clinical assays. The analysis cycle time was 4min and the peak width was 0.8min. Compared with other chromatographic methods that have been developed, the analysis cycle time and peak width for vancomycin was reduced significantly. The lower limit of quantification was 0.20μg/mL for vancomycin, which is the same as certain LC-MS/MS methods that have been recently developed and validated. The method is rapid, automated, and low-cost and has high selectivity and sensitivity for the quantification of vancomycin in human plasma, thus making it well-suited for use in hospital clinical laboratories. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Sensitive gas analysis system on a microchip and application for on-site monitoring of NH3 in a clean room.

    PubMed

    Hiki, Shinichiro; Mawatari, Kazuma; Aota, Arata; Saito, Maki; Kitamori, Takehiko

    2011-06-15

    A portable, highly sensitive, and continuous ammonia gas monitoring system was developed with a microfluidic chip. The system consists of a main unit, a gas pumping unit, and a computer which serves as an operation console. The size of the system is 45 cm width × 30 cm depth × 30 cm height, and the portable system was realized. A highly efficient and stable extraction method was developed by utilizing an annular gas/liquid laminar flow. In addition, a stable gas/liquid separation method with a PTFE membrane was developed by arranging a fluidic network in three dimensions to achieve almost zero dead volume at the gas/liquid extraction part. The extraction rate was almost 100% with a liquid flow rate of 3.5 μL/min and a gas flow rate of 100 mL/min (contact time of ~15 ms), and the concentration factor was 200 times by calculating the NH(3) concentration (w/w unit) in the gas and liquid phases. Stable phase separation and detection was sustained for more than 3 weeks in an automated operation, which was sufficient for the monitoring application. The lower limit of detection calculated based on a signal-to-noise ratio of 3 was 84 ppt, which showed good detectability for NH(3) analysis. We believe that our system is a very powerful tool for gas analysis due to the advantages of portable size, high sensitivity, and continuous monitoring, and it is particularly useful in the semiconductor field.

  1. Skin sensitizers differentially regulate signaling pathways in MUTZ-3 cells in relation to their individual potency

    PubMed Central

    2014-01-01

    Background Due to the recent European legislations posing a ban of animal tests for safety assessment within the cosmetic industry, development of in vitro alternatives for assessment of skin sensitization is highly prioritized. To date, proposed in vitro assays are mainly based on single biomarkers, which so far have not been able to classify and stratify chemicals into subgroups, related to risk or potency. Methods Recently, we presented the Genomic Allergen Rapid Detection (GARD) assay for assessment of chemical sensitizers. In this paper, we show how the genome wide readout of GARD can be expanded and used to identify differentially regulated pathways relating to individual chemical sensitizers. In this study, we investigated the mechanisms of action of a range of skin sensitizers through pathway identification, pathway classification and transcription factor analysis and related this to the reactive mechanisms and potency of the sensitizing agents. Results By transcriptional profiling of chemically stimulated MUTZ-3 cells, 33 canonical pathways intimately involved in sensitization to chemical substances were identified. The results showed that metabolic processes, cell cycling and oxidative stress responses are the key events activated during skin sensitization, and that these functions are engaged differently depending on the reactivity mechanisms of the sensitizing agent. Furthermore, the results indicate that the chemical reactivity groups seem to gradually engage more pathways and more molecules in each pathway with increasing sensitizing potency of the chemical used for stimulation. Also, a switch in gene regulation from up to down regulation, with increasing potency, was seen both in genes involved in metabolic functions and cell cycling. These observed pathway patterns were clearly reflected in the regulatory elements identified to drive these processes, where 33 regulatory elements have been proposed for further analysis. Conclusions This study demonstrates that functional analysis of biomarkers identified from our genomics study of human MUTZ-3 cells can be used to assess sensitizing potency of chemicals in vitro, by the identification of key cellular events, such as metabolic and cell cycling pathways. PMID:24517095

  2. Cognitive capital, equity and child-sensitive social protection in Asia and the Pacific

    PubMed Central

    Samson, Michael; Fajth, Gaspar; François, Daphne

    2016-01-01

    Promoting child development and welfare delivers human rights and builds sustainable economies through investment in ‘cognitive capital’. This analysis looks at conditions that support optimal brain development in childhood and highlights how social protection promotes these conditions and strengthens the achievement of the Sustainable Development Goals (SDGs) in Asia and the Pacific. Embracing child-sensitive social protection offers multiple benefits. The region has been a leader in global poverty reduction but the underlying pattern of economic growth exacerbates inequality and is increasingly unsustainable. The strategy of channelling low-skilled rural labour to industrial jobs left millions of children behind with limited opportunities for development. Building child-sensitive social protection and investing better in children's cognitive capacity could check these trends and trigger powerful long-term human capital development—enabling labour productivity to grow faster than populations age. While governments are investing more in social protection, the region's spending remains low by international comparison. Investment is particularly inadequate where it yields the highest returns: during the first 1000 days of life. Five steps are recommended for moving forward: (1) building cognitive capital by adjusting the region's development paradigms to reflect better the economic and social returns from investing in children; (2) understand and track better child poverty and vulnerability; (3) progressively build universal, child-sensitive systems that strengthen comprehensive interventions within life cycle frameworks; (4) mobilise national resources for early childhood investments and child-sensitive social protection; and (5) leverage the SDGs and other channels of national and international collaboration. PMID:28588990

  3. Integrated Droplet-Based Microextraction with ESI-MS for Removal of Matrix Interference in Single-Cell Analysis.

    PubMed

    Zhang, Xiao-Chao; Wei, Zhen-Wei; Gong, Xiao-Yun; Si, Xing-Yu; Zhao, Yao-Yao; Yang, Cheng-Dui; Zhang, Si-Chun; Zhang, Xin-Rong

    2016-04-29

    Integrating droplet-based microfluidics with mass spectrometry is essential to high-throughput and multiple analysis of single cells. Nevertheless, matrix effects such as the interference of culture medium and intracellular components influence the sensitivity and the accuracy of results in single-cell analysis. To resolve this problem, we developed a method that integrated droplet-based microextraction with single-cell mass spectrometry. Specific extraction solvent was used to selectively obtain intracellular components of interest and remove interference of other components. Using this method, UDP-Glc-NAc, GSH, GSSG, AMP, ADP and ATP were successfully detected in single MCF-7 cells. We also applied the method to study the change of unicellular metabolites in the biological process of dysfunctional oxidative phosphorylation. The method could not only realize matrix-free, selective and sensitive detection of metabolites in single cells, but also have the capability for reliable and high-throughput single-cell analysis.

  4. Measurements of 55Fe activity in activated steel samples with GEMPix

    NASA Astrophysics Data System (ADS)

    Curioni, A.; Dinar, N.; La Torre, F. P.; Leidner, J.; Murtas, F.; Puddu, S.; Silari, M.

    2017-03-01

    In this paper we present a novel method, based on the recently developed GEMPix detector, to measure the 55Fe content in samples of metallic material activated during operation of CERN accelerators and experimental facilities. The GEMPix, a gas detector with highly pixelated read-out, has been obtained by coupling a triple Gas Electron Multiplier (GEM) to a quad Timepix ASIC. Sample preparation, measurements performed on 45 samples and data analysis are described. The calibration factor (counts per second per unit specific activity) has been obtained via measurements of the 55Fe activity determined by radiochemical analysis of the same samples. Detection limit and sensitivity to the current Swiss exemption limit are calculated. Comparison with radiochemical analysis shows inconsistency for the sensitivity for only two samples, most likely due to underestimated uncertainties of the GEMPix analysis. An operative test phase of this technique is already planned at CERN.

  5. Magnetic particles as powerful purification tool for high sensitive mass spectrometric screening procedures.

    PubMed

    Peter, Jochen F; Otto, Angela M

    2010-02-01

    The effective isolation and purification of proteins from biological fluids is the most crucial step for a successful protein analysis when only minute amounts are available. While conventional purification methods such as dialysis, ultrafiltration or protein precipitation often lead to a marked loss of protein, SPE with small-sized particles is a powerful alternative. The implementation of particles with superparamagnetic cores facilitates the handling of those particles and allows the application of particles in the nanometer to low micrometer range. Due to the small diameters, magnetic particles are advantageous for increasing sensitivity when using subsequent MS analysis or gel electrophoresis. In the last years, different types of magnetic particles were developed for specific protein purification purposes followed by analysis or screening procedures using MS or SDS gel electrophoresis. In this review, the use of magnetic particles for different applications, such as, the extraction and analysis of DNA/RNA, peptides and proteins, is described.

  6. [Individual responses of arterial pressure to geomagnetic activity in practically healthy subjects].

    PubMed

    Zenchenko, T A; Dimitrova, S; Stoilova, I; Breus, T K

    2009-01-01

    Dynamic analysis of arterial blood pressure in relation to the Earth's magnetic field perturbations was performed in 77 practically healthy volunteers (staff of Bulgarian Academy of Sciences). Almost half of them proved magneto-sensitive, i.e. experienced AP elevation with increased geomagnetic activity. The probability of development of magnetic sensitivity was independent of age and gender but increased in volunteers having even mild cardiovascular pathology. These subjects complained of worsened health condition upon a rise in geomagnetic activity. However, some volunteers reported deteriorated well-being without AD elevation. It means that AD measurement may be insufficient for reliable monitoring magnetic sensitivity.

  7. Aptamer-based microspheres for highly sensitive protein detection using fluorescently-labeled DNA nanostructures.

    PubMed

    Han, Daehoon; Hong, Jinkee; Kim, Hyun Cheol; Sung, Jong Hwan; Lee, Jong Bum

    2013-11-01

    Many highly sensitive protein detection techniques have been developed and have played an important role in the analysis of proteins. Herein, we report a novel technique that can detect proteins sensitively and effectively using aptamer-based DNA nanostructures. Thrombin was used as a target protein and aptamer was used to capture fluorescent dye-labeled DNA nanobarcodes or thrombin on a microsphere. The captured DNA nanobarcodes were replaced by a thrombin and aptamer interaction. The detection ability of this approach was confirmed by flow cytometry with different concentrations of thrombin. Our detection method has great potential for rapid and simple protein detection with a variety of aptamers.

  8. Aerodynamic Shape Sensitivity Analysis and Design Optimization of Complex Configurations Using Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Newman, James C., III; Barnwell, Richard W.

    1997-01-01

    A three-dimensional unstructured grid approach to aerodynamic shape sensitivity analysis and design optimization has been developed and is extended to model geometrically complex configurations. The advantage of unstructured grids (when compared with a structured-grid approach) is their inherent ability to discretize irregularly shaped domains with greater efficiency and less effort. Hence, this approach is ideally suited for geometrically complex configurations of practical interest. In this work the nonlinear Euler equations are solved using an upwind, cell-centered, finite-volume scheme. The discrete, linearized systems which result from this scheme are solved iteratively by a preconditioned conjugate-gradient-like algorithm known as GMRES for the two-dimensional geometry and a Gauss-Seidel algorithm for the three-dimensional; similar procedures are used to solve the accompanying linear aerodynamic sensitivity equations in incremental iterative form. As shown, this particular form of the sensitivity equation makes large-scale gradient-based aerodynamic optimization possible by taking advantage of memory efficient methods to construct exact Jacobian matrix-vector products. Simple parameterization techniques are utilized for demonstrative purposes. Once the surface has been deformed, the unstructured grid is adapted by considering the mesh as a system of interconnected springs. Grid sensitivities are obtained by differentiating the surface parameterization and the grid adaptation algorithms with ADIFOR (which is an advanced automatic-differentiation software tool). To demonstrate the ability of this procedure to analyze and design complex configurations of practical interest, the sensitivity analysis and shape optimization has been performed for a two-dimensional high-lift multielement airfoil and for a three-dimensional Boeing 747-200 aircraft.

  9. Sensitivity Analysis on Remote Sensing Evapotranspiration Algorithm of Surface Energy Balance for Land

    NASA Astrophysics Data System (ADS)

    Wang, J.; Samms, T.; Meier, C.; Simmons, L.; Miller, D.; Bathke, D.

    2005-12-01

    Spatial evapotranspiration (ET) is usually estimated by Surface Energy Balance Algorithm for Land. The average accuracy of the algorithm is 85% on daily basis and 95% on seasonable basis. However, the accuracy of the algorithm varies from 67% to 95% on instantaneous ET estimates and, as reported in 18 studies, 70% to 98% on 1 to 10-day ET estimates. There is a need to understand the sensitivity of the ET calculation with respect to the algorithm variables and equations. With an increased understanding, information can be developed to improve the algorithm, and to better identify the key variables and equations. A Modified Surface Energy Balance Algorithm for Land (MSEBAL) was developed and validated with data from a pecan orchard and an alfalfa field. The MSEBAL uses ground reflectance and temperature data from ASTER sensors along with humidity, wind speed, and solar radiation data from a local weather station. MSEBAL outputs hourly and daily ET with 90 m by 90 m resolution. A sensitivity analysis was conducted for MSEBAL on ET calculation. In order to observe the sensitivity of the calculation to a particular variable, the value of that variable was changed while holding the magnitudes of the other variables. The key variables and equations to which the ET calculation most sensitive were determined in this study. href='http://weather.nmsu.edu/pecans/SEBALFolder/San%20Francisco%20AGU%20meeting/ASensitivityAnalysisonMSE">http://weather.nmsu.edu/pecans/SEBALFolder/San%20Francisco%20AGU%20meeting/ASensitivityAnalysisonMSE

  10. Trace metal mapping by laser-induced breakdown spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaiser, Jozef; Novotny, Dr. Karel; Hrdlicka, A

    2012-01-01

    Abstract: Laser-Induced Breakdown Spectroscopy (LIBS) is a sensitive optical technique capable of fast multi-elemental analysis of solid, gaseous and liquid samples. The potential applications of lasers for spectrochemical analysis were developed shortly after its invention; however the massive development of LIBS is connected with the availability of powerful pulsed laser sources. Since the late 80s of 20th century LIBS dominated the analytical atomic spectroscopy scene and its application are developed continuously. Here we review the utilization of LIBS for trace elements mapping in different matrices. The main emphasis is on trace metal mapping in biological samples.

  11. Predictive modeling of infrared radiative heating in tomato dry-peeling process: Part II. Model validation and sensitivity analysis

    USDA-ARS?s Scientific Manuscript database

    A predictive mathematical model was developed to simulate heat transfer in a tomato undergoing double sided infrared (IR) heating in a dry-peeling process. The aims of this study were to validate the developed model using experimental data and to investigate different engineering parameters that mos...

  12. Intermediate Maneuver Induced Rollover Simulation (IMIRS) and Sensitivity Analysis. Final Report

    DOT National Transportation Integrated Search

    1991-02-01

    This report describes the development of the Intermediate Maneuver Induced Rollover Simulation (IMIRS) which can be used to investigate the phenomenon of maneuver induced rollover of light vehicles. The IMIRS represents an enhancement of the existing...

  13. Nonlinear mathematical modeling and sensitivity analysis of hydraulic drive unit

    NASA Astrophysics Data System (ADS)

    Kong, Xiangdong; Yu, Bin; Quan, Lingxiao; Ba, Kaixian; Wu, Liujie

    2015-09-01

    The previous sensitivity analysis researches are not accurate enough and also have the limited reference value, because those mathematical models are relatively simple and the change of the load and the initial displacement changes of the piston are ignored, even experiment verification is not conducted. Therefore, in view of deficiencies above, a nonlinear mathematical model is established in this paper, including dynamic characteristics of servo valve, nonlinear characteristics of pressure-flow, initial displacement of servo cylinder piston and friction nonlinearity. The transfer function block diagram is built for the hydraulic drive unit closed loop position control, as well as the state equations. Through deriving the time-varying coefficient items matrix and time-varying free items matrix of sensitivity equations respectively, the expression of sensitivity equations based on the nonlinear mathematical model are obtained. According to structure parameters of hydraulic drive unit, working parameters, fluid transmission characteristics and measured friction-velocity curves, the simulation analysis of hydraulic drive unit is completed on the MATLAB/Simulink simulation platform with the displacement step 2 mm, 5 mm and 10 mm, respectively. The simulation results indicate that the developed nonlinear mathematical model is sufficient by comparing the characteristic curves of experimental step response and simulation step response under different constant load. Then, the sensitivity function time-history curves of seventeen parameters are obtained, basing on each state vector time-history curve of step response characteristic. The maximum value of displacement variation percentage and the sum of displacement variation absolute values in the sampling time are both taken as sensitivity indexes. The sensitivity indexes values above are calculated and shown visually in histograms under different working conditions, and change rules are analyzed. Then the sensitivity indexes values of four measurable parameters, such as supply pressure, proportional gain, initial position of servo cylinder piston and load force, are verified experimentally on test platform of hydraulic drive unit, and the experimental research shows that the sensitivity analysis results obtained through simulation are approximate to the test results. This research indicates each parameter sensitivity characteristics of hydraulic drive unit, the performance-affected main parameters and secondary parameters are got under different working conditions, which will provide the theoretical foundation for the control compensation and structure optimization of hydraulic drive unit.

  14. Ultrahigh-Sensitivity Piezoresistive Pressure Sensors for Detection of Tiny Pressure.

    PubMed

    Li, Hongwei; Wu, Kunjie; Xu, Zeyang; Wang, Zhongwu; Meng, Yancheng; Li, Liqiang

    2018-06-20

    High-sensitivity pressure sensors are crucial for the ultrasensitive touch technology and E-skin, especially at the tiny-pressure range below 100 Pa. However, it is highly challenging to substantially promote sensitivity beyond the current level at several to 200 kPa -1 and to improve the detection limit lower than 0.1 Pa, which is significant for the development of pressure sensors toward ultrasensitive and highly precise detection. Here, we develop an efficient strategy to greatly improve the sensitivity near to 2000 kPa -1 using short-channel coplanar device structure and sharp microstructure, which is systematically proposed for the first time and rationalized by the mathematic calculation and analysis. Significantly, benefiting from the ultrahigh sensitivity, the detection limit is improved to be as small as 0.075 Pa. The sensitivity and detection limit are both superior to the current levels and far surpass the function of human skin. Furthermore, the sensor shows fast response time (50 μs), excellent reproducibility and stability, and low power consumption. Remarkably, the sensor shows excellent detection capacity in the tiny-pressure range, including light-emitting diode switching with a pressure of 7 Pa, ringtone (2-20 Pa) recognition, and ultrasensitive (0.1 Pa) electronic glove. This work represents a performance and strategic progress in the field of pressure sensing.

  15. Rapid and sensitive detection of mink circovirus by recombinase polymerase amplification.

    PubMed

    Ge, Junwei; Shi, Yunjia; Cui, Xingyang; Gu, Shanshan; Zhao, Lili; Chen, Hongyan

    2018-06-01

    To date, the pathogenic role of mink circovirus (MiCV) remains unclear, and its prevalence and economic importance are unknown. Therefore, a rapid and sensitive molecular diagnosis is necessary for disease management and epidemiological surveillance. However, only PCR methods can identify MiCV infection at present. In this study, we developed a nested PCR and established a novel recombinase polymerase amplification (RPA) assay for MiCV detection. Sensitivity analysis showed that the detection limit of nested PCR and RPA assay was 10 1 copies/reaction, and these methods were more sensitive than conventional PCR, which has a detection limit of 10 5 copies/reaction. The RPA assay had no cross-reactivity with other related viral pathogens, and amplification was completed in less than 20 min with a simple device. Further assessment of clinical samples showed that the two assays were accurate in identifying positive and negative conventional PCR samples. The detection rate of MiCV by the RPA assay in clinical samples was 38.09%, which was 97% consistent with that by the nested PCR. The developed nested PCR is a highly sensitive tool for practical use, and the RPA assay is a simple, sensitive, and potential alternative method for rapid and accurate MiCV diagnosis. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Time-Resolved Fluorescent Immunochromatography of Aflatoxin B1 in Soybean Sauce: A Rapid and Sensitive Quantitative Analysis.

    PubMed

    Wang, Du; Zhang, Zhaowei; Li, Peiwu; Zhang, Qi; Zhang, Wen

    2016-07-14

    Rapid and quantitative sensing of aflatoxin B1 with high sensitivity and specificity has drawn increased attention of studies investigating soybean sauce. A sensitive and rapid quantitative immunochromatographic sensing method was developed for the detection of aflatoxin B1 based on time-resolved fluorescence. It combines the advantages of time-resolved fluorescent sensing and immunochromatography. The dynamic range of a competitive and portable immunoassay was 0.3-10.0 µg·kg(-1), with a limit of detection (LOD) of 0.1 µg·kg(-1) and recoveries of 87.2%-114.3%, within 10 min. The results showed good correlation (R² > 0.99) between time-resolved fluorescent immunochromatographic strip test and high performance liquid chromatography (HPLC). Soybean sauce samples analyzed using time-resolved fluorescent immunochromatographic strip test revealed that 64.2% of samples contained aflatoxin B1 at levels ranging from 0.31 to 12.5 µg·kg(-1). The strip test is a rapid, sensitive, quantitative, and cost-effective on-site screening technique in food safety analysis.

  17. Analysis of DNA Cytosine Methylation Patterns Using Methylation-Sensitive Amplification Polymorphism (MSAP).

    PubMed

    Guevara, María Ángeles; de María, Nuria; Sáez-Laguna, Enrique; Vélez, María Dolores; Cervera, María Teresa; Cabezas, José Antonio

    2017-01-01

    Different molecular techniques have been developed to study either the global level of methylated cytosines or methylation at specific gene sequences. One of them is the methylation-sensitive amplified polymorphism technique (MSAP) which is a modification of amplified fragment length polymorphism (AFLP). It has been used to study methylation of anonymous CCGG sequences in different fungi, plants, and animal species. The main variation of this technique resides on the use of isoschizomers with different methylation sensitivity (such as HpaII and MspI) as a frequent-cutter restriction enzyme. For each sample, MSAP analysis is performed using both EcoRI/HpaII- and EcoRI/MspI-digested samples. A comparative analysis between EcoRI/HpaII and EcoRI/MspI fragment patterns allows the identification of two types of polymorphisms: (1) methylation-insensitive polymorphisms that show common EcoRI/HpaII and EcoRI/MspI patterns but are detected as polymorphic amplified fragments among samples and (2) methylation-sensitive polymorphisms which are associated with the amplified fragments that differ in their presence or absence or in their intensity between EcoRI/HpaII and EcoRI/MspI patterns. This chapter describes a detailed protocol of this technique and discusses the modifications that can be applied to adjust the technology to different species of interest.

  18. LSENS: A General Chemical Kinetics and Sensitivity Analysis Code for homogeneous gas-phase reactions. Part 1: Theory and numerical solution procedures

    NASA Technical Reports Server (NTRS)

    Radhakrishnan, Krishnan

    1994-01-01

    LSENS, the Lewis General Chemical Kinetics and Sensitivity Analysis Code, has been developed for solving complex, homogeneous, gas-phase chemical kinetics problems and contains sensitivity analysis for a variety of problems, including nonisothermal situations. This report is part 1 of a series of three reference publications that describe LENS, provide a detailed guide to its usage, and present many example problems. Part 1 derives the governing equations and describes the numerical solution procedures for the types of problems that can be solved. The accuracy and efficiency of LSENS are examined by means of various test problems, and comparisons with other methods and codes are presented. LSENS is a flexible, convenient, accurate, and efficient solver for chemical reaction problems such as static system; steady, one-dimensional, inviscid flow; reaction behind incident shock wave, including boundary layer correction; and perfectly stirred (highly backmixed) reactor. In addition, the chemical equilibrium state can be computed for the following assigned states: temperature and pressure, enthalpy and pressure, temperature and volume, and internal energy and volume. For static problems the code computes the sensitivity coefficients of the dependent variables and their temporal derivatives with respect to the initial values of the dependent variables and/or the three rate coefficient parameters of the chemical reactions.

  19. Bead-based competitive fluorescence immunoassay for sensitive and rapid diagnosis of cyanotoxin risk in drinking water.

    PubMed

    Yu, Hye-Weon; Jang, Am; Kim, Lan Hee; Kim, Sung-Jo; Kim, In S

    2011-09-15

    Due to the increased occurrence of cyanobacterial blooms and their toxins in drinking water sources, effective management based on a sensitive and rapid analytical method is in high demand for security of safe water sources and environmental human health. Here, a competitive fluorescence immunoassay of microcystin-LR (MCYST-LR) is developed in an attempt to improve the sensitivity, analysis time, and ease-of-manipulation of analysis. To serve this aim, a bead-based suspension assay was introduced based on two major sensing elements: an antibody-conjugated quantum dot (QD) detection probe and an antigen-immobilized magnetic bead (MB) competitor. The assay was composed of three steps: the competitive immunological reaction of QD detection probes against analytes and MB competitors, magnetic separation and washing, and the optical signal generation of QDs. The fluorescence intensity was found to be inversely proportional to the MCYST-LR concentration. Under optimized conditions, the proposed assay performed well for the identification and quantitative analysis of MCYST-LR (within 30 min in the range of 0.42-25 μg/L, with a limit of detection of 0.03 μg/L). It is thus expected that this enhanced assay can contribute both to the sensitive and rapid diagnosis of cyanotoxin risk in drinking water and effective management procedures.

  20. Assessment of statistic analysis in non-radioisotopic local lymph node assay (non-RI-LLNA) with alpha-hexylcinnamic aldehyde as an example.

    PubMed

    Takeyoshi, Masahiro; Sawaki, Masakuni; Yamasaki, Kanji; Kimber, Ian

    2003-09-30

    The murine local lymph node assay (LLNA) is used for the identification of chemicals that have the potential to cause skin sensitization. However, it requires specific facility and handling procedures to accommodate a radioisotopic (RI) endpoint. We have developed non-radioisotopic (non-RI) endpoint of LLNA based on BrdU incorporation to avoid a use of RI. Although this alternative method appears viable in principle, it is somewhat less sensitive than the standard assay. In this study, we report investigations to determine the use of statistical analysis to improve the sensitivity of a non-RI LLNA procedure with alpha-hexylcinnamic aldehyde (HCA) in two separate experiments. Consequently, the alternative non-RI method required HCA concentrations of greater than 25% to elicit a positive response based on the criterion for classification as a skin sensitizer in the standard LLNA. Nevertheless, dose responses to HCA in the alternative method were consistent in both experiments and we examined whether the use of an endpoint based upon the statistical significance of induced changes in LNC turnover, rather than an SI of 3 or greater, might provide for additional sensitivity. The results reported here demonstrate that with HCA at least significant responses were, in each of two experiments, recorded following exposure of mice to 25% of HCA. These data suggest that this approach may be more satisfactory-at least when BrdU incorporation is measured. However, this modification of the LLNA is rather less sensitive than the standard method if employing statistical endpoint. Taken together the data reported here suggest that a modified LLNA in which BrdU is used in place of radioisotope incorporation shows some promise, but that in its present form, even with the use of a statistical endpoint, lacks some of the sensitivity of the standard method. The challenge is to develop strategies for further refinement of this approach.

  1. Filaggrin mutations increase allergic airway disease in childhood and adolescence through interactions with eczema and aeroallergen sensitization.

    PubMed

    Chan, Adrian; Terry, William; Zhang, Hongmei; Karmaus, Wilfried; Ewart, Susan; Holloway, John W; Roberts, Graham; Kurukulaaratchy, Ramesh; Arshad, Syed Hasan

    2018-02-01

    Filaggrin loss-of-function (FLG-LOF) mutations are an established genetic cause of eczema. These mutations have subsequently been reported to increase the risk of aeroallergen sensitization and allergic airway disease. However, it is unclear whether FLG variants require both eczema and aeroallergen sensitization to influence airway disease development long-term outcomes. To examine the effects of FLG-LOF mutations on allergic airway disease outcomes, with eczema and aeroallergen sensitization as intermediate variables, using the Isle of Wight birth cohort. Study participants were evaluated at ages 1, 2, 4, 10 and 18 years to ascertain the development of allergic diseases (eczema, asthma and allergic rhinitis) and aeroallergen sensitization (determined by skin prick tests). FLG-LOF mutations were genotyped in 1150 subjects. To understand the complex associations between FLG mutations, intermediate variables (eczema and aeroallergen sensitization) and airway disease, path analysis was performed. There were significant total effects of FLG-LOF mutations on both asthma and allergic rhinitis at all ages as well as on aeroallergen sensitization up till 10 years old. In the filaggrin-asthma analysis, a direct effect of FLG-LOF mutations was observed on early childhood eczema (age 1 and 2 years) (relative risk (RR) 2.01, 95% CI: 1.74-2.31, P < .001), and all significant indirect pathways on asthma outcomes passed through eczema at these ages. In contrast, for the filaggrin-rhinitis model, FLG-LOF mutations exerted significant direct effects on early eczema as well as rhinitis at 10 years (RR 1.99; 95% CI: 1.72-2.29, P = .002). FLG-LOF mutations are a significant risk factor for later childhood asthma and rhinitis. However, the pathway to asthma is only through early childhood eczema while a direct effect was observed for childhood rhinitis. © 2017 John Wiley & Sons Ltd.

  2. Shuttle cryogenic supply system optimization study. Volume 1: Management supply, sections 1 - 3

    NASA Technical Reports Server (NTRS)

    1973-01-01

    An analysis of the cryogenic supply system for use on space shuttle vehicles was conducted. The major outputs of the analysis are: (1) evaluations of subsystem and integrated system concepts, (2) selection of representative designs, (3) parametric data and sensitivity studies, (4) evaluation of cryogenic cooling in environmental control subsystems, and (5) development of mathematical model.

  3. Single-cell transcriptomics uncovers distinct molecular signatures of stem cells in chronic myeloid leukemia.

    PubMed

    Giustacchini, Alice; Thongjuea, Supat; Barkas, Nikolaos; Woll, Petter S; Povinelli, Benjamin J; Booth, Christopher A G; Sopp, Paul; Norfo, Ruggiero; Rodriguez-Meira, Alba; Ashley, Neil; Jamieson, Lauren; Vyas, Paresh; Anderson, Kristina; Segerstolpe, Åsa; Qian, Hong; Olsson-Strömberg, Ulla; Mustjoki, Satu; Sandberg, Rickard; Jacobsen, Sten Eirik W; Mead, Adam J

    2017-06-01

    Recent advances in single-cell transcriptomics are ideally placed to unravel intratumoral heterogeneity and selective resistance of cancer stem cell (SC) subpopulations to molecularly targeted cancer therapies. However, current single-cell RNA-sequencing approaches lack the sensitivity required to reliably detect somatic mutations. We developed a method that combines high-sensitivity mutation detection with whole-transcriptome analysis of the same single cell. We applied this technique to analyze more than 2,000 SCs from patients with chronic myeloid leukemia (CML) throughout the disease course, revealing heterogeneity of CML-SCs, including the identification of a subgroup of CML-SCs with a distinct molecular signature that selectively persisted during prolonged therapy. Analysis of nonleukemic SCs from patients with CML also provided new insights into cell-extrinsic disruption of hematopoiesis in CML associated with clinical outcome. Furthermore, we used this single-cell approach to identify a blast-crisis-specific SC population, which was also present in a subclone of CML-SCs during the chronic phase in a patient who subsequently developed blast crisis. This approach, which might be broadly applied to any malignancy, illustrates how single-cell analysis can identify subpopulations of therapy-resistant SCs that are not apparent through cell-population analysis.

  4. Recent Advances in Multidisciplinary Analysis and Optimization, part 3

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M. (Editor)

    1989-01-01

    This three-part document contains a collection of technical papers presented at the Second NASA/Air Force Symposium on Recent Advances in Multidisciplinary Analysis and Optimization, held September 28-30, 1988 in Hampton, Virginia. The topics covered include: aircraft design, aeroelastic tailoring, control of aeroelastic structures, dynamics and control of flexible structures, structural design, design of large engineering systems, application of artificial intelligence, shape optimization, software development and implementation, and sensitivity analysis.

  5. Recent Advances in Multidisciplinary Analysis and Optimization, part 2

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M. (Editor)

    1989-01-01

    This three-part document contains a collection of technical papers presented at the Second NASA/Air Force Symposium on Recent Advances in Multidisciplinary Analysis and Optimization, held September 28-30, 1988 in Hampton, Virginia. The topics covered include: helicopter design, aeroelastic tailoring, control of aeroelastic structures, dynamics and control of flexible structures, structural design, design of large engineering systems, application of artificial intelligence, shape optimization, software development and implementation, and sensitivity analysis.

  6. Recent Advances in Multidisciplinary Analysis and Optimization, part 1

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M. (Editor)

    1989-01-01

    This three-part document contains a collection of technical papers presented at the Second NASA/Air Force Symposium on Recent Advances in Multidisciplinary Analysis and Optimization, held September 28-30, 1988 in Hampton, Virginia. The topics covered include: helicopter design, aeroelastic tailoring, control of aeroelastic structures, dynamics and control of flexible structures, structural design, design of large engineering systems, application of artificial intelligence, shape optimization, software development and implementation, and sensitivity analysis.

  7. Development of Chiral LC-MS Methods for small Molecules and Their Applications in the Analysis of Enantiomeric Composition and Pharmacokinetic Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Desai, Meera Jay

    The purpose of this research was to develop sensitive LC-MS methods for enantiomeric separation and detection, and then apply these methods for determination of enantiomeric composition and for the study of pharmacokinetic and pharmacodynamic properties of a chiral nutraceutical. Our first study, evaluated the use of reverse phase and polar organic mode for chiral LC-API/MS method development. Reverse phase methods containing high water were found to decrease ionization efficiency in electrospray, while polar organic methods offered good compatibility and low limits of detection with ESI. The use of lower flow rates dramatically increased the sensitivity by an order of magnitude.more » Additionally, for rapid chiral screening, the coupled Chirobiotic column afforded great applicability for LC-MS method development. Our second study, continued with chiral LC-MS method development in this case for the normal phase mode. Ethoxynonafluorobutane, a fluorocarbon with low flammability and no flashpoint, was used as a substitute solvent for hexane/heptane mobile phases for LC-APCI/MS. Comparable chromatographic resolutions and selectivities were found using ENFB substituted mobile phase systems, although, peak efficiencies were significantly diminished. Limits of detection were either comparable or better for ENFB-MS over heptane-PDA detection. The miscibility of ENFB with a variety of commonly used organic modifiers provided for flexibility in method development. For APCI, lower flow rates did not increase sensitivity as significantly as was previously found for ESI-MS detection. The chiral analysis of native amino acids was evaluated using both APCI and ESI sources. For free amino acids and small peptides, APCI was found to have better sensitivities over ESI at high flow rates. For larger peptides, however, sensitivity was greatly improved with the use of electrospray. Additionally, sensitivity was enhanced with the use of non-volatile additives, This optimized method was then used to simultaneously separate all 19 native amino acids enantiomerically in less than 20 minutes, making it suitable for complex biological analysis. The previously developed amino acid method was then used to enantiomerically separate theanine, a free amino acid found in tea leaves. Native theanine was found to have lower limits of detection and better sensitivity over derivatized theanine samples. The native theanine method was then used to determine the enantiomeric composition of six commercially available L-theanine products. Five out of the six samples were found to be a racemic mixture of both D- and L-theanine. Concern over the efficacy of these theanine products led to our final study evaluating the pharmacokinetics and pharmacodynamics of theanine in rats using LC-ESI/MS. Rats were administered D-, L, and QL-theanine both orally and intra-peritoneally. Oral administration data demonstrated that intestinal absorption of L-theanine was greater than that of D-theanine, while i.p. data showed equal plasma uptake of both isomers. This suggested a possible competitive binding effect with respect to gut absorption. Additionally, it was found that regardless of administration method, the presence of the other enantiomer always decreased overall theanine plasma concentration. This indicated that D- and L- theanine exhibit competitive binding with respect to urinary reabsorption as well. The large quantities of D-theanine detected in the urine suggested that D-themine was eliminated with minimal metabolism, while L-theanine was preferentially reabsorbed and metabolized to ethylamine. Clearly, the metabolic fate of racemic theanine and its individual enantiomers was quite different, placing into doubt the utility of the commercial theanine products.« less

  8. Designing novel cellulase systems through agent-based modeling and global sensitivity analysis.

    PubMed

    Apte, Advait A; Senger, Ryan S; Fong, Stephen S

    2014-01-01

    Experimental techniques allow engineering of biological systems to modify functionality; however, there still remains a need to develop tools to prioritize targets for modification. In this study, agent-based modeling (ABM) was used to build stochastic models of complexed and non-complexed cellulose hydrolysis, including enzymatic mechanisms for endoglucanase, exoglucanase, and β-glucosidase activity. Modeling results were consistent with experimental observations of higher efficiency in complexed systems than non-complexed systems and established relationships between specific cellulolytic mechanisms and overall efficiency. Global sensitivity analysis (GSA) of model results identified key parameters for improving overall cellulose hydrolysis efficiency including: (1) the cellulase half-life, (2) the exoglucanase activity, and (3) the cellulase composition. Overall, the following parameters were found to significantly influence cellulose consumption in a consolidated bioprocess (CBP): (1) the glucose uptake rate of the culture, (2) the bacterial cell concentration, and (3) the nature of the cellulase enzyme system (complexed or non-complexed). Broadly, these results demonstrate the utility of combining modeling and sensitivity analysis to identify key parameters and/or targets for experimental improvement.

  9. Designing novel cellulase systems through agent-based modeling and global sensitivity analysis

    PubMed Central

    Apte, Advait A; Senger, Ryan S; Fong, Stephen S

    2014-01-01

    Experimental techniques allow engineering of biological systems to modify functionality; however, there still remains a need to develop tools to prioritize targets for modification. In this study, agent-based modeling (ABM) was used to build stochastic models of complexed and non-complexed cellulose hydrolysis, including enzymatic mechanisms for endoglucanase, exoglucanase, and β-glucosidase activity. Modeling results were consistent with experimental observations of higher efficiency in complexed systems than non-complexed systems and established relationships between specific cellulolytic mechanisms and overall efficiency. Global sensitivity analysis (GSA) of model results identified key parameters for improving overall cellulose hydrolysis efficiency including: (1) the cellulase half-life, (2) the exoglucanase activity, and (3) the cellulase composition. Overall, the following parameters were found to significantly influence cellulose consumption in a consolidated bioprocess (CBP): (1) the glucose uptake rate of the culture, (2) the bacterial cell concentration, and (3) the nature of the cellulase enzyme system (complexed or non-complexed). Broadly, these results demonstrate the utility of combining modeling and sensitivity analysis to identify key parameters and/or targets for experimental improvement. PMID:24830736

  10. Investigating and understanding fouling in a planar setup using ultrasonic methods.

    PubMed

    Wallhäusser, E; Hussein, M A; Becker, T

    2012-09-01

    Fouling is an unwanted deposit on heat transfer surfaces and occurs regularly in foodstuff heat exchangers. Fouling causes high costs because cleaning of heat exchangers has to be carried out and cleaning success cannot easily be monitored. Thus, used cleaning cycles in foodstuff industry are usually too long leading to high costs. In this paper, a setup is described with which it is possible, first, to produce dairy protein fouling similar to the one found in industrial heat exchangers and, second, to detect the presence and absence of such fouling using an ultrasonic based measuring method. The developed setup resembles a planar heat exchanger in which fouling can be made and cleaned reproducible. Fouling presence, absence, and cleaning progress can be monitored by using an ultrasonic detection unit. The setup is described theoretically based on electrical and mechanical lumped circuits to derive the wave equation and the transfer function to perform a sensitivity analysis. Sensitivity analysis was done to determine influencing quantities and showed that fouling is measurable. Also, first experimental results are compared with results from sensitivity analysis.

  11. Comparison of Two Global Sensitivity Analysis Methods for Hydrologic Modeling over the Columbia River Basin

    NASA Astrophysics Data System (ADS)

    Hameed, M.; Demirel, M. C.; Moradkhani, H.

    2015-12-01

    Global Sensitivity Analysis (GSA) approach helps identify the effectiveness of model parameters or inputs and thus provides essential information about the model performance. In this study, the effects of the Sacramento Soil Moisture Accounting (SAC-SMA) model parameters, forcing data, and initial conditions are analysed by using two GSA methods: Sobol' and Fourier Amplitude Sensitivity Test (FAST). The simulations are carried out over five sub-basins within the Columbia River Basin (CRB) for three different periods: one-year, four-year, and seven-year. Four factors are considered and evaluated by using the two sensitivity analysis methods: the simulation length, parameter range, model initial conditions, and the reliability of the global sensitivity analysis methods. The reliability of the sensitivity analysis results is compared based on 1) the agreement between the two sensitivity analysis methods (Sobol' and FAST) in terms of highlighting the same parameters or input as the most influential parameters or input and 2) how the methods are cohered in ranking these sensitive parameters under the same conditions (sub-basins and simulation length). The results show the coherence between the Sobol' and FAST sensitivity analysis methods. Additionally, it is found that FAST method is sufficient to evaluate the main effects of the model parameters and inputs. Another conclusion of this study is that the smaller parameter or initial condition ranges, the more consistency and coherence between the sensitivity analysis methods results.

  12. Sample pooling for real-time PCR detection and virulence determination of the footrot pathogen Dichelobacter nodosus.

    PubMed

    Frosth, Sara; König, Ulrika; Nyman, Ann-Kristin; Aspán, Anna

    2017-09-01

    Dichelobacter nodosus is the principal cause of ovine footrot and strain virulence is an important factor in disease severity. Therefore, detection and virulence determination of D. nodosus is important for proper diagnosis of the disease. Today this is possible by real-time PCR analysis. Analysis of large numbers of samples is costly and laborious; therefore, pooling of individual samples is common in surveillance programs. However, pooling can reduce the sensitivity of the method. The aim of this study was to develop a pooling method for real-time PCR analysis that would allow sensitive detection and simultaneous virulence determination of D. nodosus. A total of 225 sheep from 17 flocks were sampled using ESwabs within the Swedish Footrot Control Program in 2014. Samples were first analysed individually and then in pools of five by real-time PCR assays targeting the 16S rRNA and aprV2/B2 genes of D. nodosus. Each pool consisted of four negative and one positive D. nodosus samples with varying amounts of the bacterium. In the individual analysis, 61 (27.1%) samples were positive in the 16S rRNA and the aprV2/B2 PCR assays and 164 (72.9%) samples were negative. All samples positive in the aprV2/B2 PCR-assay were of aprB2 variant. The pooled analysis showed that all 41 pools were also positive for D. nodosus 16S rRNA and the aprB2 variant. The diagnostic sensitivity for pooled and individual samples was therefore similar. Our method includes concentration of the bacteria before DNA-extraction. This may account for the maintenance of diagnostic sensitivity. Diagnostic sensitivity in the real-time PCR assays of the pooled samples were comparable to the sensitivity obtained for individually analysed samples. Even sub-clinical infections were able to be detected in the pooled PCR samples which is important for control of the disease. This method may therefore be implemented in footrot control programs where it can replace analysis of individual samples.

  13. Redesign of a Variable-Gain Output Feedback Longitudinal Controller Flown on the High-Alpha Research Vehicle (HARV)

    NASA Technical Reports Server (NTRS)

    Ostroff, Aaron J.

    1998-01-01

    This paper describes a redesigned longitudinal controller that flew on the High-Alpha Research Vehicle (HARV) during calendar years (CY) 1995 and 1996. Linear models are developed for both the modified controller and a baseline controller that was flown in CY 1994. The modified controller was developed with three gain sets for flight evaluation, and several linear analysis results are shown comparing the gain sets. A Neal-Smith flying qualities analysis shows that performance for the low- and medium-gain sets is near the level 1 boundary, depending upon the bandwidth assumed, whereas the high-gain set indicates a sensitivity problem. A newly developed high-alpha Bode envelope criterion indicates that the control system gains may be slightly high, even for the low-gain set. A large motion-base simulator in the United Kingdom was used to evaluate the various controllers. Desired performance, which appeared to be satisfactory for flight, was generally met with both the low- and medium-gain sets. Both the high-gain set and the baseline controller were very sensitive, and it was easy to generate pilot-induced oscillation (PIO) in some of the target-tracking maneuvers. Flight target-tracking results varied from level 1 to level 3 and from no sensitivity to PIO. These results were related to pilot technique and whether actuator rate saturation was encountered.

  14. Feedback‐amplified electrochemical dual‐plate boron‐doped diamond microtrench detector for flow injection analysis

    PubMed Central

    Lewis, Grace E. M.; Gross, Andrew J.; Kasprzyk‐Hordern, Barbara; Lubben, Anneke T.

    2015-01-01

    An electrochemical flow cell with a boron‐doped diamond dual‐plate microtrench electrode has been developed and demonstrated for hydroquinone flow injection electroanalysis in phosphate buffer pH 7. Using the electrochemical generator‐collector feedback detector improves the sensitivity by one order of magnitude (when compared to a single working electrode detector). The diffusion process is switched from an analyte consuming “external” process to an analyte regenerating “internal” process with benefits in selectivity and sensitivity. PMID:25735831

  15. Securing Sensitive Flight and Engine Simulation Data Using Smart Card Technology

    NASA Technical Reports Server (NTRS)

    Blaser, Tammy M.

    2003-01-01

    NASA Glenn Research Center has developed a smart card prototype capable of encrypting and decrypting disk files required to run a distributed aerospace propulsion simulation. Triple Data Encryption Standard (3DES) encryption is used to secure the sensitive intellectual property on disk pre, during, and post simulation execution. The prototype operates as a secure system and maintains its authorized state by safely storing and permanently retaining the encryption keys only on the smart card. The prototype is capable of authenticating a single smart card user and includes pre simulation and post simulation tools for analysis and training purposes. The prototype's design is highly generic and can be used to protect any sensitive disk files with growth capability to urn multiple simulations. The NASA computer engineer developed the prototype on an interoperable programming environment to enable porting to other Numerical Propulsion System Simulation (NPSS) capable operating system environments.

  16. Modeling Effects of Annealing on Coal Char Reactivity to O 2 and CO 2 , Based on Preparation Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, Troy; Bhat, Sham; Marcy, Peter

    Oxy-fired coal combustion is a promising potential carbon capture technology. Predictive computational fluid dynamics (CFD) simulations are valuable tools in evaluating and deploying oxyfuel and other carbon capture technologies, either as retrofit technologies or for new construction. However, accurate predictive combustor simulations require physically realistic submodels with low computational requirements. A recent sensitivity analysis of a detailed char conversion model (Char Conversion Kinetics (CCK)) found thermal annealing to be an extremely sensitive submodel. In the present work, further analysis of the previous annealing model revealed significant disagreement with numerous datasets from experiments performed after that annealing model was developed. Themore » annealing model was accordingly extended to reflect experimentally observed reactivity loss, because of the thermal annealing of a variety of coals under diverse char preparation conditions. The model extension was informed by a Bayesian calibration analysis. In addition, since oxyfuel conditions include extraordinarily high levels of CO 2, the development of a first-ever CO 2 reactivity loss model due to annealing is presented.« less

  17. Modeling Effects of Annealing on Coal Char Reactivity to O 2 and CO 2 , Based on Preparation Conditions

    DOE PAGES

    Holland, Troy; Bhat, Sham; Marcy, Peter; ...

    2017-08-25

    Oxy-fired coal combustion is a promising potential carbon capture technology. Predictive computational fluid dynamics (CFD) simulations are valuable tools in evaluating and deploying oxyfuel and other carbon capture technologies, either as retrofit technologies or for new construction. However, accurate predictive combustor simulations require physically realistic submodels with low computational requirements. A recent sensitivity analysis of a detailed char conversion model (Char Conversion Kinetics (CCK)) found thermal annealing to be an extremely sensitive submodel. In the present work, further analysis of the previous annealing model revealed significant disagreement with numerous datasets from experiments performed after that annealing model was developed. Themore » annealing model was accordingly extended to reflect experimentally observed reactivity loss, because of the thermal annealing of a variety of coals under diverse char preparation conditions. The model extension was informed by a Bayesian calibration analysis. In addition, since oxyfuel conditions include extraordinarily high levels of CO 2, the development of a first-ever CO 2 reactivity loss model due to annealing is presented.« less

  18. Development of a headspace GC/MS analysis for carbonyl compounds (aldehydes and ketones) in household products after derivatization with o-(2,3,4,5,6-pentafluorobenzyl)hydroxylamine.

    PubMed

    Sugaya, Naeko; Sakurai, Katsumi; Nakagawa, Tomoo; Onda, Nobuhiko; Onodera, Sukeo; Morita, Masatoshi; Tezuka, Masakatsu

    2004-05-01

    Carbonyl compounds (aldehydes and ketones) are suspected to be among the chemical compounds responsible for Sick Building Syndrome and Multiple Chemical Sensitivities. A headspace gas chromatography/mass spectrometry (GC/MS) analysis for these compounds was developed using derivatization of the compounds into volatile derivatives with o-(2,3,4,5,6-pentafluorobenzyl)hydroxylamine (PFBOA). For GC/MS detection, two ionization modes including electron impact ionization (EI) and negative chemical ionization (NCI) were compared. The NCI mode seemed to be better because of its higher selectivity and sensitivity. This headspace GC/MS (NCI mode) was employed as analysis for aldehydes and ketones in materials (fiber products, adhesives, and printed materials). Formaldehyde was detected in the range of N.D. (not detected) to 39 microg/g; acetaldehyde, N.D. to 4.1 microg/g; propionaldehyde, N.D. to 1.0 microg/g; n-butyraldehyde, N.D. to 0.10 microg/g; and acetone, N.D. to 3.1 microg/g in the samples analyzed.

  19. Phase sensitive spectral domain interferometry for label free biomolecular interaction analysis and biosensing applications

    NASA Astrophysics Data System (ADS)

    Chirvi, Sajal

    Biomolecular interaction analysis (BIA) plays vital role in wide variety of fields, which include biomedical research, pharmaceutical industry, medical diagnostics, and biotechnology industry. Study and quantification of interactions between natural biomolecules (proteins, enzymes, DNA) and artificially synthesized molecules (drugs) is routinely done using various labeled and label-free BIA techniques. Labeled BIA (Chemiluminescence, Fluorescence, Radioactive) techniques suffer from steric hindrance of labels on interaction site, difficulty of attaching labels to molecules, higher cost and time of assay development. Label free techniques with real time detection capabilities have demonstrated advantages over traditional labeled techniques. The gold standard for label free BIA is surface Plasmon resonance (SPR) that detects and quantifies the changes in refractive index of the ligand-analyte complex molecule with high sensitivity. Although SPR is a highly sensitive BIA technique, it requires custom-made sensor chips and is not well suited for highly multiplexed BIA required in high throughput applications. Moreover implementation of SPR on various biosensing platforms is limited. In this research work spectral domain phase sensitive interferometry (SD-PSI) has been developed for label-free BIA and biosensing applications to address limitations of SPR and other label free techniques. One distinct advantage of SD-PSI compared to other label-free techniques is that it does not require use of custom fabricated biosensor substrates. Laboratory grade, off-the-shelf glass or plastic substrates of suitable thickness with proper surface functionalization are used as biosensor chips. SD-PSI is tested on four separate BIA and biosensing platforms, which include multi-well plate, flow cell, fiber probe with integrated optics and fiber tip biosensor. Sensitivity of 33 ng/ml for anti-IgG is achieved using multi-well platform. Principle of coherence multiplexing for multi-channel label-free biosensing applications is introduced. Simultaneous interrogation of multiple biosensors is achievable with a single spectral domain phase sensitive interferometer by coding the individual sensograms in coherence-multiplexed channels. Experimental results demonstrating multiplexed quantitative biomolecular interaction analysis of antibodies binding to antigen coated functionalized biosensor chip surfaces on different platforms are presented.

  20. Moving beyond the mother-child dyad: exploring the link between maternal sensitivity and siblings' attachment styles.

    PubMed

    Kennedy, Mark; Betts, Lucy R; Underwood, Jean D M

    2014-01-01

    Attachment theory asserts that secure attachment representations are developed through sensitive and consistent caregiving. If sensitive caregiving is a constant characteristic of the parent, then siblings should have concordant attachment classifications. The authors explored maternal attachment quality assessed by the Attachment Q-Set, maternal sensitivity, and specific mother-child interactions between siblings. Hour-long observations took place in the homes of 9 preschool sibling pairs and their immediate caregivers. The interactions were analyzed using a modified version of Bales' Small Group Analysis. The results reveal attachment discordance in a third of sibling pairs. While maternal sensitivity was higher with older siblings and mothers displayed more positive emotions when interacting with their younger siblings, attachment quality was not associated with birth order. Therefore, a shift toward a more contextual, family-based perspective of attachment is recommended to further understand how attachment strategies are created and maintained within the child's everyday context.

  1. Economic Evaluation of First-Line Treatments for Metastatic Renal Cell Carcinoma: A Cost-Effectiveness Analysis in A Health Resource–Limited Setting

    PubMed Central

    Wu, Bin; Dong, Baijun; Xu, Yuejuan; Zhang, Qiang; Shen, Jinfang; Chen, Huafeng; Xue, Wei

    2012-01-01

    Background To estimate, from the perspective of the Chinese healthcare system, the economic outcomes of five different first-line strategies among patients with metastatic renal cell carcinoma (mRCC). Methods and Findings A decision-analytic model was developed to simulate the lifetime disease course associated with renal cell carcinoma. The health and economic outcomes of five first-line strategies (interferon-alfa, interleukin-2, interleukin-2 plus interferon-alfa, sunitinib and bevacizumab plus interferon-alfa) were estimated and assessed by indirect comparison. The clinical and utility data were taken from published studies. The cost data were estimated from local charge data and current Chinese practices. Sensitivity analyses were used to explore the impact of uncertainty regarding the results. The impact of the sunitinib patient assistant program (SPAP) was evaluated via scenario analysis. The base-case analysis showed that the sunitinib strategy yielded the maximum health benefits: 2.71 life years and 1.40 quality-adjusted life-years (QALY). The marginal cost-effectiveness (cost per additional QALY) gained via the sunitinib strategy compared with the conventional strategy was $220,384 (without SPAP, interleukin-2 plus interferon-alfa and bevacizumab plus interferon-alfa were dominated) and $16,993 (with SPAP, interferon-alfa, interleukin-2 plus interferon-alfa and bevacizumab plus interferon-alfa were dominated). In general, the results were sensitive to the hazard ratio of progression-free survival. The probabilistic sensitivity analysis demonstrated that the sunitinib strategy with SPAP was the most cost-effective approach when the willingness-to-pay threshold was over $16,000. Conclusions Our analysis suggests that traditional cytokine therapy is the cost-effective option in the Chinese healthcare setting. In some relatively developed regions, sunitinib with SPAP may be a favorable cost-effective alternative for mRCC. PMID:22412884

  2. Economic evaluation of first-line treatments for metastatic renal cell carcinoma: a cost-effectiveness analysis in a health resource-limited setting.

    PubMed

    Wu, Bin; Dong, Baijun; Xu, Yuejuan; Zhang, Qiang; Shen, Jinfang; Chen, Huafeng; Xue, Wei

    2012-01-01

    To estimate, from the perspective of the Chinese healthcare system, the economic outcomes of five different first-line strategies among patients with metastatic renal cell carcinoma (mRCC). A decision-analytic model was developed to simulate the lifetime disease course associated with renal cell carcinoma. The health and economic outcomes of five first-line strategies (interferon-alfa, interleukin-2, interleukin-2 plus interferon-alfa, sunitinib and bevacizumab plus interferon-alfa) were estimated and assessed by indirect comparison. The clinical and utility data were taken from published studies. The cost data were estimated from local charge data and current Chinese practices. Sensitivity analyses were used to explore the impact of uncertainty regarding the results. The impact of the sunitinib patient assistant program (SPAP) was evaluated via scenario analysis. The base-case analysis showed that the sunitinib strategy yielded the maximum health benefits: 2.71 life years and 1.40 quality-adjusted life-years (QALY). The marginal cost-effectiveness (cost per additional QALY) gained via the sunitinib strategy compared with the conventional strategy was $220,384 (without SPAP, interleukin-2 plus interferon-alfa and bevacizumab plus interferon-alfa were dominated) and $16,993 (with SPAP, interferon-alfa, interleukin-2 plus interferon-alfa and bevacizumab plus interferon-alfa were dominated). In general, the results were sensitive to the hazard ratio of progression-free survival. The probabilistic sensitivity analysis demonstrated that the sunitinib strategy with SPAP was the most cost-effective approach when the willingness-to-pay threshold was over $16,000. Our analysis suggests that traditional cytokine therapy is the cost-effective option in the Chinese healthcare setting. In some relatively developed regions, sunitinib with SPAP may be a favorable cost-effective alternative for mRCC.

  3. Pilot study of a program delivered within the regular service system in Germany: effect of a short-term attachment-based intervention on maternal sensitivity in mothers at risk for child abuse and neglect.

    PubMed

    Pillhofer, Melanie; Spangler, Gottfried; Bovenschen, Ina; Kuenster, Anne K; Gabler, Sandra; Fallon, Barbara; Fegert, Joerg M; Ziegenhain, Ute

    2015-04-01

    This pilot study examined the effectiveness of a short-term attachment-based intervention, the Ulm Model, in a German population at risk for child abuse and neglect. The intervention used home visits and video feedback to promote maternal sensitivity, and was implemented by trained staff within the health care and youth welfare systems. Mothers in the control group (n=33) received standard services only, while those in the intervention group (n=63) additionally the Ulm Model intervention. The outcomes measured were maternal sensitivity, as assessed by the CARE-Index at pre-intervention, after the last session, and at about 6 and 12 months of age; and infant socio-emotional development, as assessed by the ET6-6 development test at about 6 and 12 months of age. The moderating effects on treatment outcomes of two variables were examined: risk for child abuse (moderate vs. high) and type of maternal attachment representation (secure vs. insecure). Among participants at moderate risk for child abuse, no differences were found between the intervention group and control group in either maternal sensitivity or infant development. Among those considered high risk, mothers in the intervention group showed a significant increase in maternal sensitivity from pre- to post-intervention; however, no group differences were seen at follow-up. There were some indications that infants of mothers in the intervention group showed better emotional development. The variable of maternal attachment representation was not a significant moderator for the intervention effect, but post hoc analysis indicated that the mean sensitivity of secure mothers was significant higher at the 6-month follow-up. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Sensitivity Analysis in Sequential Decision Models.

    PubMed

    Chen, Qiushi; Ayer, Turgay; Chhatwal, Jagpreet

    2017-02-01

    Sequential decision problems are frequently encountered in medical decision making, which are commonly solved using Markov decision processes (MDPs). Modeling guidelines recommend conducting sensitivity analyses in decision-analytic models to assess the robustness of the model results against the uncertainty in model parameters. However, standard methods of conducting sensitivity analyses cannot be directly applied to sequential decision problems because this would require evaluating all possible decision sequences, typically in the order of trillions, which is not practically feasible. As a result, most MDP-based modeling studies do not examine confidence in their recommended policies. In this study, we provide an approach to estimate uncertainty and confidence in the results of sequential decision models. First, we provide a probabilistic univariate method to identify the most sensitive parameters in MDPs. Second, we present a probabilistic multivariate approach to estimate the overall confidence in the recommended optimal policy considering joint uncertainty in the model parameters. We provide a graphical representation, which we call a policy acceptability curve, to summarize the confidence in the optimal policy by incorporating stakeholders' willingness to accept the base case policy. For a cost-effectiveness analysis, we provide an approach to construct a cost-effectiveness acceptability frontier, which shows the most cost-effective policy as well as the confidence in that for a given willingness to pay threshold. We demonstrate our approach using a simple MDP case study. We developed a method to conduct sensitivity analysis in sequential decision models, which could increase the credibility of these models among stakeholders.

  5. Critique and sensitivity analysis of the compensation function used in the LMS Hudson River striped bass models. Environmental Sciences Division publication No. 944

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Winkle, W.; Christensen, S.W.; Kauffman, G.

    1976-12-01

    The description and justification for the compensation function developed and used by Lawler, Matusky and Skelly Engineers (LMS) (under contract to Consolidated Edison Company of New York) in their Hudson River striped bass models are presented. A sensitivity analysis of this compensation function is reported, based on computer runs with a modified version of the LMS completely mixed (spatially homogeneous) model. Two types of sensitivity analysis were performed: a parametric study involving at least five levels for each of the three parameters in the compensation function, and a study of the form of the compensation function itself, involving comparison ofmore » the LMS function with functions having no compensation at standing crops either less than or greater than the equilibrium standing crops. For the range of parameter values used in this study, estimates of percent reduction are least sensitive to changes in YS, the equilibrium standing crop, and most sensitive to changes in KXO, the minimum mortality rate coefficient. Eliminating compensation at standing crops either less than or greater than the equilibrium standing crops results in higher estimates of percent reduction. For all values of KXO and for values of YS and KX at and above the baseline values, eliminating compensation at standing crops less than the equilibrium standing crops results in a greater increase in percent reduction than eliminating compensation at standing crops greater than the equilibrium standing crops.« less

  6. Evaluation of high throughput gene expression platforms using a genomic biomarker signature for prediction of skin sensitization.

    PubMed

    Forreryd, Andy; Johansson, Henrik; Albrekt, Ann-Sofie; Lindstedt, Malin

    2014-05-16

    Allergic contact dermatitis (ACD) develops upon exposure to certain chemical compounds termed skin sensitizers. To reduce the occurrence of skin sensitizers, chemicals are regularly screened for their capacity to induce sensitization. The recently developed Genomic Allergen Rapid Detection (GARD) assay is an in vitro alternative to animal testing for identification of skin sensitizers, classifying chemicals by evaluating transcriptional levels of a genomic biomarker signature. During assay development and biomarker identification, genome-wide expression analysis was applied using microarrays covering approximately 30,000 transcripts. However, the microarray platform suffers from drawbacks in terms of low sample throughput, high cost per sample and time consuming protocols and is a limiting factor for adaption of GARD into a routine assay for screening of potential sensitizers. With the purpose to simplify assay procedures, improve technical parameters and increase sample throughput, we assessed the performance of three high throughput gene expression platforms--nCounter®, BioMark HD™ and OpenArray®--and correlated their performance metrics against our previously generated microarray data. We measured the levels of 30 transcripts from the GARD biomarker signature across 48 samples. Detection sensitivity, reproducibility, correlations and overall structure of gene expression measurements were compared across platforms. Gene expression data from all of the evaluated platforms could be used to classify most of the sensitizers from non-sensitizers in the GARD assay. Results also showed high data quality and acceptable reproducibility for all platforms but only medium to poor correlations of expression measurements across platforms. In addition, evaluated platforms were superior to the microarray platform in terms of cost efficiency, simplicity of protocols and sample throughput. We evaluated the performance of three non-array based platforms using a limited set of transcripts from the GARD biomarker signature. We demonstrated that it was possible to achieve acceptable discriminatory power in terms of separation between sensitizers and non-sensitizers in the GARD assay while reducing assay costs, simplify assay procedures and increase sample throughput by using an alternative platform, providing a first step towards the goal to prepare GARD for formal validation and adaption of the assay for industrial screening of potential sensitizers.

  7. A sensitivity analysis comparing two procedures for adjusting as-measured spectra to reference conditions

    DOT National Transportation Integrated Search

    2002-04-01

    The Society of Automotive Engineers (SAE) Aerospace Recommended Practice (ARP) No. 866A (866A), and a : procedure utilizing pure-tone absorption equations developed in support of the International Organization : for Standardizations (ISO) 9613-...

  8. Control of Wheel/Rail Noise and Vibration

    DOT National Transportation Integrated Search

    1982-04-01

    An analytical model of the generation of wheel/rail noise has been developed and validated through an extensive series of field tests carried out at the Transportation Test Center using the State of the Art Car. A sensitivity analysis has been perfor...

  9. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool

    PubMed Central

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-01-01

    Background It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. Results This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. Conclusion SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes. PMID:18706080

  10. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool.

    PubMed

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-08-15

    It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.

  11. Basic research for the geodynamics program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The mathematical models of space very long base interferometry (VLBI) observables suitable for least squares covariance analysis were derived and estimatability problems inherent in the space VLBI system were explored, including a detailed rank defect analysis and sensitivity analysis. An important aim is to carry out a comparative analysis of the mathematical models of the ground-based VLBI and space VLBI observables in order to describe the background in detail. Computer programs were developed in order to check the relations, assess errors, and analyze sensitivity. In order to investigate the estimatability of different geodetic and geodynamic parameters from the space VLBI observables, the mathematical models for time delay and time delay rate observables of space VLBI were analytically derived along with the partial derivatives with respect to the parameters. Rank defect analysis was carried out both by analytical and numerical testing of linear dependencies between the columns of the normal matrix thus formed. Definite conclusions were formed about the rank defects in the system.

  12. IATA for skin sensitization potential – 1 out of 2 or 2 out of 3? ...

    EPA Pesticide Factsheets

    To meet EU regulatory requirements and to avoid or minimize animal testing, there is a need for non-animal methods to assess skin sensitization potential. Given the complexity of the skin sensitization endpoint, there is an expectation that integrated testing and assessment approaches (IATA) will need to be developed which rely on assays representing key events in the pathway. Three non-animal assays have been formally validated: the direct peptide reactivity assay (DPRA), the KeratinoSensTM assay and the h-CLAT assay. At the same time, there have been many efforts to develop IATA with the “2 out of 3” approach attracting much attention whereby a chemical is classified on the basis of the majority outcome. A set of 271 chemicals with mouse, human and non-animal sensitization test data was evaluated to compare the predictive performances of the 3 individual non-animal assays, their binary combinations and the ‘2 out of 3’ approach. The analysis revealed that the most predictive approach was to use both the DPRA and h-CLAT: 1. Perform DPRA – if positive, classify as a sensitizer; 2. If negative, perform h-CLAT – a positive outcome denotes a sensitizer, a negative, a non-sensitizer. With this approach, 83% (LLNA) and 93% (human) of the non-sensitizer predictions were correct, in contrast to the ‘2 out of 3’ approach which had 69% (LLNA) and 79% (human) of non-sensitizer predictions correct. The views expressed are those of the authors and do not ne

  13. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  14. Processing postmortem specimens with C18-carboxypropylbetaine and analysis by PCR to develop an antemortem test for Mycobacterium avium infections in ducks.

    PubMed

    Thornton, C G; Cranfield, M R; MacLellan, K M; Brink, T L; Strandberg, J D; Carlin, E A; Torrelles, J B; Maslow, J N; Hasson, J L; Heyl, D M; Sarro, S J; Chatterjee, D; Passen, S

    1999-03-01

    Mycobacterium avium is the causative agent of the avian mycobacteriosis commonly known as avian tuberculosis (ATB). This infection causes disseminated disease, is difficult to diagnose, and is of serious concern because it causes significant mortality in birds. A new method was developed for processing specimens for an antemortem screening test for ATB. This novel method uses the zwitterionic detergent C18-carboxypropylbetaine (CB-18). Blood, bone marrow, bursa, and fecal specimens from 28 ducks and swabs of 20 lesions were processed with CB-18 for analysis by smear, culture, and polymerase chain reaction (PCR). Postmortem examination confirmed nine of these birds as either positive or highly suspect for disseminated disease. The sensitivities of smear, culture, and PCR, relative to postmortem analysis and independent of specimen type, were 44.4%, 88.9%, and 100%, respectively, and the specificities were 84.2%, 57.9%, and 15.8%, respectively. Reductions in specificity were due primarily to results among fecal specimens. However, these results were clustered among a subset of birds, suggesting that these tests actually identified birds in early stages of the disease. Restriction fragment length polymorphism mapping identified one strain of M. avium (serotype 1) that was isolated from lesions, bursa, bone marrow, blood, and feces of all but three of the culture-positive birds. In birds with confirmed disease, blood had the lowest sensitivity and the highest specificity by all diagnostic methods. Swabs of lesions provided the highest sensitivity by smear and culture (33.3% and 77.8%, respectively), whereas fecal specimens had the highest sensitivity by PCR (77.8%). The results of this study indicate that processing fecal specimens with CB-18, followed by PCR analysis, may provide a valuable first step for monitoring the presence of ATB in birds.

  15. Parametric sensitivity analysis of an agro-economic model of management of irrigation water

    NASA Astrophysics Data System (ADS)

    El Ouadi, Ihssan; Ouazar, Driss; El Menyari, Younesse

    2015-04-01

    The current work aims to build an analysis and decision support tool for policy options concerning the optimal allocation of water resources, while allowing a better reflection on the issue of valuation of water by the agricultural sector in particular. Thus, a model disaggregated by farm type was developed for the rural town of Ait Ben Yacoub located in the east Morocco. This model integrates economic, agronomic and hydraulic data and simulates agricultural gross margin across in this area taking into consideration changes in public policy and climatic conditions, taking into account the competition for collective resources. To identify the model input parameters that influence over the results of the model, a parametric sensitivity analysis is performed by the "One-Factor-At-A-Time" approach within the "Screening Designs" method. Preliminary results of this analysis show that among the 10 parameters analyzed, 6 parameters affect significantly the objective function of the model, it is in order of influence: i) Coefficient of crop yield response to water, ii) Average daily gain in weight of livestock, iii) Exchange of livestock reproduction, iv) maximum yield of crops, v) Supply of irrigation water and vi) precipitation. These 6 parameters register sensitivity indexes ranging between 0.22 and 1.28. Those results show high uncertainties on these parameters that can dramatically skew the results of the model or the need to pay particular attention to their estimates. Keywords: water, agriculture, modeling, optimal allocation, parametric sensitivity analysis, Screening Designs, One-Factor-At-A-Time, agricultural policy, climate change.

  16. Maternal Responsiveness and Sensitivity Re-Considered: Some Is More

    PubMed Central

    Bornstein, Marc H.; Manian, Nanmathi

    2013-01-01

    Is it always or necessarily the case that common and important parenting practices are better insofar as they occur more often, or worse because they occur less often? Perhaps, less is more, or some is more. To address this question, we studied mothers’ microcoded contingent responsiveness to their infants (M = 5.4 months, SD = 0.2) in relation to independent global judgments of the same mothers’ parenting sensitivity. In a community sample of 335 European American dyads, videorecorded infant and maternal behaviors were timed microanalytically throughout an extended home observation; separately and independently, global maternal sensitivity was rated macroanalytically. Sequential analysis and spline regression showed that, as maternal contingent responsiveness increased, judged maternal sensitivity increased to significance on the contingency continuum, after which mothers who were even more contingent were judged less sensitive. Just significant levels of maternal responsiveness are deemed optimally sensitive. Implications of these findings for typical and atypical parenting, child development, and intervention science are discussed. PMID:24229542

  17. Monte Carlo sensitivity analysis of land surface parameters using the Variable Infiltration Capacity model

    NASA Astrophysics Data System (ADS)

    Demaria, Eleonora M.; Nijssen, Bart; Wagener, Thorsten

    2007-06-01

    Current land surface models use increasingly complex descriptions of the processes that they represent. Increase in complexity is accompanied by an increase in the number of model parameters, many of which cannot be measured directly at large spatial scales. A Monte Carlo framework was used to evaluate the sensitivity and identifiability of ten parameters controlling surface and subsurface runoff generation in the Variable Infiltration Capacity model (VIC). Using the Monte Carlo Analysis Toolbox (MCAT), parameter sensitivities were studied for four U.S. watersheds along a hydroclimatic gradient, based on a 20-year data set developed for the Model Parameter Estimation Experiment (MOPEX). Results showed that simulated streamflows are sensitive to three parameters when evaluated with different objective functions. Sensitivity of the infiltration parameter (b) and the drainage parameter (exp) were strongly related to the hydroclimatic gradient. The placement of vegetation roots played an important role in the sensitivity of model simulations to the thickness of the second soil layer (thick2). Overparameterization was found in the base flow formulation indicating that a simplified version could be implemented. Parameter sensitivity was more strongly dictated by climatic gradients than by changes in soil properties. Results showed how a complex model can be reduced to a more parsimonious form, leading to a more identifiable model with an increased chance of successful regionalization to ungauged basins. Although parameter sensitivities are strictly valid for VIC, this model is representative of a wider class of macroscale hydrological models. Consequently, the results and methodology will have applicability to other hydrological models.

  18. Comparison between Surrogate Indexes of Insulin Sensitivity/Resistance and Hyperinsulinemic Euglycemic Glucose Clamps in Rhesus Monkeys

    PubMed Central

    Lee, Ho-Won; Muniyappa, Ranganath; Yan, Xu; Yue, Lilly Q.; Linden, Ellen H.; Chen, Hui; Hansen, Barbara C.

    2011-01-01

    The euglycemic glucose clamp is the reference method for assessing insulin sensitivity in humans and animals. However, clamps are ill-suited for large studies because of extensive requirements for cost, time, labor, and technical expertise. Simple surrogate indexes of insulin sensitivity/resistance including quantitative insulin-sensitivity check index (QUICKI) and homeostasis model assessment (HOMA) have been developed and validated in humans. However, validation studies of QUICKI and HOMA in both rats and mice suggest that differences in metabolic physiology between rodents and humans limit their value in rodents. Rhesus monkeys are a species more similar to humans than rodents. Therefore, in the present study, we evaluated data from 199 glucose clamp studies obtained from a large cohort of 86 monkeys with a broad range of insulin sensitivity. Data were used to evaluate simple surrogate indexes of insulin sensitivity/resistance (QUICKI, HOMA, Log HOMA, 1/HOMA, and 1/Fasting insulin) with respect to linear regression, predictive accuracy using a calibration model, and diagnostic performance using receiver operating characteristic. Most surrogates had modest linear correlations with SIClamp (r ≈ 0.4–0.64) with comparable correlation coefficients. Predictive accuracy determined by calibration model analysis demonstrated better predictive accuracy of QUICKI than HOMA and Log HOMA. Receiver operating characteristic analysis showed equivalent sensitivity and specificity of most surrogate indexes to detect insulin resistance. Thus, unlike in rodents but similar to humans, surrogate indexes of insulin sensitivity/resistance including QUICKI and log HOMA may be reasonable to use in large studies of rhesus monkeys where it may be impractical to conduct glucose clamp studies. PMID:21209021

  19. Global Sensitivity Analysis of Environmental Models: Convergence, Robustness and Validation

    NASA Astrophysics Data System (ADS)

    Sarrazin, Fanny; Pianosi, Francesca; Khorashadi Zadeh, Farkhondeh; Van Griensven, Ann; Wagener, Thorsten

    2015-04-01

    Global Sensitivity Analysis aims to characterize the impact that variations in model input factors (e.g. the parameters) have on the model output (e.g. simulated streamflow). In sampling-based Global Sensitivity Analysis, the sample size has to be chosen carefully in order to obtain reliable sensitivity estimates while spending computational resources efficiently. Furthermore, insensitive parameters are typically identified through the definition of a screening threshold: the theoretical value of their sensitivity index is zero but in a sampling-base framework they regularly take non-zero values. There is little guidance available for these two steps in environmental modelling though. The objective of the present study is to support modellers in making appropriate choices, regarding both sample size and screening threshold, so that a robust sensitivity analysis can be implemented. We performed sensitivity analysis for the parameters of three hydrological models with increasing level of complexity (Hymod, HBV and SWAT), and tested three widely used sensitivity analysis methods (Elementary Effect Test or method of Morris, Regional Sensitivity Analysis, and Variance-Based Sensitivity Analysis). We defined criteria based on a bootstrap approach to assess three different types of convergence: the convergence of the value of the sensitivity indices, of the ranking (the ordering among the parameters) and of the screening (the identification of the insensitive parameters). We investigated the screening threshold through the definition of a validation procedure. The results showed that full convergence of the value of the sensitivity indices is not necessarily needed to rank or to screen the model input factors. Furthermore, typical values of the sample sizes that are reported in the literature can be well below the sample sizes that actually ensure convergence of ranking and screening.

  20. Development of rapid and sensitive high throughput pharmacologic assays for marine phycotoxins.

    PubMed

    Van Dolah, F M; Finley, E L; Haynes, B L; Doucette, G J; Moeller, P D; Ramsdell, J S

    1994-01-01

    The lack of rapid, high throughput assays is a major obstacle to many aspects of research on marine phycotoxins. Here we describe the application of microplate scintillation technology to develop high throughput assays for several classes of marine phycotoxin based on their differential pharmacologic actions. High throughput "drug discovery" format microplate receptor binding assays developed for brevetoxins/ciguatoxins and for domoic acid are described. Analysis for brevetoxins/ciguatoxins is carried out by binding competition with [3H] PbTx-3 for site 5 on the voltage dependent sodium channel in rat brain synaptosomes. Analysis of domoic acid is based on binding competition with [3H] kainic acid for the kainate/quisqualate glutamate receptor using frog brain synaptosomes. In addition, a high throughput microplate 45Ca flux assay for determination of maitotoxins is described. These microplate assays can be completed within 3 hours, have sensitivities of less than 1 ng, and can analyze dozens of samples simultaneously. The assays have been demonstrated to be useful for assessing algal toxicity and for assay-guided purification of toxins, and are applicable to the detection of biotoxins in seafood.

  1. CXTFIT/Excel A modular adaptable code for parameter estimation, sensitivity analysis and uncertainty analysis for laboratory or field tracer experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, Guoping; Mayes, Melanie; Parker, Jack C

    2010-01-01

    We implemented the widely used CXTFIT code in Excel to provide flexibility and added sensitivity and uncertainty analysis functions to improve transport parameter estimation and to facilitate model discrimination for multi-tracer experiments on structured soils. Analytical solutions for one-dimensional equilibrium and nonequilibrium convection dispersion equations were coded as VBA functions so that they could be used as ordinary math functions in Excel for forward predictions. Macros with user-friendly interfaces were developed for optimization, sensitivity analysis, uncertainty analysis, error propagation, response surface calculation, and Monte Carlo analysis. As a result, any parameter with transformations (e.g., dimensionless, log-transformed, species-dependent reactions, etc.) couldmore » be estimated with uncertainty and sensitivity quantification for multiple tracer data at multiple locations and times. Prior information and observation errors could be incorporated into the weighted nonlinear least squares method with a penalty function. Users are able to change selected parameter values and view the results via embedded graphics, resulting in a flexible tool applicable to modeling transport processes and to teaching students about parameter estimation. The code was verified by comparing to a number of benchmarks with CXTFIT 2.0. It was applied to improve parameter estimation for four typical tracer experiment data sets in the literature using multi-model evaluation and comparison. Additional examples were included to illustrate the flexibilities and advantages of CXTFIT/Excel. The VBA macros were designed for general purpose and could be used for any parameter estimation/model calibration when the forward solution is implemented in Excel. A step-by-step tutorial, example Excel files and the code are provided as supplemental material.« less

  2. SEM analysis of ionizing radiation effects in linear integrated circuits. [Scanning Electron Microscope

    NASA Technical Reports Server (NTRS)

    Stanley, A. G.; Gauthier, M. K.

    1977-01-01

    A successful diagnostic technique was developed using a scanning electron microscope (SEM) as a precision tool to determine ionization effects in integrated circuits. Previous SEM methods radiated the entire semiconductor chip or major areas. The large area exposure methods do not reveal the exact components which are sensitive to radiation. To locate these sensitive components a new method was developed, which consisted in successively irradiating selected components on the device chip with equal doses of electrons /10 to the 6th rad (Si)/, while the whole device was subjected to representative bias conditions. A suitable device parameter was measured in situ after each successive irradiation with the beam off.

  3. Distributed Evaluation of Local Sensitivity Analysis (DELSA), with application to hydrologic models

    USGS Publications Warehouse

    Rakovec, O.; Hill, Mary C.; Clark, M.P.; Weerts, A. H.; Teuling, A. J.; Uijlenhoet, R.

    2014-01-01

    This paper presents a hybrid local-global sensitivity analysis method termed the Distributed Evaluation of Local Sensitivity Analysis (DELSA), which is used here to identify important and unimportant parameters and evaluate how model parameter importance changes as parameter values change. DELSA uses derivative-based “local” methods to obtain the distribution of parameter sensitivity across the parameter space, which promotes consideration of sensitivity analysis results in the context of simulated dynamics. This work presents DELSA, discusses how it relates to existing methods, and uses two hydrologic test cases to compare its performance with the popular global, variance-based Sobol' method. The first test case is a simple nonlinear reservoir model with two parameters. The second test case involves five alternative “bucket-style” hydrologic models with up to 14 parameters applied to a medium-sized catchment (200 km2) in the Belgian Ardennes. Results show that in both examples, Sobol' and DELSA identify similar important and unimportant parameters, with DELSA enabling more detailed insight at much lower computational cost. For example, in the real-world problem the time delay in runoff is the most important parameter in all models, but DELSA shows that for about 20% of parameter sets it is not important at all and alternative mechanisms and parameters dominate. Moreover, the time delay was identified as important in regions producing poor model fits, whereas other parameters were identified as more important in regions of the parameter space producing better model fits. The ability to understand how parameter importance varies through parameter space is critical to inform decisions about, for example, additional data collection and model development. The ability to perform such analyses with modest computational requirements provides exciting opportunities to evaluate complicated models as well as many alternative models.

  4. LSENS, A General Chemical Kinetics and Sensitivity Analysis Code for Homogeneous Gas-Phase Reactions. Part 2; Code Description and Usage

    NASA Technical Reports Server (NTRS)

    Radhakrishnan, Krishnan; Bittker, David A.

    1994-01-01

    LSENS, the Lewis General Chemical Kinetics and Sensitivity Analysis Code, has been developed for solving complex, homogeneous, gas-phase chemical kinetics problems and contains sensitivity analysis for a variety of problems, including nonisothermal situations. This report is part II of a series of three reference publications that describe LSENS, provide a detailed guide to its usage, and present many example problems. Part II describes the code, how to modify it, and its usage, including preparation of the problem data file required to execute LSENS. Code usage is illustrated by several example problems, which further explain preparation of the problem data file and show how to obtain desired accuracy in the computed results. LSENS is a flexible, convenient, accurate, and efficient solver for chemical reaction problems such as static system; steady, one-dimensional, inviscid flow; reaction behind incident shock wave, including boundary layer correction; and perfectly stirred (highly backmixed) reactor. In addition, the chemical equilibrium state can be computed for the following assigned states: temperature and pressure, enthalpy and pressure, temperature and volume, and internal energy and volume. For static problems the code computes the sensitivity coefficients of the dependent variables and their temporal derivatives with respect to the initial values of the dependent variables and/or the three rate coefficient parameters of the chemical reactions. Part I (NASA RP-1328) derives the governing equations and describes the numerical solution procedures for the types of problems that can be solved by LSENS. Part III (NASA RP-1330) explains the kinetics and kinetics-plus-sensitivity-analysis problems supplied with LSENS and presents sample results.

  5. Developing a methodology for the inverse estimation of root architectural parameters from field based sampling schemes

    NASA Astrophysics Data System (ADS)

    Morandage, Shehan; Schnepf, Andrea; Vanderborght, Jan; Javaux, Mathieu; Leitner, Daniel; Laloy, Eric; Vereecken, Harry

    2017-04-01

    Root traits are increasingly important in breading of new crop varieties. E.g., longer and fewer lateral roots are suggested to improve drought resistance of wheat. Thus, detailed root architectural parameters are important. However, classical field sampling of roots only provides more aggregated information such as root length density (coring), root counts per area (trenches) or root arrival curves at certain depths (rhizotubes). We investigate the possibility of obtaining the information about root system architecture of plants using field based classical root sampling schemes, based on sensitivity analysis and inverse parameter estimation. This methodology was developed based on a virtual experiment where a root architectural model was used to simulate root system development in a field, parameterized for winter wheat. This information provided the ground truth which is normally unknown in a real field experiment. The three sampling schemes coring, trenching, and rhizotubes where virtually applied to and aggregated information computed. Morris OAT global sensitivity analysis method was then performed to determine the most sensitive parameters of root architecture model for the three different sampling methods. The estimated means and the standard deviation of elementary effects of a total number of 37 parameters were evaluated. Upper and lower bounds of the parameters were obtained based on literature and published data of winter wheat root architectural parameters. Root length density profiles of coring, arrival curve characteristics observed in rhizotubes, and root counts in grids of trench profile method were evaluated statistically to investigate the influence of each parameter using five different error functions. Number of branches, insertion angle inter-nodal distance, and elongation rates are the most sensitive parameters and the parameter sensitivity varies slightly with the depth. Most parameters and their interaction with the other parameters show highly nonlinear effect to the model output. The most sensitive parameters will be subject to inverse estimation from the virtual field sampling data using DREAMzs algorithm. The estimated parameters can then be compared with the ground truth in order to determine the suitability of the sampling schemes to identify specific traits or parameters of the root growth model.

  6. A preliminary result of three-dimensional microarray technology to gene analysis with endoscopic ultrasound-guided fine-needle aspiration specimens and pancreatic juices

    PubMed Central

    2010-01-01

    Background Analysis of gene expression and gene mutation may add information to be different from ordinary pathological tissue diagnosis. Since samples obtained endoscopically are very small, it is desired that more sensitive technology is developed for gene analysis. We investigated whether gene expression and gene mutation analysis by newly developed ultra-sensitive three-dimensional (3D) microarray is possible using small amount samples from endoscopic ultrasound-guided fine-needle aspiration (EUS-FNA) specimens and pancreatic juices. Methods Small amount samples from 17 EUS-FNA specimens and 16 pancreatic juices were obtained. After nucleic acid extraction, the samples were amplified with labeling and analyzed by the 3D microarray. Results The analyzable rate with the microarray was 46% (6/13) in EUS-FNA specimens of RNAlater® storage, and RNA degradations were observed in all the samples of frozen storage. In pancreatic juices, the analyzable rate was 67% (4/6) in frozen storage samples and 20% (2/10) in RNAlater® storage. EUS-FNA specimens were classified into cancer and non-cancer by gene expression analysis and K-ras codon 12 mutations were also detected using the 3D microarray. Conclusions Gene analysis from small amount samples obtained endoscopically was possible by newly developed 3D microarray technology. High quality RNA from EUS-FNA samples were obtained and remained in good condition only using RNA stabilizer. In contrast, high quality RNA from pancreatic juice samples were obtained only in frozen storage without RNA stabilizer. PMID:20416107

  7. Liver fat, visceral adiposity, and sleep disturbances contribute to the development of insulin resistance and glucose intolerance in nondiabetic dialysis patients.

    PubMed

    Sakkas, Giorgos K; Karatzaferi, Christina; Zintzaras, Elias; Giannaki, Christoforos D; Liakopoulos, Vassilios; Lavdas, Eleftherios; Damani, Eleni; Liakos, Nikos; Fezoulidis, Ioannis; Koutedakis, Yiannis; Stefanidis, Ioannis

    2008-12-01

    Hemodialysis patients exhibit insulin resistance (IR) in target organs such as liver, muscles, and adipose tissue. The aim of this study was to identify contributors to IR and to develop a model for predicting glucose intolerance in nondiabetic hemodialysis patients. After a 2-h, 75-g oral glucose tolerance test (OGTT), 34 hemodialysis patients were divided into groups with normal (NGT) and impaired glucose tolerance (IGT). Indices of insulin sensitivity were derived from OGTT data. Measurements included liver and muscle fat infiltration and central adiposity by computed tomography scans, body composition by dual energy X-ray absorptiometer, sleep quality by full polysomnography, and functional capacity and quality of life (QoL) by a battery of exercise tests and questionnaires. Cut-off points, as well as sensitivity and specificity calculations were based on IR (insulin sensitivity index by Matsuda) using a receiver operator characteristics (ROC) curve analysis. Fifteen patients were assigned to the IGT, and 19 subjects to the NGT group. Intrahepatic fat content and visceral adiposity were significantly higher in the IGT group. IR indices strongly correlated with sleep disturbances, visceral adiposity, functional capacity, and QoL. Visceral adiposity, O2 desaturation during sleep, intrahepatic fat content, and QoL score fitted into the model for predicting glucose intolerance. A ROC curve analysis identified an intrahepatic fat content of > 3.97% (sensitivity, 100; specificity, 35.7) as the best cutoff point for predicting IR. Visceral and intrahepatic fat content, as well as QoL and sleep seemed to be involved at some point in the development of glucose intolerance in hemodialysis patients. Means of reducing fat depots in the liver and splachnic area might prove promising in combating IR and cardiovascular risk in hemodialysis patients.

  8. A prospective microbiome-wide association study of food sensitization and food allergy in early childhood.

    PubMed

    Savage, Jessica H; Lee-Sarwar, Kathleen A; Sordillo, Joanne; Bunyavanich, Supinda; Zhou, Yanjiao; O'Connor, George; Sandel, Megan; Bacharier, Leonard B; Zeiger, Robert; Sodergren, Erica; Weinstock, George M; Gold, Diane R; Weiss, Scott T; Litonjua, Augusto A

    2018-01-01

    Alterations in the intestinal microbiome are prospectively associated with the development of asthma; less is known regarding the role of microbiome alterations in food allergy development. Intestinal microbiome samples were collected at age 3-6 months in children participating in the follow-up phase of an interventional trial of high-dose vitamin D given during pregnancy. At age 3, sensitization to foods (milk, egg, peanut, soy, wheat, walnut) was assessed. Food allergy was defined as caretaker report of healthcare provider-diagnosed allergy to the above foods prior to age 3 with evidence of IgE sensitization. Analysis was performed using Phyloseq and DESeq2; P-values were adjusted for multiple comparisons. Complete data were available for 225 children; there were 87 cases of food sensitization and 14 cases of food allergy. Microbial diversity measures did not differ between food sensitization and food allergy cases and controls. The genera Haemophilus (log 2 fold change -2.15, P=.003), Dialister (log 2 fold change -2.22, P=.009), Dorea (log 2 fold change -1.65, P=.02), and Clostridium (log 2 fold change -1.47, P=.002) were underrepresented among subjects with food sensitization. The genera Citrobacter (log 2 fold change -3.41, P=.03), Oscillospira (log 2 fold change -2.80, P=.03), Lactococcus (log 2 fold change -3.19, P=.05), and Dorea (log 2 fold change -3.00, P=.05) were underrepresented among subjects with food allergy. The temporal association between bacterial colonization and food sensitization and allergy suggests that the microbiome may have a causal role in the development of food allergy. Our findings have therapeutic implications for the prevention and treatment of food allergy. © 2017 EAACI and John Wiley and Sons A/S. Published by John Wiley and Sons Ltd.

  9. Soil Moisture Project Evaluation Workshop

    NASA Technical Reports Server (NTRS)

    Gilbert, R. H. (Editor)

    1980-01-01

    Approaches planned or being developed for measuring and modeling soil moisture parameters are discussed. Topics cover analysis of spatial variability of soil moisture as a function of terrain; the value of soil moisture information in developing stream flow data; energy/scene interactions; applications of satellite data; verifying soil water budget models; soil water profile/soil temperature profile models; soil moisture sensitivity analysis; combinations of the thermal model and microwave; determing planetary roughness and field roughness; how crust or a soil layer effects microwave return; truck radar; and truck/aircraft radar comparison.

  10. SPS market analysis

    NASA Astrophysics Data System (ADS)

    Goff, H. C.

    1980-05-01

    A market analysis task included personal interviews by GE personnel and supplemental mail surveys to acquire statistical data and to identify and measure attitudes, reactions and intentions of prospective small solar thermal power systems (SPS) users. Over 500 firms were contacted, including three ownership classes of electric utilities, industrial firms in the top SIC codes for energy consumption, and design engineering firms. A market demand model was developed which utilizes the data base developed by personal interviews and surveys, and projected energy price and consumption data to perform sensitivity analyses and estimate potential markets for SPS.

  11. A ppb level sensitive sensor for atmospheric methane detection

    NASA Astrophysics Data System (ADS)

    Xia, Jinbao; Zhu, Feng; Zhang, Sasa; Kolomenskii, Alexandre; Schuessler, Hans

    2017-11-01

    A high sensitivity sensor, combining a multipass cell and wavelength modulation spectroscopy in the near infrared spectral region was designed and implemented for trace gas detection. The effective length of the multipass cell was about 290 meters. The developed spectroscopic technique demonstrates an improved sensitivity of methane in ambient air and a relatively short detection time compared to previously reported sensors. Home-built electronics and software were employed for diode laser frequency modulation, signal lock-in detection and processing. A dual beam scheme and a balanced photo-detector were implemented to suppress the intensity modulation and noise for better detection sensitivity. The performance of the sensor was evaluated in a series of measurements ranging from three hours to two days. The average methane concentration measured in ambient air was 2.01 ppm with a relative error of ± 2.5%. With Allan deviation analysis, it was found that the methane detection limit of 1.2 ppb was achieved in 650 s. The developed sensor is compact and portable, and thus it is well suited for field measurements of methane and other trace gases.

  12. Comparative analysis of microbial fuel cell based biosensors developed with a mixed culture and Shewanella loihica PV-4 and underlying biological mechanism.

    PubMed

    Yi, Yue; Xie, Beizhen; Zhao, Ting; Liu, Hong

    2018-06-13

    Microbial fuel cell based biosensors (MFC-biosensors) utilize anode biofilms as biological recognition elements to monitor biochemical oxygen demand (BOD) and biotoxicity. However, the relatively poor sensitivity constrains the application of MFC-biosensors. To address this limitation, this study provided a systematic comparison of sensitivity between the MFC-biosensors constructed with two inocula. Higher biomass density and viability were both observed in the anode biofilm of the mixed culture MFC, which resulted in better sensitivity for BOD assessment. Compared with using mixed culture as inoculum, the anode biofilm developed with Shewanella loihica PV-4 presented lower content of extracellular polymeric substances and poorer ability to secrete protein under toxic shocks. Moreover, the looser structure in the S. loihica PV-4 biofilm further facilitated its susceptibilities to toxic agents. Therefore, the MFC-biosensor with a pure culture of S. loihica PV-4 delivered higher sensitivity for biotoxicity monitoring. This study proposed a new perspective to enhance sensor performance. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Comparative DNA microarray analysis of human monocyte derived dendritic cells and MUTZ-3 cells exposed to the moderate skin sensitizer cinnamaldehyde

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Python, Francois; Goebel, Carsten; Aeby, Pierre

    2009-09-15

    The number of studies involved in the development of in vitro skin sensitization tests has increased since the adoption of the EU 7th amendment to the cosmetics directive proposing to ban animal testing for cosmetic ingredients by 2013. Several studies have recently demonstrated that sensitizers induce a relevant up-regulation of activation markers such as CD86, CD54, IL-8 or IL-1{beta} in human myeloid cell lines (e.g., U937, MUTZ-3, THP-1) or in human peripheral blood monocyte-derived dendritic cells (PBMDCs). The present study aimed at the identification of new dendritic cell activation markers in order to further improve the in vitro evaluation ofmore » the sensitizing potential of chemicals. We have compared the gene expression profiles of PBMDCs and the human cell line MUTZ-3 after a 24-h exposure to the moderate sensitizer cinnamaldehyde. A list of 80 genes modulated in both cell types was obtained and a set of candidate marker genes was selected for further analysis. Cells were exposed to selected sensitizers and non-sensitizers for 24 h and gene expression was analyzed by quantitative real-time reverse transcriptase-polymerase chain reaction. Results indicated that PIR, TRIM16 and two Nrf2-regulated genes, CES1 and NQO1, are modulated by most sensitizers. Up-regulation of these genes could also be observed in our recently published DC-activation test with U937 cells. Due to their role in DC activation, these new genes may help to further refine the in vitro approaches for the screening of the sensitizing properties of a chemical.« less

  14. Simulation-based sensitivity analysis for non-ignorably missing data.

    PubMed

    Yin, Peng; Shi, Jian Q

    2017-01-01

    Sensitivity analysis is popular in dealing with missing data problems particularly for non-ignorable missingness, where full-likelihood method cannot be adopted. It analyses how sensitively the conclusions (output) may depend on assumptions or parameters (input) about missing data, i.e. missing data mechanism. We call models with the problem of uncertainty sensitivity models. To make conventional sensitivity analysis more useful in practice we need to define some simple and interpretable statistical quantities to assess the sensitivity models and make evidence based analysis. We propose a novel approach in this paper on attempting to investigate the possibility of each missing data mechanism model assumption, by comparing the simulated datasets from various MNAR models with the observed data non-parametrically, using the K-nearest-neighbour distances. Some asymptotic theory has also been provided. A key step of this method is to plug in a plausibility evaluation system towards each sensitivity parameter, to select plausible values and reject unlikely values, instead of considering all proposed values of sensitivity parameters as in the conventional sensitivity analysis method. The method is generic and has been applied successfully to several specific models in this paper including meta-analysis model with publication bias, analysis of incomplete longitudinal data and mean estimation with non-ignorable missing data.

  15. Space system operations and support cost analysis using Markov chains

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Dean, Edwin B.; Moore, Arlene A.; Fairbairn, Robert E.

    1990-01-01

    This paper evaluates the use of Markov chain process in probabilistic life cycle cost analysis and suggests further uses of the process as a design aid tool. A methodology is developed for estimating operations and support cost and expected life for reusable space transportation systems. Application of the methodology is demonstrated for the case of a hypothetical space transportation vehicle. A sensitivity analysis is carried out to explore the effects of uncertainty in key model inputs.

  16. Risk Analysis of a Fuel Storage Terminal Using HAZOP and FTA

    PubMed Central

    Baixauli-Pérez, Mª Piedad

    2017-01-01

    The size and complexity of industrial chemical plants, together with the nature of the products handled, means that an analysis and control of the risks involved is required. This paper presents a methodology for risk analysis in chemical and allied industries that is based on a combination of HAZard and OPerability analysis (HAZOP) and a quantitative analysis of the most relevant risks through the development of fault trees, fault tree analysis (FTA). Results from FTA allow prioritizing the preventive and corrective measures to minimize the probability of failure. An analysis of a case study is performed; it consists in the terminal for unloading chemical and petroleum products, and the fuel storage facilities of two companies, in the port of Valencia (Spain). HAZOP analysis shows that loading and unloading areas are the most sensitive areas of the plant and where the most significant danger is a fuel spill. FTA analysis indicates that the most likely event is a fuel spill in tank truck loading area. A sensitivity analysis from the FTA results show the importance of the human factor in all sequences of the possible accidents, so it should be mandatory to improve the training of the staff of the plants. PMID:28665325

  17. Risk Analysis of a Fuel Storage Terminal Using HAZOP and FTA.

    PubMed

    Fuentes-Bargues, José Luis; González-Cruz, Mª Carmen; González-Gaya, Cristina; Baixauli-Pérez, Mª Piedad

    2017-06-30

    The size and complexity of industrial chemical plants, together with the nature of the products handled, means that an analysis and control of the risks involved is required. This paper presents a methodology for risk analysis in chemical and allied industries that is based on a combination of HAZard and OPerability analysis (HAZOP) and a quantitative analysis of the most relevant risks through the development of fault trees, fault tree analysis (FTA). Results from FTA allow prioritizing the preventive and corrective measures to minimize the probability of failure. An analysis of a case study is performed; it consists in the terminal for unloading chemical and petroleum products, and the fuel storage facilities of two companies, in the port of Valencia (Spain). HAZOP analysis shows that loading and unloading areas are the most sensitive areas of the plant and where the most significant danger is a fuel spill. FTA analysis indicates that the most likely event is a fuel spill in tank truck loading area. A sensitivity analysis from the FTA results show the importance of the human factor in all sequences of the possible accidents, so it should be mandatory to improve the training of the staff of the plants.

  18. Sensitivity Analysis of Weather Variables on Offsite Consequence Analysis Tools in South Korea and the United States.

    PubMed

    Kim, Min-Uk; Moon, Kyong Whan; Sohn, Jong-Ryeul; Byeon, Sang-Hoon

    2018-05-18

    We studied sensitive weather variables for consequence analysis, in the case of chemical leaks on the user side of offsite consequence analysis (OCA) tools. We used OCA tools Korea Offsite Risk Assessment (KORA) and Areal Location of Hazardous Atmospheres (ALOHA) in South Korea and the United States, respectively. The chemicals used for this analysis were 28% ammonia (NH₃), 35% hydrogen chloride (HCl), 50% hydrofluoric acid (HF), and 69% nitric acid (HNO₃). The accident scenarios were based on leakage accidents in storage tanks. The weather variables were air temperature, wind speed, humidity, and atmospheric stability. Sensitivity analysis was performed using the Statistical Package for the Social Sciences (SPSS) program for dummy regression analysis. Sensitivity analysis showed that impact distance was not sensitive to humidity. Impact distance was most sensitive to atmospheric stability, and was also more sensitive to air temperature than wind speed, according to both the KORA and ALOHA tools. Moreover, the weather variables were more sensitive in rural conditions than in urban conditions, with the ALOHA tool being more influenced by weather variables than the KORA tool. Therefore, if using the ALOHA tool instead of the KORA tool in rural conditions, users should be careful not to cause any differences in impact distance due to input errors of weather variables, with the most sensitive one being atmospheric stability.

  19. A two-step sensitivity analysis for hydrological signatures in Jinhua River Basin, East China

    NASA Astrophysics Data System (ADS)

    Pan, S.; Fu, G.; Chiang, Y. M.; Xu, Y. P.

    2016-12-01

    Owing to model complexity and large number of parameters, calibration and sensitivity analysis are difficult processes for distributed hydrological models. In this study, a two-step sensitivity analysis approach is proposed for analyzing the hydrological signatures in Jinhua River Basin, East China, using the Distributed Hydrology-Soil-Vegetation Model (DHSVM). A rough sensitivity analysis is firstly conducted to obtain preliminary influential parameters via Analysis of Variance. The number of parameters was greatly reduced from eighteen-three to sixteen. Afterwards, the sixteen parameters are further analyzed based on a variance-based global sensitivity analysis, i.e., Sobol's sensitivity analysis method, to achieve robust sensitivity rankings and parameter contributions. Parallel-Computing is applied to reduce computational burden in variance-based sensitivity analysis. The results reveal that only a few number of model parameters are significantly sensitive, including rain LAI multiplier, lateral conductivity, porosity, field capacity, wilting point of clay loam, understory monthly LAI, understory minimum resistance and root zone depths of croplands. Finally several hydrological signatures are used for investigating the performance of DHSVM. Results show that high value of efficiency criteria didn't indicate excellent performance of hydrological signatures. For most samples from Sobol's sensitivity analysis, water yield was simulated very well. However, lowest and maximum annual daily runoffs were underestimated. Most of seven-day minimum runoffs were overestimated. Nevertheless, good performances of the three signatures above still exist in a number of samples. Analysis of peak flow shows that small and medium floods are simulated perfectly while slight underestimations happen to large floods. The work in this study helps to further multi-objective calibration of DHSVM model and indicates where to improve the reliability and credibility of model simulation.

  20. An Evaluation of Transplacental Carcinogenesis for Human ...

    EPA Pesticide Factsheets

    Risk assessments take into account the sensitivity of the postnatal period to carcinogens through the application of age-dependent adjustment factors (ADAFs) (Barton et al. 2005). The prenatal period is also recognized to be sensitive but is typically not included into risk assessments (NRC, 2009). An analysis by California OEHHA (2008) contrasted prenatal, postnatal and adult sensitivity to 23 different carcinogens across 37 studies. That analysis found a wide range of transplacental sensitivity with some agents nearly 100 fold more potent in utero than in adults while others had an in utero/adult ratio adult only exposure). Five carcinogens had more modest ratios to adult potency in both pre- and postnatal testing (vinyl chloride, ethylnitroso biuret, 3-methylcholanthrene, urethane, diethylnitrosamine, 3-10 fold). Only one chemical showed a pre- vs postnatal divergence (butylnitrosourea, prenataladult). Based upon this limited set of genotoxic carcinogens, it appears that the prenatal period often has a sensitivity that approximates what has been found for postnatal, and the maternal system does not offer substantial protection against transplacental carcinogenesis in most cases. This suggests that the system of ADAFs developed for postnatal exposure may be considered for prenatal exposures as well. An alternative approach may be to calculate cancer risk for the period of pregnancy rather than blend this risk into the calculation of lifetime risk. This

Top