Sample records for key sensitivity analysis

  1. Logistic Map for Cancellable Biometrics

    NASA Astrophysics Data System (ADS)

    Supriya, V. G., Dr; Manjunatha, Ramachandra, Dr

    2017-08-01

    This paper presents design and implementation of secured biometric template protection system by transforming the biometric template using binary chaotic signals and 3 different key streams to obtain another form of template and demonstrating its efficiency by the results and investigating on its security through analysis including, key space analysis, information entropy and key sensitivity analysis.

  2. Global sensitivity analysis of DRAINMOD-FOREST, an integrated forest ecosystem model

    Treesearch

    Shiying Tian; Mohamed A. Youssef; Devendra M. Amatya; Eric D. Vance

    2014-01-01

    Global sensitivity analysis is a useful tool to understand process-based ecosystem models by identifying key parameters and processes controlling model predictions. This study reported a comprehensive global sensitivity analysis for DRAINMOD-FOREST, an integrated model for simulating water, carbon (C), and nitrogen (N) cycles and plant growth in lowland forests. The...

  3. 3.8 Proposed approach to uncertainty quantification and sensitivity analysis in the next PA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flach, Greg; Wohlwend, Jen

    2017-10-02

    This memorandum builds upon Section 3.8 of SRNL (2016) and Flach (2017) by defining key error analysis, uncertainty quantification, and sensitivity analysis concepts and terms, in preparation for the next E-Area Performance Assessment (WSRC 2008) revision.

  4. Dynamical Model of Drug Accumulation in Bacteria: Sensitivity Analysis and Experimentally Testable Predictions

    DOE PAGES

    Vesselinova, Neda; Alexandrov, Boian; Wall, Michael E.

    2016-11-08

    We present a dynamical model of drug accumulation in bacteria. The model captures key features in experimental time courses on ofloxacin accumulation: initial uptake; two-phase response; and long-term acclimation. In combination with experimental data, the model provides estimates of import and export rates in each phase, the time of entry into the second phase, and the decrease of internal drug during acclimation. Global sensitivity analysis, local sensitivity analysis, and Bayesian sensitivity analysis of the model provide information about the robustness of these estimates, and about the relative importance of different parameters in determining the features of the accumulation time coursesmore » in three different bacterial species: Escherichia coli, Staphylococcus aureus, and Pseudomonas aeruginosa. The results lead to experimentally testable predictions of the effects of membrane permeability, drug efflux and trapping (e.g., by DNA binding) on drug accumulation. A key prediction is that a sudden increase in ofloxacin accumulation in both E. coli and S. aureus is accompanied by a decrease in membrane permeability.« less

  5. Dynamical Model of Drug Accumulation in Bacteria: Sensitivity Analysis and Experimentally Testable Predictions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vesselinova, Neda; Alexandrov, Boian; Wall, Michael E.

    We present a dynamical model of drug accumulation in bacteria. The model captures key features in experimental time courses on ofloxacin accumulation: initial uptake; two-phase response; and long-term acclimation. In combination with experimental data, the model provides estimates of import and export rates in each phase, the time of entry into the second phase, and the decrease of internal drug during acclimation. Global sensitivity analysis, local sensitivity analysis, and Bayesian sensitivity analysis of the model provide information about the robustness of these estimates, and about the relative importance of different parameters in determining the features of the accumulation time coursesmore » in three different bacterial species: Escherichia coli, Staphylococcus aureus, and Pseudomonas aeruginosa. The results lead to experimentally testable predictions of the effects of membrane permeability, drug efflux and trapping (e.g., by DNA binding) on drug accumulation. A key prediction is that a sudden increase in ofloxacin accumulation in both E. coli and S. aureus is accompanied by a decrease in membrane permeability.« less

  6. Novel Image Encryption Scheme Based on Chebyshev Polynomial and Duffing Map

    PubMed Central

    2014-01-01

    We present a novel image encryption algorithm using Chebyshev polynomial based on permutation and substitution and Duffing map based on substitution. Comprehensive security analysis has been performed on the designed scheme using key space analysis, visual testing, histogram analysis, information entropy calculation, correlation coefficient analysis, differential analysis, key sensitivity test, and speed test. The study demonstrates that the proposed image encryption algorithm shows advantages of more than 10113 key space and desirable level of security based on the good statistical results and theoretical arguments. PMID:25143970

  7. Parallel Computing and Model Evaluation for Environmental Systems: An Overview of the Supermuse and Frames Software Technologies

    EPA Science Inventory

    ERD’s Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) is a key to enhancing quality assurance in environmental models and applications. Uncertainty analysis and sensitivity analysis remain critical, though often overlooked steps in the development and e...

  8. Formal Analysis of Key Integrity in PKCS#11

    NASA Astrophysics Data System (ADS)

    Falcone, Andrea; Focardi, Riccardo

    PKCS#11 is a standard API to cryptographic devices such as smarcards, hardware security modules and usb crypto-tokens. Though widely adopted, this API has been shown to be prone to attacks in which a malicious user gains access to the sensitive keys stored in the devices. In 2008, Delaune, Kremer and Steel proposed a model to formally reason on this kind of attacks. We extend this model to also describe flaws that are based on integrity violations of the stored keys. In particular, we consider scenarios in which a malicious overwriting of keys might fool honest users into using attacker's own keys, while performing sensitive operations. We further enrich the model with a trusted key mechanism ensuring that only controlled, non-tampered keys are used in cryptographic operations, and we show how this modified API prevents the above mentioned key-replacement attacks.

  9. A new image encryption algorithm based on the fractional-order hyperchaotic Lorenz system

    NASA Astrophysics Data System (ADS)

    Wang, Zhen; Huang, Xia; Li, Yu-Xia; Song, Xiao-Na

    2013-01-01

    We propose a new image encryption algorithm on the basis of the fractional-order hyperchaotic Lorenz system. While in the process of generating a key stream, the system parameters and the derivative order are embedded in the proposed algorithm to enhance the security. Such an algorithm is detailed in terms of security analyses, including correlation analysis, information entropy analysis, run statistic analysis, mean-variance gray value analysis, and key sensitivity analysis. The experimental results demonstrate that the proposed image encryption scheme has the advantages of large key space and high security for practical image encryption.

  10. Optimization of Parameter Ranges for Composite Tape Winding Process Based on Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Yu, Tao; Shi, Yaoyao; He, Xiaodong; Kang, Chao; Deng, Bo; Song, Shibo

    2017-08-01

    This study is focus on the parameters sensitivity of winding process for composite prepreg tape. The methods of multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis are proposed. The polynomial empirical model of interlaminar shear strength is established by response surface experimental method. Using this model, the relative sensitivity of key process parameters including temperature, tension, pressure and velocity is calculated, while the single-parameter sensitivity curves are obtained. According to the analysis of sensitivity curves, the stability and instability range of each parameter are recognized. Finally, the optimization method of winding process parameters is developed. The analysis results show that the optimized ranges of the process parameters for interlaminar shear strength are: temperature within [100 °C, 150 °C], tension within [275 N, 387 N], pressure within [800 N, 1500 N], and velocity within [0.2 m/s, 0.4 m/s], respectively.

  11. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool

    PubMed Central

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-01-01

    Background It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. Results This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. Conclusion SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes. PMID:18706080

  12. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool.

    PubMed

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-08-15

    It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.

  13. Atmospheric Environment Vulnerability Cause Analysis for the Beijing-Tianjin-Hebei Metropolitan Region

    PubMed Central

    Zhang, Yang; Shen, Jing; Li, Yu

    2018-01-01

    Assessing and quantifying atmospheric vulnerability is a key issue in urban environmental protection and management. This paper integrated the Analytical hierarchy process (AHP), fuzzy synthesis evaluation and Geographic Information System (GIS) spatial analysis into an Exposure-Sensitivity-Adaptive capacity (ESA) framework to quantitatively assess atmospheric environment vulnerability in the Beijing-Tianjin-Hebei (BTH) region with spatial and temporal comparisons. The elaboration of the relationships between atmospheric environment vulnerability and indices of exposure, sensitivity, and adaptive capacity supports enable analysis of the atmospheric environment vulnerability. Our findings indicate that the atmospheric environment vulnerability of 13 cities in the BTH region exhibits obvious spatial heterogeneity, which is caused by regional diversity in exposure, sensitivity, and adaptive capacity indices. The results of atmospheric environment vulnerability assessment and the cause analysis can provide guidance to pick out key control regions and recognize vulnerable indicators for study sites. The framework developed in this paper can also be replicated at different spatial and temporal scales using context-specific datasets to support environmental management. PMID:29342852

  14. Atmospheric Environment Vulnerability Cause Analysis for the Beijing-Tianjin-Hebei Metropolitan Region.

    PubMed

    Zhang, Yang; Shen, Jing; Li, Yu

    2018-01-13

    Assessing and quantifying atmospheric vulnerability is a key issue in urban environmental protection and management. This paper integrated the Analytical hierarchy process (AHP), fuzzy synthesis evaluation and Geographic Information System (GIS) spatial analysis into an Exposure-Sensitivity-Adaptive capacity (ESA) framework to quantitatively assess atmospheric environment vulnerability in the Beijing-Tianjin-Hebei (BTH) region with spatial and temporal comparisons. The elaboration of the relationships between atmospheric environment vulnerability and indices of exposure, sensitivity, and adaptive capacity supports enable analysis of the atmospheric environment vulnerability. Our findings indicate that the atmospheric environment vulnerability of 13 cities in the BTH region exhibits obvious spatial heterogeneity, which is caused by regional diversity in exposure, sensitivity, and adaptive capacity indices. The results of atmospheric environment vulnerability assessment and the cause analysis can provide guidance to pick out key control regions and recognize vulnerable indicators for study sites. The framework developed in this paper can also be replicated at different spatial and temporal scales using context-specific datasets to support environmental management.

  15. Image encryption based on a delayed fractional-order chaotic logistic system

    NASA Astrophysics Data System (ADS)

    Wang, Zhen; Huang, Xia; Li, Ning; Song, Xiao-Na

    2012-05-01

    A new image encryption scheme is proposed based on a delayed fractional-order chaotic logistic system. In the process of generating a key stream, the time-varying delay and fractional derivative are embedded in the proposed scheme to improve the security. Such a scheme is described in detail with security analyses including correlation analysis, information entropy analysis, run statistic analysis, mean-variance gray value analysis, and key sensitivity analysis. Experimental results show that the newly proposed image encryption scheme possesses high security.

  16. Identifying key sources of uncertainty in the modelling of greenhouse gas emissions from wastewater treatment.

    PubMed

    Sweetapple, Christine; Fu, Guangtao; Butler, David

    2013-09-01

    This study investigates sources of uncertainty in the modelling of greenhouse gas emissions from wastewater treatment, through the use of local and global sensitivity analysis tools, and contributes to an in-depth understanding of wastewater treatment modelling by revealing critical parameters and parameter interactions. One-factor-at-a-time sensitivity analysis is used to screen model parameters and identify those with significant individual effects on three performance indicators: total greenhouse gas emissions, effluent quality and operational cost. Sobol's method enables identification of parameters with significant higher order effects and of particular parameter pairs to which model outputs are sensitive. Use of a variance-based global sensitivity analysis tool to investigate parameter interactions enables identification of important parameters not revealed in one-factor-at-a-time sensitivity analysis. These interaction effects have not been considered in previous studies and thus provide a better understanding wastewater treatment plant model characterisation. It was found that uncertainty in modelled nitrous oxide emissions is the primary contributor to uncertainty in total greenhouse gas emissions, due largely to the interaction effects of three nitrogen conversion modelling parameters. The higher order effects of these parameters are also shown to be a key source of uncertainty in effluent quality. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  18. Extending Raman's reach: enabling applications via greater sensitivity and speed

    NASA Astrophysics Data System (ADS)

    Creasey, David; Sullivan, Mike; Paul, Chris; Rathmell, Cicely

    2018-02-01

    Over the last decade, miniature fiber optic spectrometers have greatly expanded the ability of Raman spectroscopy to tackle practical applications in the field, from mobile pharmaceutical ID to hazardous material assessment in remote locations. There remains a gap, however, between the typical diode array spectrometer and their more sensitive benchtop analogs. High sensitivity, cooled Raman spectrometers have the potential to narrow that gap by providing greater sensitivity, better SNR, and faster measurement times. In this paper, we'll look at the key factors in the design of high sensitivity miniature Raman spectrometers and their associated accessories, as well as the key metric for direct comparison of these systems - limit of detection. With the availability of our high sensitivity Raman systems operating at wavelengths from the UV to NIR, many applications are now becoming practical in the field, from trace level detection to analysis of complex biological samples.

  19. Multiobjective Sensitivity Analysis Of Sediment And Nitrogen Processes With A Watershed Model

    EPA Science Inventory

    This paper presents a computational analysis for evaluating critical non-point-source sediment and nutrient (specifically nitrogen) processes and management actions at the watershed scale. In the analysis, model parameters that bear key uncertainties were presumed to reflect the ...

  20. twzPEA: A Topology and Working Zone Based Pathway Enrichment Analysis Framework

    USDA-ARS?s Scientific Manuscript database

    Sensitive detection of involvement and adaptation of key signaling, regulatory, and metabolic pathways holds the key to deciphering molecular mechanisms such as those in the biomass-to-biofuel conversion process in yeast. Typical gene set enrichment analyses often do not use topology information in...

  1. Simulation-based sensitivity analysis for non-ignorably missing data.

    PubMed

    Yin, Peng; Shi, Jian Q

    2017-01-01

    Sensitivity analysis is popular in dealing with missing data problems particularly for non-ignorable missingness, where full-likelihood method cannot be adopted. It analyses how sensitively the conclusions (output) may depend on assumptions or parameters (input) about missing data, i.e. missing data mechanism. We call models with the problem of uncertainty sensitivity models. To make conventional sensitivity analysis more useful in practice we need to define some simple and interpretable statistical quantities to assess the sensitivity models and make evidence based analysis. We propose a novel approach in this paper on attempting to investigate the possibility of each missing data mechanism model assumption, by comparing the simulated datasets from various MNAR models with the observed data non-parametrically, using the K-nearest-neighbour distances. Some asymptotic theory has also been provided. A key step of this method is to plug in a plausibility evaluation system towards each sensitivity parameter, to select plausible values and reject unlikely values, instead of considering all proposed values of sensitivity parameters as in the conventional sensitivity analysis method. The method is generic and has been applied successfully to several specific models in this paper including meta-analysis model with publication bias, analysis of incomplete longitudinal data and mean estimation with non-ignorable missing data.

  2. Integrating heterogeneous drug sensitivity data from cancer pharmacogenomic studies.

    PubMed

    Pozdeyev, Nikita; Yoo, Minjae; Mackie, Ryan; Schweppe, Rebecca E; Tan, Aik Choon; Haugen, Bryan R

    2016-08-09

    The consistency of in vitro drug sensitivity data is of key importance for cancer pharmacogenomics. Previous attempts to correlate drug sensitivities from the large pharmacogenomics databases, such as the Cancer Cell Line Encyclopedia (CCLE) and the Genomics of Drug Sensitivity in Cancer (GDSC), have produced discordant results. We developed a new drug sensitivity metric, the area under the dose response curve adjusted for the range of tested drug concentrations, which allows integration of heterogeneous drug sensitivity data from the CCLE, the GDSC, and the Cancer Therapeutics Response Portal (CTRP). We show that there is moderate to good agreement of drug sensitivity data for many targeted therapies, particularly kinase inhibitors. The results of this largest cancer cell line drug sensitivity data analysis to date are accessible through the online portal, which serves as a platform for high power pharmacogenomics analysis.

  3. Analysis of the sensitivity properties of a model of vector-borne bubonic plague.

    PubMed

    Buzby, Megan; Neckels, David; Antolin, Michael F; Estep, Donald

    2008-09-06

    Model sensitivity is a key to evaluation of mathematical models in ecology and evolution, especially in complex models with numerous parameters. In this paper, we use some recently developed methods for sensitivity analysis to study the parameter sensitivity of a model of vector-borne bubonic plague in a rodent population proposed by Keeling & Gilligan. The new sensitivity tools are based on a variational analysis involving the adjoint equation. The new approach provides a relatively inexpensive way to obtain derivative information about model output with respect to parameters. We use this approach to determine the sensitivity of a quantity of interest (the force of infection from rats and their fleas to humans) to various model parameters, determine a region over which linearization at a specific parameter reference point is valid, develop a global picture of the output surface, and search for maxima and minima in a given region in the parameter space.

  4. Evaluation of a Stratified National Breast Screening Program in the United Kingdom: An Early Model-Based Cost-Effectiveness Analysis.

    PubMed

    Gray, Ewan; Donten, Anna; Karssemeijer, Nico; van Gils, Carla; Evans, D Gareth; Astley, Sue; Payne, Katherine

    2017-09-01

    To identify the incremental costs and consequences of stratified national breast screening programs (stratified NBSPs) and drivers of relative cost-effectiveness. A decision-analytic model (discrete event simulation) was conceptualized to represent four stratified NBSPs (risk 1, risk 2, masking [supplemental screening for women with higher breast density], and masking and risk 1) compared with the current UK NBSP and no screening. The model assumed a lifetime horizon, the health service perspective to identify costs (£, 2015), and measured consequences in quality-adjusted life-years (QALYs). Multiple data sources were used: systematic reviews of effectiveness and utility, published studies reporting costs, and cohort studies embedded in existing NBSPs. Model parameter uncertainty was assessed using probabilistic sensitivity analysis and one-way sensitivity analysis. The base-case analysis, supported by probabilistic sensitivity analysis, suggested that the risk stratified NBSPs (risk 1 and risk-2) were relatively cost-effective when compared with the current UK NBSP, with incremental cost-effectiveness ratios of £16,689 per QALY and £23,924 per QALY, respectively. Stratified NBSP including masking approaches (supplemental screening for women with higher breast density) was not a cost-effective alternative, with incremental cost-effectiveness ratios of £212,947 per QALY (masking) and £75,254 per QALY (risk 1 and masking). When compared with no screening, all stratified NBSPs could be considered cost-effective. Key drivers of cost-effectiveness were discount rate, natural history model parameters, mammographic sensitivity, and biopsy rates for recalled cases. A key assumption was that the risk model used in the stratification process was perfectly calibrated to the population. This early model-based cost-effectiveness analysis provides indicative evidence for decision makers to understand the key drivers of costs and QALYs for exemplar stratified NBSP. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  5. Designing novel cellulase systems through agent-based modeling and global sensitivity analysis.

    PubMed

    Apte, Advait A; Senger, Ryan S; Fong, Stephen S

    2014-01-01

    Experimental techniques allow engineering of biological systems to modify functionality; however, there still remains a need to develop tools to prioritize targets for modification. In this study, agent-based modeling (ABM) was used to build stochastic models of complexed and non-complexed cellulose hydrolysis, including enzymatic mechanisms for endoglucanase, exoglucanase, and β-glucosidase activity. Modeling results were consistent with experimental observations of higher efficiency in complexed systems than non-complexed systems and established relationships between specific cellulolytic mechanisms and overall efficiency. Global sensitivity analysis (GSA) of model results identified key parameters for improving overall cellulose hydrolysis efficiency including: (1) the cellulase half-life, (2) the exoglucanase activity, and (3) the cellulase composition. Overall, the following parameters were found to significantly influence cellulose consumption in a consolidated bioprocess (CBP): (1) the glucose uptake rate of the culture, (2) the bacterial cell concentration, and (3) the nature of the cellulase enzyme system (complexed or non-complexed). Broadly, these results demonstrate the utility of combining modeling and sensitivity analysis to identify key parameters and/or targets for experimental improvement.

  6. Designing novel cellulase systems through agent-based modeling and global sensitivity analysis

    PubMed Central

    Apte, Advait A; Senger, Ryan S; Fong, Stephen S

    2014-01-01

    Experimental techniques allow engineering of biological systems to modify functionality; however, there still remains a need to develop tools to prioritize targets for modification. In this study, agent-based modeling (ABM) was used to build stochastic models of complexed and non-complexed cellulose hydrolysis, including enzymatic mechanisms for endoglucanase, exoglucanase, and β-glucosidase activity. Modeling results were consistent with experimental observations of higher efficiency in complexed systems than non-complexed systems and established relationships between specific cellulolytic mechanisms and overall efficiency. Global sensitivity analysis (GSA) of model results identified key parameters for improving overall cellulose hydrolysis efficiency including: (1) the cellulase half-life, (2) the exoglucanase activity, and (3) the cellulase composition. Overall, the following parameters were found to significantly influence cellulose consumption in a consolidated bioprocess (CBP): (1) the glucose uptake rate of the culture, (2) the bacterial cell concentration, and (3) the nature of the cellulase enzyme system (complexed or non-complexed). Broadly, these results demonstrate the utility of combining modeling and sensitivity analysis to identify key parameters and/or targets for experimental improvement. PMID:24830736

  7. Linked Sensitivity Analysis, Calibration, and Uncertainty Analysis Using a System Dynamics Model for Stroke Comparative Effectiveness Research.

    PubMed

    Tian, Yuan; Hassmiller Lich, Kristen; Osgood, Nathaniel D; Eom, Kirsten; Matchar, David B

    2016-11-01

    As health services researchers and decision makers tackle more difficult problems using simulation models, the number of parameters and the corresponding degree of uncertainty have increased. This often results in reduced confidence in such complex models to guide decision making. To demonstrate a systematic approach of linked sensitivity analysis, calibration, and uncertainty analysis to improve confidence in complex models. Four techniques were integrated and applied to a System Dynamics stroke model of US veterans, which was developed to inform systemwide intervention and research planning: Morris method (sensitivity analysis), multistart Powell hill-climbing algorithm and generalized likelihood uncertainty estimation (calibration), and Monte Carlo simulation (uncertainty analysis). Of 60 uncertain parameters, sensitivity analysis identified 29 needing calibration, 7 that did not need calibration but significantly influenced key stroke outcomes, and 24 not influential to calibration or stroke outcomes that were fixed at their best guess values. One thousand alternative well-calibrated baselines were obtained to reflect calibration uncertainty and brought into uncertainty analysis. The initial stroke incidence rate among veterans was identified as the most influential uncertain parameter, for which further data should be collected. That said, accounting for current uncertainty, the analysis of 15 distinct prevention and treatment interventions provided a robust conclusion that hypertension control for all veterans would yield the largest gain in quality-adjusted life years. For complex health care models, a mixed approach was applied to examine the uncertainty surrounding key stroke outcomes and the robustness of conclusions. We demonstrate that this rigorous approach can be practical and advocate for such analysis to promote understanding of the limits of certainty in applying models to current decisions and to guide future data collection. © The Author(s) 2016.

  8. Oscillator Noise Analysis

    NASA Astrophysics Data System (ADS)

    Demir, Alper

    2005-08-01

    Oscillators are key components of many kinds of systems, particularly electronic and opto-electronic systems. Undesired perturbations, i.e. noise, that exist in practical systems adversely affect the spectral and timing properties of the signals generated by oscillators resulting in phase noise and timing jitter. These are key performance limiting factors, being major contributors to bit-error-rate (BER) of RF and optical communication systems, and creating synchronization problems in clocked and sampled-data electronic systems. In noise analysis for oscillators, the key is figuring out how the various disturbances and noise sources in the oscillator end up as phase fluctuations. In doing so, one first computes transfer functions from the noise sources to the oscillator phase, or the sensitivity of the oscillator phase to these noise sources. In this paper, we first provide a discussion explaining the origins and the proper definition of this transfer or sensitivity function, followed by a critical review of the various numerical techniques for its computation that have been proposed by various authors over the past fifteen years.

  9. Meta-analysis for explaining the variance in public transport demand elasticities in Europe

    DOT National Transportation Integrated Search

    1998-01-01

    Results from past studies on transport demand elasticities show a large variance. This paper assesses key factors that influence the sensitivity of public transport users to transport costs in Europe, by carrying out a comparative analysis of the dif...

  10. Preliminary Thermal-Mechanical Sizing of Metallic TPS: Process Development and Sensitivity Studies

    NASA Technical Reports Server (NTRS)

    Poteet, Carl C.; Abu-Khajeel, Hasan; Hsu, Su-Yuen

    2002-01-01

    The purpose of this research was to perform sensitivity studies and develop a process to perform thermal and structural analysis and sizing of the latest Metallic Thermal Protection System (TPS) developed at NASA LaRC (Langley Research Center). Metallic TPS is a key technology for reducing the cost of reusable launch vehicles (RLV), offering the combination of increased durability and competitive weights when compared to other systems. Accurate sizing of metallic TPS requires combined thermal and structural analysis. Initial sensitivity studies were conducted using transient one-dimensional finite element thermal analysis to determine the influence of various TPS and analysis parameters on TPS weight. The thermal analysis model was then used in combination with static deflection and failure mode analysis of the sandwich panel outer surface of the TPS to obtain minimum weight TPS configurations at three vehicle stations on the windward centerline of a representative RLV. The coupled nature of the analysis requires an iterative analysis process, which will be described herein. Findings from the sensitivity analysis are reported, along with TPS designs at the three RLV vehicle stations considered.

  11. Computational methods for efficient structural reliability and reliability sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  12. Space station data system analysis/architecture study. Task 3: Trade studies, DR-5, volume 1

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The primary objective of Task 3 is to provide additional analysis and insight necessary to support key design/programmatic decision for options quantification and selection for system definition. This includes: (1) the identification of key trade study topics; (2) the definition of a trade study procedure for each topic (issues to be resolved, key inputs, criteria/weighting, methodology); (3) conduct tradeoff and sensitivity analysis; and (4) the review/verification of results within the context of evolving system design and definition. The trade study topics addressed in this volume include space autonomy and function automation, software transportability, system network topology, communications standardization, onboard local area networking, distributed operating system, software configuration management, and the software development environment facility.

  13. An Evaluation of the Potential for Shifting of Freight from Truck to Rail and Its Impacts on Energy Use and GHG Emissions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Yan; Vyas, Anant D.; Guo, Zhaomiao

    This report summarizes our evaluation of the potential energy-use and GHG-emissions reduction achieved by shifting freight from truck to rail under a most-likely scenario. A sensitivity analysis is also included. The sensitivity analysis shows changes in energy use and GHG emissions when key parameters are varied. The major contribution and distinction from previous studies is that this study considers the rail level of service (LOS) and commodity movements at the origin-destination (O-D) level. In addition, this study considers the fragility and time sensitivity of each commodity type.

  14. A PROBABILISTIC ARSENIC EXPOSURE ASSESSMENT FOR CHILDREN WHO CONTACT CHROMATED COPPER ARSENATE ( CAA )-TREATED PLAYSETS AND DECKS: PART 2 SENSITIVITY AND UNCERTAINTY ANALYSIS

    EPA Science Inventory

    A probabilistic model (SHEDS-Wood) was developed to examine children's exposure and dose to chromated copper arsenate (CCA)-treated wood, as described in Part 1 of this two part paper. This Part 2 paper discusses sensitivity and uncertainty analyses conducted to assess the key m...

  15. Efficient sensitivity analysis method for chaotic dynamical systems

    NASA Astrophysics Data System (ADS)

    Liao, Haitao

    2016-05-01

    The direct differentiation and improved least squares shadowing methods are both developed for accurately and efficiently calculating the sensitivity coefficients of time averaged quantities for chaotic dynamical systems. The key idea is to recast the time averaged integration term in the form of differential equation before applying the sensitivity analysis method. An additional constraint-based equation which forms the augmented equations of motion is proposed to calculate the time averaged integration variable and the sensitivity coefficients are obtained as a result of solving the augmented differential equations. The application of the least squares shadowing formulation to the augmented equations results in an explicit expression for the sensitivity coefficient which is dependent on the final state of the Lagrange multipliers. The LU factorization technique to calculate the Lagrange multipliers leads to a better performance for the convergence problem and the computational expense. Numerical experiments on a set of problems selected from the literature are presented to illustrate the developed methods. The numerical results demonstrate the correctness and effectiveness of the present approaches and some short impulsive sensitivity coefficients are observed by using the direct differentiation sensitivity analysis method.

  16. Sensitivity assessment of freshwater macroinvertebrates to pesticides using biological traits.

    PubMed

    Ippolito, A; Todeschini, R; Vighi, M

    2012-03-01

    Assessing the sensitivity of different species to chemicals is one of the key points in predicting the effects of toxic compounds in the environment. Trait-based predicting methods have proved to be extremely efficient for assessing the sensitivity of macroinvertebrates toward compounds with non specific toxicity (narcotics). Nevertheless, predicting the sensitivity of organisms toward compounds with specific toxicity is much more complex, since it depends on the mode of action of the chemical. The aim of this work was to predict the sensitivity of several freshwater macroinvertebrates toward three classes of plant protection products: organophosphates, carbamates and pyrethroids. Two databases were built: one with sensitivity data (retrieved, evaluated and selected from the U.S. Environmental Protection Agency ECOTOX database) and the other with biological traits. Aside from the "traditional" traits usually considered in ecological analysis (i.e. body size, respiration technique, feeding habits, etc.), multivariate analysis was used to relate the sensitivity of organisms to some other characteristics which may be involved in the process of intoxication. Results confirmed that, besides traditional biological traits, related to uptake capability (e.g. body size and body shape) some traits more related to particular metabolic characteristics or patterns have a good predictive capacity on the sensitivity to these kinds of toxic substances. For example, behavioral complexity, assumed as an indicator of nervous system complexity, proved to be an important predictor of sensitivity towards these compounds. These results confirm the need for more complex traits to predict effects of highly specific substances. One key point for achieving a complete mechanistic understanding of the process is the choice of traits, whose role in the discrimination of sensitivity should be clearly interpretable, and not only statistically significant.

  17. Sensitive and Specific Target Sequences Selected from Retrotransposons of Schistosoma japonicum for the Diagnosis of Schistosomiasis

    PubMed Central

    Xu, Jing; Zhu, Xing-Quan; Wang, Sheng-Yue; Xia, Chao-Ming

    2012-01-01

    Background Schistosomiasis japonica is a serious debilitating and sometimes fatal disease. Accurate diagnostic tests play a key role in patient management and control of the disease. However, currently available diagnostic methods are not ideal, and the detection of the parasite DNA in blood samples has turned out to be one of the most promising tools for the diagnosis of schistosomiasis. In our previous investigations, a 230-bp sequence from the highly repetitive retrotransposon SjR2 was identified and it showed high sensitivity and specificity for detecting Schistosoma japonicum DNA in the sera of rabbit model and patients. Recently, 29 retrotransposons were found in S. japonicum genome by our group. The present study highlighted the key factors for selecting a new perspective sensitive target DNA sequence for the diagnosis of schistosomiasis, which can serve as example for other parasitic pathogens. Methodology/Principal Findings In this study, we demonstrated that the key factors based on the bioinformatic analysis for selecting target sequence are the higher genome proportion, repetitive complete copies and partial copies, and active ESTs than the others in the chromosome genome. New primers based on 25 novel retrotransposons and SjR2 were designed and their sensitivity and specificity for detecting S. japonicum DNA were compared. The results showed that a new 303-bp sequence from non-long terminal repeat (LTR) retrotransposon (SjCHGCS19) had high sensitivity and specificity. The 303-bp target sequence was amplified from the sera of rabbit model at 3 d post-infection by nested-PCR and it became negative at 17 weeks post-treatment. Furthermore, the percentage sensitivity of the nested-PCR was 97.67% in 43 serum samples of S. japonicum-infected patients. Conclusions/Significance Our findings highlighted the key factors based on the bioinformatic analysis for selecting target sequence from S. japonicum genome, which provide basis for establishing powerful molecular diagnostic techniques that can be used for monitoring early infection and therapy efficacy to support schistosomiasis control programs. PMID:22479661

  18. Maternal sensitivity: a concept analysis.

    PubMed

    Shin, Hyunjeong; Park, Young-Joo; Ryu, Hosihn; Seomun, Gyeong-Ae

    2008-11-01

    The aim of this paper is to report a concept analysis of maternal sensitivity. Maternal sensitivity is a broad concept encompassing a variety of interrelated affective and behavioural caregiving attributes. It is used interchangeably with the terms maternal responsiveness or maternal competency, with no consistency of use. There is a need to clarify the concept of maternal sensitivity for research and practice. A search was performed on the CINAHL and Ovid MEDLINE databases using 'maternal sensitivity', 'maternal responsiveness' and 'sensitive mothering' as key words. The searches yielded 54 records for the years 1981-2007. Rodgers' method of evolutionary concept analysis was used to analyse the material. Four critical attributes of maternal sensitivity were identified: (a) dynamic process involving maternal abilities; (b) reciprocal give-and-take with the infant; (c) contingency on the infant's behaviour and (d) quality of maternal behaviours. Maternal identity and infant's needs and cues are antecedents for these attributes. The consequences are infant's comfort, mother-infant attachment and infant development. In addition, three positive affecting factors (social support, maternal-foetal attachment and high self-esteem) and three negative affecting factors (maternal depression, maternal stress and maternal anxiety) were identified. A clear understanding of the concept of maternal sensitivity could be useful for developing ways to enhance maternal sensitivity and to maximize the developmental potential of infants. Knowledge of the attributes of maternal sensitivity identified in this concept analysis may be helpful for constructing measuring items or dimensions.

  19. Statistical sensitivity analysis of a simple nuclear waste repository model

    NASA Astrophysics Data System (ADS)

    Ronen, Y.; Lucius, J. L.; Blow, E. M.

    1980-06-01

    A preliminary step in a comprehensive sensitivity analysis of the modeling of a nuclear waste repository. The purpose of the complete analysis is to determine which modeling parameters and physical data are most important in determining key design performance criteria and then to obtain the uncertainty in the design for safety considerations. The theory for a statistical screening design methodology is developed for later use in the overall program. The theory was applied to the test case of determining the relative importance of the sensitivity of near field temperature distribution in a single level salt repository to modeling parameters. The exact values of the sensitivities to these physical and modeling parameters were then obtained using direct methods of recalculation. The sensitivity coefficients found to be important for the sample problem were thermal loading, distance between the spent fuel canisters and their radius. Other important parameters were those related to salt properties at a point of interest in the repository.

  20. A methodology for global-sensitivity analysis of time-dependent outputs in systems biology modelling.

    PubMed

    Sumner, T; Shephard, E; Bogle, I D L

    2012-09-07

    One of the main challenges in the development of mathematical and computational models of biological systems is the precise estimation of parameter values. Understanding the effects of uncertainties in parameter values on model behaviour is crucial to the successful use of these models. Global sensitivity analysis (SA) can be used to quantify the variability in model predictions resulting from the uncertainty in multiple parameters and to shed light on the biological mechanisms driving system behaviour. We present a new methodology for global SA in systems biology which is computationally efficient and can be used to identify the key parameters and their interactions which drive the dynamic behaviour of a complex biological model. The approach combines functional principal component analysis with established global SA techniques. The methodology is applied to a model of the insulin signalling pathway, defects of which are a major cause of type 2 diabetes and a number of key features of the system are identified.

  1. Study of the pathogenesis of Ebola fever in laboratory animals with different sensitivity to this virus.

    PubMed

    Chepurnov, A A; Dadaeva, A A; Kolesnikov, S I

    2001-12-01

    Pathophysiological parameters were compared in animals with different sensitivity to Ebola virus infected with this virus. Analysis of the results showed the differences in immune reactions underlying the difference between Ebola-sensitive and Ebola-resistant animals. No neutrophil activation in response to Ebola virus injection was noted in Ebola-sensitive animal. Phagocytic activity of neutrophils in these animals inversely correlated with animal sensitivity to Ebola virus. Animal susceptibility to Ebola virus directly correlated with the decrease in the number of circulating T and B cells. We conclude that the immune system plays the key role in animal susceptibility and resistance to Ebola virus.

  2. Skin sensitizers differentially regulate signaling pathways in MUTZ-3 cells in relation to their individual potency

    PubMed Central

    2014-01-01

    Background Due to the recent European legislations posing a ban of animal tests for safety assessment within the cosmetic industry, development of in vitro alternatives for assessment of skin sensitization is highly prioritized. To date, proposed in vitro assays are mainly based on single biomarkers, which so far have not been able to classify and stratify chemicals into subgroups, related to risk or potency. Methods Recently, we presented the Genomic Allergen Rapid Detection (GARD) assay for assessment of chemical sensitizers. In this paper, we show how the genome wide readout of GARD can be expanded and used to identify differentially regulated pathways relating to individual chemical sensitizers. In this study, we investigated the mechanisms of action of a range of skin sensitizers through pathway identification, pathway classification and transcription factor analysis and related this to the reactive mechanisms and potency of the sensitizing agents. Results By transcriptional profiling of chemically stimulated MUTZ-3 cells, 33 canonical pathways intimately involved in sensitization to chemical substances were identified. The results showed that metabolic processes, cell cycling and oxidative stress responses are the key events activated during skin sensitization, and that these functions are engaged differently depending on the reactivity mechanisms of the sensitizing agent. Furthermore, the results indicate that the chemical reactivity groups seem to gradually engage more pathways and more molecules in each pathway with increasing sensitizing potency of the chemical used for stimulation. Also, a switch in gene regulation from up to down regulation, with increasing potency, was seen both in genes involved in metabolic functions and cell cycling. These observed pathway patterns were clearly reflected in the regulatory elements identified to drive these processes, where 33 regulatory elements have been proposed for further analysis. Conclusions This study demonstrates that functional analysis of biomarkers identified from our genomics study of human MUTZ-3 cells can be used to assess sensitizing potency of chemicals in vitro, by the identification of key cellular events, such as metabolic and cell cycling pathways. PMID:24517095

  3. Lunar Exploration Architecture Level Key Drivers and Sensitivities

    NASA Technical Reports Server (NTRS)

    Goodliff, Kandyce; Cirillo, William; Earle, Kevin; Reeves, J. D.; Shyface, Hilary; Andraschko, Mark; Merrill, R. Gabe; Stromgren, Chel; Cirillo, Christopher

    2009-01-01

    Strategic level analysis of the integrated behavior of lunar transportation and lunar surface systems architecture options is performed to assess the benefit, viability, affordability, and robustness of system design choices. This analysis employs both deterministic and probabilistic modeling techniques so that the extent of potential future uncertainties associated with each option are properly characterized. The results of these analyses are summarized in a predefined set of high-level Figures of Merit (FOMs) so as to provide senior NASA Constellation Program (CxP) and Exploration Systems Mission Directorate (ESMD) management with pertinent information to better inform strategic level decision making. The strategic level exploration architecture model is designed to perform analysis at as high a level as possible but still capture those details that have major impacts on system performance. The strategic analysis methodology focuses on integrated performance, affordability, and risk analysis, and captures the linkages and feedbacks between these three areas. Each of these results leads into the determination of the high-level FOMs. This strategic level analysis methodology has been previously applied to Space Shuttle and International Space Station assessments and is now being applied to the development of the Constellation Program point-of-departure lunar architecture. This paper provides an overview of the strategic analysis methodology and the lunar exploration architecture analyses to date. In studying these analysis results, the strategic analysis team has identified and characterized key drivers affecting the integrated architecture behavior. These key drivers include inclusion of a cargo lander, mission rate, mission location, fixed-versus- variable costs/return on investment, and the requirement for probabilistic analysis. Results of sensitivity analysis performed on lunar exploration architecture scenarios are also presented.

  4. A Bayesian Network Based Global Sensitivity Analysis Method for Identifying Dominant Processes in a Multi-physics Model

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2016-12-01

    Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.

  5. Global Sensitivity Applied to Dynamic Combined Finite Discrete Element Methods for Fracture Simulation

    NASA Astrophysics Data System (ADS)

    Godinez, H. C.; Rougier, E.; Osthus, D.; Srinivasan, G.

    2017-12-01

    Fracture propagation play a key role for a number of application of interest to the scientific community. From dynamic fracture processes like spall and fragmentation in metals and detection of gas flow in static fractures in rock and the subsurface, the dynamics of fracture propagation is important to various engineering and scientific disciplines. In this work we implement a global sensitivity analysis test to the Hybrid Optimization Software Suite (HOSS), a multi-physics software tool based on the combined finite-discrete element method, that is used to describe material deformation and failure (i.e., fracture and fragmentation) under a number of user-prescribed boundary conditions. We explore the sensitivity of HOSS for various model parameters that influence how fracture are propagated through a material of interest. The parameters control the softening curve that the model relies to determine fractures within each element in the mesh, as well a other internal parameters which influence fracture behavior. The sensitivity method we apply is the Fourier Amplitude Sensitivity Test (FAST), which is a global sensitivity method to explore how each parameter influence the model fracture and to determine the key model parameters that have the most impact on the model. We present several sensitivity experiments for different combination of model parameters and compare against experimental data for verification.

  6. Design sensitivity analysis of rotorcraft airframe structures for vibration reduction

    NASA Technical Reports Server (NTRS)

    Murthy, T. Sreekanta

    1987-01-01

    Optimization of rotorcraft structures for vibration reduction was studied. The objective of this study is to develop practical computational procedures for structural optimization of airframes subject to steady-state vibration response constraints. One of the key elements of any such computational procedure is design sensitivity analysis. A method for design sensitivity analysis of airframes under vibration response constraints is presented. The mathematical formulation of the method and its implementation as a new solution sequence in MSC/NASTRAN are described. The results of the application of the method to a simple finite element stick model of the AH-1G helicopter airframe are presented and discussed. Selection of design variables that are most likely to bring about changes in the response at specified locations in the airframe is based on consideration of forced response strain energy. Sensitivity coefficients are determined for the selected design variable set. Constraints on the natural frequencies are also included in addition to the constraints on the steady-state response. Sensitivity coefficients for these constraints are determined. Results of the analysis and insights gained in applying the method to the airframe model are discussed. The general nature of future work to be conducted is described.

  7. Analysis of the economic impact of the national unified carbon trading market mechanism Hebei province, for example

    NASA Astrophysics Data System (ADS)

    Sun, Yuxing

    2018-05-01

    In this paper, a grey prediction model is used to predict the carbon emission in Hebei province, and the impact analysis model based on TermCo2 is established. At the same time, we read a lot about CGE and study on how to build the scene, the selection of key parameters, and sensitivity analysis of application scenarios do industry for reference.

  8. Sensitivity and uncertainty analysis for Abreu & Johnson numerical vapor intrusion model.

    PubMed

    Ma, Jie; Yan, Guangxu; Li, Haiyan; Guo, Shaohui

    2016-03-05

    This study conducted one-at-a-time (OAT) sensitivity and uncertainty analysis for a numerical vapor intrusion model for nine input parameters, including soil porosity, soil moisture, soil air permeability, aerobic biodegradation rate, building depressurization, crack width, floor thickness, building volume, and indoor air exchange rate. Simulations were performed for three soil types (clay, silt, and sand), two source depths (3 and 8m), and two source concentrations (1 and 400 g/m(3)). Model sensitivity and uncertainty for shallow and high-concentration vapor sources (3m and 400 g/m(3)) are much smaller than for deep and low-concentration sources (8m and 1g/m(3)). For high-concentration sources, soil air permeability, indoor air exchange rate, and building depressurization (for high permeable soil like sand) are key contributors to model output uncertainty. For low-concentration sources, soil porosity, soil moisture, aerobic biodegradation rate and soil gas permeability are key contributors to model output uncertainty. Another important finding is that impacts of aerobic biodegradation on vapor intrusion potential of petroleum hydrocarbons are negligible when vapor source concentration is high, because of insufficient oxygen supply that limits aerobic biodegradation activities. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Source apportionment and sensitivity analysis: two methodologies with two different purposes

    NASA Astrophysics Data System (ADS)

    Clappier, Alain; Belis, Claudio A.; Pernigotti, Denise; Thunis, Philippe

    2017-11-01

    This work reviews the existing methodologies for source apportionment and sensitivity analysis to identify key differences and stress their implicit limitations. The emphasis is laid on the differences between source impacts (sensitivity analysis) and contributions (source apportionment) obtained by using four different methodologies: brute-force top-down, brute-force bottom-up, tagged species and decoupled direct method (DDM). A simple theoretical example to compare these approaches is used highlighting differences and potential implications for policy. When the relationships between concentration and emissions are linear, impacts and contributions are equivalent concepts. In this case, source apportionment and sensitivity analysis may be used indifferently for both air quality planning purposes and quantifying source contributions. However, this study demonstrates that when the relationship between emissions and concentrations is nonlinear, sensitivity approaches are not suitable to retrieve source contributions and source apportionment methods are not appropriate to evaluate the impact of abatement strategies. A quantification of the potential nonlinearities should therefore be the first step prior to source apportionment or planning applications, to prevent any limitations in their use. When nonlinearity is mild, these limitations may, however, be acceptable in the context of the other uncertainties inherent to complex models. Moreover, when using sensitivity analysis for planning, it is important to note that, under nonlinear circumstances, the calculated impacts will only provide information for the exact conditions (e.g. emission reduction share) that are simulated.

  10. Asymmetry of perceived key movement in chorale sequences: converging evidence from a probe-tone analysis.

    PubMed

    Cuddy, L L; Thompson, W F

    1992-01-01

    In a probe-tone experiment, two groups of listeners--one trained, the other untrained, in traditional music theory--rated the goodness of fit of each of the 12 notes of the chromatic scale to four-voice harmonic sequences. Sequences were 12 simplified excerpts from Bach chorales, 4 nonmodulating, and 8 modulating. Modulations occurred either one or two steps in either the clockwise or the counterclockwise direction on the cycle of fifths. A consistent pattern of probe-tone ratings was obtained for each sequence, with no significant differences between listener groups. Two methods of analysis (Fourier analysis and regression analysis) revealed a directional asymmetry in the perceived key movement conveyed by modulating sequences. For a given modulation distance, modulations in the counterclockwise direction effected a clearer shift in tonal organization toward the final key than did clockwise modulations. The nature of the directional asymmetry was consistent with results reported for identification and rating of key change in the sequences (Thompson & Cuddy, 1989a). Further, according to the multiple-regression analysis, probe-tone ratings did not merely reflect the distribution of tones in the sequence. Rather, ratings were sensitive to the temporal structure of the tonal organization in the sequence.

  11. Assessing the sensitivity of bovine tuberculosis surveillance in Canada's cattle population, 2009-2013.

    PubMed

    El Allaki, Farouk; Harrington, Noel; Howden, Krista

    2016-11-01

    The objectives of this study were (1) to estimate the annual sensitivity of Canada's bTB surveillance system and its three system components (slaughter surveillance, export testing and disease investigation) using a scenario tree modelling approach, and (2) to identify key model parameters that influence the estimates of the surveillance system sensitivity (SSSe). To achieve these objectives, we designed stochastic scenario tree models for three surveillance system components included in the analysis. Demographic data, slaughter data, export testing data, and disease investigation data from 2009 to 2013 were extracted for input into the scenario trees. Sensitivity analysis was conducted to identify key influential parameters on SSSe estimates. The median annual SSSe estimates generated from the study were very high, ranging from 0.95 (95% probability interval [PI]: 0.88-0.98) to 0.97 (95% PI: 0.93-0.99). Median annual sensitivity estimates for the slaughter surveillance component ranged from 0.95 (95% PI: 0.88-0.98) to 0.97 (95% PI: 0.93-0.99). This shows that slaughter surveillance to be the major contributor to overall surveillance system sensitivity with a high probability to detect M. bovis infection if present at a prevalence of 0.00028% or greater during the study period. The export testing and disease investigation components had extremely low component sensitivity estimates-the maximum median sensitivity estimates were 0.02 (95% PI: 0.014-0.023) and 0.0061 (95% PI: 0.0056-0.0066) respectively. The three most influential input parameters on the model's output (SSSe) were the probability of a granuloma being detected at slaughter inspection, the probability of a granuloma being present in older animals (≥12 months of age), and the probability of a granuloma sample being submitted to the laboratory. Additional studies are required to reduce the levels of uncertainty and variability associated with these three parameters influencing the surveillance system sensitivity. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.

  12. Subscales to the Taylor Manifest Anxiety Scale in Three Chronically Ill Populations.

    ERIC Educational Resources Information Center

    Moore, Peter N.; And Others

    1984-01-01

    Examines factors of anxiety in the Taylor Manifest Anxiety Scale in 150 asthma, tuberculosis, and chronic pain patients. Key cluster analysis revealed five clusters: restlessness, embarrassment, sensitivity, physiological anxiety, and self-confidence. Embarrassment is fairly dependent on the other factors. (JAC)

  13. Four-dimensional key design in amplitude, phase, polarization and distance for optical encryption based on polarization digital holography and QR code.

    PubMed

    Lin, Chao; Shen, Xueju; Li, Baochen

    2014-08-25

    We demonstrate that all parameters of optical lightwave can be simultaneously designed as keys in security system. This multi-dimensional property of key can significantly enlarge the key space and further enhance the security level of the system. The single-shot off-axis digital holography with orthogonal polarized reference waves is employed to perform polarization state recording on object wave. Two pieces of polarization holograms are calculated and fabricated to be arranged in reference arms to generate random amplitude and phase distribution respectively. When reconstruction, original information which is represented with QR code can be retrieved using Fresnel diffraction with decryption keys and read out noise-free. Numerical simulation results for this cryptosystem are presented. An analysis on the key sensitivity and fault tolerance properties are also provided.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, Haitao, E-mail: liaoht@cae.ac.cn

    The direct differentiation and improved least squares shadowing methods are both developed for accurately and efficiently calculating the sensitivity coefficients of time averaged quantities for chaotic dynamical systems. The key idea is to recast the time averaged integration term in the form of differential equation before applying the sensitivity analysis method. An additional constraint-based equation which forms the augmented equations of motion is proposed to calculate the time averaged integration variable and the sensitivity coefficients are obtained as a result of solving the augmented differential equations. The application of the least squares shadowing formulation to the augmented equations results inmore » an explicit expression for the sensitivity coefficient which is dependent on the final state of the Lagrange multipliers. The LU factorization technique to calculate the Lagrange multipliers leads to a better performance for the convergence problem and the computational expense. Numerical experiments on a set of problems selected from the literature are presented to illustrate the developed methods. The numerical results demonstrate the correctness and effectiveness of the present approaches and some short impulsive sensitivity coefficients are observed by using the direct differentiation sensitivity analysis method.« less

  15. Sensitivity of the reference evapotranspiration to key climatic variables during the growing season in the Ejina oasis northwest China.

    PubMed

    Hou, Lan-Gong; Zou, Song-Bing; Xiao, Hong-Lang; Yang, Yong-Gang

    2013-01-01

    The standardized FAO56 Penman-Monteith model, which has been the most reasonable method in both humid and arid climatic conditions, provides reference evapotranspiration (ETo) estimates for planning and efficient use of agricultural water resources. And sensitivity analysis is important in understanding the relative importance of climatic variables to the variation of reference evapotranspiration. In this study, a non-dimensional relative sensitivity coefficient was employed to predict responses of ETo to perturbations of four climatic variables in the Ejina oasis northwest China. A 20-year historical dataset of daily air temperature, wind speed, relative humidity and daily sunshine duration in the Ejina oasis was used in the analysis. Results have shown that daily sensitivity coefficients exhibited large fluctuations during the growing season, and shortwave radiation was the most sensitive variable in general for the Ejina oasis, followed by air temperature, wind speed and relative humidity. According to this study, the response of ETo can be preferably predicted under perturbation of air temperature, wind speed, relative humidity and shortwave radiation by their sensitivity coefficients.

  16. Global sensitivity analysis for UNSATCHEM simulations of crop production with degraded waters

    USDA-ARS?s Scientific Manuscript database

    One strategy for maintaining irrigated agricultural productivity in the face of diminishing resource availability is to make greater use of marginal quality waters and lands. A key to sustaining systems using degraded irrigation waters is salinity management. Advanced simulation models and decision ...

  17. Key Policy Makers' Awareness of Tobacco Taxation Effectiveness through a Sensitization Program.

    PubMed

    Heydari, Gholamreza; Ebn Ahmady, Arezoo; Lando, Harry A; Chamyani, Fahimeh; Masjedi, Mohammadreza; Shadmehr, Mohammad B; Fadaizadeh, Lida

    2015-12-01

    The implementation of 5 of the 6 WHO MPOWER program in Iran is satisfactory; the only notable shortcoming is the lack of tobacco taxation increases. This study was designed to increase key policy makers' awareness of tobacco taxation effectiveness through a sensitization program in Iran. This analytical and semi-experimental study in 2014 included 110 tobacco control key policy makers, who were trained and received educational materials on the importance of tobacco taxation. A valid and reliable questionnaire was completed before and three months after intervention. Data were analyzed using mean (SD), t-Test and analysis of variance. The mean (SD) scores at pre- and post-test were 2.7 ± 3 and 8.8 ± 1 out of 10, respectively. Paired t-tests demonstrated a significant difference in the pre- post-test knowledge scores. Increasing knowledge and promoting favorable attitudes of policy makers can lead to greater attention which could in turn change tobacco taxation policies.

  18. Systems Analysis of the Hydrogen Transition with HyTrans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leiby, Paul Newsome; Greene, David L; Bowman, David Charles

    2007-01-01

    The U.S. Federal government is carefully considering the merits and long-term prospects of hydrogen-fueled vehicles. NAS (1) has called for the careful application of systems analysis tools to structure the complex assessment required. Others, raising cautionary notes, question whether a consistent and plausible transition to hydrogen light-duty vehicles can identified (2) and whether that transition would, on balance, be environmentally preferred. Modeling the market transition to hydrogen-powered vehicles is an inherently complex process, encompassing hydrogen production, delivery and retailing, vehicle manufacturing, and vehicle choice and use. We describe the integration of key technological and market factors in a dynamic transitionmore » model, HyTrans. The usefulness of HyTrans and its predictions depends on three key factors: (1) the validity of the economic theories that underpin the model, (2) the authenticity with which the key processes are represented, and (3) the accuracy of specific parameter values used in the process representations. This paper summarizes the theoretical basis of HyTrans, and highlights the implications of key parameter specifications with sensitivity analysis.« less

  19. Nonnegative constraint analysis of key fluorophores within human breast cancer using native fluorescence spectroscopy excited by selective wavelength of 300 nm

    NASA Astrophysics Data System (ADS)

    Pu, Yang; Sordillo, Laura A.; Alfano, Robert R.

    2015-03-01

    Native fluorescence spectroscopy offers an important role in cancer discrimination. It is widely acknowledged that the emission spectrum of tissue is a superposition of spectra of various salient fluorophores. In this study, the native fluorescence spectra of human cancerous and normal breast tissues excited by selected wavelength of 300 nm are used to investigate the key building block fluorophores: tryptophan and reduced nicotinamide adenine dinucleotide (NADH). The basis spectra of these key fluorophores' contribution to the tissue emission spectra are obtained by nonnegative constraint analysis. The emission spectra of human cancerous and normal tissue samples are projected onto the fluorophore spectral subspace. Since previous studies indicate that tryptophan and NADH are key fluorophores related with tumor evolution, it is essential to obtain their information from tissue fluorescence but discard the redundancy. To evaluate the efficacy of for cancer detection, linear discriminant analysis (LDA) classifier is used to evaluate the sensitivity, and specificity. This research demonstrates that the native fluorescence spectroscopy measurements are effective to detect changes of fluorophores' compositions in tissues due to the development of cancer.

  20. Sensitivity Analysis on Remote Sensing Evapotranspiration Algorithm of Surface Energy Balance for Land

    NASA Astrophysics Data System (ADS)

    Wang, J.; Samms, T.; Meier, C.; Simmons, L.; Miller, D.; Bathke, D.

    2005-12-01

    Spatial evapotranspiration (ET) is usually estimated by Surface Energy Balance Algorithm for Land. The average accuracy of the algorithm is 85% on daily basis and 95% on seasonable basis. However, the accuracy of the algorithm varies from 67% to 95% on instantaneous ET estimates and, as reported in 18 studies, 70% to 98% on 1 to 10-day ET estimates. There is a need to understand the sensitivity of the ET calculation with respect to the algorithm variables and equations. With an increased understanding, information can be developed to improve the algorithm, and to better identify the key variables and equations. A Modified Surface Energy Balance Algorithm for Land (MSEBAL) was developed and validated with data from a pecan orchard and an alfalfa field. The MSEBAL uses ground reflectance and temperature data from ASTER sensors along with humidity, wind speed, and solar radiation data from a local weather station. MSEBAL outputs hourly and daily ET with 90 m by 90 m resolution. A sensitivity analysis was conducted for MSEBAL on ET calculation. In order to observe the sensitivity of the calculation to a particular variable, the value of that variable was changed while holding the magnitudes of the other variables. The key variables and equations to which the ET calculation most sensitive were determined in this study. href='http://weather.nmsu.edu/pecans/SEBALFolder/San%20Francisco%20AGU%20meeting/ASensitivityAnalysisonMSE">http://weather.nmsu.edu/pecans/SEBALFolder/San%20Francisco%20AGU%20meeting/ASensitivityAnalysisonMSE

  1. Impervious surfaces and sewer pipe effects on stormwater runoff temperature

    NASA Astrophysics Data System (ADS)

    Sabouri, F.; Gharabaghi, B.; Mahboubi, A. A.; McBean, E. A.

    2013-10-01

    The warming effect of the impervious surfaces in urban catchment areas and the cooling effect of underground storm sewer pipes on stormwater runoff temperature are assessed. Four urban residential catchment areas in the Cities of Guelph and Kitchener, Ontario, Canada were evaluated using a combination of runoff monitoring and modelling. The stormwater level and water temperature were monitored at 10 min interval at the inlet of the stormwater management ponds for three summers 2009, 2010 and 2011. The warming effect of the ponds is also studied, however discussed in detail in a separate paper. An artificial neural network (ANN) model for stormwater temperature was trained and validated using monitoring data. Stormwater runoff temperature was most sensitive to event mean temperature of the rainfall (EMTR) with a normalized sensitivity coefficient (Sn) of 1.257. Subsequent levels of sensitivity corresponded to the longest sewer pipe length (LPL), maximum rainfall intensity (MI), percent impervious cover (IMP), rainfall depth (R), initial asphalt temperature (AspT), pipe network density (PND), and rainfall duration (D), respectively. Percent impervious cover of the catchment area (IMP) was the key parameter that represented the warming effect of the paved surfaces; sensitivity analysis showed IMP increase from 20% to 50% resulted in runoff temperature increase by 3 °C. The longest storm sewer pipe length (LPL) and the storm sewer pipe network density (PND) are the two key parameters that control the cooling effect of the underground sewer system; sensitivity analysis showed LPL increase from 345 to 966 m, resulted in runoff temperature drop by 2.5 °C.

  2. Space system operations and support cost analysis using Markov chains

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Dean, Edwin B.; Moore, Arlene A.; Fairbairn, Robert E.

    1990-01-01

    This paper evaluates the use of Markov chain process in probabilistic life cycle cost analysis and suggests further uses of the process as a design aid tool. A methodology is developed for estimating operations and support cost and expected life for reusable space transportation systems. Application of the methodology is demonstrated for the case of a hypothetical space transportation vehicle. A sensitivity analysis is carried out to explore the effects of uncertainty in key model inputs.

  3. Assessment of energy and economic performance of office building models: a case study

    NASA Astrophysics Data System (ADS)

    Song, X. Y.; Ye, C. T.; Li, H. S.; Wang, X. L.; Ma, W. B.

    2016-08-01

    Energy consumption of building accounts for more than 37.3% of total energy consumption while the proportion of energy-saving buildings is just 5% in China. In this paper, in order to save potential energy, an office building in Southern China was selected as a test example for energy consumption characteristics. The base building model was developed by TRNSYS software and validated against the recorded data from the field work in six days out of August-September in 2013. Sensitivity analysis was conducted for energy performance of building envelope retrofitting; five envelope parameters were analyzed for assessing the thermal responses. Results indicated that the key sensitivity factors were obtained for the heat-transfer coefficient of exterior walls (U-wall), infiltration rate and shading coefficient (SC), of which the sum sensitivity factor was about 89.32%. In addition, the results were evaluated in terms of energy and economic analysis. The analysis of sensitivity validated against some important results of previous studies. On the other hand, the cost-effective method improved the efficiency of investment management in building energy.

  4. Biochemical analysis of force-sensitive responses using a large-scale cell stretch device.

    PubMed

    Renner, Derrick J; Ewald, Makena L; Kim, Timothy; Yamada, Soichiro

    2017-09-03

    Physical force has emerged as a key regulator of tissue homeostasis, and plays an important role in embryogenesis, tissue regeneration, and disease progression. Currently, the details of protein interactions under elevated physical stress are largely missing, therefore, preventing the fundamental, molecular understanding of mechano-transduction. This is in part due to the difficulty isolating large quantities of cell lysates exposed to force-bearing conditions for biochemical analysis. We designed a simple, easy-to-fabricate, large-scale cell stretch device for the analysis of force-sensitive cell responses. Using proximal biotinylation (BioID) analysis or phospho-specific antibodies, we detected force-sensitive biochemical changes in cells exposed to prolonged cyclic substrate stretch. For example, using promiscuous biotin ligase BirA* tagged α-catenin, the biotinylation of myosin IIA increased with stretch, suggesting the close proximity of myosin IIA to α-catenin under a force bearing condition. Furthermore, using phospho-specific antibodies, Akt phosphorylation was reduced upon stretch while Src phosphorylation was unchanged. Interestingly, phosphorylation of GSK3β, a downstream effector of Akt pathway, was also reduced with stretch, while the phosphorylation of other Akt effectors was unchanged. These data suggest that the Akt-GSK3β pathway is force-sensitive. This simple cell stretch device enables biochemical analysis of force-sensitive responses and has potential to uncover molecules underlying mechano-transduction.

  5. DIAGNOSTIC STUDY ON FINE PARTICULATE MATTER PREDICTIONS OF CMAQ IN THE SOUTHEASTERN U.S.

    EPA Science Inventory

    In this study, the authors use the process analysis tool embedded in CMAQ to examine major processes that govern the fate of key pollutants, identify the most influential processes that contribute to model errors, and guide the diagnostic and sensitivity studies aimed at improvin...

  6. Global Sensitivity of Simulated Water Balance Indicators Under Future Climate Change in the Colorado Basin

    NASA Astrophysics Data System (ADS)

    Bennett, Katrina E.; Urrego Blanco, Jorge R.; Jonko, Alexandra; Bohn, Theodore J.; Atchley, Adam L.; Urban, Nathan M.; Middleton, Richard S.

    2018-01-01

    The Colorado River Basin is a fundamentally important river for society, ecology, and energy in the United States. Streamflow estimates are often provided using modeling tools which rely on uncertain parameters; sensitivity analysis can help determine which parameters impact model results. Despite the fact that simulated flows respond to changing climate and vegetation in the basin, parameter sensitivity of the simulations under climate change has rarely been considered. In this study, we conduct a global sensitivity analysis to relate changes in runoff, evapotranspiration, snow water equivalent, and soil moisture to model parameters in the Variable Infiltration Capacity (VIC) hydrologic model. We combine global sensitivity analysis with a space-filling Latin Hypercube Sampling of the model parameter space and statistical emulation of the VIC model to examine sensitivities to uncertainties in 46 model parameters following a variance-based approach. We find that snow-dominated regions are much more sensitive to uncertainties in VIC parameters. Although baseflow and runoff changes respond to parameters used in previous sensitivity studies, we discover new key parameter sensitivities. For instance, changes in runoff and evapotranspiration are sensitive to albedo, while changes in snow water equivalent are sensitive to canopy fraction and Leaf Area Index (LAI) in the VIC model. It is critical for improved modeling to narrow uncertainty in these parameters through improved observations and field studies. This is important because LAI and albedo are anticipated to change under future climate and narrowing uncertainty is paramount to advance our application of models such as VIC for water resource management.

  7. Chaotic Image Encryption Algorithm Based on Bit Permutation and Dynamic DNA Encoding.

    PubMed

    Zhang, Xuncai; Han, Feng; Niu, Ying

    2017-01-01

    With the help of the fact that chaos is sensitive to initial conditions and pseudorandomness, combined with the spatial configurations in the DNA molecule's inherent and unique information processing ability, a novel image encryption algorithm based on bit permutation and dynamic DNA encoding is proposed here. The algorithm first uses Keccak to calculate the hash value for a given DNA sequence as the initial value of a chaotic map; second, it uses a chaotic sequence to scramble the image pixel locations, and the butterfly network is used to implement the bit permutation. Then, the image is coded into a DNA matrix dynamic, and an algebraic operation is performed with the DNA sequence to realize the substitution of the pixels, which further improves the security of the encryption. Finally, the confusion and diffusion properties of the algorithm are further enhanced by the operation of the DNA sequence and the ciphertext feedback. The results of the experiment and security analysis show that the algorithm not only has a large key space and strong sensitivity to the key but can also effectively resist attack operations such as statistical analysis and exhaustive analysis.

  8. Chaotic Image Encryption Algorithm Based on Bit Permutation and Dynamic DNA Encoding

    PubMed Central

    2017-01-01

    With the help of the fact that chaos is sensitive to initial conditions and pseudorandomness, combined with the spatial configurations in the DNA molecule's inherent and unique information processing ability, a novel image encryption algorithm based on bit permutation and dynamic DNA encoding is proposed here. The algorithm first uses Keccak to calculate the hash value for a given DNA sequence as the initial value of a chaotic map; second, it uses a chaotic sequence to scramble the image pixel locations, and the butterfly network is used to implement the bit permutation. Then, the image is coded into a DNA matrix dynamic, and an algebraic operation is performed with the DNA sequence to realize the substitution of the pixels, which further improves the security of the encryption. Finally, the confusion and diffusion properties of the algorithm are further enhanced by the operation of the DNA sequence and the ciphertext feedback. The results of the experiment and security analysis show that the algorithm not only has a large key space and strong sensitivity to the key but can also effectively resist attack operations such as statistical analysis and exhaustive analysis. PMID:28912802

  9. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE PAGES

    Dai, Heng; Ye, Ming; Walker, Anthony P.; ...

    2017-03-28

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  10. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  11. Constitutive BAK activation as a determinant of drug sensitivity in malignant lymphohematopoietic cells

    PubMed Central

    Dai, Haiming; Ding, Husheng; Meng, X. Wei; Peterson, Kevin L.; Schneider, Paula A.; Karp, Judith E.; Kaufmann, Scott H.

    2015-01-01

    Mitochondrial outer membrane permeabilization (MOMP), a key step in the intrinsic apoptotic pathway, is incompletely understood. Current models emphasize the role of BH3-only BCL2 family members in BAX and BAK activation. Here we demonstrate concentration-dependent BAK autoactivation under cell-free conditions and provide evidence that this autoactivation plays a key role in regulating the intrinsic apoptotic pathway in intact cells. In particular, we show that up to 80% of BAK (but not BAX) in lymphohematopoietic cell lines is oligomerized and bound to anti-apoptotic BCL2 family members in the absence of exogenous death stimuli. The extent of this constitutive BAK oligomerization is diminished by BAK knockdown and unaffected by BIM or PUMA down-regulation. Further analysis indicates that sensitivity of cells to BH3 mimetics reflects the identity of the anti-apoptotic proteins to which BAK is constitutively bound, with extensive BCLXL•BAK complexes predicting navitoclax sensitivity, and extensive MCL1•BAK complexes predicting A1210477 sensitivity. Moreover, high BAK expression correlates with sensitivity of clinical acute myelogenous leukemia to chemotherapy, whereas low BAK levels correlate with resistance and relapse. Collectively, these results inform current understanding of MOMP and provide new insight into the ability of BH3 mimetics to induce apoptosis without directly activating BAX or BAK. PMID:26494789

  12. Dynamic analysis of process reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shadle, L.J.; Lawson, L.O.; Noel, S.D.

    1995-06-01

    The approach and methodology of conducting a dynamic analysis is presented in this poster session in order to describe how this type of analysis can be used to evaluate the operation and control of process reactors. Dynamic analysis of the PyGas{trademark} gasification process is used to illustrate the utility of this approach. PyGas{trademark} is the gasifier being developed for the Gasification Product Improvement Facility (GPIF) by Jacobs-Siffine Engineering and Riley Stoker. In the first step of the analysis, process models are used to calculate the steady-state conditions and associated sensitivities for the process. For the PyGas{trademark} gasifier, the process modelsmore » are non-linear mechanistic models of the jetting fluidized-bed pyrolyzer and the fixed-bed gasifier. These process sensitivities are key input, in the form of gain parameters or transfer functions, to the dynamic engineering models.« less

  13. Exploring the impact of forcing error characteristics on physically based snow simulations within a global sensitivity analysis framework

    NASA Astrophysics Data System (ADS)

    Raleigh, M. S.; Lundquist, J. D.; Clark, M. P.

    2015-07-01

    Physically based models provide insights into key hydrologic processes but are associated with uncertainties due to deficiencies in forcing data, model parameters, and model structure. Forcing uncertainty is enhanced in snow-affected catchments, where weather stations are scarce and prone to measurement errors, and meteorological variables exhibit high variability. Hence, there is limited understanding of how forcing error characteristics affect simulations of cold region hydrology and which error characteristics are most important. Here we employ global sensitivity analysis to explore how (1) different error types (i.e., bias, random errors), (2) different error probability distributions, and (3) different error magnitudes influence physically based simulations of four snow variables (snow water equivalent, ablation rates, snow disappearance, and sublimation). We use the Sobol' global sensitivity analysis, which is typically used for model parameters but adapted here for testing model sensitivity to coexisting errors in all forcings. We quantify the Utah Energy Balance model's sensitivity to forcing errors with 1 840 000 Monte Carlo simulations across four sites and five different scenarios. Model outputs were (1) consistently more sensitive to forcing biases than random errors, (2) generally less sensitive to forcing error distributions, and (3) critically sensitive to different forcings depending on the relative magnitude of errors. For typical error magnitudes found in areas with drifting snow, precipitation bias was the most important factor for snow water equivalent, ablation rates, and snow disappearance timing, but other forcings had a more dominant impact when precipitation uncertainty was due solely to gauge undercatch. Additionally, the relative importance of forcing errors depended on the model output of interest. Sensitivity analysis can reveal which forcing error characteristics matter most for hydrologic modeling.

  14. Sensitivity of projected long-term CO2 emissions across the Shared Socioeconomic Pathways

    NASA Astrophysics Data System (ADS)

    Marangoni, G.; Tavoni, M.; Bosetti, V.; Borgonovo, E.; Capros, P.; Fricko, O.; Gernaat, D. E. H. J.; Guivarch, C.; Havlik, P.; Huppmann, D.; Johnson, N.; Karkatsoulis, P.; Keppo, I.; Krey, V.; Ó Broin, E.; Price, J.; van Vuuren, D. P.

    2017-01-01

    Scenarios showing future greenhouse gas emissions are needed to estimate climate impacts and the mitigation efforts required for climate stabilization. Recently, the Shared Socioeconomic Pathways (SSPs) have been introduced to describe alternative social, economic and technical narratives, spanning a wide range of plausible futures in terms of challenges to mitigation and adaptation. Thus far the key drivers of the uncertainty in emissions projections have not been robustly disentangled. Here we assess the sensitivities of future CO2 emissions to key drivers characterizing the SSPs. We use six state-of-the-art integrated assessment models with different structural characteristics, and study the impact of five families of parameters, related to population, income, energy efficiency, fossil fuel availability, and low-carbon energy technology development. A recently developed sensitivity analysis algorithm allows us to parsimoniously compute both the direct and interaction effects of each of these drivers on cumulative emissions. The study reveals that the SSP assumptions about energy intensity and economic growth are the most important determinants of future CO2 emissions from energy combustion, both with and without a climate policy. Interaction terms between parameters are shown to be important determinants of the total sensitivities.

  15. A sensitivity analysis for missing outcomes due to truncation by death under the matched-pairs design.

    PubMed

    Imai, Kosuke; Jiang, Zhichao

    2018-04-29

    The matched-pairs design enables researchers to efficiently infer causal effects from randomized experiments. In this paper, we exploit the key feature of the matched-pairs design and develop a sensitivity analysis for missing outcomes due to truncation by death, in which the outcomes of interest (e.g., quality of life measures) are not even well defined for some units (e.g., deceased patients). Our key idea is that if 2 nearly identical observations are paired prior to the randomization of the treatment, the missingness of one unit's outcome is informative about the potential missingness of the other unit's outcome under an alternative treatment condition. We consider the average treatment effect among always-observed pairs (ATOP) whose units exhibit no missing outcome regardless of their treatment status. The naive estimator based on available pairs is unbiased for the ATOP if 2 units of the same pair are identical in terms of their missingness patterns. The proposed sensitivity analysis characterizes how the bounds of the ATOP widen as the degree of the within-pair similarity decreases. We further extend the methodology to the matched-pairs design in observational studies. Our simulation studies show that informative bounds can be obtained under some scenarios when the proportion of missing data is not too large. The proposed methodology is also applied to the randomized evaluation of the Mexican universal health insurance program. An open-source software package is available for implementing the proposed research. Copyright © 2018 John Wiley & Sons, Ltd.

  16. Key Reliability Drivers of Liquid Propulsion Engines and A Reliability Model for Sensitivity Analysis

    NASA Technical Reports Server (NTRS)

    Huang, Zhao-Feng; Fint, Jeffry A.; Kuck, Frederick M.

    2005-01-01

    This paper is to address the in-flight reliability of a liquid propulsion engine system for a launch vehicle. We first establish a comprehensive list of system and sub-system reliability drivers for any liquid propulsion engine system. We then build a reliability model to parametrically analyze the impact of some reliability parameters. We present sensitivity analysis results for a selected subset of the key reliability drivers using the model. Reliability drivers identified include: number of engines for the liquid propulsion stage, single engine total reliability, engine operation duration, engine thrust size, reusability, engine de-rating or up-rating, engine-out design (including engine-out switching reliability, catastrophic fraction, preventable failure fraction, unnecessary shutdown fraction), propellant specific hazards, engine start and cutoff transient hazards, engine combustion cycles, vehicle and engine interface and interaction hazards, engine health management system, engine modification, engine ground start hold down with launch commit criteria, engine altitude start (1 in. start), Multiple altitude restart (less than 1 restart), component, subsystem and system design, manufacturing/ground operation support/pre and post flight check outs and inspection, extensiveness of the development program. We present some sensitivity analysis results for the following subset of the drivers: number of engines for the propulsion stage, single engine total reliability, engine operation duration, engine de-rating or up-rating requirements, engine-out design, catastrophic fraction, preventable failure fraction, unnecessary shutdown fraction, and engine health management system implementation (basic redlines and more advanced health management systems).

  17. Simulating visibility under reduced acuity and contrast sensitivity.

    PubMed

    Thompson, William B; Legge, Gordon E; Kersten, Daniel J; Shakespeare, Robert A; Lei, Quan

    2017-04-01

    Architects and lighting designers have difficulty designing spaces that are accessible to those with low vision, since the complex nature of most architectural spaces requires a site-specific analysis of the visibility of mobility hazards and key landmarks needed for navigation. We describe a method that can be utilized in the architectural design process for simulating the effects of reduced acuity and contrast on visibility. The key contribution is the development of a way to parameterize the simulation using standard clinical measures of acuity and contrast sensitivity. While these measures are known to be imperfect predictors of visual function, they provide a way of characterizing general levels of visual performance that is familiar to both those working in low vision and our target end-users in the architectural and lighting-design communities. We validate the simulation using a letter-recognition task.

  18. Simulating Visibility Under Reduced Acuity and Contrast Sensitivity

    PubMed Central

    Thompson, William B.; Legge, Gordon E.; Kersten, Daniel J.; Shakespeare, Robert A.; Lei, Quan

    2017-01-01

    Architects and lighting designers have difficulty designing spaces that are accessible to those with low vision, since the complex nature of most architectural spaces requires a site-specific analysis of the visibility of mobility hazards and key landmarks needed for navigation. We describe a method that can be utilized in the architectural design process for simulating the effects of reduced acuity and contrast on visibility. The key contribution is the development of a way to parameterize the simulation using standard clinical measures of acuity and contrast sensitivity. While these measures are known to be imperfect predictors of visual function, they provide a way of characterizing general levels of visual performance that is familiar to both those working in low vision and our target end-users in the architectural and lighting design communities. We validate the simulation using a letter recognition task. PMID:28375328

  19. Mapping and analysis of phosphorylation sites: a quick guide for cell biologists

    PubMed Central

    Dephoure, Noah; Gould, Kathleen L.; Gygi, Steven P.; Kellogg, Douglas R.

    2013-01-01

    A mechanistic understanding of signaling networks requires identification and analysis of phosphorylation sites. Mass spectrometry offers a rapid and highly sensitive approach to mapping phosphorylation sites. However, mass spectrometry has significant limitations that must be considered when planning to carry out phosphorylation-site mapping. Here we provide an overview of key information that should be taken into consideration before beginning phosphorylation-site analysis, as well as a step-by-step guide for carrying out successful experiments. PMID:23447708

  20. Bi-directional exchange of ammonia in a pine forest ecosystem - a model sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Moravek, Alexander; Hrdina, Amy; Murphy, Jennifer

    2016-04-01

    Ammonia (NH3) is a key component in the global nitrogen cycle and of great importance for atmospheric chemistry, neutralizing atmospheric acids and leading to the formation of aerosol particles. For understanding the role of NH3 in both natural and anthropogenically influenced environments, the knowledge of processes regulating its exchange between ecosystems and the atmosphere is essential. A two-layer canopy compensation point model is used to evaluate the NH3 exchange in a pine forest in the Colorado Rocky Mountains. The net flux comprises the NH3 exchange of leaf stomata, its deposition to leaf cuticles and exchange with the forest ground. As key parameters the model uses in-canopy NH3 mixing ratios as well as leaf and soil emission potentials measured at the site in summer 2015. A sensitivity analysis is performed to evaluate the major exchange pathways as well as the model's constraints. In addition, the NH3 exchange is examined for an extended range of environmental conditions, such as droughts or varying concentrations of atmospheric pollutants, in order to investigate their influence on the overall net exchange.

  1. Exploring variations in functional connectivity of the resting state default mode network in mild traumatic brain injury.

    PubMed

    Nathan, Dominic E; Oakes, Terrence R; Yeh, Ping Hong; French, Louis M; Harper, Jamie F; Liu, Wei; Wolfowitz, Rachel D; Wang, Bin Quan; Graner, John L; Riedy, Gerard

    2015-03-01

    A definitive diagnosis of mild traumatic brain injury (mTBI) is difficult due to the absence of biomarkers in standard clinical imaging. The brain is a complex network of interconnected neurons and subtle changes can modulate key networks of cognitive function. The resting state default mode network (DMN) has been shown to be sensitive to changes induced by pathology. This study seeks to determine whether quantitative measures of the DMN are sensitive in distinguishing mTBI subjects. Resting state functional magnetic resonance imaging data were obtained for healthy (n=12) and mTBI subjects (n=15). DMN maps were computed using dual-regression Independent Component Analysis (ICA). A goodness-of-fit (GOF) index was calculated to assess the degree of spatial specificity and sensitivity between healthy controls and mTBI subjects. DMN regions and neuropsychological assessments were examined to identify potential relationships. The resting state DMN maps indicate an increase in spatial coactivity in mTBI subjects within key regions of the DMN. Significant coactivity within the cerebellum and supplementary motor areas of mTBI subjects were also observed. This has not been previously reported in seed-based resting state network analysis. The GOF suggested the presence of high variability within the mTBI subject group, with poor sensitivity and specificity. The neuropsychological data showed correlations between areas of coactivity within the resting state network in the brain with a number of measures of emotion and cognitive functioning. The poor performance of the GOF highlights the key challenge associated with mTBI injury: the high variability in injury mechanisms and subsequent recovery. However, the quantification of the DMN using dual-regression ICA has potential to distinguish mTBI from healthy subjects, and provide information on the relationship of aspects of cognitive and emotional functioning with their potential neural correlates.

  2. Technical needs assessment: UWMC's sensitivity analysis guides decision-making.

    PubMed

    Alotis, Michael

    2003-01-01

    In today's healthcare market, it is critical for provider institutions to offer the latest and best technological services while remaining fiscally sound. In academic practices, like the University of Washington Medical Center (UWMC), there are the added responsibilities of teaching and research that require a high-tech environment to thrive. These conditions and needs require extensive analysis of not only what equipment to buy, but also when and how it should be acquired. In an organization like the UWMC, which has strategically positioned itself for growth, it is useful to build a sensitivity analysis based on the strategic plan. A common forecasting tool, the sensitivity analysis lays out existing and projected business operations with volume assumptions displayed in layers. Each layer of current and projected activity is plotted over time and placed against a background depicting the capacity of the key modality. Key elements of a sensitivity analysis include necessity, economic assessment, performance, compatibility, reliability, service and training. There are two major triggers that cause us to consider the purchase of new imaging equipment and that determine how to evaluate the equipment we buy. One trigger revolves around our ability to serve patients by seeing them on a timely basis. If we find a significant gap between demand and our capacity to meet it, or anticipate a greater increased demand based upon trends, we begin to consider enhancing that capacity. A second trigger is the release of a breakthrough or substantially improved technology that will clearly have a positive impact on clinical efficacy and efficiency, thereby benefiting the patient. Especially in radiology departments, where many technologies require large expenditures, it is no longer acceptable simply to spend on new and improved technologies. It is necessary to justify them as a strong investment in clinical management and efficacy. There is pressure to provide "proof" at the department level and beyond. By applying sensitivity analysis and other forecasting methods, we are able to spend our resources judiciously in order to get the equipment we need when we need it. This helps ensure that we have efficacious, efficient systems--and enough of them--so that our patients are examined on a timely basis and our clinics run smoothly. It also goes a long way toward making certain that the best equipment is available to our clinicians, researchers, students and patients alike.

  3. Polymorphisms of three genes (ACE, AGT and CYP11B2) in the renin-angiotensin-aldosterone system are not associated with blood pressure salt sensitivity: A systematic meta-analysis.

    PubMed

    Sun, Jiahong; Zhao, Min; Miao, Song; Xi, Bo

    2016-01-01

    Many studies have suggested that polymorphisms of three key genes (ACE, AGT and CYP11B2) in the renin-angiotensin-aldosterone system (RAAS) play important roles in the development of blood pressure (BP) salt sensitivity, but they have revealed inconsistent results. Thus, we performed a meta-analysis to clarify the association. PubMed and Embase databases were searched for eligible published articles. Fixed- or random-effect models were used to pool odds ratios and 95% confidence intervals based on whether there was significant heterogeneity between studies. In total, seven studies [237 salt-sensitive (SS) cases and 251 salt-resistant (SR) controls] for ACE gene I/D polymorphism, three studies (130 SS cases and 221 SR controls) for AGT gene M235T polymorphism and three studies (113 SS cases and 218 SR controls) for CYP11B2 gene C344T polymorphism were included in this meta-analysis. The results showed that there was no significant association between polymorphisms of these three polymorphisms in the RAAS and BP salt sensitivity under three genetic models (all p > 0.05). The meta-analysis suggested that three polymorphisms (ACE gene I/D, AGT gene M235T, CYP11B2 gene C344T) in the RAAS have no significant effect on BP salt sensitivity.

  4. Sensitivity analysis of navy aviation readiness based sparing model

    DTIC Science & Technology

    2017-09-01

    variability. (See Figure 4.) Figure 4. Research design flowchart 18 Figure 4 lays out the four steps of the methodology , starting in the upper left-hand...as a function of changes in key inputs. We develop NAVARM Experimental Designs (NED), a computational tool created by applying a state-of-the-art...experimental design to the NAVARM model. Statistical analysis of the resulting data identifies the most influential cost factors. Those are, in order of

  5. Sensitivity Analysis of an ENteric Immunity SImulator (ENISI)-Based Model of Immune Responses to Helicobacter pylori Infection

    PubMed Central

    Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav

    2015-01-01

    Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close “neighborhood” of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa. PMID:26327290

  6. Sensitivity Analysis of an ENteric Immunity SImulator (ENISI)-Based Model of Immune Responses to Helicobacter pylori Infection.

    PubMed

    Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav

    2015-01-01

    Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close "neighborhood" of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa.

  7. [Spatial heterogeneity and classified control of agricultural non-point source pollution in Huaihe River Basin].

    PubMed

    Zhou, Liang; Xu, Jian-Gang; Sun, Dong-Qi; Ni, Tian-Hua

    2013-02-01

    Agricultural non-point source pollution is of importance in river deterioration. Thus identifying and concentrated controlling the key source-areas are the most effective approaches for non-point source pollution control. This study adopts inventory method to analysis four kinds of pollution sources and their emissions intensity of the chemical oxygen demand (COD), total nitrogen (TN), and total phosphorus (TP) in 173 counties (cities, districts) in Huaihe River Basin. The four pollution sources include livestock breeding, rural life, farmland cultivation, aquacultures. The paper mainly addresses identification of non-point polluted sensitivity areas, key pollution sources and its spatial distribution characteristics through cluster, sensitivity evaluation and spatial analysis. A geographic information system (GIS) and SPSS were used to carry out this study. The results show that: the COD, TN and TP emissions of agricultural non-point sources were 206.74 x 10(4) t, 66.49 x 10(4) t, 8.74 x 10(4) t separately in Huaihe River Basin in 2009; the emission intensity were 7.69, 2.47, 0.32 t.hm-2; the proportions of COD, TN, TP emissions were 73%, 24%, 3%. The paper achieves that: the major pollution source of COD, TN and TP was livestock breeding and rural life; the sensitivity areas and priority pollution control areas among the river basin of non-point source pollution are some sub-basins of the upper branches in Huaihe River, such as Shahe River, Yinghe River, Beiru River, Jialu River and Qingyi River; livestock breeding is the key pollution source in the priority pollution control areas. Finally, the paper concludes that pollution type of rural life has the highest pollution contribution rate, while comprehensive pollution is one type which is hard to control.

  8. Recognition of Bread Key Odorants by Using Polymer Coated QCMs

    NASA Astrophysics Data System (ADS)

    Nakai, Takashi; Kouno, Shinji; Hiruma, Naoya; Shuzo, Masaki; Delaunay, Jean-Jacques; Yamada, Ichiro

    Polyisobutylene (PIB) polymer and methylphenylsiloxane (25%) diphenylsiloxane (75%) copolymer (OV25) were coated on Quartz Crystal Microbalance (QCM) sensors and used in recognition of bread key odorants. Representative compounds of key roasty odorants of bread were taken as 3-acetylpyridine and benzaldehyde, and representative key fatty odorants were hexanal and (E)-2-nonenal. Both OV25- and PIB-coated QCM fabricated sensors could detect concentration as low as 0.9 ppm of 3-acetylpyridine and 1.2 ppm of (E)-2-nonenal. The sensitivity to 3-acetylpyridine of the OV25-coated QCM was about 1000 times higher than that of ethanol, the major interference compound in bread key odorant analysis. Further, the OV25-coated QCM response was 5-6 times and 2-3 times larger than that of the PIB-coated QCM when exposed to roasty odorants and to fatty odorants, respectively. The difference in sensitivity of the OV25- and PIB-coated QCMs we fabricated made possible to discriminate roasty from fatty odorants as was evidenced by the odor recognition map representing the frequency shifts of the OV25-coated QCM against the frequency shift of the PIB-coated QCM. In conclusion, we found that the combination of an OV25-coated QCM and a PIB-coated QCM was successful in discriminating roasty odorants from fatty odorants at the ppm level.

  9. Pushing the limits: Quantification of chromophores in real-world paper samples by GC-ECD and EI-GC-MS.

    PubMed

    Schedl, A; Zweckmair, T; Kikul, F; Bacher, M; Rosenau, T; Potthast, A

    2018-03-01

    Widening the methodology of chromophore analysis in pulp and paper science, a sensitive gas-chromatographic approach with electron-capture detection is presented and applied to model samples and real-world historic paper material. Trifluoroacetic anhydride was used for derivatization of the chromophore target compounds. The derivative formation was confirmed by NMR and accurate mass analysis. The method successfully detects and quantifies hydroxyquinones which are key chromophores in cellulosic matrices. The analytical figures of merit appeared to be in an acceptable range with an LOD down to approx. 60ng/g for each key chromophore, which allows for their successful detection in historic sample material. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Reducing the overlay metrology sensitivity to perturbations of the measurement stack

    NASA Astrophysics Data System (ADS)

    Zhou, Yue; Park, DeNeil; Gutjahr, Karsten; Gottipati, Abhishek; Vuong, Tam; Bae, Sung Yong; Stokes, Nicholas; Jiang, Aiqin; Hsu, Po Ya; O'Mahony, Mark; Donini, Andrea; Visser, Bart; de Ruiter, Chris; Grzela, Grzegorz; van der Laan, Hans; Jak, Martin; Izikson, Pavel; Morgan, Stephen

    2017-03-01

    Overlay metrology setup today faces a continuously changing landscape of process steps. During Diffraction Based Overlay (DBO) metrology setup, many different metrology target designs are evaluated in order to cover the full process window. The standard method for overlay metrology setup consists of single-wafer optimization in which the performance of all available metrology targets is evaluated. Without the availability of external reference data or multiwafer measurements it is hard to predict the metrology accuracy and robustness against process variations which naturally occur from wafer-to-wafer and lot-to-lot. In this paper, the capabilities of the Holistic Metrology Qualification (HMQ) setup flow are outlined, in particular with respect to overlay metrology accuracy and process robustness. The significance of robustness and its impact on overlay measurements is discussed using multiple examples. Measurement differences caused by slight stack variations across the target area, called grating imbalance, are shown to cause significant errors in the overlay calculation in case the recipe and target have not been selected properly. To this point, an overlay sensitivity check on perturbations of the measurement stack is presented for improvement of the overlay metrology setup flow. An extensive analysis on Key Performance Indicators (KPIs) from HMQ recipe optimization is performed on µDBO measurements of product wafers. The key parameters describing the sensitivity to perturbations of the measurement stack are based on an intra-target analysis. Using advanced image analysis, which is only possible for image plane detection of μDBO instead of pupil plane detection of DBO, the process robustness performance of a recipe can be determined. Intra-target analysis can be applied for a wide range of applications, independent of layers and devices.

  11. Secure chaotic transmission of electrocardiography signals with acousto-optic modulation under profiled beam propagation.

    PubMed

    Almehmadi, Fares S; Chatterjee, Monish R

    2015-01-10

    Electrocardiography (ECG) signals are used for both medical purposes and identifying individuals. It is often necessary to encrypt this highly sensitive information before it is transmitted over any channel. A closed-loop acousto-optic hybrid device acting as a chaotic modulator is applied to ECG signals to achieve this encryption. Recently improved modeling of this approach using profiled optical beams has shown it to be very sensitive to key parameters that characterize the encryption and decryption process, exhibiting its potential for secure transmission of analog and digital signals. Here the encryption and decryption is demonstrated for ECG signals, both analog and digital versions, illustrating strong encryption without significant distortion. Performance analysis pertinent to both analog and digital transmission of the ECG waveform is also carried out using output signal-to-noise, signal-to-distortion, and bit-error-rate measures relative to the key parameters and presence of channel noise in the system.

  12. Using Linked Electronic Health Records to Estimate Healthcare Costs: Key Challenges and Opportunities.

    PubMed

    Asaria, Miqdad; Grasic, Katja; Walker, Simon

    2016-02-01

    This paper discusses key challenges and opportunities that arise when using linked electronic health records (EHR) in health economics and outcomes research (HEOR), with a particular focus on estimating healthcare costs. These challenges and opportunities are framed in the context of a case study modelling the costs of stable coronary artery disease in England. The challenges and opportunities discussed fall broadly into the categories of (1) handling and organising data of this size and sensitivity; (2) extracting clinical endpoints from datasets that have not been designed and collected with such endpoints in mind; and (3) the principles and practice of costing resource use from routinely collected data. We find that there are a number of new challenges and opportunities that arise when working with EHR compared with more traditional sources of data for HEOR. These call for greater clinician involvement and intelligent use of sensitivity analysis.

  13. A highly sensitive and accurate gene expression analysis by sequencing ("bead-seq") for a single cell.

    PubMed

    Matsunaga, Hiroko; Goto, Mari; Arikawa, Koji; Shirai, Masataka; Tsunoda, Hiroyuki; Huang, Huan; Kambara, Hideki

    2015-02-15

    Analyses of gene expressions in single cells are important for understanding detailed biological phenomena. Here, a highly sensitive and accurate method by sequencing (called "bead-seq") to obtain a whole gene expression profile for a single cell is proposed. A key feature of the method is to use a complementary DNA (cDNA) library on magnetic beads, which enables adding washing steps to remove residual reagents in a sample preparation process. By adding the washing steps, the next steps can be carried out under the optimal conditions without losing cDNAs. Error sources were carefully evaluated to conclude that the first several steps were the key steps. It is demonstrated that bead-seq is superior to the conventional methods for single-cell gene expression analyses in terms of reproducibility, quantitative accuracy, and biases caused during sample preparation and sequencing processes. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chinthavali, Madhu Sudhan; Wang, Zhiqiang

    This paper presents a detailed parametric sensitivity analysis for a wireless power transfer (WPT) system in electric vehicle application. Specifically, several key parameters for sensitivity analysis of a series-parallel (SP) WPT system are derived first based on analytical modeling approach, which includes the equivalent input impedance, active / reactive power, and DC voltage gain. Based on the derivation, the impact of primary side compensation capacitance, coupling coefficient, transformer leakage inductance, and different load conditions on the DC voltage gain curve and power curve are studied and analyzed. It is shown that the desired power can be achieved by just changingmore » frequency or voltage depending on the design value of coupling coefficient. However, in some cases both have to be modified in order to achieve the required power transfer.« less

  15. Constitutive BAK activation as a determinant of drug sensitivity in malignant lymphohematopoietic cells.

    PubMed

    Dai, Haiming; Ding, Husheng; Meng, X Wei; Peterson, Kevin L; Schneider, Paula A; Karp, Judith E; Kaufmann, Scott H

    2015-10-15

    Mitochondrial outer membrane permeabilization (MOMP), a key step in the intrinsic apoptotic pathway, is incompletely understood. Current models emphasize the role of BH3-only BCL2 family members in BAX and BAK activation. Here we demonstrate concentration-dependent BAK autoactivation under cell-free conditions and provide evidence that this autoactivation plays a key role in regulating the intrinsic apoptotic pathway in intact cells. In particular, we show that up to 80% of BAK (but not BAX) in lymphohematopoietic cell lines is oligomerized and bound to anti-apoptotic BCL2 family members in the absence of exogenous death stimuli. The extent of this constitutive BAK oligomerization is diminished by BAK knockdown and unaffected by BIM or PUMA down-regulation. Further analysis indicates that sensitivity of cells to BH3 mimetics reflects the identity of the anti-apoptotic proteins to which BAK is constitutively bound, with extensive BCLXL•BAK complexes predicting navitoclax sensitivity, and extensive MCL1•BAK complexes predicting A1210477 sensitivity. Moreover, high BAK expression correlates with sensitivity of clinical acute myelogenous leukemia to chemotherapy, whereas low BAK levels correlate with resistance and relapse. Collectively, these results inform current understanding of MOMP and provide new insight into the ability of BH3 mimetics to induce apoptosis without directly activating BAX or BAK. © 2015 Dai et al.; Published by Cold Spring Harbor Laboratory Press.

  16. Subject-specific finite element modelling of the human foot complex during walking: sensitivity analysis of material properties, boundary and loading conditions.

    PubMed

    Akrami, Mohammad; Qian, Zhihui; Zou, Zhemin; Howard, David; Nester, Chris J; Ren, Lei

    2018-04-01

    The objective of this study was to develop and validate a subject-specific framework for modelling the human foot. This was achieved by integrating medical image-based finite element modelling, individualised multi-body musculoskeletal modelling and 3D gait measurements. A 3D ankle-foot finite element model comprising all major foot structures was constructed based on MRI of one individual. A multi-body musculoskeletal model and 3D gait measurements for the same subject were used to define loading and boundary conditions. Sensitivity analyses were used to investigate the effects of key modelling parameters on model predictions. Prediction errors of average and peak plantar pressures were below 10% in all ten plantar regions at five key gait events with only one exception (lateral heel, in early stance, error of 14.44%). The sensitivity analyses results suggest that predictions of peak plantar pressures are moderately sensitive to material properties, ground reaction forces and muscle forces, and significantly sensitive to foot orientation. The maximum region-specific percentage change ratios (peak stress percentage change over parameter percentage change) were 1.935-2.258 for ground reaction forces, 1.528-2.727 for plantar flexor muscles and 4.84-11.37 for foot orientations. This strongly suggests that loading and boundary conditions need to be very carefully defined based on personalised measurement data.

  17. A Pilot Study on Developing a Standardized and Sensitive School Violence Risk Assessment with Manual Annotation.

    PubMed

    Barzman, Drew H; Ni, Yizhao; Griffey, Marcus; Patel, Bianca; Warren, Ashaki; Latessa, Edward; Sorter, Michael

    2017-09-01

    School violence has increased over the past decade and innovative, sensitive, and standardized approaches to assess school violence risk are needed. In our current feasibility study, we initialized a standardized, sensitive, and rapid school violence risk approach with manual annotation. Manual annotation is the process of analyzing a student's transcribed interview to extract relevant information (e.g., key words) to school violence risk levels that are associated with students' behaviors, attitudes, feelings, use of technology (social media and video games), and other activities. In this feasibility study, we first implemented school violence risk assessments to evaluate risk levels by interviewing the student and parent separately at the school or the hospital to complete our novel school safety scales. We completed 25 risk assessments, resulting in 25 transcribed interviews of 12-18 year olds from 15 schools in Ohio and Kentucky. We then analyzed structured professional judgments, language, and patterns associated with school violence risk levels by using manual annotation and statistical methodology. To analyze the student interviews, we initiated the development of an annotation guideline to extract key information that is associated with students' behaviors, attitudes, feelings, use of technology and other activities. Statistical analysis was applied to associate the significant categories with students' risk levels to identify key factors which will help with developing action steps to reduce risk. In a future study, we plan to recruit more subjects in order to fully develop the manual annotation which will result in a more standardized and sensitive approach to school violence assessments.

  18. A more secure parallel keyed hash function based on chaotic neural network

    NASA Astrophysics Data System (ADS)

    Huang, Zhongquan

    2011-08-01

    Although various hash functions based on chaos or chaotic neural network were proposed, most of them can not work efficiently in parallel computing environment. Recently, an algorithm for parallel keyed hash function construction based on chaotic neural network was proposed [13]. However, there is a strict limitation in this scheme that its secret keys must be nonce numbers. In other words, if the keys are used more than once in this scheme, there will be some potential security flaw. In this paper, we analyze the cause of vulnerability of the original one in detail, and then propose the corresponding enhancement measures, which can remove the limitation on the secret keys. Theoretical analysis and computer simulation indicate that the modified hash function is more secure and practical than the original one. At the same time, it can keep the parallel merit and satisfy the other performance requirements of hash function, such as good statistical properties, high message and key sensitivity, and strong collision resistance, etc.

  19. An Energy Efficient Mutual Authentication and Key Agreement Scheme Preserving Anonymity for Wireless Sensor Networks.

    PubMed

    Lu, Yanrong; Li, Lixiang; Peng, Haipeng; Yang, Yixian

    2016-06-08

    WSNs (Wireless sensor networks) are nowadays viewed as a vital portion of the IoTs (Internet of Things). Security is a significant issue in WSNs, especially in resource-constrained environments. AKA (Authentication and key agreement) enhances the security of WSNs against adversaries attempting to get sensitive sensor data. Various AKA schemes have been developed for verifying the legitimate users of a WSN. Firstly, we scrutinize Amin-Biswas's currently scheme and demonstrate the major security loopholes in their works. Next, we propose a lightweight AKA scheme, using symmetric key cryptography based on smart card, which is resilient against all well known security attacks. Furthermore, we prove the scheme accomplishes mutual handshake and session key agreement property securely between the participates involved under BAN (Burrows, Abadi and Needham) logic. Moreover, formal security analysis and simulations are also conducted using AVISPA(Automated Validation of Internet Security Protocols and Applications) to show that our scheme is secure against active and passive attacks. Additionally, performance analysis shows that our proposed scheme is secure and efficient to apply for resource-constrained WSNs.

  20. An Energy Efficient Mutual Authentication and Key Agreement Scheme Preserving Anonymity for Wireless Sensor Networks

    PubMed Central

    Lu, Yanrong; Li, Lixiang; Peng, Haipeng; Yang, Yixian

    2016-01-01

    WSNs (Wireless sensor networks) are nowadays viewed as a vital portion of the IoTs (Internet of Things). Security is a significant issue in WSNs, especially in resource-constrained environments. AKA (Authentication and key agreement) enhances the security of WSNs against adversaries attempting to get sensitive sensor data. Various AKA schemes have been developed for verifying the legitimate users of a WSN. Firstly, we scrutinize Amin-Biswas’s currently scheme and demonstrate the major security loopholes in their works. Next, we propose a lightweight AKA scheme, using symmetric key cryptography based on smart card, which is resilient against all well known security attacks. Furthermore, we prove the scheme accomplishes mutual handshake and session key agreement property securely between the participates involved under BAN (Burrows, Abadi and Needham) logic. Moreover, formal security analysis and simulations are also conducted using AVISPA(Automated Validation of Internet Security Protocols and Applications) to show that our scheme is secure against active and passive attacks. Additionally, performance analysis shows that our proposed scheme is secure and efficient to apply for resource-constrained WSNs. PMID:27338382

  1. Tight finite-key analysis for quantum cryptography

    PubMed Central

    Tomamichel, Marco; Lim, Charles Ci Wen; Gisin, Nicolas; Renner, Renato

    2012-01-01

    Despite enormous theoretical and experimental progress in quantum cryptography, the security of most current implementations of quantum key distribution is still not rigorously established. One significant problem is that the security of the final key strongly depends on the number, M, of signals exchanged between the legitimate parties. Yet, existing security proofs are often only valid asymptotically, for unrealistically large values of M. Another challenge is that most security proofs are very sensitive to small differences between the physical devices used by the protocol and the theoretical model used to describe them. Here we show that these gaps between theory and experiment can be simultaneously overcome by using a recently developed proof technique based on the uncertainty relation for smooth entropies. PMID:22252558

  2. Tight finite-key analysis for quantum cryptography.

    PubMed

    Tomamichel, Marco; Lim, Charles Ci Wen; Gisin, Nicolas; Renner, Renato

    2012-01-17

    Despite enormous theoretical and experimental progress in quantum cryptography, the security of most current implementations of quantum key distribution is still not rigorously established. One significant problem is that the security of the final key strongly depends on the number, M, of signals exchanged between the legitimate parties. Yet, existing security proofs are often only valid asymptotically, for unrealistically large values of M. Another challenge is that most security proofs are very sensitive to small differences between the physical devices used by the protocol and the theoretical model used to describe them. Here we show that these gaps between theory and experiment can be simultaneously overcome by using a recently developed proof technique based on the uncertainty relation for smooth entropies.

  3. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S.

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.« less

  4. Analysis of multiple cell upset sensitivity in bulk CMOS SRAM after neutron irradiation

    NASA Astrophysics Data System (ADS)

    Pan, Xiaoyu; Guo, Hongxia; Luo, Yinhong; Zhang, Fengqi; Ding, Lili

    2018-03-01

    In our previous studies, we have proved that neutron irradiation can decrease the single event latch-up (SEL) sensitivity of CMOS SRAM. And one of the key contributions to the multiple cell upset (MCU) is the parasitic bipolar amplification, it bring us to study the impact of neutron irradiation on the SRAM’s MCU sensitivity. After the neutron experiment, we test the devices’ function and electrical parameters. Then, we use the heavy ion fluence to examine the changes on the devices’ MCU sensitivity pre- and post-neutron-irradiation. Unfortunately, neutron irradiation makes the MCU phenomenon worse. Finally, we use the electric static discharge (ESD) testing technology to deduce the experimental results and find that the changes on the WPM region take the lead rather than the changes on the parasitic bipolar amplification for the 90 nm process.

  5. Neurolysin Knockout Mice Generation and Initial Phenotype Characterization*

    PubMed Central

    Cavalcanti, Diogo M. L. P.; Castro, Leandro M.; Rosa Neto, José C.; Seelaender, Marilia; Neves, Rodrigo X.; Oliveira, Vitor; Forti, Fábio L.; Iwai, Leo K.; Gozzo, Fabio C.; Todiras, Mihail; Schadock, Ines; Barros, Carlos C.; Bader, Michael; Ferro, Emer S.

    2014-01-01

    The oligopeptidase neurolysin (EC 3.4.24.16; Nln) was first identified in rat brain synaptic membranes and shown to ubiquitously participate in the catabolism of bioactive peptides such as neurotensin and bradykinin. Recently, it was suggested that Nln reduction could improve insulin sensitivity. Here, we have shown that Nln KO mice have increased glucose tolerance, insulin sensitivity, and gluconeogenesis. KO mice have increased liver mRNA for several genes related to gluconeogenesis. Isotopic label semiquantitative peptidomic analysis suggests an increase in specific intracellular peptides in gastrocnemius and epididymal adipose tissue, which likely is involved with the increased glucose tolerance and insulin sensitivity in the KO mice. These results suggest the exciting new possibility that Nln is a key enzyme for energy metabolism and could be a novel therapeutic target to improve glucose uptake and insulin sensitivity. PMID:24719317

  6. Cost analysis of open radical cystectomy versus robot-assisted radical cystectomy.

    PubMed

    Bansal, Sukhchain S; Dogra, Tara; Smith, Peter W; Amran, Maisarah; Auluck, Ishna; Bhambra, Maninder; Sura, Manraj S; Rowe, Edward; Koupparis, Anthony

    2018-03-01

    To perform a cost analysis comparing the cost of robot-assisted radical cystectomy (RARC) with open RC (ORC) in a UK tertiary referral centre and to identify the key cost drivers. Data on hospital length of stay (LOS), operative time (OT), transfusion rate, and volume and complication rate were obtained from a prospectively updated institutional database for patients undergoing RARC or ORC. A cost decision tree model was created. Sensitivity analysis was performed to find key drivers of overall cost and to find breakeven points with ORC. Monte Carlo analysis was performed to quantify the variability in the dataset. One RARC procedure costs £12 449.87, or £12 106.12 if the robot was donated via charitable funds. In comparison, one ORC procedure costs £10 474.54. RARC is 18.9% more expensive than ORC. The key cost drivers were OT, LOS, and the number of cases performed per annum. High ongoing equipment costs remain a large barrier to the cost of RARC falling. However, minimal improvements in patient quality of life would be required to offset this difference. © 2017 The Authors BJU International © 2017 BJU International Published by John Wiley & Sons Ltd.

  7. Quantitation of cytokine mRNA expression as an endpoint for prediction and diagnosis of xenobiotic-induced hypersensitivity reactions.

    PubMed

    Gaspard, I; Kerdine, S; Pallardy, M; Lebrec, H

    1999-09-01

    Xenobiotic-induced hypersensitivity reactions are immune-mediated effects that involve specific antibodies and/or effector and regulatory T lymphocytes. Cytokines are key mediators of such responses and must be considered as possible endpoints for predicting sensitizing potency of drugs and chemicals, as well as for helping diagnosis of allergy. Detecting cytokine production at the protein level has been shown to not be always sensitive enough. This paper describes three examples of the utilization of semiquantitative or competitive reverse transcription polymerase chain reaction analysis of interleukin-4, interferon gamma, and interleukin-1beta mRNAs as endpoints for assessing T-cell or dendritic cell responses to sensitizing drugs (beta-lactam antibiotics) or chemicals (dinitrochlorobenzene). Copyright 1999 Academic Press.

  8. @NWTC Newsletter: Summer 2014 | Wind | NREL

    Science.gov Websites

    , Developmental Role in Major Wind Journal Boosting Wind Plant Power Output by 4%-5% through Coordinated Turbine . Part 2: Wind Farm Wake Models New Framework Transforms FAST Wind Turbine Modeling Tool (Fact Sheet ) Sensitivity Analysis of Wind Plant Performance to Key Turbine Design Parameters: A Systems Engineering

  9. Targeted Quantitation of Proteins by Mass Spectrometry

    PubMed Central

    2013-01-01

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement. PMID:23517332

  10. Targeted quantitation of proteins by mass spectrometry.

    PubMed

    Liebler, Daniel C; Zimmerman, Lisa J

    2013-06-04

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement.

  11. Application of neural networks and sensitivity analysis to improved prediction of trauma survival.

    PubMed

    Hunter, A; Kennedy, L; Henry, J; Ferguson, I

    2000-05-01

    The performance of trauma departments is widely audited by applying predictive models that assess probability of survival, and examining the rate of unexpected survivals and deaths. Although the TRISS methodology, a logistic regression modelling technique, is still the de facto standard, it is known that neural network models perform better. A key issue when applying neural network models is the selection of input variables. This paper proposes a novel form of sensitivity analysis, which is simpler to apply than existing techniques, and can be used for both numeric and nominal input variables. The technique is applied to the audit survival problem, and used to analyse the TRISS variables. The conclusions discuss the implications for the design of further improved scoring schemes and predictive models.

  12. Consumer-phase Salmonella enterica serovar enteritidis risk assessment for egg-containing food products.

    PubMed

    Mokhtari, Amirhossein; Moore, Christina M; Yang, Hong; Jaykus, Lee-Ann; Morales, Roberta; Cates, Sheryl C; Cowen, Peter

    2006-06-01

    We describe a one-dimensional probabilistic model of the role of domestic food handling behaviors on salmonellosis risk associated with the consumption of eggs and egg-containing foods. Six categories of egg-containing foods were defined based on the amount of egg contained in the food, whether eggs are pooled, and the degree of cooking practiced by consumers. We used bootstrap simulation to quantify uncertainty in risk estimates due to sampling error, and sensitivity analysis to identify key sources of variability and uncertainty in the model. Because of typical model characteristics such as nonlinearity, interaction between inputs, thresholds, and saturation points, Sobol's method, a novel sensitivity analysis approach, was used to identify key sources of variability. Based on the mean probability of illness, examples of foods from the food categories ranked from most to least risk of illness were: (1) home-made salad dressings/ice cream; (2) fried eggs/boiled eggs; (3) omelettes; and (4) baked foods/breads. For food categories that may include uncooked eggs (e.g., home-made salad dressings/ice cream), consumer handling conditions such as storage time and temperature after food preparation were the key sources of variability. In contrast, for food categories associated with undercooked eggs (e.g., fried/soft-boiled eggs), the initial level of Salmonella contamination and the log10 reduction due to cooking were the key sources of variability. Important sources of uncertainty varied with both the risk percentile and the food category under consideration. This work adds to previous risk assessments focused on egg production and storage practices, and provides a science-based approach to inform consumer risk communications regarding safe egg handling practices.

  13. Toxicogenomics and cancer risk assessment: a framework for key event analysis and dose-response assessment for nongenotoxic carcinogens.

    PubMed

    Bercu, Joel P; Jolly, Robert A; Flagella, Kelly M; Baker, Thomas K; Romero, Pedro; Stevens, James L

    2010-12-01

    In order to determine a threshold for nongenotoxic carcinogens, the traditional risk assessment approach has been to identify a mode of action (MOA) with a nonlinear dose-response. The dose-response for one or more key event(s) linked to the MOA for carcinogenicity allows a point of departure (POD) to be selected from the most sensitive effect dose or no-effect dose. However, this can be challenging because multiple MOAs and key events may exist for carcinogenicity and oftentimes extensive research is required to elucidate the MOA. In the present study, a microarray analysis was conducted to determine if a POD could be identified following short-term oral rat exposure with two nongenotoxic rodent carcinogens, fenofibrate and methapyrilene, using a benchmark dose analysis of genes aggregated in Kyoto Encyclopedia of Genes and Genomes (KEGG) pathways and Gene Ontology (GO) biological processes, which likely encompass key event(s) for carcinogenicity. The gene expression response for fenofibrate given to rats for 2days was consistent with its MOA and known key events linked to PPARα activation. The temporal response from daily dosing with methapyrilene demonstrated biological complexity with waves of pathways/biological processes occurring over 1, 3, and 7days; nonetheless, the benchmark dose values were consistent over time. When comparing the dose-response of toxicogenomic data to tumorigenesis or precursor events, the toxicogenomics POD was slightly below any effect level. Our results suggest that toxicogenomic analysis using short-term studies can be used to identify a threshold for nongenotoxic carcinogens based on evaluation of potential key event(s) which then can be used within a risk assessment framework. Copyright © 2010 Elsevier Inc. All rights reserved.

  14. Performance Analysis of Direct-Sequence Code-Division Multiple-Access Communications with Asymmetric Quadrature Phase-Shift-Keying Modulation

    NASA Technical Reports Server (NTRS)

    Wang, C.-W.; Stark, W.

    2005-01-01

    This article considers a quaternary direct-sequence code-division multiple-access (DS-CDMA) communication system with asymmetric quadrature phase-shift-keying (AQPSK) modulation for unequal error protection (UEP) capability. Both time synchronous and asynchronous cases are investigated. An expression for the probability distribution of the multiple-access interference is derived. The exact bit-error performance and the approximate performance using a Gaussian approximation and random signature sequences are evaluated by extending the techniques used for uniform quadrature phase-shift-keying (QPSK) and binary phase-shift-keying (BPSK) DS-CDMA systems. Finally, a general system model with unequal user power and the near-far problem is considered and analyzed. The results show that, for a system with UEP capability, the less protected data bits are more sensitive to the near-far effect that occurs in a multiple-access environment than are the more protected bits.

  15. The art of spacecraft design: A multidisciplinary challenge

    NASA Technical Reports Server (NTRS)

    Abdi, F.; Ide, H.; Levine, M.; Austel, L.

    1989-01-01

    Actual design turn-around time has become shorter due to the use of optimization techniques which have been introduced into the design process. It seems that what, how and when to use these optimization techniques may be the key factor for future aircraft engineering operations. Another important aspect of this technique is that complex physical phenomena can be modeled by a simple mathematical equation. The new powerful multilevel methodology reduces time-consuming analysis significantly while maintaining the coupling effects. This simultaneous analysis method stems from the implicit function theorem and system sensitivity derivatives of input variables. Use of the Taylor's series expansion and finite differencing technique for sensitivity derivatives in each discipline makes this approach unique for screening dominant variables from nondominant variables. In this study, the current Computational Fluid Dynamics (CFD) aerodynamic and sensitivity derivative/optimization techniques are applied for a simple cone-type forebody of a high-speed vehicle configuration to understand basic aerodynamic/structure interaction in a hypersonic flight condition.

  16. Quantifying Hydro-biogeochemical Model Sensitivity in Assessment of Climate Change Effect on Hyporheic Zone Processes

    NASA Astrophysics Data System (ADS)

    Song, X.; Chen, X.; Dai, H.; Hammond, G. E.; Song, H. S.; Stegen, J.

    2016-12-01

    The hyporheic zone is an active region for biogeochemical processes such as carbon and nitrogen cycling, where the groundwater and surface water mix and interact with each other with distinct biogeochemical and thermal properties. The biogeochemical dynamics within the hyporheic zone are driven by both river water and groundwater hydraulic dynamics, which are directly affected by climate change scenarios. Besides that, the hydraulic and thermal properties of local sediments and microbial and chemical processes also play important roles in biogeochemical dynamics. Thus for a comprehensive understanding of the biogeochemical processes in the hyporheic zone, a coupled thermo-hydro-biogeochemical model is needed. As multiple uncertainty sources are involved in the integrated model, it is important to identify its key modules/parameters through sensitivity analysis. In this study, we develop a 2D cross-section model in the hyporheic zone at the DOE Hanford site adjacent to Columbia River and use this model to quantify module and parametric sensitivity on assessment of climate change. To achieve this purpose, We 1) develop a facies-based groundwater flow and heat transfer model that incorporates facies geometry and heterogeneity characterized from a field data set, 2) derive multiple reaction networks/pathways from batch experiments with in-situ samples and integrate temperate dependent reactive transport modules to the flow model, 3) assign multiple climate change scenarios to the coupled model by analyzing historical river stage data, 4) apply a variance-based global sensitivity analysis to quantify scenario/module/parameter uncertainty in hierarchy level. The objectives of the research include: 1) identifing the key control factors of the coupled thermo-hydro-biogeochemical model in the assessment of climate change, and 2) quantify the carbon consumption in different climate change scenarios in the hyporheic zone.

  17. How do Changes in Hydro-Climate Conditions Alter the Risk of Infection With Fasciolosis?

    NASA Astrophysics Data System (ADS)

    Beltrame, L.; Dunne, T.; Rose, H.; Walker, J.; Morgan, E.; Vickerman, P.; Wagener, T.

    2017-12-01

    Fasciolosis is a widespread parasitic disease of livestock and is emerging as a major zoonosis. Since the parasite and its intermediate host live and develop in the environment, risk of infection is directly affected by climatic-environmental conditions. Changes in disease prevalence, seasonality and distribution have been reported in recent years and attributed to altered temperature and rainfall patterns, raising concerns about the effects of climate change in the future. Therefore, it is urgent to understand how changes in climate-environmental drivers may alter the dynamics of disease risk in a quantitative way, to guide parasite control strategies and interventions in the coming decades. In a previous work, we developed and tested a novel mechanistic hydro-epidemiological model for Fasciolosis, which explicitly represents the parasite life-cycle in connection with key environmental processes, allowing to capture the impact of previously unseen conditions. In this study, we use the new mechanistic model to assess the sensitivity of infection rates to changes in climate-environmental factors. This is challenging as processes underlying disease transmission are complex and interacting, and may have contrasting effects on the parasite life-cycle stages. To this end, we set up a sensitivity analysis framework to investigate in a structured way which factors play a key role in controlling the magnitude, timing and spread of infection, and how the sensitivity of disease risk varies in time and space. Moreover, we define synthetic scenarios to explore the space of possible variability of the hydro-climate drivers and investigate conditions that lead to critical levels of infection. The study shows how the new model combined with the sensitivity analysis framework can support decision-making, providing useful information for disease management.

  18. A Risk-Based Approach for Aerothermal/TPS Analysis and Testing

    NASA Technical Reports Server (NTRS)

    Wright, Michael J.; Grinstead, Jay H.; Bose, Deepak

    2007-01-01

    The current status of aerothermal and thermal protection system modeling for civilian entry missions is reviewed. For most such missions, the accuracy of our simulations is limited not by the tools and processes currently employed, but rather by reducible deficiencies in the underlying physical models. Improving the accuracy of and reducing the uncertainties in these models will enable a greater understanding of the system level impacts of a particular thermal protection system and of the system operation and risk over the operational life of the system. A strategic plan will be laid out by which key modeling deficiencies can be identified via mission-specific gap analysis. Once these gaps have been identified, the driving component uncertainties are determined via sensitivity analyses. A Monte-Carlo based methodology is presented for physics-based probabilistic uncertainty analysis of aerothermodynamics and thermal protection system material response modeling. These data are then used to advocate for and plan focused testing aimed at reducing key uncertainties. The results of these tests are used to validate or modify existing physical models. Concurrently, a testing methodology is outlined for thermal protection materials. The proposed approach is based on using the results of uncertainty/sensitivity analyses discussed above to tailor ground testing so as to best identify and quantify system performance and risk drivers. A key component of this testing is understanding the relationship between the test and flight environments. No existing ground test facility can simultaneously replicate all aspects of the flight environment, and therefore good models for traceability to flight are critical to ensure a low risk, high reliability thermal protection system design. Finally, the role of flight testing in the overall thermal protection system development strategy is discussed.

  19. Role of Sectoral Transformation in the Evolution of Water Management Norms in Agricultural Catchments: A Sociohydrologic Modeling Analysis

    NASA Astrophysics Data System (ADS)

    Roobavannan, M.; Kandasamy, J.; Pande, S.; Vigneswaran, S.; Sivapalan, M.

    2017-10-01

    This study is focused on the water-agriculture-environment nexus as it played out in the Murrumbidgee River Basin, eastern Australia, and how coevolution of society and water management actually transpired. Over 100 years of agricultural development the Murrumbidgee Basin experienced a "pendulum swing" in terms of water allocation, initially exclusively for agriculture production changing over to reallocation back to the environment. In this paper, we hypothesize that in the competition for water between economic livelihood and environmental wellbeing, economic diversification was the key to swinging community sentiment in favor of environmental protection, and triggering policy action that resulted in more water allocation to the environment. To test this hypothesis, we developed a sociohydrology model to link the dynamics of the whole economy (both agriculture and industry composed of manufacturing and services) to the community's sensitivity toward the environment. Changing community sensitivity influenced how water was allocated and governed and how the agricultural sector grew relative to the industrial sector (composed of manufacturing and services sectors). In this way, we show that economic diversification played a key role in influencing the community's values and preferences with respect to the environment and economic growth. Without diversification, model simulations show that the community would not have been sufficiently sensitive and willing enough to act to restore the environment, highlighting the key role of sectoral transformation in achieving the goal of sustainable agricultural development.

  20. Health economic assessment: a methodological primer.

    PubMed

    Simoens, Steven

    2009-12-01

    This review article aims to provide an introduction to the methodology of health economic assessment of a health technology. Attention is paid to defining the fundamental concepts and terms that are relevant to health economic assessments. The article describes the methodology underlying a cost study (identification, measurement and valuation of resource use, calculation of costs), an economic evaluation (type of economic evaluation, the cost-effectiveness plane, trial- and model-based economic evaluation, discounting, sensitivity analysis, incremental analysis), and a budget impact analysis. Key references are provided for those readers who wish a more advanced understanding of health economic assessments.

  1. Health Economic Assessment: A Methodological Primer

    PubMed Central

    Simoens, Steven

    2009-01-01

    This review article aims to provide an introduction to the methodology of health economic assessment of a health technology. Attention is paid to defining the fundamental concepts and terms that are relevant to health economic assessments. The article describes the methodology underlying a cost study (identification, measurement and valuation of resource use, calculation of costs), an economic evaluation (type of economic evaluation, the cost-effectiveness plane, trial- and model-based economic evaluation, discounting, sensitivity analysis, incremental analysis), and a budget impact analysis. Key references are provided for those readers who wish a more advanced understanding of health economic assessments. PMID:20049237

  2. Vemurafenib resistance signature by proteome analysis offers new strategies and rational therapeutic concepts.

    PubMed

    Paulitschke, Verena; Berger, Walter; Paulitschke, Philipp; Hofstätter, Elisabeth; Knapp, Bernhard; Dingelmaier-Hovorka, Ruth; Födinger, Dagmar; Jäger, Walter; Szekeres, Thomas; Meshcheryakova, Anastasia; Bileck, Andrea; Pirker, Christine; Pehamberger, Hubert; Gerner, Christopher; Kunstfeld, Rainer

    2015-03-01

    The FDA-approved BRAF inhibitor vemurafenib achieves outstanding clinical response rates in patients with melanoma, but early resistance is common. Understanding the pathologic mechanisms of drug resistance and identification of effective therapeutic alternatives are key scientific challenges in the melanoma setting. Using proteomic techniques, including shotgun analysis and 2D-gel electrophoresis, we identified a comprehensive signature of the vemurafenib-resistant M24met in comparison with the vemurafenib-sensitive A375 melanoma cell line. The resistant cells were characterized by loss of differentiation, induction of transformation, enhanced expression of the lysosomal compartment, increased potential for metastasis, migration, adherence and Ca2(+) ion binding, enhanced expression of the MAPK pathway and extracellular matrix proteins, and epithelial-mesenchymal transformation. The main features were verified by shotgun analysis with QEXACTIVE orbitrap MS, electron microscopy, lysosomal staining, Western blotting, and adherence assay in a VM-1 melanoma cell line with acquired vemurafenib resistance. On the basis of the resistance profile, we were able to successfully predict that a novel resveratrol-derived COX-2 inhibitor, M8, would be active against the vemurafenib-resistant but not the vemurafenib-sensitive melanoma cells. Using high-throughput methods for cell line and drug characterization may thus offer a new way to identify key features of vemurafenib resistance, facilitating the design of effective rational therapeutic alternatives. ©2015 American Association for Cancer Research.

  3. A New Color Image Encryption Scheme Using CML and a Fractional-Order Chaotic System

    PubMed Central

    Wu, Xiangjun; Li, Yang; Kurths, Jürgen

    2015-01-01

    The chaos-based image cryptosystems have been widely investigated in recent years to provide real-time encryption and transmission. In this paper, a novel color image encryption algorithm by using coupled-map lattices (CML) and a fractional-order chaotic system is proposed to enhance the security and robustness of the encryption algorithms with a permutation-diffusion structure. To make the encryption procedure more confusing and complex, an image division-shuffling process is put forward, where the plain-image is first divided into four sub-images, and then the position of the pixels in the whole image is shuffled. In order to generate initial conditions and parameters of two chaotic systems, a 280-bit long external secret key is employed. The key space analysis, various statistical analysis, information entropy analysis, differential analysis and key sensitivity analysis are introduced to test the security of the new image encryption algorithm. The cryptosystem speed is analyzed and tested as well. Experimental results confirm that, in comparison to other image encryption schemes, the new algorithm has higher security and is fast for practical image encryption. Moreover, an extensive tolerance analysis of some common image processing operations such as noise adding, cropping, JPEG compression, rotation, brightening and darkening, has been performed on the proposed image encryption technique. Corresponding results reveal that the proposed image encryption method has good robustness against some image processing operations and geometric attacks. PMID:25826602

  4. Incorporating resource protection constraints in an analysis of landscape fuel-treatment effectiveness in the northern Sierra Nevada, CA, USA

    Treesearch

    Christopher B. Dow; Brandon M. Collins; Scott L. Stephens

    2016-01-01

    Finding novel ways to plan and implement landscape-level forest treatments that protect sensitive wildlife and other key ecosystem components, while also reducing the risk of large-scale, high-severity fires, can prove to be difficult. We examined alternative approaches to landscape-scale fuel-treatment design for the same landscape. These approaches included two...

  5. A Novel Image Encryption Algorithm Based on DNA Subsequence Operation

    PubMed Central

    Zhang, Qiang; Xue, Xianglian; Wei, Xiaopeng

    2012-01-01

    We present a novel image encryption algorithm based on DNA subsequence operation. Different from the traditional DNA encryption methods, our algorithm does not use complex biological operation but just uses the idea of DNA subsequence operations (such as elongation operation, truncation operation, deletion operation, etc.) combining with the logistic chaotic map to scramble the location and the value of pixel points from the image. The experimental results and security analysis show that the proposed algorithm is easy to be implemented, can get good encryption effect, has a wide secret key's space, strong sensitivity to secret key, and has the abilities of resisting exhaustive attack and statistic attack. PMID:23093912

  6. Predictors of Outcome in Traumatic Brain Injury: New Insight Using Receiver Operating Curve Indices and Bayesian Network Analysis.

    PubMed

    Zador, Zsolt; Sperrin, Matthew; King, Andrew T

    2016-01-01

    Traumatic brain injury remains a global health problem. Understanding the relative importance of outcome predictors helps optimize our treatment strategies by informing assessment protocols, clinical decisions and trial designs. In this study we establish importance ranking for outcome predictors based on receiver operating indices to identify key predictors of outcome and create simple predictive models. We then explore the associations between key outcome predictors using Bayesian networks to gain further insight into predictor importance. We analyzed the corticosteroid randomization after significant head injury (CRASH) trial database of 10008 patients and included patients for whom demographics, injury characteristics, computer tomography (CT) findings and Glasgow Outcome Scale (GCS) were recorded (total of 13 predictors, which would be available to clinicians within a few hours following the injury in 6945 patients). Predictions of clinical outcome (death or severe disability at 6 months) were performed using logistic regression models with 5-fold cross validation. Predictive performance was measured using standardized partial area (pAUC) under the receiver operating curve (ROC) and we used Delong test for comparisons. Variable importance ranking was based on pAUC targeted at specificity (pAUCSP) and sensitivity (pAUCSE) intervals of 90-100%. Probabilistic associations were depicted using Bayesian networks. Complete AUC analysis showed very good predictive power (AUC = 0.8237, 95% CI: 0.8138-0.8336) for the complete model. Specificity focused importance ranking highlighted age, pupillary, motor responses, obliteration of basal cisterns/3rd ventricle and midline shift. Interestingly when targeting model sensitivity, the highest-ranking variables were age, severe extracranial injury, verbal response, hematoma on CT and motor response. Simplified models, which included only these key predictors, had similar performance (pAUCSP = 0.6523, 95% CI: 0.6402-0.6641 and pAUCSE = 0.6332, 95% CI: 0.62-0.6477) compared to the complete models (pAUCSP = 0.6664, 95% CI: 0.6543-0.679, pAUCSE = 0.6436, 95% CI: 0.6289-0.6585, de Long p value 0.1165 and 0.3448 respectively). Bayesian networks showed the predictors that did not feature in the simplified models were associated with those that did. We demonstrate that importance based variable selection allows simplified predictive models to be created while maintaining prediction accuracy. Variable selection targeting specificity confirmed key components of clinical assessment in TBI whereas sensitivity based ranking suggested extracranial injury as one of the important predictors. These results help refine our approach to head injury assessment, decision-making and outcome prediction targeted at model sensitivity and specificity. Bayesian networks proved to be a comprehensive tool for depicting probabilistic associations for key predictors giving insight into why the simplified model has maintained accuracy.

  7. Immuno-magnetic beads-based extraction-capillary zone electrophoresis-deep UV laser-induced fluorescence analysis of erythropoietin.

    PubMed

    Wang, Heye; Dou, Peng; Lü, Chenchen; Liu, Zhen

    2012-07-13

    Erythropoietin (EPO) is an important glycoprotein hormone. Recombinant human EPO (rhEPO) is an important therapeutic drug and can be also used as doping reagent in sports. The analysis of EPO glycoforms in pharmaceutical and sports areas greatly challenges analytical scientists from several aspects, among which sensitive detection and effective and facile sample preparation are two essential issues. Herein, we investigated new possibilities for these two aspects. Deep UV laser-induced fluorescence detection (deep UV-LIF) was established to detect the intrinsic fluorescence of EPO while an immuno-magnetic beads-based extraction (IMBE) was developed to specifically extract EPO glycoforms. Combined with capillary zone electrophoresis (CZE), CZE-deep UV-LIF allows high resolution glycoform profiling with improved sensitivity. The detection sensitivity was improved by one order of magnitude as compared with UV absorbance detection. An additional advantage is that the original glycoform distribution can be completely preserved because no fluorescent labeling is needed. By combining IMBE with CZE-deep UV-LIF, the overall detection sensitivity was 1.5 × 10⁻⁸ mol/L, which was enhanced by two orders of magnitude relative to conventional CZE with UV absorbance detection. It is applicable to the analysis of pharmaceutical preparations of EPO, but the sensitivity is insufficient for the anti-doping analysis of EPO in blood and urine. IMBE can be straightforward and effective approach for sample preparation. However, antibodies with high specificity were the key for application to urine samples because some urinary proteins can severely interfere the immuno-extraction. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. MoO3/nano-Si heterostructure based highly sensitive and acetone selective sensor prototype: a key to non-invasive detection of diabetes.

    PubMed

    Dwivedi, Priyanka; Dhanekar, Saakshi; Das, Samaresh

    2018-07-06

    This paper presents the development of an extremely sensitive and selective acetone sensor prototype which can be used as a platform for non-invasive diabetes detection through exhaled human breath. The miniaturized sensors were produced in high yield with the use of standard microfabrication processes. The sensors were based on a heterostructure composed of MoO 3 and nano-porous silicon (NPS). Features like acetone selective, enhanced sensor response and 0.5 ppm detection limit were observed upon introduction of MoO 3 on the NPS. The sensors were found to be repeatable and stable for almost 1 year, as tested under humid conditions at room temperature. It was inferred that the interface resistance of MoO 3 and NPS played a key role in the sensing mechanism. With the use of breath analysis and lab-on-chip, medical diagnosis procedures can be simplified and provide solutions for point-of-care testing.

  9. Global Sensitivity Analysis with Small Sample Sizes: Ordinary Least Squares Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Michael J.; Liu, Wei; Sivaramakrishnan, Raghu

    2016-12-21

    A new version of global sensitivity analysis is developed in this paper. This new version coupled with tools from statistics, machine learning, and optimization can devise small sample sizes that allow for the accurate ordering of sensitivity coefficients for the first 10-30 most sensitive chemical reactions in complex chemical-kinetic mechanisms, and is particularly useful for studying the chemistry in realistic devices. A key part of the paper is calibration of these small samples. Because these small sample sizes are developed for use in realistic combustion devices, the calibration is done over the ranges of conditions in such devices, with amore » test case being the operating conditions of a compression ignition engine studied earlier. Compression ignition engines operate under low-temperature combustion conditions with quite complicated chemistry making this calibration difficult, leading to the possibility of false positives and false negatives in the ordering of the reactions. So an important aspect of the paper is showing how to handle the trade-off between false positives and false negatives using ideas from the multiobjective optimization literature. The combination of the new global sensitivity method and the calibration are sample sizes a factor of approximately 10 times smaller than were available with our previous algorithm.« less

  10. Orbital transfer vehicle concept definition and system analysis study, 1986. Volume 9: Study extension results

    NASA Technical Reports Server (NTRS)

    Kofal, Allen E.

    1987-01-01

    The purpose of this extension to the OTV Concept Definition and Systems Analysis Study was to improve the definition of the OTV Program that will be most beneficial to the nation in the 1995 to 2010 timeframe. The implications of the defined mission and defined launch vehicle are investigated. The key mission requirements identified for the Space Transportation Architecture Study (STAS) were established and reflect a need for early capability and more ambitious capability growth. The key technical objectives and related issues addressed are summarized. The analyses of selected areas including aerobrake design, proximity operations, and the balance of EVA and IVA operations used in the support of the OTV at the space-base were enhanced. Sensitivity studies were conducted to establish how the OTV program should be tailored to meet changing circumstances.

  11. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  12. Utility and limitations of a peptide reactivity assay to predict fragrance allergens in vitro.

    PubMed

    Natsch, A; Gfeller, H; Rothaupt, M; Ellis, G

    2007-10-01

    A key step in the skin sensitization process is the formation of a covalent adduct between the skin sensitizer and endogenous proteins and/or peptides in the skin. A published peptide depletion assay was used to relate the in vitro reactivity of fragrance molecules to LLNA data. Using the classical assay, 22 of 28 tested moderate to strong sensitizers were positive. The prediction of weak sensitizers proved to be more difficult with only 50% of weak sensitizers giving a positive response, but for some compounds this could also be due to false-positive results from the LLNA. LC-MS analysis yielded the expected mass of the peptide adducts in several cases, whereas in other cases putative oxidation reactions led to adducts of unexpected molecular weight. Several moderately sensitizing aldehydes were correctly predicted by the depletion assay, but no adducts were found and the depletion appears to be due to an oxidation of the parent peptide catalyzed by the test compound. Finally, alternative test peptides derived from a physiological reactive protein with enhanced sensitivity for weak Michael acceptors were found, further increasing the sensitivity of the assay.

  13. Speed skills: measuring the visual speed analyzing properties of primate MT neurons.

    PubMed

    Perrone, J A; Thiele, A

    2001-05-01

    Knowing the direction and speed of moving objects is often critical for survival. However, it is poorly understood how cortical neurons process the speed of image movement. Here we tested MT neurons using moving sine-wave gratings of different spatial and temporal frequencies, and mapped out the neurons' spatiotemporal frequency response profiles. The maps typically had oriented ridges of peak sensitivity as expected for speed-tuned neurons. The preferred speed estimate, derived from the orientation of the maps, corresponded well to the preferred speed when moving bars were presented. Thus, our data demonstrate that MT neurons are truly sensitive to the object speed. These findings indicate that MT is not only a key structure in the analysis of direction of motion and depth perception, but also in the analysis of object speed.

  14. Uncertainty analysis of the simulations of effects of discharging treated wastewater to the Red River of the North at Fargo, North Dakota, and Moorhead, Minnesota

    USGS Publications Warehouse

    Wesolowski, Edwin A.

    1996-01-01

    Two separate studies to simulate the effects of discharging treated wastewater to the Red River of the North at Fargo, North Dakota, and Moorhead, Minnesota, have been completed. In the first study, the Red River at Fargo Water-Quality Model was calibrated and verified for icefree conditions. In the second study, the Red River at Fargo Ice-Cover Water-Quality Model was verified for ice-cover conditions.To better understand and apply the Red River at Fargo Water-Quality Model and the Red River at Fargo Ice-Cover Water-Quality Model, the uncertainty associated with simulated constituent concentrations and property values was analyzed and quantified using the Enhanced Stream Water Quality Model-Uncertainty Analysis. The Monte Carlo simulation and first-order error analysis methods were used to analyze the uncertainty in simulated values for six constituents and properties at sites 5, 10, and 14 (upstream to downstream order). The constituents and properties analyzed for uncertainty are specific conductance, total organic nitrogen (reported as nitrogen), total ammonia (reported as nitrogen), total nitrite plus nitrate (reported as nitrogen), 5-day carbonaceous biochemical oxygen demand for ice-cover conditions and ultimate carbonaceous biochemical oxygen demand for ice-free conditions, and dissolved oxygen. Results are given in detail for both the ice-cover and ice-free conditions for specific conductance, total ammonia, and dissolved oxygen.The sensitivity and uncertainty of the simulated constituent concentrations and property values to input variables differ substantially between ice-cover and ice-free conditions. During ice-cover conditions, simulated specific-conductance values are most sensitive to the headwatersource specific-conductance values upstream of site 10 and the point-source specific-conductance values downstream of site 10. These headwater-source and point-source specific-conductance values also are the key sources of uncertainty. Simulated total ammonia concentrations are most sensitive to the point-source total ammonia concentrations at all three sites. Other input variables that contribute substantially to the variability of simulated total ammonia concentrations are the headwater-source total ammonia and the instream reaction coefficient for biological decay of total ammonia to total nitrite. Simulated dissolved-oxygen concentrations at all three sites are most sensitive to headwater-source dissolved-oxygen concentration. This input variable is the key source of variability for simulated dissolved-oxygen concentrations at sites 5 and 10. Headwatersource and point-source dissolved-oxygen concentrations are the key sources of variability for simulated dissolved-oxygen concentrations at site 14.During ice-free conditions, simulated specific-conductance values at all three sites are most sensitive to the headwater-source specific-conductance values. Headwater-source specificconductance values also are the key source of uncertainty. The input variables to which total ammonia and dissolved oxygen are most sensitive vary from site to site and may or may not correspond to the input variables that contribute the most to the variability. The input variables that contribute the most to the variability of simulated total ammonia concentrations are pointsource total ammonia, instream reaction coefficient for biological decay of total ammonia to total nitrite, and Manning's roughness coefficient. The input variables that contribute the most to the variability of simulated dissolved-oxygen concentrations are reaeration rate, sediment oxygen demand rate, and headwater-source algae as chlorophyll a.

  15. Correlation between experimental human and murine skin sensitization induction thresholds.

    PubMed

    Api, Anne Marie; Basketter, David; Lalko, Jon

    2015-01-01

    Quantitative risk assessment for skin sensitization is directed towards the determination of levels of exposure to known sensitizing substances that will avoid the induction of contact allergy in humans. A key component of this work is the predictive identification of relative skin sensitizing potency, achieved normally by the measurement of the threshold (the "EC3" value) in the local lymph node assay (LLNA). In an extended series of studies, the accuracy of this murine induction threshold as the predictor of the absence of a sensitizing effect has been verified by conduct of a human repeated insult patch test (HRIPT). Murine and human thresholds for a diverse set of 57 fragrance chemicals spanning approximately four orders of magnitude variation in potency have been compared. The results confirm that there is a useful correlation, with the LLNA EC3 value helping particularly to identify stronger sensitizers. Good correlation (with half an order of magnitude) was seen with three-quarters of the dataset. The analysis also helps to identify potential outlier types of (fragrance) chemistry, exemplified by hexyl and benzyl salicylates (an over-prediction) and trans-2-hexenal (an under-prediction).

  16. Modelling the effect of heterogeneity of shedding on the within herd Coxiella burnetii spread and identification of key parameters by sensitivity analysis.

    PubMed

    Courcoul, Aurélie; Monod, Hervé; Nielen, Mirjam; Klinkenberg, Don; Hogerwerf, Lenny; Beaudeau, François; Vergu, Elisabeta

    2011-09-07

    Coxiella burnetii is the bacterium responsible for Q fever, a worldwide zoonosis. Ruminants, especially cattle, are recognized as the most important source of human infections. Although a great heterogeneity between shedder cows has been described, no previous studies have determined which features such as shedding route and duration or the quantity of bacteria shed have the strongest impact on the environmental contamination and thus on the zoonotic risk. Our objective was to identify key parameters whose variation highly influences C. burnetii spread within a dairy cattle herd, especially those related to the heterogeneity of shedding. To compare the impact of epidemiological parameters on different dynamical aspects of C. burnetii infection, we performed a sensitivity analysis on an original stochastic model describing the bacterium spread and representing the individual variability of the shedding duration, routes and intensity as well as herd demography. This sensitivity analysis consisted of a principal component analysis followed by an ANOVA. Our findings show that the most influential parameters are the probability distribution governing the levels of shedding, especially in vaginal mucus and faeces, the characteristics of the bacterium in the environment (i.e. its survival and the fraction of bacteria shed reaching the environment), and some physiological parameters related to the intermittency of shedding (transition probability from a non-shedding infected state to a shedding state) or to the transition from one type of shedder to another one (transition probability from a seronegative shedding state to a seropositive shedding state). Our study is crucial for the understanding of the dynamics of C. burnetii infection and optimization of control measures. Indeed, as control measures should impact the parameters influencing the bacterium spread most, our model can now be used to assess the effectiveness of different control strategies of Q fever within dairy cattle herds. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Application of the adverse outcome pathway (AOP) concept to structure the available in vivo and in vitro mechanistic data for allergic sensitization to food proteins.

    PubMed

    van Bilsen, Jolanda H M; Sienkiewicz-Szłapka, Edyta; Lozano-Ojalvo, Daniel; Willemsen, Linette E M; Antunes, Celia M; Molina, Elena; Smit, Joost J; Wróblewska, Barbara; Wichers, Harry J; Knol, Edward F; Ladics, Gregory S; Pieters, Raymond H H; Denery-Papini, Sandra; Vissers, Yvonne M; Bavaro, Simona L; Larré, Colette; Verhoeckx, Kitty C M; Roggen, Erwin L

    2017-01-01

    The introduction of whole new foods in a population may lead to sensitization and food allergy. This constitutes a potential public health problem and a challenge to risk assessors and managers as the existing understanding of the pathophysiological processes and the currently available biological tools for prediction of the risk for food allergy development and the severity of the reaction are not sufficient. There is a substantial body of in vivo and in vitro data describing molecular and cellular events potentially involved in food sensitization. However, these events have not been organized in a sequence of related events that is plausible to result in sensitization, and useful to challenge current hypotheses. The aim of this manuscript was to collect and structure the current mechanistic understanding of sensitization induction to food proteins by applying the concept of adverse outcome pathway (AOP). The proposed AOP for food sensitization is based on information on molecular and cellular mechanisms and pathways evidenced to be involved in sensitization by food and food proteins and uses the AOPs for chemical skin sensitization and respiratory sensitization induction as templates. Available mechanistic data on protein respiratory sensitization were included to fill out gaps in the understanding of how proteins may affect cells, cell-cell interactions and tissue homeostasis. Analysis revealed several key events (KE) and biomarkers that may have potential use in testing and assessment of proteins for their sensitizing potential. The application of the AOP concept to structure mechanistic in vivo and in vitro knowledge has made it possible to identify a number of methods, each addressing a specific KE, that provide information about the food allergenic potential of new proteins. When applied in the context of an integrated strategy these methods may reduce, if not replace, current animal testing approaches. The proposed AOP will be shared at the www.aopwiki.org platform to expand the mechanistic data, improve the confidence in each of the proposed KE and key event relations (KERs), and allow for the identification of new, or refinement of established KE and KERs.

  18. Characterization of non-polar aromatic hydrocarbons in crude oil using atmospheric pressure laser ionization and Fourier transform ion cyclotron resonance mass spectrometry (APLI FT-ICR MS).

    PubMed

    Schrader, Wolfgang; Panda, Saroj K; Brockmann, Klaus J; Benter, Thorsten

    2008-07-01

    We report on the successful application of the recently introduced atmospheric pressure laser ionization (APLI) method as a novel tool for the analysis of crude oil and its components. Using Fourier transform ion cyclotron resonance mass spectrometry, unambiguous determination of key compounds in this complex matrix with unprecedented sensitivity is presented.

  19. A 3-Year Study of Predictive Factors for Positive and Negative Appendicectomies.

    PubMed

    Chang, Dwayne T S; Maluda, Melissa; Lee, Lisa; Premaratne, Chandrasiri; Khamhing, Srisongham

    2018-03-06

    Early and accurate identification or exclusion of acute appendicitis is the key to avoid the morbidity of delayed treatment for true appendicitis or unnecessary appendicectomy, respectively. We aim (i) to identify potential predictive factors for positive and negative appendicectomies; and (ii) to analyse the use of ultrasound scans (US) and computed tomography (CT) scans for acute appendicitis. All appendicectomies that took place at our hospital from the 1st of January 2013 to the 31st of December 2015 were retrospectively recorded. Test results of potential predictive factors of acute appendicitis were recorded. Statistical analysis was performed using Fisher exact test, logistic regression analysis, sensitivity, specificity, and positive and negative predictive values calculation. 208 patients were included in this study. 184 patients had histologically proven acute appendicitis. The other 24 patients had either nonappendicitis pathology or normal appendix. Logistic regression analysis showed statistically significant associations between appendicitis and white cell count, neutrophil count, C-reactive protein, and bilirubin. Neutrophil count was the test with the highest sensitivity and negative predictive values, whereas bilirubin was the test with the highest specificity and positive predictive values (PPV). US and CT scans had high sensitivity and PPV for diagnosing appendicitis. No single test was sufficient to diagnose or exclude acute appendicitis by itself. Combining tests with high sensitivity (abnormal neutrophil count, and US and CT scans) and high specificity (raised bilirubin) may predict acute appendicitis more accurately.

  20. Monte Carlo sensitivity analysis of unknown parameters in hazardous materials transportation risk assessment.

    PubMed

    Pet-Armacost, J J; Sepulveda, J; Sakude, M

    1999-12-01

    The US Department of Transportation was interested in the risks associated with transporting Hydrazine in tanks with and without relief devices. Hydrazine is both highly toxic and flammable, as well as corrosive. Consequently, there was a conflict as to whether a relief device should be used or not. Data were not available on the impact of relief devices on release probabilities or the impact of Hydrazine on the likelihood of fires and explosions. In this paper, a Monte Carlo sensitivity analysis of the unknown parameters was used to assess the risks associated with highway transport of Hydrazine. To help determine whether or not relief devices should be used, fault trees and event trees were used to model the sequences of events that could lead to adverse consequences during transport of Hydrazine. The event probabilities in the event trees were derived as functions of the parameters whose effects were not known. The impacts of these parameters on the risk of toxic exposures, fires, and explosions were analyzed through a Monte Carlo sensitivity analysis and analyzed statistically through an analysis of variance. The analysis allowed the determination of which of the unknown parameters had a significant impact on the risks. It also provided the necessary support to a critical transportation decision even though the values of several key parameters were not known.

  1. A Methodological Review of US Budget-Impact Models for New Drugs.

    PubMed

    Mauskopf, Josephine; Earnshaw, Stephanie

    2016-11-01

    A budget-impact analysis is required by many jurisdictions when adding a new drug to the formulary. However, previous reviews have indicated that adherence to methodological guidelines is variable. In this methodological review, we assess the extent to which US budget-impact analyses for new drugs use recommended practices. We describe recommended practice for seven key elements in the design of a budget-impact analysis. Targeted literature searches for US studies reporting estimates of the budget impact of a new drug were performed and we prepared a summary of how each study addressed the seven key elements. The primary finding from this review is that recommended practice is not followed in many budget-impact analyses. For example, we found that growth in the treated population size and/or changes in disease-related costs expected during the model time horizon for more effective treatments was not included in several analyses for chronic conditions. In addition, all drug-related costs were not captured in the majority of the models. Finally, for most studies, one-way sensitivity and scenario analyses were very limited, and the ranges used in one-way sensitivity analyses were frequently arbitrary percentages rather than being data driven. The conclusions from our review are that changes in population size, disease severity mix, and/or disease-related costs should be properly accounted for to avoid over- or underestimating the budget impact. Since each budget holder might have different perspectives and different values for many of the input parameters, it is also critical for published budget-impact analyses to include extensive sensitivity and scenario analyses based on realistic input values.

  2. An Image Encryption Algorithm Utilizing Julia Sets and Hilbert Curves

    PubMed Central

    Sun, Yuanyuan; Chen, Lina; Xu, Rudan; Kong, Ruiqing

    2014-01-01

    Image encryption is an important and effective technique to protect image security. In this paper, a novel image encryption algorithm combining Julia sets and Hilbert curves is proposed. The algorithm utilizes Julia sets’ parameters to generate a random sequence as the initial keys and gets the final encryption keys by scrambling the initial keys through the Hilbert curve. The final cipher image is obtained by modulo arithmetic and diffuse operation. In this method, it needs only a few parameters for the key generation, which greatly reduces the storage space. Moreover, because of the Julia sets’ properties, such as infiniteness and chaotic characteristics, the keys have high sensitivity even to a tiny perturbation. The experimental results indicate that the algorithm has large key space, good statistical property, high sensitivity for the keys, and effective resistance to the chosen-plaintext attack. PMID:24404181

  3. Bi-directional gene set enrichment and canonical correlation analysis identify key diet-sensitive pathways and biomarkers of metabolic syndrome.

    PubMed

    Morine, Melissa J; McMonagle, Jolene; Toomey, Sinead; Reynolds, Clare M; Moloney, Aidan P; Gormley, Isobel C; Gaora, Peadar O; Roche, Helen M

    2010-10-07

    Currently, a number of bioinformatics methods are available to generate appropriate lists of genes from a microarray experiment. While these lists represent an accurate primary analysis of the data, fewer options exist to contextualise those lists. The development and validation of such methods is crucial to the wider application of microarray technology in the clinical setting. Two key challenges in clinical bioinformatics involve appropriate statistical modelling of dynamic transcriptomic changes, and extraction of clinically relevant meaning from very large datasets. Here, we apply an approach to gene set enrichment analysis that allows for detection of bi-directional enrichment within a gene set. Furthermore, we apply canonical correlation analysis and Fisher's exact test, using plasma marker data with known clinical relevance to aid identification of the most important gene and pathway changes in our transcriptomic dataset. After a 28-day dietary intervention with high-CLA beef, a range of plasma markers indicated a marked improvement in the metabolic health of genetically obese mice. Tissue transcriptomic profiles indicated that the effects were most dramatic in liver (1270 genes significantly changed; p < 0.05), followed by muscle (601 genes) and adipose (16 genes). Results from modified GSEA showed that the high-CLA beef diet affected diverse biological processes across the three tissues, and that the majority of pathway changes reached significance only with the bi-directional test. Combining the liver tissue microarray results with plasma marker data revealed 110 CLA-sensitive genes showing strong canonical correlation with one or more plasma markers of metabolic health, and 9 significantly overrepresented pathways among this set; each of these pathways was also significantly changed by the high-CLA diet. Closer inspection of two of these pathways--selenoamino acid metabolism and steroid biosynthesis--illustrated clear diet-sensitive changes in constituent genes, as well as strong correlations between gene expression and plasma markers of metabolic syndrome independent of the dietary effect. Bi-directional gene set enrichment analysis more accurately reflects dynamic regulatory behaviour in biochemical pathways, and as such highlighted biologically relevant changes that were not detected using a traditional approach. In such cases where transcriptomic response to treatment is exceptionally large, canonical correlation analysis in conjunction with Fisher's exact test highlights the subset of pathways showing strongest correlation with the clinical markers of interest. In this case, we have identified selenoamino acid metabolism and steroid biosynthesis as key pathways mediating the observed relationship between metabolic health and high-CLA beef. These results indicate that this type of analysis has the potential to generate novel transcriptome-based biomarkers of disease.

  4. Bi-directional gene set enrichment and canonical correlation analysis identify key diet-sensitive pathways and biomarkers of metabolic syndrome

    PubMed Central

    2010-01-01

    Background Currently, a number of bioinformatics methods are available to generate appropriate lists of genes from a microarray experiment. While these lists represent an accurate primary analysis of the data, fewer options exist to contextualise those lists. The development and validation of such methods is crucial to the wider application of microarray technology in the clinical setting. Two key challenges in clinical bioinformatics involve appropriate statistical modelling of dynamic transcriptomic changes, and extraction of clinically relevant meaning from very large datasets. Results Here, we apply an approach to gene set enrichment analysis that allows for detection of bi-directional enrichment within a gene set. Furthermore, we apply canonical correlation analysis and Fisher's exact test, using plasma marker data with known clinical relevance to aid identification of the most important gene and pathway changes in our transcriptomic dataset. After a 28-day dietary intervention with high-CLA beef, a range of plasma markers indicated a marked improvement in the metabolic health of genetically obese mice. Tissue transcriptomic profiles indicated that the effects were most dramatic in liver (1270 genes significantly changed; p < 0.05), followed by muscle (601 genes) and adipose (16 genes). Results from modified GSEA showed that the high-CLA beef diet affected diverse biological processes across the three tissues, and that the majority of pathway changes reached significance only with the bi-directional test. Combining the liver tissue microarray results with plasma marker data revealed 110 CLA-sensitive genes showing strong canonical correlation with one or more plasma markers of metabolic health, and 9 significantly overrepresented pathways among this set; each of these pathways was also significantly changed by the high-CLA diet. Closer inspection of two of these pathways - selenoamino acid metabolism and steroid biosynthesis - illustrated clear diet-sensitive changes in constituent genes, as well as strong correlations between gene expression and plasma markers of metabolic syndrome independent of the dietary effect. Conclusion Bi-directional gene set enrichment analysis more accurately reflects dynamic regulatory behaviour in biochemical pathways, and as such highlighted biologically relevant changes that were not detected using a traditional approach. In such cases where transcriptomic response to treatment is exceptionally large, canonical correlation analysis in conjunction with Fisher's exact test highlights the subset of pathways showing strongest correlation with the clinical markers of interest. In this case, we have identified selenoamino acid metabolism and steroid biosynthesis as key pathways mediating the observed relationship between metabolic health and high-CLA beef. These results indicate that this type of analysis has the potential to generate novel transcriptome-based biomarkers of disease. PMID:20929581

  5. An optical fiber spool for laser stabilization with reduced acceleration sensitivity to 10-12/g

    NASA Astrophysics Data System (ADS)

    Hu, Yong-Qi; Dong, Jing; Huang, Jun-Chao; Li, Tang; Liu, Liang

    2015-10-01

    Environmental vibration causes mechanical deformation in optical fibers, which induces excess frequency noise in fiber-stabilized lasers. In order to solve such a problem, we propose an ultralow acceleration sensitivity fiber spool with symmetrically mounted structure. By numerical analysis with the finite element method, we obtain the optimal geometry parameters of the spool with which the horizontal and vertical acceleration sensitivity can be reduced to 3.25 × 10-12/g and 5.38 × 10-12/g respectively. Moreover, the structure features the insensitivity to the variation of geometry parameters, which will minimize the influence from numerical simulation error and manufacture tolerance. Project supported by the National Natural Science Foundation of China (Grant Nos. 11034008 and 11274324) and the Key Research Program of the Chinese Academy of Sciences (Grant No. KJZD-EW-W02).

  6. Optical signal monitoring in phase modulated optical fiber transmission systems

    NASA Astrophysics Data System (ADS)

    Zhao, Jian

    Optical performance monitoring (OPM) is one of the essential functions for future high speed optical networks. Among the parameters to be monitored, chromatic dispersion (CD) is especially important since it has a significant impact on overall system performance. In this thesis effective CD monitoring approaches for phase-shift keying (PSK) based optical transmission systems are investigated. A number of monitoring schemes based on radio frequency (RF) spectrum analysis and delay-tap sampling are proposed and their performance evaluated. A method for dispersion monitoring of differential phase-shift keying (DPSK) signals based on RF power detection is studied. The RF power spectrum is found to increase with the increase of CD and decrease with polarization mode dispersion (PMD). The spectral power density dependence on CD is studied theoretically and then verified through simulations and experiments. The monitoring sensitivity for nonreturn-to-zero differential phase-shift keying (NRZ-DPSK) and return-to-zero differential phase-shift keying (RZ-DPSK) based systems can reach 80ps/nm/dB and 34ps/nm/dB respectively. The scheme enables the monitoring of differential group delay (DGD) and CD simultaneously. The monitoring sensitivity of CD and DGD can reach 56.7ps/nm/dB and 3.1ps/dB using a bandpass filter. The effects of optical signal-to-noise ratio (OSNR), DGD, fiber nonlinearity and chirp on the monitoring results are investigated. Two RF pilot tones are employed for CD monitoring of DPSK signals. Specially selected pilot tone frequencies enable good monitoring sensitivity with minimum influence on the received signals. The dynamic range exceeding 35dB and monitoring sensitivity up to 9.5ps/nm/dB are achieved. Asynchronous sampling technique is employed for CD monitoring. A signed CD monitoring method for 10Gb/s NRZ-DPSK and RZ-DPSK systems using asynchronous delay-tap sampling technique is studied. The demodulated signals suffer asymmetric waveform distortion if there is a phase error (Deltaphi) in the delay interferometer (DI) and in the presence of residual CD. Using delay-tap sampling the scatter plots can reflect this signal distortion through their asymmetric characteristics. A distance ratio (DR) is defined to represent the change of the scatter plots which is directly related to the accumulated CD. The monitoring range can be up to +/-400ps/nm and to +/-720ps/nm for 10Gb/s NRZ-DPSK and RZ-DPSK signals with 450 phase error in DI. The monitoring sensitivity reaches +/-8ps/nm and CD polarity discrimination is realized. It is found that the signal degradation is related to the increment of the absolute value of CD or phase mismatch. The effect of different polarities of phase error on CD monitoring is also analyzed. The shoulders location depends on the sign of the product DLDeltaphi. If DLDeltaphi > 0, the shoulder will appear on trailing edge else the shoulder will appear on leading edge when DLDeltaphi < 0. The analysis shows that the phase error is identical to the frequency offset of optical source so a signed frequency offset monitoring is also demonstrated. The monitoring results show that the monitoring range can reach +/-2.2GHz and the monitoring sensitivity is around 27MHz. The effect of nonlinearity, OSNR and bandwidth of the lowpass filter on the proposed monitoring method has also been studied. The signed CD monitoring for 100Gb/s carrier suppressed return-to-zero differential quadrature phase-shift keying (CSRZ-DQPSK) system based on the delay-tap sampling technology is demonstrated. The monitoring range and monitoring resolution can goes up to +/-32ps/nm and +/-8ps/nm, respectively. A signed CD and optical carrier wavelength monitoring scheme using cross-correlation method for on-off keying (00K) wavelength division multiplexing (WDM) system is proposed and demonstrated. CD monitoring sensitivity is high and can be less than 10% of the bit period. Wavelength monitoring is implemented using the proposed approach. The monitoring results show that the sensitivity can reach up to 1.37ps/GHz.

  7. Evaluation of peak-picking algorithms for protein mass spectrometry.

    PubMed

    Bauer, Chris; Cramer, Rainer; Schuchhardt, Johannes

    2011-01-01

    Peak picking is an early key step in MS data analysis. We compare three commonly used approaches to peak picking and discuss their merits by means of statistical analysis. Methods investigated encompass signal-to-noise ratio, continuous wavelet transform, and a correlation-based approach using a Gaussian template. Functionality of the three methods is illustrated and discussed in a practical context using a mass spectral data set created with MALDI-TOF technology. Sensitivity and specificity are investigated using a manually defined reference set of peaks. As an additional criterion, the robustness of the three methods is assessed by a perturbation analysis and illustrated using ROC curves.

  8. The economics of protecting tiger populations: Linking household behavior to poaching and prey depletion

    USGS Publications Warehouse

    Damania, R.; Stringer, R.; Karanth, K.U.; Stith, B.

    2003-01-01

    The tiger (Panthera tigris) is classified as endangered and populations continue to decline. This paper presents a formal economic analysis of the two most imminent threats to the survival of wild tigers: poaching tigers and hunting their prey. A model is developed to examine interactions between tigers and farm households living in and around tiger habitats. The analysis extends the existing literature on tiger demography, incorporating predator-prey interactions and exploring the sensitivity of tiger populations to key economic parameters. The analysis aims to contribute to policy debates on how best to protect one of the world's most endangered wild cats.

  9. Alternatives for discounting in the analysis of noninferiority trials.

    PubMed

    Snapinn, Steven M

    2004-05-01

    Determining the efficacy of an experimental therapy relative to placebo on the basis of an active-control noninferiority trial requires reference to historical placebo-controlled trials. The validity of the resulting comparison depends on two key assumptions: assay sensitivity and constancy. Since the truth of these assumptions cannot be verified, it seems logical to raise the standard of evidence required to declare efficacy; this concept is referred to as discounting. It is not often recognized that two common design and analysis approaches, setting a noninferiority margin and requiring preservation of a fraction of the standard therapy's effect, are forms of discounting. The noninferiority margin is a particularly poor approach, since its degree of discounting depends on an irrelevant factor. Preservation of effect is more reasonable, but it addresses only the constancy assumption, not the issue of assay sensitivity. Gaining consensus on the most appropriate approach to the design and analysis of noninferiority trials will require a common understanding of the concept of discounting.

  10. Application of IATA - A case study in evaluating the global and local performance of a Bayesian Network model for Skin Sensitization

    EPA Science Inventory

    Since the publication of the Adverse Outcome Pathway (AOP) for skin sensitization, there have been many efforts to develop systematic approaches to integrate the information generated from different key events for decision making. The types of information characterizing key event...

  11. Immersion lithography defectivity analysis at DUV inspection wavelength

    NASA Astrophysics Data System (ADS)

    Golan, E.; Meshulach, D.; Raccah, N.; Yeo, J. Ho.; Dassa, O.; Brandl, S.; Schwarz, C.; Pierson, B.; Montgomery, W.

    2007-03-01

    Significant effort has been directed in recent years towards the realization of immersion lithography at 193nm wavelength. Immersion lithography is likely a key enabling technology for the production of critical layers for 45nm and 32nm design rule (DR) devices. In spite of the significant progress in immersion lithography technology, there remain several key technology issues, with a critical issue of immersion lithography process induced defects. The benefits of the optical resolution and depth of focus, made possible by immersion lithography, are well understood. Yet, these benefits cannot come at the expense of increased defect counts and decreased production yield. Understanding the impact of the immersion lithography process parameters on wafer defects formation and defect counts, together with the ability to monitor, control and minimize the defect counts down to acceptable levels is imperative for successful introduction of immersion lithography for production of advanced DR's. In this report, we present experimental results of immersion lithography defectivity analysis focused on topcoat layer thickness parameters and resist bake temperatures. Wafers were exposed on the 1150i-α-immersion scanner and 1200B Scanner (ASML), defect inspection was performed using a DUV inspection tool (UVision TM, Applied Materials). Higher sensitivity was demonstrated at DUV through detection of small defects not detected at the visible wavelength, indicating on the potential high sensitivity benefits of DUV inspection for this layer. The analysis indicates that certain types of defects are associated with different immersion process parameters. This type of analysis at DUV wavelengths would enable the optimization of immersion lithography processes, thus enabling the qualification of immersion processes for volume production.

  12. Implementation of Point-of-Care Diagnostics in Rural Primary Healthcare Clinics in South Africa: Perspectives of Key Stakeholders.

    PubMed

    Mashamba-Thompson, Tivani P; Jama, Ngcwalisa A; Sartorius, Benn; Drain, Paul K; Thompson, Rowan M

    2017-01-08

    Key stakeholders' involvement is crucial to the sustainability of quality point-of-care (POC) diagnostics services in low-and-middle income countries. The aim of this study was to explore key stakeholder perceptions on the implementation of POC diagnostics in rural primary healthcare (PHC) clinics in South Africa. We conducted a qualitative study encompassing in-depth interviews with multiple key stakeholders of POC diagnostic services for rural and resource-limited PHC clinics. Interviews were digitally recorded and transcribed verbatim prior to thematic content analysis. Thematic content analysis was conducted using themes guided by the World Health Organisation (WHO) quality-ASSURED (Affordable, Sensitive, Specific, User friendly, Rapid and to enable treatment at first visit and Robust, Equipment free and Delivered to those who need it) criteria for POC diagnostic services in resource-limited settings. 11 key stakeholders participated in the study. All stakeholders perceived the main advantage of POC diagnostics as enabling access to healthcare for rural patients. Stakeholders perceived the current POC diagnostic services to have an ability to meet patients' needs, but recommended further improvement of the following areas: research on cost-effectiveness; improved quality management systems; development of affordable POC diagnostic and clinic-based monitoring and evaluation. Key stakeholders of POC diagnostics in rural PHC clinics in South Africa highlighted the need to assess affordability and ensure quality assurance of current services before adopting new POC diagnostics and scaling up current POC diagnostics.

  13. Implementation of Point-of-Care Diagnostics in Rural Primary Healthcare Clinics in South Africa: Perspectives of Key Stakeholders

    PubMed Central

    Mashamba-Thompson, Tivani P.; Jama, Ngcwalisa A.; Sartorius, Benn; Drain, Paul K.; Thompson, Rowan M.

    2017-01-01

    Introduction: Key stakeholders’ involvement is crucial to the sustainability of quality point-of-care (POC) diagnostics services in low-and-middle income countries. The aim of this study was to explore key stakeholder perceptions on the implementation of POC diagnostics in rural primary healthcare (PHC) clinics in South Africa. Method: We conducted a qualitative study encompassing in-depth interviews with multiple key stakeholders of POC diagnostic services for rural and resource-limited PHC clinics. Interviews were digitally recorded and transcribed verbatim prior to thematic content analysis. Thematic content analysis was conducted using themes guided by the World Health Organisation (WHO) quality-ASSURED (Affordable, Sensitive, Specific, User friendly, Rapid and to enable treatment at first visit and Robust, Equipment free and Delivered to those who need it) criteria for POC diagnostic services in resource-limited settings. Results: 11 key stakeholders participated in the study. All stakeholders perceived the main advantage of POC diagnostics as enabling access to healthcare for rural patients. Stakeholders perceived the current POC diagnostic services to have an ability to meet patients’ needs, but recommended further improvement of the following areas: research on cost-effectiveness; improved quality management systems; development of affordable POC diagnostic and clinic-based monitoring and evaluation. Conclusions: Key stakeholders of POC diagnostics in rural PHC clinics in South Africa highlighted the need to assess affordability and ensure quality assurance of current services before adopting new POC diagnostics and scaling up current POC diagnostics. PMID:28075337

  14. A structured framework for assessing sensitivity to missing data assumptions in longitudinal clinical trials.

    PubMed

    Mallinckrodt, C H; Lin, Q; Molenberghs, M

    2013-01-01

    The objective of this research was to demonstrate a framework for drawing inference from sensitivity analyses of incomplete longitudinal clinical trial data via a re-analysis of data from a confirmatory clinical trial in depression. A likelihood-based approach that assumed missing at random (MAR) was the primary analysis. Robustness to departure from MAR was assessed by comparing the primary result to those from a series of analyses that employed varying missing not at random (MNAR) assumptions (selection models, pattern mixture models and shared parameter models) and to MAR methods that used inclusive models. The key sensitivity analysis used multiple imputation assuming that after dropout the trajectory of drug-treated patients was that of placebo treated patients with a similar outcome history (placebo multiple imputation). This result was used as the worst reasonable case to define the lower limit of plausible values for the treatment contrast. The endpoint contrast from the primary analysis was - 2.79 (p = .013). In placebo multiple imputation, the result was - 2.17. Results from the other sensitivity analyses ranged from - 2.21 to - 3.87 and were symmetrically distributed around the primary result. Hence, no clear evidence of bias from missing not at random data was found. In the worst reasonable case scenario, the treatment effect was 80% of the magnitude of the primary result. Therefore, it was concluded that a treatment effect existed. The structured sensitivity framework of using a worst reasonable case result based on a controlled imputation approach with transparent and debatable assumptions supplemented a series of plausible alternative models under varying assumptions was useful in this specific situation and holds promise as a generally useful framework. Copyright © 2012 John Wiley & Sons, Ltd.

  15. Optical asymmetric cryptography based on elliptical polarized light linear truncation and a numerical reconstruction technique.

    PubMed

    Lin, Chao; Shen, Xueju; Wang, Zhisong; Zhao, Cheng

    2014-06-20

    We demonstrate a novel optical asymmetric cryptosystem based on the principle of elliptical polarized light linear truncation and a numerical reconstruction technique. The device of an array of linear polarizers is introduced to achieve linear truncation on the spatially resolved elliptical polarization distribution during image encryption. This encoding process can be characterized as confusion-based optical cryptography that involves no Fourier lens and diffusion operation. Based on the Jones matrix formalism, the intensity transmittance for this truncation is deduced to perform elliptical polarized light reconstruction based on two intensity measurements. Use of a quick response code makes the proposed cryptosystem practical, with versatile key sensitivity and fault tolerance. Both simulation and preliminary experimental results that support theoretical analysis are presented. An analysis of the resistance of the proposed method on a known public key attack is also provided.

  16. A sensitivity analysis of a surface energy balance model to LAI (Leaf Area Index)

    NASA Astrophysics Data System (ADS)

    Maltese, A.; Cannarozzo, M.; Capodici, F.; La Loggia, G.; Santangelo, T.

    2008-10-01

    The LAI is a key parameter in hydrological processes, especially in the physically based distribution models. It is a critical ecosystem attribute since physiological processes such as photosynthesis, transpiration and evaporation depend on it. The diffusion of water vapor, momentum, heat and light through the canopy is regulated by the distribution and density of the leaves, branches, twigs and stems. The LAI influences the sensible heat flux H in the surface energy balance single source models through the calculation of the roughness length and of the displacement height. The aerodynamic resistance between the soil and within-canopy source height is a function of the LAI through the roughness length. This research carried out a sensitivity analysis of some of the most important parameters of surface energy balance models to the LAI time variation, in order to take into account the effects of the LAI variation with the phenological period. Finally empirical retrieved relationships between field spectroradiometric data and the field LAI measured via a light-sensitive instrument are presented for a cereal field.

  17. Image encryption with chaotic map and Arnold transform in the gyrator transform domains

    NASA Astrophysics Data System (ADS)

    Sang, Jun; Luo, Hongling; Zhao, Jun; Alam, Mohammad S.; Cai, Bin

    2017-05-01

    An image encryption method combing chaotic map and Arnold transform in the gyrator transform domains was proposed. Firstly, the original secret image is XOR-ed with a random binary sequence generated by a logistic map. Then, the gyrator transform is performed. Finally, the amplitude and phase of the gyrator transform are permutated by Arnold transform. The decryption procedure is the inverse operation of encryption. The secret keys used in the proposed method include the control parameter and the initial value of the logistic map, the rotation angle of the gyrator transform, and the transform number of the Arnold transform. Therefore, the key space is large, while the key data volume is small. The numerical simulation was conducted to demonstrate the effectiveness of the proposed method and the security analysis was performed in terms of the histogram of the encrypted image, the sensitiveness to the secret keys, decryption upon ciphertext loss, and resistance to the chosen-plaintext attack.

  18. Modeling and projection of dengue fever cases in Guangzhou based on variation of weather factors.

    PubMed

    Li, Chenlu; Wang, Xiaofeng; Wu, Xiaoxu; Liu, Jianing; Ji, Duoying; Du, Juan

    2017-12-15

    Dengue fever is one of the most serious vector-borne infectious diseases, especially in Guangzhou, China. Dengue viruses and their vectors Aedes albopictus are sensitive to climate change primarily in relation to weather factors. Previous research has mainly focused on identifying the relationship between climate factors and dengue cases, or developing dengue case models with some non-climate factors. However, there has been little research addressing the modeling and projection of dengue cases only from the perspective of climate change. This study considered this topic using long time series data (1998-2014). First, sensitive weather factors were identified through meta-analysis that included literature review screening, lagged analysis, and collinear analysis. Then, key factors that included monthly average temperature at a lag of two months, and monthly average relative humidity and monthly average precipitation at lags of three months were determined. Second, time series Poisson analysis was used with the generalized additive model approach to develop a dengue model based on key weather factors for January 1998 to December 2012. Data from January 2013 to July 2014 were used to validate that the model was reliable and reasonable. Finally, future weather data (January 2020 to December 2070) were input into the model to project the occurrence of dengue cases under different climate scenarios (RCP 2.6 and RCP 8.5). Longer time series analysis and scientifically selected weather variables were used to develop a dengue model to ensure reliability. The projections suggested that seasonal disease control (especially in summer and fall) and mitigation of greenhouse gas emissions could help reduce the incidence of dengue fever. The results of this study hope to provide a scientifically theoretical basis for the prevention and control of dengue fever in Guangzhou. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Data mining methods in the prediction of Dementia: A real-data comparison of the accuracy, sensitivity and specificity of linear discriminant analysis, logistic regression, neural networks, support vector machines, classification trees and random forests

    PubMed Central

    2011-01-01

    Background Dementia and cognitive impairment associated with aging are a major medical and social concern. Neuropsychological testing is a key element in the diagnostic procedures of Mild Cognitive Impairment (MCI), but has presently a limited value in the prediction of progression to dementia. We advance the hypothesis that newer statistical classification methods derived from data mining and machine learning methods like Neural Networks, Support Vector Machines and Random Forests can improve accuracy, sensitivity and specificity of predictions obtained from neuropsychological testing. Seven non parametric classifiers derived from data mining methods (Multilayer Perceptrons Neural Networks, Radial Basis Function Neural Networks, Support Vector Machines, CART, CHAID and QUEST Classification Trees and Random Forests) were compared to three traditional classifiers (Linear Discriminant Analysis, Quadratic Discriminant Analysis and Logistic Regression) in terms of overall classification accuracy, specificity, sensitivity, Area under the ROC curve and Press'Q. Model predictors were 10 neuropsychological tests currently used in the diagnosis of dementia. Statistical distributions of classification parameters obtained from a 5-fold cross-validation were compared using the Friedman's nonparametric test. Results Press' Q test showed that all classifiers performed better than chance alone (p < 0.05). Support Vector Machines showed the larger overall classification accuracy (Median (Me) = 0.76) an area under the ROC (Me = 0.90). However this method showed high specificity (Me = 1.0) but low sensitivity (Me = 0.3). Random Forest ranked second in overall accuracy (Me = 0.73) with high area under the ROC (Me = 0.73) specificity (Me = 0.73) and sensitivity (Me = 0.64). Linear Discriminant Analysis also showed acceptable overall accuracy (Me = 0.66), with acceptable area under the ROC (Me = 0.72) specificity (Me = 0.66) and sensitivity (Me = 0.64). The remaining classifiers showed overall classification accuracy above a median value of 0.63, but for most sensitivity was around or even lower than a median value of 0.5. Conclusions When taking into account sensitivity, specificity and overall classification accuracy Random Forests and Linear Discriminant analysis rank first among all the classifiers tested in prediction of dementia using several neuropsychological tests. These methods may be used to improve accuracy, sensitivity and specificity of Dementia predictions from neuropsychological testing. PMID:21849043

  20. Expendable vs reusable propulsion systems cost sensitivity

    NASA Technical Reports Server (NTRS)

    Hamaker, Joseph W.; Dodd, Glenn R.

    1989-01-01

    One of the key trade studies that must be considered when studying any new space transportation hardware is whether to go reusable or expendable. An analysis is presented here for such a trade relative to a proposed Liquid Rocket Booster which is being studied at MSFC. The assumptions or inputs to the trade were developed and integrated into a model that compares the Life-Cycle Costs of both a reusable LRB and an expendable LRB. Sensitivities were run by varying the input variables to see their effect on total cost. In addition a Monte-Carlo simulation was run to determine the amount of cost risk that may be involved in a decision to reuse or expend.

  1. Global Sensitivity Analysis of OnGuard Models Identifies Key Hubs for Transport Interaction in Stomatal Dynamics1[CC-BY

    PubMed Central

    Vialet-Chabrand, Silvere; Griffiths, Howard

    2017-01-01

    The physical requirement for charge to balance across biological membranes means that the transmembrane transport of each ionic species is interrelated, and manipulating solute flux through any one transporter will affect other transporters at the same membrane, often with unforeseen consequences. The OnGuard systems modeling platform has helped to resolve the mechanics of stomatal movements, uncovering previously unexpected behaviors of stomata. To date, however, the manual approach to exploring model parameter space has captured little formal information about the emergent connections between parameters that define the most interesting properties of the system as a whole. Here, we introduce global sensitivity analysis to identify interacting parameters affecting a number of outputs commonly accessed in experiments in Arabidopsis (Arabidopsis thaliana). The analysis highlights synergies between transporters affecting the balance between Ca2+ sequestration and Ca2+ release pathways, notably those associated with internal Ca2+ stores and their turnover. Other, unexpected synergies appear, including with the plasma membrane anion channels and H+-ATPase and with the tonoplast TPK K+ channel. These emergent synergies, and the core hubs of interaction that they define, identify subsets of transporters associated with free cytosolic Ca2+ concentration that represent key targets to enhance plant performance in the future. They also highlight the importance of interactions between the voltage regulation of the plasma membrane and tonoplast in coordinating transport between the different cellular compartments. PMID:28432256

  2. Crowdsourcing and Automated Retinal Image Analysis for Diabetic Retinopathy.

    PubMed

    Mudie, Lucy I; Wang, Xueyang; Friedman, David S; Brady, Christopher J

    2017-09-23

    As the number of people with diabetic retinopathy (DR) in the USA is expected to increase threefold by 2050, the need to reduce health care costs associated with screening for this treatable disease is ever present. Crowdsourcing and automated retinal image analysis (ARIA) are two areas where new technology has been applied to reduce costs in screening for DR. This paper reviews the current literature surrounding these new technologies. Crowdsourcing has high sensitivity for normal vs abnormal images; however, when multiple categories for severity of DR are added, specificity is reduced. ARIAs have higher sensitivity and specificity, and some commercial ARIA programs are already in use. Deep learning enhanced ARIAs appear to offer even more improvement in ARIA grading accuracy. The utilization of crowdsourcing and ARIAs may be a key to reducing the time and cost burden of processing images from DR screening.

  3. Sensitivity analysis on the effect of key parameters on the performance of parabolic trough solar collectors

    NASA Astrophysics Data System (ADS)

    Muhlen, Luis S. W.; Najafi, Behzad; Rinaldi, Fabio; Marchesi, Renzo

    2014-04-01

    Solar troughs are amongst the most commonly used technologies for collecting solar thermal energy and any attempt to increase the performance of these systems is welcomed. In the present study a parabolic solar trough is simulated using a one dimensional finite element model in which the energy balances for the fluid, the absorber and the envelope in each element are performed. The developed model is then validated using the available experimental data . A sensitivity analysis is performed in the next step in order to study the effect of changing the type of the working fluid and the corresponding Reynolds number on the overall performance of the system. The potential improvement due to the addition of a shield on the upper half of the annulus and enhancing the convection coefficient of the heat transfer fluid is also studied.

  4. Electrochemical sensor for rutin detection based on Au nanoparticle-loaded helical carbon nanotubes

    NASA Astrophysics Data System (ADS)

    Yang, Haitang; Li, Bingyue; Cui, Rongjing; Xing, Ruimin; Liu, Shanhu

    2017-10-01

    The key step in the fabrication of highly active electrochemical sensors is seeking multifunctional nanocomposites as electrode modified materials. In this study, the gold nanoparticle-decorated helical carbon nanotube nanocomposites (AuNPs-HCNTs) were fabricated for rutin detection because of its superior sensitivity, the chemical stability of AuNPs, and the superior conductivity and unique 3D-helical structure of helical carbon nanotubes. Results showed the prepared nanocomposites exhibited superior electrocatalytic activity towards rutin due to the synergetic effects of AuNPs and HCNTs. Under the optimized conditions, the developed sensor exhibited a linear response range from 0.1 to 31 μmol/L for rutin with a low detectable limit of 81 nmol/L. The proposed method might offer a possibility for electrochemical analysis of rutin in Chinese medical analysis or serum monitoring owing to its low cost, simplicity, high sensitivity, good stability, and few interferences against common coexisting ions in real samples.

  5. [The role of endotracheal aspirate culture in the diagnosis of ventilator-associated pneumonia: a meta analysis].

    PubMed

    Wang, Fei; He, Bei

    2013-01-01

    To investigate the role of endotracheal aspirate (EA) culture in the diagnosis and antibiotic management in ventilator-associated pneumonia (VAP). We searched CNKI, Wanfang, PUBMED and EMBASE databases published from January 1990 to December 2011, to find relevant literatures on VAP microbiological diagnostic techniques including EA and bronchoalveolar lavage (BALF). The following key words were used: ventilator associated pneumonia, diagnosis and adult. Meta-analysis was performed and the sensitivity and specificity of EA on VAP diagnosis were calculated. Our literature search identified 1665 potential articles, 8 of which fulfilled our selection criteria including 561 patients with paired cultures. Using BALF quantitative culture as reference standard, the sensitivity and specificity of EA were 72% and 71%. When considering quantitative culture of EA only, the sensitivity and specificity improved to 90% and 65%, while the positive and the negative predictive values were 68% and 89% respectively. However, the sensitivity and specificity of semi-quantitative culture of EA were only 50% and 80%, with a positive predictive value of 77% and a negative predictive value of 58% respectively. EA culture had relatively poor sensitivity and specificity, although quantitative culture of EA only could improve the sensitivity. Initiating therapy on the basis of EA quantitative culture may still result in excessive antibiotic usage. Our data suggested that EA could provide some information for clinical decision but could not replace the role of BALF quantitative culture in VAP diagnosis.

  6. The Maillard Reaction Reduced the Sensitization of Tropomyosin and Arginine Kinase from Scylla paramamosain, Simultaneously.

    PubMed

    Han, Xin-Yu; Yang, Huang; Rao, Shi-Tao; Liu, Guang-Yu; Hu, Meng-Jun; Zeng, Bin-Chang; Cao, Min-Jie; Liu, Guang-Ming

    2018-03-21

    The Maillard reaction was established to reduce the sensitization of tropomyosin (TM) and arginine kinase (AK) from Scylla paramamosain, and the mechanism of the attenuated sensitization was investigated. In the present study, the Maillard reaction conditions were optimized for heating at 100 °C for 60 min (pH 8.5) with arabinose. A low level of allergenicity in mice was shown by the levels of allergen-specific antibodies, and more Th1 and less Th2 cells cytokines produced and associated transcription factors with the Maillard reacted allergen (mAllergen). The tolerance potency in mice was demonstrated by the increased ratio of Th1/Th2 cytokines. Moreover, mass spectrometry analysis showed that some key amino acids of IgE-binding epitopes (K 112 , R 125 , R 133 of TM; K 33 , K 118 , R 202 of AK) were modified by the Maillard reaction. The Maillard reaction with arabinose reduced the sensitization of TM and AK, which may be due to the masked epitopes.

  7. Factors influencing antibiotic prescribing habits and use of sensitivity testing amongst veterinarians in Europe.

    PubMed

    De Briyne, N; Atkinson, J; Pokludová, L; Borriello, S P; Price, S

    2013-11-16

    The Heads of Medicines Agencies and the Federation of Veterinarians of Europe undertook a survey to gain a better insight into the decision-making process of veterinarians in Europe when deciding which antibiotics to prescribe. The survey was completed by 3004 practitioners from 25 European countries. Analysis was to the level of different types of practitioner (food producing (FP) animals, companion animals, equines) and country for Belgium, Czech Republic, France, Germany, Spain, Sweden and the UK. Responses indicate no single information source is universally considered critical, though training, published literature and experience were the most important. Factors recorded which most strongly influenced prescribing behaviour were sensitivity tests, own experience, the risk for antibiotic resistance developing and ease of administration. Most practitioners usually take into account responsible use warnings. Antibiotic sensitivity testing is usually performed where a treatment failure has occurred. Significant differences were observed in the frequency of sensitivity testing at the level of types of practitioners and country. The responses indicate a need to improve sensitivity tests and services, with the availability of rapid and cheaper testing being key factors.

  8. Implementation Strategies for Gender-Sensitive Public Health Practice: A European Workshop.

    PubMed

    Oertelt-Prigione, Sabine; Dalibert, Lucie; Verdonk, Petra; Stutz, Elisabeth Zemp; Klinge, Ineke

    2017-11-01

    Providing a robust scientific background for the focus on gender-sensitive public health and a systematic approach to its implementation. Within the FP7-EUGenMed project ( http://eugenmed.eu ) a workshop on sex and gender in public health was convened on February 2-3, 2015. The experts participated in moderated discussion rounds to (1) assemble available knowledge and (2) identify structural influences on practice implementation. The findings were summarized and analyzed in iterative rounds to define overarching strategies and principles. The participants discussed the rationale for implementing gender-sensitive public health and identified priorities and key stakeholders to engage in the process. Communication strategies and specific promotion strategies with distinct stakeholders were defined. A comprehensive list of gender-sensitive practices was established using the recently published taxonomy of the Expert Recommendations for Implementing Change (ERIC) project as a blueprint. A clearly defined implementation strategy should be mandated for all new projects in the field of gender-sensitive public health. Our tool can support researchers and practitioners with the analysis of current and past research as well as with the planning of new projects.

  9. Factors influencing antibiotic prescribing habits and use of sensitivity testing amongst veterinarians in Europe

    PubMed Central

    De Briyne, N.; Atkinson, J.; Pokludová, L.; Borriello, S. P.; Price, S.

    2013-01-01

    The Heads of Medicines Agencies and the Federation of Veterinarians of Europe undertook a survey to gain a better insight into the decision-making process of veterinarians in Europe when deciding which antibiotics to prescribe. The survey was completed by 3004 practitioners from 25 European countries. Analysis was to the level of different types of practitioner (food producing (FP) animals, companion animals, equines) and country for Belgium, Czech Republic, France, Germany, Spain, Sweden and the UK. Responses indicate no single information source is universally considered critical, though training, published literature and experience were the most important. Factors recorded which most strongly influenced prescribing behaviour were sensitivity tests, own experience, the risk for antibiotic resistance developing and ease of administration. Most practitioners usually take into account responsible use warnings. Antibiotic sensitivity testing is usually performed where a treatment failure has occurred. Significant differences were observed in the frequency of sensitivity testing at the level of types of practitioners and country. The responses indicate a need to improve sensitivity tests and services, with the availability of rapid and cheaper testing being key factors. PMID:24068699

  10. Use of a scenario-neutral approach to identify the key hydro-meteorological attributes that impact runoff from a natural catchment

    NASA Astrophysics Data System (ADS)

    Guo, Danlu; Westra, Seth; Maier, Holger R.

    2017-11-01

    Scenario-neutral approaches are being used increasingly for assessing the potential impact of climate change on water resource systems, as these approaches allow the performance of these systems to be evaluated independently of climate change projections. However, practical implementations of these approaches are still scarce, with a key limitation being the difficulty of generating a range of plausible future time series of hydro-meteorological data. In this study we apply a recently developed inverse stochastic generation approach to support the scenario-neutral analysis, and thus identify the key hydro-meteorological variables to which the system is most sensitive. The stochastic generator simulates synthetic hydro-meteorological time series that represent plausible future changes in (1) the average, extremes and seasonal patterns of rainfall; and (2) the average values of temperature (Ta), relative humidity (RH) and wind speed (uz) as variables that drive PET. These hydro-meteorological time series are then fed through a conceptual rainfall-runoff model to simulate the potential changes in runoff as a function of changes in the hydro-meteorological variables, and runoff sensitivity is assessed with both correlation and Sobol' sensitivity analyses. The method was applied to a case study catchment in South Australia, and the results showed that the most important hydro-meteorological attributes for runoff were winter rainfall followed by the annual average rainfall, while the PET-related meteorological variables had comparatively little impact. The high importance of winter rainfall can be related to the winter-dominated nature of both the rainfall and runoff regimes in this catchment. The approach illustrated in this study can greatly enhance our understanding of the key hydro-meteorological attributes and processes that are likely to drive catchment runoff under a changing climate, thus enabling the design of tailored climate impact assessments to specific water resource systems.

  11. The mode of sensitization and its influence on allograft outcomes in highly sensitized kidney transplant recipients.

    PubMed

    Redfield, Robert R; Scalea, Joseph R; Zens, Tiffany J; Mandelbrot, Didier A; Leverson, Glen; Kaufman, Dixon B; Djamali, Arjang

    2016-10-01

    We sought to determine whether the mode of sensitization in highly sensitized patients contributed to kidney allograft survival. An analysis of the United Network for Organ Sharing dataset involving all kidney transplants between 1997 and 2014 was undertaken. Highly sensitized adult kidney transplant recipients [panel reactive antibody (PRA) ≥98%] were compared with adult, primary non-sensitized and re-transplant recipients. Kaplan-Meier survival analyses were used to determine allograft survival rates. Cox proportional hazards regression analyses were conducted to determine the association of graft loss with key predictors. Fifty-three percent of highly sensitized patients transplanted were re-transplants. Pregnancy and transfusion were the only sensitizing event in 20 and 5%, respectively. The 10-year actuarial graft survival for highly sensitized recipients was 43.9% compared with 52.4% for non-sensitized patients, P < 0.001. The combination of being highly sensitized by either pregnancy or blood transfusion increased the risk of graft loss by 23% [hazard ratio (HR) 1.230, confidence interval (CI) 1.150-1.315, P < 0.001], and the combination of being highly sensitized from a prior transplant increased the risk of graft loss by 58.1% (HR 1.581, CI 1.473-1.698, P < 0.001). The mode of sensitization predicts graft survival in highly sensitized kidney transplant recipients (PRA ≥98%). Patients who are highly sensitized from re-transplants have inferior graft survival compared with patients who are highly sensitized from other modes of sensitization. © The Author 2016. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  12. Robotic Versus Open Renal Transplantation in Obese Patients: Protocol for a Cost-Benefit Markov Model Analysis

    PubMed Central

    Puttarajappa, Chethan; Wijkstrom, Martin; Ganoza, Armando; Lopez, Roberto; Tevar, Amit

    2018-01-01

    Background Recent studies have reported a significant decrease in wound problems and hospital stay in obese patients undergoing renal transplantation by robotic-assisted minimally invasive techniques with no difference in graft function. Objective Due to the lack of cost-benefit studies on the use of robotic-assisted renal transplantation versus open surgical procedure, the primary aim of our study is to develop a Markov model to analyze the cost-benefit of robotic surgery versus open traditional surgery in obese patients in need of a renal transplant. Methods Electronic searches will be conducted to identify studies comparing open renal transplantation versus robotic-assisted renal transplantation. Costs associated with the two surgical techniques will incorporate the expenses of the resources used for the operations. A decision analysis model will be developed to simulate a randomized controlled trial comparing three interventional arms: (1) continuation of renal replacement therapy for patients who are considered non-suitable candidates for renal transplantation due to obesity, (2) transplant recipients undergoing open transplant surgery, and (3) transplant patients undergoing robotic-assisted renal transplantation. TreeAge Pro 2017 R1 TreeAge Software, Williamstown, MA, USA) will be used to create a Markov model and microsimulation will be used to compare costs and benefits for the two competing surgical interventions. Results The model will simulate a randomized controlled trial of adult obese patients affected by end-stage renal disease undergoing renal transplantation. The absorbing state of the model will be patients' death from any cause. By choosing death as the absorbing state, we will be able simulate the population of renal transplant recipients from the day of their randomization to transplant surgery or continuation on renal replacement therapy to their death and perform sensitivity analysis around patients' age at the time of randomization to determine if age is a critical variable for cost-benefit analysis or cost-effectiveness analysis comparing renal replacement therapy, robotic-assisted surgery or open renal transplant surgery. After running the model, one of the three competing strategies will result as the most cost-beneficial or cost-effective under common circumstances. To assess the robustness of the results of the model, a multivariable probabilistic sensitivity analysis will be performed by modifying the mean values and confidence intervals of key parameters with the main intent of assessing if the winning strategy is sensitive to rigorous and plausible variations of those values. Conclusions After running the model, one of the three competing strategies will result as the most cost-beneficial or cost-effective under common circumstances. To assess the robustness of the results of the model, a multivariable probabilistic sensitivity analysis will be performed by modifying the mean values and confidence intervals of key parameters with the main intent of assessing if the winning strategy is sensitive to rigorous and plausible variations of those values. PMID:29519780

  13. Analysis strategies for high-resolution UHF-fMRI data.

    PubMed

    Polimeni, Jonathan R; Renvall, Ville; Zaretskaya, Natalia; Fischl, Bruce

    2018-03-01

    Functional MRI (fMRI) benefits from both increased sensitivity and specificity with increasing magnetic field strength, making it a key application for Ultra-High Field (UHF) MRI scanners. Most UHF-fMRI studies utilize the dramatic increases in sensitivity and specificity to acquire high-resolution data reaching sub-millimeter scales, which enable new classes of experiments to probe the functional organization of the human brain. This review article surveys advanced data analysis strategies developed for high-resolution fMRI at UHF. These include strategies designed to mitigate distortion and artifacts associated with higher fields in ways that attempt to preserve spatial resolution of the fMRI data, as well as recently introduced analysis techniques that are enabled by these extremely high-resolution data. Particular focus is placed on anatomically-informed analyses, including cortical surface-based analysis, which are powerful techniques that can guide each step of the analysis from preprocessing to statistical analysis to interpretation and visualization. New intracortical analysis techniques for laminar and columnar fMRI are also reviewed and discussed. Prospects for single-subject individualized analyses are also presented and discussed. Altogether, there are both specific challenges and opportunities presented by UHF-fMRI, and the use of proper analysis strategies can help these valuable data reach their full potential. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Sensitivity-Informed De Novo Programming for Many-Objective Water Portfolio Planning Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Kasprzyk, J. R.; Reed, P. M.; Kirsch, B. R.; Characklis, G. W.

    2009-12-01

    Risk-based water supply management presents severe cognitive, computational, and social challenges to planning in a changing world. Decision aiding frameworks must confront the cognitive biases implicit to risk, the severe uncertainties associated with long term planning horizons, and the consequent ambiguities that shape how we define and solve water resources planning and management problems. This paper proposes and demonstrates a new interactive framework for sensitivity informed de novo programming. The theoretical focus of our many-objective de novo programming is to promote learning and evolving problem formulations to enhance risk-based decision making. We have demonstrated our proposed de novo programming framework using a case study for a single city’s water supply in the Lower Rio Grande Valley (LRGV) in Texas. Key decisions in this case study include the purchase of permanent rights to reservoir inflows and anticipatory thresholds for acquiring transfers of water through optioning and spot leases. A 10-year Monte Carlo simulation driven by historical data is used to provide performance metrics for the supply portfolios. The three major components of our methodology include Sobol globoal sensitivity analysis, many-objective evolutionary optimization and interactive tradeoff visualization. The interplay between these components allows us to evaluate alternative design metrics, their decision variable controls and the consequent system vulnerabilities. Our LRGV case study measures water supply portfolios’ efficiency, reliability, and utilization of transfers in the water supply market. The sensitivity analysis is used interactively over interannual, annual, and monthly time scales to indicate how the problem controls change as a function of the timescale of interest. These results have been used then to improve our exploration and understanding of LRGV costs, vulnerabilities, and the water portfolios’ critical reliability constraints. These results demonstrate how we can adaptively improve the value and robustness of our problem formulations by evolving our definition of optimality to discover key tradeoffs.

  15. Sensitivity studies of pediatric material properties on juvenile lumbar spine responses using finite element analysis.

    PubMed

    Jebaseelan, D Davidson; Jebaraj, C; Yoganandan, Narayan; Rajasekaran, S; Kanna, Rishi M

    2012-05-01

    The objective of the study was to determine the sensitivity of material properties of the juvenile spine to its external and internal responses using a finite element model under compression, and flexion-extension bending moments. The methodology included exercising the 8-year-old juvenile lumbar spine using parametric procedures. The model included the vertebral centrum, growth plates, laminae, pedicles, transverse processes and spinous processes; disc annulus and nucleus; and various ligaments. The sensitivity analysis was conducted by varying the modulus of elasticity for various components. The first simulation was done using mean material properties. Additional simulations were done for each component corresponding to low and high material property variations. External displacement/rotation and internal stress-strain responses were determined under compression and flexion-extension bending. Results indicated that, under compression, disc properties were more sensitive than bone properties, implying an elevated role of the disc under this mode. Under flexion-extension moments, ligament properties were more dominant than the other components, suggesting that various ligaments of the juvenile spine play a key role in modulating bending behaviors. Changes in the growth plate stress associated with ligament properties explained the importance of the growth plate in the pediatric spine with potential implications in progressive deformities.

  16. System implications of aperture-shade design for the SIRTF Observatory

    NASA Technical Reports Server (NTRS)

    Lee, J. H.; Brooks, W. F.; Maa, S.

    1987-01-01

    The 1-m-aperture Space Infrared Telescope Facility (SIRTF) will operate with a sensitivity limited only by the zodiacal background. This sensitivity requirement places severe restrictions on the amount of stray light which can reach the focal plane from off-axis sources such as the sun or earth limb. In addition, radiation from these sources can degrade the lifetime of the telescope and instrument cryogenic system which is now planned for two years before the first servicing. Since the aperture of the telescope represents a break in the telescope insulation system and is effectively the first element in the optical train, the aperture shade is a key system component. The mass, length, and temperature of the shade should be minimized to reduce system cost while maximizing the telescope lifetime and stray light performance. The independent geometric parameters that characterize an asymmetrical shade for a 600 km, 28 deg orbit were identified, and the system sensitivity to the three most important shade parameters were explored. Despite the higher heat loads compared to previously studied polar orbit missions, the analysis determined that passive radiators of a reasonable size are sufficient to meet the system requirements. An optimized design for the SIRTF mission, based on the sensitivity analysis, is proposed.

  17. The effects of abdominal lipectomy in metabolic syndrome components and insulin sensitivity in females: A systematic review and meta-analysis.

    PubMed

    Seretis, Konstantinos; Goulis, Dimitrios G; Koliakos, Georgios; Demiri, Efterpi

    2015-12-01

    Adipose tissue is an endocrine organ, which is implicated in the pathogenesis of obesity, metabolic syndrome and diabetes. Lipectomy offers a unique opportunity to permanently reduce the absolute number of fat cells, though its functional role remains unclear. This systematic and meta-analysis review aims to assess the effect of abdominal lipectomy on metabolic syndrome components and insulin sensitivity in women. A predetermined protocol, established according to the Cochrane Handbook's recommendations, was used. An electronic search in MEDLINE, Scopus, the Cochrane Library and CENTRAL electronic databases was conducted from inception to May 14, 2015. This search was supplemented by a review of reference lists of potentially eligible studies and a manual search of key journals in the field of plastic surgery. Eligible studies were prospective studies with ≥1month of follow-up that included females only who underwent abdominal lipectomy and reported on parameters of metabolic syndrome and insulin sensitivity. The systematic review included 11 studies with a total of 271 individuals. Conflicting results were revealed, though most studies showed no significant metabolic effects after lipectomy. The meta-analysis included 4 studies with 140 subjects. No significant changes were revealed between lipectomy and control groups. This meta-analysis provides evidence that abdominal lipectomy in females does not affect significantly the components of metabolic syndrome and insulin sensitivity. Further high quality studies are needed to elucidate the potential metabolic effects of abdominal lipectomy. Systematic review registration PROSPERO CRD42015017564 (www.crd.york.ac.uk/PROSPERO). Copyright © 2015 Elsevier Inc. All rights reserved.

  18. "A Bayesian sensitivity analysis to evaluate the impact of unmeasured confounding with external data: a real world comparative effectiveness study in osteoporosis".

    PubMed

    Zhang, Xiang; Faries, Douglas E; Boytsov, Natalie; Stamey, James D; Seaman, John W

    2016-09-01

    Observational studies are frequently used to assess the effectiveness of medical interventions in routine clinical practice. However, the use of observational data for comparative effectiveness is challenged by selection bias and the potential of unmeasured confounding. This is especially problematic for analyses using a health care administrative database, in which key clinical measures are often not available. This paper provides an approach to conducting a sensitivity analyses to investigate the impact of unmeasured confounding in observational studies. In a real world osteoporosis comparative effectiveness study, the bone mineral density (BMD) score, an important predictor of fracture risk and a factor in the selection of osteoporosis treatments, is unavailable in the data base and lack of baseline BMD could potentially lead to significant selection bias. We implemented Bayesian twin-regression models, which simultaneously model both the observed outcome and the unobserved unmeasured confounder, using information from external sources. A sensitivity analysis was also conducted to assess the robustness of our conclusions to changes in such external data. The use of Bayesian modeling in this study suggests that the lack of baseline BMD did have a strong impact on the analysis, reversing the direction of the estimated effect (odds ratio of fracture incidence at 24 months: 0.40 vs. 1.36, with/without adjusting for unmeasured baseline BMD). The Bayesian twin-regression models provide a flexible sensitivity analysis tool to quantitatively assess the impact of unmeasured confounding in observational studies. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  19. An Electrochemical NO2 Sensor Based on Ionic Liquid: Influence of the Morphology of the Polymer Electrolyte on Sensor Sensitivity

    PubMed Central

    Kuberský, Petr; Altšmíd, Jakub; Hamáček, Aleš; Nešpůrek, Stanislav; Zmeškal, Oldřich

    2015-01-01

    A systematic study was carried out to investigate the effect of ionic liquid in solid polymer electrolyte (SPE) and its layer morphology on the characteristics of an electrochemical amperometric nitrogen dioxide sensor. Five different ionic liquids were immobilized into a solid polymer electrolyte and key sensor parameters (sensitivity, response/recovery times, hysteresis and limit of detection) were characterized. The study revealed that the sensor based on 1-ethyl-3-methylimidazolium bis(trifluoromethylsulfonyl)imide ([EMIM][N(Tf)2]) showed the best sensitivity, fast response/recovery times, and low sensor response hysteresis. The working electrode, deposited from water-based carbon nanotube ink, was prepared by aerosol-jet printing technology. It was observed that the thermal treatment and crystallinity of poly(vinylidene fluoride) (PVDF) in the solid polymer electrolyte influenced the sensitivity. Picture analysis of the morphology of the SPE layer based on [EMIM][N(Tf)2] ionic liquid treated under different conditions suggests that the sensor sensitivity strongly depends on the fractal dimension of PVDF spherical objects in SPE. Their deformation, e.g., due to crowding, leads to a decrease in sensor sensitivity. PMID:26569248

  20. The Impact of Parametric Uncertainties on Biogeochemistry in the E3SM Land Model

    NASA Astrophysics Data System (ADS)

    Ricciuto, Daniel; Sargsyan, Khachik; Thornton, Peter

    2018-02-01

    We conduct a global sensitivity analysis (GSA) of the Energy Exascale Earth System Model (E3SM), land model (ELM) to calculate the sensitivity of five key carbon cycle outputs to 68 model parameters. This GSA is conducted by first constructing a Polynomial Chaos (PC) surrogate via new Weighted Iterative Bayesian Compressive Sensing (WIBCS) algorithm for adaptive basis growth leading to a sparse, high-dimensional PC surrogate with 3,000 model evaluations. The PC surrogate allows efficient extraction of GSA information leading to further dimensionality reduction. The GSA is performed at 96 FLUXNET sites covering multiple plant functional types (PFTs) and climate conditions. About 20 of the model parameters are identified as sensitive with the rest being relatively insensitive across all outputs and PFTs. These sensitivities are dependent on PFT, and are relatively consistent among sites within the same PFT. The five model outputs have a majority of their highly sensitive parameters in common. A common subset of sensitive parameters is also shared among PFTs, but some parameters are specific to certain types (e.g., deciduous phenology). The relative importance of these parameters shifts significantly among PFTs and with climatic variables such as mean annual temperature.

  1. Micropollutants throughout an integrated urban drainage model: Sensitivity and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Mannina, Giorgio; Cosenza, Alida; Viviani, Gaspare

    2017-11-01

    The paper presents the sensitivity and uncertainty analysis of an integrated urban drainage model which includes micropollutants. Specifically, a bespoke integrated model developed in previous studies has been modified in order to include the micropollutant assessment (namely, sulfamethoxazole - SMX). The model takes into account also the interactions between the three components of the system: sewer system (SS), wastewater treatment plant (WWTP) and receiving water body (RWB). The analysis has been applied to an experimental catchment nearby Palermo (Italy): the Nocella catchment. Overall, five scenarios, each characterized by different uncertainty combinations of sub-systems (i.e., SS, WWTP and RWB), have been considered applying, for the sensitivity analysis, the Extended-FAST method in order to select the key factors affecting the RWB quality and to design a reliable/useful experimental campaign. Results have demonstrated that sensitivity analysis is a powerful tool for increasing operator confidence in the modelling results. The approach adopted here can be used for blocking some non-identifiable factors, thus wisely modifying the structure of the model and reducing the related uncertainty. The model factors related to the SS have been found to be the most relevant factors affecting the SMX modeling in the RWB when all model factors (scenario 1) or model factors of SS (scenarios 2 and 3) are varied. If the only factors related to the WWTP are changed (scenarios 4 and 5), the SMX concentration in the RWB is mainly influenced (till to 95% influence of the total variance for SSMX,max) by the aerobic sorption coefficient. A progressive uncertainty reduction from the upstream to downstream was found for the soluble fraction of SMX in the RWB.

  2. Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool

    NASA Astrophysics Data System (ADS)

    Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.

    2018-06-01

    Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.

  3. Fundamentals and practice for ultrasensitive laser-induced fluorescence detection in microanalytical systems.

    PubMed

    Johnson, Mitchell E; Landers, James P

    2004-11-01

    Laser-induced fluorescence is an extremely sensitive method for detection in chemical separations. In addition, it is well-suited to detection in small volumes, and as such is widely used for capillary electrophoresis and microchip-based separations. This review explores the detailed instrumental conditions required for sub-zeptomole, sub-picomolar detection limits. The key to achieving the best sensitivity is to use an excitation and emission volume that is matched to the separation system and that, simultaneously, will keep scattering and luminescence background to a minimum. We discuss how this is accomplished with confocal detection, 90 degrees on-capillary detection, and sheath-flow detection. It is shown that each of these methods have their advantages and disadvantages, but that all can be used to produce extremely sensitive detectors for capillary- or microchip-based separations. Analysis of these capabilities allows prediction of the optimal means of achieving ultrasensitive detection on microchips.

  4. Securing Sensitive Flight and Engine Simulation Data Using Smart Card Technology

    NASA Technical Reports Server (NTRS)

    Blaser, Tammy M.

    2003-01-01

    NASA Glenn Research Center has developed a smart card prototype capable of encrypting and decrypting disk files required to run a distributed aerospace propulsion simulation. Triple Data Encryption Standard (3DES) encryption is used to secure the sensitive intellectual property on disk pre, during, and post simulation execution. The prototype operates as a secure system and maintains its authorized state by safely storing and permanently retaining the encryption keys only on the smart card. The prototype is capable of authenticating a single smart card user and includes pre simulation and post simulation tools for analysis and training purposes. The prototype's design is highly generic and can be used to protect any sensitive disk files with growth capability to urn multiple simulations. The NASA computer engineer developed the prototype on an interoperable programming environment to enable porting to other Numerical Propulsion System Simulation (NPSS) capable operating system environments.

  5. Simple Sensitivity Analysis for Orion GNC

    NASA Technical Reports Server (NTRS)

    Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar

    2013-01-01

    The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch.We describe in this paper a sensitivity analysis tool (Critical Factors Tool or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.

  6. PharmacoGx: an R package for analysis of large pharmacogenomic datasets.

    PubMed

    Smirnov, Petr; Safikhani, Zhaleh; El-Hachem, Nehme; Wang, Dong; She, Adrian; Olsen, Catharina; Freeman, Mark; Selby, Heather; Gendoo, Deena M A; Grossmann, Patrick; Beck, Andrew H; Aerts, Hugo J W L; Lupien, Mathieu; Goldenberg, Anna; Haibe-Kains, Benjamin

    2016-04-15

    Pharmacogenomics holds great promise for the development of biomarkers of drug response and the design of new therapeutic options, which are key challenges in precision medicine. However, such data are scattered and lack standards for efficient access and analysis, consequently preventing the realization of the full potential of pharmacogenomics. To address these issues, we implemented PharmacoGx, an easy-to-use, open source package for integrative analysis of multiple pharmacogenomic datasets. We demonstrate the utility of our package in comparing large drug sensitivity datasets, such as the Genomics of Drug Sensitivity in Cancer and the Cancer Cell Line Encyclopedia. Moreover, we show how to use our package to easily perform Connectivity Map analysis. With increasing availability of drug-related data, our package will open new avenues of research for meta-analysis of pharmacogenomic data. PharmacoGx is implemented in R and can be easily installed on any system. The package is available from CRAN and its source code is available from GitHub. bhaibeka@uhnresearch.ca or benjamin.haibe.kains@utoronto.ca Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. Treatment strategies for pelvic organ prolapse: a cost-effectiveness analysis.

    PubMed

    Hullfish, Kathie L; Trowbridge, Elisa R; Stukenborg, George J

    2011-05-01

    To compare the relative cost effectiveness of treatment decision alternatives for post-hysterectomy pelvic organ prolapse (POP). A Markov decision analysis model was used to assess and compare the relative cost effectiveness of expectant management, use of a pessary, and surgery for obtaining months of quality-adjusted life over 1 year. Sensitivity analysis was conducted to determine whether the results depended on specific estimates of patient utilities for pessary use, probabilities for complications and other events, and estimated costs. Only two treatment alternatives were found to be efficient choices: initial pessary use and vaginal reconstructive surgery (VRS). Pessary use (including patients that eventually transitioned to surgery) achieved 10.4 quality-adjusted months, at a cost of $10,000 per patient, while VRS obtained 11.4 quality-adjusted months, at $15,000 per patient. Sensitivity analysis demonstrated that these baseline results depended on several key estimates in the model. This analysis indicates that pessary use and VRS are the most cost-effective treatment alternatives for treating post-hysterectomy vaginal prolapse. Additional research is needed to standardize POP outcomes and complications, so that healthcare providers can best utilize cost information in balancing the risks and benefits of their treatment decisions.

  8. Improved analysis of Monascus pigments based on their pH-sensitive UV-Vis absorption and reactivity properties.

    PubMed

    Shi, Kan; Chen, Gong; Pistolozzi, Marco; Xia, Fenggeng; Wu, Zhenqiang

    2016-09-01

    Monascus pigments, a mixture of azaphilones mainly composed of red, orange and yellow pigments, are usually prepared in aqueous ethanol and analysed by ultraviolet-visible (UV-Vis) spectroscopy. The pH of aqueous ethanol used during sample preparation and analysis has never been considered a key parameter to control; however, this study shows that the UV-Vis spectra and colour characteristics of the six major pigments are strongly influenced by the pH of the solvent employed. In addition, the increase of solvent pH results in a remarkable increase of the amination reaction of orange pigments with amino compounds, and at higher pH (≥ 6.0) a significant amount of orange pigment derivatives rapidly form. The consequent impact of these pH-sensitive properties on pigment analysis is further discussed. Based on the presented results, we propose that the sample preparation and analysis of Monascus pigments should be uniformly performed at low pH (≤ 2.5) to avoid variations of UV-Vis spectra and the creation of artefacts due to the occurrence of amination reactions, and ensure an accurate analysis that truly reflects pigment characteristics in the samples.

  9. Parameterization of aquatic ecosystem functioning and its natural variation: Hierarchical Bayesian modelling of plankton food web dynamics

    NASA Astrophysics Data System (ADS)

    Norros, Veera; Laine, Marko; Lignell, Risto; Thingstad, Frede

    2017-10-01

    Methods for extracting empirically and theoretically sound parameter values are urgently needed in aquatic ecosystem modelling to describe key flows and their variation in the system. Here, we compare three Bayesian formulations for mechanistic model parameterization that differ in their assumptions about the variation in parameter values between various datasets: 1) global analysis - no variation, 2) separate analysis - independent variation and 3) hierarchical analysis - variation arising from a shared distribution defined by hyperparameters. We tested these methods, using computer-generated and empirical data, coupled with simplified and reasonably realistic plankton food web models, respectively. While all methods were adequate, the simulated example demonstrated that a well-designed hierarchical analysis can result in the most accurate and precise parameter estimates and predictions, due to its ability to combine information across datasets. However, our results also highlighted sensitivity to hyperparameter prior distributions as an important caveat of hierarchical analysis. In the more complex empirical example, hierarchical analysis was able to combine precise identification of parameter values with reasonably good predictive performance, although the ranking of the methods was less straightforward. We conclude that hierarchical Bayesian analysis is a promising tool for identifying key ecosystem-functioning parameters and their variation from empirical datasets.

  10. Reliability and Productivity Modeling for the Optimization of Separated Spacecraft Interferometers

    NASA Technical Reports Server (NTRS)

    Kenny, Sean (Technical Monitor); Wertz, Julie

    2002-01-01

    As technological systems grow in capability, they also grow in complexity. Due to this complexity, it is no longer possible for a designer to use engineering judgement to identify the components that have the largest impact on system life cycle metrics, such as reliability, productivity, cost, and cost effectiveness. One way of identifying these key components is to build quantitative models and analysis tools that can be used to aid the designer in making high level architecture decisions. Once these key components have been identified, two main approaches to improving a system using these components exist: add redundancy or improve the reliability of the component. In reality, the most effective approach to almost any system will be some combination of these two approaches, in varying orders of magnitude for each component. Therefore, this research tries to answer the question of how to divide funds, between adding redundancy and improving the reliability of components, to most cost effectively improve the life cycle metrics of a system. While this question is relevant to any complex system, this research focuses on one type of system in particular: Separate Spacecraft Interferometers (SSI). Quantitative models are developed to analyze the key life cycle metrics of different SSI system architectures. Next, tools are developed to compare a given set of architectures in terms of total performance, by coupling different life cycle metrics together into one performance metric. Optimization tools, such as simulated annealing and genetic algorithms, are then used to search the entire design space to find the "optimal" architecture design. Sensitivity analysis tools have been developed to determine how sensitive the results of these analyses are to uncertain user defined parameters. Finally, several possibilities for the future work that could be done in this area of research are presented.

  11. Performance of visual inspection with acetic acid for cervical cancer screening: a qualitative summary of evidence to date.

    PubMed

    Gaffikin, Lynne; Lauterbach, Margo; Blumenthal, Paul D

    2003-08-01

    Developing countries often lack the necessary resources to use the Papanicolaou (Pap) smear as a screening tool for cervical abnormalities. Because the burden of cervical cancer is highest in such low-resource settings, alternative techniques have been sought. Recently, interest in visual inspection with acetic acid (VIA) has increased. Numerous studies have been conducted on its accuracy and its ability to detect cervical lesions when compared with other techniques, both conventional and nonconventional. This review summarizes key findings from the literature to provide researchers and policymakers with an up-to-date summary on VIA. PubMed was used to identify relevant journal articles published between 1982 and 2002. Key words were cervical cancer screening, visual inspection, VIA (visual inspection with acetic acid), DVI (direct visual inspection), AAT (acetic acid test), and cervicoscopy. Studies were eligible for review only if they involved analysis of primary VIA data (ie, not review articles); studies involving magnification devices were excluded. Fifteen studies were reviewed in total; key results were extracted and a summary analysis was performed for sensitivity and specificity parameters. When reported, sensitivity ranged between 66% and 96% and specificity between 64% and 98%. Authors comparing VIA with cytology noted that the overall usefulness of VIA compares favorably with that of the Pap test. The reported findings reviewed here suggest that VIA has the potential to be a cervical cancer screening tool, especially in low resource settings. Obstetricians & Gynecologists, Family Physicians. After completion of this article, the reader will be able to describe how visual inspection of the cervix for cervical cancer screening (VIA) is performed, to summarize the current literature on VIA, and to list potential advantages of VIA.

  12. A comprehensive evaluation of various sensitivity analysis methods: A case study with a hydrological model

    DOE PAGES

    Gan, Yanjun; Duan, Qingyun; Gong, Wei; ...

    2014-01-01

    Sensitivity analysis (SA) is a commonly used approach for identifying important parameters that dominate model behaviors. We use a newly developed software package, a Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), to evaluate the effectiveness and efficiency of ten widely used SA methods, including seven qualitative and three quantitative ones. All SA methods are tested using a variety of sampling techniques to screen out the most sensitive (i.e., important) parameters from the insensitive ones. The Sacramento Soil Moisture Accounting (SAC-SMA) model, which has thirteen tunable parameters, is used for illustration. The South Branch Potomac River basin nearmore » Springfield, West Virginia in the U.S. is chosen as the study area. The key findings from this study are: (1) For qualitative SA methods, Correlation Analysis (CA), Regression Analysis (RA), and Gaussian Process (GP) screening methods are shown to be not effective in this example. Morris One-At-a-Time (MOAT) screening is the most efficient, needing only 280 samples to identify the most important parameters, but it is the least robust method. Multivariate Adaptive Regression Splines (MARS), Delta Test (DT) and Sum-Of-Trees (SOT) screening methods need about 400–600 samples for the same purpose. Monte Carlo (MC), Orthogonal Array (OA) and Orthogonal Array based Latin Hypercube (OALH) are appropriate sampling techniques for them; (2) For quantitative SA methods, at least 2777 samples are needed for Fourier Amplitude Sensitivity Test (FAST) to identity parameter main effect. McKay method needs about 360 samples to evaluate the main effect, more than 1000 samples to assess the two-way interaction effect. OALH and LPτ (LPTAU) sampling techniques are more appropriate for McKay method. For the Sobol' method, the minimum samples needed are 1050 to compute the first-order and total sensitivity indices correctly. These comparisons show that qualitative SA methods are more efficient but less accurate and robust than quantitative ones.« less

  13. Identification of key outcome measures when using the instrumented timed up and go and/or posturography for fall screening.

    PubMed

    Sample, Renee Beach; Kinney, Allison L; Jackson, Kurt; Diestelkamp, Wiebke; Bigelow, Kimberly Edginton

    2017-09-01

    The Timed Up and Go (TUG) has been commonly used for fall risk assessment. The instrumented Timed Up and Go (iTUG) adds wearable sensors to capture sub-movements and may be more sensitive. Posturography assessments have also been used for determining fall risk. This study used stepwise logistic regression models to identify key outcome measures for the iTUG and posturography protocols. The effectiveness of the models containing these measures in differentiating fallers from non-fallers were then compared for each: iTUG total time duration only, iTUG, posturography, and combined iTUG and posturography assessments. One hundred and fifty older adults participated in this study. The iTUG measures were calculated utilizing APDM Inc.'s Mobility Lab software. Traditional and non-linear posturography measures were calculated from center of pressure during quiet-standing. The key outcome measures incorporated in the iTUG assessment model (sit-to-stand lean angle and height) resulted in a model sensitivity of 48.1% and max re-scaled R 2 value of 0.19. This was a higher sensitivity, indicating better differentiation, compared to the model only including total time duration (outcome of the traditional TUG), which had a sensitivity of 18.2%. When the key outcome measures of the iTUG and the posturography assessments were combined into a single model, the sensitivity was approximately the same as the iTUG model alone. Overall the findings of this study support that the iTUG demonstrates greater sensitivity than the total time duration, but that carrying out both iTUG and posturography does not greatly improve sensitivity when used as a fall risk screening tool. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Modeling bronchial circulation with application to soluble gas exchange: description and sensitivity analysis.

    PubMed

    Bui, T D; Dabdub, D; George, S C

    1998-06-01

    The steady-state exchange of inert gases across an in situ canine trachea has recently been shown to be limited equally by diffusion and perfusion over a wide range (0.01-350) of blood solubilities (betablood; ml . ml-1 . atm-1). Hence, we hypothesize that the exchange of ethanol (betablood = 1,756 at 37 degrees C) in the airways depends on the blood flow rate from the bronchial circulation. To test this hypothesis, the dynamics of the bronchial circulation were incorporated into an existing model that describes the simultaneous exchange of heat, water, and a soluble gas in the airways. A detailed sensitivity analysis of key model parameters was performed by using the method of Latin hypercube sampling. The model accurately predicted a previously reported experimental exhalation profile of ethanol (R2 = 0.991) as well as the end-exhalation airstream temperature (34.6 degrees C). The model predicts that 27, 29, and 44% of exhaled ethanol in a single exhalation are derived from the tissues of the mucosa and submucosa, the bronchial circulation, and the tissue exterior to the submucosa (which would include the pulmonary circulation), respectively. Although the concentration of ethanol in the bronchial capillary decreased during inspiration, the three key model outputs (end-exhaled ethanol concentration, the slope of phase III, and end-exhaled temperature) were all statistically insensitive (P > 0.05) to the parameters describing the bronchial circulation. In contrast, the model outputs were all sensitive (P < 0.05) to the thickness of tissue separating the core body conditions from the bronchial smooth muscle. We conclude that both the bronchial circulation and the pulmonary circulation impact soluble gas exchange when the entire conducting airway tree is considered.

  15. Sensitivity analysis of monthly reference crop evapotranspiration trends in Iran: a qualitative approach

    NASA Astrophysics Data System (ADS)

    Mosaedi, Abolfazl; Ghabaei Sough, Mohammad; Sadeghi, Sayed-Hossein; Mooshakhian, Yousof; Bannayan, Mohammad

    2017-05-01

    The main objective of this study was to analyze the sensitivity of the monthly reference crop evapotranspiration (ETo) trends to key climatic factors (minimum and maximum air temperature ( T max and T min), relative humidity (RH), sunshine hours ( t sun), and wind speed ( U 2)) in Iran by applying a qualitative detrended method, rather than the historical mathematical approach. Meteorological data for the period of 1963-2007 from five synoptic stations with different climatic characteristics, including Mashhad (mountains), Tabriz (mountains), Tehran (semi-desert), Anzali (coastal wet), and Shiraz (semi-mountains) were used to address this objective. The Mann-Kendall test was employed to assess the trends of ETo and the climatic variables. The results indicated a significant increasing trend of the monthly ETo for Mashhad and Tabriz for most part of the year while the opposite conclusion was drawn for Tehran, Anzali, and Shiraz. Based on the detrended method, RH and U 2 were the two main variables enhancing the negative ETo trends in Tehran and Anzali stations whereas U 2 and temperature were responsible for this observation in Shiraz. On the other hand, the main meteorological variables affecting the significant positive trend of ETo were RH and t sun in Tabriz and T min, RH, and U 2 in Mashhad. Although a relative agreement was observed in terms of identifying one of the first two key climatic variables affecting the ETo trend, the qualitative and the quantitative sensitivity analysis solutions did never coincide. Further research is needed to evaluate this interesting finding for other geographic locations, and also to search for the major causes of this discrepancy.

  16. Influence of the quality of intraoperative fluoroscopic images on the spatial positioning accuracy of a CAOS system.

    PubMed

    Wang, Junqiang; Wang, Yu; Zhu, Gang; Chen, Xiangqian; Zhao, Xiangrui; Qiao, Huiting; Fan, Yubo

    2018-06-01

    Spatial positioning accuracy is a key issue in a computer-assisted orthopaedic surgery (CAOS) system. Since intraoperative fluoroscopic images are one of the most important input data to the CAOS system, the quality of these images should have a significant influence on the accuracy of the CAOS system. But the regularities and mechanism of the influence of the quality of intraoperative images on the accuracy of a CAOS system have yet to be studied. Two typical spatial positioning methods - a C-arm calibration-based method and a bi-planar positioning method - are used to study the influence of different image quality parameters, such as resolution, distortion, contrast and signal-to-noise ratio, on positioning accuracy. The error propagation rules of image error in different spatial positioning methods are analyzed by the Monte Carlo method. Correlation analysis showed that resolution and distortion had a significant influence on spatial positioning accuracy. In addition the C-arm calibration-based method was more sensitive to image distortion, while the bi-planar positioning method was more susceptible to image resolution. The image contrast and signal-to-noise ratio have no significant influence on the spatial positioning accuracy. The result of Monte Carlo analysis proved that generally the bi-planar positioning method was more sensitive to image quality than the C-arm calibration-based method. The quality of intraoperative fluoroscopic images is a key issue in the spatial positioning accuracy of a CAOS system. Although the 2 typical positioning methods have very similar mathematical principles, they showed different sensitivities to different image quality parameters. The result of this research may help to create a realistic standard for intraoperative fluoroscopic images for CAOS systems. Copyright © 2018 John Wiley & Sons, Ltd.

  17. Extensions and applications of a second-order landsurface parameterization

    NASA Technical Reports Server (NTRS)

    Andreou, S. A.; Eagleson, P. S.

    1983-01-01

    Extensions and applications of a second order land surface parameterization, proposed by Andreou and Eagleson are developed. Procedures for evaluating the near surface storage depth used in one cell land surface parameterizations are suggested and tested by using the model. Sensitivity analysis to the key soil parameters is performed. A case study involving comparison with an "exact" numerical model and another simplified parameterization, under very dry climatic conditions and for two different soil types, is also incorporated.

  18. Quantifying Cell Fate Decisions for Differentiation and Reprogramming of a Human Stem Cell Network: Landscape and Biological Paths

    PubMed Central

    Li, Chunhe; Wang, Jin

    2013-01-01

    Cellular reprogramming has been recently intensively studied experimentally. We developed a global potential landscape and kinetic path framework to explore a human stem cell developmental network composed of 52 genes. We uncovered the underlying landscape for the stem cell network with two basins of attractions representing stem and differentiated cell states, quantified and exhibited the high dimensional biological paths for the differentiation and reprogramming process, connecting the stem cell state and differentiated cell state. Both the landscape and non-equilibrium curl flux determine the dynamics of cell differentiation jointly. Flux leads the kinetic paths to be deviated from the steepest descent gradient path, and the corresponding differentiation and reprogramming paths are irreversible. Quantification of paths allows us to find out how the differentiation and reprogramming occur and which important states they go through. We show the developmental process proceeds as moving from the stem cell basin of attraction to the differentiation basin of attraction. The landscape topography characterized by the barrier heights and transition rates quantitatively determine the global stability and kinetic speed of cell fate decision process for development. Through the global sensitivity analysis, we provided some specific predictions for the effects of key genes and regulation connections on the cellular differentiation or reprogramming process. Key links from sensitivity analysis and biological paths can be used to guide the differentiation designs or reprogramming tactics. PMID:23935477

  19. Chemical Exchange Saturation Transfer in Chemical Reactions: A Mechanistic Tool for NMR Detection and Characterization of Transient Intermediates.

    PubMed

    Lokesh, N; Seegerer, Andreas; Hioe, Johnny; Gschwind, Ruth M

    2018-02-07

    The low sensitivity of NMR and transient key intermediates below detection limit are the central problems studying reaction mechanisms by NMR. Sensitivity can be enhanced by hyperpolarization techniques such as dynamic nuclear polarization or the incorporation/interaction of special hyperpolarized molecules. However, all of these techniques require special equipment, are restricted to selective reactions, or undesirably influence the reaction pathways. Here, we apply the chemical exchange saturation transfer (CEST) technique for the first time to NMR detect and characterize previously unobserved transient reaction intermediates in organocatalysis. The higher sensitivity of CEST and chemical equilibria present in the reaction pathway are exploited to access population and kinetics information on low populated intermediates. The potential of the method is demonstrated on the proline-catalyzed enamine formation for unprecedented in situ detection of a DPU stabilized zwitterionic iminium species, the elusive key intermediate between enamine and oxazolidinones. The quantitative analysis of CEST data at 250 K revealed the population ratio of [Z-iminium]/[exo-oxazolidinone] 0.02, relative free energy +8.1 kJ/mol (calculated +7.3 kJ/mol), and free energy barrier of +45.9 kJ/mol (ΔG ⧧ calc. (268 K) = +42.2 kJ/mol) for Z-iminium → exo-oxazolidinone. The findings underpin the iminium ion participation in enamine formation pathway corroborating our earlier theoretical prediction and help in better understanding. The reliability of CEST is validated using 1D EXSY-build-up techniques at low temperature (213 K). The CEST method thus serves as a new tool for mechanistic investigations in organocatalysis to access key information, such as chemical shifts, populations, and reaction kinetics of intermediates below the standard NMR detection limit.

  20. Modeling screening, prevention, and delaying of Alzheimer's disease: an early-stage decision analytic model

    PubMed Central

    2010-01-01

    Background Alzheimer's Disease (AD) affects a growing proportion of the population each year. Novel therapies on the horizon may slow the progress of AD symptoms and avoid cases altogether. Initiating treatment for the underlying pathology of AD would ideally be based on biomarker screening tools identifying pre-symptomatic individuals. Early-stage modeling provides estimates of potential outcomes and informs policy development. Methods A time-to-event (TTE) simulation provided estimates of screening asymptomatic patients in the general population age ≥55 and treatment impact on the number of patients reaching AD. Patients were followed from AD screen until all-cause death. Baseline sensitivity and specificity were 0.87 and 0.78, with treatment on positive screen. Treatment slowed progression by 50%. Events were scheduled using literature-based age-dependent incidences of AD and death. Results The base case results indicated increased AD free years (AD-FYs) through delays in onset and a reduction of 20 AD cases per 1000 screened individuals. Patients completely avoiding AD accounted for 61% of the incremental AD-FYs gained. Total years of treatment per 1000 screened patients was 2,611. The number-needed-to-screen was 51 and the number-needed-to-treat was 12 to avoid one case of AD. One-way sensitivity analysis indicated that duration of screening sensitivity and rescreen interval impact AD-FYs the most. A two-way sensitivity analysis found that for a test with an extended duration of sensitivity (15 years) the number of AD cases avoided was 6,000-7,000 cases for a test with higher sensitivity and specificity (0.90,0.90). Conclusions This study yielded valuable parameter range estimates at an early stage in the study of screening for AD. Analysis identified duration of screening sensitivity as a key variable that may be unavailable from clinical trials. PMID:20433705

  1. Modeling screening, prevention, and delaying of Alzheimer's disease: an early-stage decision analytic model.

    PubMed

    Furiak, Nicolas M; Klein, Robert W; Kahle-Wrobleski, Kristin; Siemers, Eric R; Sarpong, Eric; Klein, Timothy M

    2010-04-30

    Alzheimer's Disease (AD) affects a growing proportion of the population each year. Novel therapies on the horizon may slow the progress of AD symptoms and avoid cases altogether. Initiating treatment for the underlying pathology of AD would ideally be based on biomarker screening tools identifying pre-symptomatic individuals. Early-stage modeling provides estimates of potential outcomes and informs policy development. A time-to-event (TTE) simulation provided estimates of screening asymptomatic patients in the general population age > or =55 and treatment impact on the number of patients reaching AD. Patients were followed from AD screen until all-cause death. Baseline sensitivity and specificity were 0.87 and 0.78, with treatment on positive screen. Treatment slowed progression by 50%. Events were scheduled using literature-based age-dependent incidences of AD and death. The base case results indicated increased AD free years (AD-FYs) through delays in onset and a reduction of 20 AD cases per 1000 screened individuals. Patients completely avoiding AD accounted for 61% of the incremental AD-FYs gained. Total years of treatment per 1000 screened patients was 2,611. The number-needed-to-screen was 51 and the number-needed-to-treat was 12 to avoid one case of AD. One-way sensitivity analysis indicated that duration of screening sensitivity and rescreen interval impact AD-FYs the most. A two-way sensitivity analysis found that for a test with an extended duration of sensitivity (15 years) the number of AD cases avoided was 6,000-7,000 cases for a test with higher sensitivity and specificity (0.90,0.90). This study yielded valuable parameter range estimates at an early stage in the study of screening for AD. Analysis identified duration of screening sensitivity as a key variable that may be unavailable from clinical trials.

  2. A systematic study on drug-response associated genes using baseline gene expressions of the Cancer Cell Line Encyclopedia

    NASA Astrophysics Data System (ADS)

    Liu, Xiaoming; Yang, Jiasheng; Zhang, Yi; Fang, Yun; Wang, Fayou; Wang, Jun; Zheng, Xiaoqi; Yang, Jialiang

    2016-03-01

    We have studied drug-response associated (DRA) gene expressions by applying a systems biology framework to the Cancer Cell Line Encyclopedia data. More than 4,000 genes are inferred to be DRA for at least one drug, while the number of DRA genes for each drug varies dramatically from almost 0 to 1,226. Functional enrichment analysis shows that the DRA genes are significantly enriched in genes associated with cell cycle and plasma membrane. Moreover, there might be two patterns of DRA genes between genders. There are significantly shared DRA genes between male and female for most drugs, while very little DRA genes tend to be shared between the two genders for a few drugs targeting sex-specific cancers (e.g., PD-0332991 for breast cancer and ovarian cancer). Our analyses also show substantial difference for DRA genes between young and old samples, suggesting the necessity of considering the age effects for personalized medicine in cancers. Lastly, differential module and key driver analyses confirm cell cycle related modules as top differential ones for drug sensitivity. The analyses also reveal the role of TSPO, TP53, and many other immune or cell cycle related genes as important key drivers for DRA network modules. These key drivers provide new drug targets to improve the sensitivity of cancer therapy.

  3. The sensitivity of the ESA DELTA model

    NASA Astrophysics Data System (ADS)

    Martin, C.; Walker, R.; Klinkrad, H.

    Long-term debris environment models play a vital role in furthering our understanding of the future debris environment, and in aiding the determination of a strategy to preserve the Earth orbital environment for future use. By their very nature these models have to make certain assumptions to enable informative future projections to be made. Examples of these assumptions include the projection of future traffic, including launch and explosion rates, and the methodology used to simulate break-up events. To ensure a sound basis for future projections, and consequently for assessing the effectiveness of various mitigation measures, it is essential that the sensitivity of these models to variations in key assumptions is examined. The DELTA (Debris Environment Long Term Analysis) model, developed by QinetiQ for the European Space Agency, allows the future projection of the debris environment throughout Earth orbit. Extensive analyses with this model have been performed under the auspices of the ESA Space Debris Mitigation Handbook and following the recent upgrade of the model to DELTA 3.0. This paper draws on these analyses to present the sensitivity of the DELTA model to changes in key model parameters and assumptions. Specifically the paper will address the variation in future traffic rates, including the deployment of satellite constellations, and the variation in the break-up model and criteria used to simulate future explosion and collision events.

  4. Development of Responder Definitions for Fibromyalgia Clinical Trials

    PubMed Central

    Arnold, Lesley M.; Williams, David A.; Hudson, James I.; Martin, Susan A.; Clauw, Daniel J.; Crofford, Leslie J.; Wang, Fujun; Emir, Birol; Lai, Chinglin; Zablocki, Rong; Mease, Philip J.

    2011-01-01

    Objective To develop responder definitions for fibromyalgia clinical trials using key symptom and functional domains. Methods 24 candidate responder definitions were developed by expert consensus and evaluated in 12 randomized, placebo-controlled fibromyalgia trials of 4 medications. For each definition, treatment effects of the medication compared with placebo were analyzed using the Cochran-Mantel-Haenszel test or Chi Square test. A meta-analysis of the pooled results for the 4 medications established risk ratios to determine the definitions that best favored medication over placebo. Results Two definitions performed best in the analyses. Both definitions included ≥ 30% reduction in pain and ≥ 10% improvement in physical function. They differed in that one (FM30 short version) included ≥ 30% improvement in sleep or fatigue, and the other (FM30 long version) required ≥ 30% improvement in 2 of the following symptoms: sleep, fatigue, depression, anxiety, or cognition. In the analysis of both versions, the response rate was ≥ 15% for each medication and significantly greater than placebo. The risk ratio favoring drug over placebo (95% CI) in the pooled analysis for the FM30 short version was 1.50 (1.24, 1.82), P ≤ 0.0001; the FM30 long version was 1.60 (1.31, 1.96), P ≤ 0.00001. Conclusion Among the 24 responder definitions tested, 2 were identified as most sensitive in identifying response to treatment. The identification of responder definitions for fibromyalgia clinical trials that include assessments of key symptom and functional domains may improve the sensitivity of clinical trials to identify meaningful improvements, leading to improved management of fibromyalgia. PMID:21953205

  5. [Evaluation of land resources carrying capacity of development zone based on planning environment impact assessment].

    PubMed

    Fu, Shi-Feng; Zhang, Ping; Jiang, Jin-Long

    2012-02-01

    Assessment of land resources carrying capacity is the key point of planning environment impact assessment and the main foundation to determine whether the planning could be implemented or not. With the help of the space analysis function of Geographic Information System, and selecting altitude, slope, land use type, distance from resident land, distance from main traffic roads, and distance from environmentally sensitive area as the sensitive factors, a comprehensive assessment on the ecological sensitivity and its spatial distribution in Zhangzhou Merchants Economic and Technological Development Zone, Fujian Province of East China was conducted, and the assessment results were combined with the planning land layout diagram for the ecological suitability analysis. In the Development Zone, 84.0% of resident land, 93.1% of industrial land, 86.0% of traffic land, and 76. 0% of other constructive lands in planning were located in insensitive and gently sensitive areas, and thus, the implement of the land use planning generally had little impact on the ecological environment, and the land resources in the planning area was able to meet the land use demand. The assessment of the population carrying capacity with ecological land as the limiting factor indicated that in considering the highly sensitive area and 60% of the moderately sensitive area as ecological land, the population within the Zone in the planning could reach 240000, and the available land area per capita could be 134.0 m2. Such a planned population scale is appropriate, according to the related standards of constructive land.

  6. A statistical approach to nuclear fuel design and performance

    NASA Astrophysics Data System (ADS)

    Cunning, Travis Andrew

    As CANDU fuel failures can have significant economic and operational consequences on the Canadian nuclear power industry, it is essential that factors impacting fuel performance are adequately understood. Current industrial practice relies on deterministic safety analysis and the highly conservative "limit of operating envelope" approach, where all parameters are assumed to be at their limits simultaneously. This results in a conservative prediction of event consequences with little consideration given to the high quality and precision of current manufacturing processes. This study employs a novel approach to the prediction of CANDU fuel reliability. Probability distributions are fitted to actual fuel manufacturing datasets provided by Cameco Fuel Manufacturing, Inc. They are used to form input for two industry-standard fuel performance codes: ELESTRES for the steady-state case and ELOCA for the transient case---a hypothesized 80% reactor outlet header break loss of coolant accident. Using a Monte Carlo technique for input generation, 105 independent trials are conducted and probability distributions are fitted to key model output quantities. Comparing model output against recognized industrial acceptance criteria, no fuel failures are predicted for either case. Output distributions are well removed from failure limit values, implying that margin exists in current fuel manufacturing and design. To validate the results and attempt to reduce the simulation burden of the methodology, two dimensional reduction methods are assessed. Using just 36 trials, both methods are able to produce output distributions that agree strongly with those obtained via the brute-force Monte Carlo method, often to a relative discrepancy of less than 0.3% when predicting the first statistical moment, and a relative discrepancy of less than 5% when predicting the second statistical moment. In terms of global sensitivity, pellet density proves to have the greatest impact on fuel performance, with an average sensitivity index of 48.93% on key output quantities. Pellet grain size and dish depth are also significant contributors, at 31.53% and 13.46%, respectively. A traditional limit of operating envelope case is also evaluated. This case produces output values that exceed the maximum values observed during the 105 Monte Carlo trials for all output quantities of interest. In many cases the difference between the predictions of the two methods is very prominent, and the highly conservative nature of the deterministic approach is demonstrated. A reliability analysis of CANDU fuel manufacturing parametric data, specifically pertaining to the quantification of fuel performance margins, has not been conducted previously. Key Words: CANDU, nuclear fuel, Cameco, fuel manufacturing, fuel modelling, fuel performance, fuel reliability, ELESTRES, ELOCA, dimensional reduction methods, global sensitivity analysis, deterministic safety analysis, probabilistic safety analysis.

  7. Securing Digital Audio using Complex Quadratic Map

    NASA Astrophysics Data System (ADS)

    Suryadi, MT; Satria Gunawan, Tjandra; Satria, Yudi

    2018-03-01

    In This digital era, exchanging data are common and easy to do, therefore it is vulnerable to be attacked and manipulated from unauthorized parties. One data type that is vulnerable to attack is digital audio. So, we need data securing method that is not vulnerable and fast. One of the methods that match all of those criteria is securing the data using chaos function. Chaos function that is used in this research is complex quadratic map (CQM). There are some parameter value that causing the key stream that is generated by CQM function to pass all 15 NIST test, this means that the key stream that is generated using this CQM is proven to be random. In addition, samples of encrypted digital sound when tested using goodness of fit test are proven to be uniform, so securing digital audio using this method is not vulnerable to frequency analysis attack. The key space is very huge about 8.1×l031 possible keys and the key sensitivity is very small about 10-10, therefore this method is also not vulnerable against brute-force attack. And finally, the processing speed for both encryption and decryption process on average about 450 times faster that its digital audio duration.

  8. A comprehensive probabilistic analysis model of oil pipelines network based on Bayesian network

    NASA Astrophysics Data System (ADS)

    Zhang, C.; Qin, T. X.; Jiang, B.; Huang, C.

    2018-02-01

    Oil pipelines network is one of the most important facilities of energy transportation. But oil pipelines network accident may result in serious disasters. Some analysis models for these accidents have been established mainly based on three methods, including event-tree, accident simulation and Bayesian network. Among these methods, Bayesian network is suitable for probabilistic analysis. But not all the important influencing factors are considered and the deployment rule of the factors has not been established. This paper proposed a probabilistic analysis model of oil pipelines network based on Bayesian network. Most of the important influencing factors, including the key environment condition and emergency response are considered in this model. Moreover, the paper also introduces a deployment rule for these factors. The model can be used in probabilistic analysis and sensitive analysis of oil pipelines network accident.

  9. Integrated Decision Strategies for Skin Sensitization Hazard

    EPA Science Inventory

    One of the top priorities of the Interagency Coordinating Committee for the Validation of Alternative Methods (ICCVAM) is the identification and evaluation of non-animal alternatives for skin sensitization testing. Although skin sensitization is a complex process, the key biologi...

  10. NUMERICAL FLOW AND TRANSPORT SIMULATIONS SUPPORTING THE SALTSTONE FACILITY PERFORMANCE ASSESSMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flach, G.

    2009-02-28

    The Saltstone Disposal Facility Performance Assessment (PA) is being revised to incorporate requirements of Section 3116 of the Ronald W. Reagan National Defense Authorization Act for Fiscal Year 2005 (NDAA), and updated data and understanding of vault performance since the 1992 PA (Cook and Fowler 1992) and related Special Analyses. A hybrid approach was chosen for modeling contaminant transport from vaults and future disposal cells to exposure points. A higher resolution, largely deterministic, analysis is performed on a best-estimate Base Case scenario using the PORFLOW numerical analysis code. a few additional sensitivity cases are simulated to examine alternative scenarios andmore » parameter settings. Stochastic analysis is performed on a simpler representation of the SDF system using the GoldSim code to estimate uncertainty and sensitivity about the Base Case. This report describes development of PORFLOW models supporting the SDF PA, and presents sample results to illustrate model behaviors and define impacts relative to key facility performance objectives. The SDF PA document, when issued, should be consulted for a comprehensive presentation of results.« less

  11. Isothermal Amplification Methods for the Detection of Nucleic Acids in Microfluidic Devices

    PubMed Central

    Zanoli, Laura Maria; Spoto, Giuseppe

    2012-01-01

    Diagnostic tools for biomolecular detection need to fulfill specific requirements in terms of sensitivity, selectivity and high-throughput in order to widen their applicability and to minimize the cost of the assay. The nucleic acid amplification is a key step in DNA detection assays. It contributes to improving the assay sensitivity by enabling the detection of a limited number of target molecules. The use of microfluidic devices to miniaturize amplification protocols reduces the required sample volume and the analysis times and offers new possibilities for the process automation and integration in one single device. The vast majority of miniaturized systems for nucleic acid analysis exploit the polymerase chain reaction (PCR) amplification method, which requires repeated cycles of three or two temperature-dependent steps during the amplification of the nucleic acid target sequence. In contrast, low temperature isothermal amplification methods have no need for thermal cycling thus requiring simplified microfluidic device features. Here, the use of miniaturized analysis systems using isothermal amplification reactions for the nucleic acid amplification will be discussed. PMID:25587397

  12. Negative electrospray ionization on porous supporting tips for mass spectrometric analysis: electrostatic charging effect on detection sensitivity and its application to explosive detection.

    PubMed

    Wong, Melody Yee-Man; Man, Sin-Heng; Che, Chi-Ming; Lau, Kai-Chung; Ng, Kwan-Ming

    2014-03-21

    The simplicity and easy manipulation of a porous substrate-based ESI-MS technique have been widely applied to the direct analysis of different types of samples in positive ion mode. However, the study and application of this technique in negative ion mode are sparse. A key challenge could be due to the ease of electrical discharge on supporting tips upon the application of negative voltage. The aim of this study is to investigate the effect of supporting materials, including polyester, polyethylene and wood, on the detection sensitivity of a porous substrate-based negative ESI-MS technique. By using nitrobenzene derivatives and nitrophenol derivatives as the target analytes, it was found that the hydrophobic materials (i.e., polyethylene and polyester) with a higher tendency to accumulate negative charge could enhance the detection sensitivity towards nitrobenzene derivatives via electron-capture ionization; whereas, compounds with electron affinities lower than the cut-off value (1.13 eV) were not detected. Nitrophenol derivatives with pKa smaller than 9.0 could be detected in the form of deprotonated ions; whereas polar materials (i.e., wood), which might undergo competitive deprotonation with the analytes, could suppress the detection sensitivity. With the investigation of the material effects on the detection sensitivity, the porous substrate-based negative ESI-MS method was developed and applied to the direct detection of two commonly encountered explosives in complex samples.

  13. Effect of Metformin on Plasma Fibrinogen Concentrations: A Systematic Review and Meta-Analysis of Randomized Placebo-Controlled Trials.

    PubMed

    Simental-Mendia, Luis E; Pirro, Matteo; Atkin, Stephen L; Banach, Maciej; Mikhailidis, Dimitri P; Sahebkar, Amirhossein

    2018-01-01

    Fibrinogen is a key mediator of thrombosis and it has been implicated in the pathogenesis of atherosclerosis. Because metformin has shown a potential protective effect on different atherothrombotic risk factors, we assessed in this meta-analysis its effect on plasma fibrinogen concentrations. A systematic review and meta-analysis was carried out to identify randomized placebo-controlled trials evaluating the effect of metformin administration on fibrinogen levels. The search included PubMed-Medline, Scopus, ISI Web of Knowledge and Google Scholar databases (by June 2, 2017) and quality of studies was performed according to Cochrane criteria. Quantitative data synthesis was conducted using a random-effects model and sensitivity analysis by the leave-one-out method. Meta-regression analysis was performed to assess the modifiers of treatment response. Meta-analysis of data from 9 randomized placebo-controlled clinical trials with 2302 patients comprising 10 treatment arms did not suggest a significant change in plasma fibrinogen concentrations following metformin therapy (WMD: -0.25 g/L, 95% CI: -0.53, 0.04, p = 0.092). The effect size was robust in the leave-one-out sensitivity analysis and remained non-significant after omission of each single study from the meta-analysis. No significant effect of metformin on plasma fibrinogen concentrations was demonstrated in the current meta-analysis. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  14. Sensitivity analysis of Monju using ERANOS with JENDL-4.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tamagno, P.; Van Rooijen, W. F. G.; Takeda, T.

    2012-07-01

    This paper deals with sensitivity analysis using JENDL-4.0 nuclear data applied to the Monju reactor. In 2010 the Japan Atomic Energy Agency - JAEA - released a new set of nuclear data: JENDL-4.0. This new evaluation is expected to contain improved data on actinides and covariance matrices. Covariance matrices are a key point in quantification of uncertainties due to basic nuclear data. For sensitivity analysis, the well-established ERANOS [1] code was chosen because of its integrated modules that allow users to perform a sensitivity analysis of complex reactor geometries. A JENDL-4.0 cross-section library is not available for ERANOS. Therefore amore » cross-section library had to be made from the original nuclear data set, available as ENDF formatted files. This is achieved by using the following codes: NJOY, CALENDF, MERGE and GECCO in order to create a library for the ECCO cell code (part of ERANOS). In order to make sure of the accuracy of the new ECCO library, two benchmark experiments have been analyzed: the MZA and MZB cores of the MOZART program measured at the ZEBRA facility in the UK. These were chosen due to their similarity to the Monju core. Using the JENDL-4.0 ECCO library we have analyzed the criticality of Monju during the restart in 2010. We have obtained good agreement with the measured criticality. Perturbation calculations have been performed between JENDL-3.3 and JENDL-4.0 based models. The isotopes {sup 239}Pu, {sup 238}U, {sup 241}Am and {sup 241}Pu account for a major part of observed differences. (authors)« less

  15. Efficient molecular screening of Lynch syndrome by specific 3' promoter methylation of the MLH1 or BRAF mutation in colorectal cancer with high-frequency microsatellite instability.

    PubMed

    Nakagawa, Hitoshi; Nagasaka, Takeshi; Cullings, Harry M; Notohara, Kenji; Hoshijima, Naoko; Young, Joanne; Lynch, Henry T; Tanaka, Noriaki; Matsubara, Nagahide

    2009-06-01

    It is sometimes difficult to diagnose Lynch syndrome by the simple but strict clinical criteria, or even by the definitive genetic testing for causative germline mutation of mismatch repair genes. Thus, some practical and efficient screening strategy to select highly possible Lynch syndrome patients is exceedingly desirable. We performed a comprehensive study to evaluate the methylation status of whole MLH1 promoter region by direct bisulfite sequencing of the entire MLH1 promoter regions on Lynch and non-Lynch colorectal cancers (CRCs). Then, we established a convenient assay to detect methylation in key CpG islands responsible for the silencing of MLH1 expression. We studied the methylation status of MLH1 as well as the CpG island methylator phenotype (CIMP) and immunohistochemical analysis of mismatch repair proteins on 16 cases of Lynch CRC and 19 cases of sporadic CRCs with high-frequency microsatellite instability (MSI-H). Sensitivity to detect Lynch syndrome by MLH1 (CCAAT) methylation was 88% and the specificity was 84%. Positive likelihood ratio (PLR) was 5.5 and negative likelihood ratio (NLR) was 0.15. Sensitivity by mutational analysis of BRAF was 100%, specificity was 84%, PLR was 6.3 and NLR was zero. By CIMP analysis; sensitivity was 88%, specificity was 79%, PLR was 4.2, and NLR was 0.16. BRAF mutation or MLH1 methylation analysis combined with MSI testing could be a good alternative to screen Lynch syndrome patients in a cost effective manner. Although the assay for CIMP status also showed acceptable sensitivity and specificity, it may not be practical because of its rather complicated assay.

  16. Classification of Bitter Orange Essential Oils According to Fruit Ripening Stage by Untargeted Chemical Profiling and Machine Learning.

    PubMed

    Taghadomi-Saberi, Saeedeh; Mas Garcia, Sílvia; Allah Masoumi, Amin; Sadeghi, Morteza; Marco, Santiago

    2018-06-13

    The quality and composition of bitter orange essential oils (EOs) strongly depend on the ripening stage of the citrus fruit. The concentration of volatile compounds and consequently its organoleptic perception varies. While this can be detected by trained humans, we propose an objective approach for assessing the bitter orange from the volatile composition of their EO. The method is based on the combined use of headspace gas chromatography⁻mass spectrometry (HS-GC-MS) and artificial neural networks (ANN) for predictive modeling. Data obtained from the analysis of HS-GC-MS were preprocessed to select relevant peaks in the total ion chromatogram as input features for ANN. Results showed that key volatile compounds have enough predictive power to accurately classify the EO, according to their ripening stage for different applications. A sensitivity analysis detected the key compounds to identify the ripening stage. This study provides a novel strategy for the quality control of bitter orange EO without subjective methods.

  17. Simulation Studies of Satellite Laser CO2 Mission Concepts

    NASA Technical Reports Server (NTRS)

    Kawa, Stephan Randy; Mao, J.; Abshire, J. B.; Collatz, G. J.; Sun X.; Weaver, C. J.

    2011-01-01

    Results of mission simulation studies are presented for a laser-based atmospheric CO2 sounder. The simulations are based on real-time carbon cycle process modeling and data analysis. The mission concept corresponds to ASCENDS as recommended by the US National Academy of Sciences Decadal Survey. Compared to passive sensors, active (lidar) sensing of CO2 from space has several potentially significant advantages that hold promise to advance CO2 measurement capability in the next decade. Although the precision and accuracy requirements remain at unprecedented levels of stringency, analysis of possible instrument technology indicates that such sensors are more than feasible. Radiative transfer model calculations, an instrument model with representative errors, and a simple retrieval approach complete the cycle from "nature" run to "pseudodata" CO2. Several mission and instrument configuration options are examined, and the sensitivity to key design variables is shown. Examples are also shown of how the resulting pseudo-measurements might be used to address key carbon cycle science questions.

  18. Sensitivity of Rayleigh wave ellipticity and implications for surface wave inversion

    NASA Astrophysics Data System (ADS)

    Cercato, Michele

    2018-04-01

    The use of Rayleigh wave ellipticity has gained increasing popularity in recent years for investigating earth structures, especially for near-surface soil characterization. In spite of its widespread application, the sensitivity of the ellipticity function to the soil structure has been rarely explored in a comprehensive and systematic manner. To this end, a new analytical method is presented for computing the sensitivity of Rayleigh wave ellipticity with respect to the structural parameters of a layered elastic half-space. This method takes advantage of the minor decomposition of the surface wave eigenproblem and is numerically stable at high frequency. This numerical procedure allowed to retrieve the sensitivity for typical near surface and crustal geological scenarios, pointing out the key parameters for ellipticity interpretation under different circumstances. On this basis, a thorough analysis is performed to assess how ellipticity data can efficiently complement surface wave dispersion information in a joint inversion algorithm. The results of synthetic and real-world examples are illustrated to analyse quantitatively the diagnostic potential of the ellipticity data with respect to the soil structure, focusing on the possible sources of misinterpretation in data inversion.

  19. Economic analysis of uricase production under uncertainty: Contrast of chromatographic purification and aqueous two-phase extraction (with and without PEG recycle).

    PubMed

    Torres-Acosta, Mario A; Aguilar-Yáñez, José M; Rito-Palomares, Marco; Titchener-Hooker, Nigel J

    2016-01-01

    Uricase is the enzyme responsible for the breakdown of uric acid, the key molecule leading to gout in humans, into allantoin, but it is absent in humans. It has been produced as a PEGylated pharmaceutical where the purification is performed through three sequential chromatographic columns. More recently an aqueous two-phase system (ATPS) was reported that could recover Uricase with high yield and purity. Although the use of ATPS can decrease cost and time, it also generates a large amount of waste. The ability, therefore, to recycle key components of ATPS is of interest. Economic modelling is a powerful tool that allows the bioprocess engineer to compare possible outcomes and find areas where further research or optimization might be required without recourse to extensive experiments and time. This research provides an economic analysis using the commercial software BioSolve of the strategies for Uricase production: chromatographic and ATPS, and includes a third bioprocess that uses material recycling. The key parameters that affect the process the most were located via a sensitivity analysis and evaluated with a Monte Carlo analysis. Results show that ATPS is far less expensive than chromatography, but that there is an area where the cost of production of both bioprocesses overlap. Furthermore, recycling does not impact the cost of production. This study serves to provide a framework for the economic analysis of Uricase production using alternative techniques. © 2015 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.

  20. Economic analysis of uricase production under uncertainty: Contrast of chromatographic purification and aqueous two‐phase extraction (with and without PEG recycle)

    PubMed Central

    Torres‐Acosta, Mario A.; Aguilar‐Yáñez, José M.; Rito‐Palomares, Marco

    2015-01-01

    Uricase is the enzyme responsible for the breakdown of uric acid, the key molecule leading to gout in humans, into allantoin, but it is absent in humans. It has been produced as a PEGylated pharmaceutical where the purification is performed through three sequential chromatographic columns. More recently an aqueous two‐phase system (ATPS) was reported that could recover Uricase with high yield and purity. Although the use of ATPS can decrease cost and time, it also generates a large amount of waste. The ability, therefore, to recycle key components of ATPS is of interest. Economic modelling is a powerful tool that allows the bioprocess engineer to compare possible outcomes and find areas where further research or optimization might be required without recourse to extensive experiments and time. This research provides an economic analysis using the commercial software BioSolve of the strategies for Uricase production: chromatographic and ATPS, and includes a third bioprocess that uses material recycling. The key parameters that affect the process the most were located via a sensitivity analysis and evaluated with a Monte Carlo analysis. Results show that ATPS is far less expensive than chromatography, but that there is an area where the cost of production of both bioprocesses overlap. Furthermore, recycling does not impact the cost of production. This study serves to provide a framework for the economic analysis of Uricase production using alternative techniques. © 2015 American Institute of Chemical Engineers Biotechnol. Prog., 32:126–133, 2016 PMID:26561271

  1. Multiobjective robust design of the double wishbone suspension system based on particle swarm optimization.

    PubMed

    Cheng, Xianfu; Lin, Yuqun

    2014-01-01

    The performance of the suspension system is one of the most important factors in the vehicle design. For the double wishbone suspension system, the conventional deterministic optimization does not consider any deviations of design parameters, so design sensitivity analysis and robust optimization design are proposed. In this study, the design parameters of the robust optimization are the positions of the key points, and the random factors are the uncertainties in manufacturing. A simplified model of the double wishbone suspension is established by software ADAMS. The sensitivity analysis is utilized to determine main design variables. Then, the simulation experiment is arranged and the Latin hypercube design is adopted to find the initial points. The Kriging model is employed for fitting the mean and variance of the quality characteristics according to the simulation results. Further, a particle swarm optimization method based on simple PSO is applied and the tradeoff between the mean and deviation of performance is made to solve the robust optimization problem of the double wishbone suspension system.

  2. Probing the pH sensitivity of R-phycoerythrin: investigations of active conformational and functional variation.

    PubMed

    Liu, Lu-Ning; Su, Hai-Nan; Yan, Shi-Gan; Shao, Si-Mi; Xie, Bin-Bin; Chen, Xiu-Lan; Zhang, Xi-Ying; Zhou, Bai-Cheng; Zhang, Yu-Zhong

    2009-07-01

    Crystal structures of phycobiliproteins have provided valuable information regarding the conformations and amino acid organizations of peptides and chromophores, and enable us to investigate their structural and functional relationships with respect to environmental variations. In this work, we explored the pH-induced conformational and functional dynamics of R-phycoerythrin (R-PE) by means of absorption, fluorescence and circular dichroism spectra, together with analysis of its crystal structure. R-PE presents stronger functional stability in the pH range of 3.5-10 compared to the structural stability. Beyond this range, pronounced functional and structural changes occur. Crystal structure analysis shows that the tertiary structure of R-PE is fixed by several key anchoring points of the protein. With this specific association, the fundamental structure of R-PE is stabilized to present physiological spectroscopic properties, while local variations in protein peptides are also allowed in response to environmental disturbances. The functional stability and relative structural sensitivity of R-PE allow environmental adaptation.

  3. Economic evaluation of algae biodiesel based on meta-analyses

    NASA Astrophysics Data System (ADS)

    Zhang, Yongli; Liu, Xiaowei; White, Mark A.; Colosi, Lisa M.

    2017-08-01

    The objective of this study is to elucidate the economic viability of algae-to-energy systems at a large scale, by developing a meta-analysis of five previously published economic evaluations of systems producing algae biodiesel. Data from original studies were harmonised into a standardised framework using financial and technical assumptions. Results suggest that the selling price of algae biodiesel under the base case would be 5.00-10.31/gal, higher than the selected benchmarks: 3.77/gal for petroleum diesel, and 4.21/gal for commercial biodiesel (B100) from conventional vegetable oil or animal fat. However, the projected selling price of algal biodiesel (2.76-4.92/gal), following anticipated improvements, would be competitive. A scenario-based sensitivity analysis reveals that the price of algae biodiesel is most sensitive to algae biomass productivity, algae oil content, and algae cultivation cost. This indicates that the improvements in the yield, quality, and cost of algae feedstock could be the key factors to make algae-derived biodiesel economically viable.

  4. An analysis of parameter sensitivities of preference-inspired co-evolutionary algorithms

    NASA Astrophysics Data System (ADS)

    Wang, Rui; Mansor, Maszatul M.; Purshouse, Robin C.; Fleming, Peter J.

    2015-10-01

    Many-objective optimisation problems remain challenging for many state-of-the-art multi-objective evolutionary algorithms. Preference-inspired co-evolutionary algorithms (PICEAs) which co-evolve the usual population of candidate solutions with a family of decision-maker preferences during the search have been demonstrated to be effective on such problems. However, it is unknown whether PICEAs are robust with respect to the parameter settings. This study aims to address this question. First, a global sensitivity analysis method - the Sobol' variance decomposition method - is employed to determine the relative importance of the parameters controlling the performance of PICEAs. Experimental results show that the performance of PICEAs is controlled for the most part by the number of function evaluations. Next, we investigate the effect of key parameters identified from the Sobol' test and the genetic operators employed in PICEAs. Experimental results show improved performance of the PICEAs as more preferences are co-evolved. Additionally, some suggestions for genetic operator settings are provided for non-expert users.

  5. Electrophysiological Correlates of Individual Differences in Perception of Audiovisual Temporal Asynchrony

    PubMed Central

    Kaganovich, Natalya; Schumaker, Jennifer

    2016-01-01

    Sensitivity to the temporal relationship between auditory and visual stimuli is key to efficient audiovisual integration. However, even adults vary greatly in their ability to detect audiovisual temporal asynchrony. What underlies this variability is currently unknown. We recorded event-related potentials (ERPs) while participants performed a simultaneity judgment task on a range of audiovisual (AV) and visual-auditory (VA) stimulus onset asynchronies (SOAs) and compared ERP responses in good and poor performers to the 200 ms SOA, which showed the largest individual variability in the number of synchronous perceptions. Analysis of ERPs to the VA200 stimulus yielded no significant results. However, those individuals who were more sensitive to the AV200 SOA had significantly more positive voltage between 210 and 270 ms following the sound onset. In a follow-up analysis, we showed that the mean voltage within this window predicted approximately 36% of variability in sensitivity to AV temporal asynchrony in a larger group of participants. The relationship between the ERP measure in the 210-270 ms window and accuracy on the simultaneity judgment task also held for two other AV SOAs with significant individual variability - 100 and 300 ms. Because the identified window was time-locked to the onset of sound in the AV stimulus, we conclude that sensitivity to AV temporal asynchrony is shaped to a large extent by the efficiency in the neural encoding of sound onsets. PMID:27094850

  6. Smelling the Diagnosis: The Electronic Nose as Diagnostic Tool in Inflammatory Arthritis. A Case-Reference Study.

    PubMed

    Brekelmans, Marjolein P; Fens, Niki; Brinkman, Paul; Bos, Lieuwe D; Sterk, Peter J; Tak, Paul P; Gerlag, Daniëlle M

    2016-01-01

    To investigate whether exhaled breath analysis using an electronic nose can identify differences between inflammatory joint diseases and healthy controls. In a cross-sectional study, the exhaled breath of 21 rheumatoid arthritis (RA) and 18 psoriatic arthritis (PsA) patients with active disease was compared to 21 healthy controls using an electronic nose (Cyranose 320; Smiths Detection, Pasadena, CA, USA). Breathprints were analyzed with principal component analysis, discriminant analysis, and area under curve (AUC) of receiver operating characteristics (ROC) curves. Volatile organic compounds (VOCs) were identified by gas chromatography and mass spectrometry (GC-MS), and relationships between breathprints and markers of disease activity were explored. Breathprints of RA patients could be distinguished from controls with an accuracy of 71% (AUC 0.75, 95% CI 0.60-0.90, sensitivity 76%, specificity 67%). Breathprints from PsA patients were separated from controls with 69% accuracy (AUC 0.77, 95% CI 0.61-0.92, sensitivity 72%, specificity 71%). Distinction between exhaled breath of RA and PsA patients exhibited an accuracy of 69% (AUC 0.72, 95% CI 0.55-0.89, sensitivity 71%, specificity 72%). There was a positive correlation in RA patients of exhaled breathprints with disease activity score (DAS28) and number of painful joints. GC-MS identified seven key VOCs that significantly differed between the groups. Exhaled breath analysis by an electronic nose may play a role in differential diagnosis of inflammatory joint diseases. Data from this study warrant external validation.

  7. Establishing the Capability of a 1D SVAT Modelling Scheme in Predicting Key Biophysical Vegetation Characterisation Parameters

    NASA Astrophysics Data System (ADS)

    Ireland, Gareth; Petropoulos, George P.; Carlson, Toby N.; Purdy, Sarah

    2015-04-01

    Sensitivity analysis (SA) consists of an integral and important validatory check of a computer simulation model before it is used to perform any kind of analysis. In the present work, we present the results from a SA performed on the SimSphere Soil Vegetation Atmosphere Transfer (SVAT) model utilising a cutting edge and robust Global Sensitivity Analysis (GSA) approach, based on the use of the Gaussian Emulation Machine for Sensitivity Analysis (GEM-SA) tool. The sensitivity of the following model outputs was evaluated: the ambient CO2 concentration and the rate of CO2 uptake by the plant, the ambient O3 concentration, the flux of O3 from the air to the plant/soil boundary, and the flux of O3 taken up by the plant alone. The most sensitive model inputs for the majority of model outputs were related to the structural properties of vegetation, namely, the Leaf Area Index, Fractional Vegetation Cover, Cuticle Resistance and Vegetation Height. External CO2 in the leaf and the O3 concentration in the air input parameters also exhibited significant influence on model outputs. This work presents a very important step towards an all-inclusive evaluation of SimSphere. Indeed, results from this study contribute decisively towards establishing its capability as a useful teaching and research tool in modelling Earth's land surface interactions. This is of considerable importance in the light of the rapidly expanding use of this model worldwide, which also includes research conducted by various Space Agencies examining its synergistic use with Earth Observation data towards the development of operational products at a global scale. This research was supported by the European Commission Marie Curie Re-Integration Grant "TRANSFORM-EO". SimSphere is currently maintained and freely distributed by the Department of Geography and Earth Sciences at Aberystwyth University (http://www.aber.ac.uk/simsphere). Keywords: CO2 flux, ambient CO2, O3 flux, SimSphere, Gaussian process emulators, BACCO GEM-SA, TRANSFORM-EO.

  8. The Impact of Parametric Uncertainties on Biogeochemistry in the E3SM Land Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricciuto, Daniel; Sargsyan, Khachik; Thornton, Peter

    We conduct a global sensitivity analysis (GSA) of the Energy Exascale Earth System Model (E3SM), land model (ELM) to calculate the sensitivity of five key carbon cycle outputs to 68 model parameters. This GSA is conducted by first constructing a Polynomial Chaos (PC) surrogate via new Weighted Iterative Bayesian Compressive Sensing (WIBCS) algorithm for adaptive basis growth leading to a sparse, high-dimensional PC surrogate with 3,000 model evaluations. The PC surrogate allows efficient extraction of GSA information leading to further dimensionality reduction. The GSA is performed at 96 FLUXNET sites covering multiple plant functional types (PFTs) and climate conditions. Aboutmore » 20 of the model parameters are identified as sensitive with the rest being relatively insensitive across all outputs and PFTs. These sensitivities are dependent on PFT, and are relatively consistent among sites within the same PFT. The five model outputs have a majority of their highly sensitive parameters in common. A common subset of sensitive parameters is also shared among PFTs, but some parameters are specific to certain types (e.g., deciduous phenology). In conclusion, the relative importance of these parameters shifts significantly among PFTs and with climatic variables such as mean annual temperature.« less

  9. The Impact of Parametric Uncertainties on Biogeochemistry in the E3SM Land Model

    DOE PAGES

    Ricciuto, Daniel; Sargsyan, Khachik; Thornton, Peter

    2018-02-27

    We conduct a global sensitivity analysis (GSA) of the Energy Exascale Earth System Model (E3SM), land model (ELM) to calculate the sensitivity of five key carbon cycle outputs to 68 model parameters. This GSA is conducted by first constructing a Polynomial Chaos (PC) surrogate via new Weighted Iterative Bayesian Compressive Sensing (WIBCS) algorithm for adaptive basis growth leading to a sparse, high-dimensional PC surrogate with 3,000 model evaluations. The PC surrogate allows efficient extraction of GSA information leading to further dimensionality reduction. The GSA is performed at 96 FLUXNET sites covering multiple plant functional types (PFTs) and climate conditions. Aboutmore » 20 of the model parameters are identified as sensitive with the rest being relatively insensitive across all outputs and PFTs. These sensitivities are dependent on PFT, and are relatively consistent among sites within the same PFT. The five model outputs have a majority of their highly sensitive parameters in common. A common subset of sensitive parameters is also shared among PFTs, but some parameters are specific to certain types (e.g., deciduous phenology). In conclusion, the relative importance of these parameters shifts significantly among PFTs and with climatic variables such as mean annual temperature.« less

  10. Sensitivity analysis in a life cycle assessment of an aged red wine production from Catalonia, Spain.

    PubMed

    Meneses, M; Torres, C M; Castells, F

    2016-08-15

    Sustainability in agriculture and food processing is an issue with a clear growing interest; especially in products were consumers have particular awareness regarding its environmental profile. This is the case of wine industry depending on grape production, winemaking and bottling. Also viticulture and generally agricultural production is significantly affected by climate variations. The aim of this article is to determine the environmental load of an aged red wine from a winery in Catalonia, Spain, over its entire life cycle, including sensitivity analysis of the main parameters related to the cultivation, vinification and bottling. The life cycle assessment (LCA) methodology is used for the environmental analysis. In a first step, life cycle inventory (LCI) data were collected by questionnaires and interviews with the winemaker, all data are actual operating data and all the stages involved in the production have been taken into account (viticulture, vinification, bottling and the disposal subsystem). Data were then used to determine the environmental profile by a life cycle impact assessment using the ReCiPe method. Annual variability in environmental performance, stresses the importance of including timeline analysis in the wine sector. Because of that this study is accompanied with a sensitivity analysis carried out by a Monte Carlo simulation that takes into account the uncertainty and variability of the parameters used. In this manner, the results are presented with confidence intervals to provide a wider view of the environmental issues derived from the activities of the studied wine estate regardless of the eventualities of a specific harvesting year. Since the beverage packaging has an important influence in this case, a dataset for the production of green glass was adapted to reflect the actual recycling situation in Spain. Furthermore, a hypothetical variation of the glass-recycling rate in the glass production completes this article, as a key variable of sensitivity analysis, in order… in order to show the potential reduction of total greenhouse gas emissions. It was found that in almost all categories the production of the glass bottles has the highest environmental impact (10%-80% depending on the impact category) followed by the viticulture stage, i.e. the agricultural activities (17%-84% depending on the impact category). The vinification step, i.e. the winemaking itself, has an almost negligible effect on the overall load (1%-5%). The sensitivity analysis showed that the results do not differ by more than ±4% from the expected values except for the water depletion indicator. With the variation of the recycling rate, it could be shown that an increase in the rate from 60% to 85% allows for a reduction of 102gCO2eq. per bottle (-11.1%). The results show that glass production causes the highest environmental load. The key parameters that determine the impact are the recycling rate and the bottle weight. A glass container deposit legislation might be a promising way to enhance the glass recycling. Lightweight bottles and alternative packaging should also be considered. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Multivariate models for skin sensitization hazard and potency

    EPA Science Inventory

    One of the top priorities being addressed by ICCVAM is the identification and validation of non-animal alternatives for skin sensitization testing. Although skin sensitization is a complex process, the key biological events have been well characterized in an adverse outcome pathw...

  12. Bayesian analysis of heterogeneous treatment effects for patient-centered outcomes research.

    PubMed

    Henderson, Nicholas C; Louis, Thomas A; Wang, Chenguang; Varadhan, Ravi

    2016-01-01

    Evaluation of heterogeneity of treatment effect (HTE) is an essential aspect of personalized medicine and patient-centered outcomes research. Our goal in this article is to promote the use of Bayesian methods for subgroup analysis and to lower the barriers to their implementation by describing the ways in which the companion software beanz can facilitate these types of analyses. To advance this goal, we describe several key Bayesian models for investigating HTE and outline the ways in which they are well-suited to address many of the commonly cited challenges in the study of HTE. Topics highlighted include shrinkage estimation, model choice, sensitivity analysis, and posterior predictive checking. A case study is presented in which we demonstrate the use of the methods discussed.

  13. Key metabolites in tissue extracts of Elliptio complanata identified using 1H nuclear magnetic resonance spectroscopy

    PubMed Central

    Hurley-Sanders, Jennifer L.; Levine, Jay F.; Nelson, Stacy A. C.; Law, J. M.; Showers, William J.; Stoskopf, Michael K.

    2015-01-01

    We used 1H nuclear magnetic resonance spectroscopy to describe key metabolites of the polar metabolome of the freshwater mussel, Elliptio complanata. Principal components analysis documented variability across tissue types and river of origin in mussels collected from two rivers in North Carolina (USA). Muscle, digestive gland, mantle and gill tissues yielded identifiable but overlapping metabolic profiles. Variation in digestive gland metabolic profiles between the two mussel collection sites was characterized by differences in mono- and disaccharides. Variation in mantle tissue metabolomes appeared to be associated with sex. Nuclear magnetic resonance spectroscopy is a sensitive means to detect metabolites in the tissues of E. complanata and holds promise as a tool for the investigation of freshwater mussel health and physiology. PMID:27293708

  14. Analysis of parameter estimation and optimization application of ant colony algorithm in vehicle routing problem

    NASA Astrophysics Data System (ADS)

    Xu, Quan-Li; Cao, Yu-Wei; Yang, Kun

    2018-03-01

    Ant Colony Optimization (ACO) is the most widely used artificial intelligence algorithm at present. This study introduced the principle and mathematical model of ACO algorithm in solving Vehicle Routing Problem (VRP), and designed a vehicle routing optimization model based on ACO, then the vehicle routing optimization simulation system was developed by using c ++ programming language, and the sensitivity analyses, estimations and improvements of the three key parameters of ACO were carried out. The results indicated that the ACO algorithm designed in this paper can efficiently solve rational planning and optimization of VRP, and the different values of the key parameters have significant influence on the performance and optimization effects of the algorithm, and the improved algorithm is not easy to locally converge prematurely and has good robustness.

  15. Study of aircraft in intraurban transportation systems, volume 1

    NASA Technical Reports Server (NTRS)

    Stout, E. G.; Kesling, P. H.; Matteson, H. C.; Sherwood, D. E.; Tuck, W. R., Jr.; Vaughn, L. A.

    1971-01-01

    An analysis of an effective short range, high density computer transportation system for intraurban systems is presented. The seven county Detroit, Michigan, metropolitan area, was chosen as the scenario for the analysis. The study consisted of an analysis and forecast of the Detroit market through 1985, a parametric analysis of appropriate short haul aircraft concepts and associated ground systems, and a preliminary overall economic analysis of a simplified total system designed to evaluate the candidate vehicles and select the most promising VTOL and STOL aircraft. Data are also included on the impact of advanced technology on the system, the sensitivity of mission performance to changes in aircraft characteristics and system operations, and identification of key problem areas that may be improved by additional research. The approach, logic, and computer models used are adaptable to other intraurban or interurban areas.

  16. A sensitivity analysis of process design parameters, commodity prices and robustness on the economics of odour abatement technologies.

    PubMed

    Estrada, José M; Kraakman, N J R Bart; Lebrero, Raquel; Muñoz, Raúl

    2012-01-01

    The sensitivity of the economics of the five most commonly applied odour abatement technologies (biofiltration, biotrickling filtration, activated carbon adsorption, chemical scrubbing and a hybrid technology consisting of a biotrickling filter coupled with carbon adsorption) towards design parameters and commodity prices was evaluated. Besides, the influence of the geographical location on the Net Present Value calculated for a 20 years lifespan (NPV20) of each technology and its robustness towards typical process fluctuations and operational upsets were also assessed. This comparative analysis showed that biological techniques present lower operating costs (up to 6 times) and lower sensitivity than their physical/chemical counterparts, with the packing material being the key parameter affecting their operating costs (40-50% of the total operating costs). The use of recycled or partially treated water (e.g. secondary effluent in wastewater treatment plants) offers an opportunity to significantly reduce costs in biological techniques. Physical/chemical technologies present a high sensitivity towards H2S concentration, which is an important drawback due to the fluctuating nature of malodorous emissions. The geographical analysis evidenced high NPV20 variations around the world for all the technologies evaluated, but despite the differences in wage and price levels, biofiltration and biotrickling filtration are always the most cost-efficient alternatives (NPV20). When, in an economical evaluation, the robustness is as relevant as the overall costs (NPV20), the hybrid technology would move up next to BTF as the most preferred technologies. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. Separating foliar physiology from morphology reveals the relative roles of vertically structured transpiration factors within red maple crowns and limitations of larger scale models

    PubMed Central

    Bauerle, William L.; Bowden, Joseph D.

    2011-01-01

    A spatially explicit mechanistic model, MAESTRA, was used to separate key parameters affecting transpiration to provide insights into the most influential parameters for accurate predictions of within-crown and within-canopy transpiration. Once validated among Acer rubrum L. genotypes, model responses to different parameterization scenarios were scaled up to stand transpiration (expressed per unit leaf area) to assess how transpiration might be affected by the spatial distribution of foliage properties. For example, when physiological differences were accounted for, differences in leaf width among A. rubrum L. genotypes resulted in a 25% difference in transpiration. An in silico within-canopy sensitivity analysis was conducted over the range of genotype parameter variation observed and under different climate forcing conditions. The analysis revealed that seven of 16 leaf traits had a ≥5% impact on transpiration predictions. Under sparse foliage conditions, comparisons of the present findings with previous studies were in agreement that parameters such as the maximum Rubisco-limited rate of photosynthesis can explain ∼20% of the variability in predicted transpiration. However, the spatial analysis shows how such parameters can decrease or change in importance below the uppermost canopy layer. Alternatively, model sensitivity to leaf width and minimum stomatal conductance was continuous along a vertical canopy depth profile. Foremost, transpiration sensitivity to an observed range of morphological and physiological parameters is examined and the spatial sensitivity of transpiration model predictions to vertical variations in microclimate and foliage density is identified to reduce the uncertainty of current transpiration predictions. PMID:21617246

  18. Sensitivity of emergent sociohydrologic dynamics to internal system properties and external sociopolitical factors: Implications for water management

    NASA Astrophysics Data System (ADS)

    Elshafei, Y.; Tonts, M.; Sivapalan, M.; Hipsey, M. R.

    2016-06-01

    It is increasingly acknowledged that effective management of water resources requires a holistic understanding of the coevolving dynamics inherent in the coupled human-hydrology system. One of the fundamental information gaps concerns the sensitivity of coupled system feedbacks to various endogenous system properties and exogenous societal contexts. This paper takes a previously calibrated sociohydrology model and applies an idealized implementation, in order to: (i) explore the sensitivity of emergent dynamics resulting from bidirectional feedbacks to assumptions regarding (a) internal system properties that control the internal dynamics of the coupled system and (b) the external sociopolitical context; and (ii) interpret the results within the context of water resource management decision making. The analysis investigates feedback behavior in three ways, (a) via a global sensitivity analysis on key parameters and assessment of relevant model outputs, (b) through a comparative analysis based on hypothetical placement of the catchment along various points on the international sociopolitical gradient, and (c) by assessing the effects of various direct management intervention scenarios. Results indicate the presence of optimum windows that might offer the greatest positive impact per unit of management effort. Results further advocate management tools that encourage an adaptive learning, community-based approach with respect to water management, which are found to enhance centralized policy measures. This paper demonstrates that it is possible to use a place-based sociohydrology model to make abstractions as to the dynamics of bidirectional feedback behavior, and provide insights as to the efficacy of water management tools under different circumstances.

  19. Cost analysis of school-based intermittent screening and treatment of malaria in Kenya

    PubMed Central

    2011-01-01

    Background The control of malaria in schools is receiving increasing attention, but there remains currently no consensus as to the optimal intervention strategy. This paper analyses the costs of intermittent screening and treatment (IST) of malaria in schools, implemented as part of a cluster-randomized controlled trial on the Kenyan coast. Methods Financial and economic costs were estimated using an ingredients approach whereby all resources required in the delivery of IST are quantified and valued. Sensitivity analysis was conducted to investigate how programme variation affects costs and to identify potential cost savings in the future implementation of IST. Results The estimated financial cost of IST per child screened is US$ 6.61 (economic cost US$ 6.24). Key contributors to cost were salary costs (36%) and malaria rapid diagnostic tests (RDT) (22%). Almost half (47%) of the intervention cost comprises redeployment of existing resources including health worker time and use of hospital vehicles. Sensitivity analysis identified changes to intervention delivery that can reduce programme costs by 40%, including use of alternative RDTs and removal of supervised treatment. Cost-effectiveness is also likely to be highly sensitive to the proportion of children found to be RDT-positive. Conclusion In the current context, school-based IST is a relatively expensive malaria intervention, but reducing the complexity of delivery can result in considerable savings in the cost of intervention. (Costs are reported in US$ 2010). PMID:21933376

  20. Simple Sensitivity Analysis for Orion Guidance Navigation and Control

    NASA Technical Reports Server (NTRS)

    Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar

    2013-01-01

    The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch. We describe in this paper a sensitivity analysis tool ("Critical Factors Tool" or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.

  1. Reliability of rapid reporting of cancers in New Hampshire.

    PubMed

    Celaya, Maria O; Riddle, Bruce L; Cherala, Sai S; Armenti, Karla R; Rees, Judy R

    2010-01-01

    The New Hampshire State Cancer Registry (NHSCR) has a 2-phase reporting system. An abbreviated, "rapid" report of cancer diagnosis or treatment is due to the central registry within 45 days of diagnosis and a more detailed, definitive report is due within 180 days. Rapid reports are used for various research studies, but researchers who contact patients are warned that the rapid reports may contain inaccuracies. This study aimed to assess the reliability of rapid cancer reports. For diagnosis years 2000-2004, we compared the rapid and definitive reports submitted to NHSCR. We calculated the sensitivity and positive predictive value of rapid reports; the reliability of key data items overall and for major sites; and the time between diagnosis and submission of the report. Rapid reports identified incident cancer cases with a sensitivity of 88.5%. The overall accuracy of key data items was high. The accuracy of primary sites identified by rapid reports was high generally but lower for ovarian and unknown primaries. A subset analysis showed that 47% of cancers were reported within 90 days of diagnosis. Rapid reports submitted to NHSCR are generally of high quality and present a useful opportunity for research investigations in New Hampshire.

  2. A novel algorithm for thermal image encryption.

    PubMed

    Hussain, Iqtadar; Anees, Amir; Algarni, Abdulmohsen

    2018-04-16

    Thermal images play a vital character at nuclear plants, Power stations, Forensic labs biological research, and petroleum products extraction. Safety of thermal images is very important. Image data has some unique features such as intensity, contrast, homogeneity, entropy and correlation among pixels that is why somehow image encryption is trickier as compare to other encryptions. With conventional image encryption schemes it is normally hard to handle these features. Therefore, cryptographers have paid attention to some attractive properties of the chaotic maps such as randomness and sensitivity to build up novel cryptosystems. That is why, recently proposed image encryption techniques progressively more depends on the application of chaotic maps. This paper proposed an image encryption algorithm based on Chebyshev chaotic map and S8 Symmetric group of permutation based substitution boxes. Primarily, parameters of chaotic Chebyshev map are chosen as a secret key to mystify the primary image. Then, the plaintext image is encrypted by the method generated from the substitution boxes and Chebyshev map. By this process, we can get a cipher text image that is perfectly twisted and dispersed. The outcomes of renowned experiments, key sensitivity tests and statistical analysis confirm that the proposed algorithm offers a safe and efficient approach for real-time image encryption.

  3. Two-probe versus van der Pauw method in studying the piezoresistivity of single-wall carbon nanotube thin films

    NASA Astrophysics Data System (ADS)

    Yao, Yanbo; Duan, Xiaoshuang; Luo, Jiangjiang; Liu, Tao

    2017-11-01

    The use of the van der Pauw (VDP) method for characterizing and evaluating the piezoresistive behavior of carbon nanomaterial enabled piezoresistive sensors have not been systematically studied. By using single-wall carbon nanotube (SWCNT) thin films as a model system, herein we report a coupled electrical-mechanical experimental study in conjunction with a multiphysics finite element simulation as well as an analytic analysis to compare the two-probe and VDP testing configuration in evaluating the piezoresistive behavior of carbon nanomaterial enabled piezoresistive sensors. The key features regarding the sample aspect ratio dependent piezoresistive sensitivity or gauge factor were identified for the VDP testing configuration. It was found that the VDP test configuration offers consistently higher piezoresistive sensitivity than the two-probe testing method.

  4. Mathematical 3D modelling and sensitivity analysis of multipolar radiofrequency ablation in the spine.

    PubMed

    Matschek, Janine; Bullinger, Eric; von Haeseler, Friedrich; Skalej, Martin; Findeisen, Rolf

    2017-02-01

    Radiofrequency ablation is a valuable tool in the treatment of many diseases, especially cancer. However, controlled heating up to apoptosis of the desired target tissue in complex situations, e.g. in the spine, is challenging and requires experienced interventionalists. For such challenging situations a mathematical model of radiofrequency ablation allows to understand, improve and optimise the outcome of the medical therapy. The main contribution of this work is the derivation of a tailored, yet expandable mathematical model, for the simulation, analysis, planning and control of radiofrequency ablation in complex situations. The dynamic model consists of partial differential equations that describe the potential and temperature distribution during intervention. To account for multipolar operation, time-dependent boundary conditions are introduced. Spatially distributed parameters, like tissue conductivity and blood perfusion, allow to describe the complex 3D environment representing diverse involved tissue types in the spine. To identify the key parameters affecting the prediction quality of the model, the influence of the parameters on the temperature distribution is investigated via a sensitivity analysis. Simulations underpin the quality of the derived model and the analysis approach. The proposed modelling and analysis schemes set the basis for intervention planning, state- and parameter estimation, and control. Copyright © 2016. Published by Elsevier Inc.

  5. From LCAs to simplified models: a generic methodology applied to wind power electricity.

    PubMed

    Padey, Pierryves; Girard, Robin; le Boulch, Denis; Blanc, Isabelle

    2013-02-05

    This study presents a generic methodology to produce simplified models able to provide a comprehensive life cycle impact assessment of energy pathways. The methodology relies on the application of global sensitivity analysis to identify key parameters explaining the impact variability of systems over their life cycle. Simplified models are built upon the identification of such key parameters. The methodology is applied to one energy pathway: onshore wind turbines of medium size considering a large sample of possible configurations representative of European conditions. Among several technological, geographical, and methodological parameters, we identified the turbine load factor and the wind turbine lifetime as the most influent parameters. Greenhouse Gas (GHG) performances have been plotted as a function of these key parameters identified. Using these curves, GHG performances of a specific wind turbine can be estimated, thus avoiding the undertaking of an extensive Life Cycle Assessment (LCA). This methodology should be useful for decisions makers, providing them a robust but simple support tool for assessing the environmental performance of energy systems.

  6. Variability in surface infrared reflectance of thirteen nitrile rubber gloves at key wavelengths for analysis of captan.

    PubMed

    Phalen, R N; Que Hee, Shane S

    2007-02-01

    The aim of this study was to investigate the surface variability of 13 powder-free, unlined, and unsupported nitrile rubber gloves using attenuated total reflection Fourier transform infrared (ATR-FT-IR) spectrophotometry at key wavelengths for analysis of captan contamination. The within-glove, within-lot, and between-lot variability was measured at 740, 1124, 1252, and 1735 cm(-1), the characteristic captan reflectance minima wavelengths. Three glove brands were assessed after conditioning overnight at relative humidity (RH) values ranging from 2 +/- 1 to 87 +/- 4% and temperatures ranging from -8.6 +/- 0.7 to 59.2 +/- 0.9 degrees C. For all gloves, 1735 cm(-1) provided the lowest background absorbance and greatest potential sensitivity for captan analysis on the outer glove surface: absorbances ranged from 0.0074 +/- 0.0005 (Microflex) to 0.0195 +/- 0.0024 (SafeSkin); average within-glove coefficients of variation (CV) ranged from 2.7% (Best, range 0.9-5.3%) to 10% (SafeSkin, 1.2-17%); within-glove CVs greater than 10% were for one brand (SafeSkin); within-lot CVs ranged from 2.8% (Best N-Dex) to 28% (SafeSkin Blue); and between-lot variation was statistically significant (p < or = 0.05) for all but two SafeSkin lots. The RH had variable effects dependent on wavelength, being minimal at 1735, 1252, and 1124 cm(-1) and highest at 3430 cm(-1) (O-H stretch region). There was no significant effect of temperature conditioning. Substantial within-glove, within-lot, and between-lot variability was observed. Thus, surface analysis using ATR-FT-IR must treat glove brands and lots as different. ATR-FT-IR proved to be a useful real-time analytical tool for measuring glove variability, detecting surface humidity effects, and choosing selective and sensitive wavelengths for analysis of nonvolatile surface contaminants.

  7. Effects of quench rate and natural ageing on the age hardening behaviour of aluminium alloy AA6060

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strobel, Katharina, E-mail: katharina.strobel@aol.com; Lay, Matthew D.H., E-mail: mlay@fbrice.com; Easton, Mark A., E-mail: mark.easton@rmit.edu.au

    Quench sensitivity in Al–Mg–Si alloys has been largely attributed to the solute loss at the heterogeneous nucleation sites, primarily dispersoids, during slow cooling after extrusion. As such, the number density of dispersoids, the solute type and concentration are considered to be the key variables for the quench sensitivity. In this study, quench sensitivity and the influence of natural ageing in a lean Al–Mg–Si alloy, AA6060, which contains few dispersoids, have been investigated by hardness measurement, thermal analysis, transmission electron microscopy (TEM) and positron annihilation lifetime spectroscopy (PALS). It is shown that the quench sensitivity in this alloy is associated withmore » the degree of supersaturation of vacancies after cooling. Due to vacancy annihilation and clustering during natural ageing, the quench sensitivity is more pronounced after a short natural ageing time (30 min) compared to a longer natural ageing time (24 h). Therefore, prolonged natural ageing not only leads to an increase in hardness, but can also have a positive effect on the quench sensitivity of lean Al–Mg–Si alloys. - Highlights: • Significant quench sensitivity observed in AA6060 alloy after 30 min natural ageing • Prolonged natural ageing increased hardness and reduced QS. • Low dispersoid density leads to insignificant QS from non-hardening precipitates. • Vacancy supersaturation identified as a contributor to QS.« less

  8. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis version 6.0 theory manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.« less

  9. Utilizing a one-dimensional multispecies model to simulate the nutrient reduction and biomass structure in two types of H2-based membrane-aeration biofilm reactors (H2-MBfR): model development and parametric analysis.

    PubMed

    Wang, Zuowei; Xia, Siqing; Xu, Xiaoyin; Wang, Chenhui

    2016-02-01

    In this study, a one-dimensional multispecies model (ODMSM) was utilized to simulate NO3(-)-N and ClO4(-) reduction performances in two kinds of H2-based membrane-aeration biofilm reactors (H2-MBfR) within different operating conditions (e.g., NO3(-)-N/ClO4(-) loading rates, H2 partial pressure, etc.). Before the simulation process, we conducted the sensitivity analysis of some key parameters which would fluctuate in different environmental conditions, then we used the experimental data to calibrate the more sensitive parameters μ1 and μ2 (maximum specific growth rates of denitrification bacteria and perchlorate reduction bacteria) in two H2-MBfRs, and the diversity of the two key parameters' values in two types of reactors may be resulted from the different carbon source fed in the reactors. From the simulation results of six different operating conditions (four in H2-MBfR 1 and two in H2-MBfR 2), the applicability of the model was approved, and the variation of the removal tendency in different operating conditions could be well simulated. Besides, the rationality of operating parameters (H2 partial pressure, etc.) could be judged especially in condition of high nutrients' loading rates. To a certain degree, the model could provide theoretical guidance to determine the operating parameters on some specific conditions in practical application.

  10. Development of real-time PCR method for the detection and the quantification of a new endogenous reference gene in sugar beet "Beta vulgaris L.": GMO application.

    PubMed

    Chaouachi, Maher; Alaya, Akram; Ali, Imen Ben Haj; Hafsa, Ahmed Ben; Nabi, Nesrine; Bérard, Aurélie; Romaniuk, Marcel; Skhiri, Fethia; Saïd, Khaled

    2013-01-01

    KEY MESSAGE : Here, we describe a new developed quantitative real-time PCR method for the detection and quantification of a new specific endogenous reference gene used in GMO analysis. The key requirement of this study was the identification of a new reference gene used for the differentiation of the four genomic sections of the sugar beet (Beta vulgaris L.) (Beta, Corrollinae, Nanae and Procumbentes) suitable for quantification of genetically modified sugar beet. A specific qualitative polymerase chain reaction (PCR) assay was designed to detect the sugar beet amplifying a region of the adenylate transporter (ant) gene only from the species of the genomic section I of the genus Beta (cultivated and wild relatives) and showing negative PCR results for 7 species of the 3 other sections, 8 related species and 20 non-sugar beet plants. The sensitivity of the assay was 15 haploid genome copies (HGC). A quantitative real-time polymerase chain reaction (QRT-PCR) assay was also performed, having high linearity (R (2) > 0.994) over sugar beet standard concentrations ranging from 20,000 to 10 HGC of the sugar beet DNA per PCR. The QRT-PCR assay described in this study was specific and more sensitive for sugar beet quantification compared to the validated test previously reported in the European Reference Laboratory. This assay is suitable for GMO quantification in routine analysis from a wide variety of matrices.

  11. Thermal analysis of a conceptual design for a 250 We GPHS/FPSE space power system

    NASA Technical Reports Server (NTRS)

    Mccomas, Thomas J.; Dugan, Edward T.

    1991-01-01

    A thermal analysis has been performed for a 250-We space nuclear power system which combines the US Department of Energy's general purpose heat source (GPHS) modules with a state-of-the-art free-piston Stirling engine (FPSE). The focus of the analysis is on the temperature of the indium fuel clad within the GPHS modules. The thermal analysis results indicate fuel clad temperatures slightly higher than the design goal temperature of 1573 K. The results are considered favorable due to numerous conservative assumptions used. To demonstrate the effects of the conservatism, a brief sensitivity analysis is performed in which a few of the key system parameters are varied to determine their effect on the fuel clad temperatures. It is shown that thermal analysis of a more detailed thermal mode should yield fuel clad temperatures below 1573 K.

  12. Statistical analysis and handling of missing data in cluster randomized trials: a systematic review.

    PubMed

    Fiero, Mallorie H; Huang, Shuang; Oren, Eyal; Bell, Melanie L

    2016-02-09

    Cluster randomized trials (CRTs) randomize participants in groups, rather than as individuals and are key tools used to assess interventions in health research where treatment contamination is likely or if individual randomization is not feasible. Two potential major pitfalls exist regarding CRTs, namely handling missing data and not accounting for clustering in the primary analysis. The aim of this review was to evaluate approaches for handling missing data and statistical analysis with respect to the primary outcome in CRTs. We systematically searched for CRTs published between August 2013 and July 2014 using PubMed, Web of Science, and PsycINFO. For each trial, two independent reviewers assessed the extent of the missing data and method(s) used for handling missing data in the primary and sensitivity analyses. We evaluated the primary analysis and determined whether it was at the cluster or individual level. Of the 86 included CRTs, 80 (93%) trials reported some missing outcome data. Of those reporting missing data, the median percent of individuals with a missing outcome was 19% (range 0.5 to 90%). The most common way to handle missing data in the primary analysis was complete case analysis (44, 55%), whereas 18 (22%) used mixed models, six (8%) used single imputation, four (5%) used unweighted generalized estimating equations, and two (2%) used multiple imputation. Fourteen (16%) trials reported a sensitivity analysis for missing data, but most assumed the same missing data mechanism as in the primary analysis. Overall, 67 (78%) trials accounted for clustering in the primary analysis. High rates of missing outcome data are present in the majority of CRTs, yet handling missing data in practice remains suboptimal. Researchers and applied statisticians should carry out appropriate missing data methods, which are valid under plausible assumptions in order to increase statistical power in trials and reduce the possibility of bias. Sensitivity analysis should be performed, with weakened assumptions regarding the missing data mechanism to explore the robustness of results reported in the primary analysis.

  13. Women survivors of child sexual abuse. How can health professionals promote healing?

    PubMed Central

    Schachter, Candice L.; Radomsky, Nellie A.; Stalker, Carol A.; Teram, Eli

    2004-01-01

    OBJECTIVE: To explore how health professionals can practise in ways sensitive to adult women survivors of child sexual abuse. DESIGN: Qualitative semistructured in-depth interviews. SETTING: Small and midsize cities in Ontario and Saskatchewan. PARTICIPANTS: Twenty-seven women survivors of childhood sexual abuse. METHODS: Respondents were asked about their experiences with physical therapists and other health professionals and asked how practice could be sensitive to their needs as survivors. A grounded-theory approach was used. After independent analyses, researchers achieved consensus on the main themes. Findings were checked with participants, other survivors, and mental health professionals. MAIN FINDINGS: A crucial theme was the need to feel safe when consulting any health professional. Participants described specific ways for clinicians to facilitate the feeling of safety. Disclosure of abuse history was another key theme; analysis revealed no one "right way" to inquire about it. CONCLUSION: Women survivors of child sexual abuse want safe, accepting environments and sensitive, informed health professionals with whom to work in partnership on all their health concerns. PMID:15318678

  14. ASME B89.4.19 Performance Evaluation Tests and Geometric Misalignments in Laser Trackers

    PubMed Central

    Muralikrishnan, B.; Sawyer, D.; Blackburn, C.; Phillips, S.; Borchardt, B.; Estler, W. T.

    2009-01-01

    Small and unintended offsets, tilts, and eccentricity of the mechanical and optical components in laser trackers introduce systematic errors in the measured spherical coordinates (angles and range readings) and possibly in the calculated lengths of reference artifacts. It is desirable that the tests described in the ASME B89.4.19 Standard [1] be sensitive to these geometric misalignments so that any resulting systematic errors are identified during performance evaluation. In this paper, we present some analysis, using error models and numerical simulation, of the sensitivity of the length measurement system tests and two-face system tests in the B89.4.19 Standard to misalignments in laser trackers. We highlight key attributes of the testing strategy adopted in the Standard and propose new length measurement system tests that demonstrate improved sensitivity to some misalignments. Experimental results with a tracker that is not properly error corrected for the effects of the misalignments validate claims regarding the proposed new length tests. PMID:27504211

  15. Multidisciplinary optimization of controlled space structures with global sensitivity equations

    NASA Technical Reports Server (NTRS)

    Padula, Sharon L.; James, Benjamin B.; Graves, Philip C.; Woodard, Stanley E.

    1991-01-01

    A new method for the preliminary design of controlled space structures is presented. The method coordinates standard finite element structural analysis, multivariable controls, and nonlinear programming codes and allows simultaneous optimization of the structures and control systems of a spacecraft. Global sensitivity equations are a key feature of this method. The preliminary design of a generic geostationary platform is used to demonstrate the multidisciplinary optimization method. Fifteen design variables are used to optimize truss member sizes and feedback gain values. The goal is to reduce the total mass of the structure and the vibration control system while satisfying constraints on vibration decay rate. Incorporating the nonnegligible mass of actuators causes an essential coupling between structural design variables and control design variables. The solution of the demonstration problem is an important step toward a comprehensive preliminary design capability for structures and control systems. Use of global sensitivity equations helps solve optimization problems that have a large number of design variables and a high degree of coupling between disciplines.

  16. The future viability of algae-derived biodiesel under economic and technical uncertainties.

    PubMed

    Brownbridge, George; Azadi, Pooya; Smallbone, Andrew; Bhave, Amit; Taylor, Benjamin; Kraft, Markus

    2014-01-01

    This study presents a techno-economic assessment of algae-derived biodiesel under economic and technical uncertainties associated with the development of algal biorefineries. A global sensitivity analysis was performed using a High Dimensional Model Representation (HDMR) method. It was found that, considering reasonable ranges over which each parameter can vary, the sensitivity of the biodiesel production cost to the key input parameters decreases in the following order: algae oil content>algae annual productivity per unit area>plant production capacity>carbon price increase rate. It was also found that the Return on Investment (ROI) is highly sensitive to the algae oil content, and to a lesser extent to the algae annual productivity, crude oil price and price increase rate, plant production capacity, and carbon price increase rate. For a large scale plant (100,000 tonnes of biodiesel per year) the production cost of biodiesel is likely to be £0.8-1.6 per kg. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Sensitive determination of endogenous hydroxyl radical in live cell by a BODIPY based fluorescent probe.

    PubMed

    Lei, Kepeng; Sun, Mingtai; Du, Libo; Zhang, Xiaojie; Yu, Huan; Wang, Suhua; Hayat, Tasawar; Alsaedi, Ahmed

    2017-08-01

    The sensitive and selective fluorescence probe for hydroxyl radical analysis is of significance because hydroxyl radical plays key roles in many physiological and pathological processes. In this work, a novel organic fluorescence molecular probe OHP for hydroxyl radical is synthesized by a two-step route. The probe employs 4-bora-3a,4a-diaza-s-indacene (difluoroboron dipyrromethene, BODIPY) as the fluorophore and possesses relatively high fluorescence quantum yields (77.14%). Hydroxyl radical can rapidly react with the probe and quench the fluorescence in a good linear relationship (R 2 =0.9967). The limit of detection is determined to be as low as 11nM. In addition, it has been demonstrated that the probe has a good stability against pH and light illumination, low cytotoxicity and high biocompatibility. Cell culture experimental results show that the probe OHP is sensitive and selective for imaging and tracking endogenous hydroxyl radical in live cells. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Thermal biology mediates responses of amphibians and reptiles to habitat modification.

    PubMed

    Nowakowski, A Justin; Watling, James I; Thompson, Michelle E; Brusch, George A; Catenazzi, Alessandro; Whitfield, Steven M; Kurz, David J; Suárez-Mayorga, Ángela; Aponte-Gutiérrez, Andrés; Donnelly, Maureen A; Todd, Brian D

    2018-03-01

    Human activities often replace native forests with warmer, modified habitats that represent novel thermal environments for biodiversity. Reducing biodiversity loss hinges upon identifying which species are most sensitive to the environmental conditions that result from habitat modification. Drawing on case studies and a meta-analysis, we examined whether observed and modelled thermal traits, including heat tolerances, variation in body temperatures, and evaporative water loss, explained variation in sensitivity of ectotherms to habitat modification. Low heat tolerances of lizards and amphibians and high evaporative water loss of amphibians were associated with increased sensitivity to habitat modification, often explaining more variation than non-thermal traits. Heat tolerances alone explained 24-66% (mean = 38%) of the variation in species responses, and these trends were largely consistent across geographic locations and spatial scales. As habitat modification alters local microclimates, the thermal biology of species will likely play a key role in the reassembly of terrestrial communities. © 2018 John Wiley & Sons Ltd/CNRS.

  19. Sensitive Multi-Species Emissions Monitoring: Infrared Laser-Based Detection of Trace-Level Contaminants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steill, Jeffrey D.; Huang, Haifeng; Hoops, Alexandra A.

    This report summarizes our development of spectroscopic chemical analysis techniques and spectral modeling for trace-gas measurements of highly-regulated low-concentration species present in flue gas emissions from utility coal boilers such as HCl under conditions of high humidity. Detailed spectral modeling of the spectroscopy of HCl and other important combustion and atmospheric species such as H 2 O, CO 2 , N 2 O, NO 2 , SO 2 , and CH 4 demonstrates that IR-laser spectroscopy is a sensitive multi-component analysis strategy. Experimental measurements from techniques based on IR laser spectroscopy are presented that demonstrate sub-ppm sensitivity levels to thesemore » species. Photoacoustic infrared spectroscopy is used to detect and quantify HCl at ppm levels with extremely high signal-to-noise even under conditions of high relative humidity. Additionally, cavity ring-down IR spectroscopy is used to achieve an extremely high sensitivity to combustion trace gases in this spectral region; ppm level CH 4 is one demonstrated example. The importance of spectral resolution in the sensitivity of a trace-gas measurement is examined by spectral modeling in the mid- and near-IR, and efforts to improve measurement resolution through novel instrument development are described. While previous project reports focused on benefits and complexities of the dual-etalon cavity ring-down infrared spectrometer, here details on steps taken to implement this unique and potentially revolutionary instrument are described. This report also illustrates and critiques the general strategy of IR- laser photodetection of trace gases leading to the conclusion that mid-IR laser spectroscopy techniques provide a promising basis for further instrument development and implementation that will enable cost-effective sensitive detection of multiple key contaminant species simultaneously.« less

  20. Preoperative localization strategies for primary hyperparathyroidism: an economic analysis.

    PubMed

    Lubitz, Carrie C; Stephen, Antonia E; Hodin, Richard A; Pandharipande, Pari

    2012-12-01

    Strategies for localizing parathyroid pathology preoperatively vary in cost and accuracy. Our purpose was to compute and compare comprehensive costs associated with common localization strategies. A decision-analytic model was developed to evaluate comprehensive, short-term costs of parathyroid localization strategies for patients with primary hyperparathyroidism. Eight strategies were compared. Probabilities of accurate localization were extracted from the literature, and costs associated with each strategy were based on 2011 Medicare reimbursement schedules. Differential cost considerations included outpatient versus inpatient surgeries, operative time, and costs of imaging. Sensitivity analyses were performed to determine effects of variability in key model parameters upon model results. Ultrasound (US) followed by 4D-CT was the least expensive strategy ($5,901), followed by US alone ($6,028), and 4D-CT alone ($6,110). Strategies including sestamibi (SM) were more expensive, with associated expenditures of up to $6,329 for contemporaneous US and SM. Four-gland, bilateral neck exploration (BNE) was the most expensive strategy ($6,824). Differences in cost were dependent upon differences in the sensitivity of each strategy for detecting single-gland disease, which determined the proportion of patients able to undergo outpatient minimally invasive parathyroidectomy. In sensitivity analysis, US alone was preferred over US followed by 4D-CT only when both the sensitivity of US alone for detecting an adenoma was ≥ 94 %, and the sensitivity of 4D-CT following negative US was ≤ 39 %. 4D-CT alone was the least costly strategy when US sensitivity was ≤ 31 %. Among commonly used strategies for preoperative localization of parathyroid pathology, US followed by selective 4D-CT is the least expensive.

  1. M13 Bacteriophage Based Protein Sensors

    NASA Astrophysics Data System (ADS)

    Lee, Ju Hun

    Despite significant progress in biotechnology and biosensing, early detection and disease diagnosis remains a critical issue for improving patient survival rates and well-being. Many of the typical detection schemes currently used possess issues such as low sensitivity and accuracy and are also time consuming to run and expensive. In addition, multiplexed detection remains difficult to achieve. Therefore, developing advanced approaches for reliable, simple, quantitative analysis of multiple markers in solution that also are highly sensitive are still in demand. In recent years, much of the research has primarily focused on improving two key components of biosensors: the bio-recognition agent (bio-receptor) and the transducer. Particular bio-receptors that have been used include antibodies, aptamers, molecular imprinted polymers, and small affinity peptides. In terms of transducing agents, nanomaterials have been considered as attractive candidates due to their inherent nanoscale size, durability and unique chemical and physical properties. The key focus of this thesis is the design of a protein detection and identification system that is based on chemically engineered M13 bacteriophage coupled with nanomaterials. The first chapter provides an introduction of biosensors and M13 bacteriophage in general, where the advantages of each are provided. In chapter 2, an efficient and enzyme-free sensor is demonstrated from modified M13 bacteriophage to generate highly sensitive colorimetric signals from gold nanocrystals. In chapter 3, DNA conjugated M13 were used to enable facile and rapid detection of antigens in solution that also provides modalities for identification. Lastly, high DNA loadings per phage was achieved via hydrozone chemistry and these were applied in conjunction with Raman active DNA-gold/silver core/shell nanoparticles toward highly sensitive SERS sensing.

  2. Sensitivity Analysis and Parameter Estimation for a Reactive Transport Model of Uranium Bioremediation

    NASA Astrophysics Data System (ADS)

    Meyer, P. D.; Yabusaki, S.; Curtis, G. P.; Ye, M.; Fang, Y.

    2011-12-01

    A three-dimensional, variably-saturated flow and multicomponent biogeochemical reactive transport model of uranium bioremediation was used to generate synthetic data . The 3-D model was based on a field experiment at the U.S. Dept. of Energy Rifle Integrated Field Research Challenge site that used acetate biostimulation of indigenous metal reducing bacteria to catalyze the conversion of aqueous uranium in the +6 oxidation state to immobile solid-associated uranium in the +4 oxidation state. A key assumption in past modeling studies at this site was that a comprehensive reaction network could be developed largely through one-dimensional modeling. Sensitivity analyses and parameter estimation were completed for a 1-D reactive transport model abstracted from the 3-D model to test this assumption, to identify parameters with the greatest potential to contribute to model predictive uncertainty, and to evaluate model structure and data limitations. Results showed that sensitivities of key biogeochemical concentrations varied in space and time, that model nonlinearities and/or parameter interactions have a significant impact on calculated sensitivities, and that the complexity of the model's representation of processes affecting Fe(II) in the system may make it difficult to correctly attribute observed Fe(II) behavior to modeled processes. Non-uniformity of the 3-D simulated groundwater flux and averaging of the 3-D synthetic data for use as calibration targets in the 1-D modeling resulted in systematic errors in the 1-D model parameter estimates and outputs. This occurred despite using the same reaction network for 1-D modeling as used in the data-generating 3-D model. Predictive uncertainty of the 1-D model appeared to be significantly underestimated by linear parameter uncertainty estimates.

  3. Salt stress induces differential regulation of the phenylpropanoid pathway in Olea europaea cultivars Frantoio (salt-tolerant) and Leccino (salt-sensitive).

    PubMed

    Rossi, Lorenzo; Borghi, Monica; Francini, Alessandra; Lin, Xiuli; Xie, De-Yu; Sebastiani, Luca

    2016-10-01

    Olive tree (Olea europaea L.) is an important crop in the Mediterranean Basin where drought and salinity are two of the main factors affecting plant productivity. Despite several studies have reported different responses of various olive tree cultivars to salt stress, the mechanisms that convey tolerance and sensitivity remain largely unknown. To investigate this issue, potted olive plants of Leccino (salt-sensitive) and Frantoio (salt-tolerant) cultivars were grown in a phytotron chamber and treated with 0, 60 and 120mM NaCl. After forty days of treatment, growth analysis was performed and the concentration of sodium in root, stem and leaves was measured by atomic absorption spectroscopy. Phenolic compounds were extracted using methanol, hydrolyzed with butanol-HCl, and quercetin and kaempferol quantified via high performance liquid-chromatography-electrospray-mass spectrometry (HPLC-ESI-MS) and HPLC-q-Time of Flight-MS analyses. In addition, the transcripts levels of five key genes of the phenylpropanoid pathway were measured by quantitative Real-Time PCR. The results of this study corroborate the previous observations, which showed that Frantoio and Leccino differ in allocating sodium in root and leaves. This study also revealed that phenolic compounds remain stable or are strongly depleted under long-time treatment with sodium in Leccino, despite a strong up-regulation of key genes of the phenylpropanoid pathway was observed. Frantoio instead, showed a less intense up-regulation of the phenylpropanoid genes but overall higher content of phenolic compounds. These data suggest that Frantoio copes with the toxicity imposed by elevated sodium not only with mechanisms of Na + exclusion, but also promptly allocating effective and adequate antioxidant compounds to more sensitive organs. Copyright © 2016 Elsevier GmbH. All rights reserved.

  4. Active and passive shielding design optimization and technical solutions for deep sensitivity hard x-ray focusing telescopes

    NASA Astrophysics Data System (ADS)

    Malaguti, G.; Pareschi, G.; Ferrando, P.; Caroli, E.; Di Cocco, G.; Foschini, L.; Basso, S.; Del Sordo, S.; Fiore, F.; Bonati, A.; Lesci, G.; Poulsen, J. M.; Monzani, F.; Stevoli, A.; Negri, B.

    2005-08-01

    The 10-100 keV region of the electromagnetic spectrum contains the potential for a dramatic improvement in our understanding of a number of key problems in high energy astrophysics. A deep inspection of the universe in this band is on the other hand still lacking because of the demanding sensitivity (fraction of μCrab in the 20-40 keV for 1 Ms integration time) and imaging (≈ 15" angular resolution) requirements. The mission ideas currently being proposed are based on long focal length, grazing incidence, multi-layer optics, coupled with focal plane detectors with few hundreds μm spatial resolution capability. The required large focal lengths, ranging between 8 and 50 m, can be realized by means of extendable optical benches (as foreseen e.g. for the HEXITSAT, NEXT and NuSTAR missions) or formation flight scenarios (e.g. Simbol-X and XEUS). While the final telescope design will require a detailed trade-off analysis between all the relevant parameters (focal length, plate scale value, angular resolution, field of view, detector size, and sensitivity degradation due to detector dead area and telescope vignetting), extreme attention must be dedicated to the background minimization. In this respect, key issues are represented by the passive baffling system, which in case of large focal lengths requires particular design assessments, and by the active/passive shielding geometries and materials. In this work, the result of a study of the expected background for a hard X-ray telescope is presented, and its implication on the required sensitivity, together with the possible implementation design concepts for active and passive shielding in the framework of future satellite missions, are discussed.

  5. Final report of key comparison AFRIMETS.AUV.A-K5: primary pressure calibration of LS1P microphones according to IEC 61094-2, over the frequency range 2 Hz to 10 kHz.

    NASA Astrophysics Data System (ADS)

    Nel, R.; Avison, J.; Harris, P.; Blabla, M.; Hämäläinen, J.

    2017-01-01

    The degrees of equivalence of the AFRIMETS.AUV.A-K5 regional key comparison are reported here as the final report. The scope of the comparison covered the complex pressure sensitivities of two LS1P microphones over the frequency range 2 Hz to 10 kHz in accordance with IEC 61094-2: 2009. Four national metrology institutes from two different regional metrology organisations participated in the comparison. Two LS1P microphones were circulated simultaneously to all the participants in a circular configuration. One of the microphones sensitivity shifted and all results associated with this microphone were subsequently excluded from further analysis and linking. The AFRIMETS.AUV.A-K5 comparison results were linked to the CCAUV.A-K5 comparison results via dual participation in the CCAUV.A-K5 and AFRIMETS.AUV.A-K5 comparisons. The degrees of equivalence, linked to the CCAUV.A-K5 comparison, were calculated for all participants of this comparison. Main text To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCAUV, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  6. Intestinal microbial variation may predict early acute rejection after liver transplantation in rats.

    PubMed

    Ren, Zhigang; Jiang, Jianwen; Lu, Haifeng; Chen, Xinhua; He, Yong; Zhang, Hua; Xie, Haiyang; Wang, Weilin; Zheng, Shusen; Zhou, Lin

    2014-10-27

    Acute rejection (AR) remains a life-threatening complication after orthotopic liver transplantation (OLT) and there are few available diagnostic biomarkers clinically for AR. This study aims to identify intestinal microbial profile and explore potential application of microbial profile as a biomarker for AR after OLT. The OLT models in rats were established. Hepatic graft histology, ultrastructure, function, and intestinal barrier function were tested. Ileocecal contents were collected for intestinal microbial analysis. Hepatic graft suffered from the ischemia-reperfusion (I/R) injury on day 1, initial AR on day 3, and severe AR on day 7 after OLT. Real-time quantitative polymerase chain reaction results showed that genus Faecalibacterium prausnitzii and Lactobacillus were decreased, whereas Clostridium bolteae was increased during AR. Notably, cluster analysis of denaturing gradient gel electrophoresis (DGGE) profiles showed the 7AR and 3AR groups clustered together with 73.4% similarity, suggesting that intestinal microbiota was more sensitive than hepatic function in responding to AR. Microbial diversity and species richness were decreased during AR. Phylogenetic tree analysis showed that most of the decreased key bacteria belonged to phylum Firmicutes, whereas increased key bacteria belonged to phylum Bacteroidetes. Moreover, intestinal microvilli loss and tight junction damage were noted, and intestinal barrier dysfunction during AR presented a decrease of fecal secretory immunoglobulin A (sIgA) and increase of blood bacteremia, endotoxin, and tumor necrosis factor-α. We dynamically detail intestinal microbial characterization and find a high sensitivity of microbial change during AR after OLT, suggesting that intestinal microbial variation may predict AR in early phase and become an assistant therapeutic target to improve rejection after OLT.

  7. Editor's Highlight: Application of Gene Set Enrichment Analysis for Identification of Chemically Induced, Biologically Relevant Transcriptomic Networks and Potential Utilization in Human Health Risk Assessment.

    PubMed

    Dean, Jeffry L; Zhao, Q Jay; Lambert, Jason C; Hawkins, Belinda S; Thomas, Russell S; Wesselkamper, Scott C

    2017-05-01

    The rate of new chemical development in commerce combined with a paucity of toxicity data for legacy chemicals presents a unique challenge for human health risk assessment. There is a clear need to develop new technologies and incorporate novel data streams to more efficiently inform derivation of toxicity values. One avenue of exploitation lies in the field of transcriptomics and the application of gene expression analysis to characterize biological responses to chemical exposures. In this context, gene set enrichment analysis (GSEA) was employed to evaluate tissue-specific, dose-response gene expression data generated following exposure to multiple chemicals for various durations. Patterns of transcriptional enrichment were evident across time and with increasing dose, and coordinated enrichment plausibly linked to the etiology of the biological responses was observed. GSEA was able to capture both transient and sustained transcriptional enrichment events facilitating differentiation between adaptive versus longer term molecular responses. When combined with benchmark dose (BMD) modeling of gene expression data from key drivers of biological enrichment, GSEA facilitated characterization of dose ranges required for enrichment of biologically relevant molecular signaling pathways, and promoted comparison of the activation dose ranges required for individual pathways. Median transcriptional BMD values were calculated for the most sensitive enriched pathway as well as the overall median BMD value for key gene members of significantly enriched pathways, and both were observed to be good estimates of the most sensitive apical endpoint BMD value. Together, these efforts support the application of GSEA to qualitative and quantitative human health risk assessment. Published by Oxford University Press on behalf of the Society of Toxicology 2017. This work is written by US Government employees and is in the public domain in the US.

  8. Intestinal Microbial Variation May Predict Early Acute Rejection after Liver Transplantation in Rats

    PubMed Central

    Ren, Zhigang; Jiang, Jianwen; Lu, Haifeng; Chen, Xinhua; He, Yong; Zhang, Hua; Xie, Haiyang; Wang, Weilin; Zheng, Shusen; Zhou, Lin

    2014-01-01

    Background Acute rejection (AR) remains a life-threatening complication after orthotopic liver transplantation (OLT) and there are few available diagnostic biomarkers clinically for AR. This study aims to identify intestinal microbial profile and explore potential application of microbial profile as a biomarker for AR after OLT. Methods The OLT models in rats were established. Hepatic graft histology, ultrastructure, function, and intestinal barrier function were tested. Ileocecal contents were collected for intestinal microbial analysis. Results Hepatic graft suffered from the ischemia-reperfusion (I/R) injury on day 1, initial AR on day 3, and severe AR on day 7 after OLT. Real-time quantitative polymerase chain reaction results showed that genus Faecalibacterium prausnitzii and Lactobacillus were decreased, whereas Clostridium bolteae was increased during AR. Notably, cluster analysis of denaturing gradient gel electrophoresis (DGGE) profiles showed the 7AR and 3AR groups clustered together with 73.4% similarity, suggesting that intestinal microbiota was more sensitive than hepatic function in responding to AR. Microbial diversity and species richness were decreased during AR. Phylogenetic tree analysis showed that most of the decreased key bacteria belonged to phylum Firmicutes, whereas increased key bacteria belonged to phylum Bacteroidetes. Moreover, intestinal microvilli loss and tight junction damage were noted, and intestinal barrier dysfunction during AR presented a decrease of fecal secretory immunoglobulin A (sIgA) and increase of blood bacteremia, endotoxin, and tumor necrosis factor-α. Conclusion We dynamically detail intestinal microbial characterization and find a high sensitivity of microbial change during AR after OLT, suggesting that intestinal microbial variation may predict AR in early phase and become an assistant therapeutic target to improve rejection after OLT. PMID:25321166

  9. Renal Mass Biopsy to Guide Treatment Decisions for Small Incidental Renal Tumors: A Cost-effectiveness Analysis1

    PubMed Central

    Gervais, Debra A.; Hartman, Rebecca I.; Harisinghani, Mukesh G.; Feldman, Adam S.; Mueller, Peter R.; Gazelle, G. Scott

    2010-01-01

    Purpose: To evaluate the effectiveness, cost, and cost-effectiveness of using renal mass biopsy to guide treatment decisions for small incidentally detected renal tumors. Materials and Methods: A decision-analytic Markov model was developed to estimate life expectancy and lifetime costs for patients with small (≤4-cm) renal tumors. Two strategies were compared: renal mass biopsy to triage patients to surgery or imaging surveillance and empiric nephron-sparing surgery. The model incorporated biopsy performance, the probability of track seeding with malignant cells, the prevalence and growth of benign and malignant tumors, treatment effectiveness and costs, and patient outcomes. An incremental cost-effectiveness analysis was performed to identify strategy preference under a willingness-to-pay threshold of $75 000 per quality-adjusted life-year (QALY). Effects of changes in key parameters on strategy preference were evaluated in sensitivity analysis. Results: Under base-case assumptions, the biopsy strategy yielded a minimally greater quality-adjusted life expectancy (4 days) than did empiric surgery at a lower lifetime cost ($3466), dominating surgery from a cost-effectiveness perspective. Over the majority of parameter ranges tested in one-way sensitivity analysis, the biopsy strategy dominated surgery or was cost-effective relative to surgery based on a $75 000-per-QALY willingness-to-pay threshold. In two-way sensitivity analysis, surgery yielded greater life expectancy when the prevalence of malignancy and propensity for biopsy-negative cancers to metastasize were both higher than expected or when the sensitivity and specificity of biopsy were both lower than expected. Conclusion: The use of biopsy to guide treatment decisions for small incidentally detected renal tumors is cost-effective and can prevent unnecessary surgery in many cases. © RSNA, 2010 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.10092013/-/DC1 PMID:20720070

  10. A multi-model assessment of terrestrial biosphere model data needs

    NASA Astrophysics Data System (ADS)

    Gardella, A.; Cowdery, E.; De Kauwe, M. G.; Desai, A. R.; Duveneck, M.; Fer, I.; Fisher, R.; Knox, R. G.; Kooper, R.; LeBauer, D.; McCabe, T.; Minunno, F.; Raiho, A.; Serbin, S.; Shiklomanov, A. N.; Thomas, A.; Walker, A.; Dietze, M.

    2017-12-01

    Terrestrial biosphere models provide us with the means to simulate the impacts of climate change and their uncertainties. Going beyond direct observation and experimentation, models synthesize our current understanding of ecosystem processes and can give us insight on data needed to constrain model parameters. In previous work, we leveraged the Predictive Ecosystem Analyzer (PEcAn) to assess the contribution of different parameters to the uncertainty of the Ecosystem Demography model v2 (ED) model outputs across various North American biomes (Dietze et al., JGR-G, 2014). While this analysis identified key research priorities, the extent to which these priorities were model- and/or biome-specific was unclear. Furthermore, because the analysis only studied one model, we were unable to comment on the effect of variability in model structure to overall predictive uncertainty. Here, we expand this analysis to all biomes globally and a wide sample of models that vary in complexity: BioCro, CABLE, CLM, DALEC, ED2, FATES, G'DAY, JULES, LANDIS, LINKAGES, LPJ-GUESS, MAESPA, PRELES, SDGVM, SIPNET, and TEM. Prior to performing uncertainty analyses, model parameter uncertainties were assessed by assimilating all available trait data from the combination of the BETYdb and TRY trait databases, using an updated multivariate version of PEcAn's Hierarchical Bayesian meta-analysis. Next, sensitivity analyses were performed for all models across a range of sites globally to assess sensitivities for a range of different outputs (GPP, ET, SH, Ra, NPP, Rh, NEE, LAI) at multiple time scales from the sub-annual to the decadal. Finally, parameter uncertainties and model sensitivities were combined to evaluate the fractional contribution of each parameter to the predictive uncertainty for a specific variable at a specific site and timescale. Facilitated by PEcAn's automated workflows, this analysis represents the broadest assessment of the sensitivities and uncertainties in terrestrial models to date, and provides a comprehensive roadmap for constraining model uncertainties through model development and data collection.

  11. Key Elements of a Family Intervention for Schizophrenia: A Qualitative Analysis of an RCT.

    PubMed

    Grácio, Jaime; Gonçalves-Pereira, Manuel; Leff, Julian

    2018-03-01

    Schizophrenia is a complex biopsychosocial condition in which expressed emotion in family members is a robust predictor of relapse. Not surprisingly, family interventions are remarkably effective and thus recommended in current treatment guidelines. Their key elements seem to be common therapeutic factors, followed by education and coping skills training. However, few studies have explored these key elements and the process of the intervention itself. We conducted a qualitative and quantitative analysis of the records from a pioneering family intervention trial addressing expressed emotion, published by Leff and colleagues four decades ago. Records were analyzed into categories and data explored using descriptive statistics. This was complemented by a narrative evaluation using an inductive approach based on emotional markers and markers of change. The most used strategies in the intervention were addressing needs, followed by coping skills enhancement, advice, and emotional support. Dealing with overinvolvement and reframing were the next most frequent. Single-family home sessions seemed to augment the therapeutic work conducted in family groups. Overall the intervention seemed to promote cognitive and emotional change in the participants, and therapists were sensitive to the emotional trajectory of each subject. On the basis of our findings, we developed a longitudinal framework for better understanding the process of this treatment approach. © 2016 Family Process Institute.

  12. Development of Sensitivity to Audiovisual Temporal Asynchrony during Midchildhood

    ERIC Educational Resources Information Center

    Kaganovich, Natalya

    2016-01-01

    Temporal proximity is one of the key factors determining whether events in different modalities are integrated into a unified percept. Sensitivity to audiovisual temporal asynchrony has been studied in adults in great detail. However, how such sensitivity matures during childhood is poorly understood. We examined perception of audiovisual temporal…

  13. Fricke-gel dosimeter: overview of Xylenol Orange chemical behavior

    NASA Astrophysics Data System (ADS)

    Liosi, G. M.; Dondi, D.; Vander Griend, D. A.; Lazzaroni, S.; D'Agostino, G.; Mariani, M.

    2017-11-01

    The complexation between Xylenol Orange (XO) and Fe3+ ions plays a key role in Fricke-gel dosimeters for the determination of the absorbed dose via UV-vis analysis. In this study, the effect of XO and the acidity of the solution on the complexation mechanism was investigated. Moreover, starting from the results of complexation titration and Equilibrium Restricted Factor Analysis, four XO-Fe3+ complexes were identified to contribute to the absorption spectra. Based on the acquired knowledge, a new [Fe3+] vs dose calibration method is proposed. The preliminary results show a significant improvement of the sensitivity and dose threshold with respect to the commonly used Abs vs dose calibration method.

  14. Observer-Pattern Modeling and Slow-Scale Bifurcation Analysis of Two-Stage Boost Inverters

    NASA Astrophysics Data System (ADS)

    Zhang, Hao; Wan, Xiaojin; Li, Weijie; Ding, Honghui; Yi, Chuanzhi

    2017-06-01

    This paper deals with modeling and bifurcation analysis of two-stage Boost inverters. Since the effect of the nonlinear interactions between source-stage converter and load-stage inverter causes the “hidden” second-harmonic current at the input of the downstream H-bridge inverter, an observer-pattern modeling method is proposed by removing time variance originating from both fundamental frequency and hidden second harmonics in the derived averaged equations. Based on the proposed observer-pattern model, the underlying mechanism of slow-scale instability behavior is uncovered with the help of eigenvalue analysis method. Then eigenvalue sensitivity analysis is used to select some key system parameters of two-stage Boost inverter, and some behavior boundaries are given to provide some design-oriented information for optimizing the circuit. Finally, these theoretical results are verified by numerical simulations and circuit experiment.

  15. Parameter screening: the use of a dummy parameter to identify non-influential parameters in a global sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Khorashadi Zadeh, Farkhondeh; Nossent, Jiri; van Griensven, Ann; Bauwens, Willy

    2017-04-01

    Parameter estimation is a major concern in hydrological modeling, which may limit the use of complex simulators with a large number of parameters. To support the selection of parameters to include in or exclude from the calibration process, Global Sensitivity Analysis (GSA) is widely applied in modeling practices. Based on the results of GSA, the influential and the non-influential parameters are identified (i.e. parameters screening). Nevertheless, the choice of the screening threshold below which parameters are considered non-influential is a critical issue, which has recently received more attention in GSA literature. In theory, the sensitivity index of a non-influential parameter has a value of zero. However, since numerical approximations, rather than analytical solutions, are utilized in GSA methods to calculate the sensitivity indices, small but non-zero indices may be obtained for the indices of non-influential parameters. In order to assess the threshold that identifies non-influential parameters in GSA methods, we propose to calculate the sensitivity index of a "dummy parameter". This dummy parameter has no influence on the model output, but will have a non-zero sensitivity index, representing the error due to the numerical approximation. Hence, the parameters whose indices are above the sensitivity index of the dummy parameter can be classified as influential, whereas the parameters whose indices are below this index are within the range of the numerical error and should be considered as non-influential. To demonstrated the effectiveness of the proposed "dummy parameter approach", 26 parameters of a Soil and Water Assessment Tool (SWAT) model are selected to be analyzed and screened, using the variance-based Sobol' and moment-independent PAWN methods. The sensitivity index of the dummy parameter is calculated from sampled data, without changing the model equations. Moreover, the calculation does not even require additional model evaluations for the Sobol' method. A formal statistical test validates these parameter screening results. Based on the dummy parameter screening, 11 model parameters are identified as influential. Therefore, it can be denoted that the "dummy parameter approach" can facilitate the parameter screening process and provide guidance for GSA users to define a screening-threshold, with only limited additional resources. Key words: Parameter screening, Global sensitivity analysis, Dummy parameter, Variance-based method, Moment-independent method

  16. Context sensitive solutions (CSS) online training course development.

    DOT National Transportation Integrated Search

    2009-04-01

    A key to the successful implementation of Context Sensitive Solutions (CSS) for Illinois transportation projects is the : active and informed participation of Illinois Department of Transportations (IDOT) stakeholders. Essential to this : particip...

  17. Self-homodyne free-space optical communication system based on orthogonally polarized binary phase shift keying.

    PubMed

    Cai, Guangyu; Sun, Jianfeng; Li, Guangyuan; Zhang, Guo; Xu, Mengmeng; Zhang, Bo; Yue, Chaolei; Liu, Liren

    2016-06-10

    A self-homodyne laser communication system based on orthogonally polarized binary phase shift keying is demonstrated. The working principles of this method and the structure of a transceiver are described using theoretical calculations. Moreover, the signal-to-noise ratio, sensitivity, and bit error rate are analyzed for the amplifier-noise-limited case. The reported experiment validates the feasibility of the proposed method and demonstrates its advantageous sensitivity as a self-homodyne communication system.

  18. Statistics 101 for Radiologists.

    PubMed

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  19. Behavioral modeling of VCSELs for high-speed optical interconnects

    NASA Astrophysics Data System (ADS)

    Szczerba, Krzysztof; Kocot, Chris

    2018-02-01

    Transition from on-off keying to 4-level pulse amplitude modulation (PAM) in VCSEL based optical interconnects allows for an increase of data rates, at the cost of 4.8 dB sensitivity penalty. The resulting strained link budget creates a need for accurate VCSEL models for driver integrated circuit (IC) design and system level simulations. Rate equation based equivalent circuit models are convenient for the IC design, but system level analysis requires computationally efficient closed form behavioral models based Volterra series and neural networks. In this paper we present and compare these models.

  20. A Sensitivity Analysis of the BRL Message Processing Model (BRLMPM) Data Inputs.

    DTIC Science & Technology

    1982-12-01

    DISTRIBUTION STATEMENT (f the abstract mird edn Block 0, if dIfferent hrm Report) IS. SUPPLEMENTARY NOTES It. KEY WORDS (Continue an revese side ll...j o4 0066000a 060 00060600 4 4C 2k.4I %J Ma 0a42 a J~353 O 4 It a.m.zn4 0.kap -007 S 60 .. 0 60 0 000 a0 00 00 5J 0 0 00000 0 0 0 a00a0 00 0 3k 7k

  1. Regional surface soil heat flux estimate from multiple remote sensing data in a temperate and semiarid basin

    NASA Astrophysics Data System (ADS)

    Li, Nana; Jia, Li; Lu, Jing; Menenti, Massimo; Zhou, Jie

    2017-01-01

    The regional surface soil heat flux (G0) estimation is very important for the large-scale land surface process modeling. However, most of the regional G0 estimation methods are based on the empirical relationship between G0 and the net radiation flux. A physical model based on harmonic analysis was improved (referred to as "HM model") and applied over the Heihe River Basin northwest China with multiple remote sensing data, e.g., FY-2C, AMSR-E, and MODIS, and soil map data. The sensitivity analysis of the model was studied as well. The results show that the improved model describes the variation of G0 well. Land surface temperature (LST) and thermal inertia (Γ) are the two key input variables to the HM model. Compared with in situ G0, there are some differences, mainly due to the differences between remote-sensed LST and the in situ LST. The sensitivity analysis shows that the errors from -7 to -0.5 K in LST amplitude and from -300 to 300 J m-2 K-1 s-0.5 in Γ will cause about 20% errors, which are acceptable for G0 estimation.

  2. Validation and Parameter Sensitivity Tests for Reconstructing Swell Field Based on an Ensemble Kalman Filter

    PubMed Central

    Wang, Xuan; Tandeo, Pierre; Fablet, Ronan; Husson, Romain; Guan, Lei; Chen, Ge

    2016-01-01

    The swell propagation model built on geometric optics is known to work well when simulating radiated swells from a far located storm. Based on this simple approximation, satellites have acquired plenty of large samples on basin-traversing swells induced by fierce storms situated in mid-latitudes. How to routinely reconstruct swell fields with these irregularly sampled observations from space via known swell propagation principle requires more examination. In this study, we apply 3-h interval pseudo SAR observations in the ensemble Kalman filter (EnKF) to reconstruct a swell field in ocean basin, and compare it with buoy swell partitions and polynomial regression results. As validated against in situ measurements, EnKF works well in terms of spatial–temporal consistency in far-field swell propagation scenarios. Using this framework, we further address the influence of EnKF parameters, and perform a sensitivity analysis to evaluate estimations made under different sets of parameters. Such analysis is of key interest with respect to future multiple-source routinely recorded swell field data. Satellite-derived swell data can serve as a valuable complementary dataset to in situ or wave re-analysis datasets. PMID:27898005

  3. IATA for skin sensitization potential – 1 out of 2 or 2 out of 3? ...

    EPA Pesticide Factsheets

    To meet EU regulatory requirements and to avoid or minimize animal testing, there is a need for non-animal methods to assess skin sensitization potential. Given the complexity of the skin sensitization endpoint, there is an expectation that integrated testing and assessment approaches (IATA) will need to be developed which rely on assays representing key events in the pathway. Three non-animal assays have been formally validated: the direct peptide reactivity assay (DPRA), the KeratinoSensTM assay and the h-CLAT assay. At the same time, there have been many efforts to develop IATA with the “2 out of 3” approach attracting much attention whereby a chemical is classified on the basis of the majority outcome. A set of 271 chemicals with mouse, human and non-animal sensitization test data was evaluated to compare the predictive performances of the 3 individual non-animal assays, their binary combinations and the ‘2 out of 3’ approach. The analysis revealed that the most predictive approach was to use both the DPRA and h-CLAT: 1. Perform DPRA – if positive, classify as a sensitizer; 2. If negative, perform h-CLAT – a positive outcome denotes a sensitizer, a negative, a non-sensitizer. With this approach, 83% (LLNA) and 93% (human) of the non-sensitizer predictions were correct, in contrast to the ‘2 out of 3’ approach which had 69% (LLNA) and 79% (human) of non-sensitizer predictions correct. The views expressed are those of the authors and do not ne

  4. Extinction, survival or recovery of large predatory fishes

    PubMed Central

    Myers, Ransom A.; Worm, Boris

    2005-01-01

    Large predatory fishes have long played an important role in marine ecosystems and fisheries. Overexploitation, however, is gradually diminishing this role. Recent estimates indicate that exploitation has depleted large predatory fish communities worldwide by at least 90% over the past 50–100 years. We demonstrate that these declines are general, independent of methodology, and even higher for sensitive species such as sharks. We also attempt to predict the future prospects of large predatory fishes. (i) An analysis of maximum reproductive rates predicts the collapse and extinction of sensitive species under current levels of fishing mortality. Sensitive species occur in marine habitats worldwide and have to be considered in most management situations. (ii) We show that to ensure the survival of sensitive species in the northwest Atlantic fishing mortality has to be reduced by 40–80%. (iii) We show that rapid recovery of community biomass and diversity usually occurs when fishing mortality is reduced. However, recovery is more variable for single species, often because of the influence of species interactions. We conclude that management of multi-species fisheries needs to be tailored to the most sensitive, rather than the more robust species. This requires reductions in fishing effort, reduction in bycatch mortality and protection of key areas to initiate recovery of severely depleted communities. PMID:15713586

  5. Extinction, survival or recovery of large predatory fishes.

    PubMed

    Myers, Ransom A; Worm, Boris

    2005-01-29

    Large predatory fishes have long played an important role in marine ecosystems and fisheries. Overexploitation, however, is gradually diminishing this role. Recent estimates indicate that exploitation has depleted large predatory fish communities worldwide by at least 90% over the past 50-100 years. We demonstrate that these declines are general, independent of methodology, and even higher for sensitive species such as sharks. We also attempt to predict the future prospects of large predatory fishes. (i) An analysis of maximum reproductive rates predicts the collapse and extinction of sensitive species under current levels of fishing mortality. Sensitive species occur in marine habitats worldwide and have to be considered in most management situations. (ii) We show that to ensure the survival of sensitive species in the northwest Atlantic fishing mortality has to be reduced by 40-80%. (iii) We show that rapid recovery of community biomass and diversity usually occurs when fishing mortality is reduced. However, recovery is more variable for single species, often because of the influence of species interactions. We conclude that management of multi-species fisheries needs to be tailored to the most sensitive, rather than the more robust species. This requires reductions in fishing effort, reduction in bycatch mortality and protection of key areas to initiate recovery of severely depleted communities.

  6. Impact of the time scale of model sensitivity response on coupled model parameter estimation

    NASA Astrophysics Data System (ADS)

    Liu, Chang; Zhang, Shaoqing; Li, Shan; Liu, Zhengyu

    2017-11-01

    That a model has sensitivity responses to parameter uncertainties is a key concept in implementing model parameter estimation using filtering theory and methodology. Depending on the nature of associated physics and characteristic variability of the fluid in a coupled system, the response time scales of a model to parameters can be different, from hourly to decadal. Unlike state estimation, where the update frequency is usually linked with observational frequency, the update frequency for parameter estimation must be associated with the time scale of the model sensitivity response to the parameter being estimated. Here, with a simple coupled model, the impact of model sensitivity response time scales on coupled model parameter estimation is studied. The model includes characteristic synoptic to decadal scales by coupling a long-term varying deep ocean with a slow-varying upper ocean forced by a chaotic atmosphere. Results show that, using the update frequency determined by the model sensitivity response time scale, both the reliability and quality of parameter estimation can be improved significantly, and thus the estimated parameters make the model more consistent with the observation. These simple model results provide a guideline for when real observations are used to optimize the parameters in a coupled general circulation model for improving climate analysis and prediction initialization.

  7. ‘They will be afraid to touch you’: LGBTI people and sex workers' experiences of accessing healthcare in Zimbabwe—an in-depth qualitative study

    PubMed Central

    Hunt, Jennifer; Bristowe, Katherine; Chidyamatare, Sybille; Harding, Richard

    2017-01-01

    Objectives To examine experiences of key populations (lesbian, gay, bisexual, trans and intersex (LGBTI) people, men who have sex with men (MSM) and sex workers) in Zimbabwe regarding access to, and experiences of, healthcare. Design Qualitative study using in-depth interviews and focus groups, with thematic analysis. Participants Sixty individuals from key populations in Zimbabwe. Setting Participants were recruited from four locations (Harare, Bulawayo, Mutare, Beitbridge/Masvingo). Results Participants described considerable unmet needs and barriers to accessing basic healthcare due to discrimination regarding key population status, exacerbated by the sociopolitical/legal environment. Three main themes emerged: (1) key populations' illnesses were caused by their behaviour; (2) equal access to healthcare is conditional on key populations conforming to ‘sexual norms’ and (3) perceptions that healthcare workers were ill-informed about key populations, and that professionals' personal attitudes affected care delivery. Participants felt unable to discuss their key population status with healthcare workers. Their healthcare needs were expected to be met almost entirely by their own communities. Conclusions This is one of very few studies of healthcare access beyond HIV for key populations in Africa. Discrimination towards key populations discourages early diagnosis, limits access to healthcare/treatment and increases risk of transmission of infectious diseases. Key populations experience unnecessary suffering from untreated conditions, exclusion from healthcare and extreme psychological distress. Education is needed to reduce stigma and enhance sensitive clinical interviewing skills. Clinical and public health implications of discrimination in healthcare must be addressed through evidence-based interventions for professionals, particularly in contexts with sociopolitical/legal barriers to equality. PMID:28589012

  8. Cost-effectiveness of unicondylar versus total knee arthroplasty: a Markov model analysis.

    PubMed

    Peersman, Geert; Jak, Wouter; Vandenlangenbergh, Tom; Jans, Christophe; Cartier, Philippe; Fennema, Peter

    2014-01-01

    Unicondylar knee arthroplasty (UKA) is believed to lead to less morbidity and enhanced functional outcomes when compared with total knee arthroplasty (TKA). Conversely, UKA is also associated with a higher revision risk than TKA. In order to further clarify the key differences between these separate procedures, the current study assessing the cost-effectiveness of UKA versus TKA was undertaken. A state-transition Markov model was developed to compare the cost-effectiveness of UKA versus TKA for unicondylar osteoarthritis using a Belgian payer's perspective. The model was designed to include the possibility of two revision procedures. Model estimates were obtained through literature review and revision rates were based on registry data. Threshold analysis and probabilistic sensitivity analysis were performed to assess the model's robustness. UKA was associated with a cost reduction of €2,807 and a utility gain of 0.04 quality-adjusted life years in comparison with TKA. Analysis determined that the model is sensitive to clinical effectiveness, and that a marginal reduction in the clinical performance of UKA would lead to TKA being the more cost-effective solution. UKA yields clear advantages in terms of costs and marginal advantages in terms of health effects, in comparison with TKA. © 2014 Elsevier B.V. All rights reserved.

  9. Missing data in trial‐based cost‐effectiveness analysis: An incomplete journey

    PubMed Central

    Gomes, Manuel; Carpenter, James R.

    2018-01-01

    SUMMARY Cost‐effectiveness analyses (CEA) conducted alongside randomised trials provide key evidence for informing healthcare decision making, but missing data pose substantive challenges. Recently, there have been a number of developments in methods and guidelines addressing missing data in trials. However, it is unclear whether these developments have permeated CEA practice. This paper critically reviews the extent of and methods used to address missing data in recently published trial‐based CEA. Issues of the Health Technology Assessment journal from 2013 to 2015 were searched. Fifty‐two eligible studies were identified. Missing data were very common; the median proportion of trial participants with complete cost‐effectiveness data was 63% (interquartile range: 47%–81%). The most common approach for the primary analysis was to restrict analysis to those with complete data (43%), followed by multiple imputation (30%). Half of the studies conducted some sort of sensitivity analyses, but only 2 (4%) considered possible departures from the missing‐at‐random assumption. Further improvements are needed to address missing data in cost‐effectiveness analyses conducted alongside randomised trials. These should focus on limiting the extent of missing data, choosing an appropriate method for the primary analysis that is valid under contextually plausible assumptions, and conducting sensitivity analyses to departures from the missing‐at‐random assumption. PMID:29573044

  10. A preliminary study applying decision analysis to the treatment of caries in primary teeth.

    PubMed

    Tamošiūnas, Vytautas; Kay, Elizabeth; Craven, Rebecca

    2013-01-01

    To determine an optimal treatment strategy for carious deciduous teeth. Manchester Dental Hospital. Decision analysis. The likelihoods of each of the sequelae of caries in deciduous teeth were determined from the literature. The utility of the outcomes from non-treatment and treatment was then measured in 100 parents of children with caries, using a visual analogue scale. Decision analysis was performed which weighted the value of each potential outcome by the probability of its occurrence. A decision tree "fold-back" and sensitivity analysis then determined which treatment strategies, under which circumstances, offered the maximum expected utilities. The decision to leave a carious deciduous tooth unrestored attracted a maximum utility of 76.65 and the overall expected utility for the decision "restore" was 73.27 The decision to restore or not restore carious deciduous teeth are therefore of almost equal value. The decision is however highly sensitive to the utility value assigned to the advent of pain by the patient. There is no clear advantage to be gained by restoring deciduous teeth if patients' evaluations of outcomes are taken into account. Avoidance of pain and avoidance of procedures which are viewed as unpleasant by parents should be key determinants of clinical decision making about carious deciduous teeth.

  11. Costs and Effects of a Telephonic Diabetes Self-Management Support Intervention Using Health Educators

    PubMed Central

    Schechter, Clyde B.; Walker, Elizabeth A.; Ortega, Felix M.; Chamany, Shadi; Silver, Lynn D.

    2015-01-01

    Background Self-management is crucial to successful glycemic control in patients with diabetes, yet it requires patients to initiate and sustain complicated behavioral changes. Support programs can improve glycemic control, but may be expensive to implement. We report here an analysis of the costs of a successful telephone-based self-management support program delivered by lay health educators utilizing a municipal health department A1c registry, and relate them to near-term effectiveness. Methods Costs of implementation were assessed by micro-costing of all resources used. Per-capita costs and cost-effectiveness ratios from the perspective of the service provider are estimated for net A1c reduction, and percentages of patients achieving A1c reductions of 0.5 and 1.0 percentage points. Oneway sensitivity analyses of key cost elements, and a Monte Carlo sensitivity analysis are reported. Results The telephone intervention was provided to 443 people at a net cost of $187.61 each. Each percentage point of net A1c reduction was achieved at a cost of $464.41. Labor costs were the largest component of costs, and cost-effectiveness was most sensitive to the wages paid to the health educators. Conclusions Effective telephone-based self-management support for people in poor diabetes control can be delivered by health educators at moderate cost relative to the gains achieved. The costs of doing so are most sensitive to the prevailing wage for the health educators. PMID:26750743

  12. Costs and effects of a telephonic diabetes self-management support intervention using health educators.

    PubMed

    Schechter, Clyde B; Walker, Elizabeth A; Ortega, Felix M; Chamany, Shadi; Silver, Lynn D

    2016-03-01

    Self-management is crucial to successful glycemic control in patients with diabetes, yet it requires patients to initiate and sustain complicated behavioral changes. Support programs can improve glycemic control, but may be expensive to implement. We report here an analysis of the costs of a successful telephone-based self-management support program delivered by lay health educators utilizing a municipal health department A1c registry, and relate them to near-term effectiveness. Costs of implementation were assessed by micro-costing of all resources used. Per-capita costs and cost-effectiveness ratios from the perspective of the service provider are estimated for net A1c reduction, and percentages of patients achieving A1c reductions of 0.5 and 1.0 percentage points. One-way sensitivity analyses of key cost elements, and a Monte Carlo sensitivity analysis are reported. The telephone intervention was provided to 443 people at a net cost of $187.61 each. Each percentage point of net A1c reduction was achieved at a cost of $464.41. Labor costs were the largest component of costs, and cost-effectiveness was most sensitive to the wages paid to the health educators. Effective telephone-based self-management support for people in poor diabetes control can be delivered by health educators at moderate cost relative to the gains achieved. The costs of doing so are most sensitive to the prevailing wage for the health educators. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. CLARREO Cornerstone of the Earth Observing System: Measuring Decadal Change Through Accurate Emitted Infrared and Reflected Solar Spectra and Radio Occultation

    NASA Technical Reports Server (NTRS)

    Sandford, Stephen P.

    2010-01-01

    The Climate Absolute Radiance and Refractivity Observatory (CLARREO) is one of four Tier 1 missions recommended by the recent NRC Decadal Survey report on Earth Science and Applications from Space (NRC, 2007). The CLARREO mission addresses the need to provide accurate, broadly acknowledged climate records that are used to enable validated long-term climate projections that become the foundation for informed decisions on mitigation and adaptation policies that address the effects of climate change on society. The CLARREO mission accomplishes this critical objective through rigorous SI traceable decadal change observations that are sensitive to many of the key uncertainties in climate radiative forcings, responses, and feedbacks that in turn drive uncertainty in current climate model projections. These same uncertainties also lead to uncertainty in attribution of climate change to anthropogenic forcing. For the first time CLARREO will make highly accurate, global, SI-traceable decadal change observations sensitive to the most critical, but least understood, climate forcings, responses, and feedbacks. The CLARREO breakthrough is to achieve the required levels of accuracy and traceability to SI standards for a set of observations sensitive to a wide range of key decadal change variables. The required accuracy levels are determined so that climate trend signals can be detected against a background of naturally occurring variability. Climate system natural variability therefore determines what level of accuracy is overkill, and what level is critical to obtain. In this sense, the CLARREO mission requirements are considered optimal from a science value perspective. The accuracy for decadal change traceability to SI standards includes uncertainties associated with instrument calibration, satellite orbit sampling, and analysis methods. Unlike most space missions, the CLARREO requirements are driven not by the instantaneous accuracy of the measurements, but by accuracy in the large time/space scale averages that are key to understanding decadal changes.

  14. Adaptation of cone pigments found in green rods for scotopic vision through a single amino acid mutation

    PubMed Central

    Kojima, Keiichi; Matsutani, Yuki; Yamashita, Takahiro; Yanagawa, Masataka; Imamoto, Yasushi; Yamano, Yumiko; Wada, Akimori; Hisatomi, Osamu; Nishikawa, Kanto; Sakurai, Keisuke; Shichida, Yoshinori

    2017-01-01

    Most vertebrate retinas contain a single type of rod for scotopic vision and multiple types of cones for photopic and color vision. The retinas of certain amphibian species uniquely contain two types of rods: red rods, which express rhodopsin, and green rods, which express a blue-sensitive cone pigment (M1/SWS2 group). Spontaneous activation of rhodopsin induced by thermal isomerization of the retinal chromophore has been suggested to contribute to the rod’s background noise, which limits the visual threshold for scotopic vision. Therefore, rhodopsin must exhibit low thermal isomerization rate compared with cone visual pigments to adapt to scotopic condition. In this study, we determined whether amphibian blue-sensitive cone pigments in green rods exhibit low thermal isomerization rates to act as rhodopsin-like pigments for scotopic vision. Anura blue-sensitive cone pigments exhibit low thermal isomerization rates similar to rhodopsin, whereas Urodela pigments exhibit high rates like other vertebrate cone pigments present in cones. Furthermore, by mutational analysis, we identified a key amino acid residue, Thr47, that is responsible for the low thermal isomerization rates of Anura blue-sensitive cone pigments. These results strongly suggest that, through this mutation, anurans acquired special blue-sensitive cone pigments in their green rods, which could form the molecular basis for scotopic color vision with normal red rods containing green-sensitive rhodopsin. PMID:28484015

  15. Risk-sensitive reinforcement learning.

    PubMed

    Shen, Yun; Tobia, Michael J; Sommer, Tobias; Obermayer, Klaus

    2014-07-01

    We derive a family of risk-sensitive reinforcement learning methods for agents, who face sequential decision-making tasks in uncertain environments. By applying a utility function to the temporal difference (TD) error, nonlinear transformations are effectively applied not only to the received rewards but also to the true transition probabilities of the underlying Markov decision process. When appropriate utility functions are chosen, the agents' behaviors express key features of human behavior as predicted by prospect theory (Kahneman & Tversky, 1979 ), for example, different risk preferences for gains and losses, as well as the shape of subjective probability curves. We derive a risk-sensitive Q-learning algorithm, which is necessary for modeling human behavior when transition probabilities are unknown, and prove its convergence. As a proof of principle for the applicability of the new framework, we apply it to quantify human behavior in a sequential investment task. We find that the risk-sensitive variant provides a significantly better fit to the behavioral data and that it leads to an interpretation of the subject's responses that is indeed consistent with prospect theory. The analysis of simultaneously measured fMRI signals shows a significant correlation of the risk-sensitive TD error with BOLD signal change in the ventral striatum. In addition we find a significant correlation of the risk-sensitive Q-values with neural activity in the striatum, cingulate cortex, and insula that is not present if standard Q-values are used.

  16. [Assessment of the validity and utility of the Beijing questionnaire as atool to evaluate for obstructive sleep apnea hypopnea syndrome].

    PubMed

    Wang, X T; Gao, L M; Xu, W; Ding, X

    2016-10-20

    Objective: To test the Beijing questionnaire as a means of identifying patients with obstructive sleep apnea hypopnea syndrome(OSAHS). Method: The Beijing questionnaire is designed as an explorative tool consist of 11 questions for patients with obstructive sleep apnea hypopnea, and is targeted toward key symptoms include snoring, apneas, daytime sleepiness, hypertension and overweight. 1 336 female participants living in communities of age≥40 years and 198 male adult subjects visting clinics were given questionnaires. Finally, 59 female and 198 male subjects underwent sleep studies after factor analysis,reliability check,internal consistency study. The correlation analysis was performed between the scores from the Beijing questionnaire and the apnea-hypopnea index from inlaboratory polysomnography.Receiver operating characteristics were constructed to determine optimal sensitivity and specificity. Twenty-four male subjects were recorded in the sleep laberatory again after operative. Result: Factor analysis reduced 11 questions of scale to four common factors as we have designed: snoring,apneas,other symptoms,risk factors. Cronbach's α coefficient of scale reached 0.7.There were an acceptable level of testretest reliability(r=0.619, P <0.01).The apnea hypopnea indices were significantly correlated with their Beijing questionnaire scores( P <0.01).For wemen,an Beijing questionnaire scroe of 19.5 provided a sensitivity of 74.3% and a specificity of 62.5%.For men,an Beijing questionnaire scroe of 22.5 provided a sensitivity of 90.9% and a specificity of 54.5%. And the postoperative Beijing questionnaire scroes changed with the apnea hypopnea indices. Conclusion: This questionnaire has a good validity and reliability and appears to be valid and sensitive to clinical change. Copyright© by the Editorial Department of Journal of Clinical Otorhinolaryngology Head and Neck Surgery.

  17. ASPASIA: A toolkit for evaluating the effects of biological interventions on SBML model behaviour.

    PubMed

    Evans, Stephanie; Alden, Kieran; Cucurull-Sanchez, Lourdes; Larminie, Christopher; Coles, Mark C; Kullberg, Marika C; Timmis, Jon

    2017-02-01

    A calibrated computational model reflects behaviours that are expected or observed in a complex system, providing a baseline upon which sensitivity analysis techniques can be used to analyse pathways that may impact model responses. However, calibration of a model where a behaviour depends on an intervention introduced after a defined time point is difficult, as model responses may be dependent on the conditions at the time the intervention is applied. We present ASPASIA (Automated Simulation Parameter Alteration and SensItivity Analysis), a cross-platform, open-source Java toolkit that addresses a key deficiency in software tools for understanding the impact an intervention has on system behaviour for models specified in Systems Biology Markup Language (SBML). ASPASIA can generate and modify models using SBML solver output as an initial parameter set, allowing interventions to be applied once a steady state has been reached. Additionally, multiple SBML models can be generated where a subset of parameter values are perturbed using local and global sensitivity analysis techniques, revealing the model's sensitivity to the intervention. To illustrate the capabilities of ASPASIA, we demonstrate how this tool has generated novel hypotheses regarding the mechanisms by which Th17-cell plasticity may be controlled in vivo. By using ASPASIA in conjunction with an SBML model of Th17-cell polarisation, we predict that promotion of the Th1-associated transcription factor T-bet, rather than inhibition of the Th17-associated transcription factor RORγt, is sufficient to drive switching of Th17 cells towards an IFN-γ-producing phenotype. Our approach can be applied to all SBML-encoded models to predict the effect that intervention strategies have on system behaviour. ASPASIA, released under the Artistic License (2.0), can be downloaded from http://www.york.ac.uk/ycil/software.

  18. Thermodynamics-based Metabolite Sensitivity Analysis in metabolic networks.

    PubMed

    Kiparissides, A; Hatzimanikatis, V

    2017-01-01

    The increasing availability of large metabolomics datasets enhances the need for computational methodologies that can organize the data in a way that can lead to the inference of meaningful relationships. Knowledge of the metabolic state of a cell and how it responds to various stimuli and extracellular conditions can offer significant insight in the regulatory functions and how to manipulate them. Constraint based methods, such as Flux Balance Analysis (FBA) and Thermodynamics-based flux analysis (TFA), are commonly used to estimate the flow of metabolites through genome-wide metabolic networks, making it possible to identify the ranges of flux values that are consistent with the studied physiological and thermodynamic conditions. However, unless key intracellular fluxes and metabolite concentrations are known, constraint-based models lead to underdetermined problem formulations. This lack of information propagates as uncertainty in the estimation of fluxes and basic reaction properties such as the determination of reaction directionalities. Therefore, knowledge of which metabolites, if measured, would contribute the most to reducing this uncertainty can significantly improve our ability to define the internal state of the cell. In the present work we combine constraint based modeling, Design of Experiments (DoE) and Global Sensitivity Analysis (GSA) into the Thermodynamics-based Metabolite Sensitivity Analysis (TMSA) method. TMSA ranks metabolites comprising a metabolic network based on their ability to constrain the gamut of possible solutions to a limited, thermodynamically consistent set of internal states. TMSA is modular and can be applied to a single reaction, a metabolic pathway or an entire metabolic network. This is, to our knowledge, the first attempt to use metabolic modeling in order to provide a significance ranking of metabolites to guide experimental measurements. Copyright © 2016 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  19. A Predictive Model to Estimate Cost Savings of a Novel Diagnostic Blood Panel for Diagnosis of Diarrhea-predominant Irritable Bowel Syndrome.

    PubMed

    Pimentel, Mark; Purdy, Chris; Magar, Raf; Rezaie, Ali

    2016-07-01

    A high incidence of irritable bowel syndrome (IBS) is associated with significant medical costs. Diarrhea-predominant IBS (IBS-D) is diagnosed on the basis of clinical presentation and diagnostic test results and procedures that exclude other conditions. This study was conducted to estimate the potential cost savings of a novel IBS diagnostic blood panel that tests for the presence of antibodies to cytolethal distending toxin B and anti-vinculin associated with IBS-D. A cost-minimization (CM) decision tree model was used to compare the costs of a novel IBS diagnostic blood panel pathway versus an exclusionary diagnostic pathway (ie, standard of care). The probability that patients proceed to treatment was modeled as a function of sensitivity, specificity, and likelihood ratios of the individual biomarker tests. One-way sensitivity analyses were performed for key variables, and a break-even analysis was performed for the pretest probability of IBS-D. Budget impact analysis of the CM model was extrapolated to a health plan with 1 million covered lives. The CM model (base-case) predicted $509 cost savings for the novel IBS diagnostic blood panel versus the exclusionary diagnostic pathway because of the avoidance of downstream testing (eg, colonoscopy, computed tomography scans). Sensitivity analysis indicated that an increase in both positive likelihood ratios modestly increased cost savings. Break-even analysis estimated that the pretest probability of disease would be 0.451 to attain cost neutrality. The budget impact analysis predicted a cost savings of $3,634,006 ($0.30 per member per month). The novel IBS diagnostic blood panel may yield significant cost savings by allowing patients to proceed to treatment earlier, thereby avoiding unnecessary testing. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  20. MRI-Guided Focused Ultrasound Surgery for Uterine Fibroid Treatment: A Cost-Effectiveness Analysis

    PubMed Central

    Kong, Chung Y.; Omer, Zehra B.; Pandharipande, Pari V.; Swan, J. Shannon; Srouji, Serene; Gazelle, G. Scott; Fennessy, Fiona M.

    2015-01-01

    Objective To evaluate the cost-effectiveness of a treatment strategy for symptomatic uterine fibroids that employs Magnetic Resonance guided Focused Ultrasound (MRgFUS) as a first-line therapy relative to uterine artery embolization (UAE) or abdominal hysterectomy (HYST). Materials and Methods We developed a decision-analytic model to compare the cost-effectiveness of three treatment strategies: MRgFUS, UAE and HYST. Short and long-term utilities specific to each treatment were incorporated, allowing us to account for differences in quality of life across the strategies considered. Lifetime costs and quality-adjusted life-years (QALYs) were calculated for each strategy. An incremental cost-effectiveness analysis was performed, using a societal willingness-to-pay (WTP) threshold of $50,000 per QALY to designate a strategy as cost-effective. Sensitivity analysis was performed on all key model parameters. Results In the base-case analysis, in which treatment for symptomatic fibroids started at age 40, UAE was the most effective and expensive strategy (22.81 QALYs, $22,164), followed by MRgFUS (22.80 QALYs, $19,796) and HYST (22.60 QALYs, $13,291). MRgFUS was cost-effective relative to HYST, with an associated incremental cost-effectiveness ratio (ICER) of $33,110/QALY. MRgFUS was also cost-effective relative to UAE – the ICER of UAE relative to MRgFUS ($270,057) far exceeded the WTP threshold of $50,000/QALY. In sensitivity analysis, results were robust to changes in most parameters, but were sensitive to changes in probabilities of recurrence and symptom relief following certain procedures, and quality of life associated with symptomatic fibroids. Conclusions MRgFUS is cost-effective relative to both UAE and hysterectomy for the treatment of women with symptomatic fibroids. PMID:25055272

  1. Central GLP-2 enhances hepatic insulin sensitivity via activating PI3K signaling in POMC neurons

    USDA-ARS?s Scientific Manuscript database

    Glucagon-like peptides (GLP-1/GLP-2) are coproduced and highlighted as key modulators to improve glucose homeostasis and insulin sensitivity after bariatric surgery. However, it is unknown if CNS GLP-2 plays any physiological role in the control of glucose homeostasis and insulin sensitivity. We sho...

  2. Microstructure-Sensitive Modeling of High Cycle Fatigue (Preprint)

    DTIC Science & Technology

    2009-03-01

    SUBJECT TERMS microplasticity , microstructure-sensitive modeling, high cycle fatigue, fatigue variability 16. SECURITY CLASSIFICATION OF: 17...3Air Force Research Laboratory Wright Patterson Air Force Base, Ohio 45433 Keywords: Microplasticity , microstructure-sensitive modeling, high cycle...cyclic microplasticity ) plays a key role in modeling fatigue resistance. Unlike effective properties such as elastic stiffness, fatigue is

  3. Crossmodal Semantic Priming by Naturalistic Sounds and Spoken Words Enhances Visual Sensitivity

    ERIC Educational Resources Information Center

    Chen, Yi-Chuan; Spence, Charles

    2011-01-01

    We propose a multisensory framework based on Glaser and Glaser's (1989) general reading-naming interference model to account for the semantic priming effect by naturalistic sounds and spoken words on visual picture sensitivity. Four experiments were designed to investigate two key issues: First, can auditory stimuli enhance visual sensitivity when…

  4. Evaluation of Uncertainty and Sensitivity in Environmental Modeling at a Radioactive Waste Management Site

    NASA Astrophysics Data System (ADS)

    Stockton, T. B.; Black, P. K.; Catlett, K. M.; Tauxe, J. D.

    2002-05-01

    Environmental modeling is an essential component in the evaluation of regulatory compliance of radioactive waste management sites (RWMSs) at the Nevada Test Site in southern Nevada, USA. For those sites that are currently operating, further goals are to support integrated decision analysis for the development of acceptance criteria for future wastes, as well as site maintenance, closure, and monitoring. At these RWMSs, the principal pathways for release of contamination to the environment are upward towards the ground surface rather than downwards towards the deep water table. Biotic processes, such as burrow excavation and plant uptake and turnover, dominate this upward transport. A combined multi-pathway contaminant transport and risk assessment model was constructed using the GoldSim modeling platform. This platform facilitates probabilistic analysis of environmental systems, and is especially well suited for assessments involving radionuclide decay chains. The model employs probabilistic definitions of key parameters governing contaminant transport, with the goals of quantifying cumulative uncertainty in the estimation of performance measures and providing information necessary to perform sensitivity analyses. This modeling differs from previous radiological performance assessments (PAs) in that the modeling parameters are intended to be representative of the current knowledge, and the uncertainty in that knowledge, of parameter values rather than reflective of a conservative assessment approach. While a conservative PA may be sufficient to demonstrate regulatory compliance, a parametrically honest PA can also be used for more general site decision-making. In particular, a parametrically honest probabilistic modeling approach allows both uncertainty and sensitivity analyses to be explicitly coupled to the decision framework using a single set of model realizations. For example, sensitivity analysis provides a guide for analyzing the value of collecting more information by quantifying the relative importance of each input parameter in predicting the model response. However, in these complex, high dimensional eco-system models, represented by the RWMS model, the dynamics of the systems can act in a non-linear manner. Quantitatively assessing the importance of input variables becomes more difficult as the dimensionality, the non-linearities, and the non-monotonicities of the model increase. Methods from data mining such as Multivariate Adaptive Regression Splines (MARS) and the Fourier Amplitude Sensitivity Test (FAST) provide tools that can be used in global sensitivity analysis in these high dimensional, non-linear situations. The enhanced interpretability of model output provided by the quantitative measures estimated by these global sensitivity analysis tools will be demonstrated using the RWMS model.

  5. The Key Role of Pain Catastrophizing in the Disability of Patients with Acute Back Pain.

    PubMed

    Ramírez-Maestre, C; Esteve, R; Ruiz-Párraga, G; Gómez-Pérez, L; López-Martínez, A E

    2017-04-01

    This study investigated the role of anxiety sensitivity, resilience, pain catastrophizing, depression, pain fear-avoidance beliefs, and pain intensity in patients with acute back pain-related disability. Two hundred and thirty-two patients with acute back pain completed questionnaires on anxiety sensitivity, resilience, pain catastrophizing, fear-avoidance beliefs, depression, pain intensity, and disability. A structural equation modelling analysis revealed that anxiety sensitivity was associated with pain catastrophizing, and resilience was associated with lower levels of depression. Pain catastrophizing was positively associated with fear-avoidance beliefs and pain intensity. Depression was associated with fear-avoidance beliefs, but was not associated with pain intensity. Finally, catastrophizing, fear-avoidance beliefs, and pain intensity were positively and significantly associated with acute back pain-related disability. Although fear-avoidance beliefs and pain intensity were associated with disability, the results showed that pain catastrophizing was a central variable in the pain experience and had significant direct associations with disability when pain was acute. Anxiety sensitivity appeared to be an important antecedent of catastrophizing, whereas the influence of resilience on the acute back pain experience was limited to its relationship with depression.

  6. Estimating the neutrally buoyant energy density of a Rankine-cycle/fuel-cell underwater propulsion system

    NASA Astrophysics Data System (ADS)

    Waters, Daniel F.; Cadou, Christopher P.

    2014-02-01

    A unique requirement of underwater vehicles' power/energy systems is that they remain neutrally buoyant over the course of a mission. Previous work published in the Journal of Power Sources reported gross as opposed to neutrally-buoyant energy densities of an integrated solid oxide fuel cell/Rankine-cycle based power system based on the exothermic reaction of aluminum with seawater. This paper corrects this shortcoming by presenting a model for estimating system mass and using it to update the key findings of the original paper in the context of the neutral buoyancy requirement. It also presents an expanded sensitivity analysis to illustrate the influence of various design and modeling assumptions. While energy density is very sensitive to turbine efficiency (sensitivity coefficient in excess of 0.60), it is relatively insensitive to all other major design parameters (sensitivity coefficients < 0.15) like compressor efficiency, inlet water temperature, scaling methodology, etc. The neutral buoyancy requirement introduces a significant (∼15%) energy density penalty but overall the system still appears to offer factors of five to eight improvements in energy density (i.e., vehicle range/endurance) over present battery-based technologies.

  7. Site-specific climate analysis elucidates revegetation challenges for post-mining landscapes in eastern Australia

    NASA Astrophysics Data System (ADS)

    Audet, P.; Arnold, S.; Lechner, A. M.; Baumgartl, T.

    2013-10-01

    In eastern Australia, the availability of water is critical for the successful rehabilitation of post-mining landscapes and climatic characteristics of this diverse geographical region are closely defined by factors such as erratic rainfall and periods of drought and flooding. Despite this, specific metrics of climate patterning are seldom incorporated into the initial design of current post-mining land rehabilitation strategies. Our study proposes that a few common rainfall parameters can be combined and rated using arbitrary rainfall thresholds to characterise bioregional climate sensitivity relevant to the rehabilitation these landscapes. This approach included assessments of annual rainfall depth, average recurrence interval of prolonged low intensity rainfall, average recurrence intervals of short or prolonged high intensity events, median period without rain (or water-deficit) and standard deviation for this period in order to address climatic factors such as total water availability, seasonality and intensity - which were selected as potential proxies of both short- and long-term biological sensitivity to climate within the context of post-disturbance ecological development and recovery. Following our survey of available climate data, we derived site "climate sensitivity" indexes and compared the performance of 9 ongoing mine sites: Weipa, Mt. Isa and Cloncurry, Eromanga, Kidston, the Bowen Basin (Curragh), Tarong, North Stradbroke Island, and the Newnes Plateau. The sites were then ranked from most-to-least sensitive and compared with natural bioregional patterns of vegetation density using mean NDVI. It was determined that regular rainfall and relatively short periods of water-deficit were key characteristics of sites having less sensitivity to climate - as found among the relatively more temperate inland mining locations. Whereas, high rainfall variability, frequently occurring high intensity events, and (or) prolonged seasonal drought were primary indicators of sites having greater sensitivity to climate - as found among the semi-arid central-inland sites. Overall, the manner in which these climatic factors are identified and ultimately addressed by land managers and rehabilitation practitioners could be a key determinant of achievable success at given locations at the planning stages of rehabilitation design.

  8. Sensitivity Analysis as a Tool to assess Energy-Water Nexus in India

    NASA Astrophysics Data System (ADS)

    Priyanka, P.; Banerjee, R.

    2017-12-01

    Rapid urbanization, population growth and related structural changes with-in the economy of a developing country act as a stressor on energy and water demand, which forms a well-established energy-water nexus. Energy-water nexus is thoroughly studied at various spatial scales viz. city level, river basin level and national level- to guide different stakeholders for sustainable management of energy and water. However, temporal dimensions of energy-water nexus at national level have not been thoroughly investigated because of unavailability of relevant time-series data. In this study we investigated energy-water nexus at national level using environmentally-extended input-output tables for Indian economy (2004-2013) as provided by EORA database. Perturbation based sensitivity analysis is proposed to highlight the critical nodes of interactions among economic sectors which is further linked to detect the synergistic effects of energy and water consumption. Technology changes (interpreted as change in value of nodes) results in modification of interactions among economic sectors and synergy is affected through direct as well as indirect effects. Indirect effects are not easily understood through preliminary examination of data, hence sensitivity analysis within an input-output framework is important to understand the indirect effects. Furthermore, time series data helps in developing the understanding on dynamics of synergistic effects. We identified the key sectors and technology changes for Indian economy which will provide the better decision support for policy makers about sustainable use of energy-water resources in India.

  9. Fecal microbiota manipulation prevents dysbiosis and alcohol-induced liver injury in mice.

    PubMed

    Ferrere, Gladys; Wrzosek, Laura; Cailleux, Frédéric; Turpin, Williams; Puchois, Virginie; Spatz, Madeleine; Ciocan, Dragos; Rainteau, Dominique; Humbert, Lydie; Hugot, Cindy; Gaudin, Françoise; Noordine, Marie-Louise; Robert, Véronique; Berrebi, Dominique; Thomas, Muriel; Naveau, Sylvie; Perlemuter, Gabriel; Cassard, Anne-Marie

    2017-04-01

    Alcoholic liver disease (ALD) is a leading cause of liver failure and mortality. In humans, severe alcoholic hepatitis is associated with key changes to intestinal microbiota (IM), which influences individual sensitivity to develop advanced ALD. We used the different susceptibility to ALD observed in two distinct animal facilities to test the efficiency of two complementary strategies (fecal microbiota transplantation and prebiotic treatment) to reverse dysbiosis and prevent ALD. Mice were fed alcohol in two distinct animal facilities with a Lieber DeCarli diet. Fecal microbiota transplantation was performed with fresh feces from alcohol-resistant donor mice to alcohol-sensitive receiver mice three times a week. Another group of mice received pectin during the entire alcohol consumption period. Ethanol induced steatosis and liver inflammation, which were associated with disruption of gut homeostasis, in alcohol-sensitive, but not alcohol resistant mice. IM analysis showed that the proportion of Bacteroides was specifically lower in alcohol-sensitive mice (p<0.05). Principal coordinate analysis showed that the IM of sensitive and resistant mice clustered differently. We targeted IM using two different strategies to prevent alcohol-induced liver lesions: (1) pectin treatment which induced major modifications of the IM, (2) fecal microbiota transplantation which resulted in an IM very close to that of resistant donor mice in the sensitive recipient mice. Both methods prevented steatosis, liver inflammation, and restored gut homeostasis. Manipulation of IM can prevent alcohol-induced liver injury. The IM should be considered as a new therapeutic target in ALD. Sensitivity to alcoholic liver disease (ALD) is driven by intestinal microbiota in alcohol fed mice. Treatment of mice with alcohol-induced liver lesions by fecal transplant from alcohol fed mice resistant to ALD or with prebiotic (pectin) prevents ALD. These findings open new possibilities for treatment of human ALD through intestinal microbiota manipulation. Copyright © 2016 European Association for the Study of the Liver. Published by Elsevier B.V. All rights reserved.

  10. Connectome sensitivity or specificity: which is more important?

    PubMed

    Zalesky, Andrew; Fornito, Alex; Cocchi, Luca; Gollo, Leonardo L; van den Heuvel, Martijn P; Breakspear, Michael

    2016-11-15

    Connectomes with high sensitivity and high specificity are unattainable with current axonal fiber reconstruction methods, particularly at the macro-scale afforded by magnetic resonance imaging. Tensor-guided deterministic tractography yields sparse connectomes that are incomplete and contain false negatives (FNs), whereas probabilistic methods steered by crossing-fiber models yield dense connectomes, often with low specificity due to false positives (FPs). Densely reconstructed probabilistic connectomes are typically thresholded to improve specificity at the cost of a reduction in sensitivity. What is the optimal tradeoff between connectome sensitivity and specificity? We show empirically and theoretically that specificity is paramount. Our evaluations of the impact of FPs and FNs on empirical connectomes indicate that specificity is at least twice as important as sensitivity when estimating key properties of brain networks, including topological measures of network clustering, network efficiency and network modularity. Our asymptotic analysis of small-world networks with idealized modular structure reveals that as the number of nodes grows, specificity becomes exactly twice as important as sensitivity to the estimation of the clustering coefficient. For the estimation of network efficiency, the relative importance of specificity grows linearly with the number of nodes. The greater importance of specificity is due to FPs occurring more prevalently between network modules rather than within them. These spurious inter-modular connections have a dramatic impact on network topology. We argue that efforts to maximize the sensitivity of connectome reconstruction should be realigned with the need to map brain networks with high specificity. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. KEY COMPARISON: Report on the Regional Comparison COOMET.AUV.A-K3

    NASA Astrophysics Data System (ADS)

    Barrera-Figueroa, Salvador; Nielsen, Lars; Rasmussen, Knud

    2007-01-01

    COOMET.AUV.A-K3 is a Regional Comparison that supplements the Key Comparison CCAUV.A-K3 organized by the CCAUV. The participating NMIs are GUM (Poland), INM (Romania), VNIIFTRI (Russia) and DP-NDI 'Systema' (Ukraine). The role of Pilot laboratory was undertaken by DPLADFM (Denmark). The measurements took place between May 2005 and February 2006. The time schedule was organized in a single star configuration. Initially, two LS2aP microphones were circulated. However, a sudden change of sensitivity of one of them forced the inclusion of an additional microphone. Nevertheless, the analysis was performed on all microphones involved. This report includes the measurement results from the participants, information about their calibration methods, and the analysis leading to the assignation of degrees of equivalence and the link to the CCAUV.A-K3. Main text. To reach the main text of this paper, click on Final Report. The final report has been peer-reviewed and approved for publication by the CCAUV, according to the provisions of the CIPM Mutual Recognition Arrangement (MRA).

  12. What can eye movements tell us about Symbol Digit substitution by patients with schizophrenia?

    PubMed

    Elahipanah, Ava; Christensen, Bruce K; Reingold, Eyal M

    2011-04-01

    Substitution tests are sensitive to cognitive impairment and reliably discriminate patients with schizophrenia from healthy individuals better than most other neuropsychological instruments. However, due to their multifaceted nature, substitution test scores cannot pinpoint the specific cognitive deficits that lead to poor performance. The current study investigated eye movements during performance on a substitution test in order to better understand what aspect of substitution test performance underlies schizophrenia-related impairment. Twenty-five patients with schizophrenia and 25 healthy individuals performed a computerized version of the Symbol Digit Modalities Test while their eye movements were monitored. As expected, patients achieved lower overall performance scores. Moreover, analysis of participants' eye movements revealed that patients spent more time searching for the target symbol every time they visited the key area. Patients also made more visits to the key area for each response that they made. Regression analysis suggested that patients' impaired performance on substitution tasks is primarily related to a less efficient visual search and, secondarily, to impaired memory. Copyright © 2010 Elsevier B.V. All rights reserved.

  13. Comparing methods for analysis of biomedical hyperspectral image data

    NASA Astrophysics Data System (ADS)

    Leavesley, Silas J.; Sweat, Brenner; Abbott, Caitlyn; Favreau, Peter F.; Annamdevula, Naga S.; Rich, Thomas C.

    2017-02-01

    Over the past 2 decades, hyperspectral imaging technologies have been adapted to address the need for molecule-specific identification in the biomedical imaging field. Applications have ranged from single-cell microscopy to whole-animal in vivo imaging and from basic research to clinical systems. Enabling this growth has been the availability of faster, more effective hyperspectral filtering technologies and more sensitive detectors. Hence, the potential for growth of biomedical hyperspectral imaging is high, and many hyperspectral imaging options are already commercially available. However, despite the growth in hyperspectral technologies for biomedical imaging, little work has been done to aid users of hyperspectral imaging instruments in selecting appropriate analysis algorithms. Here, we present an approach for comparing the effectiveness of spectral analysis algorithms by combining experimental image data with a theoretical "what if" scenario. This approach allows us to quantify several key outcomes that characterize a hyperspectral imaging study: linearity of sensitivity, positive detection cut-off slope, dynamic range, and false positive events. We present results of using this approach for comparing the effectiveness of several common spectral analysis algorithms for detecting weak fluorescent protein emission in the midst of strong tissue autofluorescence. Results indicate that this approach should be applicable to a very wide range of applications, allowing a quantitative assessment of the effectiveness of the combined biology, hardware, and computational analysis for detecting a specific molecular signature.

  14. Simulation on a car interior aerodynamic noise control based on statistical energy analysis

    NASA Astrophysics Data System (ADS)

    Chen, Xin; Wang, Dengfeng; Ma, Zhengdong

    2012-09-01

    How to simulate interior aerodynamic noise accurately is an important question of a car interior noise reduction. The unsteady aerodynamic pressure on body surfaces is proved to be the key effect factor of car interior aerodynamic noise control in high frequency on high speed. In this paper, a detail statistical energy analysis (SEA) model is built. And the vibra-acoustic power inputs are loaded on the model for the valid result of car interior noise analysis. The model is the solid foundation for further optimization on car interior noise control. After the most sensitive subsystems for the power contribution to car interior noise are pointed by SEA comprehensive analysis, the sound pressure level of car interior aerodynamic noise can be reduced by improving their sound and damping characteristics. The further vehicle testing results show that it is available to improve the interior acoustic performance by using detailed SEA model, which comprised by more than 80 subsystems, with the unsteady aerodynamic pressure calculation on body surfaces and the materials improvement of sound/damping properties. It is able to acquire more than 2 dB reduction on the central frequency in the spectrum over 800 Hz. The proposed optimization method can be looked as a reference of car interior aerodynamic noise control by the detail SEA model integrated unsteady computational fluid dynamics (CFD) and sensitivity analysis of acoustic contribution.

  15. Effect of increasing body condition on key regulators of fat metabolism in subcutaneous adipose tissue depot and circulation of nonlactating dairy cows.

    PubMed

    Locher, L; Häussler, S; Laubenthal, L; Singh, S P; Winkler, J; Kinoshita, A; Kenéz, Á; Rehage, J; Huber, K; Sauerwein, H; Dänicke, S

    2015-02-01

    In response to negative energy balance, overconditioned cows mobilize more body fat than thin cows and subsequently are prone to develop metabolic disorders. Changes in adipose tissue (AT) metabolism are barely investigated in overconditioned cows. Therefore, the objective was to investigate the effect of increasing body condition on key regulator proteins of fat metabolism in subcutaneous AT and circulation of dairy cows. Nonlactating, nonpregnant dairy cows (n=8) investigated in the current study served as a model to elucidate the changes in the course of overcondition independent from physiological changes related to gestation, parturition, and lactation. Cows were fed diets with increasing portions of concentrate during the first 6wk of the experiment until 60% were reached, which was maintained for 9wk. Biopsy samples from AT of the subcutaneous tailhead region were collected every 8wk, whereas blood was sampled monthly. Within the experimental period cows had an average BW gain of 243±33.3 kg. Leptin and insulin concentrations were increased until wk 12. Based on serum concentrations of glucose, insulin, and nonesterified fatty acids, the surrogate indices for insulin sensitivity were calculated. High-concentrate feeding led to decreased quantitative insulin sensitivity check index and homeostasis model assessment due to high insulin and glucose concentrations indicating decreased insulin sensitivity. Adiponectin, an adipokine-promoting insulin sensitivity, decreased in subcutaneous AT, but remained unchanged in the circulation. The high-concentrate diet affected key enzymes reflecting AT metabolism such as AMP-activated protein kinase and hormone-sensitive lipase, both represented as the proportion of the phosphorylated protein to total protein, as well as fatty acid synthase. The extent of phosphorylation of AMP-activated protein kinase and the protein expression of fatty acid synthase were inversely regulated throughout the experimental period, whereas the extent of phosphorylation of hormone-sensitive lipase was consistently decreasing by the high-concentrate diet. Overcondition in nonpregnant, nonlactating dairy cows changed the expression of key regulator proteins of AT metabolism and circulation accompanied by impaired insulin sensitivity, which might increase the risk for metabolic disorders. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  16. HEP Software Foundation Community White Paper Working Group - Data Analysis and Interpretation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bauerdick, Lothar

    At the heart of experimental high energy physics (HEP) is the development of facilities and instrumentation that provide sensitivity to new phenomena. Our understanding of nature at its most fundamental level is advanced through the analysis and interpretation of data from sophisticated detectors in HEP experiments. The goal of data analysis systems is to realize the maximum possible scientific potential of the data within the constraints of computing and human resources in the least time. To achieve this goal, future analysis systems should empower physicists to access the data with a high level of interactivity, reproducibility and throughput capability. Asmore » part of the HEP Software Foundation Community White Paper process, a working group on Data Analysis and Interpretation was formed to assess the challenges and opportunities in HEP data analysis and develop a roadmap for activities in this area over the next decade. In this report, the key findings and recommendations of the Data Analysis and Interpretation Working Group are presented.« less

  17. Polarization variations in installed fibers and their influence on quantum key distribution systems.

    PubMed

    Ding, Yu-Yang; Chen, Hua; Wang, Shuang; He, De-Yong; Yin, Zhen-Qiang; Chen, Wei; Zhou, Zheng; Guo, Guang-Can; Han, Zheng-Fu

    2017-10-30

    Polarization variations in the installed fibers are complex and volatile, and would severely affect the performances of polarization-sensitive quantum key distribution (QKD) systems. Based on the recorded data about polarization variations of different installed fibers, we establish an analytical methodology to quantitatively evaluate the influence of polarization variations on polarization-sensitive QKD systems. Using the increased quantum bit error rate induced by polarization variations as a key criteria, we propose two parameters - polarization drift time and required tracking speed - to characterize polarization variations. For field buried and aerial fibers with different length, we quantitatively evaluate the influence of polarization variations, and also provide requirements and suggestions for polarization basis alignment modules of QKD systems deployed in different kind of fibers.

  18. Methods of recording and analysing cough sounds.

    PubMed

    Subburaj, S; Parvez, L; Rajagopalan, T G

    1996-01-01

    Efforts have been directed to evolve a computerized system for acquisition and multi-dimensional analysis of the cough sound. The system consists of a PC-AT486 computer with an ADC board having 12 bit resolution. The audio cough sound is acquired using a sensitive miniature microphone at a sampling rate of 8 kHz in the computer and simultaneously recorded in real time using a digital audio tape recorder which also serves as a back up. Analysis of the cough sound is done in time and frequency domains using the digitized data which provide numerical values for key parameters like cough counts, bouts, their intensity and latency. In addition, the duration of each event and cough patterns provide a unique tool which allows objective evaluation of antitussive and expectorant drugs. Both on-line and off-line checks ensure error-free performance over long periods of time. The entire system has been evaluated for sensitivity, accuracy, precision and reliability. Successful use of this system in clinical studies has established what perhaps is the first integrated approach for the objective evaluation of cough.

  19. A new hyperchaotic map and its application for image encryption

    NASA Astrophysics Data System (ADS)

    Natiq, Hayder; Al-Saidi, N. M. G.; Said, M. R. M.; Kilicman, Adem

    2018-01-01

    Based on the one-dimensional Sine map and the two-dimensional Hénon map, a new two-dimensional Sine-Hénon alteration model (2D-SHAM) is hereby proposed. Basic dynamic characteristics of 2D-SHAM are studied through the following aspects: equilibria, Jacobin eigenvalues, trajectory, bifurcation diagram, Lyapunov exponents and sensitivity dependence test. The complexity of 2D-SHAM is investigated using Sample Entropy algorithm. Simulation results show that 2D-SHAM is overall hyperchaotic with the high complexity, and high sensitivity to its initial values and control parameters. To investigate its performance in terms of security, a new 2D-SHAM-based image encryption algorithm (SHAM-IEA) is also proposed. In this algorithm, the essential requirements of confusion and diffusion are accomplished, and the stochastic 2D-SHAM is used to enhance the security of encrypted image. The stochastic 2D-SHAM generates random values, hence SHAM-IEA can produce different encrypted images even with the same secret key. Experimental results and security analysis show that SHAM-IEA has strong capability to withstand statistical analysis, differential attack, chosen-plaintext and chosen-ciphertext attacks.

  20. DeltaSA tool for source apportionment benchmarking, description and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Pernigotti, D.; Belis, C. A.

    2018-05-01

    DeltaSA is an R-package and a Java on-line tool developed at the EC-Joint Research Centre to assist and benchmark source apportionment applications. Its key functionalities support two critical tasks in this kind of studies: the assignment of a factor to a source in factor analytical models (source identification) and the model performance evaluation. The source identification is based on the similarity between a given factor and source chemical profiles from public databases. The model performance evaluation is based on statistical indicators used to compare model output with reference values generated in intercomparison exercises. The references values are calculated as the ensemble average of the results reported by participants that have passed a set of testing criteria based on chemical profiles and time series similarity. In this study, a sensitivity analysis of the model performance criteria is accomplished using the results of a synthetic dataset where "a priori" references are available. The consensus modulated standard deviation punc gives the best choice for the model performance evaluation when a conservative approach is adopted.

  1. The I.A.G./A.I.G. SEDIBUD (Sediment Budgets in Cold Environments) Program (2005 - 2017): Key activities and outcomes

    NASA Astrophysics Data System (ADS)

    Beylich, Achim A.

    2017-04-01

    Amplified climate change and ecological sensitivity of high-latitude and high-altitude cold climate environments has been highlighted as a key global environmental issue. Projected climate change in largely undisturbed cold regions is expected to alter melt-season duration and intensity, along with the number of extreme rainfall events, total annual precipitation and the balance between snowfall and rainfall. Similarly, changes to the thermal balance are expected to reduce the extent of permafrost and seasonal ground frost and increase active-layer depths. These combined effects will undoubtedly change Earth surface environments in cold regions and will alter the fluxes of sediments, solutes and nutrients. However, the absence of quantitative data and coordinated analysis to understand the sensitivity of the Earth surface environment are acute in cold regions. Contemporary cold climate environments generally provide the opportunity to identify solute and sedimentary systems where anthropogenic impacts are still less important than the effects of climate change. Accordingly, it is still possible to develop a library of baseline fluvial yields and sedimentary budgets before the natural environment is completely transformed. The SEDIBUD (Sediment Budgets in Cold Environments) Program, building on the European Science Foundation (ESF) Network SEDIFLUX (Sedimentary Source-to-Sink Fluxes in Cold Environments, since 2004) was formed in 2005 as a new Program (Working Group) of the International Association of Geomorphologists (I.A.G./A.I.G.) to address this still existing key knowledge gap. SEDIBUD (2005-2017) has currently about 400 members worldwide and the Steering Committee of this international program is composed of eleven scientists from ten different countries. The central research question of this global program is to: Assess and model the contemporary sedimentary fluxes in cold climates, with emphasis on both particulate and dissolved components. Research carried out at 56 defined SEDIBUD key test sites (selected catchment systems) varies by scientific program, logistics and available resources, but typically represent interdisciplinary collaborations of geomorphologists, hydrologists, ecologists, permafrost scientists and glaciologists with different levels of detail. SEDIBUD has developed a key set of primary research data requirements intended to incorporate results from these varied projects and allow quantitative analysis across the program. Defined SEDIBUD key test sites provide field data on annual climatic conditions, total discharge and particulate and dissolved fluxes and yields as well as information on other relevant denudational Earth surface processes. A number of selected key test sites are providing high-resolution data on climatic conditions, runoff and solute and sedimentary fluxes and yields, which - in addition to the annual data - contribute to the SEDIBUD metadata database. To support these coordinated efforts, the SEDIFLUX manual and a set of framework papers and book chapters have been produced to establish the integrative approach and common methods and data standards. Comparable field-datasets from different SEDIBUD key test sites are analyzed and integrated to address key research questions of the SEDIBUD program as defined in the SEDIBUD working group objective. A key SEDIBUD synthesis book was published in 2016 by the group and a synthesis key paper is currently in preparation. Detailed information on all SEDIBUD activities, outcomes and published products is found at http://www.geomorph.org/sedibud-working-group/.

  2. Design Considerations for a New Terminal Area Arrival Scheduler

    NASA Technical Reports Server (NTRS)

    Thipphavong, Jane; Mulfinger, Daniel

    2010-01-01

    Design of a terminal area arrival scheduler depends on the interrelationship between throughput, delay and controller intervention. The main contribution of this paper is an analysis of the above interdependence for several stochastic behaviors of expected system performance distributions in the aircraft s time of arrival at the meter fix and runway. Results of this analysis serve to guide the scheduler design choices for key control variables. Two types of variables are analyzed, separation buffers and terminal delay margins. The choice for these decision variables was tested using sensitivity analysis. Analysis suggests that it is best to set the separation buffer at the meter fix to its minimum and adjust the runway buffer to attain the desired system performance. Delay margin was found to have the least effect. These results help characterize the variables most influential in the scheduling operations of terminal area arrivals.

  3. Effect of extreme data loss on heart rate signals quantified by entropy analysis

    NASA Astrophysics Data System (ADS)

    Li, Yu; Wang, Jun; Li, Jin; Liu, Dazhao

    2015-02-01

    The phenomenon of data loss always occurs in the analysis of large databases. Maintaining the stability of analysis results in the event of data loss is very important. In this paper, we used a segmentation approach to generate a synthetic signal that is randomly wiped from data according to the Gaussian distribution and the exponential distribution of the original signal. Then, the logistic map is used as verification. Finally, two methods of measuring entropy-base-scale entropy and approximate entropy-are comparatively analyzed. Our results show the following: (1) Two key parameters-the percentage and the average length of removed data segments-can change the sequence complexity according to logistic map testing. (2) The calculation results have preferable stability for base-scale entropy analysis, which is not sensitive to data loss. (3) The loss percentage of HRV signals should be controlled below the range (p = 30 %), which can provide useful information in clinical applications.

  4. Direct Analysis in Real Time-Mass Spectrometry for the Rapid Detection of Metabolites of Aconite Alkaloids in Intestinal Bacteria

    NASA Astrophysics Data System (ADS)

    Li, Xue; Hou, Guangyue; Xing, Junpeng; Song, Fengrui; Liu, Zhiqiang; Liu, Shuying

    2014-12-01

    In the present work, direct analysis of real time ionization combined with multi-stage tandem mass spectrometry (DART-MSn) was used to investigate the metabolic profile of aconite alkaloids in rat intestinal bacteria. A total of 36 metabolites from three aconite alkaloids were identified by using DART-MSn, and the feasibility of quantitative analysis of these analytes was examined. Key parameters of the DART ion source, such as helium gas temperature and pressure, the source-to-MS distance, and the speed of the autosampler, were optimized to achieve high sensitivity, enhance reproducibility, and reduce the occurrence of fragmentation. The instrument analysis time for one sample can be less than 10 s for this method. Compared with ESI-MS and UPLC-MS, the DART-MS is more efficient for directly detecting metabolic samples, and has the advantage of being a simple, high-speed, high-throughput method.

  5. Economic Efficiency and Investment Timing for Dual Water Systems

    NASA Astrophysics Data System (ADS)

    Leconte, Robert; Hughes, Trevor C.; Narayanan, Rangesan

    1987-10-01

    A general methodology to evaluate the economic feasibility of dual water systems is presented. In a first step, a static analysis (evaluation at a single point in time) is developed. The analysis requires the evaluation of consumers' and producer's surpluses from water use and the capital cost of the dual (outdoor) system. The analysis is then extended to a dynamic approach where the water demand increases with time (as a result of a population increase) and where the dual system is allowed to expand. The model determines whether construction of a dual system represents a net benefit, and if so, what is the best time to initiate the system (corresponding to maximization of social welfare). Conditions under which an analytic solution is possible are discussed and results of an application are summarized (including sensitivity to different parameters). The analysis allows identification of key parameters influencing attractiveness of dual water systems.

  6. Direct analysis in real time-mass spectrometry for the rapid detection of metabolites of aconite alkaloids in intestinal bacteria.

    PubMed

    Li, Xue; Hou, Guangyue; Xing, Junpeng; Song, Fengrui; Liu, Zhiqiang; Liu, Shuying

    2014-12-01

    In the present work, direct analysis of real time ionization combined with multi-stage tandem mass spectrometry (DART-MS(n)) was used to investigate the metabolic profile of aconite alkaloids in rat intestinal bacteria. A total of 36 metabolites from three aconite alkaloids were identified by using DART-MS(n), and the feasibility of quantitative analysis of these analytes was examined. Key parameters of the DART ion source, such as helium gas temperature and pressure, the source-to-MS distance, and the speed of the autosampler, were optimized to achieve high sensitivity, enhance reproducibility, and reduce the occurrence of fragmentation. The instrument analysis time for one sample can be less than 10 s for this method. Compared with ESI-MS and UPLC-MS, the DART-MS is more efficient for directly detecting metabolic samples, and has the advantage of being a simple, high-speed, high-throughput method.

  7. The cytokine-dependent MUTZ-3 cell line as an in vitro model for the screening of contact sensitizers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Azam, Philippe; Peiffer, Jean-Luc; Chamousset, Delphine

    2006-04-01

    Langerhans cells (LC) are key mediators of contact allergenicity in the skin. However, no in vitro methods exist which are based on the activation process of LC to predict the sensitization potential of chemicals. In this study, we have evaluated the performances of MUTZ-3, a cytokine-dependent human monocytic cell line, in its response to sensitizers. First, we compared undifferentiated MUTZ-3 cells with several standard human cells such as THP-1, KG-1, HL-60, K-562, and U-937 in their response to the strong sensitizer DNCB and the irritant SDS by monitoring the expression levels of HLA-DR, CD54, and CD86 by flow cytometry. Onlymore » MUTZ-3 and THP-1 cells show a strong and specific response to sensitizer, while other cell lines showed very variable responses. Then, we tested MUTZ-3 cells against a wider panel of sensitizers and irritants on a broader spectrum of cell surface markers (HLA-DR, CD40, CD54, CD80, CD86, B7-H1, B7-H2, B7-DC). Of these markers, CD86 proved to be the most reliable since it detected all sensitizers, including benzocaine, a classical false negative in local lymph node assay (LLNA) but not irritants. We confirmed the MUTZ-3 response to DNCB by real-time PCR analysis. Taken together, our data suggest that undifferentiated MUTZ-3 cells may represent a valuable in vitro model for the screening of potential sensitizers.« less

  8. A Cost-Effectiveness Analysis of Clopidogrel for Patients with Non-ST-Segment Elevation Acute Coronary Syndrome in China.

    PubMed

    Cui, Ming; Tu, Chen Chen; Chen, Er Zhen; Wang, Xiao Li; Tan, Seng Chuen; Chen, Can

    2016-09-01

    There are a number of economic evaluation studies of clopidogrel for patients with non-ST-segment elevation acute coronary syndrome (NSTEACS) published from the perspective of multiple countries in recent years. However, relevant research is quite limited in China. We aimed to estimate the long-term cost effectiveness for up to 1-year treatment with clopidogrel plus acetylsalicylic acid (ASA) versus ASA alone for NSTEACS from the public payer perspective in China. This analysis used a Markov model to simulate a cohort of patients for quality-adjusted life years (QALYs) gained and incremental cost for lifetime horizon. Based on the primary event rates, adherence rate, and mortality derived from the CURE trial, hazard functions obtained from published literature were used to extrapolate the overall survival to lifetime horizon. Resource utilization, hospitalization, medication costs, and utility values were estimated from official reports, published literature, and analysis of the patient-level insurance data in China. To assess the impact of parameters' uncertainty on cost-effectiveness results, one-way sensitivity analyses were undertaken for key parameters, and probabilistic sensitivity analysis (PSA) was conducted using the Monte Carlo simulation. The therapy of clopidogrel plus ASA is a cost-effective option in comparison with ASA alone for the treatment of NSTEACS in China, leading to 0.0548 life years (LYs) and 0.0518 QALYs gained per patient. From the public payer perspective in China, clopidogrel plus ASA is associated with an incremental cost of 43,340 China Yuan (CNY) per QALY gained and 41,030 CNY per LY gained (discounting at 3.5% per year). PSA results demonstrated that 88% of simulations were lower than the cost-effectiveness threshold of 150,721 CYN per QALY gained. Based on the one-way sensitivity analysis, results are most sensitive to price of clopidogrel, but remain well below this threshold. This analysis suggests that treatment with clopidogrel plus ASA for up to 1 year for patients with NSTEACS is cost effective in the local context of China from a public payers' perspective. Sanofi China.

  9. Resonance analysis of a high temperature piezoelectric disc for sensitivity characterization.

    PubMed

    Bilgunde, Prathamesh N; Bond, Leonard J

    2018-07-01

    Ultrasonic transducers for high temperature (200 °C+) applications are a key enabling technology for advanced nuclear power systems and in a range of chemical and petro-chemical industries. Design, fabrication and optimization of such transducers using piezoelectric materials remains a challenge. In this work, experimental data-based analysis is performed to investigate the fundamental causal factors for the resonance characteristics of a piezoelectric disc at elevated temperatures. The effect of all ten temperature-dependent piezoelectric constants (ε 33 , ε 11 , d 33 , d 31 , d 15 , s 11 , s 12 , s 13 , s 33 , s 44 ) is studied numerically on both the radial and thickness mode resonances of a piezoelectric disc. A sensitivity index is defined to quantify the effect of each of the temperature-dependent coefficients on the resonance modes of the modified lead zirconium titanate disc. The temperature dependence of s 33 showed highest sensitivity towards the thickness resonance mode followed by ε 33 , s 11 , s 13 , s 12 , d 31 , d 33 , s 44 , ε 11 , and d 15 in the decreasing order of the sensitivity index. For radial resonance modes, the temperature dependence of ε 33 showed highest sensitivity index followed by s 11 , s 12 and d 31 coefficient. This numerical study demonstrates that the magnitude of d 33 is not the sole factor that affects the resonance characteristics of the piezoelectric disc at high temperatures. It appears that there exists a complex interplay between various temperature dependent piezoelectric coefficients that causes reduction in the thickness mode resonance frequencies which is found to be agreement in with the experimental data at an elevated temperature. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Cost effectiveness of OptiMal® rapid diagnostic test for malaria in remote areas of the Amazon Region, Brazil

    PubMed Central

    2010-01-01

    Background In areas with limited structure in place for microscopy diagnosis, rapid diagnostic tests (RDT) have been demonstrated to be effective. Method The cost-effectiveness of the Optimal® and thick smear microscopy was estimated and compared. Data were collected on remote areas of 12 municipalities in the Brazilian Amazon. Data sources included the National Malaria Control Programme of the Ministry of Health, the National Healthcare System reimbursement table, hospitalization records, primary data collected from the municipalities, and scientific literature. The perspective was that of the Brazilian public health system, the analytical horizon was from the start of fever until the diagnostic results provided to patient and the temporal reference was that of year 2006. The results were expressed in costs per adequately diagnosed cases in 2006 U.S. dollars. Sensitivity analysis was performed considering key model parameters. Results In the case base scenario, considering 92% and 95% sensitivity for thick smear microscopy to Plasmodium falciparum and Plasmodium vivax, respectively, and 100% specificity for both species, thick smear microscopy is more costly and more effective, with an incremental cost estimated at US$549.9 per adequately diagnosed case. In sensitivity analysis, when sensitivity and specificity of microscopy for P. vivax were 0.90 and 0.98, respectively, and when its sensitivity for P. falciparum was 0.83, the RDT was more cost-effective than microscopy. Conclusion Microscopy is more cost-effective than OptiMal® in these remote areas if high accuracy of microscopy is maintained in the field. Decision regarding use of rapid tests for diagnosis of malaria in these areas depends on current microscopy accuracy in the field. PMID:20937094

  11. Automating calibration, sensitivity and uncertainty analysis of complex models using the R package Flexible Modeling Environment (FME): SWAT as an example

    USGS Publications Warehouse

    Wu, Y.; Liu, S.

    2012-01-01

    Parameter optimization and uncertainty issues are a great challenge for the application of large environmental models like the Soil and Water Assessment Tool (SWAT), which is a physically-based hydrological model for simulating water and nutrient cycles at the watershed scale. In this study, we present a comprehensive modeling environment for SWAT, including automated calibration, and sensitivity and uncertainty analysis capabilities through integration with the R package Flexible Modeling Environment (FME). To address challenges (e.g., calling the model in R and transferring variables between Fortran and R) in developing such a two-language coupling framework, 1) we converted the Fortran-based SWAT model to an R function (R-SWAT) using the RFortran platform, and alternatively 2) we compiled SWAT as a Dynamic Link Library (DLL). We then wrapped SWAT (via R-SWAT) with FME to perform complex applications including parameter identifiability, inverse modeling, and sensitivity and uncertainty analysis in the R environment. The final R-SWAT-FME framework has the following key functionalities: automatic initialization of R, running Fortran-based SWAT and R commands in parallel, transferring parameters and model output between SWAT and R, and inverse modeling with visualization. To examine this framework and demonstrate how it works, a case study simulating streamflow in the Cedar River Basin in Iowa in the United Sates was used, and we compared it with the built-in auto-calibration tool of SWAT in parameter optimization. Results indicate that both methods performed well and similarly in searching a set of optimal parameters. Nonetheless, the R-SWAT-FME is more attractive due to its instant visualization, and potential to take advantage of other R packages (e.g., inverse modeling and statistical graphics). The methods presented in the paper are readily adaptable to other model applications that require capability for automated calibration, and sensitivity and uncertainty analysis.

  12. The management of patients with T1 adenocarcinoma of the low rectum: a decision analysis.

    PubMed

    Johnston, Calvin F; Tomlinson, George; Temple, Larissa K; Baxter, Nancy N

    2013-04-01

    Decision making for patients with T1 adenocarcinoma of the low rectum, when treatment options are limited to a transanal local excision or abdominoperineal resection, is challenging. The aim of this study was to develop a contemporary decision analysis to assist patients and clinicians in balancing the goals of maximizing life expectancy and quality of life in this situation. We constructed a Markov-type microsimulation in open-source software. Recurrence rates and quality-of-life parameters were elicited by systematic literature reviews. Sensitivity analyses were performed on key model parameters. Our base case for analysis was a 65-year-old man with low-lying T1N0 rectal cancer. We determined the sensitivity of our model for sex, age up to 80, and T stage. The main outcome measured was quality-adjusted life-years. In the base case, selecting transanal local excision over abdominoperineal resection resulted in a loss of 0.53 years of life expectancy but a gain of 0.97 quality-adjusted life-years. One-way sensitivity analysis demonstrated a health state utility value threshold for permanent colostomy of 0.93. This value ranged from 0.88 to 1.0 based on tumor recurrence risk. There were no other model sensitivities. Some model parameter estimates were based on weak data. In our model, transanal local excision was found to be the preferable approach for most patients. An abdominoperineal resection has a 3.5% longer life expectancy, but this advantage is lost when the quality-of-life reduction reported by stoma patients is weighed in. The minority group in whom abdominoperineal resection is preferred are those who are unwilling to sacrifice 7% of their life expectancy to avoid a permanent stoma. This is estimated to be approximately 25% of all patients. The threshold increases to 12% of life expectancy in high-risk tumors. No other factors are found to be relevant to the decision.

  13. Evidence for aversive withdrawal response to own errors.

    PubMed

    Hochman, Eldad Yitzhak; Milman, Valery; Tal, Liron

    2017-10-01

    Recent model suggests that error detection gives rise to defensive motivation prompting protective behavior. Models of active avoidance behavior predict it should grow larger with threat imminence and avoidance. We hypothesized that in a task requiring left or right key strikes, error detection would drive an avoidance reflex manifested by rapid withdrawal of an erring finger growing larger with threat imminence and avoidance. In experiment 1, three groups differing by error-related threat imminence and avoidance performed a flanker task requiring left or right force sensitive-key strikes. As predicted, errors were followed by rapid force release growing faster with threat imminence and opportunity to evade threat. In experiment 2, we established a link between error key release time (KRT) and the subjective sense of inner-threat. In a simultaneous, multiple regression analysis of three error-related compensatory mechanisms (error KRT, flanker effect, error correction RT), only error KRT was significantly associated with increased compulsive checking tendencies. We propose that error response withdrawal reflects an error-withdrawal reflex. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Two decades of demography reveals that seed and seedling transitions limit population persistence in a translocated shrub

    PubMed Central

    Gross, C. L.; Mackay, D.

    2014-01-01

    Background and Aims Olearia flocktoniae is an endangered shrub that was passively translocated from its natural ecosystem, where it has since gone extinct. This study aimed to determine sensitivities vital to populations persisting in human-created areas. Methods Population colonization, longevity and extinction were investigated over 20 years using 133 populations. Seed-bank longevity was determined from germination trials of seeds exhumed from extinct and extant sites via a 10-year glasshouse trial and by in situ sowing experiments. From 27 populations, 98 cohorts were followed and matrix models of transitions from seeds to adults were used to evaluate the intrinsic rate of population growth against disturbance histories. Ten populations (38 cohorts) with different disturbance histories were used to evaluate sensitivities in vital rates. Key Results Most populations had few individuals (∼30) and were transient (<5 years above ground). The intrinsic population growth rate was rarely >1 and all but two populations were extinct at year 20. Seeds were short-lived in situ. Although >1000 seeds per plant were produced annually in most populations, sensitivity analysis showed that the transition to the seed bank and the transition from the seed bank to seedlings are key vulnerabilities in the life-cycle. Conclusions Seedling establishment is promoted by recent disturbance. Increasing the number of disturbance events in populations, even severe disturbances that almost extirpate populations, significantly increases longer-term population persistence. Only populations that were disturbed annually survived the full 20 years of the study. The results show that translocated populations of O. flocktoniae will fail to persist without active management. PMID:24844983

  15. Foundations to the unified psycho-cognitive engine.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernard, Michael Lewis; Bier, Asmeret Brooke; Backus, George A.

    This document outlines the key features of the SNL psychological engine. The engine is designed to be a generic presentation of cognitive entities interacting among themselves and with the external world. The engine combines the most accepted theories of behavioral psychology with those of behavioral economics to produce a unified simulation of human response from stimuli through executed behavior. The engine explicitly recognizes emotive and reasoned contributions to behavior and simulates the dynamics associated with cue processing, learning, and choice selection. Most importantly, the model parameterization can come from available media or survey information, as well subject-matter-expert information. The frameworkmore » design allows the use of uncertainty quantification and sensitivity analysis to manage confidence in using the analysis results for intervention decisions.« less

  16. Report on the Development of the Advanced Encryption Standard (AES).

    PubMed

    Nechvatal, J; Barker, E; Bassham, L; Burr, W; Dworkin, M; Foti, J; Roback, E

    2001-01-01

    In 1997, the National Institute of Standards and Technology (NIST) initiated a process to select a symmetric-key encryption algorithm to be used to protect sensitive (unclassified) Federal information in furtherance of NIST's statutory responsibilities. In 1998, NIST announced the acceptance of 15 candidate algorithms and requested the assistance of the cryptographic research community in analyzing the candidates. This analysis included an initial examination of the security and efficiency characteristics for each algorithm. NIST reviewed the results of this preliminary research and selected MARS, RC™, Rijndael, Serpent and Twofish as finalists. Having reviewed further public analysis of the finalists, NIST has decided to propose Rijndael as the Advanced Encryption Standard (AES). The research results and rationale for this selection are documented in this report.

  17. Artifact Correction in Temperature-Dependent Attenuated Total Reflection Infrared (ATR-IR) Spectra.

    PubMed

    Sobieski, Brian; Chase, Bruce; Noda, Isao; Rabolt, John

    2017-08-01

    A spectral processing method was developed and tested for analyzing temperature-dependent attenuated total reflection infrared (ATR-IR) spectra of aliphatic polyesters. Spectra of a bio-based, biodegradable polymer, 3.9 mol% 3HHx poly[(R)-3-hydroxybutyrate- co-(R)-3-hydroxyhexanoate] (PHBHx), were analyzed and corrected prior to analysis using two-dimensional correlation spectroscopy (2D-COS). Removal of the temperature variation of diamond absorbance, correction of the baseline, ATR correction, and appropriate normalization were key to generating more reliable data. Both the processing steps and order were important. A comparison to differential scanning calorimetry (DSC) analysis indicated that the normalization method should be chosen with caution to avoid unintentional trends and distortions of the crystalline sensitive bands.

  18. Interdependency of Reactive Oxygen Species generating and scavenging system in salt sensitive and salt tolerant cultivars of rice.

    PubMed

    Kaur, Navdeep; Dhawan, Manish; Sharma, Isha; Pati, Pratap Kumar

    2016-06-10

    Salinity stress is a major constrain in the global rice production and hence serious efforts are being undertaken towards deciphering its remedial strategies. The comparative analysis of differential response of salt sensitive and salt tolerant lines is a judicious approach to obtain essential clues towards understanding the acquisition of salinity tolerance in rice plants. However, adaptation to salt stress is a fairly complex process and operates through different mechanisms. Among various mechanisms involved, the reactive oxygen species mediated salinity tolerance is believed to be critical as it evokes cascade of responses related to stress tolerance. In this background, the present paper for the first time evaluates the ROS generating and the scavenging system in tandem in both salt sensitive and salt tolerant cultivars of rice for getting better insight into salinity stress adaptation. Comparative analysis of ROS indicates the higher level of hydrogen peroxide (H2O2) and lower level of superoxide ions (O(2-)) in the salt tolerant as compared to salt sensitive cultivars. Specific activity of ROS generating enzyme, NADPH oxidase was also found to be more in the tolerant cultivars. Further, activities of various enzymes involved in enzymatic and non enzymatic antioxidant defence system were mostly higher in tolerant cultivars. The transcript level analysis of antioxidant enzymes were in alignment with the enzymatic activity. Other stress markers like proline were observed to be higher in tolerant varieties whereas, the level of malondialdehyde (MDA) equivalents and chlorophyll content were estimated to be more in sensitive. The present study showed significant differences in the level of ROS production and antioxidant enzymes activities among sensitive and tolerant cultivars, suggesting their possible role in providing natural salt tolerance to selected cultivars of rice. Our study demonstrates that the cellular machinery for ROS production and scavenging system works in an interdependent manner to offer better salt stress adaptation in rice. The present work further highlights that the elevated level of H2O2 which is considered as a key determinant for conferring salt stress tolerance to rice might have originated through an alternative route of photocatalytic activity of chlorophyll.

  19. Analysis of fatigue reliability for high temperature and high pressure multi-stage decompression control valve

    NASA Astrophysics Data System (ADS)

    Yu, Long; Xu, Juanjuan; Zhang, Lifang; Xu, Xiaogang

    2018-03-01

    Based on stress-strength interference theory to establish the reliability mathematical model for high temperature and high pressure multi-stage decompression control valve (HMDCV), and introduced to the temperature correction coefficient for revising material fatigue limit at high temperature. Reliability of key dangerous components and fatigue sensitivity curve of each component are calculated and analyzed by the means, which are analyzed the fatigue life of control valve and combined with reliability theory of control valve model. The impact proportion of each component on the control valve system fatigue failure was obtained. The results is shown that temperature correction factor makes the theoretical calculations of reliability more accurate, prediction life expectancy of main pressure parts accords with the technical requirements, and valve body and the sleeve have obvious influence on control system reliability, the stress concentration in key part of control valve can be reduced in the design process by improving structure.

  20. A secure and efficient authentication and key agreement scheme based on ECC for telecare medicine information systems.

    PubMed

    Xu, Xin; Zhu, Ping; Wen, Qiaoyan; Jin, Zhengping; Zhang, Hua; He, Lian

    2014-01-01

    In the field of the Telecare Medicine Information System, recent researches have focused on consummating more convenient and secure healthcare delivery services for patients. In order to protect the sensitive information, various attempts such as access control have been proposed to safeguard patients' privacy in this system. However, these schemes suffered from some certain security defects and had costly consumption, which were not suitable for the telecare medicine information system. In this paper, based on the elliptic curve cryptography, we propose a secure and efficient two-factor mutual authentication and key agreement scheme to reduce the computational cost. Such a scheme enables to provide the patient anonymity by employing the dynamic identity. Compared with other related protocols, the security analysis and performance evaluation show that our scheme overcomes some well-known attacks and has a better performance in the telecare medicine information system.

  1. Innovative Tools and Technology for Analysis of Single Cells and Cell-Cell Interaction.

    PubMed

    Konry, Tania; Sarkar, Saheli; Sabhachandani, Pooja; Cohen, Noa

    2016-07-11

    Heterogeneity in single-cell responses and intercellular interactions results from complex regulation of cell-intrinsic and environmental factors. Single-cell analysis allows not only detection of individual cellular characteristics but also correlation of genetic content with phenotypic traits in the same cell. Technological advances in micro- and nanofabrication have benefited single-cell analysis by allowing precise control of the localized microenvironment, cell manipulation, and sensitive detection capabilities. Additionally, microscale techniques permit rapid, high-throughput, multiparametric screening that has become essential for -omics research. This review highlights innovative applications of microscale platforms in genetic, proteomic, and metabolic detection in single cells; cell sorting strategies; and heterotypic cell-cell interaction. We discuss key design aspects of single-cell localization and isolation in microfluidic systems, dynamic and endpoint analyses, and approaches that integrate highly multiplexed detection of various intracellular species.

  2. Lithography hotspot discovery at 70nm DRAM 300mm fab: process window qualification using design base binning

    NASA Astrophysics Data System (ADS)

    Chen, Daniel; Chen, Damian; Yen, Ray; Cheng, Mingjen; Lan, Andy; Ghaskadvi, Rajesh

    2008-11-01

    Identifying hotspots--structures that limit the lithography process window--become increasingly important as the industry relies heavily on RET to print sub-wavelength designs. KLA-Tencor's patented Process Window Qualification (PWQ) methodology has been used for this purpose in various fabs. PWQ methodology has three key advantages (a) PWQ Layout--to obtain the best sensitivity (b) Design Based Binning--for pattern repeater analysis (c) Intelligent sampling--for the best DOI sampling rate. This paper evaluates two different analysis strategies for SEM review sampling successfully deployed at Inotera Memories, Inc. We propose a new approach combining the location repeater and pattern repeaters. Based on a recent case study the new sampling flow reduces the data analysis and sampling time from 6 hours to 1.5 hour maintaining maximum DOI sample rate.

  3. Nanoparticle separation with a miniaturized asymmetrical flow field-flow fractionation cartridge

    PubMed Central

    Müller, David; Cattaneo, Stefano; Meier, Florian; Welz, Roland; de Mello, Andrew J.

    2015-01-01

    Asymmetrical Flow Field-Flow Fractionation (AF4) is a separation technique applicable to particles over a wide size range. Despite the many advantages of AF4, its adoption in routine particle analysis is somewhat limited by the large footprint of currently available separation cartridges, extended analysis times and significant solvent consumption. To address these issues, we describe the fabrication and characterization of miniaturized AF4 cartridges. Key features of the down-scaled platform include simplified cartridge and reagent handling, reduced analysis costs and higher throughput capacities. The separation performance of the miniaturized cartridge is assessed using certified gold and silver nanoparticle standards. Analysis of gold nanoparticle populations indicates shorter analysis times and increased sensitivity compared to conventional AF4 separation schemes. Moreover, nanoparticulate titanium dioxide populations exhibiting broad size distributions are analyzed in a rapid and efficient manner. Finally, the repeatability and reproducibility of the miniaturized platform are investigated with respect to analysis time and separation efficiency. PMID:26258119

  4. Nanoparticle separation with a miniaturized asymmetrical flow field-flow fractionation cartridge

    NASA Astrophysics Data System (ADS)

    Müller, David; Cattaneo, Stefano; Meier, Florian; Welz, Roland; deMello, Andrew

    2015-07-01

    Asymmetrical Flow Field-Flow Fractionation (AF4) is a separation technique applicable to particles over a wide size range. Despite the many advantages of AF4, its adoption in routine particle analysis is somewhat limited by the large footprint of currently available separation cartridges, extended analysis times and significant solvent consumption. To address these issues, we describe the fabrication and characterization of miniaturized AF4 cartridges. Key features of the scale-down platform include simplified cartridge and reagent handling, reduced analysis costs and higher throughput capacities. The separation performance of the miniaturized cartridge is assessed using certified gold and silver nanoparticle standards. Analysis of gold nanoparticle populations indicates shorter analysis times and increased sensitivity compared to conventional AF4 separation schemes. Moreover, nanoparticulate titanium dioxide populations exhibiting broad size distributions are analyzed in a rapid and efficient manner. Finally, the repeatability and reproducibility of the miniaturized platform are investigated with respect to analysis time and separation efficiency.

  5. Sensitive detection of major food allergens in breast milk: first gateway for allergenic contact during breastfeeding.

    PubMed

    Pastor-Vargas, C; Maroto, A S; Díaz-Perales, A; Villaba, M; Casillas Diaz, N; Vivanco, F; Cuesta-Herranz, J

    2015-08-01

    Food allergy is recognized as a major public health issue, especially in early childhood. It has been hypothesized that early sensitization to food allergens maybe due to their ingestion as components dissolved in the milk during the breastfeeding, explaining reaction to a food, which has never been taken before. Thus, the aim of this work has been to detect the presence of the food allergens in breast milk by microarray technology. We produced a homemade microarray with antibodies produced against major food allergens. The antibody microarray was incubated with breast milk from 14 women collected from Fundación Jiménez Díaz Hospital. In this way, we demonstrated the presence of major foods allergens in breast milk. The analysis of allergens presented in breast milk could be a useful tool in allergy prevention and could provide us a key data on the role of this feeding in tolerance induction or sensitization in children. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  6. BSA-coated nanoparticles for improved SERS-based intracellular pH sensing.

    PubMed

    Zheng, Xiao-Shan; Hu, Pei; Cui, Yan; Zong, Cheng; Feng, Jia-Min; Wang, Xin; Ren, Bin

    2014-12-16

    Local microenvironment pH sensing is one of the key parameters for the understanding of many biological processes. As a noninvasive and high sensitive technique, surface-enhanced Raman spectroscopy (SERS) has attracted considerable interest in the detection of the local pH of live cells. We herein develop a facile way to prepare Au-(4-MPy)-BSA (AMB) pH nanosensor. The 4-MPy (4-mercaptopyridine) was used as the pH sensing molecule. The modification of the nanoparticles with BSA not only provides a high sensitive response to pH changes ranging from pH 4.0 to 9.0 but also exhibits a high sensitivity and good biocompatibility, stability, and reliability in various solutions (including the solutions of high ionic strength or with complex composition such as the cell culture medium), both in the aggregation state or after long-term storage. The AMB pH nanosensor shows great advantages for reliable intracellular pH analysis and has been successfully used to monitor the pH distribution of live cells and can address the grand challenges in SERS-based pH sensing for practical biological applications.

  7. Differential Sensitivity of Target Genes to Translational Repression by miR-17~92

    PubMed Central

    Jin, Hyun Yong; Oda, Hiroyo; Chen, Pengda; Kang, Seung Goo; Valentine, Elizabeth; Liao, Lujian; Zhang, Yaoyang; Gonzalez-Martin, Alicia; Shepherd, Jovan; Head, Steven R.; Kim, Pyeung-Hyeun; Fu, Guo; Liu, Wen-Hsien; Han, Jiahuai

    2017-01-01

    MicroRNAs (miRNAs) are thought to exert their functions by modulating the expression of hundreds of target genes and each to a small degree, but it remains unclear how small changes in hundreds of target genes are translated into the specific function of a miRNA. Here, we conducted an integrated analysis of transcriptome and translatome of primary B cells from mutant mice expressing miR-17~92 at three different levels to address this issue. We found that target genes exhibit differential sensitivity to miRNA suppression and that only a small fraction of target genes are actually suppressed by a given concentration of miRNA under physiological conditions. Transgenic expression and deletion of the same miRNA gene regulate largely distinct sets of target genes. miR-17~92 controls target gene expression mainly through translational repression and 5’UTR plays an important role in regulating target gene sensitivity to miRNA suppression. These findings provide molecular insights into a model in which miRNAs exert their specific functions through a small number of key target genes. PMID:28241004

  8. Protein Oxidation: Key to Bacterial Desiccation Resistance?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fredrickson, Jim K.; Li, Shu-Mei W.; Gaidamakova, E.

    For extremely ionizing radiation resistant bacteria, survival has been attributed to protection of proteins from oxidative damage during irradiation, with the result that repair systems survive and function with far greater efficiency during recovery than in sensitive bacteria. Here we examined the relationship between survival of dry-climate soil bacteria and the level of cellular protein oxidation induced by desiccation. Bacteria were isolated from surface soils of the shrub-steppe of the U.S. Department of Energy’s Hanford Site in Washington state. A total of 63 isolates were used for phylogenetic analysis. The majority of isolates were closely related to members of themore » genus Deinococcus, with Chelatococcus, Methylobacterium and Bosea also among the genera identified. Desiccation-resistant isolates accumulated high intracellular manganese and low iron concentrations compared to sensitive bacteria. In vivo, proteins of desiccation-resistant bacteria were protected from oxidative modifications that introduce carbonyl groups in sensitive bacteria during drying. We present the case that survival of bacteria that inhabit dry-climate soils are highly dependent on mechanisms which limit protein oxidation during dehydration.« less

  9. Evidence for the Induction of Key Components of the NOTCH Signaling Pathway via Deltamethrin and Azamethiphos Treatment in the Sea Louse Caligus rogercresseyi

    PubMed Central

    Boltaña, Sebastian; Chávez-Mardones, Jaqueline; Valenzuela-Muñoz, Valentina; Gallardo-Escárate, Cristian

    2016-01-01

    The extensive use of organophosphates and pyrethroids in the aquaculture industry has negatively impacted parasite sensitivity to the delousing effects of these antiparasitics, especially among sea lice species. The NOTCH signaling pathway is a positive regulator of ABC transporter subfamily C expression and plays a key role in the generation and modulation of pesticide resistance. However, little is known about the molecular mechanisms behind pesticide resistance, partly due to the lack of genomic and molecular information on the processes involved in the resistance mechanism of sea lice. Next-generation sequencing technologies provide an opportunity for rapid and cost-effective generation of genome-scale data. The present study, through RNA-seq analysis, determined that the sea louse Caligus rogercresseyi (C. rogercresseyi) specifically responds to the delousing drugs azamethiphos and deltamethrin at the transcriptomic level by differentially activating mRNA of the NOTCH signaling pathway and of ABC genes. These results suggest that frequent antiparasitic application may increase the activity of inhibitory mRNA components, thereby promoting inhibitory NOTCH output and conditions for increased resistance to delousing drugs. Moreover, data analysis underscored that key functions of NOTCH/ABC components were regulated during distinct phases of the drug response, thus indicating resistance modifications in C. rogercresseyi resulting from the frequent use of organophosphates and pyrethroids. PMID:27187362

  10. Evidence for the Induction of Key Components of the NOTCH Signaling Pathway via Deltamethrin and Azamethiphos Treatment in the Sea Louse Caligus rogercresseyi.

    PubMed

    Boltaña, Sebastian; Chávez-Mardones, Jaqueline; Valenzuela-Muñoz, Valentina; Gallardo-Escárate, Cristian

    2016-05-12

    The extensive use of organophosphates and pyrethroids in the aquaculture industry has negatively impacted parasite sensitivity to the delousing effects of these antiparasitics, especially among sea lice species. The NOTCH signaling pathway is a positive regulator of ABC transporter subfamily C expression and plays a key role in the generation and modulation of pesticide resistance. However, little is known about the molecular mechanisms behind pesticide resistance, partly due to the lack of genomic and molecular information on the processes involved in the resistance mechanism of sea lice. Next-generation sequencing technologies provide an opportunity for rapid and cost-effective generation of genome-scale data. The present study, through RNA-seq analysis, determined that the sea louse Caligus rogercresseyi (C. rogercresseyi) specifically responds to the delousing drugs azamethiphos and deltamethrin at the transcriptomic level by differentially activating mRNA of the NOTCH signaling pathway and of ABC genes. These results suggest that frequent antiparasitic application may increase the activity of inhibitory mRNA components, thereby promoting inhibitory NOTCH output and conditions for increased resistance to delousing drugs. Moreover, data analysis underscored that key functions of NOTCH/ABC components were regulated during distinct phases of the drug response, thus indicating resistance modifications in C. rogercresseyi resulting from the frequent use of organophosphates and pyrethroids.

  11. LC-MS-based characterization of the peptide reactivity of chemicals to improve the in vitro prediction of the skin sensitization potential.

    PubMed

    Natsch, Andreas; Gfeller, Hans

    2008-12-01

    A key step in the skin sensitization process is the formation of a covalent adduct between skin sensitizers and endogenous proteins and/or peptides in the skin. Based on this mechanistic understanding, there is a renewed interest in in vitro assays to determine the reactivity of chemicals toward peptides in order to predict their sensitization potential. A standardized peptide reactivity assay yielded a promising predictivity. This published assay is based on high-performance liquid chromatography with ultraviolet detection to quantify peptide depletion after incubation with test chemicals. We had observed that peptide depletion may be due to either adduct formation or peptide oxidation. Here we report a modified assay based on both liquid chromatography-mass spectrometry (LC-MS) analysis and detection of free thiol groups. This approach allows simultaneous determination of (1) peptide depletion, (2) peptide oxidation (dimerization), (3) adduct formation, and (4) thiol reactivity and thus generates a more detailed characterization of the reactivity of a molecule. Highly reactive molecules are further discriminated with a kinetic measure. The assay was validated on 80 chemicals. Peptide depletion could accurately be quantified both with LC-MS detection and depletion of thiol groups. The majority of the moderate/strong/extreme sensitizers formed detectable peptide adducts, but many sensitizers were also able to catalyze peptide oxidation. Whereas adduct formation was only observed for sensitizers, this oxidation reaction was also observed for two nonsensitizing fragrance aldehydes, indicating that peptide depletion might not always be regarded as sufficient evidence for rating a chemical as a sensitizer. Thus, this modified assay gives a more informed view of the peptide reactivity of chemicals to better predict their sensitization potential.

  12. Instrumentation and Performance Analysis Plans for the HIFiRE Flight 2 Experiment

    NASA Technical Reports Server (NTRS)

    Gruber, Mark; Barhorst, Todd; Jackson, Kevin; Eklund, Dean; Hass, Neal; Storch, Andrea M.; Liu, Jiwen

    2009-01-01

    Supersonic combustion performance of a bi-component gaseous hydrocarbon fuel mixture is one of the primary aspects under investigation in the HIFiRE Flight 2 experiment. In-flight instrumentation and post-test analyses will be two key elements used to determine the combustion performance. Pre-flight computational fluid dynamics (CFD) analyses provide valuable information that can be used to optimize the placement of a constrained set of wall pressure instrumentation in the experiment. The simulations also allow pre-flight assessments of performance sensitivities leading to estimates of overall uncertainty in the determination of combustion efficiency. Based on the pre-flight CFD results, 128 wall pressure sensors have been located throughout the isolator/combustor flowpath to minimize the error in determining the wall pressure force at Mach 8 flight conditions. Also, sensitivity analyses show that mass capture and combustor exit stream thrust are the two primary contributors to uncertainty in combustion efficiency.

  13. Tunable wavefront coded imaging system based on detachable phase mask: Mathematical analysis, optimization and underlying applications

    NASA Astrophysics Data System (ADS)

    Zhao, Hui; Wei, Jingxuan

    2014-09-01

    The key to the concept of tunable wavefront coding lies in detachable phase masks. Ojeda-Castaneda et al. (Progress in Electronics Research Symposium Proceedings, Cambridge, USA, July 5-8, 2010) described a typical design in which two components with cosinusoidal phase variation operate together to make defocus sensitivity tunable. The present study proposes an improved design and makes three contributions: (1) A mathematical derivation based on the stationary phase method explains why the detachable phase mask of Ojeda-Castaneda et al. tunes the defocus sensitivity. (2) The mathematical derivations show that the effective bandwidth wavefront coded imaging system is also tunable by making each component of the detachable phase mask move asymmetrically. An improved Fisher information-based optimization procedure was also designed to ascertain the optimal mask parameters corresponding to specific bandwidth. (3) Possible applications of the tunable bandwidth are demonstrated by simulated imaging.

  14. Assessment of Rho GTPase signaling during neurite outgrowth.

    PubMed

    Feltrin, Daniel; Pertz, Olivier

    2012-01-01

    Rho GTPases are key regulators of the cytoskeleton during the process of neurite outgrowth. Based on overexpression of dominant-positive and negative Rho GTPase constructs, the classic view is that Rac1 and Cdc42 are important for neurite elongation whereas RhoA regulates neurite retraction in response to collapsing agents. However, recent work has suggested a much finer control of spatiotemporal Rho GTPase signaling in this process. Understanding this complexity level necessitates a panel of more sensitive tools than previously used. Here, we discuss a novel assay that enables the biochemical fractionation of the neurite from the soma of differentiating N1E-115 neuronal-like cells. This allows for spatiotemporal characterization of a large number of protein components, interactions, and post-translational modifications using classic biochemical and also proteomics approaches. We also provide protocols for siRNA-mediated knockdown of genes and sensitive assays that allow quantitative analysis of the neurite outgrowth process.

  15. Tip-enhanced Raman scattering microscopy: Recent advance in tip production

    NASA Astrophysics Data System (ADS)

    Fujita, Yasuhiko; Walke, Peter; De Feyter, Steven; Uji-i, Hiroshi

    2016-08-01

    Tip-enhanced Raman scattering (TERS) microscopy is a technique that combines the chemical sensitivity of Raman spectroscopy with the resolving power of scanning probe microscopy. The key component of any TERS setup is a plasmonically-active noble metal tip, which serves to couple far-field incident radiation with the near-field. Thus, the design and implementation of reproducible probes are crucial for the continued development of TERS as a tool for nanoscopic analysis. Here we discuss conventional methods for the fabrication of TERS-ready tips, highlighting the problems therein, as well as detailing more recent developments to improve reducibility. In addition, the idea of remote excitation-TERS is enlightened upon, whereby TERS sensitivity is further improved by using propagating surface plasmons to separate the incident radiation from the tip apex, as well as how this can be incorporated into the fabrication process.

  16. Integrated model for pricing, delivery time setting, and scheduling in make-to-order environments

    NASA Astrophysics Data System (ADS)

    Garmdare, Hamid Sattari; Lotfi, M. M.; Honarvar, Mahboobeh

    2018-03-01

    Usually, in make-to-order environments which work only in response to the customer's orders, manufacturers for maximizing the profits should offer the best price and delivery time for an order considering the existing capacity and the customer's sensitivity to both the factors. In this paper, an integrated approach for pricing, delivery time setting and scheduling of new arrival orders are proposed based on the existing capacity and accepted orders in system. In the problem, the acquired market demands dependent on the price and delivery time of both the manufacturer and its competitors. A mixed-integer non-linear programming model is presented for the problem. After converting to a pure non-linear model, it is validated through a case study. The efficiency of proposed model is confirmed by comparing it to both the literature and the current practice. Finally, sensitivity analysis for the key parameters is carried out.

  17. Method for quick thermal tolerancing of optical systems

    NASA Astrophysics Data System (ADS)

    Werschnik, J.; Uhlendorf, K.

    2016-09-01

    Optical systems for lithography (projection lens), inspection (micro-objectives) or laser material processing usually have tight specifications regarding focus and wave-front stability. The same is true regarding the field dependent properties. Especially projection lenses have tight specifications on field curvature, magnification and distortion. Unwanted heating either from internal or external sources lead to undesired changes of the above properties. In this work we show an elegant and fast method to analyze the thermal sensitivity using ZEMAX. The key point of this method is using the thermal changes of the lens data from the multi-configuration editor as starting point for a (standard) tolerance analysis. Knowing the sensitivity we can either define requirements on the environment or use it to systematically improve the thermal behavior of the lens. We demonstrate this method for a typical projection lens for which we optimized the thermal field curvature to a minimum.

  18. Clinical potential of proteomics in the diagnosis of ovarian cancer.

    PubMed

    Ardekani, Ali M; Liotta, Lance A; Petricoin, Emanuel F

    2002-07-01

    The need for specific and sensitive markers of ovarian cancer is critical. Finding a sensitive and specific test for its detection has an important public health impact. Currently, there are no effective screening options available for patients with ovarian cancer. CA-125, the most widely used biomarker for ovarian cancer, does not have a high positive predictive value and it is only effective when used in combination with other diagnostic tests. However, pathologic changes taking place within the ovary may be reflected in biomarker patterns in the serum. Combination of mass spectra generated by new proteomic technologies, such as surface-enhanced laser desorption ionization time-of-flight (SELDI-TOF) and artificial-intelligence-based informatic algorithms, have been used to discover a small set of key protein values and discriminate normal from ovarian cancer patients. Serum proteomic pattern analysis might be applied ultimately in medical screening clinics, as a supplement to the diagnostic work-up and evaluation.

  19. AmpliVar: mutation detection in high-throughput sequence from amplicon-based libraries.

    PubMed

    Hsu, Arthur L; Kondrashova, Olga; Lunke, Sebastian; Love, Clare J; Meldrum, Cliff; Marquis-Nicholson, Renate; Corboy, Greg; Pham, Kym; Wakefield, Matthew; Waring, Paul M; Taylor, Graham R

    2015-04-01

    Conventional means of identifying variants in high-throughput sequencing align each read against a reference sequence, and then call variants at each position. Here, we demonstrate an orthogonal means of identifying sequence variation by grouping the reads as amplicons prior to any alignment. We used AmpliVar to make key-value hashes of sequence reads and group reads as individual amplicons using a table of flanking sequences. Low-abundance reads were removed according to a selectable threshold, and reads above this threshold were aligned as groups, rather than as individual reads, permitting the use of sensitive alignment tools. We show that this approach is more sensitive, more specific, and more computationally efficient than comparable methods for the analysis of amplicon-based high-throughput sequencing data. The method can be extended to enable alignment-free confirmation of variants seen in hybridization capture target-enrichment data. © 2015 WILEY PERIODICALS, INC.

  20. Using microRNA profiling in urine samples to develop a non-invasive test for bladder cancer.

    PubMed

    Mengual, Lourdes; Lozano, Juan José; Ingelmo-Torres, Mercedes; Gazquez, Cristina; Ribal, María José; Alcaraz, Antonio

    2013-12-01

    Current standard methods used to detect and monitor bladder urothelial cell carcinoma (UCC) are invasive or have low sensitivity. The incorporation into clinical practice of a non-invasive tool for UCC assessment would enormously improve patients' quality of life and outcome. This study aimed to examine the microRNA (miRNA) expression profiles in urines of UCC patients in order to develop a non-invasive accurate and reliable tool to diagnose and provide information on the aggressiveness of the tumor. We performed a global miRNA expression profiling analysis of the urinary cells from 40 UCC patients and controls using TaqMan Human MicroRNA Array followed by validation of 22 selected potentially diagnostic and prognostic miRNAs in a separate cohort of 277 samples using a miRCURY LNA qPCR system. miRNA-based signatures were developed by multivariate logistic regression analysis and internally cross-validated. In the initial cohort of patients, we identified 40 and 30 aberrantly expressed miRNA in UCC compared with control urines and in high compared with low grade tumors, respectively. Quantification of 22 key miRNAs in an independent cohort resulted in the identification of a six miRNA diagnostic signature with a sensitivity of 84.8% and specificity of 86.5% (AUC = 0.92) and a two miRNA prognostic model with a sensitivity of 84.95% and a specificity of 74.14% (AUC = 0.83). Internal cross-validation analysis confirmed the accuracy rates of both models, reinforcing the strength of our findings. Although the data needs to be externally validated, miRNA analysis in urine appears to be a valuable tool for the non-invasive assessment of UCC. Copyright © 2013 UICC.

  1. Sensitive determination of thiols in wine samples by a stable isotope-coded derivatization reagent d0/d4-acridone-10-ethyl-N-maleimide coupled with high-performance liquid chromatography-electrospray ionization-tandem mass spectrometry analysis.

    PubMed

    Lv, Zhengxian; You, Jinmao; Lu, Shuaimin; Sun, Weidi; Ji, Zhongyin; Sun, Zhiwei; Song, Cuihua; Chen, Guang; Li, Guoliang; Hu, Na; Zhou, Wu; Suo, Yourui

    2017-03-31

    As the key aroma compounds, varietal thiols are the crucial odorants responsible for the flavor of wines. Quantitative analysis of thiols can provide crucial information for the aroma profiles of different wine styles. In this study, a rapid and sensitive method for the simultaneous determination of six thiols in wine using d 0 /d 4 -acridone-10-ethyl-N-maleimide (d 0 /d 4 -AENM) as stable isotope-coded derivatization reagent (SICD) by high performance liquid chromatography-electrospray ionization-tandem mass spectrometry (HPLC-ESI-MS/MS) has been developed. Quantification of thiols was performed by using d 4 -AENM labeled thiols as the internal standards (IS), followed by stable isotope dilution HPLC-ESI-MS/MS analysis. The AENM derivatization combined with multiple reactions monitoring (MRM) not only allowed trace analysis of thiols due to the extremely high sensitivity, but also efficiently corrected the matrix effects during HPLC-MS/MS and the fluctuation in MS/MS signal intensity due to instrument. The obtained internal standard calibration curves for six thiols were linear over the range of 25-10,000pmol/L (R 2 ≥0.9961). Detection limits (LODs) for most of analytes were below 6.3pmol/L. The proposed method was successfully applied for the simultaneous determination of six kinds of thiols in wine samples with precisions ≤3.5% and recoveries ≥78.1%. In conclusion, the developed method is expected to be a promising tool for detection of trace thiols in wine and also in other complex matrix. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. The approach to detection and application of the company’s technological competences to form a business-model

    NASA Astrophysics Data System (ADS)

    Chursin, A. A.; Kashirin, A. I.; Strenalyuk, V. V.; Semenov, A. S.; Ostrovskaya, A. A.; Kokuytseva, T. V.

    2018-02-01

    The most important condition for increasing the competitiveness of business is the formation, retention, and development of key competences of the organization, which reflect the competitive advantage. This problem is especially urgent for high-tech industries, which are the most sensitive to all kinds of changes and innovations. The ways of applying the company’s technological competences to form a business model, the proper form of competence description and analysis on the example of the company “Teplolux” are considered. The following from is recommended to use in IT solutions for competence databases.

  3. Raman Life Detection Instrument Development for Icy Worlds

    NASA Technical Reports Server (NTRS)

    Thomson, Seamus; Allen, A'Lester; Gutierrez, Daniel; Quinn, Richard C.; Chen, Bin; Koehne, Jessica E.

    2017-01-01

    The objective of this project is to develop a compact, high sensitivity Raman sensor for detection of life signatures in a flow cell configuration to enable bio-exploration and life detection during future mission to our Solar Systems Icy Worlds. The specific project objectives are the following: 1) Develop a Raman spectroscopy liquid analysis sensor for biosignatures; 2) Demonstrate applicability towards a future Enceladus or other Icy Worlds missions; 3) Establish key parameters for integration with the ARC Sample Processor for Life on Icy Worlds (SPLIce); 4) Position ARC for a successful response to upcoming Enceladus or other Icy World mission instrument opportunities.

  4. Hyperbolic Rendezvous at Mars: Risk Assessments and Mitigation Strategies

    NASA Technical Reports Server (NTRS)

    Jedrey, Ricky; Landau, Damon; Whitley, Ryan

    2015-01-01

    Given the current interest in the use of flyby trajectories for human Mars exploration, a key requirement is the capability to execute hyperbolic rendezvous. Hyperbolic rendezvous is used to transport crew from a Mars centered orbit, to a transiting Earth bound habitat that does a flyby. Representative cases are taken from future potential missions of this type, and a thorough sensitivity analysis of the hyperbolic rendezvous phase is performed. This includes early engine cutoff, missed burn times, and burn misalignment. A finite burn engine model is applied that assumes the hyperbolic rendezvous phase is done with at least two burns.

  5. Influence of Oxidation in Starting Material Sn on Electric Transport Properties of SnSe Single Crystals

    NASA Astrophysics Data System (ADS)

    Yamashita, Aichi; Ogiso, Osamu; Matsumoto, Ryo; Tanaka, Masashi; Hara, Hiroshi; Tanaka, Hiromi; Takeya, Hiroyuki; Lee, Chul-Ho; Takano, Yoshihiko

    2018-06-01

    We found that the electronic transport property of SnSe single crystals was sensitively affected by oxidation in raw Sn. Semiconducting SnSe single crystals were obtained by using Sn of grain form as a starting material while powder Sn resulted in metallic SnSe. X-ray photoelectron spectroscopy analysis revealed that the surfaces of raw Sn were oxidized, which volume fraction is lower in grain Sn. This indicates that the amount of oxygen in raw Sn is the key factor for the electronic transport property of SnSe.

  6. Scientific guidelines for preservation of samples collected from Mars

    NASA Technical Reports Server (NTRS)

    Gooding, James L. (Editor)

    1990-01-01

    The maximum scientific value of Martian geologic and atmospheric samples is retained when the samples are preserved in the conditions that applied prior to their collection. Any sample degradation equates to loss of information. Based on detailed review of pertinent scientific literature, and advice from experts in planetary sample analysis, number values are recommended for key parameters in the environmental control of collected samples with respect to material contamination, temperature, head-space gas pressure, ionizing radiation, magnetic fields, and acceleration/shock. Parametric values recommended for the most sensitive geologic samples should also be adequate to preserve any biogenic compounds or exobiological relics.

  7. Pharmacogenomics of Cisplatin Sensitivity in Non-small Cell Lung Cancer

    PubMed Central

    Rose, Maimon C.; Kostyanovskaya, Elina; Huang, R. Stephanie

    2014-01-01

    Cisplatin, a platinum-based chemotherapeutic drug, has been used for over 30 years in a wide variety of cancers with varying degrees of success. In particular, cisplatin has been used to treat late stage non-small cell lung cancer (NSCLC) as the standard of care. However, therapeutic outcomes vary from patient to patient. Considerable efforts have been invested to identify biomarkers that can be used to predict cisplatin sensitivity in NSCLC. Here we reviewed current evidence for cisplatin sensitivity biomarkers in NSCLC. We focused on several key pathways, including nucleotide excision repair, drug transport and metabolism. Both expression and germline DNA variation were evaluated in these key pathways. Current evidence suggests that cisplatin-based treatment could be improved by the use of these biomarkers. PMID:25449594

  8. High sensitivity optical molecular imaging system

    NASA Astrophysics Data System (ADS)

    An, Yu; Yuan, Gao; Huang, Chao; Jiang, Shixin; Zhang, Peng; Wang, Kun; Tian, Jie

    2018-02-01

    Optical Molecular Imaging (OMI) has the advantages of high sensitivity, low cost and ease of use. By labeling the regions of interest with fluorescent or bioluminescence probes, OMI can noninvasively obtain the distribution of the probes in vivo, which play the key role in cancer research, pharmacokinetics and other biological studies. In preclinical and clinical application, the image depth, resolution and sensitivity are the key factors for researchers to use OMI. In this paper, we report a high sensitivity optical molecular imaging system developed by our group, which can improve the imaging depth in phantom to nearly 5cm, high resolution at 2cm depth, and high image sensitivity. To validate the performance of the system, special designed phantom experiments and weak light detection experiment were implemented. The results shows that cooperated with high performance electron-multiplying charge coupled device (EMCCD) camera, precision design of light path system and high efficient image techniques, our OMI system can simultaneously collect the light-emitted signals generated by fluorescence molecular imaging, bioluminescence imaging, Cherenkov luminance and other optical imaging modality, and observe the internal distribution of light-emitting agents fast and accurately.

  9. Identification of sensitive parameters in the modeling of SVOC reemission processes from soil to atmosphere.

    PubMed

    Loizeau, Vincent; Ciffroy, Philippe; Roustan, Yelva; Musson-Genon, Luc

    2014-09-15

    Semi-volatile organic compounds (SVOCs) are subject to Long-Range Atmospheric Transport because of transport-deposition-reemission successive processes. Several experimental data available in the literature suggest that soil is a non-negligible contributor of SVOCs to atmosphere. Then coupling soil and atmosphere in integrated coupled models and simulating reemission processes can be essential for estimating atmospheric concentration of several pollutants. However, the sources of uncertainty and variability are multiple (soil properties, meteorological conditions, chemical-specific parameters) and can significantly influence the determination of reemissions. In order to identify the key parameters in reemission modeling and their effect on global modeling uncertainty, we conducted a sensitivity analysis targeted on the 'reemission' output variable. Different parameters were tested, including soil properties, partition coefficients and meteorological conditions. We performed EFAST sensitivity analysis for four chemicals (benzo-a-pyrene, hexachlorobenzene, PCB-28 and lindane) and different spatial scenari (regional and continental scales). Partition coefficients between air, solid and water phases are influent, depending on the precision of data and global behavior of the chemical. Reemissions showed a lower variability to soil parameters (soil organic matter and water contents at field capacity and wilting point). A mapping of these parameters at a regional scale is sufficient to correctly estimate reemissions when compared to other sources of uncertainty. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Evaluation of dried blood spot samples for screening of hepatitis C and human immunodeficiency virus in a real-world setting.

    PubMed

    Vázquez-Morón, Sonia; Ryan, Pablo; Ardizone-Jiménez, Beatriz; Martín, Dolores; Troya, Jesus; Cuevas, Guillermo; Valencia, Jorge; Jimenez-Sousa, María A; Avellón, Ana; Resino, Salvador

    2018-01-30

    Both hepatitis C virus (HCV) infection and human immunodeficiency virus (HIV) infection are underdiagnosed, particularly in low-income countries and in difficult-to-access populations. Our aim was to develop and evaluate a methodology for the detection of HCV and HIV infection based on capillary dry blood spot (DBS) samples taken under real-world conditions. We carried out a cross-sectional study of 139 individuals (31 healthy controls, 68 HCV-monoinfected patients, and 40 HCV/HIV-coinfected patients). ELISA was used for anti-HCV and anti-HIV antibody detection; and SYBR Green RT-PCR was used for HCV-RNA detection. The HIV serological analysis revealed 100% sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV). The HCV serological analysis revealed a sensitivity of 92.6%, specificity of 100%, PPV of 100%, and NPV of 79.5%. Finally, the HCV-RNA detection test revealed a detection limit of 5 copies/µl with an efficiency of 100% and sensitivity of 99.1%, specificity of 100%, PPV of 100%, and NPV of 96.9%. In conclusion, our methodology was able to detect both HCV infection and HIV infection from the same DBS sample with good diagnostic performance. Screening for HCV and HIV using DBS might be a key strategy in the implementation of national programs for the control of both infections.

  11. Metabolomic Analysis of Key Central Carbon Metabolism Carboxylic Acids as Their 3-Nitrophenylhydrazones by UPLC/ESI-MS

    PubMed Central

    Han, Jun; Gagnon, Susannah; Eckle, Tobias; Borchers, Christoph H.

    2014-01-01

    Multiple hydroxy-, keto-, di-, and tri-carboxylic acids are among the cellular metabolites of central carbon metabolism (CCM). Sensitive and reliable analysis of these carboxylates is important for many biological and cell engineering studies. In this work, we examined 3-nitrophenylhydrazine as a derivatizing reagent and optimized the reaction conditions for the measurement of ten CCM related carboxylic compounds, including glycolate, lactate, malate, fumarate, succinate, citrate, isocitrate, pyruvate, oxaloacetate, and α-ketoglutarate as their 3-nitrophenylhydrazones using LC/MS with electrospray ionization. With the derivatization protocol which we have developed, and using negative-ion multiple reaction monitoring on a triple-quadrupole instrument, all of the carboxylates showed good linearity within a dynamic range of ca. 200 to more than 2000. The on-column limits of detection and quantitation were from high femtomoles to low picomoles. The analytical accuracies for eight of the ten analytes were determined to be between 89.5 to 114.8% (CV≤7.4%, n=6). Using a quadrupole time-of-flight instrument, the isotopic distribution patterns of these carboxylates, extracted from a 13C-labeled mouse heart, were successfully determined by UPLC/MS with full-mass detection, indicating the possible utility of this analytical method for metabolic flux analysis. In summary, this work demonstrates an efficient chemical derivatization LC/MS method for metabolomic analysis of these key CCM intermediates in a biological matrix. PMID:23580203

  12. Long term load forecasting accuracy in electric utility integrated resource planning

    DOE PAGES

    Carvallo, Juan Pablo; Larsen, Peter H.; Sanstad, Alan H.; ...

    2018-05-23

    Forecasts of electricity consumption and peak demand over time horizons of one or two decades are a key element in electric utilities’ meeting their core objective and obligation to ensure reliable and affordable electricity supplies for their customers while complying with a range of energy and environmental regulations and policies. These forecasts are an important input to integrated resource planning (IRP) processes involving utilities, regulators, and other stake-holders. Despite their importance, however, there has been little analysis of long term utility load forecasting accuracy. We conduct a retrospective analysis of long term load forecasts on twelve Western U. S. electricmore » utilities in the mid-2000s to find that most overestimated both energy consumption and peak demand growth. A key reason for this was the use of assumptions that led to an overestimation of economic growth. We find that the complexity of forecast methods and the accuracy of these forecasts are mildly correlated. In addition, sensitivity and risk analysis of load growth and its implications for capacity expansion were not well integrated with subsequent implementation. As a result, we review changes in the utilities load forecasting methods over the subsequent decade, and discuss the policy implications of long term load forecast inaccuracy and its underlying causes.« less

  13. Anisotropy enhanced X-ray scattering from solvated transition metal complexes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biasin, Elisa; van Driel, Tim B.; Levi, Gianluca

    Time-resolved X-ray scattering patterns from photoexcited molecules in solution are in many cases anisotropic at the ultrafast time scales accessible at X-ray free-electron lasers (XFELs). This anisotropy arises from the interaction of a linearly polarized UV–Vis pump laser pulse with the sample, which induces anisotropic structural changes that can be captured by femtosecond X-ray pulses. In this work, a method for quantitative analysis of the anisotropic scattering signal arising from an ensemble of molecules is described, and it is demonstrated how its use can enhance the structural sensitivity of the time-resolved X-ray scattering experiment. This method is applied on time-resolvedmore » X-ray scattering patterns measured upon photoexcitation of a solvated di-platinum complex at an XFEL, and the key parameters involved are explored. Here it is shown that a combined analysis of the anisotropic and isotropic difference scattering signals in this experiment allows a more precise determination of the main photoinduced structural change in the solute,i.e.the change in Pt—Pt bond length, and yields more information on the excitation channels than the analysis of the isotropic scattering only. Finally, it is discussed how the anisotropic transient response of the solvent can enable the determination of key experimental parameters such as the instrument response function.« less

  14. Long term load forecasting accuracy in electric utility integrated resource planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carvallo, Juan Pablo; Larsen, Peter H.; Sanstad, Alan H.

    Forecasts of electricity consumption and peak demand over time horizons of one or two decades are a key element in electric utilities’ meeting their core objective and obligation to ensure reliable and affordable electricity supplies for their customers while complying with a range of energy and environmental regulations and policies. These forecasts are an important input to integrated resource planning (IRP) processes involving utilities, regulators, and other stake-holders. Despite their importance, however, there has been little analysis of long term utility load forecasting accuracy. We conduct a retrospective analysis of long term load forecasts on twelve Western U. S. electricmore » utilities in the mid-2000s to find that most overestimated both energy consumption and peak demand growth. A key reason for this was the use of assumptions that led to an overestimation of economic growth. We find that the complexity of forecast methods and the accuracy of these forecasts are mildly correlated. In addition, sensitivity and risk analysis of load growth and its implications for capacity expansion were not well integrated with subsequent implementation. As a result, we review changes in the utilities load forecasting methods over the subsequent decade, and discuss the policy implications of long term load forecast inaccuracy and its underlying causes.« less

  15. Anisotropy enhanced X-ray scattering from solvated transition metal complexes

    DOE PAGES

    Biasin, Elisa; van Driel, Tim B.; Levi, Gianluca; ...

    2018-02-13

    Time-resolved X-ray scattering patterns from photoexcited molecules in solution are in many cases anisotropic at the ultrafast time scales accessible at X-ray free-electron lasers (XFELs). This anisotropy arises from the interaction of a linearly polarized UV–Vis pump laser pulse with the sample, which induces anisotropic structural changes that can be captured by femtosecond X-ray pulses. In this work, a method for quantitative analysis of the anisotropic scattering signal arising from an ensemble of molecules is described, and it is demonstrated how its use can enhance the structural sensitivity of the time-resolved X-ray scattering experiment. This method is applied on time-resolvedmore » X-ray scattering patterns measured upon photoexcitation of a solvated di-platinum complex at an XFEL, and the key parameters involved are explored. Here it is shown that a combined analysis of the anisotropic and isotropic difference scattering signals in this experiment allows a more precise determination of the main photoinduced structural change in the solute,i.e.the change in Pt—Pt bond length, and yields more information on the excitation channels than the analysis of the isotropic scattering only. Finally, it is discussed how the anisotropic transient response of the solvent can enable the determination of key experimental parameters such as the instrument response function.« less

  16. Analytical Chemistry: A retrospective view on some current trends.

    PubMed

    Niessner, Reinhard

    2018-04-01

    In a retrospective view some current trends in Analytical Chemistry are outlined and connected to work published more than a hundred years ago in the same field. For example, gravimetric microanalysis after specific precipitation, once the sole basis for chemical analysis, has been transformed into a mass-sensitive transducer in combination with compound-specific receptors. Molecular spectroscopy, still practising the classical absorption/emission techniques for detecting elements or molecules experiences a change to Raman spectroscopy, is now allowing analysis of a multitude of additional features. Chemical sensors are now used to perform a vast number of analytical measurements. Especially paper-based devices (dipsticks, microfluidic pads) celebrate a revival as they can potentially revolutionize medicine in the developing world. Industry 4.0 will lead to a further increase of sensor applications. Preceding separation and enrichment of analytes from complicated matrices remains the backbone for a successful analysis, despite increasing attempts to avoid clean-up. Continuous separation techniques will become a key element for 24/7 production of goods with certified quality. Attempts to get instantaneous and specific chemical information by optical or electrical transduction will need highly selective receptors in large quantities. Further understanding of ligand - receptor complex structures is the key for successful generation of artificial bio-inspired receptors. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Diagnostic value of urinary tissue inhibitor of metalloproteinase-2 and insulin-like growth factor binding protein 7 for acute kidney injury: a meta-analysis.

    PubMed

    Jia, Hui-Miao; Huang, Li-Feng; Zheng, Yue; Li, Wen-Xiong

    2017-03-25

    Tissue inhibitor of metalloproteinase-2 (TIMP-2) and insulin-like growth factor binding protein 7 (IGFBP7), inducers of G 1 cell cycle arrest, are two recently discovered good biomarkers for early diagnosis of acute kidney injury (AKI). To obtain a more robust performance measurement, the present meta-analysis was performed, pooling existing studies. Literature in the MEDLINE (via PubMed), Ovid, Embase, and Cochrane Library databases was systematically searched from inception to 12 October 2016. Studies that met the set inclusion and exclusion criteria were identified by two independent investigators. The diagnostic value of urinary [TIMP-2] × [IGFBP7] for AKI was evaluated by pooled sensitivity, specificity, likelihood ratio (LR), diagnostic odds ratio (DOR), and summary receiver operating characteristic (SROC) curve analyses. The causes of heterogeneity were explored by sensitivity and subgroup analyses. A total of nine published and eligible studies assessing 1886 cases were included in this meta-analysis. Early diagnostic value of urinary [TIMP-2] × [IGFBP7] for AKI was assessed using a random-effects model. Pooled sensitivity and specificity with corresponding 95% CIs were 0.83 (95% CI 0.79-0.87, heterogeneity I 2  = 68.8%) and 0.55 (95% CI 0.52-0.57, I 2  = 92.9%), respectively. Pooled positive LR, negative LR, and DOR were 2.37 (95% CI 1.87-2.99, I 2  = 82.6%), 0.30 (95% CI 0.21-0.41, I 2  = 43.4%), and 9.92 (95% CI 6.09-16.18, I 2  = 38.5%), respectively. The AUC estimated by SROC was 0.846 (SE 0.027) with a Q* value of 0.777 (SE 0.026). Sensitivity analysis indicated that one study significantly affected the stability of pooled results. Subgroup analysis showed that population setting and AKI threshold were the key factors causing heterogeneity in pooled sensitivity and specificity. On the basis of recent evidence, urinary [TIMP-2] × [IGFBP7] is an effective predictive factor of AKI. PROSPERO registration number: CRD42016051186 . Registered on 10 November 2016.

  18. Highly hydrogen-sensitive thermal desorption spectroscopy system for quantitative analysis of low hydrogen concentration (˜1 × 1016 atoms/cm3) in thin-film samples

    NASA Astrophysics Data System (ADS)

    Hanna, Taku; Hiramatsu, Hidenori; Sakaguchi, Isao; Hosono, Hideo

    2017-05-01

    We developed a highly hydrogen-sensitive thermal desorption spectroscopy (HHS-TDS) system to detect and quantitatively analyze low hydrogen concentrations in thin films. The system was connected to an in situ sample-transfer chamber system, manipulators, and an rf magnetron sputtering thin-film deposition chamber under an ultra-high-vacuum (UHV) atmosphere of ˜10-8 Pa. The following key requirements were proposed in developing the HHS-TDS: (i) a low hydrogen residual partial pressure, (ii) a low hydrogen exhaust velocity, and (iii) minimization of hydrogen thermal desorption except from the bulk region of the thin films. To satisfy these requirements, appropriate materials and components were selected, and the system was constructed to extract the maximum performance from each component. Consequently, ˜2000 times higher sensitivity to hydrogen than that of a commercially available UHV-TDS system was achieved using H+-implanted Si samples. Quantitative analysis of an amorphous oxide semiconductor InGaZnO4 thin film (1 cm × 1 cm × 1 μm thickness, hydrogen concentration of 4.5 × 1017 atoms/cm3) was demonstrated using the HHS-TDS system. This concentration level cannot be detected using UHV-TDS or secondary ion mass spectroscopy (SIMS) systems. The hydrogen detection limit of the HHS-TDS system was estimated to be ˜1 × 1016 atoms/cm3, which implies ˜2 orders of magnitude higher sensitivity than that of SIMS and resonance nuclear reaction systems (˜1018 atoms/cm3).

  19. Detailed Functional and Proteomic Characterization of Fludarabine Resistance in Mantle Cell Lymphoma Cells

    PubMed Central

    Lorkova, Lucie; Scigelova, Michaela; Arrey, Tabiwang Ndipanquang; Vit, Ondrej; Pospisilova, Jana; Doktorova, Eliska; Klanova, Magdalena; Alam, Mahmudul; Vockova, Petra; Maswabi, Bokang

    2015-01-01

    Mantle cell lymphoma (MCL) is a chronically relapsing aggressive type of B-cell non-Hodgkin lymphoma considered incurable by currently used treatment approaches. Fludarabine is a purine analog clinically still widely used in the therapy of relapsed MCL. Molecular mechanisms of fludarabine resistance have not, however, been studied in the setting of MCL so far. We therefore derived fludarabine-resistant MCL cells (Mino/FR) and performed their detailed functional and proteomic characterization compared to the original fludarabine sensitive cells (Mino). We demonstrated that Mino/FR were highly cross-resistant to other antinucleosides (cytarabine, cladribine, gemcitabine) and to an inhibitor of Bruton tyrosine kinase (BTK) ibrutinib. Sensitivity to other types of anti-lymphoma agents was altered only mildly (methotrexate, doxorubicin, bortezomib) or remained unaffacted (cisplatin, bendamustine). The detailed proteomic analysis of Mino/FR compared to Mino cells unveiled over 300 differentially expressed proteins. Mino/FR were characterized by the marked downregulation of deoxycytidine kinase (dCK) and BTK (thus explaining the observed crossresistance to antinucleosides and ibrutinib), but also by the upregulation of several enzymes of de novo nucleotide synthesis, as well as the up-regulation of the numerous proteins of DNA repair and replication. The significant upregulation of the key antiapoptotic protein Bcl-2 in Mino/FR cells was associated with the markedly increased sensitivity of the fludarabine-resistant MCL cells to Bcl-2-specific inhibitor ABT199 compared to fludarabine-sensitive cells. Our data thus demonstrate that a detailed molecular analysis of drug-resistant tumor cells can indeed open a way to personalized therapy of resistant malignancies. PMID:26285204

  20. Robust ECC-based authenticated key agreement scheme with privacy protection for Telecare medicine information systems.

    PubMed

    Zhang, Liping; Zhu, Shaohui

    2015-05-01

    To protect the transmission of the sensitive medical data, a secure and efficient authenticated key agreement scheme should be deployed when the healthcare delivery session is established via Telecare Medicine Information Systems (TMIS) over the unsecure public network. Recently, Islam and Khan proposed an authenticated key agreement scheme using elliptic curve cryptography for TMIS. They claimed that their proposed scheme is provably secure against various attacks in random oracle model and enjoys some good properties such as user anonymity. In this paper, however, we point out that any legal but malicious patient can reveal other user's identity. Consequently, their scheme suffers from server spoofing attack and off-line password guessing attack. Moreover, if the malicious patient performs the same time of the registration as other users, she can further launch the impersonation attack, man-in-the-middle attack, modification attack, replay attack, and strong replay attack successfully. To eliminate these weaknesses, we propose an improved ECC-based authenticated key agreement scheme. Security analysis demonstrates that the proposed scheme can resist various attacks and enables the patient to enjoy the remote healthcare services with privacy protection. Through the performance evaluation, we show that the proposed scheme achieves a desired balance between security and performance in comparisons with other related schemes.

  1. Missing data in trial-based cost-effectiveness analysis: An incomplete journey.

    PubMed

    Leurent, Baptiste; Gomes, Manuel; Carpenter, James R

    2018-06-01

    Cost-effectiveness analyses (CEA) conducted alongside randomised trials provide key evidence for informing healthcare decision making, but missing data pose substantive challenges. Recently, there have been a number of developments in methods and guidelines addressing missing data in trials. However, it is unclear whether these developments have permeated CEA practice. This paper critically reviews the extent of and methods used to address missing data in recently published trial-based CEA. Issues of the Health Technology Assessment journal from 2013 to 2015 were searched. Fifty-two eligible studies were identified. Missing data were very common; the median proportion of trial participants with complete cost-effectiveness data was 63% (interquartile range: 47%-81%). The most common approach for the primary analysis was to restrict analysis to those with complete data (43%), followed by multiple imputation (30%). Half of the studies conducted some sort of sensitivity analyses, but only 2 (4%) considered possible departures from the missing-at-random assumption. Further improvements are needed to address missing data in cost-effectiveness analyses conducted alongside randomised trials. These should focus on limiting the extent of missing data, choosing an appropriate method for the primary analysis that is valid under contextually plausible assumptions, and conducting sensitivity analyses to departures from the missing-at-random assumption. © 2018 The Authors Health Economics published by John Wiley & Sons Ltd.

  2. Identifying drought response of semi-arid aeolian systems using near-surface luminescence profiles and changepoint analysis, Nebraska Sandhills.

    NASA Astrophysics Data System (ADS)

    Buckland, Catherine; Bailey, Richard; Thomas, David

    2017-04-01

    Two billion people living in drylands are affected by land degradation. Sediment erosion by wind and water removes fertile soil and destabilises landscapes. Vegetation disturbance is a key driver of dryland erosion caused by both natural and human forcings: drought, fire, land use, grazing pressure. A quantified understanding of vegetation cover sensitivities and resultant surface change to forcing factors is needed if the vegetation and landscape response to future climate change and human pressure are to be better predicted. Using quartz luminescence dating and statistical changepoint analysis (Killick & Eckley, 2014) this study demonstrates the ability to identify step-changes in depositional age of near-surface sediments. Lx/Tx luminescence profiles coupled with statistical analysis show the use of near-surface sediments in providing a high-resolution record of recent system response and aeolian system thresholds. This research determines how the environment has recorded and retained sedimentary evidence of drought response and land use disturbances over the last two hundred years across both individual landforms and the wider Nebraska Sandhills. Identifying surface deposition and comparing with records of climate, fire and land use changes allows us to assess the sensitivity and stability of the surface sediment to a range of forcing factors. Killick, R and Eckley, IA. (2014) "changepoint: An R Package for Changepoint Analysis." Journal of Statistical Software, (58) 1-19.

  3. Label free quantitative proteomics analysis on the cisplatin resistance in ovarian cancer cells.

    PubMed

    Wang, F; Zhu, Y; Fang, S; Li, S; Liu, S

    2017-05-20

    Quantitative proteomics has been made great progress in recent years. Label free quantitative proteomics analysis based on the mass spectrometry is widely used. Using this technique, we determined the differentially expressed proteins in the cisplatin-sensitive ovarian cancer cells COC1 and cisplatin-resistant cells COC1/DDP before and after the application of cisplatin. Using the GO analysis, we classified those proteins into different subgroups bases on their cellular component, biological process, and molecular function. We also used KEGG pathway analysis to determine the key signal pathways that those proteins were involved in. There are 710 differential proteins between COC1 and COC1/DDP cells, 783 between COC1 and COC1/DDP cells treated with cisplatin, 917 between the COC1/DDP cells and COC1/DDP cells treated with LaCl3, 775 between COC1/DDP cells treated with cisplatin and COC1/DDP cells treated with cisplatin and LaCl3. Among the same 411 differentially expressed proteins in cisplatin-sensitive COC1 cells and cisplain-resistant COC1/DDP cells before and after cisplatin treatment, 14% of them were localized on the cell membrane. According to the KEGG results, differentially expressed proteins were classified into 21 groups. The most abundant proteins were involved in spliceosome. This study lays a foundation for deciphering the mechanism for drug resistance in ovarian tumor.

  4. Co-acting gene networks predict TRAIL responsiveness of tumour cells with high accuracy.

    PubMed

    O'Reilly, Paul; Ortutay, Csaba; Gernon, Grainne; O'Connell, Enda; Seoighe, Cathal; Boyce, Susan; Serrano, Luis; Szegezdi, Eva

    2014-12-19

    Identification of differentially expressed genes from transcriptomic studies is one of the most common mechanisms to identify tumor biomarkers. This approach however is not well suited to identify interaction between genes whose protein products potentially influence each other, which limits its power to identify molecular wiring of tumour cells dictating response to a drug. Due to the fact that signal transduction pathways are not linear and highly interlinked, the biological response they drive may be better described by the relative amount of their components and their functional relationships than by their individual, absolute expression. Gene expression microarray data for 109 tumor cell lines with known sensitivity to the death ligand cytokine tumor necrosis factor-related apoptosis-inducing ligand (TRAIL) was used to identify genes with potential functional relationships determining responsiveness to TRAIL-induced apoptosis. The machine learning technique Random Forest in the statistical environment "R" with backward elimination was used to identify the key predictors of TRAIL sensitivity and differentially expressed genes were identified using the software GeneSpring. Gene co-regulation and statistical interaction was assessed with q-order partial correlation analysis and non-rejection rate. Biological (functional) interactions amongst the co-acting genes were studied with Ingenuity network analysis. Prediction accuracy was assessed by calculating the area under the receiver operator curve using an independent dataset. We show that the gene panel identified could predict TRAIL-sensitivity with a very high degree of sensitivity and specificity (AUC=0·84). The genes in the panel are co-regulated and at least 40% of them functionally interact in signal transduction pathways that regulate cell death and cell survival, cellular differentiation and morphogenesis. Importantly, only 12% of the TRAIL-predictor genes were differentially expressed highlighting the importance of functional interactions in predicting the biological response. The advantage of co-acting gene clusters is that this analysis does not depend on differential expression and is able to incorporate direct- and indirect gene interactions as well as tissue- and cell-specific characteristics. This approach (1) identified a descriptor of TRAIL sensitivity which performs significantly better as a predictor of TRAIL sensitivity than any previously reported gene signatures, (2) identified potential novel regulators of TRAIL-responsiveness and (3) provided a systematic view highlighting fundamental differences between the molecular wiring of sensitive and resistant cell types.

  5. Synovial Fluid α-Defensin as a Biomarker for Peri-Prosthetic Joint Infection: A Systematic Review and Meta-Analysis.

    PubMed

    Li, Bin; Chen, Fei; Liu, Yi; Xu, Guokang

    Total joint arthroplasty (TJA) has been one of the most beneficial interventions for treating patients suffering from joint disorders. However, peri-prosthetic joint infection (PJI) is a serious complication that often accompanies TJA and the diagnosis of PJI is remains difficult. Questions remain regarding whether certain biomarkers can be valuable in the diagnosis of PJI. We conducted our systematic review by searching PubMed, Embase, Web of Science, the Cochrane Library, and Science Direct with the key words "periprosthetic joint infection," "synovial fluid," and "α-defensin." Studies that provided sufficient data to construct 2 × 2 contingency tables were chosen based on inclusion and exclusion criteria. The quality of included studies was assessed according to the revised Quality Assessment of Diagnostic Accuracy Studies (QUADAS-2) criteria. The pooled sensitivity, specificity, and diagnostic odds ratio (DOR) were calculated for the included studies. The summary receiver operating characteristic (SROC) curve and the area under the summary receiver operating characteristic (AUSROC) were used to evaluate the overall diagnostic performance. Eight studies were included in this systematic review. Among them four articles were included in meta-analysis. A total of 421 participants were studied in the meta-analysis. The pooled sensitivity, specificity, and DOR were 0.98 (95% confidence interval [CI]: 0.94-1.00), 0.97 (95% CI: 0.95-0.99), and 1095.49 (95% CI: 283.68.58-4230.45), respectively. The AUSROC was 0.9949 (standard error [SE] 0.0095). Synovial fluid α-defensin is a biomarker of high sensitivity and specificity for the diagnosis of PJI.

  6. Spectral Sensitivity of the ctenid spider Cupiennius salei Keys

    PubMed Central

    Zopf, Lydia M.; Schmid, Axel; Fredman, David; Eriksson, Bo Joakim

    2014-01-01

    Summary The spectral sensitivity of adult male Cupiennius salei Keys, a nocturnal hunting spider, was studied in a behavioural test. As known from earlier behavioural tests, C. salei walks towards a black target presented in front of a white background. In this study a black target (size 42 × 70 cm) was presented in a white arena illuminated by monochromatic light in the range of 365 to 695 nm using 19 monochromatic filters (HW in the range of 6 – 10 nm). In the first trial, the transmission of the optical filters was between 40 % and 80%. In a second trial the transmission was reduced to 5%, using a neutral density filter. At the high intensity the spiders showed a spectral sensivity in the range from 380 to 670 nm. In the second trial the animals only showed directed walks if the illumination was in the range of 449 to 599 nm, indicating a lower sensitivity at the margins of the spectral sensitivity. In previous intracellular recordings, the measured spectral sensitivity was between 320 and 620 nm. Interestingly, these results do not completely match the behaviourally tested spectral sensitivity of the photoreceptors, where the sensitivity range is shifted to longer wavelengths. In order to investigate the molecular background of spectral sensitivity, we searched for opsin genes in C. salei. We found three visual opsins that correspond to UV and middle to long wavelength sensitive opsins as described for jumping spiders. PMID:23948480

  7. Carbon and water flux responses to physiology by environment interactions: a sensitivity analysis of variation in climate on photosynthetic and stomatal parameters

    NASA Astrophysics Data System (ADS)

    Bauerle, William L.; Daniels, Alex B.; Barnard, David M.

    2014-05-01

    Sensitivity of carbon uptake and water use estimates to changes in physiology was determined with a coupled photosynthesis and stomatal conductance ( g s) model, linked to canopy microclimate with a spatially explicit scheme (MAESTRA). The sensitivity analyses were conducted over the range of intraspecific physiology parameter variation observed for Acer rubrum L. and temperate hardwood C3 (C3) vegetation across the following climate conditions: carbon dioxide concentration 200-700 ppm, photosynthetically active radiation 50-2,000 μmol m-2 s-1, air temperature 5-40 °C, relative humidity 5-95 %, and wind speed at the top of the canopy 1-10 m s-1. Five key physiological inputs [quantum yield of electron transport ( α), minimum stomatal conductance ( g 0), stomatal sensitivity to the marginal water cost of carbon gain ( g 1), maximum rate of electron transport ( J max), and maximum carboxylation rate of Rubisco ( V cmax)] changed carbon and water flux estimates ≥15 % in response to climate gradients; variation in α, J max, and V cmax input resulted in up to ~50 and 82 % intraspecific and C3 photosynthesis estimate output differences respectively. Transpiration estimates were affected up to ~46 and 147 % by differences in intraspecific and C3 g 1 and g 0 values—two parameters previously overlooked in modeling land-atmosphere carbon and water exchange. We show that a variable environment, within a canopy or along a climate gradient, changes the spatial parameter effects of g 0, g 1, α, J max, and V cmax in photosynthesis- g s models. Since variation in physiology parameter input effects are dependent on climate, this approach can be used to assess the geographical importance of key physiology model inputs when estimating large scale carbon and water exchange.

  8. Qualitative analysis of tackifier resins in pressure sensitive adhesives using direct analysis in real time time-of-flight mass spectrometry.

    PubMed

    Mess, Aylin; Vietzke, Jens-Peter; Rapp, Claudius; Francke, Wittko

    2011-10-01

    Tackifier resins play an important role as additives in pressure sensitive adhesives (PSAs) to modulate their desired properties. With dependence on their origin and processing, tackifier resins can be multicomponent mixtures. Once they have been incorporated in a polymer matrix, conventional chemical analysis of tackifiers usually tends to be challenging because a suitable sample pretreatment and/or separation is necessary and all characteristic components have to be detected for an unequivocal identification of the resin additive. Nevertheless, a reliable analysis of tackifiers is essential for product quality and safety reasons. A promising approach for the examination of tackifier resins in PSAs is the novel direct analysis in real time mass spectrometry (DART-MS) technique, which enables screening analysis without time-consuming sample preparation. In the present work, four key classes of tackifier resins were studied (rosin, terpene phenolic, polyterpene, and hydrocarbon resins). Their corresponding complex mass spectra were interpreted and used as reference spectra for subsequent analyses. These data were used to analyze tackifier additives in synthetic rubber and acrylic adhesive matrixes. To prove the efficiency of the developed method, complete PSA products containing two or three different tackifiers were analyzed. The tackifier resins were successfully identified, while measurement time and interpretation took less than 10 mins per sample. Determination of resin additives in PSAs can be performed down to 0.1% (w/w, limit of detection) using the three most abundant signals for each tackifier. In summary, DART-MS is a rapid and efficient screening method for the analysis of various tackifiers in PSAs.

  9. Intercomprehension: A Portal to Teachers' Intercultural Sensitivity

    ERIC Educational Resources Information Center

    Pinho, Ana Sofia

    2015-01-01

    The development of opportunities for teachers' professional development in plurilingual and intercultural education is a key issue in language teacher education and "intercomprehension" (IC) can provide a potential portal for the development of teachers' intercultural sensitivity. Particularly relevant to this is the creation of powerful…

  10. A high-efficiency HPGe coincidence system for environmental analysis.

    PubMed

    Britton, R; Davies, A V; Burnett, J L; Jackson, M J

    2015-08-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) is supported by a network of certified laboratories which must meet certain sensitivity requirements for CTBT relevant radionuclides. At the UK CTBT Radionuclide Laboratory (GBL15), a high-efficiency, dual-detector gamma spectroscopy system has been developed to improve the sensitivity of measurements for treaty compliance, greatly reducing the time required for each sample. Utilising list-mode acquisition, each sample can be counted once, and processed multiple times to further improve sensitivity. For the 8 key radionuclides considered, Minimum Detectable Activities (MDA's) were improved by up to 37% in standard mode (when compared to a typical CTBT detector system), with the acquisition time required to achieve the CTBT sensitivity requirements reduced from 6 days to only 3. When utilising the system in coincidence mode, the MDA for (60) Co in a high-activity source was improved by a factor of 34 when compared to a standard CTBT detector, and a factor of 17 when compared to the dual-detector system operating in standard mode. These MDA improvements will allow the accurate and timely quantification of radionuclides that decay via both singular and cascade γ emission, greatly enhancing the effectiveness of CTBT laboratories. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  11. Children with a history of SLI show reduced sensitivity to audiovisual temporal asynchrony: an ERP study.

    PubMed

    Kaganovich, Natalya; Schumaker, Jennifer; Leonard, Laurence B; Gustafson, Dana; Macias, Danielle

    2014-08-01

    The authors examined whether school-age children with a history of specific language impairment (H-SLI), their peers with typical development (TD), and adults differ in sensitivity to audiovisual temporal asynchrony and whether such difference stems from the sensory encoding of audiovisual information. Fifteen H-SLI children, 15 TD children, and 15 adults judged whether a flashed explosion-shaped figure and a 2-kHz pure tone occurred simultaneously. The stimuli were presented at 0-, 100-, 200-, 300-, 400-, and 500-ms temporal offsets. This task was combined with EEG recordings. H-SLI children were profoundly less sensitive to temporal separations between auditory and visual modalities compared with their TD peers. Those H-SLI children who performed better at simultaneity judgment also had higher language aptitude. TD children were less accurate than adults, revealing a remarkably prolonged developmental course of the audiovisual temporal discrimination. Analysis of early event-related potential components suggested that poor sensory encoding was not a key factor in H-SLI children's reduced sensitivity to audiovisual asynchrony. Audiovisual temporal discrimination is impaired in H-SLI children and is still immature during mid-childhood in TD children. The present findings highlight the need for further evaluation of the role of atypical audiovisual processing in the development of SLI.

  12. Children with a history of SLI show reduced sensitivity to audiovisual temporal asynchrony: An ERP Study

    PubMed Central

    Kaganovich, Natalya; Schumaker, Jennifer; Leonard, Laurence B.; Gustafson, Dana; Macias, Danielle

    2014-01-01

    Purpose We examined whether school-age children with a history of SLI (H-SLI), their typically developing (TD) peers, and adults differ in sensitivity to audiovisual temporal asynchrony and whether such difference stems from the sensory encoding of audiovisual information. Method 15 H-SLI children, 15 TD children, and 15 adults judged whether a flashed explosion-shaped figure and a 2 kHz pure tone occurred simultaneously. The stimuli were presented at 0, 100, 200, 300, 400, and 500 ms temporal offsets. This task was combined with EEG recordings. Results H-SLI children were profoundly less sensitive to temporal separations between auditory and visual modalities compared to their TD peers. Those H-SLI children who performed better at simultaneity judgment also had higher language aptitude. TD children were less accurate than adults, revealing a remarkably prolonged developmental course of the audiovisual temporal discrimination. Analysis of early ERP components suggested that poor sensory encoding was not a key factor in H-SLI children’s reduced sensitivity to audiovisual asynchrony. Conclusions Audiovisual temporal discrimination is impaired in H-SLI children and is still immature during mid-childhood in TD children. The present findings highlight the need for further evaluation of the role of atypical audiovisual processing in the development of SLI. PMID:24686922

  13. A novel immunochromatographic electrochemical biosensor for highly sensitive and selective detection of trichloropyridinol, a biomarker of exposure to chlorpyrifos.

    PubMed

    Wang, Limin; Lu, Donglai; Wang, Jun; Du, Dan; Zou, Zhexiang; Wang, Hua; Smith, Jordan N; Timchalk, Charles; Liu, Fengquan; Lin, Yuehe

    2011-02-15

    We present a novel portable immunochromatographic electrochemical biosensor (IEB) for simple, rapid, and sensitive biomonitoring of trichloropyridinol (TCP), a metabolite biomarker of exposure to organophosphorus insecticides. Our new approach takes the advantage of immunochromatographic test strip for a rapid competitive immunoreaction and a disposable screen-printed carbon electrode for a rapid and sensitive electrochemical analysis of captured HRP labeling. Several key experimental parameters (e.g. immunoreaction time, the amount of HRP labeled TCP, concentration of the substrate for electrochemical measurements, and the blocking agents for the nitrocellulose membrane) were optimized to achieve a high sensitivity, selectivity and stability. Under optimal conditions, the IEB has demonstrated a wide linear range (0.1-100 ng/ml) with a detection limit as low as 0.1 ng/ml TCP. Furthermore, the IEB has been successfully applied for biomonitoring of TCP in the rat plasma samples with in vivo exposure to organophosphorus insecticides like Chlorpyrifos-oxon (CPF-oxon). The IEB thus opens up new pathways for designing a simple, rapid, clinically accurate, and quantitative tool for TCP detection, as well as holds a great promise for in-field screening of metabolite biomarkers, e.g., TCP, for humans exposed to organophosphorus insecticides. Copyright © 2010 Elsevier B.V. All rights reserved.

  14. Cutaneous exposure to clinically-relevant pigeon pea (Cajanus cajan) proteins promote TH2-dependent sensitization and IgE-mediated anaphylaxis in Balb/c mice.

    PubMed

    Kumar Gupta, Rinkesh; Kumar, Sandeep; Gupta, Kriti; Sharma, Akanksha; Roy, Ruchi; Kumar Verma, Alok; Chaudhari, Bhushan P; Das, Mukul; Ahmad Ansari, Irfan; Dwivedi, Premendra D

    2016-11-01

    Epicutaneous (EC) sensitization to food allergens may occur when the skin has been lightly damaged. The study here tested whether cutaneous exposure to pigeon pea protein(s) may cause allergic sensitization. BALB/c mice were either orally gavaged or epicutaneously sensitized by repeated application of pigeon pea crude protein extract (CPE) on undamaged areas of skin without any adjuvant; afterwards, both groups were orally challenged with the pigeon pea CPE. Anaphylactic symptoms along with measures of body temperature, MCPT-1, TSLP, pigeon pea-specific IgE and IgG 1 , myeloperoxidase (MPO) activity, T H 2 cytokines, T H 2 transcription factors (TFs) and filaggrin expression were determined. Mast cell staining, eosinophil levels and histopathological analysis of the skin and intestines were also performed. In the epicutaneously-sensitized mice, elevated levels of specific IgE and IgG 1 , as well as of MCPT-1, TSLP, T H 2 cytokines and TFs, higher anaphylactic scores and histological changes in the skin and intestine were indicative of sensitization ability via both routes in the pigeon pea CPE-treated hosts. Elevated levels of mast cells were observed in both the skin and intestine; increased levels of eosinophils and MPO activity were noted only in the skin. Decreased levels of filaggrin in skin may have played a key role in the skin barrier dysfunction, increasing the chances of sensitization. Therefore, the experimental data support the hypothesis that in addition to oral exposure, skin exposure to food allergens can promote T H 2-dependent sensitization, IgE-mediated anaphylaxis and intestinal changes after oral challenge. Based on this, an avoidance of cutaneous exposures to allergens might prevent development of food anaphylaxis.

  15. Enhancing the Characterization of Epistemic Uncertainties in PM2.5 Risk Analyses.

    PubMed

    Smith, Anne E; Gans, Will

    2015-03-01

    The Environmental Benefits Mapping and Analysis Program (BenMAP) is a software tool developed by the U.S. Environmental Protection Agency (EPA) that is widely used inside and outside of EPA to produce quantitative estimates of public health risks from fine particulate matter (PM2.5 ). This article discusses the purpose and appropriate role of a risk analysis tool to support risk management deliberations, and evaluates the functions of BenMAP in this context. It highlights the importance in quantitative risk analyses of characterization of epistemic uncertainty, or outright lack of knowledge, about the true risk relationships being quantified. This article describes and quantitatively illustrates sensitivities of PM2.5 risk estimates to several key forms of epistemic uncertainty that pervade those calculations: the risk coefficient, shape of the risk function, and the relative toxicity of individual PM2.5 constituents. It also summarizes findings from a review of U.S.-based epidemiological evidence regarding the PM2.5 risk coefficient for mortality from long-term exposure. That review shows that the set of risk coefficients embedded in BenMAP substantially understates the range in the literature. We conclude that BenMAP would more usefully fulfill its role as a risk analysis support tool if its functions were extended to better enable and prompt its users to characterize the epistemic uncertainties in their risk calculations. This requires expanded automatic sensitivity analysis functions and more recognition of the full range of uncertainty in risk coefficients. © 2014 Society for Risk Analysis.

  16. Highly-sensitive NO, NO 2 , and NH 3 measurements with an open-multipass cell based on mid-infrared wavelength modulation spectroscopy

    NASA Astrophysics Data System (ADS)

    Chen, Xiang; Yang, Chen-Guang; Hu, Mai; Shen, Jian-Kang; Niu, Er-Chao; Xu, Zhen-Yu; Fan, Xue-Li; Wei, Min; Yao, Lu; He, Ya-Bai; Liu, Jian-Guo; Kan, Rui-Feng

    2018-04-01

    Not Available Project supported by the National Key Scientific Instrument and Equipment Development, China (Grant No. 2014YQ060537) and the National Key Research and Development Program, China (Grant No. 2016YFC0201103).

  17. Identification of weather variables sensitive to dysentery in disease-affected county of China.

    PubMed

    Liu, Jianing; Wu, Xiaoxu; Li, Chenlu; Xu, Bing; Hu, Luojia; Chen, Jin; Dai, Shuang

    2017-01-01

    Climate change mainly refers to long-term change in weather variables, and it has significant impact on sustainability and spread of infectious diseases. Among three leading infectious diseases in China, dysentery is exclusively sensitive to climate change. Previous researches on weather variables and dysentery mainly focus on determining correlation between dysentery incidence and weather variables. However, the contribution of each variable to dysentery incidence has been rarely clarified. Therefore, we chose a typical county in epidemic of dysentery as the study area. Based on data of dysentery incidence, weather variables (monthly mean temperature, precipitation, wind speed, relative humidity, absolute humidity, maximum temperature, and minimum temperature) and lagged analysis, we used principal component analysis (PCA) and classification and regression trees (CART) to examine the relationships between the incidence of dysentery and weather variables. Principal component analysis showed that temperature, precipitation, and humidity played a key role in determining transmission of dysentery. We further selected weather variables including minimum temperature, precipitation, and relative humidity based on results of PCA, and used CART to clarify contributions of these three weather variables to dysentery incidence. We found when minimum temperature was at a high level, the high incidence of dysentery occurred if relative humidity or precipitation was at a high level. We compared our results with other studies on dysentery incidence and meteorological factors in areas both in China and abroad, and good agreement has been achieved. Yet, some differences remain for three reasons: not identifying all key weather variables, climate condition difference caused by local factors, and human factors that also affect dysentery incidence. This study hopes to shed light on potential early warnings for dysentery transmission as climate change occurs, and provide a theoretical basis for the control and prevention of dysentery. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Cost-effectiveness of dihydroartemisinin-piperaquine compared with artemether-lumefantrine for treating uncomplicated malaria in children at a district hospital in Tanzania.

    PubMed

    Mori, Amani T; Ngalesoni, Frida; Norheim, Ole F; Robberstad, Bjarne

    2014-09-15

    Dihydroartemisinin-piperaquine (DhP) is highly recommended for the treatment of uncomplicated malaria. This study aims to compare the costs, health benefits and cost-effectiveness of DhP and artemether-lumefantrine (AL) alongside "do-nothing" as a baseline comparator in order to consider the appropriateness of DhP as a first-line anti-malarial drug for children in Tanzania. A cost-effectiveness analysis was performed using a Markov decision model, from a provider's perspective. The study used cost data from Tanzania and secondary effectiveness data from a review of articles from sub-Saharan Africa. Probabilistic sensitivity analysis was used to incorporate uncertainties in the model parameters. In addition, sensitivity analyses were used to test plausible variations of key parameters and the key assumptions were tested in scenario analyses. The model predicts that DhP is more cost-effective than AL, with an incremental cost-effectiveness ratio (ICER) of US$ 12.40 per DALY averted. This result relies on the assumption that compliance to treatment with DhP is higher than that with AL due to its relatively simple once-a-day dosage regimen. When compliance was assumed to be identical for the two drugs, AL was more cost-effective than DhP with an ICER of US$ 12.54 per DALY averted. DhP is, however, slightly more likely to be cost-effective compared to a willingness-to-pay threshold of US$ 150 per DALY averted. Dihydroartemisinin-piperaquine is a very cost-effective anti-malarial drug. The findings support its use as an alternative first-line drug for treatment of uncomplicated malaria in children in Tanzania and other sub-Saharan African countries with similar healthcare infrastructures and epidemiology of malaria.

  19. Analyzing key constraints to biogas production from crop residues and manure in the EU—A spatially explicit model

    PubMed Central

    Persson, U. Martin

    2017-01-01

    This paper presents a spatially explicit method for making regional estimates of the potential for biogas production from crop residues and manure, accounting for key technical, biochemical, environmental and economic constraints. Methods for making such estimates are important as biofuels from agricultural residues are receiving increasing policy support from the EU and major biogas producers, such as Germany and Italy, in response to concerns over unintended negative environmental and social impacts of conventional biofuels. This analysis comprises a spatially explicit estimate of crop residue and manure production for the EU at 250 m resolution, and a biogas production model accounting for local constraints such as the sustainable removal of residues, transportation of substrates, and the substrates’ biochemical suitability for anaerobic digestion. In our base scenario, the EU biogas production potential from crop residues and manure is about 0.7 EJ/year, nearly double the current EU production of biogas from agricultural substrates, most of which does not come from residues or manure. An extensive sensitivity analysis of the model shows that the potential could easily be 50% higher or lower, depending on the stringency of economic, technical and biochemical constraints. We find that the potential is particularly sensitive to constraints on the substrate mixtures’ carbon-to-nitrogen ratio and dry matter concentration. Hence, the potential to produce biogas from crop residues and manure in the EU depends to large extent on the possibility to overcome the challenges associated with these substrates, either by complementing them with suitable co-substrates (e.g. household waste and energy crops), or through further development of biogas technology (e.g. pretreatment of substrates and recirculation of effluent). PMID:28141827

  20. SOS2 Promotes Salt Tolerance in Part by Interacting with the Vacuolar H+-ATPase and Upregulating Its Transport Activity▿

    PubMed Central

    Batelli, Giorgia; Verslues, Paul E.; Agius, Fernanda; Qiu, Quansheng; Fujii, Hiroaki; Pan, Songqin; Schumaker, Karen S.; Grillo, Stefania; Zhu, Jian-Kang

    2007-01-01

    The salt overly sensitive (SOS) pathway is critical for plant salt stress tolerance and has a key role in regulating ion transport under salt stress. To further investigate salt tolerance factors regulated by the SOS pathway, we expressed an N-terminal fusion of the improved tandem affinity purification tag to SOS2 (NTAP-SOS2) in sos2-2 mutant plants. Expression of NTAP-SOS2 rescued the salt tolerance defect of sos2-2 plants, indicating that the fusion protein was functional in vivo. Tandem affinity purification of NTAP-SOS2-containing protein complexes and subsequent liquid chromatography-tandem mass spectrometry analysis indicated that subunits A, B, C, E, and G of the peripheral cytoplasmic domain of the vacuolar H+-ATPase (V-ATPase) were present in a SOS2-containing protein complex. Parallel purification of samples from control and salt-stressed NTAP-SOS2/sos2-2 plants demonstrated that each of these V-ATPase subunits was more abundant in NTAP-SOS2 complexes isolated from salt-stressed plants, suggesting that the interaction may be enhanced by salt stress. Yeast two-hybrid analysis showed that SOS2 interacted directly with V-ATPase regulatory subunits B1 and B2. The importance of the SOS2 interaction with the V-ATPase was shown at the cellular level by reduced H+ transport activity of tonoplast vesicles isolated from sos2-2 cells relative to vesicles from wild-type cells. In addition, seedlings of the det3 mutant, which has reduced V-ATPase activity, were found to be severely salt sensitive. Our results suggest that regulation of V-ATPase activity is an additional key function of SOS2 in coordinating changes in ion transport during salt stress and in promoting salt tolerance. PMID:17875927

  1. Chiral analysis of bambuterol, its intermediate and active drug in human plasma by liquid chromatography-tandem mass spectrometry: Application to a pharmacokinetic study.

    PubMed

    Zhou, Ting; Liu, Shan; Zhao, Ting; Zeng, Jing; He, Mingzhi; Xu, Beining; Qu, Shanshan; Xu, Ling; Tan, Wen

    2015-08-01

    A sensitive liquid chromatography-tandem mass spectrometry (LC-MS/MS) method has been developed for simultaneous chiral analysis of an antiasthma drug bambuterol, its key intermediate monocarbamate bambuterol and its active drug terbutaline in human plasma. All samples were extracted with ethyl acetate and separated on an Astec Chirobiotic T column under isocratic elution with a mobile phase consisting of methanol and water with the addition of 20mm ammonium acetate and 0.005% (v/v) formic acid at 0.6mL/min. The analytes were detected by a Xevo TQ-S tandem mass spectrometer with positive electrospray ionization in multiple reaction monitoring mode. The established method has high sensitivity with the lower limit of quantifications of 25.00pg/mL for bambuterol enantiomers, and 50.00pg/mL for monocarbamate bambuterol and terbutaline enantiomers, respectively. The calibration curves for bambuterol enantiomers were linear in the range of 25.00-2500pg/mL, and for monocarbamate bambuterol and terbutaline enantiomers were linear in the range of 50.00-5000pg/mL. The intra- and inter-day precisions were <12.4%. All the analytes were separated in 18.0min. For the first time, the validated method was successfully applied to an enantioselective pharmacokinetic study of rac-bambuterol in 8 healthy volunteers. According to the results, this chiral LC-MS/MS assay provides a suitable and robust method for the enantioselectivity and interaction study of the prodrug bambuterol, the key intermediate monocarbamate bambuterol and its active drug terbutaline in human. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Analyzing key constraints to biogas production from crop residues and manure in the EU-A spatially explicit model.

    PubMed

    Einarsson, Rasmus; Persson, U Martin

    2017-01-01

    This paper presents a spatially explicit method for making regional estimates of the potential for biogas production from crop residues and manure, accounting for key technical, biochemical, environmental and economic constraints. Methods for making such estimates are important as biofuels from agricultural residues are receiving increasing policy support from the EU and major biogas producers, such as Germany and Italy, in response to concerns over unintended negative environmental and social impacts of conventional biofuels. This analysis comprises a spatially explicit estimate of crop residue and manure production for the EU at 250 m resolution, and a biogas production model accounting for local constraints such as the sustainable removal of residues, transportation of substrates, and the substrates' biochemical suitability for anaerobic digestion. In our base scenario, the EU biogas production potential from crop residues and manure is about 0.7 EJ/year, nearly double the current EU production of biogas from agricultural substrates, most of which does not come from residues or manure. An extensive sensitivity analysis of the model shows that the potential could easily be 50% higher or lower, depending on the stringency of economic, technical and biochemical constraints. We find that the potential is particularly sensitive to constraints on the substrate mixtures' carbon-to-nitrogen ratio and dry matter concentration. Hence, the potential to produce biogas from crop residues and manure in the EU depends to large extent on the possibility to overcome the challenges associated with these substrates, either by complementing them with suitable co-substrates (e.g. household waste and energy crops), or through further development of biogas technology (e.g. pretreatment of substrates and recirculation of effluent).

  3. Small sensitivity to temperature variations of Si-photonic Mach-Zehnder interferometer using Si and SiN waveguides

    NASA Astrophysics Data System (ADS)

    Hiraki, Tatsurou; Fukuda, Hiroshi; Yamada, Koji; Yamamoto, Tsuyoshi

    2015-03-01

    We demonstrated a small sensitivity to temperature variations of delay-line Mach-Zehnder interferometer (DL MZI) on a Si photonics platform. The key technique is to balance a thermo-optic effect in the two arms by using waveguide made of different materials. With silicon and silicon nitride waveguides, the fabricated DL MZI with a free-spectrum range of ~40 GHz showed a wavelength shift of -2.8 pm/K with temperature variations, which is 24 times smaller than that of the conventional Si-waveguide DL MZI. We also demonstrated the decoding of the 40-Gbit/s differential phase-shift keying signals to on-off keying signals with various temperatures. The tolerable temperature variation for the acceptable power penalty was significantly improved due to the small wavelength shifts.

  4. Key populations and human rights in the context of HIV services rendition in Ghana.

    PubMed

    Laar, Amos; DeBruin, Debra

    2017-08-02

    In line with its half century old penal code, Ghana currently criminalizes and penalizes behaviors of some key populations - populations deemed to be at higher risk of acquiring or transmitting Human Immunodeficiency Virus (HIV). Men who have sex with men (MSM), and sex workers (SWs) fit into this categorization. This paper provides an analysis of how enactment and implementation of rights-limiting laws not only limit rights, but also amplify risk and vulnerability to HIV in key and general populations. The paper derives from a project that assessed the ethics sensitivity of key documents guiding Ghana's response to its HIV epidemic. Assessment was guided by leading frameworks from public health ethics, and relevant articles from the international bill of rights. Ghana's response to her HIV epidemic does not adequately address the rights and needs of key populations. Even though the national response has achieved some public health successes, palpable efforts to address rights issues remain nascent. Ghana's guiding documents for HIV response include no advocacy for decriminalization, depenalization or harm reduction approaches for these key populations. The impact of rights-restricting codes on the nation's HIV epidemic is real: criminalization impedes key populations' access to HIV prevention and treatment services. Given that they are bridging populations, whatever affects the Ghanaian key populations directly, affects the general population indirectly. The right to the highest attainable standard of health, without qualification, is generally acknowledged as a fundamental human right. Unfortunately, this right currently eludes the Ghanaian SW and MSM. The paper endorses decriminalization as a means of promoting this right. In the face of opposition to decriminalization, the paper proposes specific harm reduction strategies as approaches to promote health and uplift the diminished rights of key populations. Thus the authors call on Ghana to remove impediments to public health services provision to these populations. Doing so will require political will and sufficient planning toward prioritizing HIV prevention, care and treatment programming for key populations.

  5. Nonlinear bias analysis and correction of microwave temperature sounder observations for FY-3C meteorological satellite

    NASA Astrophysics Data System (ADS)

    Hu, Taiyang; Lv, Rongchuan; Jin, Xu; Li, Hao; Chen, Wenxin

    2018-01-01

    The nonlinear bias analysis and correction of receiving channels in Chinese FY-3C meteorological satellite Microwave Temperature Sounder (MWTS) is a key technology of data assimilation for satellite radiance data. The thermal-vacuum chamber calibration data acquired from the MWTS can be analyzed to evaluate the instrument performance, including radiometric temperature sensitivity, channel nonlinearity and calibration accuracy. Especially, the nonlinearity parameters due to imperfect square-law detectors will be calculated from calibration data and further used to correct the nonlinear bias contributions of microwave receiving channels. Based upon the operational principles and thermalvacuum chamber calibration procedures of MWTS, this paper mainly focuses on the nonlinear bias analysis and correction methods for improving the calibration accuracy of the important instrument onboard FY-3C meteorological satellite, from the perspective of theoretical and experimental studies. Furthermore, a series of original results are presented to demonstrate the feasibility and significance of the methods.

  6. Kinetic Study of Acetone-Butanol-Ethanol Fermentation in Continuous Culture

    PubMed Central

    Buehler, Edward A.; Mesbah, Ali

    2016-01-01

    Acetone-butanol-ethanol (ABE) fermentation by clostridia has shown promise for industrial-scale production of biobutanol. However, the continuous ABE fermentation suffers from low product yield, titer, and productivity. Systems analysis of the continuous ABE fermentation will offer insights into its metabolic pathway as well as into optimal fermentation design and operation. For the ABE fermentation in continuous Clostridium acetobutylicum culture, this paper presents a kinetic model that includes the effects of key metabolic intermediates and enzymes as well as culture pH, product inhibition, and glucose inhibition. The kinetic model is used for elucidating the behavior of the ABE fermentation under the conditions that are most relevant to continuous cultures. To this end, dynamic sensitivity analysis is performed to systematically investigate the effects of culture conditions, reaction kinetics, and enzymes on the dynamics of the ABE production pathway. The analysis provides guidance for future metabolic engineering and fermentation optimization studies. PMID:27486663

  7. Sensitivity Analysis in Engineering

    NASA Technical Reports Server (NTRS)

    Adelman, Howard M. (Compiler); Haftka, Raphael T. (Compiler)

    1987-01-01

    The symposium proceedings presented focused primarily on sensitivity analysis of structural response. However, the first session, entitled, General and Multidisciplinary Sensitivity, focused on areas such as physics, chemistry, controls, and aerodynamics. The other four sessions were concerned with the sensitivity of structural systems modeled by finite elements. Session 2 dealt with Static Sensitivity Analysis and Applications; Session 3 with Eigenproblem Sensitivity Methods; Session 4 with Transient Sensitivity Analysis; and Session 5 with Shape Sensitivity Analysis.

  8. ENSO activity during the last climate cycle using Individual Foraminifera Analysis

    NASA Astrophysics Data System (ADS)

    Leduc, G.; Vidal, L.; Thirumalai, K.

    2017-12-01

    The El Niño / Southern Oscillation (ENSO) is the principal mode of interannual climate variability and affects key climate parameters such as low-latitude rainfall variability. Recent climate modeling experiments tend to suggest an increase in the frequency of both El Niño and La Niña events in the future, but these results remain model-dependent and require to be validated by paleodata-model comparisons. Fossil corals indicate that the ENSO variance during the 20th century is particularly high as compared to other time periods of the Holocene. Beyond the Holocene, however, little is known on past ENSO changes, which makes difficult to test paleoclimate model simulations that are used to study the ENSO sensitivity to various types of forcings. We have expanded an Individual Foraminifera Analysis (IFA) dataset using the thermocline-dwelling N. dutertrei on a marine core collected in the Panama Basin (Leduc et al., 2009), that has proven to be a skillful way to reconstruct the ENSO (Thirumalai et al., 2013). Our new IFA dataset comprehensively covers the Holocene, allowing to verify how the IFA method compares with ENSO reconstructions using corals. The dataset then extends back in time to Marine Isotope Stage 6 (MIS), with a special focus the last deglaciation and Termination II (MIS5/6) time windows, as well as key time periods to tests the sensitivity of ENSO to ice volume and orbital parameters. The new dataset confirms variable ENSO activity during the Holocene and weaker activity during LGM than during the Holocene, as a recent isotope-enabled climate model simulations of the LGM suggests (Zhu et al., 2017). Such pattern is reproduced for the Termination II. Leduc, G., L. Vidal, O. Cartapanis, and E. Bard (2009), Modes of eastern equatorial Pacific thermocline variability: Implications for ENSO dynamics over the last glacial period, Paleoceanography, 24, PA3202, doi:10.1029/2008PA001701. Thirumalai, K., J. W. Partin, C. S. Jackson, and T. M. Quinn (2013), Statistical constraints on El Niño Southern Oscillation reconstructions using individual foraminifera: A sensitivity analysis, Paleoceanography, 28, 401-412, doi:10.1002/palo.20037. Zhu, J., et al. (2017), Reduced ENSO variability at the LGM revealed by an isotope-enabled Earth system model, Geophys. Res. Lett., 44, 6984-6992, doi:10.1002/2017GL073406.

  9. Distributional Cost-Effectiveness Analysis

    PubMed Central

    Asaria, Miqdad; Griffin, Susan; Cookson, Richard

    2015-01-01

    Distributional cost-effectiveness analysis (DCEA) is a framework for incorporating health inequality concerns into the economic evaluation of health sector interventions. In this tutorial, we describe the technical details of how to conduct DCEA, using an illustrative example comparing alternative ways of implementing the National Health Service (NHS) Bowel Cancer Screening Programme (BCSP). The 2 key stages in DCEA are 1) modeling social distributions of health associated with different interventions, and 2) evaluating social distributions of health with respect to the dual objectives of improving total population health and reducing unfair health inequality. As well as describing the technical methods used, we also identify the data requirements and the social value judgments that have to be made. Finally, we demonstrate the use of sensitivity analyses to explore the impacts of alternative modeling assumptions and social value judgments. PMID:25908564

  10. Report on the Development of the Advanced Encryption Standard (AES)

    PubMed Central

    Nechvatal, James; Barker, Elaine; Bassham, Lawrence; Burr, William; Dworkin, Morris; Foti, James; Roback, Edward

    2001-01-01

    In 1997, the National Institute of Standards and Technology (NIST) initiated a process to select a symmetric-key encryption algorithm to be used to protect sensitive (unclassified) Federal information in furtherance of NIST’s statutory responsibilities. In 1998, NIST announced the acceptance of 15 candidate algorithms and requested the assistance of the cryptographic research community in analyzing the candidates. This analysis included an initial examination of the security and efficiency characteristics for each algorithm. NIST reviewed the results of this preliminary research and selected MARS, RC™, Rijndael, Serpent and Twofish as finalists. Having reviewed further public analysis of the finalists, NIST has decided to propose Rijndael as the Advanced Encryption Standard (AES). The research results and rationale for this selection are documented in this report. PMID:27500035

  11. Direct atmospheric pressure chemical ionisation ion trap mass spectrometry for aroma analysis: Speed, sensitivity and resolution of isobaric compounds

    NASA Astrophysics Data System (ADS)

    Jublot, Lionel; Linforth, Robert S. T.; Taylor, Andrew J.

    2005-06-01

    Atmospheric pressure chemical ionisation (APCI) sources were developed for real time analysis of volatile release from foods using an ion trap (IT) mass spectrometer (MS). Key objectives were spectral simplicity (minimal fragmentation), response time and signal to noise ratio. The benefits of APCI-IT-MS were assessed by comparing the performance for in vivo and headspace analyses with that obtained using APCI coupled to a quadrupole mass analyser. Using MS-MS, direct APCI-IT-MS was able to differentiate mixtures of some C6 and terpene isobaric aroma compounds. Resolution could be achieved for some compounds by monitoring specific secondary ions. Direct resolution was also achieved with two of the three isobaric compounds released from chocolate with time as the sample was eaten.

  12. Hepatic changes in metabolic gene expression in old ghrelin and ghrelin receptor knockout mice

    USDA-ARS?s Scientific Manuscript database

    Ghrelin knockout (GKO) and ghrelin receptor (growth hormone secretagogue receptor) knockout (GHSRKO) mice exhibit enhanced insulin sensitivity, but the mechanism is unclear. Insulin sensitivity declines with age and is inversely associated with accumulation of lipid in liver, a key glucoregulatory ...

  13. Understanding Canadian Family Policy: Intents and Models.

    ERIC Educational Resources Information Center

    Kieren, Dianne K.

    1991-01-01

    Introduces some of the key concepts and issues that face Canadian policymakers as they attempt to provide relevant and sensitive actions in support of families and individuals in families. Reviews the Quebec model, considered to be excellent for the development of humanly sensitive family policy. (Author/JOW)

  14. Phenylpropanoids are key players in the antioxidant defense to ozone of European ash, Fraxinus excelsior.

    PubMed

    Cotrozzi, Lorenzo; Campanella, Alessandra; Pellegrini, Elisa; Lorenzini, Giacomo; Nali, Cristina; Paoletti, Elena

    2018-03-01

    Physiological and biochemical responses to ozone (O 3 ) (150 ppb, 8 h day -1 , 35 consecutive days) of two Italian provenances (Piedmont and Tuscany) of Fraxinus excelsior L. were evaluated, with special attention to the role of phenylpropanoids. Our results indicate (i) the high O 3 sensitivity especially of Piedmont provenance (in terms of visible injury, water status, and photosynthetic apparatus); (ii) although the intra-specific sensitivity to O 3 between provenances differs (mainly due to different stomatal behaviors since only Tuscany plants partially avoided the uptake of the pollutant gas), both provenances showed detoxification and defense mechanisms; (iii) the crucial participation of phenylpropanoids, with a key role played by flavonoids (especially quercitrin): among this class of metabolites, isoquercitrin is the principal player in the lower O 3 sensitivity of Tuscany plants, together with lignins; (iv) although coumarins (typical compounds of Fraxinus) were severely depressed by O 3 , isofraxidin was triggered suggesting a key role in reactive oxygen species (ROS) detoxification, as well as trans-chalcone. Furthermore, the different behavior of verbascoside and oleuropein among provenances lead us to speculate on their influence in the tentatively repair or acclimation shown by Piedmont plants at the end of the exposure. Finally, the intra-specific O 3 sensitivity may be also due to de novo peaks triggered by O 3 not yet associated to some chemicals.

  15. Available nitrogen is the key factor influencing soil microbial functional gene diversity in tropical rainforest.

    PubMed

    Cong, Jing; Liu, Xueduan; Lu, Hui; Xu, Han; Li, Yide; Deng, Ye; Li, Diqiang; Zhang, Yuguang

    2015-08-20

    Tropical rainforests cover over 50% of all known plant and animal species and provide a variety of key resources and ecosystem services to humans, largely mediated by metabolic activities of soil microbial communities. A deep analysis of soil microbial communities and their roles in ecological processes would improve our understanding on biogeochemical elemental cycles. However, soil microbial functional gene diversity in tropical rainforests and causative factors remain unclear. GeoChip, contained almost all of the key functional genes related to biogeochemical cycles, could be used as a specific and sensitive tool for studying microbial gene diversity and metabolic potential. In this study, soil microbial functional gene diversity in tropical rainforest was analyzed by using GeoChip technology. Gene categories detected in the tropical rainforest soils were related to different biogeochemical processes, such as carbon (C), nitrogen (N) and phosphorus (P) cycling. The relative abundance of genes related to C and P cycling detected mostly derived from the cultured bacteria. C degradation gene categories for substrates ranging from labile C to recalcitrant C were all detected, and gene abundances involved in many recalcitrant C degradation gene categories were significantly (P < 0.05) different among three sampling sites. The relative abundance of genes related to N cycling detected was significantly (P < 0.05) different, mostly derived from the uncultured bacteria. The gene categories related to ammonification had a high relative abundance. Both canonical correspondence analysis and multivariate regression tree analysis showed that soil available N was the most correlated with soil microbial functional gene structure. Overall high microbial functional gene diversity and different soil microbial metabolic potential for different biogeochemical processes were considered to exist in tropical rainforest. Soil available N could be the key factor in shaping the soil microbial functional gene structure and metabolic potential.

  16. Integrating satellite actual evapotranspiration patterns into distributed model parametrization and evaluation for a mesoscale catchment

    NASA Astrophysics Data System (ADS)

    Demirel, M. C.; Mai, J.; Stisen, S.; Mendiguren González, G.; Koch, J.; Samaniego, L. E.

    2016-12-01

    Distributed hydrologic models are traditionally calibrated and evaluated against observations of streamflow. Spatially distributed remote sensing observations offer a great opportunity to enhance spatial model calibration schemes. For that it is important to identify the model parameters that can change spatial patterns before the satellite based hydrologic model calibration. Our study is based on two main pillars: first we use spatial sensitivity analysis to identify the key parameters controlling the spatial distribution of actual evapotranspiration (AET). Second, we investigate the potential benefits of incorporating spatial patterns from MODIS data to calibrate the mesoscale Hydrologic Model (mHM). This distributed model is selected as it allows for a change in the spatial distribution of key soil parameters through the calibration of pedo-transfer function parameters and includes options for using fully distributed daily Leaf Area Index (LAI) directly as input. In addition the simulated AET can be estimated at the spatial resolution suitable for comparison to the spatial patterns observed using MODIS data. We introduce a new dynamic scaling function employing remotely sensed vegetation to downscale coarse reference evapotranspiration. In total, 17 parameters of 47 mHM parameters are identified using both sequential screening and Latin hypercube one-at-a-time sampling methods. The spatial patterns are found to be sensitive to the vegetation parameters whereas streamflow dynamics are sensitive to the PTF parameters. The results of multi-objective model calibration show that calibration of mHM against observed streamflow does not reduce the spatial errors in AET while they improve only the streamflow simulations. We will further examine the results of model calibration using only multi spatial objective functions measuring the association between observed AET and simulated AET maps and another case including spatial and streamflow metrics together.

  17. MutScan: fast detection and visualization of target mutations by scanning FASTQ data.

    PubMed

    Chen, Shifu; Huang, Tanxiao; Wen, Tiexiang; Li, Hong; Xu, Mingyan; Gu, Jia

    2018-01-22

    Some types of clinical genetic tests, such as cancer testing using circulating tumor DNA (ctDNA), require sensitive detection of known target mutations. However, conventional next-generation sequencing (NGS) data analysis pipelines typically involve different steps of filtering, which may cause miss-detection of key mutations with low frequencies. Variant validation is also indicated for key mutations detected by bioinformatics pipelines. Typically, this process can be executed using alignment visualization tools such as IGV or GenomeBrowse. However, these tools are too heavy and therefore unsuitable for validating mutations in ultra-deep sequencing data. We developed MutScan to address problems of sensitive detection and efficient validation for target mutations. MutScan involves highly optimized string-searching algorithms, which can scan input FASTQ files to grab all reads that support target mutations. The collected supporting reads for each target mutation will be piled up and visualized using web technologies such as HTML and JavaScript. Algorithms such as rolling hash and bloom filter are applied to accelerate scanning and make MutScan applicable to detect or visualize target mutations in a very fast way. MutScan is a tool for the detection and visualization of target mutations by only scanning FASTQ raw data directly. Compared to conventional pipelines, this offers a very high performance, executing about 20 times faster, and offering maximal sensitivity since it can grab mutations with even one single supporting read. MutScan visualizes detected mutations by generating interactive pile-ups using web technologies. These can serve to validate target mutations, thus avoiding false positives. Furthermore, MutScan can visualize all mutation records in a VCF file to HTML pages for cloud-friendly VCF validation. MutScan is an open source tool available at GitHub: https://github.com/OpenGene/MutScan.

  18. An examination of anticipated g-jitter on Space Station and its effects on materials processes

    NASA Technical Reports Server (NTRS)

    Nelson, Emily S.

    1994-01-01

    This study is concerned with the effects of g-jitter, the residual acceleration aboard spacecraft, on selected classes of materials processes. In particular, the anticipated acceleration environment aboard Space Station Freedom (SSF) and its potential effects are analyzed, but the topic is covered with a sufficient level of generality as to apply to other processes and to other vehicles as well. Some of the key findings of this study include: The present acceleration specifications for SSF are inadequate to assure a quality level low-g environment. The local g vector orientation is an extremely sensitive parameter for certain key processes, but can not be controlled to within the desired tolerance. Therefore, less emphasis should be placed upon achieving a tight control of SSF attitude, but more emphasis should be focused on reducing the overall level of the g-jitter magnitude. Melt-based crystal growth may not be successfully processed in the relatively noisy environment of a large inhabited space structure. Growth from vapor or from solution appears more favorable. A smaller space structure and/or a free flyer can provide better alternatives in terms of g-jitter considerations. A high priority (including budgetary) should be given to coordinated efforts among researchers, SSF designers, and equipment contractors, to develop practical experiment-specific sensitivity requirements. Combined focused numerical simulations and experiments with well-resolved acceleration measurements should be vigorously pursued for developing reliable experiment-specific sensitivity data. Appendices provide an extensive cross-referenced bibliography, a discussion of the merits offered by g-jitter analysis techniques, as well as definitions of relevant nondimensional quantities and a brief description of available accelerometry hardware.

  19. Polyamine analog TBP inhibits proliferation of human K562 chronic myelogenous leukemia cells by induced apoptosis

    PubMed Central

    WANG, QING; WANG, YAN-LIN; WANG, KAI; YANG, JIAN-LIN; CAO, CHUN-YU

    2015-01-01

    The aim of the present study was to investigate the effects of the novel polyamine analog tetrabutyl propanediamine (TBP) on the growth of K562 chronic myelogenous leukemia (CML) cells and the underlying mechanism of these effects. MTT was used for the analysis of cell proliferation and flow cytometry was performed to analyze cell cycle distribution. DNA fragmentation analysis and Annexin V/propidium iodide double staining were used to identify apoptotic cells. The activity of the key enzymes in polyamine catabolism was detected using chemiluminescence. TBP can induce apoptosis and significantly inhibit K562 cell proliferation in a time- and dose-dependent manner. TBP treatment significantly induced the enzyme activity of spermine oxidase and acetylpolyamine oxidase in K562 cells, and also enhanced the inhibitory effect of the antitumor drug doxorubicin on K562 cell proliferation. As a novel polyamine analog, TBP significantly inhibited proliferation and induced apoptosis in K562 cells by upregulating the activity of the key enzymes in the polyamine catabolic pathways. TBP also increased the sensitivity of the K562 cells to the antitumor drug doxorubicin. These data indicate an important potential value of TBP for clinical therapy of human CML. PMID:25435975

  20. Unraveling protein catalysis through neutron diffraction

    NASA Astrophysics Data System (ADS)

    Myles, Dean

    Neutron scattering and diffraction are exquisitely sensitive to the location, concentration and dynamics of hydrogen atoms in materials and provide a powerful tool for the characterization of structure-function and interfacial relationships in biological systems. Modern neutron scattering facilities offer access to a sophisticated, non-destructive suite of instruments for biophysical characterization that provide spatial and dynamic information spanning from Angstroms to microns and from picoseconds to microseconds, respectively. Applications range from atomic-resolution analysis of individual hydrogen atoms in enzymes, through to multi-scale analysis of hierarchical structures and assemblies in biological complexes, membranes and in living cells. Here we describe how the precise location of protein and water hydrogen atoms using neutron diffraction provides a more complete description of the atomic and electronic structures of proteins, enabling key questions concerning enzyme reaction mechanisms, molecular recognition and binding and protein-water interactions to be addressed. Current work is focused on understanding how molecular structure and dynamics control function in photosynthetic, cell signaling and DNA repair proteins. We will highlight recent studies that provide detailed understanding of the physiochemical mechanisms through which proteins recognize ligands and catalyze reactions, and help to define and understand the key principles involved.

  1. Sensitivity analysis of key components in large-scale hydroeconomic models

    NASA Astrophysics Data System (ADS)

    Medellin-Azuara, J.; Connell, C. R.; Lund, J. R.; Howitt, R. E.

    2008-12-01

    This paper explores the likely impact of different estimation methods in key components of hydro-economic models such as hydrology and economic costs or benefits, using the CALVIN hydro-economic optimization for water supply in California. In perform our analysis using two climate scenarios: historical and warm-dry. The components compared were perturbed hydrology using six versus eighteen basins, highly-elastic urban water demands, and different valuation of agricultural water scarcity. Results indicate that large scale hydroeconomic hydro-economic models are often rather robust to a variety of estimation methods of ancillary models and components. Increasing the level of detail in the hydrologic representation of this system might not greatly affect overall estimates of climate and its effects and adaptations for California's water supply. More price responsive urban water demands will have a limited role in allocating water optimally among competing uses. Different estimation methods for the economic value of water and scarcity in agriculture may influence economically optimal water allocation; however land conversion patterns may have a stronger influence in this allocation. Overall optimization results of large-scale hydro-economic models remain useful for a wide range of assumptions in eliciting promising water management alternatives.

  2. Hydrophobic Residues near the Bilin Chromophore-Binding Pocket Modulate Spectral Tuning of Insert-Cys Subfamily Cyanobacteriochromes

    PubMed Central

    Cho, Sung Mi; Jeoung, Sae Chae; Song, Ji-Young; Song, Ji-Joon; Park, Youn-Il

    2017-01-01

    Cyanobacteriochromes (CBCRs) are a subfamily of phytochrome photoreceptors found exclusively in photosynthetic cyanobacteria. Four CBCRs containing a second Cys in the insert region (insert-Cys) have been identified from the nonheterocystous cyanobacterium Microcoleus B353 (Mbr3854g4 and Mbl3738g2) and the nitrogen fixing, heterocystous cyanobacterium Nostoc punctiforme (NpF2164g3 and NpR1597g2). These insert-Cys CBCRs can sense light in the near-UV to orange range, but key residues responsible for tuning their colour sensitivity have not been reported. In the present study, near-UV/Green (UG) photosensors Mbr3854g4 (UG1) and Mbl3738g2 (UG2) were chosen for further spectroscopic analysis of their spectral sensitivity and tuning. Consistent with most dual-Cys CBCRs, both UGs formed a second thioether linkage to the phycocyanobilin (PCB) chromophore via the insert-Cys. This bond is subject to breakage and relinkage during forward and reverse photoconversions. Variations in residues equivalent to Phe that are in close contact with the PCB chromophore D-ring in canonical red/green CBCRs are responsible for tuning the light absorption peaks of both dark and photoproducts. This is the first time these key residues that govern light absorption in insert-Cys family CBCRs have been identified and characterised. PMID:28094296

  3. Relative stability of DNA as a generic criterion for promoter prediction: whole genome annotation of microbial genomes with varying nucleotide base composition.

    PubMed

    Rangannan, Vetriselvi; Bansal, Manju

    2009-12-01

    The rapid increase in genome sequence information has necessitated the annotation of their functional elements, particularly those occurring in the non-coding regions, in the genomic context. Promoter region is the key regulatory region, which enables the gene to be transcribed or repressed, but it is difficult to determine experimentally. Hence an in silico identification of promoters is crucial in order to guide experimental work and to pin point the key region that controls the transcription initiation of a gene. In this analysis, we demonstrate that while the promoter regions are in general less stable than the flanking regions, their average free energy varies depending on the GC composition of the flanking genomic sequence. We have therefore obtained a set of free energy threshold values, for genomic DNA with varying GC content and used them as generic criteria for predicting promoter regions in several microbial genomes, using an in-house developed tool PromPredict. On applying it to predict promoter regions corresponding to the 1144 and 612 experimentally validated TSSs in E. coli (50.8% GC) and B. subtilis (43.5% GC) sensitivity of 99% and 95% and precision values of 58% and 60%, respectively, were achieved. For the limited data set of 81 TSSs available for M. tuberculosis (65.6% GC) a sensitivity of 100% and precision of 49% was obtained.

  4. Testing the metabolic theory of ecology with marine bacteria: different temperature sensitivity of major phylogenetic groups during the spring phytoplankton bloom.

    PubMed

    Arandia-Gorostidi, Nestor; Huete-Stauffer, Tamara Megan; Alonso-Sáez, Laura; G Morán, Xosé Anxelu

    2017-11-01

    Although temperature is a key driver of bacterioplankton metabolism, the effect of ocean warming on different bacterial phylogenetic groups remains unclear. Here, we conducted monthly short-term incubations with natural coastal bacterial communities over an annual cycle to test the effect of experimental temperature on the growth rates and carrying capacities of four phylogenetic groups: SAR11, Rhodobacteraceae, Gammaproteobacteria and Bacteroidetes. SAR11 was the most abundant group year-round as analysed by CARD-FISH, with maximum abundances in summer, while the other taxa peaked in spring. All groups, including SAR11, showed high temperature-sensitivity of growth rates and/or carrying capacities in spring, under phytoplankton bloom or post-bloom conditions. In that season, Rhodobacteraceae showed the strongest temperature response in growth rates, estimated here as activation energy (E, 1.43 eV), suggesting an advantage to outcompete other groups under warmer conditions. In summer E values were in general lower than 0.65 eV, the value predicted by the Metabolic Theory of Ecology (MTE). Contrary to MTE predictions, carrying capacity tended to increase with warming for all bacterial groups. Our analysis confirms that resource availability is key when addressing the temperature response of heterotrophic bacterioplankton. We further show that even under nutrient-sufficient conditions, warming differentially affected distinct bacterioplankton taxa. © 2017 Society for Applied Microbiology and John Wiley & Sons Ltd.

  5. Transcription Factor Activities Enhance Markers of Drug Sensitivity in Cancer.

    PubMed

    Garcia-Alonso, Luz; Iorio, Francesco; Matchan, Angela; Fonseca, Nuno; Jaaks, Patricia; Peat, Gareth; Pignatelli, Miguel; Falcone, Fiammetta; Benes, Cyril H; Dunham, Ian; Bignell, Graham; McDade, Simon S; Garnett, Mathew J; Saez-Rodriguez, Julio

    2018-02-01

    Transcriptional dysregulation induced by aberrant transcription factors (TF) is a key feature of cancer, but its global influence on drug sensitivity has not been examined. Here, we infer the transcriptional activity of 127 TFs through analysis of RNA-seq gene expression data newly generated for 448 cancer cell lines, combined with publicly available datasets to survey a total of 1,056 cancer cell lines and 9,250 primary tumors. Predicted TF activities are supported by their agreement with independent shRNA essentiality profiles and homozygous gene deletions, and recapitulate mutant-specific mechanisms of transcriptional dysregulation in cancer. By analyzing cell line responses to 265 compounds, we uncovered numerous TFs whose activity interacts with anticancer drugs. Importantly, combining existing pharmacogenomic markers with TF activities often improves the stratification of cell lines in response to drug treatment. Our results, which can be queried freely at dorothea.opentargets.io, offer a broad foundation for discovering opportunities to refine personalized cancer therapies. Significance: Systematic analysis of transcriptional dysregulation in cancer cell lines and patient tumor specimens offers a publicly searchable foundation to discover new opportunities to refine personalized cancer therapies. Cancer Res; 78(3); 769-80. ©2017 AACR . ©2017 American Association for Cancer Research.

  6. A comprehensive prediction and evaluation method of pilot workload

    PubMed Central

    Feng, Chuanyan; Wanyan, Xiaoru; Yang, Kun; Zhuang, Damin; Wu, Xu

    2018-01-01

    BACKGROUND: The prediction and evaluation of pilot workload is a key problem in human factor airworthiness of cockpit. OBJECTIVE: A pilot traffic pattern task was designed in a flight simulation environment in order to carry out the pilot workload prediction and improve the evaluation method. METHODS: The prediction of typical flight subtasks and dynamic workloads (cruise, approach, and landing) were built up based on multiple resource theory, and a favorable validity was achieved by the correlation analysis verification between sensitive physiological data and the predicted value. RESULTS: Statistical analysis indicated that eye movement indices (fixation frequency, mean fixation time, saccade frequency, mean saccade time, and mean pupil diameter), Electrocardiogram indices (mean normal-to-normal interval and the ratio between low frequency and sum of low frequency and high frequency), and Electrodermal Activity indices (mean tonic and mean phasic) were all sensitive to typical workloads of subjects. CONCLUSION: A multinominal logistic regression model based on combination of physiological indices (fixation frequency, mean normal-to-normal interval, the ratio between low frequency and sum of low frequency and high frequency, and mean tonic) was constructed, and the discriminate accuracy was comparatively ideal with a rate of 84.85%. PMID:29710742

  7. Effects of Weather on Caloric and Nutritive Intake in India

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Babiarz, K.; Goldhaber-Fiebert, J.; Lobell, D. B.

    2012-12-01

    Many studies have investigated effects of weather on production of key food crops, largely motivated by a desire to anticipate impacts of climate change. However, health outcomes are most directly affected by food consumption, not production. Consumption changes will not necessarily follow production changes, primarily because people can adjust their diets away from foods that are most negatively affected. To more directly evaluate the effects of weather on nutrition, we analyzed reported household expenditure and consumption data from 20 rounds of the National Sample Survey (NSS) of India along with aggregated weather data of the two main agricultural seasons, kharif and rabi. Per capita intake of calories, protein, fats, and micronutrients were calculated from reported data at the household level, and then aggregated to district level for comparison with weather data. Regression analysis revealed significant negative effects of increased temperatures on calorie consumption in rural areas, with lower sensitivities in urban areas. We also found a higher sensitivity of protein and fat consumption to weather than for calories, which likely reflects the ability of households to switch to cheaper sources of calories in lean times. The results of this analysis will be useful for assessing the overall health burdens associated with climate change in India.

  8. In-depth analysis of accidental oil spills from tankers in the context of global spill trends from all sources.

    PubMed

    Burgherr, Peter

    2007-02-09

    This study gives a global overview of accidental oil spills from all sources (> or =700t) for the period 1970-2004, followed by a detailed examination of trends in accidental tanker spills. The present analysis of the number and volume of tanker spills includes temporal and spatial spill trends, aspects of spill size distribution as well as trends of key factors (i.e., flag state, hull type, tanker age, accident cause and sensitivity of location). Results show that the total number and volume of tanker spills have significantly decreased since the 1970s, which is in contrast to increases in maritime transport of oil and to popular perceptions following recent catastrophic events. However, many spills still occur in ecologically sensitive locations because the major maritime transport routes often cross the boundaries of the Large Marine Ecosystems, but the substantially lower total spill volume is an important contribution to potentially reduce overall ecosystem impacts. In summary, the improvements achieved in the past decades have been the result of a set of initiatives and regulations implemented by governments, international organizations and the shipping industry.

  9. A comprehensive prediction and evaluation method of pilot workload.

    PubMed

    Feng, Chuanyan; Wanyan, Xiaoru; Yang, Kun; Zhuang, Damin; Wu, Xu

    2018-01-01

    The prediction and evaluation of pilot workload is a key problem in human factor airworthiness of cockpit. A pilot traffic pattern task was designed in a flight simulation environment in order to carry out the pilot workload prediction and improve the evaluation method. The prediction of typical flight subtasks and dynamic workloads (cruise, approach, and landing) were built up based on multiple resource theory, and a favorable validity was achieved by the correlation analysis verification between sensitive physiological data and the predicted value. Statistical analysis indicated that eye movement indices (fixation frequency, mean fixation time, saccade frequency, mean saccade time, and mean pupil diameter), Electrocardiogram indices (mean normal-to-normal interval and the ratio between low frequency and sum of low frequency and high frequency), and Electrodermal Activity indices (mean tonic and mean phasic) were all sensitive to typical workloads of subjects. A multinominal logistic regression model based on combination of physiological indices (fixation frequency, mean normal-to-normal interval, the ratio between low frequency and sum of low frequency and high frequency, and mean tonic) was constructed, and the discriminate accuracy was comparatively ideal with a rate of 84.85%.

  10. Uncertainty Quantification and Assessment of CO2 Leakage in Groundwater Aquifers

    NASA Astrophysics Data System (ADS)

    Carroll, S.; Mansoor, K.; Sun, Y.; Jones, E.

    2011-12-01

    Complexity of subsurface aquifers and the geochemical reactions that control drinking water compositions complicate our ability to estimate the impact of leaking CO2 on groundwater quality. We combined lithologic field data from the High Plains Aquifer, numerical simulations, and uncertainty quantification analysis to assess the role of aquifer heterogeneity and physical transport on the extent of CO2 impacted plume over a 100-year period. The High Plains aquifer is a major aquifer over much of the central United States where CO2 may be sequestered in depleted oil and gas reservoirs or deep saline formations. Input parameters considered included, aquifer heterogeneity, permeability, porosity, regional groundwater flow, CO2 and TDS leakage rates over time, and the number of leakage source points. Sensitivity analysis suggest that variations in sand and clay permeability, correlation lengths, van Genuchten parameters, and CO2 leakage rate have the greatest impact on impacted volume or maximum distance from the leak source. A key finding is that relative sensitivity of the parameters changes over the 100-year period. Reduced order models developed from regression of the numerical simulations show that volume of the CO2-impacted aquifer increases over time with 2 order of magnitude variance.

  11. Current Technologies of Electrochemical Immunosensors: Perspective on Signal Amplification.

    PubMed

    Cho, Il-Hoon; Lee, Jongsung; Kim, Jiyeon; Kang, Min-Soo; Paik, Jean Kyung; Ku, Seockmo; Cho, Hyun-Mo; Irudayaraj, Joseph; Kim, Dong-Hyung

    2018-01-12

    An electrochemical immunosensor employs antibodies as capture and detection means to produce electrical charges for the quantitative analysis of target molecules. This sensor type can be utilized as a miniaturized device for the detection of point-of-care testing (POCT). Achieving high-performance analysis regarding sensitivity has been one of the key issues with developing this type of biosensor system. Many modern nanotechnology efforts allowed for the development of innovative electrochemical biosensors with high sensitivity by employing various nanomaterials that facilitate the electron transfer and carrying capacity of signal tracers in combination with surface modification and bioconjugation techniques. In this review, we introduce novel nanomaterials (e.g., carbon nanotube, graphene, indium tin oxide, nanowire and metallic nanoparticles) in order to construct a high-performance electrode. Also, we describe how to increase the number of signal tracers by employing nanomaterials as carriers and making the polymeric enzyme complex associated with redox cycling for signal amplification. The pros and cons of each method are considered throughout this review. We expect that these reviewed strategies for signal enhancement will be applied to the next versions of lateral-flow paper chromatography and microfluidic immunosensor, which are considered the most practical POCT biosensor platforms.

  12. Functional analysis of two sterol regulatory element binding proteins in Penicillium digitatum

    PubMed Central

    Ruan, Ruoxin; Wang, Mingshuang; Liu, Xin; Sun, Xuepeng; Chung, Kuang-Ren

    2017-01-01

    The sterol regulatory element binding proteins (SREBPs) are key regulators for sterol homeostasis in most fungi. In the citrus postharvest pathogen Penicillium digitatum, the SREBP homolog is required for fungicide resistance and regulation of CYP51 expression. In this study, we identified another SREBP transcription factor PdSreB in P. digitatum, and the biological functions of both SREBPs were characterized and compared. Inactivation of PdsreA, PdsreB or both genes in P. digitatum reduced ergosterol contents and increased sensitivities to sterol 14-α-demethylation inhibitors (DMIs) and cobalt chloride. Fungal strains impaired at PdsreA but not PdsreB increased sensitivity to tridemorph and an iron chelator 2,2’-dipyridyl. Virulence assays on citrus fruit revealed that fungal strains impaired at PdsreA, PdsreB or both induce maceration lesions similar to those induced by wild-type. However, ΔPdsreA, ΔPdsreB or the double mutant strain rarely produce aerial mycelia on infected citrus fruit peels. RNA-Seq analysis showed the broad regulatory functions of both SREBPs in biosynthesis, transmembrane transportation and stress responses. Our results provide new insights into the conserved and differentiated regulatory functions of SREBP homologs in plant pathogenic fungi. PMID:28467453

  13. Current Technologies of Electrochemical Immunosensors: Perspective on Signal Amplification

    PubMed Central

    Cho, Il-Hoon; Kim, Jiyeon; Kang, Min-soo; Paik, Jean Kyung; Ku, Seockmo; Cho, Hyun-Mo; Irudayaraj, Joseph; Kim, Dong-Hyung

    2018-01-01

    An electrochemical immunosensor employs antibodies as capture and detection means to produce electrical charges for the quantitative analysis of target molecules. This sensor type can be utilized as a miniaturized device for the detection of point-of-care testing (POCT). Achieving high-performance analysis regarding sensitivity has been one of the key issues with developing this type of biosensor system. Many modern nanotechnology efforts allowed for the development of innovative electrochemical biosensors with high sensitivity by employing various nanomaterials that facilitate the electron transfer and carrying capacity of signal tracers in combination with surface modification and bioconjugation techniques. In this review, we introduce novel nanomaterials (e.g., carbon nanotube, graphene, indium tin oxide, nanowire and metallic nanoparticles) in order to construct a high-performance electrode. Also, we describe how to increase the number of signal tracers by employing nanomaterials as carriers and making the polymeric enzyme complex associated with redox cycling for signal amplification. The pros and cons of each method are considered throughout this review. We expect that these reviewed strategies for signal enhancement will be applied to the next versions of lateral-flow paper chromatography and microfluidic immunosensor, which are considered the most practical POCT biosensor platforms. PMID:29329274

  14. High-throughput quantum cascade laser (QCL) spectral histopathology: a practical approach towards clinical translation.

    PubMed

    Pilling, Michael J; Henderson, Alex; Bird, Benjamin; Brown, Mick D; Clarke, Noel W; Gardner, Peter

    2016-06-23

    Infrared microscopy has become one of the key techniques in the biomedical research field for interrogating tissue. In partnership with multivariate analysis and machine learning techniques, it has become widely accepted as a method that can distinguish between normal and cancerous tissue with both high sensitivity and high specificity. While spectral histopathology (SHP) is highly promising for improved clinical diagnosis, several practical barriers currently exist, which need to be addressed before successful implementation in the clinic. Sample throughput and speed of acquisition are key barriers and have been driven by the high volume of samples awaiting histopathological examination. FTIR chemical imaging utilising FPA technology is currently state-of-the-art for infrared chemical imaging, and recent advances in its technology have dramatically reduced acquisition times. Despite this, infrared microscopy measurements on a tissue microarray (TMA), often encompassing several million spectra, takes several hours to acquire. The problem lies with the vast quantities of data that FTIR collects; each pixel in a chemical image is derived from a full infrared spectrum, itself composed of thousands of individual data points. Furthermore, data management is quickly becoming a barrier to clinical translation and poses the question of how to store these incessantly growing data sets. Recently, doubts have been raised as to whether the full spectral range is actually required for accurate disease diagnosis using SHP. These studies suggest that once spectral biomarkers have been predetermined it may be possible to diagnose disease based on a limited number of discrete spectral features. In this current study, we explore the possibility of utilising discrete frequency chemical imaging for acquiring high-throughput, high-resolution chemical images. Utilising a quantum cascade laser imaging microscope with discrete frequency collection at key diagnostic wavelengths, we demonstrate that we can diagnose prostate cancer with high sensitivity and specificity. Finally we extend the study to a large patient dataset utilising tissue microarrays, and show that high sensitivity and specificity can be achieved using high-throughput, rapid data collection, thereby paving the way for practical implementation in the clinic.

  15. FT-IR-cPAS—New Photoacoustic Measurement Technique for Analysis of Hot Gases: A Case Study on VOCs

    PubMed Central

    Hirschmann, Christian Bernd; Koivikko, Niina Susanna; Raittila, Jussi; Tenhunen, Jussi; Ojala, Satu; Rahkamaa-Tolonen, Katariina; Marbach, Ralf; Hirschmann, Sarah; Keiski, Riitta Liisa

    2011-01-01

    This article describes a new photoacoustic FT-IR system capable of operating at elevated temperatures. The key hardware component is an optical-readout cantilever microphone that can work up to 200 °C. All parts in contact with the sample gas were put into a heated oven, incl. the photoacoustic cell. The sensitivity of the built photoacoustic system was tested by measuring 18 different VOCs. At 100 ppm gas concentration, the univariate signal to noise ratios (1σ, measurement time 25.5 min, at highest peak, optical resolution 8 cm−1) of the spectra varied from minimally 19 for o-xylene up to 329 for butyl acetate. The sensitivity can be improved by multivariate analyses over broad wavelength ranges, which effectively co-adds the univariate sensitivities achievable at individual wavelengths. The multivariate limit of detection (3σ, 8.5 min, full useful wavelength range), i.e., the best possible inverse analytical sensitivity achievable at optimum calibration, was calculated using the SBC method and varied from 2.60 ppm for dichloromethane to 0.33 ppm for butyl acetate. Depending on the shape of the spectra, which often only contain a few sharp peaks, the multivariate analysis improved the analytical sensitivity by 2.2 to 9.2 times compared to the univariate case. Selectivity and multi component ability were tested by a SBC calibration including 5 VOCs and water. The average cross selectivities turned out to be less than 2% and the resulting inverse analytical sensitivities of the 5 interfering VOCs was increased by maximum factor of 2.2 compared to the single component sensitivities. Water subtraction using SBC gave the true analyte concentration with a variation coefficient of 3%, although the sample spectra (methyl ethyl ketone, 200 ppm) contained water from 1,400 to 100k ppm and for subtraction only one water spectra (10k ppm) was used. The developed device shows significant improvement to the current state-of-the-art measurement methods used in industrial VOC measurements. PMID:22163900

  16. Anxiety Sensitivity: A Missing Piece to the Agoraphobia-without-Panic Puzzle

    ERIC Educational Resources Information Center

    Hayward, Chris; Wilson, Kimberly A.

    2007-01-01

    This article reviews the controversy surrounding the diagnosis of agoraphobia without panic attacks and proposes a key role for anxiety sensitivity in explaining agoraphobic avoidance among those who have never experienced panic. Although rare in clinical samples, agoraphobia without panic is commonly observed in population-based surveys,…

  17. Teaching Prevention on Sensitive Topics: Key Elements and Pedagogical Techniques

    ERIC Educational Resources Information Center

    Russell, Beth S.; Soysa, Champika K.; Wagoner, Marc J.; Dawson, Lori

    2008-01-01

    This paper presents a set of topical and pedagogical considerations for instructors teaching material on sensitive topics with either the primary or secondary aim of addressing prevention. Prevention can be approached as an effort to create changes in an individual's attitudes/beliefs, knowledge, and behavior. Following this framework, classroom…

  18. Radical Sensitivity Is the Key to Understanding Chinese Character Acquisition in Children

    ERIC Educational Resources Information Center

    Tong, Xiuhong; Tong, Xiuli; McBride, Catherine

    2017-01-01

    This study investigated Chinese children's development of sensitivity to positional (orthographic), phonological, and semantic cues of radicals in encoding novel Chinese characters. A newly designed picture-novel character mapping task, along with nonverbal reasoning ability, vocabulary, and Chinese character recognition were administered to 198…

  19. Validation of a particle tracking analysis method for the size determination of nano- and microparticles

    NASA Astrophysics Data System (ADS)

    Kestens, Vikram; Bozatzidis, Vassili; De Temmerman, Pieter-Jan; Ramaye, Yannic; Roebben, Gert

    2017-08-01

    Particle tracking analysis (PTA) is an emerging technique suitable for size analysis of particles with external dimensions in the nano- and sub-micrometre scale range. Only limited attempts have so far been made to investigate and quantify the performance of the PTA method for particle size analysis. This article presents the results of a validation study during which selected colloidal silica and polystyrene latex reference materials with particle sizes in the range of 20 nm to 200 nm were analysed with NS500 and LM10-HSBF NanoSight instruments and video analysis software NTA 2.3 and NTA 3.0. Key performance characteristics such as working range, linearity, limit of detection, limit of quantification, sensitivity, robustness, precision and trueness were examined according to recommendations proposed by EURACHEM. A model for measurement uncertainty estimation following the principles described in ISO/IEC Guide 98-3 was used for quantifying random and systematic variations. For nominal 50 nm and 100 nm polystyrene and a nominal 80 nm silica reference materials, the relative expanded measurement uncertainties for the three measurands of interest, being the mode, median and arithmetic mean of the number-weighted particle size distribution, varied from about 10% to 12%. For the nominal 50 nm polystyrene material, the relative expanded uncertainty of the arithmetic mean of the particle size distributions increased up to 18% which was due to the presence of agglomerates. Data analysis was performed with software NTA 2.3 and NTA 3.0. The latter showed to be superior in terms of sensitivity and resolution.

  20. Health economics analysis of insulin aspart vs. regular human insulin in type 2 diabetes patients, based on observational real life evidence from general practices in Germany.

    PubMed

    Liebl, A; Seitz, L; Palmer, A J

    2014-10-01

    A retrospective analysis of German general practice data demonstrated that insulin aspart (IA) was associated with a significantly reduced incidence of macrovascular events (MVE: stroke, myocardial infarction, peripheral vascular disease or coronary heart disease) vs. regular human insulin (RHI) in type 2 diabetes patients. Economic implications, balanced against potential improvements in quality-adjusted life years (QALYs) resulting from lower risks of complications with IA in this setting have not yet been explored. A decision analysis model was developed utilizing 3-year initial MVE rates for each comparator, combined with published German-specific insulin and MVE costs and health utilities to calculate number needed to treat (NNT) to avoid any MVE, incremental costs and QALYs gained/ person for IA vs. RHI. A 3-year time horizon and German 3(rd)-party payer perspective were used. Probabilistic sensitivity analysis was performed, sampling from distributions of key parameters. Additional sensitivity analyses were performed. NNT over a 3 year period to avoid any MVE was 8 patients for IA vs. RHI. Due to lower MVE rates, IA dominated RHI with 0.020 QALYs gained (95% confidence interval: 0.014-0.025) and cost savings of EUR 1 556 (1 062-2 076)/person for IA vs. RHI over the 3-year time horizon. Sensitivity analysis revealed that IA would still be overall cost saving even if the cost of IA was double the cost/unit of RHI. From a health economics perspective, IA was the superior alternative for the insulin treatment of type 2 diabetes, with lower incidence of MVE events translating to improved QALYs and lower costs vs. RHI within a 3-year time horizon. © J. A. Barth Verlag in Georg Thieme Verlag KG Stuttgart · New York.

  1. Quantitative image analysis of immunohistochemical stains using a CMYK color model

    PubMed Central

    Pham, Nhu-An; Morrison, Andrew; Schwock, Joerg; Aviel-Ronen, Sarit; Iakovlev, Vladimir; Tsao, Ming-Sound; Ho, James; Hedley, David W

    2007-01-01

    Background Computer image analysis techniques have decreased effects of observer biases, and increased the sensitivity and the throughput of immunohistochemistry (IHC) as a tissue-based procedure for the evaluation of diseases. Methods We adapted a Cyan/Magenta/Yellow/Key (CMYK) model for automated computer image analysis to quantify IHC stains in hematoxylin counterstained histological sections. Results The spectral characteristics of the chromogens AEC, DAB and NovaRed as well as the counterstain hematoxylin were first determined using CMYK, Red/Green/Blue (RGB), normalized RGB and Hue/Saturation/Lightness (HSL) color models. The contrast of chromogen intensities on a 0–255 scale (24-bit image file) as well as compared to the hematoxylin counterstain was greatest using the Yellow channel of a CMYK color model, suggesting an improved sensitivity for IHC evaluation compared to other color models. An increase in activated STAT3 levels due to growth factor stimulation, quantified using the Yellow channel image analysis was associated with an increase detected by Western blotting. Two clinical image data sets were used to compare the Yellow channel automated method with observer-dependent methods. First, a quantification of DAB-labeled carbonic anhydrase IX hypoxia marker in 414 sections obtained from 138 biopsies of cervical carcinoma showed strong association between Yellow channel and positive color selection results. Second, a linear relationship was also demonstrated between Yellow intensity and visual scoring for NovaRed-labeled epidermal growth factor receptor in 256 non-small cell lung cancer biopsies. Conclusion The Yellow channel image analysis method based on a CMYK color model is independent of observer biases for threshold and positive color selection, applicable to different chromogens, tolerant of hematoxylin, sensitive to small changes in IHC intensity and is applicable to simple automation procedures. These characteristics are advantageous for both basic as well as clinical research in an unbiased, reproducible and high throughput evaluation of IHC intensity. PMID:17326824

  2. An Animal Model of Trichloroethylene-Induced Skin Sensitization in BALB/c Mice.

    PubMed

    Wang, Hui; Zhang, Jia-xiang; Li, Shu-long; Wang, Feng; Zha, Wan-sheng; Shen, Tong; Wu, Changhao; Zhu, Qi-xing

    2015-01-01

    Trichloroethylene (TCE) is a major occupational hazard and environmental contaminant that can cause multisystem disorders in the form of occupational medicamentosa-like dermatitis. Development of dermatitis involves several proinflammatory cytokines, but their role in TCE-mediated dermatitis has not been examined in a well-defined experimental model. In addition, few animal models of TCE sensitization are available, and the current guinea pig model has apparent limitations. This study aimed to establish a model of TCE-induced skin sensitization in BALB/c mice and to examine the role of several key inflammatory cytokines on TCE sensitization. The sensitization rate of dorsal painted group was 38.3%. Skin edema and erythema occurred in TCE-sensitized groups, as seen in 2,4-dinitrochlorobenzene (DNCB) positive control. Trichloroethylene sensitization-positive (dermatitis [+]) group exhibited increased thickness of epidermis, inflammatory cell infiltration, swelling, and necrosis in dermis and around hair follicle, but ear painted group did not show these histological changes. The concentrations of serum proinflammatory cytokines including tumor necrosis factor (TNF)-α, interferon (IFN)-γ, and interleukin (IL)-2 were significantly increased in 24, 48, and 72 hours dermatitis [+] groups treated with TCE and peaked at 72 hours. Deposition of TNF-α, IFN-γ, and IL-2 into the skin tissue was also revealed by immunohistochemistry. We have established a new animal model of skin sensitization induced by repeated TCE stimulations, and we provide the first evidence that key proinflammatory cytokines including TNF-α, IFN-γ, and IL-2 play an important role in the process of TCE sensitization. © The Author(s) 2015.

  3. Final Technical Report to DOE for the Award DE-SC0004601

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Jizhong

    Understanding the responses, adaptations and feedback mechanisms of biological communities to climate change is critical to project future state of earth and climate systems. Although significant amount of knowledge is available on the feedback responses of aboveground communities to climate change, little is known about the responses of belowground microbial communities due to the challenges in analyzing soil microbial community structure. Thus the goal overall goal of this study is to provide system-level, predictive mechanistic understanding of the temperature sensitivity of soil carbon (C) decomposition to climate warming by using cutting-edge integrated metagenomic technologies. Towards this goal, the following fourmore » objectives will be pursued: (i) To determine phylogenetic composition and metabolic diversity of microbial communities in the temperate grassland and tundra ecosystems; (ii) To delineate the responses of microbial community structure, functions and activities to climate change in the temperate grassland and tundra ecosystems; (iii) To determine the temperature sensitivity of microbial respiration in soils with different mixtures of labile versus recalcitrant C, and the underlying microbiological basis for temperature sensitivity of these pools; and (iv) To synthesize all experimental data for revealing microbial control of ecosystem carbon processes in responses to climate change. We have achieved our goals for all four proposed objectives. First, we determined the phylogenetic composition and metabolic diversity of microbial communities in the temperate grassland and tundra ecosystems. For this objective, we have developed a novel phasing amplicon sequencing (PAS) approach for MiSeq sequencing of amplicons. This approach has been used for sequencing various phylogenetic and functional genes related to ecosystem functioning. A comprehensive functional gene array (e.g., GeoChip 5.0) has also been developed and used for soil microbial community analysis in this study. In addition, shot-gun metagenome sequencing along with the above approaches have been used to understand the phylogenetic and functional diversity, composition, and structure of soil microbial communities in both temperature grassland and tundra ecosystems. Second, we determined the response of soil microbial communities to climate warming in both temperate grassland and tundra ecosystems using various methods. Our major findings are: (i) Microorganisms are very rapid to respond to climate warming in the tundra ecosystem, AK, which is vulnerable, too. (ii) Climate warming also significantly shifted the metabolic diversity, composition and structure of microbial communities, and key metabolic pathways related to carbon turnover, such as cellulose degradation (~13%) and CO2 production (~10%), and to nitrogen cycling, including denitrification (~12%) were enriched by warming. (iii) Warming also altered the expression patterns of microbial functional genes important to ecosystem functioning and stability through GeoChip and metatranscriptomic analysis of soil microbial communities at the OK site. Third, we analyzed temperature sensitivity of C decomposition to climate warming for both AK and OK soils through laboratory incubations. Key results include: (i) Alaska tundra soils showed that after one year of incubation, CT in the top 15 cm could be as high as 25% and 15% of the initial soil C content at 25°C and 15°C incubations, respectively. (ii) analysis of 456 incubated soil samples with 16S rRNA gene, ITS and GeoChip hybridization showed that warming shifted the phylogenretic and functional diversity, composition, structure and metabolic potential of soil microbial communities, and at different stages of incubation, key populations and functional genes significantly changed along with soil substrate changes. Functional gene diversity and functional genes for degrading labile C components decrease along incubation when labile C components are exhausting, but the genes related to degrading recalcitrant C increase. These molecular data will be directly used for modeling. Fourth, we have developed novel approaches to integrate and model experimental data to understand microbial control of ecosystem C processes in response to climate change. We compared different methods to calculate Q10 for estimating temperature sensitivity, and new approaches for Q10 calculation and molecular ecological network analysis were also developed. Using those newly developed approaches, our result indicated that Q10s increased with the recalcitrance of C pools, suggesting that longer incubation studies are needed in order to assess the temperature sensitivity of slower C pools, especially at low temperature regimes. This project has been very productive, resulting in 42 papers published or in press, 4 submitted, and 13 in preparation.« less

  4. Validation and subsequent development of the DEREK skin sensitization rulebase by analysis of the BgVV list of contact allergens.

    PubMed

    Barratt, M D; Langowski, J J

    1999-01-01

    The DEREK knowledge-based computer system contains a subset of approximately 50 rules describing chemical substructures (toxophores) responsible for skin sensitization. This rulebase, based originally on Unilever historical in-house guinea pig maximization test data, has been subject to extensive validation and is undergoing refinement as the next stage of its development. As part of an ongoing program of validation and testing, the predictive ability of the sensitization rule set has been assessed by processing the structures of the 84 chemical substances in the list of contact allergens issued by the BgVV (German Federal Institute for Health Protection of Consumers). This list of chemicals is important because the biological data for each of the chemicals have been carefully scrutinized and peer reviewed, a key consideration in an area of toxicology in which much unreliable and potentially misleading data have been published. The existing DEREK rulebase for skin sensitization identified toxophores for skin sensitization in the structures of 71 out of the 84 chemicals (85%). The exercise highlighted areas of chemistry where further development of the rulebase was required, either by extension of the scope of existing rules or by generation of new rules where a sound mechanistic rationale for the biological activity could be established. Chemicals likely to be acting as photoallergens were identified, and new rules for photoallergenicity have subsequently been written. At the end of the exercise, the refined rulebase was able to identify toxophores for skin sensitization for 82 of the 84 chemicals in the BgVV list.

  5. Native Mass Spectrometry in Fragment-Based Drug Discovery.

    PubMed

    Pedro, Liliana; Quinn, Ronald J

    2016-07-28

    The advent of native mass spectrometry (MS) in 1990 led to the development of new mass spectrometry instrumentation and methodologies for the analysis of noncovalent protein-ligand complexes. Native MS has matured to become a fast, simple, highly sensitive and automatable technique with well-established utility for fragment-based drug discovery (FBDD). Native MS has the capability to directly detect weak ligand binding to proteins, to determine stoichiometry, relative or absolute binding affinities and specificities. Native MS can be used to delineate ligand-binding sites, to elucidate mechanisms of cooperativity and to study the thermodynamics of binding. This review highlights key attributes of native MS for FBDD campaigns.

  6. Source term model evaluations for the low-level waste facility performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yim, M.S.; Su, S.I.

    1995-12-31

    The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.

  7. Operations Optimization of Nuclear Hybrid Energy Systems

    DOE PAGES

    Chen, Jun; Garcia, Humberto E.; Kim, Jong Suk; ...

    2016-08-01

    We proposed a plan for nuclear hybrid energy systems (NHES) as an effective element to incorporate high penetration of clean energy. Our paper focuses on the operations optimization of two specific NHES configurations to address the variability raised from various markets and renewable generation. Both analytical and numerical approaches are used to obtain the optimization solutions. Furthermore, key economic figures of merit are evaluated under optimized and constant operations to demonstrate the benefit of the optimization, which also suggests the economic viability of considered NHES under proposed operations optimizer. Furthermore, sensitivity analysis on commodity price is conducted for better understandingmore » of considered NHES.« less

  8. Operations Optimization of Nuclear Hybrid Energy Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Jun; Garcia, Humberto E.; Kim, Jong Suk

    We proposed a plan for nuclear hybrid energy systems (NHES) as an effective element to incorporate high penetration of clean energy. Our paper focuses on the operations optimization of two specific NHES configurations to address the variability raised from various markets and renewable generation. Both analytical and numerical approaches are used to obtain the optimization solutions. Furthermore, key economic figures of merit are evaluated under optimized and constant operations to demonstrate the benefit of the optimization, which also suggests the economic viability of considered NHES under proposed operations optimizer. Furthermore, sensitivity analysis on commodity price is conducted for better understandingmore » of considered NHES.« less

  9. Approach to spatial information security based on digital certificate

    NASA Astrophysics Data System (ADS)

    Cong, Shengri; Zhang, Kai; Chen, Baowen

    2005-11-01

    With the development of the online applications of geographic information systems (GIS) and the spatial information services, the spatial information security becomes more important. This work introduced digital certificates and authorization schemes into GIS to protect the crucial spatial information combining the techniques of the role-based access control (RBAC), the public key infrastructure (PKI) and the privilege management infrastructure (PMI). We investigated the spatial information granularity suited for sensitivity marking and digital certificate model that fits the need of GIS security based on the semantics analysis of spatial information. It implements a secure, flexible, fine-grained data access based on public technologies in GIS in the world.

  10. Characteristics and sensitivity analysis of multiple-time-resolved source patterns of PM2.5 with real time data using Multilinear Engine 2

    NASA Astrophysics Data System (ADS)

    Peng, Xing; Shi, Guo-Liang; Gao, Jian; Liu, Jia-Yuan; HuangFu, Yan-Qi; Ma, Tong; Wang, Hai-Ting; Zhang, Yue-Chong; Wang, Han; Li, Hui; Ivey, Cesunica E.; Feng, Yin-Chang

    2016-08-01

    With real time resolved data of Particulate matter (PM) and chemical species, understanding the source patterns and chemical characteristics is critical to establish controlling of PM. In this work, PM2.5 and chemical species were measured by corresponding online instruments with 1-h time resolution in Beijing. Multilinear Engine 2 (ME2) model was applied to explore the sources, and four sources (vehicle emission, crustal dust, secondary formation and coal combustion) were identified. To investigate the sensitivity of time resolution on the source contributions and chemical characteristics, ME2 was conducted with four time resolution runs (1-h, 2-h, 4-h, and 8-h). Crustal dust and coal combustion display large variation in the four time resolutions runs, with their contributions ranging from 6.7 to 10.4 μg m-3 and from 6.4 to 12.2 μg m-3, respectively. The contributions of vehicle emission and secondary formation range from 7.5 to 10.5 and from 14.7 to 16.7 μg m-3, respectively. The sensitivity analyses were conducted by principal component analysis-plot (PCA-plot), coefficient of divergence (CD), average absolute error (AAE) and correlation coefficients. For the four time resolution runs, the source contributions and profiles of crustal dust and coal combustion were more unstable than other source categories, possibly due to the lack of key markers of crustal dust and coal combustion (e.g. Si, Al). On the other hand, vehicle emission and crustal dust were more sensitive to time series of source contributions at different time resolutions. Findings in this study can improve our knowledge of source contributions and chemical characteristics at different time solutions.

  11. Link between epigenomic alterations and genome-wide aberrant transcriptional response to allergen in dendritic cells conveying maternal asthma risk.

    PubMed

    Mikhaylova, Lyudmila; Zhang, Yiming; Kobzik, Lester; Fedulov, Alexey V

    2013-01-01

    We investigated the link between epigenome-wide methylation aberrations at birth and genomic transcriptional changes upon allergen sensitization that occur in the neonatal dendritic cells (DC) due to maternal asthma. We previously demonstrated that neonates of asthmatic mothers are born with a functional skew in splenic DCs that can be seen even in allergen-naïve pups and can convey allergy responses to normal recipients. However, minimal-to-no transcriptional or phenotypic changes were found to explain this alteration. Here we provide in-depth analysis of genome-wide DNA methylation profiles and RNA transcriptional (microarray) profiles before and after allergen sensitization. We identified differentially methylated and differentially expressed loci and performed manually-curated matching of methylation status of the key regulatory sequences (promoters and CpG islands) to expression of their respective transcripts before and after sensitization. We found that while allergen-naive DCs from asthma-at-risk neonates have minimal transcriptional change compared to controls, the methylation changes are extensive. The substantial transcriptional change only becomes evident upon allergen sensitization, when it occurs in multiple genes with the pre-existing epigenetic alterations. We demonstrate that maternal asthma leads to both hyper- and hypomethylation in neonatal DCs, and that both types of events at various loci significantly overlap with transcriptional responses to allergen. Pathway analysis indicates that approximately 1/2 of differentially expressed and differentially methylated genes directly interact in known networks involved in allergy and asthma processes. We conclude that congenital epigenetic changes in DCs are strongly linked to altered transcriptional responses to allergen and to early-life asthma origin. The findings are consistent with the emerging paradigm that asthma is a disease with underlying epigenetic changes.

  12. Cost-effectiveness analysis of antiviral treatment in the management of seasonal influenza A: point-of-care rapid test versus clinical judgment.

    PubMed

    Nshimyumukiza, Léon; Douville, Xavier; Fournier, Diane; Duplantie, Julie; Daher, Rana K; Charlebois, Isabelle; Longtin, Jean; Papenburg, Jesse; Guay, Maryse; Boissinot, Maurice; Bergeron, Michel G; Boudreau, Denis; Gagné, Christian; Rousseau, François; Reinharz, Daniel

    2016-03-01

    A point-of-care rapid test (POCRT) may help early and targeted use of antiviral drugs for the management of influenza A infection. (i) To determine whether antiviral treatment based on a POCRT for influenza A is cost-effective and, (ii) to determine the thresholds of key test parameters (sensitivity, specificity and cost) at which a POCRT based-strategy appears to be cost effective. An hybrid « susceptible, infected, recovered (SIR) » compartmental transmission and Markov decision analytic model was used to simulate the cost-effectiveness of antiviral treatment based on a POCRT for influenza A in the social perspective. Data input parameters used were retrieved from peer-review published studies and government databases. The outcome considered was the incremental cost per life-year saved for one seasonal influenza season. In the base-case analysis, the antiviral treatment based on POCRT saves 2 lives/100,000 person-years and costs $7600 less than the empirical antiviral treatment based on clinical judgment alone, which demonstrates that the POCRT-based strategy is dominant. In one and two way-sensitivity analyses, results were sensitive to the POCRT accuracy and cost, to the vaccination coverage as well as to the prevalence of influenza A. In probabilistic sensitivity analyses, the POCRT strategy is cost-effective in 66% of cases, for a commonly accepted threshold of $50,000 per life-year saved. The influenza antiviral treatment based on POCRT could be cost-effective in specific conditions of performance, price and disease prevalence. © 2015 The Authors. Influenza and Other Respiratory Viruses Published by John Wiley & Sons Ltd.

  13. Effects of gaboxadol on the expression of cocaine sensitization in rats.

    PubMed

    Silverman, Nora Siegal; Popp, Susanna; Vialou, Vincent; Astafurov, Konstantin; Nestler, Eric J; Dow-Edwards, Diana

    2016-04-01

    Behavioral sensitization to psychostimulants is associated with changes in dopamine (DA), glutamate, and GABA within the mesocorticolimbic and nigrostriatal DA systems. Because GABAA receptors are highly expressed within these systems, we examined the role of these receptors containing a δ subunit in cocaine behavioral sensitization. Experiment 1 examined the effects of Gaboxadol (GBX, also known as THIP [4,5,6,7-tetrahydro-isoxazolo[5,4-c]pyridin-3-ol]), a selective δ-GABAA receptor agonist, on the locomotor responses to acute cocaine. GBX at 1.25 mg/kg produced locomotor depression in female rats alone. We then examined the effects of GBX on the expression of cocaine-induced locomotion and stereotypy in female and male rats treated with 5 days of cocaine (15 mg/kg) followed by cocaine challenge 7 days later. We administered systemic (Experiment 2) or intranucleus accumbens (intra-NAC; Experiment 3) injections of GBX (0, 1.25, 2.5, 5, or 10 mg/kg subcutaneously, or 1 μmol/L or 1 mM intra-NAC, respectively) prior to cocaine challenge (10 mg/kg). In our experiments females were robustly sensitized to cocaine at low dose whereas males did not show such sensitization-limiting comparisons between the 2 sexes. Sensitized females showed a biphasic response to low (1.25 mg/kg and 1 μmol/L) and high (10 mg/kg and 1 mM) dose GBX whereas nonsensitized males showed this pattern only following intra-NAC injection. Immunohistochemical analysis of the NAC revealed that females have more δ-containing GABAA receptors than do males and that following chronic cocaine injections this difference persisted (Experiment 4). Together, our results support the notion of the key role of extrasynaptic GABAA δ-subunit containing receptors in cocaine sensitization. (c) 2016 APA, all rights reserved).

  14. Skeletal muscle phosphatidylcholine and phosphatidylethanolamine are related to insulin sensitivity and respond to acute exercise in humans.

    PubMed

    Newsom, Sean A; Brozinick, Joseph T; Kiseljak-Vassiliades, Katja; Strauss, Allison N; Bacon, Samantha D; Kerege, Anna A; Bui, Hai Hoang; Sanders, Phil; Siddall, Parker; Wei, Tao; Thomas, Melissa; Kuo, Ming Shang; Nemkov, Travis; D'Alessandro, Angelo; Hansen, Kirk C; Perreault, Leigh; Bergman, Bryan C

    2016-06-01

    Several recent reports indicate that the balance of skeletal muscle phosphatidylcholine (PC) and phosphatidylethanolamine (PE) is a key determinant of muscle contractile function and metabolism. The purpose of this study was to determine relationships between skeletal muscle PC, PE and insulin sensitivity, and whether PC and PE are dynamically regulated in response to acute exercise in humans. Insulin sensitivity was measured via intravenous glucose tolerance in sedentary obese adults (OB; n = 14), individuals with type 2 diabetes (T2D; n = 15), and endurance-trained athletes (ATH; n = 15). Vastus lateralis muscle biopsies were obtained at rest, immediately after 90 min of cycle ergometry at 50% maximal oxygen consumption (V̇o2 max), and 2-h postexercise (recovery). Skeletal muscle PC and PE were measured via infusion-based mass spectrometry/mass spectrometry analysis. ATH had greater levels of muscle PC and PE compared with OB and T2D (P < 0.05), with total PC and PE positively relating to insulin sensitivity (both P < 0.05). Skeletal muscle PC:PE ratio was elevated in T2D compared with OB and ATH (P < 0.05), tended to be elevated in OB vs. ATH (P = 0.07), and was inversely related to insulin sensitivity among the entire cohort (r = -0.43, P = 0.01). Muscle PC and PE were altered by exercise, particularly after 2 h of recovery, in a highly group-specific manner. However, muscle PC:PE ratio remained unchanged in all groups. In summary, total muscle PC and PE are positively related to insulin sensitivity while PC:PE ratio is inversely related to insulin sensitivity in humans. A single session of exercise significantly alters skeletal muscle PC and PE levels, but not PC:PE ratio. Copyright © 2016 the American Physiological Society.

  15. Skeletal muscle phosphatidylcholine and phosphatidylethanolamine are related to insulin sensitivity and respond to acute exercise in humans

    PubMed Central

    Newsom, Sean A.; Brozinick, Joseph T.; Kiseljak-Vassiliades, Katja; Strauss, Allison N.; Bacon, Samantha D.; Kerege, Anna A.; Bui, Hai Hoang; Sanders, Phil; Siddall, Parker; Wei, Tao; Thomas, Melissa; Kuo, Ming Shang; Nemkov, Travis; D'Alessandro, Angelo; Hansen, Kirk C.; Perreault, Leigh

    2016-01-01

    Several recent reports indicate that the balance of skeletal muscle phosphatidylcholine (PC) and phosphatidylethanolamine (PE) is a key determinant of muscle contractile function and metabolism. The purpose of this study was to determine relationships between skeletal muscle PC, PE and insulin sensitivity, and whether PC and PE are dynamically regulated in response to acute exercise in humans. Insulin sensitivity was measured via intravenous glucose tolerance in sedentary obese adults (OB; n = 14), individuals with type 2 diabetes (T2D; n = 15), and endurance-trained athletes (ATH; n = 15). Vastus lateralis muscle biopsies were obtained at rest, immediately after 90 min of cycle ergometry at 50% maximal oxygen consumption (V̇o2 max), and 2-h postexercise (recovery). Skeletal muscle PC and PE were measured via infusion-based mass spectrometry/mass spectrometry analysis. ATH had greater levels of muscle PC and PE compared with OB and T2D (P < 0.05), with total PC and PE positively relating to insulin sensitivity (both P < 0.05). Skeletal muscle PC:PE ratio was elevated in T2D compared with OB and ATH (P < 0.05), tended to be elevated in OB vs. ATH (P = 0.07), and was inversely related to insulin sensitivity among the entire cohort (r = −0.43, P = 0.01). Muscle PC and PE were altered by exercise, particularly after 2 h of recovery, in a highly group-specific manner. However, muscle PC:PE ratio remained unchanged in all groups. In summary, total muscle PC and PE are positively related to insulin sensitivity while PC:PE ratio is inversely related to insulin sensitivity in humans. A single session of exercise significantly alters skeletal muscle PC and PE levels, but not PC:PE ratio. PMID:27032901

  16. TPV Power Source Using Infrared-Sensitive Cells with Commercially Available Radiant Tube Burner

    NASA Astrophysics Data System (ADS)

    Fraas, Lewis; Minkin, Leonid; Hui, She; Avery, James; Howells, Christopher

    2004-11-01

    Over the last several years, JX Crystals has invented and systematically developed the key components for thermophotovoltaic systems. These key components include GaSb infrared sensitive cells, high power density shingle circuits, dielectric filters, and hydrocarbon-fueled radiant tube burners. Most recently, we invented and demonstrated an antireflection (AR)-coated tungsten IR emitter which when integrated with the other key components should make TPV systems with efficiencies over 10% practical. However, the use of the AR tungsten emitter requires an oxygen-free hermetic seal enclosure. During a 2003 Small Business Innovative Research (SBIR) Phase I contract, we integrated a tungsten emitter foil and a commercial SiC radiant tube burner within an emitter thermos and successfully demonstrated its operation at high temperature. We also designed a complete stand alone 500 W TPV generator. During the upcoming SBIR Phase II, we plan to implement this design in hardware.

  17. Reduction and Uncertainty Analysis of Chemical Mechanisms Based on Local and Global Sensitivities

    NASA Astrophysics Data System (ADS)

    Esposito, Gaetano

    Numerical simulations of critical reacting flow phenomena in hypersonic propulsion devices require accurate representation of finite-rate chemical kinetics. The chemical kinetic models available for hydrocarbon fuel combustion are rather large, involving hundreds of species and thousands of reactions. As a consequence, they cannot be used in multi-dimensional computational fluid dynamic calculations in the foreseeable future due to the prohibitive computational cost. In addition to the computational difficulties, it is also known that some fundamental chemical kinetic parameters of detailed models have significant level of uncertainty due to limited experimental data available and to poor understanding of interactions among kinetic parameters. In the present investigation, local and global sensitivity analysis techniques are employed to develop a systematic approach of reducing and analyzing detailed chemical kinetic models. Unlike previous studies in which skeletal model reduction was based on the separate analysis of simple cases, in this work a novel strategy based on Principal Component Analysis of local sensitivity values is presented. This new approach is capable of simultaneously taking into account all the relevant canonical combustion configurations over different composition, temperature and pressure conditions. Moreover, the procedure developed in this work represents the first documented inclusion of non-premixed extinction phenomena, which is of great relevance in hypersonic combustors, in an automated reduction algorithm. The application of the skeletal reduction to a detailed kinetic model consisting of 111 species in 784 reactions is demonstrated. The resulting reduced skeletal model of 37--38 species showed that the global ignition/propagation/extinction phenomena of ethylene-air mixtures can be predicted within an accuracy of 2% of the full detailed model. The problems of both understanding non-linear interactions between kinetic parameters and identifying sources of uncertainty affecting relevant reaction pathways are usually addressed by resorting to Global Sensitivity Analysis (GSA) techniques. In particular, the most sensitive reactions controlling combustion phenomena are first identified using the Morris Method and then analyzed under the Random Sampling -- High Dimensional Model Representation (RS-HDMR) framework. The HDMR decomposition shows that 10% of the variance seen in the extinction strain rate of non-premixed flames is due to second-order effects between parameters, whereas the maximum concentration of acetylene, a key soot precursor, is affected by mostly only first-order contributions. Moreover, the analysis of the global sensitivity indices demonstrates that improving the accuracy of the reaction rates including the vinyl radical, C2H3, can drastically reduce the uncertainty of predicting targeted flame properties. Finally, the back-propagation of the experimental uncertainty of the extinction strain rate to the parameter space is also performed. This exercise, achieved by recycling the numerical solutions of the RS-HDMR, shows that some regions of the parameter space have a high probability of reproducing the experimental value of the extinction strain rate between its own uncertainty bounds. Therefore this study demonstrates that the uncertainty analysis of bulk flame properties can effectively provide information on relevant chemical reactions.

  18. Anxiety sensitivity as a predictor of broad dimensions of psychopathology after cognitive behavioral therapy for panic disorder.

    PubMed

    Ino, Keiko; Ogawa, Sei; Kondo, Masaki; Imai, Risa; Ii, Toshitaka; Furukawa, Toshi A; Akechi, Tatsuo

    2017-01-01

    Panic disorder (PD) is a common disease and presents with broad dimensions of psychopathology. Cognitive behavioral therapy (CBT) is known to improve these broad dimensions of psychopathology in addition to PD symptoms. However, little is known about the predictors of treatment response in comorbid psychiatric symptoms after CBT for PD. Recent studies suggest that anxiety sensitivity (AS) may be a key vulnerability for PD. This study aimed to examine AS as a predictor of broad dimensions of psychopathology after CBT for PD. In total, 118 patients with PD were treated with manualized group CBT. We used multiple regression analysis to examine the associations between 3 Anxiety Sensitivity Index (ASI) factors (physical concerns, mental incapacitation concerns, and social concerns) at baseline and the subscales of the Symptom Checklist-90 Revised (SCL-90-R) at endpoint. Low levels of social concerns at baseline predicted low levels on 5 SCL-90-R subscales after CBT: interpersonal sensitivity, depression, hostility, paranoid ideation, and psychosis. High levels of mental incapacitation concerns significantly predicted low levels on 3 SCL-90-R subscales after treatment: interpersonal sensitivity, hostility, and paranoid ideation. Physical concerns at baseline did not predict broad dimensions of psychopathology. This study suggested that the social concerns and mental incapacitation concerns subscales of the ASI at baseline predicted several dimensions of psychopathology after CBT for PD. To improve comorbid psychopathology, it may be useful to direct more attention to these ASI subscales.

  19. Digital PCR methods improve detection sensitivity and measurement precision of low abundance mtDNA deletions.

    PubMed

    Belmonte, Frances R; Martin, James L; Frescura, Kristin; Damas, Joana; Pereira, Filipe; Tarnopolsky, Mark A; Kaufman, Brett A

    2016-04-28

    Mitochondrial DNA (mtDNA) mutations are a common cause of primary mitochondrial disorders, and have also been implicated in a broad collection of conditions, including aging, neurodegeneration, and cancer. Prevalent among these pathogenic variants are mtDNA deletions, which show a strong bias for the loss of sequence in the major arc between, but not including, the heavy and light strand origins of replication. Because individual mtDNA deletions can accumulate focally, occur with multiple mixed breakpoints, and in the presence of normal mtDNA sequences, methods that detect broad-spectrum mutations with enhanced sensitivity and limited costs have both research and clinical applications. In this study, we evaluated semi-quantitative and digital PCR-based methods of mtDNA deletion detection using double-stranded reference templates or biological samples. Our aim was to describe key experimental assay parameters that will enable the analysis of low levels or small differences in mtDNA deletion load during disease progression, with limited false-positive detection. We determined that the digital PCR method significantly improved mtDNA deletion detection sensitivity through absolute quantitation, improved precision and reduced assay standard error.

  20. Polyploidization mechanisms: temperature environment can induce diploid gamete formation in Rosa sp.

    PubMed

    Pécrix, Yann; Rallo, Géraldine; Folzer, Hélène; Cigna, Mireille; Gudin, Serge; Le Bris, Manuel

    2011-06-01

    Polyploidy is an important evolutionary phenomenon but the mechanisms by which polyploidy arises still remain underexplored. There may be an environmental component to polyploidization. This study aimed to clarify how temperature may promote diploid gamete formation considered an essential element for sexual polyploidization. First of all, a detailed cytological analysis of microsporogenesis and microgametogenesis was performed to target precisely the key developmental stages which are the most sensitive to temperature. Then, heat-induced modifications in sporad and pollen characteristics were analysed through an exposition of high temperature gradient. Rosa plants are sensitive to high temperatures with a developmental sensitivity window limited to meiosis. Moreover, the range of efficient temperatures is actually narrow. 36 °C at early meiosis led to a decrease in pollen viability, pollen ectexine defects but especially the appearance of numerous diploid pollen grains. They resulted from dyads or triads mainly formed following heat-induced spindle misorientations in telophase II. A high temperature environment has the potential to increase gamete ploidy level. The high frequencies of diplogametes obtained at some extreme temperatures support the hypothesis that polyploidization events could have occurred in adverse conditions and suggest polyploidization facilitating in a global change context.

  1. Digital PCR methods improve detection sensitivity and measurement precision of low abundance mtDNA deletions

    PubMed Central

    Belmonte, Frances R.; Martin, James L.; Frescura, Kristin; Damas, Joana; Pereira, Filipe; Tarnopolsky, Mark A.; Kaufman, Brett A.

    2016-01-01

    Mitochondrial DNA (mtDNA) mutations are a common cause of primary mitochondrial disorders, and have also been implicated in a broad collection of conditions, including aging, neurodegeneration, and cancer. Prevalent among these pathogenic variants are mtDNA deletions, which show a strong bias for the loss of sequence in the major arc between, but not including, the heavy and light strand origins of replication. Because individual mtDNA deletions can accumulate focally, occur with multiple mixed breakpoints, and in the presence of normal mtDNA sequences, methods that detect broad-spectrum mutations with enhanced sensitivity and limited costs have both research and clinical applications. In this study, we evaluated semi-quantitative and digital PCR-based methods of mtDNA deletion detection using double-stranded reference templates or biological samples. Our aim was to describe key experimental assay parameters that will enable the analysis of low levels or small differences in mtDNA deletion load during disease progression, with limited false-positive detection. We determined that the digital PCR method significantly improved mtDNA deletion detection sensitivity through absolute quantitation, improved precision and reduced assay standard error. PMID:27122135

  2. Ultrasensitive microfluidic solid-phase ELISA using an actuatable microwell-patterned PDMS chip.

    PubMed

    Wang, Tanyu; Zhang, Mohan; Dreher, Dakota D; Zeng, Yong

    2013-11-07

    Quantitative detection of low abundance proteins is of significant interest for biological and clinical applications. Here we report an integrated microfluidic solid-phase ELISA platform for rapid and ultrasensitive detection of proteins with a wide dynamic range. Compared to the existing microfluidic devices that perform affinity capture and enzyme-based optical detection in a constant channel volume, the key novelty of our design is two-fold. First, our system integrates a microwell-patterned assay chamber that can be pneumatically actuated to significantly reduce the volume of chemifluorescent reaction, markedly improving the sensitivity and speed of ELISA. Second, monolithic integration of on-chip pumps and the actuatable assay chamber allow programmable fluid delivery and effective mixing for rapid and sensitive immunoassays. Ultrasensitive microfluidic ELISA was demonstrated for insulin-like growth factor 1 receptor (IGF-1R) across at least five orders of magnitude with an extremely low detection limit of 21.8 aM. The microwell-based solid-phase ELISA strategy provides an expandable platform for developing the next-generation microfluidic immunoassay systems that integrate and automate digital and analog measurements to further improve the sensitivity, dynamic ranges, and reproducibility of proteomic analysis.

  3. Validation and sensitivity of the FINE Bayesian network for forecasting aquatic exposure to nano-silver.

    PubMed

    Money, Eric S; Barton, Lauren E; Dawson, Joseph; Reckhow, Kenneth H; Wiesner, Mark R

    2014-03-01

    The adaptive nature of the Forecasting the Impacts of Nanomaterials in the Environment (FINE) Bayesian network is explored. We create an updated FINE model (FINEAgNP-2) for predicting aquatic exposure concentrations of silver nanoparticles (AgNP) by combining the expert-based parameters from the baseline model established in previous work with literature data related to particle behavior, exposure, and nano-ecotoxicology via parameter learning. We validate the AgNP forecast from the updated model using mesocosm-scale field data and determine the sensitivity of several key variables to changes in environmental conditions, particle characteristics, and particle fate. Results show that the prediction accuracy of the FINEAgNP-2 model increased approximately 70% over the baseline model, with an error rate of only 20%, suggesting that FINE is a reliable tool to predict aquatic concentrations of nano-silver. Sensitivity analysis suggests that fractal dimension, particle diameter, conductivity, time, and particle fate have the most influence on aquatic exposure given the current knowledge; however, numerous knowledge gaps can be identified to suggest further research efforts that will reduce the uncertainty in subsequent exposure and risk forecasts. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system structural components

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.

    1987-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  5. Probabilistic Structural Analysis Methods for select space propulsion system structural components (PSAM)

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.

    1988-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  6. Assessing gendered roles in water decision-making in semi-arid regions through sex-disaggregated water data with UNESCO-WWAP gender toolkit

    NASA Astrophysics Data System (ADS)

    Miletto, Michela; Greco, Francesca; Belfiore, Elena

    2017-04-01

    Global climate change is expected to exacerbate current and future stresses on water resources from population growth and land use, and increase the frequency and severity of droughts and floods. Women are more vulnerable to the effects of climate change than men not only because they constitute the majority of the world's poor but also because they are more dependent for their livelihood on natural resources that are threatened by climate change. In addition, social, economic and political barriers often limit their coping capacity. Women play a key role in the provision, management and safeguarding of water, nonetheless, gender inequality in water management framework persists around the globe. Sharp data are essential to inform decisions and support effective policies. Disaggregating water data by sex is crucial to analyse gendered roles in the water realm and inform gender sensitive water policies in light of the global commitments to gender equality of Agenda 2030. In view of this scenario, WWAP has created an innovative toolkit for sex-disaggregated water data collection, as a result of a participatory work of more than 35 experts, part of the WWAP Working Group on Sex-Disaggregated Indicators (http://www.unesco.org/new/en/natural-sciences/environment/water/wwap/water-and-gender/un-wwap-working-group-on-gender-disaggregated-indicators/#c1430774). The WWAP toolkit contains four tools: the methodology (Seager J. WWAP UNESCO, 2015), set of key indicators, the guideline (Pangare V.,WWAP UNESCO, 2015) and a questionnaire for field survey. WWAP key gender-sensitive indicators address water resources management, aspects of water quality and agricultural uses, water resources governance and management, and investigate unaccounted labour in according to gender and age. Managing water resources is key for climate adaptation. Women are particularly sensitive to water quality and the health of water-dependent ecosystems, often source of food and job opportunities. Extreme climatic events like floods and droughts could severely impact the status of water resources and dependent ecosystems and the sustainability of household activities and local economies, given the absence of gender sensitive preparedness to hydrological and meteorological extremes. This paper describes the application of the WWAP Gender Toolkit to water data assessments in semi-arid region of the Stampriet transboundary aquifer shared by Botwana, Namibia and South Africa, in the framework of the "Groundwater Resources Governance in Transboundary Aquifers" - GGRETA project, led and executed by the UNESCO International Hydrological Programme (IHP), and financed by the Swiss Agency for Development and Cooperation (SDC). The tests in the field proved the reliability of WWAP gender toolkit and selected gender-sensitive indicators in the freshwater assessment. Further analysis could inform on the gaps and needs for climate adaptation practices. Field data identified socially-determined differences in roles, and confirmed the prevalent role of women in managing freshwater for drinking and sanitation purposes within the household boundaries, while decision-making for water allocation and use (with implications on hydrological risk) for agriculture and livestock purposes, is broadly under men's responsibility.

  7. Electrochemical Determination of Chlorpyrifos on a Nano-TiO₂Cellulose Acetate Composite Modified Glassy Carbon Electrode.

    PubMed

    Kumaravel, Ammasai; Chandrasekaran, Maruthai

    2015-07-15

    A rapid and simple method of determination of chlorpyrifos is important in environmental monitoring and quality control. Electrochemical methods for the determination of pesticides are fast, sensitive, reproducible, and cost-effective. The key factor in electrochemical methods is the choice of suitable electrode materials. The electrode materials should have good stability, reproducibility, more sensitivity, and easy method of preparation. Mercury-based electrodes have been widely used for the determination of chlorpyrifos. From an environmental point of view mercury cannot be used. In this study a biocompatible nano-TiO2/cellulose acetate modified glassy carbon electrode was prepared by a simple method and used for the electrochemical sensing of chlorpyrifos in aqueous methanolic solution. Electroanalytical techniques such as cyclic voltammetry, differential pulse voltammetry, and amperometry were used in this work. This electrode showed very good stability, reproducibility, and sensitivity. A well-defined peak was obtained for the reduction of chlorpyrifos in cyclic voltammetry and differential pulse voltammetry. A smooth noise-free current response was obtained in amperometric analysis. The peak current obtained was proportional to the concentration of chlorpyrifos and was used to determine the unknown concentration of chlorpyrifos in the samples. Analytical parameters such as LOD, LOQ, and linear range were estimated. Analysis of real samples was also carried out. The results were validated through HPLC. This composite electrode can be used as an alternative to mercury electrodes reported in the literature.

  8. A Web-Based Education Program for Colorectal Lesion Diagnosis with Narrow Band Imaging Classification.

    PubMed

    Aihara, Hiroyuki; Kumar, Nitin; Thompson, Christopher C

    2018-04-19

    An education system for narrow band imaging (NBI) interpretation requires sufficient exposure to key features. However, access to didactic lectures by experienced teachers is limited in the United States. To develop and assess the effectiveness of a colorectal lesion identification tutorial. In the image analysis pretest, subjects including 9 experts and 8 trainees interpreted 50 white light (WL) and 50 NBI images of colorectal lesions. Results were not reviewed with subjects. Trainees then participated in an online tutorial emphasizing NBI interpretation in colorectal lesion analysis. A post-test was administered and diagnostic yields were compared to pre-education diagnostic yields. Under the NBI mode, experts showed higher diagnostic yields (sensitivity 91.5% [87.3-94.4], specificity 90.6% [85.1-94.2], and accuracy 91.1% [88.5-93.7] with substantial interobserver agreement [κ value 0.71]) compared to trainees (sensitivity 89.6% [84.8-93.0], specificity 80.6% [73.5-86.3], and accuracy 86.0% [82.6-89.2], with substantial interobserver agreement [κ value 0.69]). The online tutorial improved the diagnostic yields of trainees to the equivalent level of experts (sensitivity 94.1% [90.0-96.6], specificity 89.0% [83.0-93.2], and accuracy 92.0% [89.3-94.7], p < 0.001 with substantial interobserver agreement [κ value 0.78]). This short, online tutorial improved diagnostic performance and interobserver agreement. © 2018 S. Karger AG, Basel.

  9. Security enhancement of optical encryption based on biometric array keys

    NASA Astrophysics Data System (ADS)

    Yan, Aimin; Wei, Yang; Zhang, Jingtao

    2018-07-01

    A novel optical image encryption method is proposed by using Dammann grating and biometric array keys. Dammann grating is utilized to create a 2D finite uniform-intensity spot array. In encryption, a fingerprint array is used as private encryption keys. An original image can be encrypted by a scanning Fresnel zone plate array. Encrypted signals are processed by an optical coherent heterodyne detection system. Biometric array keys and optical scanning cryptography are integrated with each other to enhance information security greatly. Numerical simulations are performed to demonstrate the feasibility and validity of this method. Analyses on key sensitivity and the resistance against to possible attacks are provided.

  10. Teacher-Student Sexual Relations: Key Risks and Ethical Issues

    ERIC Educational Resources Information Center

    Sikes, Pat

    2010-01-01

    Researching actual or purported sexual contact between teachers and students raises many difficult ethical issues, questions and dilemmas, which may help to explain why few have ventured into the field. This experientially based paper addresses key problem areas under the headings of: the ethics of researching a sensitive taboo topic; the ethics…

  11. Role of CTGF in sensitivity to hyperthermia in ovarian and uterine cancers

    DOE PAGES

    Hatakeyama, Hiroto; Wu, Sherry Y.; Lyons, Yasmin A.; ...

    2016-11-01

    Even though hyperthermia is a promising treatment for cancer, the relationship between specific temperatures and clinical benefits and predictors of sensitivity of cancer to hyperthermia is poorly understood. Ovarian and uterine tumors have diverse hyperthermia sensitivities. Integrative analyses of the specific gene signatures and the differences in response to hyperthermia between hyperthermia-sensitive and -resistant cancer cells identified CTGF as a key regulator of sensitivity. CTGF silencing sensitized resistant cells to hyperthermia. CTGF small interfering RNA (siRNA) treatment also sensitized resistant cancers to localized hyperthermia induced by copper sulfide nanoparticles and near-infrared laser in orthotopic ovarian cancer models. Lastly, CTGF silencingmore » aggravated energy stress induced by hyperthermia and enhanced apoptosis of hyperthermia-resistant cancers.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hatakeyama, Hiroto; Wu, Sherry Y.; Lyons, Yasmin A.

    Even though hyperthermia is a promising treatment for cancer, the relationship between specific temperatures and clinical benefits and predictors of sensitivity of cancer to hyperthermia is poorly understood. Ovarian and uterine tumors have diverse hyperthermia sensitivities. Integrative analyses of the specific gene signatures and the differences in response to hyperthermia between hyperthermia-sensitive and -resistant cancer cells identified CTGF as a key regulator of sensitivity. CTGF silencing sensitized resistant cells to hyperthermia. CTGF small interfering RNA (siRNA) treatment also sensitized resistant cancers to localized hyperthermia induced by copper sulfide nanoparticles and near-infrared laser in orthotopic ovarian cancer models. Lastly, CTGF silencingmore » aggravated energy stress induced by hyperthermia and enhanced apoptosis of hyperthermia-resistant cancers.« less

  13. Privacy Protection for Telecare Medicine Information Systems Using a Chaotic Map-Based Three-Factor Authenticated Key Agreement Scheme.

    PubMed

    Zhang, Liping; Zhu, Shaohui; Tang, Shanyu

    2017-03-01

    Telecare medicine information systems (TMIS) provide flexible and convenient e-health care. However, the medical records transmitted in TMIS are exposed to unsecured public networks, so TMIS are more vulnerable to various types of security threats and attacks. To provide privacy protection for TMIS, a secure and efficient authenticated key agreement scheme is urgently needed to protect the sensitive medical data. Recently, Mishra et al. proposed a biometrics-based authenticated key agreement scheme for TMIS by using hash function and nonce, they claimed that their scheme could eliminate the security weaknesses of Yan et al.'s scheme and provide dynamic identity protection and user anonymity. In this paper, however, we demonstrate that Mishra et al.'s scheme suffers from replay attacks, man-in-the-middle attacks and fails to provide perfect forward secrecy. To overcome the weaknesses of Mishra et al.'s scheme, we then propose a three-factor authenticated key agreement scheme to enable the patient to enjoy the remote healthcare services via TMIS with privacy protection. The chaotic map-based cryptography is employed in the proposed scheme to achieve a delicate balance of security and performance. Security analysis demonstrates that the proposed scheme resists various attacks and provides several attractive security properties. Performance evaluation shows that the proposed scheme increases efficiency in comparison with other related schemes.

  14. Evaluating integrated watershed management using multiple criteria analysis--a case study at Chittagong Hill Tracts in Bangladesh.

    PubMed

    Biswas, Shampa; Vacik, Harald; Swanson, Mark E; Haque, S M Sirajul

    2012-05-01

    Criteria and indicators assessment is one of the ways to evaluate management strategies for mountain watersheds. One framework for this, Integrated Watershed Management (IWM), was employed at Chittagong Hill Tracts region of Bangladesh using a multi-criteria analysis approach. The IWM framework, consisting of the design and application of principles, criteria, indicators, and verifiers (PCIV), facilitates active participation by diverse professionals, experts, and interest groups in watershed management, to explicitly address the demands and problems to measure the complexity of problems in a transparent and understandable way. Management alternatives are developed to fulfill every key component of IWM considering the developed PCIV set and current situation of the study area. Different management strategies, each focusing on a different approach (biodiversity conservation, flood control, soil and water quality conservation, indigenous knowledge conservation, income generation, watershed conservation, and landscape conservation) were assessed qualitatively on their potential to improve the current situation according to each verifier of the criteria and indicator set. Analytic Hierarchy Process (AHP), including sensitivity analysis, was employed to identify an appropriate management strategy according to overall priorities (i.e., different weights of each principle) of key informants. The AHP process indicated that a strategy focused on conservation of biodiversity provided the best option to address watershed-related challenges in the Chittagong Hill Tracts, Bangladesh.

  15. A parametric sensitivity study for single-stage-to-orbit hypersonic vehicles using trajectory optimization

    NASA Astrophysics Data System (ADS)

    Lovell, T. Alan; Schmidt, D. K.

    1994-03-01

    The class of hypersonic vehicle configurations with single stage-to-orbit (SSTO) capability reflect highly integrated airframe and propulsion systems. These designs are also known to exhibit a large degree of interaction between the airframe and engine dynamics. Consequently, even simplified hypersonic models are characterized by tightly coupled nonlinear equations of motion. In addition, hypersonic SSTO vehicles present a major system design challenge; the vehicle's overall mission performance is a function of its subsystem efficiencies including structural, aerodynamic, propulsive, and operational. Further, all subsystem efficiencies are interrelated, hence, independent optimization of the subsystems is not likely to lead to an optimum design. Thus, it is desired to know the effect of various subsystem efficiencies on overall mission performance. For the purposes of this analysis, mission performance will be measured in terms of the payload weight inserted into orbit. In this report, a trajectory optimization problem is formulated for a generic hypersonic lifting body for a specified orbit-injection mission. A solution method is outlined, and results are detailed for the generic vehicle, referred to as the baseline model. After evaluating the performance of the baseline model, a sensitivity study is presented to determine the effect of various subsystem efficiencies on mission performance. This consists of performing a parametric analysis of the basic design parameters, generating a matrix of configurations, and determining the mission performance of each configuration. Also, the performance loss due to constraining the total head load experienced by the vehicle is evaluated. The key results from this analysis include the formulation of the sizing problem for this vehicle class using trajectory optimization, characteristics of the optimal trajectories, and the subsystem design sensitivities.

  16. A Personalized Approach of Patient-Health Care Provider Communication Regarding Colorectal Cancer Screening Options.

    PubMed

    Sava, M Gabriela; Dolan, James G; May, Jerrold H; Vargas, Luis G

    2018-07-01

    Current colorectal cancer screening guidelines by the US Preventive Services Task Force endorse multiple options for average-risk patients and recommend that screening choices should be guided by individual patient preferences. Implementing these recommendations in practice is challenging because they depend on accurate and efficient elicitation and assessment of preferences from patients who are facing a novel task. To present a methodology for analyzing the sensitivity and stability of a patient's preferences regarding colorectal cancer screening options and to provide a starting point for a personalized discussion between the patient and the health care provider about the selection of the appropriate screening option. This research is a secondary analysis of patient preference data collected as part of a previous study. We propose new measures of preference sensitivity and stability that can be used to determine if additional information provided would result in a change to the initially most preferred colorectal cancer screening option. Illustrative results of applying the methodology to the preferences of 2 patients, of different ages, are provided. The results show that different combinations of screening options are viable for each patient and that the health care provider should emphasize different information during the medical decision-making process. Sensitivity and stability analysis can supply health care providers with key topics to focus on when communicating with a patient and the degree of emphasis to place on each of them to accomplish specific goals. The insights provided by the analysis can be used by health care providers to approach communication with patients in a more personalized way, by taking into consideration patients' preferences before adding their own expertise to the discussion.

  17. The challenge of rapid diagnosis in oncology: Diagnostic accuracy and cost analysis of a large-scale one-stop breast clinic.

    PubMed

    Delaloge, Suzette; Bonastre, Julia; Borget, Isabelle; Garbay, Jean-Rémi; Fontenay, Rachel; Boinon, Diane; Saghatchian, Mahasti; Mathieu, Marie-Christine; Mazouni, Chafika; Rivera, Sofia; Uzan, Catherine; André, Fabrice; Dromain, Clarisse; Boyer, Bruno; Pistilli, Barbara; Azoulay, Sandy; Rimareix, Françoise; Bayou, El-Hadi; Sarfati, Benjamin; Caron, Hélène; Ghouadni, Amal; Leymarie, Nicolas; Canale, Sandra; Mons, Muriel; Arfi-Rouche, Julia; Arnedos, Monica; Suciu, Voichita; Vielh, Philippe; Balleyguier, Corinne

    2016-10-01

    Rapid diagnosis is a key issue in modern oncology, for which one-stop breast clinics are a model. We aimed to assess the diagnosis accuracy and procedure costs of a large-scale one-stop breast clinic. A total of 10,602 individuals with suspect breast lesions attended the Gustave Roussy's regional one-stop breast clinic between 2004 and 2012. The multidisciplinary clinic uses multimodal imaging together with ultrasonography-guided fine needle aspiration for masses and ultrasonography-guided and stereotactic biopsies as needed. Diagnostic accuracy was assessed by comparing one-stop diagnosis to the consolidated diagnosis obtained after surgery or biopsy or long-term monitoring. The medical cost per patient of the care pathway was assessed from patient-level data collected prospectively. Sixty-nine percent of the patients had masses, while 31% had micro-calcifications or other non-mass lesions. In 75% of the cases (87% of masses), an exact diagnosis could be given on the same day. In the base-case analysis (i.e. considering only benign and malignant lesions at one-stop and at consolidated diagnoses), the sensitivity of the one-stop clinic was 98.4%, specificity 99.8%, positive and negative predictive values 99.7% and 99.0%. In the sensitivity analysis (reclassification of suspect, atypical and undetermined lesions), diagnostic sensitivity varied from 90.3% to 98.5% and specificity varied from 94.3% to 99.8%. The mean medical cost per patient of one-stop diagnostic procedure was €420. One-stop breast clinic can provide timely and cost-efficient delivery of highly accurate diagnoses and serve as models of care for multiple settings, including rapid screening-linked diagnosis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Role of an archaeal PitA transporter in the copper and arsenic resistance of Metallosphaera sedula, an extreme thermoacidophile.

    PubMed

    McCarthy, Samuel; Ai, Chenbing; Wheaton, Garrett; Tevatia, Rahul; Eckrich, Valerie; Kelly, Robert; Blum, Paul

    2014-10-01

    Thermoacidophilic archaea, such as Metallosphaera sedula, are lithoautotrophs that occupy metal-rich environments. In previous studies, an M. sedula mutant lacking the primary copper efflux transporter, CopA, became copper sensitive. In contrast, the basis for supranormal copper resistance remained unclear in the spontaneous M. sedula mutant, CuR1. Here, transcriptomic analysis of copper-shocked cultures indicated that CuR1 had a unique regulatory response to metal challenge corresponding to the upregulation of 55 genes. Genome resequencing identified 17 confirmed mutations unique to CuR1 that were likely to change gene function. Of these, 12 mapped to genes with annotated function associated with transcription, metabolism, or transport. These mutations included 7 nonsynonymous substitutions, 4 insertions, and 1 deletion. One of the insertion mutations mapped to pseudogene Msed_1517 and extended its reading frame an additional 209 amino acids. The extended mutant allele was identified as a homolog of Pho4, a family of phosphate symporters that includes the bacterial PitA proteins. Orthologs of this allele were apparent in related extremely thermoacidophilic species, suggesting M. sedula naturally lacked this gene. Phosphate transport studies combined with physiologic analysis demonstrated M. sedula PitA was a low-affinity, high-velocity secondary transporter implicated in copper resistance and arsenate sensitivity. Genetic analysis demonstrated that spontaneous arsenate-resistant mutants derived from CuR1 all underwent mutation in pitA and nonselectively became copper sensitive. Taken together, these results point to archaeal PitA as a key requirement for the increased metal resistance of strain CuR1 and its accelerated capacity for copper bioleaching. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  19. [Value of Immunohistochemical Methods in Detecting EML4-ALK Fusion Mutations: A Meta-analysis].

    PubMed

    Liu, Chang; Cai, Lu; Zhong, Diansheng; Wang, Jing

    2016-01-01

    The fusion between echinoderm microtubule-associated protein 4 (EML4) and anaplastic lymphatic tumor kinase (ALK) rearrangement is present in approximately 5% of non-small cell lung cancer (NSCLC) patients. It has been regarded as another new target gene after epidermal growth factor receptor (EGFR) and K-ras. Figures showed that the disease control rate could reach up to 80% in NSCLC patients with EML4-ALK fusion gene after treated with ALK inhibitors. Thus, exploring an accurate and rapid detecting method is the key in screening NSCLC patients with EML4-ALK expressions. The aim of this study is to analyze the specificity and sensitivity of IHC in detecting EML4-ALK fusion mutations. To evaluate the accuracy and clinical value of this method, and then provide basis for individual molecular therapy of NSCLC patients. Using Pubmed database to search all documents required. The deadline of retrieval was February 25, 2015. Then further screening the articles according to the inclusion and exclusion criteria. Using diagnostic test meta-analysis methods to analyze the sensitivity and specificity of the immunohistochemistry (IHC) method compared with fluorescence in situ hybridization (FISH) method. Eleven literatures were added into the meta analysis, there were 3,234 of total cases. The diagnostic odds ratio (DOR) was 1,135.00 (95%CI: 337.10-3,821.46); the area under curve (AUC) of summary receiver operating characteristic curve (SROC) curve was 0.992,3 (SEAUC=0.003,2), the Q* was 0.964,4 (SEQ*=0.008,7). Immunohistochemical detection of EML4-ALK fusion gene mutation with specific antibody is feasible. It has high sensitivity and specificity. IHC can be a simple and rapid way in screening EML4-ALK fusion gene mutation and exhibits important clinical values.

  20. Role of an Archaeal PitA Transporter in the Copper and Arsenic Resistance of Metallosphaera sedula, an Extreme Thermoacidophile

    PubMed Central

    McCarthy, Samuel; Ai, Chenbing; Wheaton, Garrett; Tevatia, Rahul; Eckrich, Valerie; Kelly, Robert

    2014-01-01

    Thermoacidophilic archaea, such as Metallosphaera sedula, are lithoautotrophs that occupy metal-rich environments. In previous studies, an M. sedula mutant lacking the primary copper efflux transporter, CopA, became copper sensitive. In contrast, the basis for supranormal copper resistance remained unclear in the spontaneous M. sedula mutant, CuR1. Here, transcriptomic analysis of copper-shocked cultures indicated that CuR1 had a unique regulatory response to metal challenge corresponding to the upregulation of 55 genes. Genome resequencing identified 17 confirmed mutations unique to CuR1 that were likely to change gene function. Of these, 12 mapped to genes with annotated function associated with transcription, metabolism, or transport. These mutations included 7 nonsynonymous substitutions, 4 insertions, and 1 deletion. One of the insertion mutations mapped to pseudogene Msed_1517 and extended its reading frame an additional 209 amino acids. The extended mutant allele was identified as a homolog of Pho4, a family of phosphate symporters that includes the bacterial PitA proteins. Orthologs of this allele were apparent in related extremely thermoacidophilic species, suggesting M. sedula naturally lacked this gene. Phosphate transport studies combined with physiologic analysis demonstrated M. sedula PitA was a low-affinity, high-velocity secondary transporter implicated in copper resistance and arsenate sensitivity. Genetic analysis demonstrated that spontaneous arsenate-resistant mutants derived from CuR1 all underwent mutation in pitA and nonselectively became copper sensitive. Taken together, these results point to archaeal PitA as a key requirement for the increased metal resistance of strain CuR1 and its accelerated capacity for copper bioleaching. PMID:25092032

  1. A parametric sensitivity study for single-stage-to-orbit hypersonic vehicles using trajectory optimization

    NASA Technical Reports Server (NTRS)

    Lovell, T. Alan; Schmidt, D. K.

    1994-01-01

    The class of hypersonic vehicle configurations with single stage-to-orbit (SSTO) capability reflect highly integrated airframe and propulsion systems. These designs are also known to exhibit a large degree of interaction between the airframe and engine dynamics. Consequently, even simplified hypersonic models are characterized by tightly coupled nonlinear equations of motion. In addition, hypersonic SSTO vehicles present a major system design challenge; the vehicle's overall mission performance is a function of its subsystem efficiencies including structural, aerodynamic, propulsive, and operational. Further, all subsystem efficiencies are interrelated, hence, independent optimization of the subsystems is not likely to lead to an optimum design. Thus, it is desired to know the effect of various subsystem efficiencies on overall mission performance. For the purposes of this analysis, mission performance will be measured in terms of the payload weight inserted into orbit. In this report, a trajectory optimization problem is formulated for a generic hypersonic lifting body for a specified orbit-injection mission. A solution method is outlined, and results are detailed for the generic vehicle, referred to as the baseline model. After evaluating the performance of the baseline model, a sensitivity study is presented to determine the effect of various subsystem efficiencies on mission performance. This consists of performing a parametric analysis of the basic design parameters, generating a matrix of configurations, and determining the mission performance of each configuration. Also, the performance loss due to constraining the total head load experienced by the vehicle is evaluated. The key results from this analysis include the formulation of the sizing problem for this vehicle class using trajectory optimization, characteristics of the optimal trajectories, and the subsystem design sensitivities.

  2. A cost-minimisation analysis comparing photoselective vaporisation (PVP) and transurethral resection of the prostate (TURP) for the management of symptomatic benign prostatic hyperplasia (BPH) in Queensland, Australia.

    PubMed

    Whitty, Jennifer A; Crosland, Paul; Hewson, Kaye; Narula, Rajan; Nathan, Timothy R; Campbell, Peter A; Keller, Andrew; Scuffham, Paul A

    2014-03-01

    To compare the costs of photoselective vaporisation (PVP) and transurethral resection of the prostate (TURP) for management of symptomatic benign prostatic hyperplasia (BPH) from the perspective of a Queensland public hospital provider. A decision-analytic model was used to compare the costs of PVP and TURP. Cost inputs were sourced from an audit of patients undergoing PVP or TURP across three hospitals. The probability of re-intervention was obtained from secondary literature sources. Probabilistic and multi-way sensitivity analyses were used to account for uncertainty and test the impact of varying key assumptions. In the base case analysis, which included equipment, training and re-intervention costs, PVP was AU$ 739 (95% credible interval [CrI] -12 187 to 14 516) more costly per patient than TURP. The estimate was most sensitive to changes in procedural costs, fibre costs and the probability of re-intervention. Sensitivity analyses based on data from the most favourable site or excluding equipment and training costs reduced the point estimate to favour PVP (incremental cost AU$ -684, 95% CrI -8319 to 5796 and AU$ -100, 95% CrI -13 026 to 13 678, respectively). However, CrIs were wide for all analyses. In this cost minimisation analysis, there was no significant cost difference between PVP and TURP, after accounting for equipment, training and re-intervention costs. However, PVP was associated with a shorter length of stay and lower procedural costs during audit, indicating PVP potentially provides comparatively good value for money once the technology is established. © 2013 The Authors. BJU International © 2013 BJU International.

  3. An E-Hospital Security Architecture

    NASA Astrophysics Data System (ADS)

    Tian, Fang; Adams, Carlisle

    In this paper, we introduce how to use cryptography in network security and access control of an e-hospital. We first define the security goal of the e-hospital system, and then we analyze the current application system. Our idea is proposed on the system analysis and the related regulations of patients' privacy protection. The security of the whole application system is strengthened through layered security protection. Three security domains in the e-hospital system are defined according to their sensitivity level, and for each domain, we propose different security protections. We use identity based cryptography to establish secure communication channel in the backbone network and policy based cryptography to establish secure communication channel between end users and the backbone network. We also use policy based cryptography in the access control of the application system. We use a symmetric key cryptography to protect the real data in the database. The identity based and policy based cryptography are all based on elliptic curve cryptography—a public key cryptography.

  4. PGRN is a key adipokine mediating high fat diet-induced insulin resistance and obesity through IL-6 in adipose tissue.

    PubMed

    Matsubara, Toshiya; Mita, Ayako; Minami, Kohtaro; Hosooka, Tetsuya; Kitazawa, Sohei; Takahashi, Kenichi; Tamori, Yoshikazu; Yokoi, Norihide; Watanabe, Makoto; Matsuo, Ei-Ichi; Nishimura, Osamu; Seino, Susumu

    2012-01-04

    Adipose tissue secretes adipokines that mediate insulin resistance, a characteristic feature of obesity and type 2 diabetes. By differential proteome analysis of cellular models of insulin resistance, we identified progranulin (PGRN) as an adipokine induced by TNF-α and dexamethasone. PGRN in blood and adipose tissues was markedly increased in obese mouse models and was normalized with treatment of pioglitazone, an insulin-sensitizing agent. Ablation of PGRN (Grn(-/-)) prevented mice from high fat diet (HFD)-induced insulin resistance, adipocyte hypertrophy, and obesity. Grn deficiency blocked elevation of IL-6, an inflammatory cytokine, induced by HFD in blood and adipose tissues. Insulin resistance induced by chronic administration of PGRN was suppressed by neutralizing IL-6 in vivo. Thus, PGRN is a key adipokine that mediates HFD-induced insulin resistance and obesity through production of IL-6 in adipose tissue, and may be a promising therapeutic target for obesity. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Gastrointestinal stromal tumors: the histology report.

    PubMed

    Dei Tos, Angelo P; Laurino, Licia; Bearzi, Italo; Messerini, Luca; Farinati, Fabio

    2011-03-01

    Gastrointestinal stromal tumors (GISTs) represent a mesenchymal neoplasm occurring primarily in the gastrointestinal tract, and showing differentiation toward the interstitial cell of Cajal. Its incidence is approximately 15 case/100,000/year. Stomach and small bowel are the most frequently affected anatomic sites. GIST represents a morphological, immunophenotypical and molecular distinct entity, the recognition of which has profound therapeutic implications. In fact, they have shown an exquisite sensitivity to treatment with the tyrosine kinase inhibitor imatinib. Diagnosis relies upon morphology along with immunodetection of KIT and/or DOG1. When dealing with KIT negative cases, molecular analysis of KIT/PDGFRA genes may help in confirming diagnosis. Molecular evaluation of both genes are in any case recommended as mutational status provides key predictive information. Pathologists also play a key role in providing an estimation of the risk of biological aggressiveness, which is currently based on anatomic location of the tumor, size, and mitotic activity. Copyright © 2011 Editrice Gastroenterologica Italiana S.r.l. Published by Elsevier Ltd.. All rights reserved.

  6. Integrated strategic and tactical biomass-biofuel supply chain optimization.

    PubMed

    Lin, Tao; Rodríguez, Luis F; Shastri, Yogendra N; Hansen, Alan C; Ting, K C

    2014-03-01

    To ensure effective biomass feedstock provision for large-scale biofuel production, an integrated biomass supply chain optimization model was developed to minimize annual biomass-ethanol production costs by optimizing both strategic and tactical planning decisions simultaneously. The mixed integer linear programming model optimizes the activities range from biomass harvesting, packing, in-field transportation, stacking, transportation, preprocessing, and storage, to ethanol production and distribution. The numbers, locations, and capacities of facilities as well as biomass and ethanol distribution patterns are key strategic decisions; while biomass production, delivery, and operating schedules and inventory monitoring are key tactical decisions. The model was implemented to study Miscanthus-ethanol supply chain in Illinois. The base case results showed unit Miscanthus-ethanol production costs were $0.72L(-1) of ethanol. Biorefinery related costs accounts for 62% of the total costs, followed by biomass procurement costs. Sensitivity analysis showed that a 50% reduction in biomass yield would increase unit production costs by 11%. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Electromyogenic Artifacts and Electroencephalographic Inferences Revisited

    PubMed Central

    McMenamin, Brenton W.; Shackman, Alexander J.; Greischar, Lawrence L.; Davidson, Richard J.

    2010-01-01

    Recent years have witnessed a renewed interest in using oscillatory brain electrical activity to understand the neural bases of cognition and emotion. Electrical signals originating from pericranial muscles represent a profound threat to the validity of such research. Recently, McMenamin et al (2010) examined whether independent component analysis (ICA) provides a sensitive and specific means of correcting electromyogenic (EMG) artifacts. This report sparked the accompanying commentary (Olbrich, Jödicke, Sander, Himmerich & Hegerl, in press), and here we revisit the question of how EMG can alter inferences drawn from the EEG and what can be done to minimize its pernicious effects. Accordingly, we briefly summarize salient features of the EMG problem and review recent research investigating the utility of ICA for correcting EMG and other artifacts. We then directly address the key concerns articulated by Olbrich and provide a critique of their efforts at validating ICA. We conclude by identifying key areas for future methodological work and offer some practical recommendations for intelligently addressing EMG artifact. PMID:20981275

  8. Fourier-Mellin moment-based intertwining map for image encryption

    NASA Astrophysics Data System (ADS)

    Kaur, Manjit; Kumar, Vijay

    2018-03-01

    In this paper, a robust image encryption technique that utilizes Fourier-Mellin moments and intertwining logistic map is proposed. Fourier-Mellin moment-based intertwining logistic map has been designed to overcome the issue of low sensitivity of an input image. Multi-objective Non-Dominated Sorting Genetic Algorithm (NSGA-II) based on Reinforcement Learning (MNSGA-RL) has been used to optimize the required parameters of intertwining logistic map. Fourier-Mellin moments are used to make the secret keys more secure. Thereafter, permutation and diffusion operations are carried out on input image using secret keys. The performance of proposed image encryption technique has been evaluated on five well-known benchmark images and also compared with seven well-known existing encryption techniques. The experimental results reveal that the proposed technique outperforms others in terms of entropy, correlation analysis, a unified average changing intensity and the number of changing pixel rate. The simulation results reveal that the proposed technique provides high level of security and robustness against various types of attacks.

  9. [Process and key points of clinical literature evaluation of post-marketing traditional Chinese medicine].

    PubMed

    Liu, Huan; Xie, Yanming

    2011-10-01

    The clinical literature evaluation of the post-marketing traditional Chinese medicine is a comprehensive evaluation by the comprehensive gain, analysis of the drug, literature of drug efficacy, safety, economy, based on the literature evidence and is part of the evaluation of evidence-based medicine. The literature evaluation in the post-marketing Chinese medicine clinical evaluation is in the foundation and the key position. Through the literature evaluation, it can fully grasp the information, grasp listed drug variety of traditional Chinese medicines second development orientation, make clear further clinical indications, perfect the medicines, etc. This paper discusses the main steps and emphasis of the clinical literature evaluation. Emphasizing security literature evaluation should attach importance to the security of a comprehensive collection drug information. Safety assessment should notice traditional Chinese medicine validity evaluation in improving syndrome, improveing the living quality of patients with special advantage. The economics literature evaluation should pay attention to reliability, sensitivity and practicability of the conclusion.

  10. Surface Connectivity and Interocean Exchanges From Drifter-Based Transition Matrices

    NASA Astrophysics Data System (ADS)

    McAdam, Ronan; van Sebille, Erik

    2018-01-01

    Global surface transport in the ocean can be represented by using the observed trajectories of drifters to calculate probability distribution functions. The oceanographic applications of the Markov Chain approach to modeling include tracking of floating debris and water masses, globally and on yearly-to-centennial time scales. Here we analyze the error inherent with mapping trajectories onto a grid and the consequences for ocean transport modeling and detection of accumulation structures. A sensitivity analysis of Markov Chain parameters is performed in an idealized Stommel gyre and western boundary current as well as with observed ocean drifters, complementing previous studies on widespread floating debris accumulation. Focusing on two key areas of interocean exchange—the Agulhas system and the North Atlantic intergyre transport barrier—we assess the capacity of the Markov Chain methodology to detect surface connectivity and dynamic transport barriers. Finally, we extend the methodology's functionality to separate the geostrophic and nongeostrophic contributions to interocean exchange in these key regions.

  11. Cost-effectiveness Analysis of Nutritional Support for the Prevention of Pressure Ulcers in High-Risk Hospitalized Patients.

    PubMed

    Tuffaha, Haitham W; Roberts, Shelley; Chaboyer, Wendy; Gordon, Louisa G; Scuffham, Paul A

    2016-06-01

    To evaluate the cost-effectiveness of nutritional support compared with standard care in preventing pressure ulcers (PrUs) in high-risk hospitalized patients. An economic model using data from a systematic literature review. A meta-analysis of randomized controlled trials on the efficacy of nutritional support in reducing the incidence of PrUs was conducted. Modeled cohort of hospitalized patients at high risk of developing PrUs and malnutrition simulated during their hospital stay and up to 1 year. Standard care included PrU prevention strategies, such as redistribution surfaces, repositioning, and skin protection strategies, along with standard hospital diet. In addition to the standard care, the intervention group received nutritional support comprising patient education, nutrition goal setting, and the consumption of high-protein supplements. The analysis was from a healthcare payer perspective. Key outcomes of the model included the average costs and quality-adjusted life years. Model results were tested in univariate sensitivity analyses, and decision uncertainty was characterized using a probabilistic sensitivity analysis. Compared with standard care, nutritional support was cost saving at AU $425 per patient and marginally more effective with an average 0.005 quality-adjusted life years gained. The probability of nutritional support being cost-effective was 87%. Nutritional support to prevent PrUs in high-risk hospitalized patients is cost-effective with substantial cost savings predicted. Hospitals should implement the recommendations from the current PrU practice guidelines and offer nutritional support to high-risk patients.

  12. Line-Focused Optical Excitation of Parallel Acoustic Focused Sample Streams for High Volumetric and Analytical Rate Flow Cytometry.

    PubMed

    Kalb, Daniel M; Fencl, Frank A; Woods, Travis A; Swanson, August; Maestas, Gian C; Juárez, Jaime J; Edwards, Bruce S; Shreve, Andrew P; Graves, Steven W

    2017-09-19

    Flow cytometry provides highly sensitive multiparameter analysis of cells and particles but has been largely limited to the use of a single focused sample stream. This limits the analytical rate to ∼50K particles/s and the volumetric rate to ∼250 μL/min. Despite the analytical prowess of flow cytometry, there are applications where these rates are insufficient, such as rare cell analysis in high cellular backgrounds (e.g., circulating tumor cells and fetal cells in maternal blood), detection of cells/particles in large dilute samples (e.g., water quality, urine analysis), or high-throughput screening applications. Here we report a highly parallel acoustic flow cytometer that uses an acoustic standing wave to focus particles into 16 parallel analysis points across a 2.3 mm wide optical flow cell. A line-focused laser and wide-field collection optics are used to excite and collect the fluorescence emission of these parallel streams onto a high-speed camera for analysis. With this instrument format and fluorescent microsphere standards, we obtain analysis rates of 100K/s and flow rates of 10 mL/min, while maintaining optical performance comparable to that of a commercial flow cytometer. The results with our initial prototype instrument demonstrate that the integration of key parallelizable components, including the line-focused laser, particle focusing using multinode acoustic standing waves, and a spatially arrayed detector, can increase analytical and volumetric throughputs by orders of magnitude in a compact, simple, and cost-effective platform. Such instruments will be of great value to applications in need of high-throughput yet sensitive flow cytometry analysis.

  13. Global Expression Profiling of Low Temperature Induced Genes in the Chilling Tolerant Japonica Rice Jumli Marshi

    PubMed Central

    Chawade, Aakash; Lindlöf, Angelica; Olsson, Björn; Olsson, Olof

    2013-01-01

    Low temperature is a key factor that limits growth and productivity of many important agronomical crops worldwide. Rice (Oryza sativa L.) is negatively affected already at temperatures below +10°C and is therefore denoted as chilling sensitive. However, chilling tolerant rice cultivars exist and can be commercially cultivated at altitudes up to 3,050 meters with temperatures reaching as low as +4°C. In this work, the global transcriptional response to cold stress (+4°C) was studied in the Nepalese highland variety Jumli Marshi (spp. japonica) and 4,636 genes were identified as significantly differentially expressed within 24 hours of cold stress. Comparison with previously published microarray data from one chilling tolerant and two sensitive rice cultivars identified 182 genes differentially expressed (DE) upon cold stress in all four rice cultivars and 511 genes DE only in the chilling tolerant rice. Promoter analysis of the 182 genes suggests a complex cross-talk between ABRE and CBF regulons. Promoter analysis of the 511 genes identified over-represented ABRE motifs but not DRE motifs, suggesting a role for ABA signaling in cold tolerance. Moreover, 2,101 genes were DE in Jumli Marshi alone. By chromosomal localization analysis, 473 of these cold responsive genes were located within 13 different QTLs previously identified as cold associated. PMID:24349120

  14. Invited Review Small is beautiful: The analysis of nanogram-sized astromaterials

    NASA Astrophysics Data System (ADS)

    Zolensky, M. E.; Pieters, C.; Clark, B.; Papike, J. J.

    2000-01-01

    The capability of modern methods to characterize ultra-small samples is well established from analysis of interplanetary dust particles (IDPs), interstellar grains recovered from meteorites, and other materials requiring ultra-sensitive analytical capabilities. Powerful analytical techniques are available that require, under favorable circumstances, single particles of only a few nanograms for entire suites of fairly comprehensive characterizations. A returned sample of >1,000 particles with total mass of just one microgram permits comprehensive quantitative geochemical measurements that are impractical to carry out in situ by flight instruments. The main goal of this paper is to describe the state-of-the-art in microanalysis of astromaterials. Given that we can analyze fantastically small quantities of asteroids and comets, etc., we have to ask ourselves how representative are microscopic samples of bodies that measure a few to many km across? With the Galileo flybys of Gaspra and Ida, it is now recognized that even very small airless bodies have indeed developed a particulate regolith. Acquiring a sample of the bulk regolith, a simple sampling strategy, provides two critical pieces of information about the body. Regolith samples are excellent bulk samples since they normally contain all the key components of the local environment, albeit in particulate form. Furthermore, since this fine fraction dominates remote measurements, regolith samples also provide information about surface alteration processes and are a key link to remote sensing of other bodies. Studies indicate that a statistically significant number of nanogram-sized particles should be able to characterize the regolith of a primitive asteroid, although the presence of larger components within even primitive meteorites (e.g.. Murchison), e.g. chondrules, CAI, large crystal fragments, etc., points out the limitations of using data obtained from nanogram-sized samples to characterize entire primitive asteroids. However, most important asteroidal geological processes have left their mark on the matrix, since this is the finest-grained portion and therefore most sensitive to chemical and physical changes. Thus, the following information can be learned from this fine grain size fraction alone: (1) mineral paragenesis; (2) regolith processes, (3) bulk composition; (4) conditions of thermal and aqueous alteration (if any); (5) relationships to planets, comets, meteorites (via isotopic analyses, including oxygen; (6) abundance of water and hydrated material; (7) abundance of organics; (8) history of volatile mobility, (9) presence and origin of presolar and/or interstellar material. Most of this information can even be obtained from dust samples from bodies for which nanogram-sized samples are not truly representative. Future advances in sensitivity and accuracy of laboratory analytical techniques can be expected to enhance the science value of nano- to microgram sized samples even further. This highlights a key advantage of sample returns - that the most advanced analysis techniques can always be applied in the laboratory, and that well-preserved samples are available for future investigations.

  15. Chemotherapy curable malignancies and cancer stem cells: a biological review and hypothesis.

    PubMed

    Savage, Philip

    2016-11-21

    Cytotoxic chemotherapy brings routine cures to only a small select group of metastatic malignancies comprising gestational trophoblast tumours, germ cell tumours, acute leukemia, Hodgkin's disease, high grade lymphomas and some of the rare childhood malignancies. We have previously postulated that the extreme sensitivity to chemotherapy for these malignancies is linked to the on-going high levels of apoptotic sensitivity that is naturally linked with the unique genetic events of nuclear fusion, meiosis, VDJ recombination, somatic hypermutation, and gastrulation that have occurred within the cells of origin of these malignancies. In this review we will examine the cancer stem cell/cancer cell relationship of each of the chemotherapy curable malignancies and how this relationship impacts on the resultant biology and pro-apoptotic sensitivity of the varying cancer cell types. In contrast to the common epithelial cancers, in each of the chemotherapy curable malignancies there are no conventional hierarchical cancer stem cells. However cells with cancer stem like qualities can arise stochastically from within the general tumour cell population. These stochastic stem cells acquire a degree of resistance to DNA damaging agents but also retain much of the key characteristics of the cancer cells from which they develop. We would argue that the balance between the acquired resistance of the stochastic cancer stem cell and the inherent chemotherapy sensitivity of parent tumour cell determines the overall chemotherapy curability of each diagnosis. The cancer stem cells in the chemotherapy curable malignancies appear to have two key biological differences from those of the more common chemotherapy incurable malignancies. The first difference is that the conventional hierarchical pattern of cancer stem cells is absent in each of the chemotherapy curable malignancies. The other key difference, we suggest, is that the stochastic stem cells in the chemotherapy curable malignancies take on a significant aspect of the biological characteristics of their parent cancer cells. This action includes for the chemotherapy curable malignancies the heightened pro-apoptotic sensitivity linked to their respective associated unique genetic events. For the chemotherapy curable malignancies the combination of the relationship of their cancer stem cells combined with the extreme inherent sensitivity to induction of apoptosis from DNA damaging agents plays a key role in determining their overall curability with chemotherapy.

  16. Simulation-Based Estimates of the Effectiveness and Cost-Effectiveness of Pulmonary Rehabilitation in Patients with Chronic Obstructive Pulmonary Disease in France.

    PubMed

    Atsou, Kokuvi; Crequit, Perrine; Chouaid, Christos; Hejblum, Gilles

    2016-01-01

    The medico-economic impact of pulmonary rehabilitation in patients with chronic obstructive pulmonary disease (COPD) is poorly documented. To estimate the effectiveness and cost-effectiveness of pulmonary rehabilitation in a hypothetical cohort of COPD patients. We used a multi-state Markov model, adopting society's perspective. Simulated cohorts of French GOLD stage 2 to 4 COPD patients with and without pulmonary rehabilitation were compared in terms of life expectancy, quality-adjusted life years (QALY), disease-related costs, and the incremental cost-effectiveness ratio (ICER). Sensitivity analyses included variations of key model parameters. At the horizon of a COPD patient's remaining lifetime, pulmonary rehabilitation would result in mean gain of 0.8 QALY, with an over disease-related costs of 14 102 € per patient. The ICER was 17 583 €/QALY. Sensitivity analysis showed that pulmonary rehabilitation was cost-effective in every scenario (ICER <50 000 €/QALY). These results should provide a useful basis for COPD pulmonary rehabilitation programs.

  17. Ag2S Quantum Dot-Sensitized Solar Cells by First Principles: The Effect of Capping Ligands and Linkers.

    PubMed

    Amaya Suárez, Javier; Plata, Jose J; Márquez, Antonio M; Fernández Sanz, Javier

    2017-09-28

    Quantum dots solar cells, QDSCs, are one of the candidates for being a reliable alternative to fossil fuels. However, the well-studied CdSe and CdTe-based QDSCs present a variety of issues for their use in consumer-goods applications. Silver sulfide, Ag 2 S, is a promising material, but poor efficiency has been reported for QDSCs based on this compound. The potential influence of each component of QDSCs is critical and key for the development of more efficient devices based on Ag 2 S. In this work, density functional theory calculations were performed to study the nature of the optoelectronic properties for an anatase-TiO 2 (101) surface sensitized with different silver sulfide nanoclusters. We demonstrated how it is possible to deeply tune of its electronic properties by modifying the capping ligands and linkers to the surface. Finally, an analysis of the electron injection mechanism for this system is presented.

  18. Microglial brain region-dependent diversity and selective regional sensitivities to ageing

    PubMed Central

    Grabert, Kathleen; Michoel, Tom; Karavolos, Michail H; Clohisey, Sara; Baillie, J Kenneth; Stevens, Mark P; Freeman, Tom C; Summers, Kim M; McColl, Barry W

    2015-01-01

    Microglia play critical roles in neural development, homeostasis and neuroinflammation and are increasingly implicated in age-related neurological dysfunction. Neurodegeneration often occurs in disease-specific spatially-restricted patterns, the origins of which are unknown. We performed the first genome-wide analysis of microglia from discrete brain regions across the adult lifespan of the mouse and reveal that microglia have distinct region-dependent transcriptional identities and age in a regionally variable manner. In the young adult brain, differences in bioenergetic and immunoregulatory pathways were the major sources of heterogeneity and suggested that cerebellar and hippocampal microglia exist in a more immune vigilant state. Immune function correlated with regional transcriptional patterns. Augmentation of the distinct cerebellar immunophenotype and a contrasting loss in distinction of the hippocampal phenotype among forebrain regions were key features during ageing. Microglial diversity may enable regionally localised homeostatic functions but could also underlie region-specific sensitivities to microglial dysregulation and involvement in age-related neurodegeneration. PMID:26780511

  19. Attentional modulation of neuronal variability in circuit models of cortex

    PubMed Central

    Kanashiro, Tatjana; Ocker, Gabriel Koch; Cohen, Marlene R; Doiron, Brent

    2017-01-01

    The circuit mechanisms behind shared neural variability (noise correlation) and its dependence on neural state are poorly understood. Visual attention is well-suited to constrain cortical models of response variability because attention both increases firing rates and their stimulus sensitivity, as well as decreases noise correlations. We provide a novel analysis of population recordings in rhesus primate visual area V4 showing that a single biophysical mechanism may underlie these diverse neural correlates of attention. We explore model cortical networks where top-down mediated increases in excitability, distributed across excitatory and inhibitory targets, capture the key neuronal correlates of attention. Our models predict that top-down signals primarily affect inhibitory neurons, whereas excitatory neurons are more sensitive to stimulus specific bottom-up inputs. Accounting for trial variability in models of state dependent modulation of neuronal activity is a critical step in building a mechanistic theory of neuronal cognition. DOI: http://dx.doi.org/10.7554/eLife.23978.001 PMID:28590902

  20. Electrostatic Effects in Filamentous Protein Aggregation

    PubMed Central

    Buell, Alexander K.; Hung, Peter; Salvatella, Xavier; Welland, Mark E.; Dobson, Christopher M.; Knowles, Tuomas P.J.

    2013-01-01

    Electrostatic forces play a key role in mediating interactions between proteins. However, gaining quantitative insights into the complex effects of electrostatics on protein behavior has proved challenging, due to the wide palette of scenarios through which both cations and anions can interact with polypeptide molecules in a specific manner or can result in screening in solution. In this article, we have used a variety of biophysical methods to probe the steady-state kinetics of fibrillar protein self-assembly in a highly quantitative manner to detect how it is modulated by changes in solution ionic strength. Due to the exponential modulation of the reaction rate by electrostatic forces, this reaction represents an exquisitely sensitive probe of these effects in protein-protein interactions. Our approach, which involves a combination of experimental kinetic measurements and theoretical analysis, reveals a hierarchy of electrostatic effects that control protein aggregation. Furthermore, our results provide a highly sensitive method for the estimation of the magnitude of binding of a variety of ions to protein molecules. PMID:23473495

Top