Sample records for sensitive analysis tools

  1. Sensitivity Analysis of Weather Variables on Offsite Consequence Analysis Tools in South Korea and the United States.

    PubMed

    Kim, Min-Uk; Moon, Kyong Whan; Sohn, Jong-Ryeul; Byeon, Sang-Hoon

    2018-05-18

    We studied sensitive weather variables for consequence analysis, in the case of chemical leaks on the user side of offsite consequence analysis (OCA) tools. We used OCA tools Korea Offsite Risk Assessment (KORA) and Areal Location of Hazardous Atmospheres (ALOHA) in South Korea and the United States, respectively. The chemicals used for this analysis were 28% ammonia (NH₃), 35% hydrogen chloride (HCl), 50% hydrofluoric acid (HF), and 69% nitric acid (HNO₃). The accident scenarios were based on leakage accidents in storage tanks. The weather variables were air temperature, wind speed, humidity, and atmospheric stability. Sensitivity analysis was performed using the Statistical Package for the Social Sciences (SPSS) program for dummy regression analysis. Sensitivity analysis showed that impact distance was not sensitive to humidity. Impact distance was most sensitive to atmospheric stability, and was also more sensitive to air temperature than wind speed, according to both the KORA and ALOHA tools. Moreover, the weather variables were more sensitive in rural conditions than in urban conditions, with the ALOHA tool being more influenced by weather variables than the KORA tool. Therefore, if using the ALOHA tool instead of the KORA tool in rural conditions, users should be careful not to cause any differences in impact distance due to input errors of weather variables, with the most sensitive one being atmospheric stability.

  2. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool

    PubMed Central

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-01-01

    Background It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. Results This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. Conclusion SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes. PMID:18706080

  3. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool.

    PubMed

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-08-15

    It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.

  4. Eigenvalue Contributon Estimator for Sensitivity Calculations with TSUNAMI-3D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Williams, Mark L

    2007-01-01

    Since the release of the Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) codes in SCALE [1], the use of sensitivity and uncertainty analysis techniques for criticality safety applications has greatly increased within the user community. In general, sensitivity and uncertainty analysis is transitioning from a technique used only by specialists to a practical tool in routine use. With the desire to use the tool more routinely comes the need to improve the solution methodology to reduce the input and computational burden on the user. This paper reviews the current solution methodology of the Monte Carlo eigenvalue sensitivity analysismore » sequence TSUNAMI-3D, describes an alternative approach, and presents results from both methodologies.« less

  5. Integrated Sensitivity Analysis Workflow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman-Hill, Ernest J.; Hoffman, Edward L.; Gibson, Marcus J.

    2014-08-01

    Sensitivity analysis is a crucial element of rigorous engineering analysis, but performing such an analysis on a complex model is difficult and time consuming. The mission of the DART Workbench team at Sandia National Laboratories is to lower the barriers to adoption of advanced analysis tools through software integration. The integrated environment guides the engineer in the use of these integrated tools and greatly reduces the cycle time for engineering analysis.

  6. VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.

    2016-12-01

    VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.

  7. Estimating Sobol Sensitivity Indices Using Correlations

    EPA Science Inventory

    Sensitivity analysis is a crucial tool in the development and evaluation of complex mathematical models. Sobol's method is a variance-based global sensitivity analysis technique that has been applied to computational models to assess the relative importance of input parameters on...

  8. CALIBRATION, OPTIMIZATION, AND SENSITIVITY AND UNCERTAINTY ALGORITHMS APPLICATION PROGRAMMING INTERFACE (COSU-API)

    EPA Science Inventory

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and Parameter Estimation (UA/SA/PE API) tool development, here fore referred to as the Calibration, Optimization, and Sensitivity and Uncertainty Algorithms API (COSU-API), was initially d...

  9. Global sensitivity analysis of DRAINMOD-FOREST, an integrated forest ecosystem model

    Treesearch

    Shiying Tian; Mohamed A. Youssef; Devendra M. Amatya; Eric D. Vance

    2014-01-01

    Global sensitivity analysis is a useful tool to understand process-based ecosystem models by identifying key parameters and processes controlling model predictions. This study reported a comprehensive global sensitivity analysis for DRAINMOD-FOREST, an integrated model for simulating water, carbon (C), and nitrogen (N) cycles and plant growth in lowland forests. The...

  10. Automatic Single Event Effects Sensitivity Analysis of a 13-Bit Successive Approximation ADC

    NASA Astrophysics Data System (ADS)

    Márquez, F.; Muñoz, F.; Palomo, F. R.; Sanz, L.; López-Morillo, E.; Aguirre, M. A.; Jiménez, A.

    2015-08-01

    This paper presents Analog Fault Tolerant University of Seville Debugging System (AFTU), a tool to evaluate the Single-Event Effect (SEE) sensitivity of analog/mixed signal microelectronic circuits at transistor level. As analog cells can behave in an unpredictable way when critical areas interact with the particle hitting, there is a need for designers to have a software tool that allows an automatic and exhaustive analysis of Single-Event Effects influence. AFTU takes the test-bench SPECTRE design, emulates radiation conditions and automatically evaluates vulnerabilities using user-defined heuristics. To illustrate the utility of the tool, the SEE sensitivity of a 13-bits Successive Approximation Analog-to-Digital Converter (ADC) has been analysed. This circuit was selected not only because it was designed for space applications, but also due to the fact that a manual SEE sensitivity analysis would be too time-consuming. After a user-defined test campaign, it was detected that some voltage transients were propagated to a node where a parasitic diode was activated, affecting the offset cancelation, and therefore the whole resolution of the ADC. A simple modification of the scheme solved the problem, as it was verified with another automatic SEE sensitivity analysis.

  11. OECD/NEA expert group on uncertainty analysis for criticality safety assessment: Results of benchmark on sensitivity calculation (phase III)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ivanova, T.; Laville, C.; Dyrda, J.

    2012-07-01

    The sensitivities of the k{sub eff} eigenvalue to neutron cross sections have become commonly used in similarity studies and as part of the validation algorithm for criticality safety assessments. To test calculations of the sensitivity coefficients, a benchmark study (Phase III) has been established by the OECD-NEA/WPNCS/EG UACSA (Expert Group on Uncertainty Analysis for Criticality Safety Assessment). This paper presents some sensitivity results generated by the benchmark participants using various computational tools based upon different computational methods: SCALE/TSUNAMI-3D and -1D, MONK, APOLLO2-MORET 5, DRAGON-SUSD3D and MMKKENO. The study demonstrates the performance of the tools. It also illustrates how model simplificationsmore » impact the sensitivity results and demonstrates the importance of 'implicit' (self-shielding) sensitivities. This work has been a useful step towards verification of the existing and developed sensitivity analysis methods. (authors)« less

  12. Benchmark On Sensitivity Calculation (Phase III)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ivanova, Tatiana; Laville, Cedric; Dyrda, James

    2012-01-01

    The sensitivities of the keff eigenvalue to neutron cross sections have become commonly used in similarity studies and as part of the validation algorithm for criticality safety assessments. To test calculations of the sensitivity coefficients, a benchmark study (Phase III) has been established by the OECD-NEA/WPNCS/EG UACSA (Expert Group on Uncertainty Analysis for Criticality Safety Assessment). This paper presents some sensitivity results generated by the benchmark participants using various computational tools based upon different computational methods: SCALE/TSUNAMI-3D and -1D, MONK, APOLLO2-MORET 5, DRAGON-SUSD3D and MMKKENO. The study demonstrates the performance of the tools. It also illustrates how model simplifications impactmore » the sensitivity results and demonstrates the importance of 'implicit' (self-shielding) sensitivities. This work has been a useful step towards verification of the existing and developed sensitivity analysis methods.« less

  13. Automatic differentiation for design sensitivity analysis of structural systems using multiple processors

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi

    1994-01-01

    An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.

  14. Automatic differentiation as a tool in engineering design

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois; Hall, Laura E.

    1992-01-01

    Automatic Differentiation (AD) is a tool that systematically implements the chain rule of differentiation to obtain the derivatives of functions calculated by computer programs. AD is assessed as a tool for engineering design. The forward and reverse modes of AD, their computing requirements, as well as approaches to implementing AD are discussed. The application of two different tools to two medium-size structural analysis problems to generate sensitivity information typically necessary in an optimization or design situation is also discussed. The observation is made that AD is to be preferred to finite differencing in most cases, as long as sufficient computer storage is available; in some instances, AD may be the alternative to consider in lieu of analytical sensitivity analysis.

  15. The effective integration of analysis, modeling, and simulation tools.

    DOT National Transportation Integrated Search

    2013-08-01

    The need for model integration arises from the recognition that both transportation decisionmaking and the tools supporting it continue to increase in complexity. Many strategies that agencies evaluate require using tools that are sensitive to supply...

  16. Design sensitivity analysis and optimization tool (DSO) for sizing design applications

    NASA Technical Reports Server (NTRS)

    Chang, Kuang-Hua; Choi, Kyung K.; Perng, Jyh-Hwa

    1992-01-01

    The DSO tool, a structural design software system that provides the designer with a graphics-based menu-driven design environment to perform easy design optimization for general applications, is presented. Three design stages, preprocessing, design sensitivity analysis, and postprocessing, are implemented in the DSO to allow the designer to carry out the design process systematically. A framework, including data base, user interface, foundation class, and remote module, has been designed and implemented to facilitate software development for the DSO. A number of dedicated commercial software/packages have been integrated in the DSO to support the design procedures. Instead of parameterizing an FEM, design parameters are defined on a geometric model associated with physical quantities, and the continuum design sensitivity analysis theory is implemented to compute design sensitivity coefficients using postprocessing data from the analysis codes. A tracked vehicle road wheel is given as a sizing design application to demonstrate the DSO's easy and convenient design optimization process.

  17. Addressing Curse of Dimensionality in Sensitivity Analysis: How Can We Handle High-Dimensional Problems?

    NASA Astrophysics Data System (ADS)

    Safaei, S.; Haghnegahdar, A.; Razavi, S.

    2016-12-01

    Complex environmental models are now the primary tool to inform decision makers for the current or future management of environmental resources under the climate and environmental changes. These complex models often contain a large number of parameters that need to be determined by a computationally intensive calibration procedure. Sensitivity analysis (SA) is a very useful tool that not only allows for understanding the model behavior, but also helps in reducing the number of calibration parameters by identifying unimportant ones. The issue is that most global sensitivity techniques are highly computationally demanding themselves for generating robust and stable sensitivity metrics over the entire model response surface. Recently, a novel global sensitivity analysis method, Variogram Analysis of Response Surfaces (VARS), is introduced that can efficiently provide a comprehensive assessment of global sensitivity using the Variogram concept. In this work, we aim to evaluate the effectiveness of this highly efficient GSA method in saving computational burden, when applied to systems with extra-large number of input factors ( 100). We use a test function and a hydrological modelling case study to demonstrate the capability of VARS method in reducing problem dimensionality by identifying important vs unimportant input factors.

  18. A geostatistics-informed hierarchical sensitivity analysis method for complex groundwater flow and transport modeling: GEOSTATISTICAL SENSITIVITY ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Chen, Xingyuan; Ye, Ming

    Sensitivity analysis is an important tool for quantifying uncertainty in the outputs of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a hierarchical sensitivity analysis method that (1) constructs an uncertainty hierarchy by analyzing the input uncertainty sources, and (2) accounts for the spatial correlation among parameters at each level ofmore » the hierarchy using geostatistical tools. The contribution of uncertainty source at each hierarchy level is measured by sensitivity indices calculated using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport in model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally as driven by the dynamic interaction between groundwater and river water at the site. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed parameters.« less

  19. Probabilistic Sensitivity Analysis for Launch Vehicles with Varying Payloads and Adapters for Structural Dynamics and Loads

    NASA Technical Reports Server (NTRS)

    McGhee, David S.; Peck, Jeff A.; McDonald, Emmett J.

    2012-01-01

    This paper examines Probabilistic Sensitivity Analysis (PSA) methods and tools in an effort to understand their utility in vehicle loads and dynamic analysis. Specifically, this study addresses how these methods may be used to establish limits on payload mass and cg location and requirements on adaptor stiffnesses while maintaining vehicle loads and frequencies within established bounds. To this end, PSA methods and tools are applied to a realistic, but manageable, integrated launch vehicle analysis where payload and payload adaptor parameters are modeled as random variables. This analysis is used to study both Regional Response PSA (RRPSA) and Global Response PSA (GRPSA) methods, with a primary focus on sampling based techniques. For contrast, some MPP based approaches are also examined.

  20. Use of a Process Analysis Tool for Diagnostic Study on Fine Particulate Matter Predictions in the U.S.-Part II: Analysis and Sensitivity Simulations

    EPA Science Inventory

    Following the Part I paper that described an application of the U.S. EPA Models-3/Community Multiscale Air Quality (CMAQ) modeling system to the 1999 Southern Oxidants Study episode, this paper presents results from process analysis (PA) using the PA tool embedded in CMAQ and s...

  1. Use of SCALE Continuous-Energy Monte Carlo Tools for Eigenvalue Sensitivity Coefficient Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perfetti, Christopher M; Rearden, Bradley T

    2013-01-01

    The TSUNAMI code within the SCALE code system makes use of eigenvalue sensitivity coefficients for an extensive number of criticality safety applications, such as quantifying the data-induced uncertainty in the eigenvalue of critical systems, assessing the neutronic similarity between different critical systems, and guiding nuclear data adjustment studies. The need to model geometrically complex systems with improved fidelity and the desire to extend TSUNAMI analysis to advanced applications has motivated the development of a methodology for calculating sensitivity coefficients in continuous-energy (CE) Monte Carlo applications. The CLUTCH and Iterated Fission Probability (IFP) eigenvalue sensitivity methods were recently implemented in themore » CE KENO framework to generate the capability for TSUNAMI-3D to perform eigenvalue sensitivity calculations in continuous-energy applications. This work explores the improvements in accuracy that can be gained in eigenvalue and eigenvalue sensitivity calculations through the use of the SCALE CE KENO and CE TSUNAMI continuous-energy Monte Carlo tools as compared to multigroup tools. The CE KENO and CE TSUNAMI tools were used to analyze two difficult models of critical benchmarks, and produced eigenvalue and eigenvalue sensitivity coefficient results that showed a marked improvement in accuracy. The CLUTCH sensitivity method in particular excelled in terms of efficiency and computational memory requirements.« less

  2. Fast computation of derivative based sensitivities of PSHA models via algorithmic differentiation

    NASA Astrophysics Data System (ADS)

    Leövey, Hernan; Molkenthin, Christian; Scherbaum, Frank; Griewank, Andreas; Kuehn, Nicolas; Stafford, Peter

    2015-04-01

    Probabilistic seismic hazard analysis (PSHA) is the preferred tool for estimation of potential ground-shaking hazard due to future earthquakes at a site of interest. A modern PSHA represents a complex framework which combines different models with possible many inputs. Sensitivity analysis is a valuable tool for quantifying changes of a model output as inputs are perturbed, identifying critical input parameters and obtaining insight in the model behavior. Differential sensitivity analysis relies on calculating first-order partial derivatives of the model output with respect to its inputs. Moreover, derivative based global sensitivity measures (Sobol' & Kucherenko '09) can be practically used to detect non-essential inputs of the models, thus restricting the focus of attention to a possible much smaller set of inputs. Nevertheless, obtaining first-order partial derivatives of complex models with traditional approaches can be very challenging, and usually increases the computation complexity linearly with the number of inputs appearing in the models. In this study we show how Algorithmic Differentiation (AD) tools can be used in a complex framework such as PSHA to successfully estimate derivative based sensitivities, as is the case in various other domains such as meteorology or aerodynamics, without no significant increase in the computation complexity required for the original computations. First we demonstrate the feasibility of the AD methodology by comparing AD derived sensitivities to analytically derived sensitivities for a basic case of PSHA using a simple ground-motion prediction equation. In a second step, we derive sensitivities via AD for a more complex PSHA study using a ground motion attenuation relation based on a stochastic method to simulate strong motion. The presented approach is general enough to accommodate more advanced PSHA studies of higher complexity.

  3. Simple Sensitivity Analysis for Orion GNC

    NASA Technical Reports Server (NTRS)

    Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar

    2013-01-01

    The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch.We describe in this paper a sensitivity analysis tool (Critical Factors Tool or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.

  4. Sobol Sensitivity Analysis: A Tool to Guide the Development and Evaluation of Systems Pharmacology Models

    PubMed Central

    Trame, MN; Lesko, LJ

    2015-01-01

    A systems pharmacology model typically integrates pharmacokinetic, biochemical network, and systems biology concepts into a unifying approach. It typically consists of a large number of parameters and reaction species that are interlinked based upon the underlying (patho)physiology and the mechanism of drug action. The more complex these models are, the greater the challenge of reliably identifying and estimating respective model parameters. Global sensitivity analysis provides an innovative tool that can meet this challenge. CPT Pharmacometrics Syst. Pharmacol. (2015) 4, 69–79; doi:10.1002/psp4.6; published online 25 February 2015 PMID:27548289

  5. Theoretical foundations for finite-time transient stability and sensitivity analysis of power systems

    NASA Astrophysics Data System (ADS)

    Dasgupta, Sambarta

    Transient stability and sensitivity analysis of power systems are problems of enormous academic and practical interest. These classical problems have received renewed interest, because of the advancement in sensor technology in the form of phasor measurement units (PMUs). The advancement in sensor technology has provided unique opportunity for the development of real-time stability monitoring and sensitivity analysis tools. Transient stability problem in power system is inherently a problem of stability analysis of the non-equilibrium dynamics, because for a short time period following a fault or disturbance the system trajectory moves away from the equilibrium point. The real-time stability decision has to be made over this short time period. However, the existing stability definitions and hence analysis tools for transient stability are asymptotic in nature. In this thesis, we discover theoretical foundations for the short-term transient stability analysis of power systems, based on the theory of normally hyperbolic invariant manifolds and finite time Lyapunov exponents, adopted from geometric theory of dynamical systems. The theory of normally hyperbolic surfaces allows us to characterize the rate of expansion and contraction of co-dimension one material surfaces in the phase space. The expansion and contraction rates of these material surfaces can be computed in finite time. We prove that the expansion and contraction rates can be used as finite time transient stability certificates. Furthermore, material surfaces with maximum expansion and contraction rate are identified with the stability boundaries. These stability boundaries are used for computation of stability margin. We have used the theoretical framework for the development of model-based and model-free real-time stability monitoring methods. Both the model-based and model-free approaches rely on the availability of high resolution time series data from the PMUs for stability prediction. The problem of sensitivity analysis of power system, subjected to changes or uncertainty in load parameters and network topology, is also studied using the theory of normally hyperbolic manifolds. The sensitivity analysis is used for the identification and rank ordering of the critical interactions and parameters in the power network. The sensitivity analysis is carried out both in finite time and in asymptotic. One of the distinguishing features of the asymptotic sensitivity analysis is that the asymptotic dynamics of the system is assumed to be a periodic orbit. For asymptotic sensitivity analysis we employ combination of tools from ergodic theory and geometric theory of dynamical systems.

  6. Automated Sensitivity Analysis of Interplanetary Trajectories

    NASA Technical Reports Server (NTRS)

    Knittel, Jeremy; Hughes, Kyle; Englander, Jacob; Sarli, Bruno

    2017-01-01

    This work describes a suite of Python tools known as the Python EMTG Automated Trade Study Application (PEATSA). PEATSA was written to automate the operation of trajectory optimization software, simplify the process of performing sensitivity analysis, and was ultimately found to out-perform a human trajectory designer in unexpected ways. These benefits will be discussed and demonstrated on sample mission designs.

  7. Depression Case Finding in Individuals with Dementia: A Systematic Review and Meta-Analysis.

    PubMed

    Goodarzi, Zahra S; Mele, Bria S; Roberts, Derek J; Holroyd-Leduc, Jayna

    2017-05-01

    To compare the diagnostic accuracy of depression case finding tools with a criterion standard in the outpatient setting among adults with dementia. Systematic review and meta-analysis. Studies of older outpatients with dementia. Elderly outpatients (clinic and long-term care) with dementia (N = 3,035). Prevalence of major depression and diagnostic accuracy measures including sensitivity, specificity, and likelihood ratios. From the 11,539 citations, 20 studies were included for qualitative synthesis and 15 for a meta-analysis. Tools included were the Montgomery Åsberg Depression Rating Scale, Cornell Scale for Depression in Dementia (CSDD), Geriatric Depression Scale (GDS), Center for Epidemiologic Studies Depression Scale (CES-D), Hamilton Depression Rating Scale (HDRS), Single Question, Nijmegen Observer-Rated Depression Scale, and Even Briefer Assessment Scale-Depression. The pooled prevalence of depression in individuals with dementia was 30.3% (95% CI = 22.1-38.5). The average age was 75.2 (95% CI = 71.7-78.7), and mean Mini-Mental State Examination scores ranged from 11.2 to 24. The diagnostic accuracy of the individual tools was pooled for the best-reported cutoffs and for each cutoff, if available. The CSDD had a sensitivity of 0.84 (95% CI = 0.73-0.91) and a specificity of 0.80 (95% CI = 0.65-0.90), the 30-item GDS (GDS-30) had a sensitivity of 0.62 (95% CI = 0.45-0.76) and a specificity 0.81 (95% CI = 0.75-0.85), and the HDRS had a sensitivity of 0.86 (95% CI = 0.63-0.96) and a specificity of 0.84 (95% CI = 0.76-0.90). Summary statistics for all tools across best-reported cutoffs had significant heterogeneity. There are many validated tools for the detection of depression in individuals with dementia. Tools that incorporate a physician interview with patient and collateral histories, the CSDD and HDRS, have higher sensitivities, which would ensure fewer false-negatives. © 2017, Copyright the Authors Journal compilation © 2017, The American Geriatrics Society.

  8. Evaluation of an inpatient fall risk screening tool to identify the most critical fall risk factors in inpatients.

    PubMed

    Hou, Wen-Hsuan; Kang, Chun-Mei; Ho, Mu-Hsing; Kuo, Jessie Ming-Chuan; Chen, Hsiao-Lien; Chang, Wen-Yin

    2017-03-01

    To evaluate the accuracy of the inpatient fall risk screening tool and to identify the most critical fall risk factors in inpatients. Variations exist in several screening tools applied in acute care hospitals for examining risk factors for falls and identifying high-risk inpatients. Secondary data analysis. A subset of inpatient data for the period from June 2011-June 2014 was extracted from the nursing information system and adverse event reporting system of an 818-bed teaching medical centre in Taipei. Data were analysed using descriptive statistics, receiver operating characteristic curve analysis and logistic regression analysis. During the study period, 205 fallers and 37,232 nonfallers were identified. The results revealed that the inpatient fall risk screening tool (cut-off point of ≥3) had a low sensitivity level (60%), satisfactory specificity (87%), a positive predictive value of 2·0% and a negative predictive value of 99%. The receiver operating characteristic curve analysis revealed an area under the curve of 0·805 (sensitivity, 71·8%; specificity, 78%). To increase the sensitivity values, the Youden index suggests at least 1·5 points to be the most suitable cut-off point for the inpatient fall risk screening tool. Multivariate logistic regression analysis revealed a considerably increased fall risk in patients with impaired balance and impaired elimination. The fall risk factor was also significantly associated with days of hospital stay and with admission to surgical wards. The findings can raise awareness about the two most critical risk factors for falls among future clinical nurses and other healthcare professionals and thus facilitate the development of fall prevention interventions. This study highlights the needs for redefining the cut-off points of the inpatient fall risk screening tool to effectively identify inpatients at a high risk of falls. Furthermore, inpatients with impaired balance and impaired elimination should be closely monitored by nurses to prevent falling during hospitalisations. © 2016 John Wiley & Sons Ltd.

  9. Identifying key sources of uncertainty in the modelling of greenhouse gas emissions from wastewater treatment.

    PubMed

    Sweetapple, Christine; Fu, Guangtao; Butler, David

    2013-09-01

    This study investigates sources of uncertainty in the modelling of greenhouse gas emissions from wastewater treatment, through the use of local and global sensitivity analysis tools, and contributes to an in-depth understanding of wastewater treatment modelling by revealing critical parameters and parameter interactions. One-factor-at-a-time sensitivity analysis is used to screen model parameters and identify those with significant individual effects on three performance indicators: total greenhouse gas emissions, effluent quality and operational cost. Sobol's method enables identification of parameters with significant higher order effects and of particular parameter pairs to which model outputs are sensitive. Use of a variance-based global sensitivity analysis tool to investigate parameter interactions enables identification of important parameters not revealed in one-factor-at-a-time sensitivity analysis. These interaction effects have not been considered in previous studies and thus provide a better understanding wastewater treatment plant model characterisation. It was found that uncertainty in modelled nitrous oxide emissions is the primary contributor to uncertainty in total greenhouse gas emissions, due largely to the interaction effects of three nitrogen conversion modelling parameters. The higher order effects of these parameters are also shown to be a key source of uncertainty in effluent quality. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. CORSSTOL: Cylinder Optimization of Rings, Skin, and Stringers with Tolerance sensitivity

    NASA Technical Reports Server (NTRS)

    Finckenor, J.; Bevill, M.

    1995-01-01

    Cylinder Optimization of Rings, Skin, and Stringers with Tolerance (CORSSTOL) sensitivity is a design optimization program incorporating a method to examine the effects of user-provided manufacturing tolerances on weight and failure. CORSSTOL gives designers a tool to determine tolerances based on need. This is a decisive way to choose the best design among several manufacturing methods with differing capabilities and costs. CORSSTOL initially optimizes a stringer-stiffened cylinder for weight without tolerances. The skin and stringer geometry are varied, subject to stress and buckling constraints. Then the same analysis and optimization routines are used to minimize the maximum material condition weight subject to the least favorable combination of tolerances. The adjusted optimum dimensions are provided with the weight and constraint sensitivities of each design variable. The designer can immediately identify critical tolerances. The safety of parts made out of tolerance can also be determined. During design and development of weight-critical systems, design/analysis tools that provide product-oriented results are of vital significance. The development of this program and methodology provides designers with an effective cost- and weight-saving design tool. The tolerance sensitivity method can be applied to any system defined by a set of deterministic equations.

  11. Sampling and sensitivity analyses tools (SaSAT) for computational modelling

    PubMed Central

    Hoare, Alexander; Regan, David G; Wilson, David P

    2008-01-01

    SaSAT (Sampling and Sensitivity Analysis Tools) is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated. PMID:18304361

  12. Case-Deletion Diagnostics for Maximum Likelihood Multipoint Quantitative Trait Locus Linkage Analysis

    PubMed Central

    Mendoza, Maria C.B.; Burns, Trudy L.; Jones, Michael P.

    2009-01-01

    Objectives Case-deletion diagnostic methods are tools that allow identification of influential observations that may affect parameter estimates and model fitting conclusions. The goal of this paper was to develop two case-deletion diagnostics, the exact case deletion (ECD) and the empirical influence function (EIF), for detecting outliers that can affect results of sib-pair maximum likelihood quantitative trait locus (QTL) linkage analysis. Methods Subroutines to compute the ECD and EIF were incorporated into the maximum likelihood QTL variance estimation components of the linkage analysis program MAPMAKER/SIBS. Performance of the diagnostics was compared in simulation studies that evaluated the proportion of outliers correctly identified (sensitivity), and the proportion of non-outliers correctly identified (specificity). Results Simulations involving nuclear family data sets with one outlier showed EIF sensitivities approximated ECD sensitivities well for outlier-affected parameters. Sensitivities were high, indicating the outlier was identified a high proportion of the time. Simulations also showed the enormous computational time advantage of the EIF. Diagnostics applied to body mass index in nuclear families detected observations influential on the lod score and model parameter estimates. Conclusions The EIF is a practical diagnostic tool that has the advantages of high sensitivity and quick computation. PMID:19172086

  13. Automated Sensitivity Analysis of Interplanetary Trajectories for Optimal Mission Design

    NASA Technical Reports Server (NTRS)

    Knittel, Jeremy; Hughes, Kyle; Englander, Jacob; Sarli, Bruno

    2017-01-01

    This work describes a suite of Python tools known as the Python EMTG Automated Trade Study Application (PEATSA). PEATSA was written to automate the operation of trajectory optimization software, simplify the process of performing sensitivity analysis, and was ultimately found to out-perform a human trajectory designer in unexpected ways. These benefits will be discussed and demonstrated on sample mission designs.

  14. Linear regression metamodeling as a tool to summarize and present simulation model results.

    PubMed

    Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M

    2013-10-01

    Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.

  15. Simple Sensitivity Analysis for Orion Guidance Navigation and Control

    NASA Technical Reports Server (NTRS)

    Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar

    2013-01-01

    The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch. We describe in this paper a sensitivity analysis tool ("Critical Factors Tool" or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.

  16. Assessment of the predictive accuracy of five in silico prediction tools, alone or in combination, and two metaservers to classify long QT syndrome gene mutations.

    PubMed

    Leong, Ivone U S; Stuckey, Alexander; Lai, Daniel; Skinner, Jonathan R; Love, Donald R

    2015-05-13

    Long QT syndrome (LQTS) is an autosomal dominant condition predisposing to sudden death from malignant arrhythmia. Genetic testing identifies many missense single nucleotide variants of uncertain pathogenicity. Establishing genetic pathogenicity is an essential prerequisite to family cascade screening. Many laboratories use in silico prediction tools, either alone or in combination, or metaservers, in order to predict pathogenicity; however, their accuracy in the context of LQTS is unknown. We evaluated the accuracy of five in silico programs and two metaservers in the analysis of LQTS 1-3 gene variants. The in silico tools SIFT, PolyPhen-2, PROVEAN, SNPs&GO and SNAP, either alone or in all possible combinations, and the metaservers Meta-SNP and PredictSNP, were tested on 312 KCNQ1, KCNH2 and SCN5A gene variants that have previously been characterised by either in vitro or co-segregation studies as either "pathogenic" (283) or "benign" (29). The accuracy, sensitivity, specificity and Matthews Correlation Coefficient (MCC) were calculated to determine the best combination of in silico tools for each LQTS gene, and when all genes are combined. The best combination of in silico tools for KCNQ1 is PROVEAN, SNPs&GO and SIFT (accuracy 92.7%, sensitivity 93.1%, specificity 100% and MCC 0.70). The best combination of in silico tools for KCNH2 is SIFT and PROVEAN or PROVEAN, SNPs&GO and SIFT. Both combinations have the same scores for accuracy (91.1%), sensitivity (91.5%), specificity (87.5%) and MCC (0.62). In the case of SCN5A, SNAP and PROVEAN provided the best combination (accuracy 81.4%, sensitivity 86.9%, specificity 50.0%, and MCC 0.32). When all three LQT genes are combined, SIFT, PROVEAN and SNAP is the combination with the best performance (accuracy 82.7%, sensitivity 83.0%, specificity 80.0%, and MCC 0.44). Both metaservers performed better than the single in silico tools; however, they did not perform better than the best performing combination of in silico tools. The combination of in silico tools with the best performance is gene-dependent. The in silico tools reported here may have some value in assessing variants in the KCNQ1 and KCNH2 genes, but caution should be taken when the analysis is applied to SCN5A gene variants.

  17. The Objective Identification and Quantification of Interstitial Lung Abnormalities in Smokers.

    PubMed

    Ash, Samuel Y; Harmouche, Rola; Ross, James C; Diaz, Alejandro A; Hunninghake, Gary M; Putman, Rachel K; Onieva, Jorge; Martinez, Fernando J; Choi, Augustine M; Lynch, David A; Hatabu, Hiroto; Rosas, Ivan O; Estepar, Raul San Jose; Washko, George R

    2017-08-01

    Previous investigation suggests that visually detected interstitial changes in the lung parenchyma of smokers are highly clinically relevant and predict outcomes, including death. Visual subjective analysis to detect these changes is time-consuming, insensitive to subtle changes, and requires training to enhance reproducibility. Objective detection of such changes could provide a method of disease identification without these limitations. The goal of this study was to develop and test a fully automated image processing tool to objectively identify radiographic features associated with interstitial abnormalities in the computed tomography scans of a large cohort of smokers. An automated tool that uses local histogram analysis combined with distance from the pleural surface was used to detect radiographic features consistent with interstitial lung abnormalities in computed tomography scans from 2257 individuals from the Genetic Epidemiology of COPD study, a longitudinal observational study of smokers. The sensitivity and specificity of this tool was determined based on its ability to detect the visually identified presence of these abnormalities. The tool had a sensitivity of 87.8% and a specificity of 57.5% for the detection of interstitial lung abnormalities, with a c-statistic of 0.82, and was 100% sensitive and 56.7% specific for the detection of the visual subtype of interstitial abnormalities called fibrotic parenchymal abnormalities, with a c-statistic of 0.89. In smokers, a fully automated image processing tool is able to identify those individuals who have interstitial lung abnormalities with moderate sensitivity and specificity. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  18. First- and Second-Order Sensitivity Analysis of a P-Version Finite Element Equation Via Automatic Differentiation

    NASA Technical Reports Server (NTRS)

    Hou, Gene

    1998-01-01

    Sensitivity analysis is a technique for determining derivatives of system responses with respect to design parameters. Among many methods available for sensitivity analysis, automatic differentiation has been proven through many applications in fluid dynamics and structural mechanics to be an accurate and easy method for obtaining derivatives. Nevertheless, the method can be computational expensive and can require a high memory space. This project will apply an automatic differentiation tool, ADIFOR, to a p-version finite element code to obtain first- and second- order then-nal derivatives, respectively. The focus of the study is on the implementation process and the performance of the ADIFOR-enhanced codes for sensitivity analysis in terms of memory requirement, computational efficiency, and accuracy.

  19. Molecular Tools for Diagnosis of Visceral Leishmaniasis: Systematic Review and Meta-Analysis of Diagnostic Test Accuracy

    PubMed Central

    de Ruiter, C. M.; van der Veer, C.; Leeflang, M. M. G.; Deborggraeve, S.; Lucas, C.

    2014-01-01

    Molecular methods have been proposed as highly sensitive tools for the detection of Leishmania parasites in visceral leishmaniasis (VL) patients. Here, we evaluate the diagnostic accuracy of these tools in a meta-analysis of the published literature. The selection criteria were original studies that evaluate the sensitivities and specificities of molecular tests for diagnosis of VL, adequate classification of study participants, and the absolute numbers of true positives and negatives derivable from the data presented. Forty studies met the selection criteria, including PCR, real-time PCR, nucleic acid sequence-based amplification (NASBA), and loop-mediated isothermal amplification (LAMP). The sensitivities of the individual studies ranged from 29 to 100%, and the specificities ranged from 25 to 100%. The pooled sensitivity of PCR in whole blood was 93.1% (95% confidence interval [CI], 90.0 to 95.2), and the specificity was 95.6% (95% CI, 87.0 to 98.6). The specificity was significantly lower in consecutive studies, at 63.3% (95% CI, 53.9 to 71.8), due either to true-positive patients not being identified by parasitological methods or to the number of asymptomatic carriers in areas of endemicity. PCR for patients with HIV-VL coinfection showed high diagnostic accuracy in buffy coat and bone marrow, ranging from 93.1 to 96.9%. Molecular tools are highly sensitive assays for Leishmania detection and may contribute as an additional test in the algorithm, together with a clear clinical case definition. We observed wide variety in reference standards and study designs and now recommend consecutively designed studies. PMID:24829226

  20. Multiple shooting shadowing for sensitivity analysis of chaotic dynamical systems

    NASA Astrophysics Data System (ADS)

    Blonigan, Patrick J.; Wang, Qiqi

    2018-02-01

    Sensitivity analysis methods are important tools for research and design with simulations. Many important simulations exhibit chaotic dynamics, including scale-resolving turbulent fluid flow simulations. Unfortunately, conventional sensitivity analysis methods are unable to compute useful gradient information for long-time-averaged quantities in chaotic dynamical systems. Sensitivity analysis with least squares shadowing (LSS) can compute useful gradient information for a number of chaotic systems, including simulations of chaotic vortex shedding and homogeneous isotropic turbulence. However, this gradient information comes at a very high computational cost. This paper presents multiple shooting shadowing (MSS), a more computationally efficient shadowing approach than the original LSS approach. Through an analysis of the convergence rate of MSS, it is shown that MSS can have lower memory usage and run time than LSS.

  1. Tool development to assess the work related neck and upper limb musculoskeletal disorders among female garment workers in Sri-Lanka.

    PubMed

    Amarasinghe, Nirmalie Champika; De AlwisSenevirathne, Rohini

    2016-10-17

    Musculoskeletal disorders (MSDs) have been identified as a predisposing factor for lesser productivity, but no validated tool has been developed to assess them in the Sri- Lankan context. To develop a validated tool to assess the neck and upper limb MSDs. It comprises three components: item selections, item reduction using principal component analysis, and validation. A tentative self-administrated questionnaire was developed, translated, and pre-tested. Four important domains - neck, shoulder, elbow and wrist - were identified through principal component analysis. Prevalence of any MSDs was 38.1% and prevalence of neck, shoulder, elbow and wrist MSDs are 12.85%, 13.71%, 12%, 13.71% respectively. Content and criterion validity of the tool was assessed. Separate ROC curves were produced and sensitivity and specificity of neck (83.1%, 71.7%), shoulder (97.6%, 91.9%), elbow (98.2%, 87.2%), and wrist (97.6%, 94.9%) was determined. Cronbach's Alpha and correlation coefficient was above 0.7. The tool has high sensitivity, specificity, internal consistency, and test re-test reliability.

  2. Development of a generalized perturbation theory method for sensitivity analysis using continuous-energy Monte Carlo methods

    DOE PAGES

    Perfetti, Christopher M.; Rearden, Bradley T.

    2016-03-01

    The sensitivity and uncertainty analysis tools of the ORNL SCALE nuclear modeling and simulation code system that have been developed over the last decade have proven indispensable for numerous application and design studies for nuclear criticality safety and reactor physics. SCALE contains tools for analyzing the uncertainty in the eigenvalue of critical systems, but cannot quantify uncertainty in important neutronic parameters such as multigroup cross sections, fuel fission rates, activation rates, and neutron fluence rates with realistic three-dimensional Monte Carlo simulations. A more complete understanding of the sources of uncertainty in these design-limiting parameters could lead to improvements in processmore » optimization, reactor safety, and help inform regulators when setting operational safety margins. A novel approach for calculating eigenvalue sensitivity coefficients, known as the CLUTCH method, was recently explored as academic research and has been found to accurately and rapidly calculate sensitivity coefficients in criticality safety applications. The work presented here describes a new method, known as the GEAR-MC method, which extends the CLUTCH theory for calculating eigenvalue sensitivity coefficients to enable sensitivity coefficient calculations and uncertainty analysis for a generalized set of neutronic responses using high-fidelity continuous-energy Monte Carlo calculations. Here, several criticality safety systems were examined to demonstrate proof of principle for the GEAR-MC method, and GEAR-MC was seen to produce response sensitivity coefficients that agreed well with reference direct perturbation sensitivity coefficients.« less

  3. Evaluation of an existing screening tool for psoriatic arthritis in people with psoriasis and the development of a new instrument: the Psoriasis Epidemiology Screening Tool (PEST) questionnaire.

    PubMed

    Ibrahim, G H; Buch, M H; Lawson, C; Waxman, R; Helliwell, P S

    2009-01-01

    To evaluate an existing tool (the Swedish modification of the Psoriasis Assessment Questionnaire) and to develop a new instrument to screen for psoriatic arthritis in people with psoriasis. The starting point was a community-based survey of people with psoriasis using questionnaires developed from the literature. Selected respondents were examined and additional known cases of psoriatic arthritis were included in the analysis. The new instrument was developed using univariate statistics and a logistic regression model, comparing people with and without psoriatic arthritis. The instruments were compared using receiver operating curve (ROC) curve analysis. 168 questionnaires were returned (response rate 27%) and 93 people attended for examination (55% of questionnaire respondents). Of these 93, twelve were newly diagnosed with psoriatic arthritis during this study. These 12 were supplemented by 21 people with known psoriatic arthritis. Just 5 questions were found to be significant predictors of psoriatic arthritis in this population. Figures for sensitivity and specificity were 0.92 and 0.78 respectively, an improvement on the Alenius tool (sensitivity and specificity, 0.63 and 0.72 respectively). A new screening tool for identifying people with psoriatic arthritis has been developed. Five simple questions demonstrated good sensitivity and specificity in this population but further validation is required.

  4. Accurate evaluation of sensitivity for calibration between a LiDAR and a panoramic camera used for remote sensing

    NASA Astrophysics Data System (ADS)

    García-Moreno, Angel-Iván; González-Barbosa, José-Joel; Ramírez-Pedraza, Alfonso; Hurtado-Ramos, Juan B.; Ornelas-Rodriguez, Francisco-Javier

    2016-04-01

    Computer-based reconstruction models can be used to approximate urban environments. These models are usually based on several mathematical approximations and the usage of different sensors, which implies dependency on many variables. The sensitivity analysis presented in this paper is used to weigh the relative importance of each uncertainty contributor into the calibration of a panoramic camera-LiDAR system. Both sensors are used for three-dimensional urban reconstruction. Simulated and experimental tests were conducted. For the simulated tests we analyze and compare the calibration parameters using the Monte Carlo and Latin hypercube sampling techniques. Sensitivity analysis for each variable involved into the calibration was computed by the Sobol method, which is based on the analysis of the variance breakdown, and the Fourier amplitude sensitivity test method, which is based on Fourier's analysis. Sensitivity analysis is an essential tool in simulation modeling and for performing error propagation assessments.

  5. Modelling ecological and human exposure to POPs in Venice lagoon - Part II: Quantitative uncertainty and sensitivity analysis in coupled exposure models.

    PubMed

    Radomyski, Artur; Giubilato, Elisa; Ciffroy, Philippe; Critto, Andrea; Brochot, Céline; Marcomini, Antonio

    2016-11-01

    The study is focused on applying uncertainty and sensitivity analysis to support the application and evaluation of large exposure models where a significant number of parameters and complex exposure scenarios might be involved. The recently developed MERLIN-Expo exposure modelling tool was applied to probabilistically assess the ecological and human exposure to PCB 126 and 2,3,7,8-TCDD in the Venice lagoon (Italy). The 'Phytoplankton', 'Aquatic Invertebrate', 'Fish', 'Human intake' and PBPK models available in MERLIN-Expo library were integrated to create a specific food web to dynamically simulate bioaccumulation in various aquatic species and in the human body over individual lifetimes from 1932 until 1998. MERLIN-Expo is a high tier exposure modelling tool allowing propagation of uncertainty on the model predictions through Monte Carlo simulation. Uncertainty in model output can be further apportioned between parameters by applying built-in sensitivity analysis tools. In this study, uncertainty has been extensively addressed in the distribution functions to describe the data input and the effect on model results by applying sensitivity analysis techniques (screening Morris method, regression analysis, and variance-based method EFAST). In the exposure scenario developed for the Lagoon of Venice, the concentrations of 2,3,7,8-TCDD and PCB 126 in human blood turned out to be mainly influenced by a combination of parameters (half-lives of the chemicals, body weight variability, lipid fraction, food assimilation efficiency), physiological processes (uptake/elimination rates), environmental exposure concentrations (sediment, water, food) and eating behaviours (amount of food eaten). In conclusion, this case study demonstrated feasibility of MERLIN-Expo to be successfully employed in integrated, high tier exposure assessment. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Extended Testability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  7. TAxonomy of Self-reported Sedentary behaviour Tools (TASST) framework for development, comparison and evaluation of self-report tools: content analysis and systematic review

    PubMed Central

    Dall, PM; Coulter, EH; Fitzsimons, CF; Skelton, DA; Chastin, SFM

    2017-01-01

    Objective Sedentary behaviour (SB) has distinct deleterious health outcomes, yet there is no consensus on best practice for measurement. This study aimed to identify the optimal self-report tool for population surveillance of SB, using a systematic framework. Design A framework, TAxonomy of Self-reported Sedentary behaviour Tools (TASST), consisting of four domains (type of assessment, recall period, temporal unit and assessment period), was developed based on a systematic inventory of existing tools. The inventory was achieved through a systematic review of studies reporting SB and tracing back to the original description. A systematic review of the accuracy and sensitivity to change of these tools was then mapped against TASST domains. Data sources Systematic searches were conducted via EBSCO, reference lists and expert opinion. Eligibility criteria for selecting studies The inventory included tools measuring SB in adults that could be self-completed at one sitting, and excluded tools measuring SB in specific populations or contexts. The systematic review included studies reporting on the accuracy against an objective measure of SB and/or sensitivity to change of a tool in the inventory. Results The systematic review initially identified 32 distinct tools (141 questions), which were used to develop the TASST framework. Twenty-two studies evaluated accuracy and/or sensitivity to change representing only eight taxa. Assessing SB as a sum of behaviours and using a previous day recall were the most promising features of existing tools. Accuracy was poor for all existing tools, with underestimation and overestimation of SB. There was a lack of evidence about sensitivity to change. Conclusions Despite the limited evidence, mapping existing SB tools onto the TASST framework has enabled informed recommendations to be made about the most promising features for a surveillance tool, identified aspects on which future research and development of SB surveillance tools should focus. Trial registration number International prospective register of systematic reviews (PROPSPERO)/CRD42014009851. PMID:28391233

  8. Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool

    NASA Astrophysics Data System (ADS)

    Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.

    2018-06-01

    Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.

  9. State-of-the-Art Fusion-Finder Algorithms Sensitivity and Specificity

    PubMed Central

    Carrara, Matteo; Beccuti, Marco; Lazzarato, Fulvio; Cavallo, Federica; Cordero, Francesca; Donatelli, Susanna; Calogero, Raffaele A.

    2013-01-01

    Background. Gene fusions arising from chromosomal translocations have been implicated in cancer. RNA-seq has the potential to discover such rearrangements generating functional proteins (chimera/fusion). Recently, many methods for chimeras detection have been published. However, specificity and sensitivity of those tools were not extensively investigated in a comparative way. Results. We tested eight fusion-detection tools (FusionHunter, FusionMap, FusionFinder, MapSplice, deFuse, Bellerophontes, ChimeraScan, and TopHat-fusion) to detect fusion events using synthetic and real datasets encompassing chimeras. The comparison analysis run only on synthetic data could generate misleading results since we found no counterpart on real dataset. Furthermore, most tools report a very high number of false positive chimeras. In particular, the most sensitive tool, ChimeraScan, reports a large number of false positives that we were able to significantly reduce by devising and applying two filters to remove fusions not supported by fusion junction-spanning reads or encompassing large intronic regions. Conclusions. The discordant results obtained using synthetic and real datasets suggest that synthetic datasets encompassing fusion events may not fully catch the complexity of RNA-seq experiment. Moreover, fusion detection tools are still limited in sensitivity or specificity; thus, there is space for further improvement in the fusion-finder algorithms. PMID:23555082

  10. msgbsR: An R package for analysing methylation-sensitive restriction enzyme sequencing data.

    PubMed

    Mayne, Benjamin T; Leemaqz, Shalem Y; Buckberry, Sam; Rodriguez Lopez, Carlos M; Roberts, Claire T; Bianco-Miotto, Tina; Breen, James

    2018-02-01

    Genotyping-by-sequencing (GBS) or restriction-site associated DNA marker sequencing (RAD-seq) is a practical and cost-effective method for analysing large genomes from high diversity species. This method of sequencing, coupled with methylation-sensitive enzymes (often referred to as methylation-sensitive restriction enzyme sequencing or MRE-seq), is an effective tool to study DNA methylation in parts of the genome that are inaccessible in other sequencing techniques or are not annotated in microarray technologies. Current software tools do not fulfil all methylation-sensitive restriction sequencing assays for determining differences in DNA methylation between samples. To fill this computational need, we present msgbsR, an R package that contains tools for the analysis of methylation-sensitive restriction enzyme sequencing experiments. msgbsR can be used to identify and quantify read counts at methylated sites directly from alignment files (BAM files) and enables verification of restriction enzyme cut sites with the correct recognition sequence of the individual enzyme. In addition, msgbsR assesses DNA methylation based on read coverage, similar to RNA sequencing experiments, rather than methylation proportion and is a useful tool in analysing differential methylation on large populations. The package is fully documented and available freely online as a Bioconductor package ( https://bioconductor.org/packages/release/bioc/html/msgbsR.html ).

  11. Multiculturally Sensitive Mental Health Scale (MSMHS): Development, Factor Analysis, Reliability, and Validity

    ERIC Educational Resources Information Center

    Chao, Ruth Chu-Lien; Green, Kathy E.

    2011-01-01

    Effectively and efficiently diagnosing African Americans' mental health has been a chronically unresolved challenge. To meet this challenge we developed a tool to better understand African Americans' mental health: the Multiculturally Sensitive Mental Health Scale (MSMHS). Three studies reporting the development and initial validation of the MSMHS…

  12. Multisite-multivariable sensitivity analysis of distributed watershed models: enhancing the perceptions from computationally frugal methods

    USDA-ARS?s Scientific Manuscript database

    This paper assesses the impact of different likelihood functions in identifying sensitive parameters of the highly parameterized, spatially distributed Soil and Water Assessment Tool (SWAT) watershed model for multiple variables at multiple sites. The global one-factor-at-a-time (OAT) method of Morr...

  13. Molecular tools for diagnosis of visceral leishmaniasis: systematic review and meta-analysis of diagnostic test accuracy.

    PubMed

    de Ruiter, C M; van der Veer, C; Leeflang, M M G; Deborggraeve, S; Lucas, C; Adams, E R

    2014-09-01

    Molecular methods have been proposed as highly sensitive tools for the detection of Leishmania parasites in visceral leishmaniasis (VL) patients. Here, we evaluate the diagnostic accuracy of these tools in a meta-analysis of the published literature. The selection criteria were original studies that evaluate the sensitivities and specificities of molecular tests for diagnosis of VL, adequate classification of study participants, and the absolute numbers of true positives and negatives derivable from the data presented. Forty studies met the selection criteria, including PCR, real-time PCR, nucleic acid sequence-based amplification (NASBA), and loop-mediated isothermal amplification (LAMP). The sensitivities of the individual studies ranged from 29 to 100%, and the specificities ranged from 25 to 100%. The pooled sensitivity of PCR in whole blood was 93.1% (95% confidence interval [CI], 90.0 to 95.2), and the specificity was 95.6% (95% CI, 87.0 to 98.6). The specificity was significantly lower in consecutive studies, at 63.3% (95% CI, 53.9 to 71.8), due either to true-positive patients not being identified by parasitological methods or to the number of asymptomatic carriers in areas of endemicity. PCR for patients with HIV-VL coinfection showed high diagnostic accuracy in buffy coat and bone marrow, ranging from 93.1 to 96.9%. Molecular tools are highly sensitive assays for Leishmania detection and may contribute as an additional test in the algorithm, together with a clear clinical case definition. We observed wide variety in reference standards and study designs and now recommend consecutively designed studies. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  14. Global Sensitivity of Simulated Water Balance Indicators Under Future Climate Change in the Colorado Basin

    DOE PAGES

    Bennett, Katrina Eleanor; Urrego Blanco, Jorge Rolando; Jonko, Alexandra; ...

    2017-11-20

    The Colorado River basin is a fundamentally important river for society, ecology and energy in the United States. Streamflow estimates are often provided using modeling tools which rely on uncertain parameters; sensitivity analysis can help determine which parameters impact model results. Despite the fact that simulated flows respond to changing climate and vegetation in the basin, parameter sensitivity of the simulations under climate change has rarely been considered. In this study, we conduct a global sensitivity analysis to relate changes in runoff, evapotranspiration, snow water equivalent and soil moisture to model parameters in the Variable Infiltration Capacity (VIC) hydrologic model.more » Here, we combine global sensitivity analysis with a space-filling Latin Hypercube sampling of the model parameter space and statistical emulation of the VIC model to examine sensitivities to uncertainties in 46 model parameters following a variance-based approach.« less

  15. Analysis of the sensitivity properties of a model of vector-borne bubonic plague.

    PubMed

    Buzby, Megan; Neckels, David; Antolin, Michael F; Estep, Donald

    2008-09-06

    Model sensitivity is a key to evaluation of mathematical models in ecology and evolution, especially in complex models with numerous parameters. In this paper, we use some recently developed methods for sensitivity analysis to study the parameter sensitivity of a model of vector-borne bubonic plague in a rodent population proposed by Keeling & Gilligan. The new sensitivity tools are based on a variational analysis involving the adjoint equation. The new approach provides a relatively inexpensive way to obtain derivative information about model output with respect to parameters. We use this approach to determine the sensitivity of a quantity of interest (the force of infection from rats and their fleas to humans) to various model parameters, determine a region over which linearization at a specific parameter reference point is valid, develop a global picture of the output surface, and search for maxima and minima in a given region in the parameter space.

  16. Global Sensitivity of Simulated Water Balance Indicators Under Future Climate Change in the Colorado Basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Katrina Eleanor; Urrego Blanco, Jorge Rolando; Jonko, Alexandra

    The Colorado River basin is a fundamentally important river for society, ecology and energy in the United States. Streamflow estimates are often provided using modeling tools which rely on uncertain parameters; sensitivity analysis can help determine which parameters impact model results. Despite the fact that simulated flows respond to changing climate and vegetation in the basin, parameter sensitivity of the simulations under climate change has rarely been considered. In this study, we conduct a global sensitivity analysis to relate changes in runoff, evapotranspiration, snow water equivalent and soil moisture to model parameters in the Variable Infiltration Capacity (VIC) hydrologic model.more » Here, we combine global sensitivity analysis with a space-filling Latin Hypercube sampling of the model parameter space and statistical emulation of the VIC model to examine sensitivities to uncertainties in 46 model parameters following a variance-based approach.« less

  17. Discrete Adjoint Sensitivity Analysis of Hybrid Dynamical Systems With Switching [Discrete Adjoint Sensitivity Analysis of Hybrid Dynamical Systems

    DOE PAGES

    Zhang, Hong; Abhyankar, Shrirang; Constantinescu, Emil; ...

    2017-01-24

    Sensitivity analysis is an important tool for describing power system dynamic behavior in response to parameter variations. It is a central component in preventive and corrective control applications. The existing approaches for sensitivity calculations, namely, finite-difference and forward sensitivity analysis, require a computational effort that increases linearly with the number of sensitivity parameters. In this paper, we investigate, implement, and test a discrete adjoint sensitivity approach whose computational effort is effectively independent of the number of sensitivity parameters. The proposed approach is highly efficient for calculating sensitivities of larger systems and is consistent, within machine precision, with the function whosemore » sensitivity we are seeking. This is an essential feature for use in optimization applications. Moreover, our approach includes a consistent treatment of systems with switching, such as dc exciters, by deriving and implementing the adjoint jump conditions that arise from state-dependent and time-dependent switchings. The accuracy and the computational efficiency of the proposed approach are demonstrated in comparison with the forward sensitivity analysis approach. In conclusion, this paper focuses primarily on the power system dynamics, but the approach is general and can be applied to hybrid dynamical systems in a broader range of fields.« less

  18. Discrete Adjoint Sensitivity Analysis of Hybrid Dynamical Systems With Switching [Discrete Adjoint Sensitivity Analysis of Hybrid Dynamical Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Hong; Abhyankar, Shrirang; Constantinescu, Emil

    Sensitivity analysis is an important tool for describing power system dynamic behavior in response to parameter variations. It is a central component in preventive and corrective control applications. The existing approaches for sensitivity calculations, namely, finite-difference and forward sensitivity analysis, require a computational effort that increases linearly with the number of sensitivity parameters. In this paper, we investigate, implement, and test a discrete adjoint sensitivity approach whose computational effort is effectively independent of the number of sensitivity parameters. The proposed approach is highly efficient for calculating sensitivities of larger systems and is consistent, within machine precision, with the function whosemore » sensitivity we are seeking. This is an essential feature for use in optimization applications. Moreover, our approach includes a consistent treatment of systems with switching, such as dc exciters, by deriving and implementing the adjoint jump conditions that arise from state-dependent and time-dependent switchings. The accuracy and the computational efficiency of the proposed approach are demonstrated in comparison with the forward sensitivity analysis approach. In conclusion, this paper focuses primarily on the power system dynamics, but the approach is general and can be applied to hybrid dynamical systems in a broader range of fields.« less

  19. Allergen Sensitization Pattern by Sex: A Cluster Analysis in Korea.

    PubMed

    Ohn, Jungyoon; Paik, Seung Hwan; Doh, Eun Jin; Park, Hyun-Sun; Yoon, Hyun-Sun; Cho, Soyun

    2017-12-01

    Allergens tend to sensitize simultaneously. Etiology of this phenomenon has been suggested to be allergen cross-reactivity or concurrent exposure. However, little is known about specific allergen sensitization patterns. To investigate the allergen sensitization characteristics according to gender. Multiple allergen simultaneous test (MAST) is widely used as a screening tool for detecting allergen sensitization in dermatologic clinics. We retrospectively reviewed the medical records of patients with MAST results between 2008 and 2014 in our Department of Dermatology. A cluster analysis was performed to elucidate the allergen-specific immunoglobulin (Ig)E cluster pattern. The results of MAST (39 allergen-specific IgEs) from 4,360 cases were analyzed. By cluster analysis, 39items were grouped into 8 clusters. Each cluster had characteristic features. When compared with female, the male group tended to be sensitized more frequently to all tested allergens, except for fungus allergens cluster. The cluster and comparative analysis results demonstrate that the allergen sensitization is clustered, manifesting allergen similarity or co-exposure. Only the fungus cluster allergens tend to sensitize female group more frequently than male group.

  20. Applying geologic sensitivity analysis to environmental risk management: The financial implications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rogers, D.T.

    The financial risks associated with environmental contamination can be staggering and are often difficult to identify and accurately assess. Geologic sensitivity analysis is gaining recognition as a significant and useful tool that can empower the user with crucial information concerning environmental risk management and brownfield redevelopment. It is particularly useful when (1) evaluating the potential risks associated with redevelopment of historical industrial facilities (brownfields) and (2) planning for future development, especially in areas of rapid development because the number of potential contaminating sources often increases with an increase in economic development. An examination of the financial implications relating to geologicmore » sensitivity analysis in southeastern Michigan from numerous case studies indicate that the environmental cost of contamination may be 100 to 1,000 times greater at a geologically sensitive location compared to the least sensitive location. Geologic sensitivity analysis has demonstrated that near-surface geology may influence the environmental impact of a contaminated site to a greater extent than the amount and type of industrial development.« less

  1. Investigation, sensitivity analysis, and multi-objective optimization of effective parameters on temperature and force in robotic drilling cortical bone.

    PubMed

    Tahmasbi, Vahid; Ghoreishi, Majid; Zolfaghari, Mojtaba

    2017-11-01

    The bone drilling process is very prominent in orthopedic surgeries and in the repair of bone fractures. It is also very common in dentistry and bone sampling operations. Due to the complexity of bone and the sensitivity of the process, bone drilling is one of the most important and sensitive processes in biomedical engineering. Orthopedic surgeries can be improved using robotic systems and mechatronic tools. The most crucial problem during drilling is an unwanted increase in process temperature (higher than 47 °C), which causes thermal osteonecrosis or cell death and local burning of the bone tissue. Moreover, imposing higher forces to the bone may lead to breaking or cracking and consequently cause serious damage. In this study, a mathematical second-order linear regression model as a function of tool drilling speed, feed rate, tool diameter, and their effective interactions is introduced to predict temperature and force during the bone drilling process. This model can determine the maximum speed of surgery that remains within an acceptable temperature range. Moreover, for the first time, using designed experiments, the bone drilling process was modeled, and the drilling speed, feed rate, and tool diameter were optimized. Then, using response surface methodology and applying a multi-objective optimization, drilling force was minimized to sustain an acceptable temperature range without damaging the bone or the surrounding tissue. In addition, for the first time, Sobol statistical sensitivity analysis is used to ascertain the effect of process input parameters on process temperature and force. The results show that among all effective input parameters, tool rotational speed, feed rate, and tool diameter have the highest influence on process temperature and force, respectively. The behavior of each output parameters with variation in each input parameter is further investigated. Finally, a multi-objective optimization has been performed considering all the aforementioned parameters. This optimization yielded a set of data that can considerably improve orthopedic osteosynthesis outcomes.

  2. Systematic review of fall risk screening tools for older patients in acute hospitals.

    PubMed

    Matarese, Maria; Ivziku, Dhurata; Bartolozzi, Francesco; Piredda, Michela; De Marinis, Maria Grazia

    2015-06-01

    To determine the most accurate fall risk screening tools for predicting falls among patients aged 65 years or older admitted to acute care hospitals. Falls represent a serious problem in older inpatients due to the potential physical, social, psychological and economic consequences. Older inpatients present with risk factors associated with age-related physiological and psychological changes as well as multiple morbidities. Thus, fall risk screening tools for older adults should include these specific risk factors. There are no published recommendations addressing what tools are appropriate for older hospitalized adults. Systematic review. MEDLINE, CINAHL and Cochrane electronic databases were searched between January 1981-April 2013. Only prospective validation studies reporting sensitivity and specificity values were included. Recommendations of the Cochrane Handbook of Diagnostic Test Accuracy Reviews have been followed. Three fall risk assessment tools were evaluated in seven articles. Due to the limited number of studies, meta-analysis was carried out only for the STRATIFY and Hendrich Fall Risk Model II. In the combined analysis, the Hendrich Fall Risk Model II demonstrated higher sensitivity than STRATIFY, while the STRATIFY showed higher specificity. In both tools, the Youden index showed low prognostic accuracy. The identified tools do not demonstrate predictive values as high as needed for identifying older inpatients at risk for falls. For this reason, no tool can be recommended for fall detection. More research is needed to evaluate fall risk screening tools for older inpatients. © 2014 John Wiley & Sons Ltd.

  3. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  4. Detection of Impaired Cerebral Autoregulation Using Selected Correlation Analysis: A Validation Study

    PubMed Central

    Brawanski, Alexander

    2017-01-01

    Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca), correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp). In this study we compared the results of the sca with the pressure reactivity index (PRx), an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc). The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data. PMID:28255331

  5. Detection of Impaired Cerebral Autoregulation Using Selected Correlation Analysis: A Validation Study.

    PubMed

    Proescholdt, Martin A; Faltermeier, Rupert; Bele, Sylvia; Brawanski, Alexander

    2017-01-01

    Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca), correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp). In this study we compared the results of the sca with the pressure reactivity index (PRx), an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc). The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data.

  6. Practical applications of surface analytic tools in tribology

    NASA Technical Reports Server (NTRS)

    Ferrante, J.

    1980-01-01

    Many of the currently, widely used tools available for surface analysis are described. Those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology and are truly surface sensitive (that is, less than 10 atomic layers) are presented. The latter group is evaluated in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under 'real' conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  7. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  8. TAxonomy of Self-reported Sedentary behaviour Tools (TASST) framework for development, comparison and evaluation of self-report tools: content analysis and systematic review.

    PubMed

    Dall, P M; Coulter, E H; Fitzsimons, C F; Skelton, D A; Chastin, Sfm

    2017-04-08

    Sedentary behaviour (SB) has distinct deleterious health outcomes, yet there is no consensus on best practice for measurement. This study aimed to identify the optimal self-report tool for population surveillance of SB, using a systematic framework. A framework, TAxonomy of Self-reported Sedentary behaviour Tools (TASST), consisting of four domains (type of assessment, recall period, temporal unit and assessment period), was developed based on a systematic inventory of existing tools. The inventory was achieved through a systematic review of studies reporting SB and tracing back to the original description. A systematic review of the accuracy and sensitivity to change of these tools was then mapped against TASST domains. Systematic searches were conducted via EBSCO, reference lists and expert opinion. The inventory included tools measuring SB in adults that could be self-completed at one sitting, and excluded tools measuring SB in specific populations or contexts. The systematic review included studies reporting on the accuracy against an objective measure of SB and/or sensitivity to change of a tool in the inventory. The systematic review initially identified 32 distinct tools (141 questions), which were used to develop the TASST framework. Twenty-two studies evaluated accuracy and/or sensitivity to change representing only eight taxa. Assessing SB as a sum of behaviours and using a previous day recall were the most promising features of existing tools. Accuracy was poor for all existing tools, with underestimation and overestimation of SB. There was a lack of evidence about sensitivity to change. Despite the limited evidence, mapping existing SB tools onto the TASST framework has enabled informed recommendations to be made about the most promising features for a surveillance tool, identified aspects on which future research and development of SB surveillance tools should focus. International prospective register of systematic reviews (PROPSPERO)/CRD42014009851. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  9. Review-of-systems questionnaire as a predictive tool for psychogenic nonepileptic seizures.

    PubMed

    Robles, Liliana; Chiang, Sharon; Haneef, Zulfi

    2015-04-01

    Patients with refractory epilepsy undergo video-electroencephalography for seizure characterization, among whom approximately 10-30% will be discharged with the diagnosis of psychogenic nonepileptic seizures (PNESs). Clinical PNES predictors have been described but in general are not sensitive or specific. We evaluated whether multiple complaints in a routine review-of-system (ROS) questionnaire could serve as a sensitive and specific marker of PNESs. We performed a retrospective analysis of a standardized ROS questionnaire completed by patients with definite PNESs and epileptic seizures (ESs) diagnosed in our adult epilepsy monitoring unit. A multivariate analysis of covariance (MANCOVA) was used to determine whether groups with PNES and ES differed with respect to the percentage of complaints in the ROS questionnaire. Tenfold cross-validation was used to evaluate the predictive error of a logistic regression classifier for PNES status based on the percentage of positive complaints in the ROS questionnaire. A total of 44 patients were included for analysis. Patients with PNESs had a significantly higher number of complaints in the ROS questionnaire compared to patients with epilepsy. A threshold of 17% positive complaints achieved a 78% specificity and 85% sensitivity for discriminating between PNESs and ESs. We conclude that the routine ROS questionnaire may be a sensitive and specific predictive tool for discriminating between PNESs and ESs. Published by Elsevier Inc.

  10. New features and improved uncertainty analysis in the NEA nuclear data sensitivity tool (NDaST)

    NASA Astrophysics Data System (ADS)

    Dyrda, J.; Soppera, N.; Hill, I.; Bossant, M.; Gulliford, J.

    2017-09-01

    Following the release and initial testing period of the NEA's Nuclear Data Sensitivity Tool [1], new features have been designed and implemented in order to expand its uncertainty analysis capabilities. The aim is to provide a free online tool for integral benchmark testing, that is both efficient and comprehensive, meeting the needs of the nuclear data and benchmark testing communities. New features include access to P1 sensitivities for neutron scattering angular distribution [2] and constrained Chi sensitivities for the prompt fission neutron energy sampling. Both of these are compatible with covariance data accessed via the JANIS nuclear data software, enabling propagation of the resultant uncertainties in keff to a large series of integral experiment benchmarks. These capabilities are available using a number of different covariance libraries e.g., ENDF/B, JEFF, JENDL and TENDL, allowing comparison of the broad range of results it is possible to obtain. The IRPhE database of reactor physics measurements is now also accessible within the tool in addition to the criticality benchmarks from ICSBEP. Other improvements include the ability to determine and visualise the energy dependence of a given calculated result in order to better identify specific regions of importance or high uncertainty contribution. Sorting and statistical analysis of the selected benchmark suite is now also provided. Examples of the plots generated by the software are included to illustrate such capabilities. Finally, a number of analytical expressions, for example Maxwellian and Watt fission spectra will be included. This will allow the analyst to determine the impact of varying such distributions within the data evaluation, either through adjustment of parameters within the expressions, or by comparison to a more general probability distribution fitted to measured data. The impact of such changes is verified through calculations which are compared to a `direct' measurement found by adjustment of the original ENDF format file.

  11. Univariate and multivariate analysis of tannin-impregnated wood species using vibrational spectroscopy.

    PubMed

    Schnabel, Thomas; Musso, Maurizio; Tondi, Gianluca

    2014-01-01

    Vibrational spectroscopy is one of the most powerful tools in polymer science. Three main techniques--Fourier transform infrared spectroscopy (FT-IR), FT-Raman spectroscopy, and FT near-infrared (NIR) spectroscopy--can also be applied to wood science. Here, these three techniques were used to investigate the chemical modification occurring in wood after impregnation with tannin-hexamine preservatives. These spectroscopic techniques have the capacity to detect the externally added tannin. FT-IR has very strong sensitivity to the aromatic peak at around 1610 cm(-1) in the tannin-treated samples, whereas FT-Raman reflects the peak at around 1600 cm(-1) for the externally added tannin. This high efficacy in distinguishing chemical features was demonstrated in univariate analysis and confirmed via cluster analysis. Conversely, the results of the NIR measurements show noticeable sensitivity for small differences. For this technique, multivariate analysis is required and with this chemometric tool, it is also possible to predict the concentration of tannin on the surface.

  12. Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

    2002-01-01

    Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

  13. Affective and behavioral dysfunction under antiepileptic drugs in epilepsy: Development of a new drug-sensitive screening tool.

    PubMed

    Mertens, Lea Julia; Witt, Juri-Alexander; Helmstaedter, Christoph

    2018-06-01

    Behavioral problems and psychiatric symptoms are common in patients with epilepsy and have a multifactorial origin, including adverse effects of antiepileptic drugs (AEDs). In order to develop a screening tool for behavioral AED effects, the aim of this study was to identify behavioral problems and symptoms particularly sensitive to AED drug load and the presence/absence of AEDs with known negative psychotropic profiles. Four hundred ninety-four patients with epilepsy were evaluated who had been assessed with three self-report questionnaires on mood, personality, and behavior (Beck Depression Inventory, BDI; Neurological Disorders Depression Inventory for Epilepsy extended, NDDI-E; and Fragebogen zur Persönlichkeit bei zerebralen Erkrankungen, FPZ). Drug-sensitive items were determined via correlation analyses and entered into an exploratory factor analysis for scale construction. The resulting scales were then analyzed as a function of drug treatment. Analyses revealed 30 items, which could be allocated to six behavioral domains: Emotional Lability, Depression, Aggression/Irritability, Psychosis & Suicidality, Risk- & Sensation-seeking, and Somatization. Subsequent analysis showed significant effects of the number of AEDs on behavior, as in Emotional Lability (F=2.54, p=.029), Aggression/Irritability (F=2.29, p=.046), Psychosis & Suicidality (F=2.98, p=.012), and Somatization (F=2.39, p=.038). Affective and behavioral difficulties were more prominent in those patients taking AEDs with supposedly negative psychotropic profiles. These effects were largely domain-unspecific and primarily manifested in polytherapy. Drug-sensitive behavioral domains and items were identified which qualify for a self-report screening tool. The tool indicates impairments with a higher drug load and when administering AEDs with negative psychotropic profiles. The next steps require normalization in healthy subjects and the clinical validation of the newly developed screening tool PsyTrack along with antiepileptic drug treatment. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. Review of nutritional screening and assessment tools and clinical outcomes in heart failure.

    PubMed

    Lin, Hong; Zhang, Haifeng; Lin, Zheng; Li, Xinli; Kong, Xiangqin; Sun, Gouzhen

    2016-09-01

    Recent studies have suggested that undernutrition as defined using multidimensional nutritional evaluation tools may affect clinical outcomes in heart failure (HF). The evidence supporting this correlation is unclear. Therefore, we conducted this systematic review to critically appraise the use of multidimensional evaluation tools in the prediction of clinical outcomes in HF. We performed descriptive analyses of all identified articles involving qualitative analyses. We used STATA to conduct meta-analyses when at least three studies that tested the same type of nutritional assessment or screening tools and used the same outcome were identified. Sensitivity analyses were conducted to validate our positive results. We identified 17 articles with qualitative analyses and 11 with quantitative analysis after comprehensive literature searching and screening. We determined that the prevalence of malnutrition is high in HF (range 16-90 %), particularly in advanced and acute decompensated HF (approximate range 75-90 %). Undernutrition as identified by multidimensional evaluation tools may be significantly associated with hospitalization, length of stay and complications and is particularly strongly associated with high mortality. The meta-analysis revealed that compared with other tools, Mini Nutritional Assessment (MNA) scores were the strongest predictors of mortality in HF [HR (4.32, 95 % CI 2.30-8.11)]. Our results remained reliable after conducting sensitivity analyses. The prevalence of malnutrition is high in HF, particularly in advanced and acute decompensated HF. Moreover, undernutrition as identified by multidimensional evaluation tools is significantly associated with unfavourable prognoses and high mortality in HF.

  15. Two-step sensitivity testing of parametrized and regionalized life cycle assessments: methodology and case study.

    PubMed

    Mutel, Christopher L; de Baan, Laura; Hellweg, Stefanie

    2013-06-04

    Comprehensive sensitivity analysis is a significant tool to interpret and improve life cycle assessment (LCA) models, but is rarely performed. Sensitivity analysis will increase in importance as inventory databases become regionalized, increasing the number of system parameters, and parametrized, adding complexity through variables and nonlinear formulas. We propose and implement a new two-step approach to sensitivity analysis. First, we identify parameters with high global sensitivities for further examination and analysis with a screening step, the method of elementary effects. Second, the more computationally intensive contribution to variance test is used to quantify the relative importance of these parameters. The two-step sensitivity test is illustrated on a regionalized, nonlinear case study of the biodiversity impacts from land use of cocoa production, including a worldwide cocoa products trade model. Our simplified trade model can be used for transformable commodities where one is assessing market shares that vary over time. In the case study, the highly uncertain characterization factors for the Ivory Coast and Ghana contributed more than 50% of variance for almost all countries and years examined. The two-step sensitivity test allows for the interpretation, understanding, and improvement of large, complex, and nonlinear LCA systems.

  16. Analysis techniques for multivariate root loci. [a tool in linear control systems

    NASA Technical Reports Server (NTRS)

    Thompson, P. M.; Stein, G.; Laub, A. J.

    1980-01-01

    Analysis and techniques are developed for the multivariable root locus and the multivariable optimal root locus. The generalized eigenvalue problem is used to compute angles and sensitivities for both types of loci, and an algorithm is presented that determines the asymptotic properties of the optimal root locus.

  17. Is the Job Satisfaction Survey a good tool to measure job satisfaction amongst health workers in Nepal? Results of a validation analysis.

    PubMed

    Batura, Neha; Skordis-Worrall, Jolene; Thapa, Rita; Basnyat, Regina; Morrison, Joanna

    2016-07-27

    Job satisfaction is an important predictor of an individual's intention to leave the workplace. It is increasingly being used to consider the retention of health workers in low-income countries. However, the determinants of job satisfaction vary in different contexts, and it is important to use measurement methods that are contextually appropriate. We identified a measurement tool developed by Paul Spector, and used mixed methods to assess its validity and reliability in measuring job satisfaction among maternal and newborn health workers (MNHWs) in government facilities in rural Nepal. We administered the tool to 137 MNHWs and collected qualitative data from 78 MNHWs, and district and central level stakeholders to explore definitions of job satisfaction and factors that affected it. We calculated a job satisfaction index for all MNHWs using quantitative data and tested for validity, reliability and sensitivity. We conducted qualitative content analysis and compared the job satisfaction indices with qualitative data. Results from the internal consistency tests offer encouraging evidence of the validity, reliability and sensitivity of the tool. Overall, the job satisfaction indices reflected the qualitative data. The tool was able to distinguish levels of job satisfaction among MNHWs. However, the work environment and promotion dimensions of the tool did not adequately reflect local conditions. Further, community fit was found to impact job satisfaction but was not captured by the tool. The relatively high incidence of missing responses may suggest that responding to some statements was perceived as risky. Our findings indicate that the adapted job satisfaction survey was able to measure job satisfaction in Nepal. However, it did not include key contextual factors affecting job satisfaction of MNHWs, and as such may have been less sensitive than a more inclusive measure. The findings suggest that this tool can be used in similar settings and populations, with the addition of statements reflecting the nature of the work environment and structure of the local health system. Qualitative data on job satisfaction should be collected before using the tool in a new context, to highlight any locally relevant dimensions of job satisfaction not already captured in the standard survey.

  18. Improved Analysis of Earth System Models and Observations using Simple Climate Models

    NASA Astrophysics Data System (ADS)

    Nadiga, B. T.; Urban, N. M.

    2016-12-01

    Earth system models (ESM) are the most comprehensive tools we have to study climate change and develop climate projections. However, the computational infrastructure required and the cost incurred in running such ESMs precludes direct use of such models in conjunction with a wide variety of tools that can further our understanding of climate. Here we are referring to tools that range from dynamical systems tools that give insight into underlying flow structure and topology to tools that come from various applied mathematical and statistical techniques and are central to quantifying stability, sensitivity, uncertainty and predictability to machine learning tools that are now being rapidly developed or improved. Our approach to facilitate the use of such models is to analyze output of ESM experiments (cf. CMIP) using a range of simpler models that consider integral balances of important quantities such as mass and/or energy in a Bayesian framework.We highlight the use of this approach in the context of the uptake of heat by the world oceans in the ongoing global warming. Indeed, since in excess of 90% of the anomalous radiative forcing due greenhouse gas emissions is sequestered in the world oceans, the nature of ocean heat uptake crucially determines the surface warming that is realized (cf. climate sensitivity). Nevertheless, ESMs themselves are never run long enough to directly assess climate sensitivity. So, we consider a range of models based on integral balances--balances that have to be realized in all first-principles based models of the climate system including the most detailed state-of-the art climate simulations. The models range from simple models of energy balance to those that consider dynamically important ocean processes such as the conveyor-belt circulation (Meridional Overturning Circulation, MOC), North Atlantic Deep Water (NADW) formation, Antarctic Circumpolar Current (ACC) and eddy mixing. Results from Bayesian analysis of such models using both ESM experiments and actual observations are presented. One such result points to the importance of direct sequestration of heat below 700 m, a process that is not allowed for in the simple models that have been traditionally used to deduce climate sensitivity.

  19. [Progress in the application of laser ablation ICP-MS to surface microanalysis in material science].

    PubMed

    Zhang, Yong; Jia, Yun-hai; Chen, Ji-wen; Shen, Xue-jing; Liu, Ying; Zhao, Leiz; Li, Dong-ling; Hang, Peng-cheng; Zhao, Zhen; Fan, Wan-lun; Wang, Hai-zhou

    2014-08-01

    In the present paper, apparatus and theory of surface analysis is introduced, and the progress in the application of laser ablation ICP-MS to microanalysis in ferrous, nonferrous and semiconductor field is reviewed in detail. Compared with traditional surface analytical tools, such as SEM/EDS (scanning electron microscopy/energy dispersive spectrum), EPMA (electron probe microanalysis analysis), AES (auger energy spectrum), etc. the advantage is little or no sample preparation, adjustable spatial resolution according to analytical demand, multi-element analysis and high sensitivity. It is now a powerful complementary method to traditional surface analytical tool. With the development of LA-ICP-MS technology maturing, more and more analytical workers will use this powerful tool in the future, and LA-ICP-MS will be a super star in elemental analysis field just like LIBS (Laser-induced breakdown spectroscopy).

  20. A geostatistics-informed hierarchical sensitivity analysis method for complex groundwater flow and transport modeling

    NASA Astrophysics Data System (ADS)

    Dai, Heng; Chen, Xingyuan; Ye, Ming; Song, Xuehang; Zachara, John M.

    2017-05-01

    Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study, we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multilayer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially distributed input variables.

  1. A Geostatistics-Informed Hierarchical Sensitivity Analysis Method for Complex Groundwater Flow and Transport Modeling

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2017-12-01

    Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multi-layer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed input variables.

  2. New Tools for Sea Ice Data Analysis and Visualization: NSIDC's Arctic Sea Ice News and Analysis

    NASA Astrophysics Data System (ADS)

    Vizcarra, N.; Stroeve, J.; Beam, K.; Beitler, J.; Brandt, M.; Kovarik, J.; Savoie, M. H.; Skaug, M.; Stafford, T.

    2017-12-01

    Arctic sea ice has long been recognized as a sensitive climate indicator and has undergone a dramatic decline over the past thirty years. Antarctic sea ice continues to be an intriguing and active field of research. The National Snow and Ice Data Center's Arctic Sea Ice News & Analysis (ASINA) offers researchers and the public a transparent view of sea ice data and analysis. We have released a new set of tools for sea ice analysis and visualization. In addition to Charctic, our interactive sea ice extent graph, the new Sea Ice Data and Analysis Tools page provides access to Arctic and Antarctic sea ice data organized in seven different data workbooks, updated daily or monthly. An interactive tool lets scientists, or the public, quickly compare changes in ice extent and location. Another tool allows users to map trends, anomalies, and means for user-defined time periods. Animations of September Arctic and Antarctic monthly average sea ice extent and concentration may also be accessed from this page. Our tools help the NSIDC scientists monitor and understand sea ice conditions in near real time. They also allow the public to easily interact with and explore sea ice data. Technical innovations in our data center helped NSIDC quickly build these tools and more easily maintain them. The tools were made publicly accessible to meet the desire from the public and members of the media to access the numbers and calculations that power our visualizations and analysis. This poster explores these tools and how other researchers, the media, and the general public are using them.

  3. Validation of Version 3.0 of the Breast Cancer Genetics Referral Screening Tool (B-RST™).

    PubMed

    Bellcross, Cecelia; Hermstad, April; Tallo, Christine; Stanislaw, Christine

    2018-05-08

    Despite increased awareness of hereditary breast and ovarian cancer among clinicians and the public, many BRCA1/2 mutation carriers remain unaware of their risk status. The Breast Cancer Genetics Referral Screening Tool (B-RST™) was created and validated to easily identify individuals at increased risk for hereditary breast and ovarian cancer for referral to cancer genetics services. The purpose of this study was to revise B-RST™ to maximize sensitivity against BRCA1/2 mutation status. We analyzed pedigrees of 277 individuals who had undergone BRCA1/2 testing to determine modifications to the B-RST™ 2.0 algorithm that would maximize sensitivity for mutations, while maintaining simplicity. We used McNemar's chi-square test to compare validation measures between the revised version (3.0) and the 2.0 version. Algorithmic changes made to B-RST™ 2.0 increased the sensitivity against BRCA1/2 mutation analysis from 71.1 to 94.0% (P < 0.0001). While specificity decreased, all screen-positive individuals were appropriate for cancer genetics referral, the primary purpose of the tool. Despite calls for BRCA1/2 population screening, there remains a critical need to identify those most at risk who should receive cancer genetics services. B-RST™ version 3.0 demonstrates high sensitivity for BRCA1/2 mutations, yet remains a simple and quick screening tool for at-risk individuals.

  4. Modelling the exposure to chemicals for risk assessment: a comprehensive library of multimedia and PBPK models for integration, prediction, uncertainty and sensitivity analysis - the MERLIN-Expo tool.

    PubMed

    Ciffroy, P; Alfonso, B; Altenpohl, A; Banjac, Z; Bierkens, J; Brochot, C; Critto, A; De Wilde, T; Fait, G; Fierens, T; Garratt, J; Giubilato, E; Grange, E; Johansson, E; Radomyski, A; Reschwann, K; Suciu, N; Tanaka, T; Tediosi, A; Van Holderbeke, M; Verdonck, F

    2016-10-15

    MERLIN-Expo is a library of models that was developed in the frame of the FP7 EU project 4FUN in order to provide an integrated assessment tool for state-of-the-art exposure assessment for environment, biota and humans, allowing the detection of scientific uncertainties at each step of the exposure process. This paper describes the main features of the MERLIN-Expo tool. The main challenges in exposure modelling that MERLIN-Expo has tackled are: (i) the integration of multimedia (MM) models simulating the fate of chemicals in environmental media, and of physiologically based pharmacokinetic (PBPK) models simulating the fate of chemicals in human body. MERLIN-Expo thus allows the determination of internal effective chemical concentrations; (ii) the incorporation of a set of functionalities for uncertainty/sensitivity analysis, from screening to variance-based approaches. The availability of such tools for uncertainty and sensitivity analysis aimed to facilitate the incorporation of such issues in future decision making; (iii) the integration of human and wildlife biota targets with common fate modelling in the environment. MERLIN-Expo is composed of a library of fate models dedicated to non biological receptor media (surface waters, soils, outdoor air), biological media of concern for humans (several cultivated crops, mammals, milk, fish), as well as wildlife biota (primary producers in rivers, invertebrates, fish) and humans. These models can be linked together to create flexible scenarios relevant for both human and wildlife biota exposure. Standardized documentation for each model and training material were prepared to support an accurate use of the tool by end-users. One of the objectives of the 4FUN project was also to increase the confidence in the applicability of the MERLIN-Expo tool through targeted realistic case studies. In particular, we aimed at demonstrating the feasibility of building complex realistic exposure scenarios and the accuracy of the modelling predictions through a comparison with actual measurements. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Computational mechanics analysis tools for parallel-vector supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf O.; Nguyen, Duc T.; Baddourah, Majdi; Qin, Jiangning

    1993-01-01

    Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigensolution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization search analysis and domain decomposition. The source code for many of these algorithms is available.

  6. Improving multi-objective reservoir operation optimization with sensitivity-informed dimension reduction

    NASA Astrophysics Data System (ADS)

    Chu, J.; Zhang, C.; Fu, G.; Li, Y.; Zhou, H.

    2015-08-01

    This study investigates the effectiveness of a sensitivity-informed method for multi-objective operation of reservoir systems, which uses global sensitivity analysis as a screening tool to reduce computational demands. Sobol's method is used to screen insensitive decision variables and guide the formulation of the optimization problems with a significantly reduced number of decision variables. This sensitivity-informed method dramatically reduces the computational demands required for attaining high-quality approximations of optimal trade-off relationships between conflicting design objectives. The search results obtained from the reduced complexity multi-objective reservoir operation problems are then used to pre-condition the full search of the original optimization problem. In two case studies, the Dahuofang reservoir and the inter-basin multi-reservoir system in Liaoning province, China, sensitivity analysis results show that reservoir performance is strongly controlled by a small proportion of decision variables. Sensitivity-informed dimension reduction and pre-conditioning are evaluated in their ability to improve the efficiency and effectiveness of multi-objective evolutionary optimization. Overall, this study illustrates the efficiency and effectiveness of the sensitivity-informed method and the use of global sensitivity analysis to inform dimension reduction of optimization problems when solving complex multi-objective reservoir operation problems.

  7. Improving multi-objective reservoir operation optimization with sensitivity-informed problem decomposition

    NASA Astrophysics Data System (ADS)

    Chu, J. G.; Zhang, C.; Fu, G. T.; Li, Y.; Zhou, H. C.

    2015-04-01

    This study investigates the effectiveness of a sensitivity-informed method for multi-objective operation of reservoir systems, which uses global sensitivity analysis as a screening tool to reduce the computational demands. Sobol's method is used to screen insensitive decision variables and guide the formulation of the optimization problems with a significantly reduced number of decision variables. This sensitivity-informed problem decomposition dramatically reduces the computational demands required for attaining high quality approximations of optimal tradeoff relationships between conflicting design objectives. The search results obtained from the reduced complexity multi-objective reservoir operation problems are then used to pre-condition the full search of the original optimization problem. In two case studies, the Dahuofang reservoir and the inter-basin multi-reservoir system in Liaoning province, China, sensitivity analysis results show that reservoir performance is strongly controlled by a small proportion of decision variables. Sensitivity-informed problem decomposition and pre-conditioning are evaluated in their ability to improve the efficiency and effectiveness of multi-objective evolutionary optimization. Overall, this study illustrates the efficiency and effectiveness of the sensitivity-informed method and the use of global sensitivity analysis to inform problem decomposition when solving the complex multi-objective reservoir operation problems.

  8. A GIS-assisted regional screening tool to evaluate the leaching potential of volatile and non-volatile pesticides

    NASA Astrophysics Data System (ADS)

    Ki, Seo Jin; Ray, Chittaranjan

    2015-03-01

    A regional screening tool-which is useful in cases where few site-specific parameters are available for complex vadose zone models-assesses the leaching potential of pollutants to groundwater over large areas. In this study, the previous pesticide leaching tool used in Hawaii was revised to account for the release of new volatile organic compounds (VOCs) from the soil surface. The tool was modified to introduce expanded terms in the traditional pesticide ranking indices (i.e., retardation and attenuation factors), allowing the estimation of the leaching fraction of volatile chemicals based on recharge, soil, and chemical properties to be updated. Results showed that the previous tool significantly overestimated the mass fraction of VOCs leached through soils as the recharge rates increased above 0.001801 m/d. In contrast, the revised tool successfully delineated vulnerable areas to the selected VOCs based on two reference chemicals, a known leacher and non-leacher, which were determined in local conditions. The sensitivity analysis with the Latin-Hypercube-One-factor-At-a-Time method revealed that the new leaching tool was most sensitive to changes in the soil organic carbon sorption coefficient, fractional organic carbon content, and Henry's law constant; and least sensitive to parameters such as the bulk density, water content at field capacity, and particle density in soils. When the revised tool was compared to the analytical (STANMOD) and numerical (HYDRUS-1D) models as a susceptibility measure, it ranked particular VOCs well (e.g., benzene, carbofuran, and toluene) that were consistent with other two models under the given conditions. Therefore, the new leaching tool can be widely used to address intrinsic groundwater vulnerability to contamination of pesticides and VOCs, along with the DRASTIC method or similar Tier 1 models such as SCI-GROW and WIN-PST.

  9. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    NASA Astrophysics Data System (ADS)

    Feng, Jinchao; Lansford, Joshua; Mironenko, Alexander; Pourkargar, Davood Babaei; Vlachos, Dionisios G.; Katsoulakis, Markos A.

    2018-03-01

    We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data). The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  10. Sensitivity analysis of hybrid thermoelastic techniques

    Treesearch

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  11. Selection and validation of endogenous reference genes for qRT-PCR analysis in leafy spurge (Euphorbia esula)

    USDA-ARS?s Scientific Manuscript database

    Quantitative real-time polymerase chain reaction (qRT-PCR) is the most important tool in measuring levels of gene expression due to its accuracy, specificity, and sensitivity. However, the accuracy of qRT-PCR analysis strongly depends on transcript normalization using stably expressed reference gene...

  12. A Laboratory Exercise Illustrating the Sensitivity and Specificity of Western Blot Analysis

    ERIC Educational Resources Information Center

    Chang, Ming-Mei; Lovett, Janice

    2011-01-01

    Western blot analysis, commonly known as "Western blotting," is a standard tool in every laboratory where proteins are analyzed. It involves the separation of polypeptides in polyacrylamide gels followed by the electrophoretic transfer of the separated polypeptides onto a nitrocellulose or polyvinylidene fluoride membrane. A replica of the…

  13. SCALE Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less

  14. SCALE Code System 6.2.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less

  15. Is the Timed Up and Go test a useful predictor of risk of falls in community dwelling older adults: a systematic review and meta- analysis

    PubMed Central

    2014-01-01

    Background The Timed Up and Go test (TUG) is a commonly used screening tool to assist clinicians to identify patients at risk of falling. The purpose of this systematic review and meta-analysis is to determine the overall predictive value of the TUG in community-dwelling older adults. Methods A literature search was performed to identify all studies that validated the TUG test. The methodological quality of the selected studies was assessed using the QUADAS-2 tool, a validated tool for the quality assessment of diagnostic accuracy studies. A TUG score of ≥13.5 seconds was used to identify individuals at higher risk of falling. All included studies were combined using a bivariate random effects model to generate pooled estimates of sensitivity and specificity at ≥13.5 seconds. Heterogeneity was assessed using the variance of logit transformed sensitivity and specificity. Results Twenty-five studies were included in the systematic review and 10 studies were included in meta-analysis. The TUG test was found to be more useful at ruling in rather than ruling out falls in individuals classified as high risk (>13.5 sec), with a higher pooled specificity (0.74, 95% CI 0.52-0.88) than sensitivity (0.31, 95% CI 0.13-0.57). Logistic regression analysis indicated that the TUG score is not a significant predictor of falls (OR = 1.01, 95% CI 1.00-1.02, p = 0.05). Conclusion The Timed Up and Go test has limited ability to predict falls in community dwelling elderly and should not be used in isolation to identify individuals at high risk of falls in this setting. PMID:24484314

  16. Using microRNA profiling in urine samples to develop a non-invasive test for bladder cancer.

    PubMed

    Mengual, Lourdes; Lozano, Juan José; Ingelmo-Torres, Mercedes; Gazquez, Cristina; Ribal, María José; Alcaraz, Antonio

    2013-12-01

    Current standard methods used to detect and monitor bladder urothelial cell carcinoma (UCC) are invasive or have low sensitivity. The incorporation into clinical practice of a non-invasive tool for UCC assessment would enormously improve patients' quality of life and outcome. This study aimed to examine the microRNA (miRNA) expression profiles in urines of UCC patients in order to develop a non-invasive accurate and reliable tool to diagnose and provide information on the aggressiveness of the tumor. We performed a global miRNA expression profiling analysis of the urinary cells from 40 UCC patients and controls using TaqMan Human MicroRNA Array followed by validation of 22 selected potentially diagnostic and prognostic miRNAs in a separate cohort of 277 samples using a miRCURY LNA qPCR system. miRNA-based signatures were developed by multivariate logistic regression analysis and internally cross-validated. In the initial cohort of patients, we identified 40 and 30 aberrantly expressed miRNA in UCC compared with control urines and in high compared with low grade tumors, respectively. Quantification of 22 key miRNAs in an independent cohort resulted in the identification of a six miRNA diagnostic signature with a sensitivity of 84.8% and specificity of 86.5% (AUC = 0.92) and a two miRNA prognostic model with a sensitivity of 84.95% and a specificity of 74.14% (AUC = 0.83). Internal cross-validation analysis confirmed the accuracy rates of both models, reinforcing the strength of our findings. Although the data needs to be externally validated, miRNA analysis in urine appears to be a valuable tool for the non-invasive assessment of UCC. Copyright © 2013 UICC.

  17. PeTTSy: a computational tool for perturbation analysis of complex systems biology models.

    PubMed

    Domijan, Mirela; Brown, Paul E; Shulgin, Boris V; Rand, David A

    2016-03-10

    Over the last decade sensitivity analysis techniques have been shown to be very useful to analyse complex and high dimensional Systems Biology models. However, many of the currently available toolboxes have either used parameter sampling, been focused on a restricted set of model observables of interest, studied optimisation of a objective function, or have not dealt with multiple simultaneous model parameter changes where the changes can be permanent or temporary. Here we introduce our new, freely downloadable toolbox, PeTTSy (Perturbation Theory Toolbox for Systems). PeTTSy is a package for MATLAB which implements a wide array of techniques for the perturbation theory and sensitivity analysis of large and complex ordinary differential equation (ODE) based models. PeTTSy is a comprehensive modelling framework that introduces a number of new approaches and that fully addresses analysis of oscillatory systems. It examines sensitivity analysis of the models to perturbations of parameters, where the perturbation timing, strength, length and overall shape can be controlled by the user. This can be done in a system-global setting, namely, the user can determine how many parameters to perturb, by how much and for how long. PeTTSy also offers the user the ability to explore the effect of the parameter perturbations on many different types of outputs: period, phase (timing of peak) and model solutions. PeTTSy can be employed on a wide range of mathematical models including free-running and forced oscillators and signalling systems. To enable experimental optimisation using the Fisher Information Matrix it efficiently allows one to combine multiple variants of a model (i.e. a model with multiple experimental conditions) in order to determine the value of new experiments. It is especially useful in the analysis of large and complex models involving many variables and parameters. PeTTSy is a comprehensive tool for analysing large and complex models of regulatory and signalling systems. It allows for simulation and analysis of models under a variety of environmental conditions and for experimental optimisation of complex combined experiments. With its unique set of tools it makes a valuable addition to the current library of sensitivity analysis toolboxes. We believe that this software will be of great use to the wider biological, systems biology and modelling communities.

  18. Stochastic sensitivity measure for mistuned high-performance turbines

    NASA Technical Reports Server (NTRS)

    Murthy, Durbha V.; Pierre, Christophe

    1992-01-01

    A stochastic measure of sensitivity is developed in order to predict the effects of small random blade mistuning on the dynamic aeroelastic response of turbomachinery blade assemblies. This sensitivity measure is based solely on the nominal system design (i.e., on tuned system information), which makes it extremely easy and inexpensive to calculate. The measure has the potential to become a valuable design tool that will enable designers to evaluate mistuning effects at a preliminary design stage and thus assess the need for a full mistuned rotor analysis. The predictive capability of the sensitivity measure is illustrated by examining the effects of mistuning on the aeroelastic modes of the first stage of the oxidizer turbopump in the Space Shuttle Main Engine. Results from a full analysis mistuned systems confirm that the simple stochastic sensitivity measure predicts consistently the drastic changes due to misturning and the localization of aeroelastic vibration to a few blades.

  19. Discrete sensitivity derivatives of the Navier-Stokes equations with a parallel Krylov solver

    NASA Technical Reports Server (NTRS)

    Ajmani, Kumud; Taylor, Arthur C., III

    1994-01-01

    This paper solves an 'incremental' form of the sensitivity equations derived by differentiating the discretized thin-layer Navier Stokes equations with respect to certain design variables of interest. The equations are solved with a parallel, preconditioned Generalized Minimal RESidual (GMRES) solver on a distributed-memory architecture. The 'serial' sensitivity analysis code is parallelized by using the Single Program Multiple Data (SPMD) programming model, domain decomposition techniques, and message-passing tools. Sensitivity derivatives are computed for low and high Reynolds number flows over a NACA 1406 airfoil on a 32-processor Intel Hypercube, and found to be identical to those computed on a single-processor Cray Y-MP. It is estimated that the parallel sensitivity analysis code has to be run on 40-50 processors of the Intel Hypercube in order to match the single-processor processing time of a Cray Y-MP.

  20. Precision of Sensitivity in the Design Optimization of Indeterminate Structures

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Hopkins, Dale A.

    2006-01-01

    Design sensitivity is central to most optimization methods. The analytical sensitivity expression for an indeterminate structural design optimization problem can be factored into a simple determinate term and a complicated indeterminate component. Sensitivity can be approximated by retaining only the determinate term and setting the indeterminate factor to zero. The optimum solution is reached with the approximate sensitivity. The central processing unit (CPU) time to solution is substantially reduced. The benefit that accrues from using the approximate sensitivity is quantified by solving a set of problems in a controlled environment. Each problem is solved twice: first using the closed-form sensitivity expression, then using the approximation. The problem solutions use the CometBoards testbed as the optimization tool with the integrated force method as the analyzer. The modification that may be required, to use the stiffener method as the analysis tool in optimization, is discussed. The design optimization problem of an indeterminate structure contains many dependent constraints because of the implicit relationship between stresses, as well as the relationship between the stresses and displacements. The design optimization process can become problematic because the implicit relationship reduces the rank of the sensitivity matrix. The proposed approximation restores the full rank and enhances the robustness of the design optimization method.

  1. A Bayesian Network Based Global Sensitivity Analysis Method for Identifying Dominant Processes in a Multi-physics Model

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2016-12-01

    Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.

  2. Global Sensitivity Analysis for Identifying Important Parameters of Nitrogen Nitrification and Denitrification under Model and Scenario Uncertainties

    NASA Astrophysics Data System (ADS)

    Ye, M.; Chen, Z.; Shi, L.; Zhu, Y.; Yang, J.

    2017-12-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. While global sensitivity analysis is a vital tool for identifying the parameters important to nitrogen reactive transport, conventional global sensitivity analysis only considers parametric uncertainty. This may result in inaccurate selection of important parameters, because parameter importance may vary under different models and modeling scenarios. By using a recently developed variance-based global sensitivity analysis method, this paper identifies important parameters with simultaneous consideration of parametric uncertainty, model uncertainty, and scenario uncertainty. In a numerical example of nitrogen reactive transport modeling, a combination of three scenarios of soil temperature and two scenarios of soil moisture leads to a total of six scenarios. Four alternative models are used to evaluate reduction functions used for calculating actual rates of nitrification and denitrification. The model uncertainty is tangled with scenario uncertainty, as the reduction functions depend on soil temperature and moisture content. The results of sensitivity analysis show that parameter importance varies substantially between different models and modeling scenarios, which may lead to inaccurate selection of important parameters if model and scenario uncertainties are not considered. This problem is avoided by using the new method of sensitivity analysis in the context of model averaging and scenario averaging. The new method of sensitivity analysis can be applied to other problems of contaminant transport modeling when model uncertainty and/or scenario uncertainty are present.

  3. Reliability Analysis for AFTI-F16 SRFCS Using ASSIST and SURE

    NASA Technical Reports Server (NTRS)

    Wu, N. Eva

    2001-01-01

    This paper reports the results of a study on reliability analysis of an AFTI-16 Self-Repairing Flight Control System (SRFCS) using software tools SURE (Semi-Markov Unreliability Range Evaluator and ASSIST (Abstract Semi-Markov Specification Interface to the SURE Tool). The purpose of the study is to investigate the potential utility of the software tools in the ongoing effort of the NASA Aviation Safety Program, where the class of systems must be extended beyond the originally intended serving class of electronic digital processors. The study concludes that SURE and ASSIST are applicable to reliability, analysis of flight control systems. They are especially efficient for sensitivity analysis that quantifies the dependence of system reliability on model parameters. The study also confirms an earlier finding on the dominant role of a parameter called a failure coverage. The paper will remark on issues related to the improvement of coverage and the optimization of redundancy level.

  4. Scale/TSUNAMI Sensitivity Data for ICSBEP Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Reed, Davis Allan; Lefebvre, Robert A

    2011-01-01

    The Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) software developed at Oak Ridge National Laboratory (ORNL) as part of the Scale code system provide unique methods for code validation, gap analysis, and experiment design. For TSUNAMI analysis, sensitivity data are generated for each application and each existing or proposed experiment used in the assessment. The validation of diverse sets of applications requires potentially thousands of data files to be maintained and organized by the user, and a growing number of these files are available through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE) distributed through themore » International Criticality Safety Benchmark Evaluation Program (ICSBEP). To facilitate the use of the IHECSBE benchmarks in rigorous TSUNAMI validation and gap analysis techniques, ORNL generated SCALE/TSUNAMI sensitivity data files (SDFs) for several hundred benchmarks for distribution with the IHECSBE. For the 2010 edition of IHECSBE, the sensitivity data were generated using 238-group cross-section data based on ENDF/B-VII.0 for 494 benchmark experiments. Additionally, ORNL has developed a quality assurance procedure to guide the generation of Scale inputs and sensitivity data, as well as a graphical user interface to facilitate the use of sensitivity data in identifying experiments and applying them in validation studies.« less

  5. Fall 2014 SEI Research Review Probabilistic Analysis of Time Sensitive Systems

    DTIC Science & Technology

    2014-10-28

    Osmosis SMC Tool Osmosis is a tool for Statistical Model Checking (SMC) with Semantic Importance Sampling. • Input model is written in subset of C...ASSERT() statements in model indicate conditions that must hold. • Input probability distributions defined by the user. • Osmosis returns the...on: – Target relative error, or – Set number of simulations Osmosis Main Algorithm 1 http://dreal.cs.cmu.edu/ (?⃑?): Indicator

  6. Mapping Seabird Sensitivity to Offshore Wind Farms

    PubMed Central

    Bradbury, Gareth; Trinder, Mark; Furness, Bob; Banks, Alex N.; Caldow, Richard W. G.; Hume, Duncan

    2014-01-01

    We present a Geographic Information System (GIS) tool, SeaMaST (Seabird Mapping and Sensitivity Tool), to provide evidence on the use of sea areas by seabirds and inshore waterbirds in English territorial waters, mapping their relative sensitivity to offshore wind farms. SeaMaST is a freely available evidence source for use by all connected to the offshore wind industry and will assist statutory agencies in assessing potential risks to seabird populations from planned developments. Data were compiled from offshore boat and aerial observer surveys spanning the period 1979–2012. The data were analysed using distance analysis and Density Surface Modelling to produce predicted bird densities across a grid covering English territorial waters at a resolution of 3 km×3 km. Coefficients of Variation were estimated for each grid cell density, as an indication of confidence in predictions. Offshore wind farm sensitivity scores were compiled for seabird species using English territorial waters. The comparative risks to each species of collision with turbines and displacement from operational turbines were reviewed and scored separately, and the scores were multiplied by the bird density estimates to produce relative sensitivity maps. The sensitivity maps reflected well the amassed distributions of the most sensitive species. SeaMaST is an important new tool for assessing potential impacts on seabird populations from offshore development at a time when multiple large areas of development are proposed which overlap with many seabird species’ ranges. It will inform marine spatial planning as well as identifying priority areas of sea usage by marine birds. Example SeaMaST outputs are presented. PMID:25210739

  7. Mapping seabird sensitivity to offshore wind farms.

    PubMed

    Bradbury, Gareth; Trinder, Mark; Furness, Bob; Banks, Alex N; Caldow, Richard W G; Hume, Duncan

    2014-01-01

    We present a Geographic Information System (GIS) tool, SeaMaST (Seabird Mapping and Sensitivity Tool), to provide evidence on the use of sea areas by seabirds and inshore waterbirds in English territorial waters, mapping their relative sensitivity to offshore wind farms. SeaMaST is a freely available evidence source for use by all connected to the offshore wind industry and will assist statutory agencies in assessing potential risks to seabird populations from planned developments. Data were compiled from offshore boat and aerial observer surveys spanning the period 1979-2012. The data were analysed using distance analysis and Density Surface Modelling to produce predicted bird densities across a grid covering English territorial waters at a resolution of 3 km×3 km. Coefficients of Variation were estimated for each grid cell density, as an indication of confidence in predictions. Offshore wind farm sensitivity scores were compiled for seabird species using English territorial waters. The comparative risks to each species of collision with turbines and displacement from operational turbines were reviewed and scored separately, and the scores were multiplied by the bird density estimates to produce relative sensitivity maps. The sensitivity maps reflected well the amassed distributions of the most sensitive species. SeaMaST is an important new tool for assessing potential impacts on seabird populations from offshore development at a time when multiple large areas of development are proposed which overlap with many seabird species' ranges. It will inform marine spatial planning as well as identifying priority areas of sea usage by marine birds. Example SeaMaST outputs are presented.

  8. A systematic review and meta-analysis of quantitative interviewing tools to investigate self-reported HIV and STI associated behaviours in low- and middle-income countries.

    PubMed

    Phillips, Anna E; Gomez, Gabriella B; Boily, Marie-Claude; Garnett, Geoffrey P

    2010-12-01

    Studies identifying risks and evaluating interventions for human immunodeficiency virus (HIV) and other sexually transmitted infections often rely on self-reported measures of sensitive behaviours. Such self-reports can be subject to social desirability bias. Concerns over the accuracy of these measures have prompted efforts to improve the level of privacy and anonymity of the interview setting. This study aims to determine whether such novel tools minimize misreporting of sensitive information. Systematic review and meta-analysis of studies in low- and middle-income countries comparing traditional face-to-face interview (FTFI) with innovative tools for reporting HIV risk behaviour. Crude odds ratios (ORs) and 95% confidence intervals (CIs) were calculated. Cochran's chi-squared test of heterogeneity was performed to explore differences between estimates. Pooled estimates were determined by gender, region, education, setting and question time frame using a random effects model. We found and included 15 data sets in the meta-analysis. Most studies compared audio computer-assisted self interview (ACASI) with FTFI. There was significant heterogeneity across studies for three outcomes of interest: 'ever had sex' (I(2) = 93.4%, P < 0.001), non-condom use (I(2) = 89.3%, P < 0.001), and number of partners (I(2) = 75.3%, P < 0.001). For the fourth outcome, 'forced sex', there was homogenous increased reporting by non-FTFI methods (OR 1.47; 95% CI 1.11-1.94). Overall, non-FTFI methods were not consistently associated with a significant increase in the reporting of all outcomes. However, there was increased reporting associated with non-FTFI with region (Asia), setting (urban), education (>60% had secondary education) and a shorter question time frame. Contrary to expectation, differences between FTFI and non-interviewer-administered interview methods for the reported sensitive behaviour investigated were not uniform. However, we observed trends and variations in the level of reporting according to the outcome, study and population characteristics. FTFI may not always be inferior to innovative interview tools depending on the sensitivity of the question as well as the population assessed.

  9. Applied Meteorology Unit (AMU) Quarterly Report Third Quarter FY-08

    NASA Technical Reports Server (NTRS)

    Bauman, William; Crawford, Winifred; Barrett, Joe; Watson, Leela; Dreher, Joseph

    2008-01-01

    This report summarizes the Applied Meteorology Unit (AMU) activities for the third quarter of Fiscal Year 2008 (April - June 2008). Tasks reported on are: Peak Wind Tool for User Launch Commit Criteria (LCC), Anvil Forecast Tool in AWIPS Phase II, Completion of the Edward Air Force Base (EAFB) Statistical Guidance Wind Tool, Volume Averaged Height Integ rated Radar Reflectivity (VAHIRR), Impact of Local Sensors, Radar Scan Strategies for the PAFB WSR-74C Replacement, VAHIRR Cost Benefit Analysis, and WRF Wind Sensitivity Study at Edwards Air Force Base

  10. Analysis of the Elodea nuttallii transcriptome in response to mercury and cadmium pollution: development of sensitive tools for rapid ecotoxicological testing.

    PubMed

    Regier, Nicole; Baerlocher, Loïc; Münsterkötter, Martin; Farinelli, Laurent; Cosio, Claudia

    2013-08-06

    Toxic metals polluting aquatic ecosystems are taken up by inhabitants and accumulate in the food web, affecting species at all trophic levels. It is therefore important to have good tools to assess the level of risk represented by toxic metals in the environment. Macrophytes are potential organisms for the identification of metal-responsive biomarkers but are still underrepresented in ecotoxicology. In the present study, we used next-generation sequencing to investigate the transcriptomic response of Elodea nuttallii exposed to enhanced concentrations of Hg and Cd. We de novo assembled more than 60 000 contigs, of which we found 170 to be regulated dose-dependently by Hg and 212 by Cd. Functional analysis showed that these genes were notably related to energy and metal homeostasis. Expression analysis using nCounter of a subset of genes showed that the gene expression pattern was able to assess toxic metal exposure in complex environmental samples and was more sensitive than other end points (e.g., bioaccumulation, photosynthesis, etc.). In conclusion, we demonstrate the feasibility of using gene expression signatures for the assessment of environmental contamination, using an organism without previous genetic information. This is of interest to ecotoxicology in a wider sense given the possibility to develop specific and sensitive bioassays.

  11. Development of a practical approach to expert elicitation for randomised controlled trials with missing health outcomes: Application to the IMPROVE trial.

    PubMed

    Mason, Alexina J; Gomes, Manuel; Grieve, Richard; Ulug, Pinar; Powell, Janet T; Carpenter, James

    2017-08-01

    The analyses of randomised controlled trials with missing data typically assume that, after conditioning on the observed data, the probability of missing data does not depend on the patient's outcome, and so the data are 'missing at random' . This assumption is usually implausible, for example, because patients in relatively poor health may be more likely to drop out. Methodological guidelines recommend that trials require sensitivity analysis, which is best informed by elicited expert opinion, to assess whether conclusions are robust to alternative assumptions about the missing data. A major barrier to implementing these methods in practice is the lack of relevant practical tools for eliciting expert opinion. We develop a new practical tool for eliciting expert opinion and demonstrate its use for randomised controlled trials with missing data. We develop and illustrate our approach for eliciting expert opinion with the IMPROVE trial (ISRCTN 48334791), an ongoing multi-centre randomised controlled trial which compares an emergency endovascular strategy versus open repair for patients with ruptured abdominal aortic aneurysm. In the IMPROVE trial at 3 months post-randomisation, 21% of surviving patients did not complete health-related quality of life questionnaires (assessed by EQ-5D-3L). We address this problem by developing a web-based tool that provides a practical approach for eliciting expert opinion about quality of life differences between patients with missing versus complete data. We show how this expert opinion can define informative priors within a fully Bayesian framework to perform sensitivity analyses that allow the missing data to depend upon unobserved patient characteristics. A total of 26 experts, of 46 asked to participate, completed the elicitation exercise. The elicited quality of life scores were lower on average for the patients with missing versus complete data, but there was considerable uncertainty in these elicited values. The missing at random analysis found that patients randomised to the emergency endovascular strategy versus open repair had higher average (95% credible interval) quality of life scores of 0.062 (-0.005 to 0.130). Our sensitivity analysis that used the elicited expert information as pooled priors found that the gain in average quality of life for the emergency endovascular strategy versus open repair was 0.076 (-0.054 to 0.198). We provide and exemplify a practical tool for eliciting the expert opinion required by recommended approaches to the sensitivity analyses of randomised controlled trials. We show how this approach allows the trial analysis to fully recognise the uncertainty that arises from making alternative, plausible assumptions about the reasons for missing data. This tool can be widely used in the design, analysis and interpretation of future trials, and to facilitate this, materials are available for download.

  12. Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology.

    PubMed

    Faltermeier, Rupert; Proescholdt, Martin A; Bele, Sylvia; Brawanski, Alexander

    2015-01-01

    Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses.

  13. Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology

    PubMed Central

    Faltermeier, Rupert; Proescholdt, Martin A.; Bele, Sylvia; Brawanski, Alexander

    2015-01-01

    Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses. PMID:26693250

  14. Simultaneous Aerodynamic and Structural Design Optimization (SASDO) for a 3-D Wing

    NASA Technical Reports Server (NTRS)

    Gumbert, Clyde R.; Hou, Gene J.-W.; Newman, Perry A.

    2001-01-01

    The formulation and implementation of an optimization method called Simultaneous Aerodynamic and Structural Design Optimization (SASDO) is shown as an extension of the Simultaneous Aerodynamic Analysis and Design Optimization (SAADO) method. It is extended by the inclusion of structure element sizing parameters as design variables and Finite Element Method (FEM) analysis responses as constraints. The method aims to reduce the computational expense. incurred in performing shape and sizing optimization using state-of-the-art Computational Fluid Dynamics (CFD) flow analysis, FEM structural analysis and sensitivity analysis tools. SASDO is applied to a simple. isolated, 3-D wing in inviscid flow. Results show that the method finds the saine local optimum as a conventional optimization method with some reduction in the computational cost and without significant modifications; to the analysis tools.

  15. Enhancing the Characterization of Epistemic Uncertainties in PM2.5 Risk Analyses.

    PubMed

    Smith, Anne E; Gans, Will

    2015-03-01

    The Environmental Benefits Mapping and Analysis Program (BenMAP) is a software tool developed by the U.S. Environmental Protection Agency (EPA) that is widely used inside and outside of EPA to produce quantitative estimates of public health risks from fine particulate matter (PM2.5 ). This article discusses the purpose and appropriate role of a risk analysis tool to support risk management deliberations, and evaluates the functions of BenMAP in this context. It highlights the importance in quantitative risk analyses of characterization of epistemic uncertainty, or outright lack of knowledge, about the true risk relationships being quantified. This article describes and quantitatively illustrates sensitivities of PM2.5 risk estimates to several key forms of epistemic uncertainty that pervade those calculations: the risk coefficient, shape of the risk function, and the relative toxicity of individual PM2.5 constituents. It also summarizes findings from a review of U.S.-based epidemiological evidence regarding the PM2.5 risk coefficient for mortality from long-term exposure. That review shows that the set of risk coefficients embedded in BenMAP substantially understates the range in the literature. We conclude that BenMAP would more usefully fulfill its role as a risk analysis support tool if its functions were extended to better enable and prompt its users to characterize the epistemic uncertainties in their risk calculations. This requires expanded automatic sensitivity analysis functions and more recognition of the full range of uncertainty in risk coefficients. © 2014 Society for Risk Analysis.

  16. A Chain of Modeling Tools For Gas and Aqueous Phase Chemstry

    NASA Astrophysics Data System (ADS)

    Audiffren, N.; Djouad, R.; Sportisse, B.

    Atmospheric chemistry is characterized by the use of large set of chemical species and reactions. Handling with the set of data required for the definition of the model is a quite difficult task. We prsent in this short article a preprocessor for diphasic models (gas phase and aqueous phase in cloud droplets) named SPACK. The main interest of SPACK is the automatic generation of lumped species related to fast equilibria. We also developped a linear tangent model using the automatic differentiation tool named ODYSSEE in order to perform a sensitivity analysis of an atmospheric multi- phase mechanism based on RADM2 kinetic scheme.Local sensitivity coefficients are computed for two different scenarii. We focus in this study on the sensitivity of the ozone,NOx,HOx, system with respect to some aqueous phase reactions and we inves- tigate the influence of the reduction in the photolysis rates in the area below the cloud region.

  17. What Constitutes a "Good" Sensitivity Analysis? Elements and Tools for a Robust Sensitivity Analysis with Reduced Computational Cost

    NASA Astrophysics Data System (ADS)

    Razavi, Saman; Gupta, Hoshin; Haghnegahdar, Amin

    2016-04-01

    Global sensitivity analysis (GSA) is a systems theoretic approach to characterizing the overall (average) sensitivity of one or more model responses across the factor space, by attributing the variability of those responses to different controlling (but uncertain) factors (e.g., model parameters, forcings, and boundary and initial conditions). GSA can be very helpful to improve the credibility and utility of Earth and Environmental System Models (EESMs), as these models are continually growing in complexity and dimensionality with continuous advances in understanding and computing power. However, conventional approaches to GSA suffer from (1) an ambiguous characterization of sensitivity, and (2) poor computational efficiency, particularly as the problem dimension grows. Here, we identify several important sensitivity-related characteristics of response surfaces that must be considered when investigating and interpreting the ''global sensitivity'' of a model response (e.g., a metric of model performance) to its parameters/factors. Accordingly, we present a new and general sensitivity and uncertainty analysis framework, Variogram Analysis of Response Surfaces (VARS), based on an analogy to 'variogram analysis', that characterizes a comprehensive spectrum of information on sensitivity. We prove, theoretically, that Morris (derivative-based) and Sobol (variance-based) methods and their extensions are special cases of VARS, and that their SA indices are contained within the VARS framework. We also present a practical strategy for the application of VARS to real-world problems, called STAR-VARS, including a new sampling strategy, called "star-based sampling". Our results across several case studies show the STAR-VARS approach to provide reliable and stable assessments of "global" sensitivity, while being at least 1-2 orders of magnitude more efficient than the benchmark Morris and Sobol approaches.

  18. Sensitive and comprehensive analysis of O-glycosylation in biotherapeutics: a case study of novel erythropoiesis stimulating protein.

    PubMed

    Kim, Unyong; Oh, Myung Jin; Seo, Youngsuk; Jeon, Yinae; Eom, Joon-Ho; An, Hyun Joo

    2017-09-01

    Glycosylation of recombinant human erythropoietins (rhEPOs) is significantly associated with drug's quality and potency. Thus, comprehensive characterization of glycosylation is vital to assess the biotherapeutic quality and establish the equivalency of biosimilar rhEPOs. However, current glycan analysis mainly focuses on the N-glycans due to the absence of analytical tools to liberate O-glycans with high sensitivity. We developed selective and sensitive method to profile native O-glycans on rhEPOs. O-glycosylation on rhEPO including O-acetylation on a sialic acid was comprehensively characterized. Details such as O-glycan structure and O-acetyl-modification site were obtained from tandem MS. This method may be applied to QC and batch analysis of not only rhEPOs but also other biotherapeutics bearing multiple O-glycosylations.

  19. Automated Morphological Analysis of Microglia After Stroke.

    PubMed

    Heindl, Steffanie; Gesierich, Benno; Benakis, Corinne; Llovera, Gemma; Duering, Marco; Liesz, Arthur

    2018-01-01

    Microglia are the resident immune cells of the brain and react quickly to changes in their environment with transcriptional regulation and morphological changes. Brain tissue injury such as ischemic stroke induces a local inflammatory response encompassing microglial activation. The change in activation status of a microglia is reflected in its gradual morphological transformation from a highly ramified into a less ramified or amoeboid cell shape. For this reason, the morphological changes of microglia are widely utilized to quantify microglial activation and studying their involvement in virtually all brain diseases. However, the currently available methods, which are mainly based on manual rating of immunofluorescent microscopic images, are often inaccurate, rater biased, and highly time consuming. To address these issues, we created a fully automated image analysis tool, which enables the analysis of microglia morphology from a confocal Z-stack and providing up to 59 morphological features. We developed the algorithm on an exploratory dataset of microglial cells from a stroke mouse model and validated the findings on an independent data set. In both datasets, we could demonstrate the ability of the algorithm to sensitively discriminate between the microglia morphology in the peri-infarct and the contralateral, unaffected cortex. Dimensionality reduction by principal component analysis allowed to generate a highly sensitive compound score for microglial shape analysis. Finally, we tested for concordance of results between the novel automated analysis tool and the conventional manual analysis and found a high degree of correlation. In conclusion, our novel method for the fully automatized analysis of microglia morphology shows excellent accuracy and time efficacy compared to traditional analysis methods. This tool, which we make openly available, could find application to study microglia morphology using fluorescence imaging in a wide range of brain disease models.

  20. Automatic Differentiation as a tool in engineering design

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M.; Hall, Laura E.

    1992-01-01

    Automatic Differentiation (AD) is a tool that systematically implements the chain rule of differentiation to obtain the derivatives of functions calculated by computer programs. In this paper, it is assessed as a tool for engineering design. The paper discusses the forward and reverse modes of AD, their computing requirements, and approaches to implementing AD. It continues with application to two different tools to two medium-size structural analysis problems to generate sensitivity information typically necessary in an optimization or design situation. The paper concludes with the observation that AD is to be preferred to finite differencing in most cases, as long as sufficient computer storage is available.

  1. Practical applications of surface analytic tools in tribology

    NASA Technical Reports Server (NTRS)

    Ferrante, J.

    1980-01-01

    A brief description of many of the widely used tools is presented. Of this list, those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology along with being truly surface sensitive (that is less than 10 atomic layers) are presented. The latter group is critiqued in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under real conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  2. Screening for hearing, visual and dual sensory impairment in older adults using behavioural cues: a validation study.

    PubMed

    Roets-Merken, Lieve M; Zuidema, Sytse U; Vernooij-Dassen, Myrra J F J; Kempen, Gertrudis I J M

    2014-11-01

    This study investigated the psychometric properties of the Severe Dual Sensory Loss screening tool, a tool designed to help nurses and care assistants to identify hearing, visual and dual sensory impairment in older adults. Construct validity of the Severe Dual Sensory Loss screening tool was evaluated using Crohnbach's alpha and factor analysis. Interrater reliability was calculated using Kappa statistics. To evaluate the predictive validity, sensitivity and specificity were calculated by comparison with the criterion standard assessment for hearing and vision. The criterion used for hearing impairment was a hearing loss of ≥40 decibel measured by pure-tone audiometry, and the criterion for visual impairment was a visual acuity of ≤0.3 diopter or a visual field of ≤0.3°. Feasibility was evaluated by the time needed to fill in the screening tool and the clarity of the instruction and items. Prevalence of dual sensory impairment was calculated. A total of 56 older adults receiving aged care and 12 of their nurses and care assistants participated in the study. Crohnbach's alpha was 0.81 for the hearing subscale and 0.84 for the visual subscale. Factor analysis showed two constructs for hearing and two for vision. Kappa was 0.71 for the hearing subscale and 0.74 for the visual subscale. The predictive validity showed a sensitivity of 0.71 and a specificity of 0.72 for the hearing subscale; and a sensitivity of 0.69 and a specificity of 0.78 for the visual subscale. The optimum cut-off point for each subscale was score 1. The nurses and care assistants reported that the Severe Dual Sensory Loss screening tool was easy to use. The prevalence of hearing and vision impairment was 55% and 29%, respectively, and that of dual sensory impairment was 20%. The Severe Dual Sensory Loss screening tool was compared with the criterion standards for hearing and visual impairment and was found a valid and reliable tool, enabling nurses and care assistants to identify hearing, visual and dual sensory impairment among older adults. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Elemental Analysis in Biological Matrices Using ICP-MS.

    PubMed

    Hansen, Matthew N; Clogston, Jeffrey D

    2018-01-01

    The increasing exploration of metallic nanoparticles for use as cancer therapeutic agents necessitates a sensitive technique to track the clearance and distribution of the material once introduced into a living system. Inductively coupled plasma mass spectrometry (ICP-MS) provides a sensitive and selective tool for tracking the distribution of metal components from these nanotherapeutics. This chapter presents a standardized method for processing biological matrices, ensuring complete homogenization of tissues, and outlines the preparation of appropriate standards and controls. The method described herein utilized gold nanoparticle-treated samples; however, the method can easily be applied to the analysis of other metals.

  4. Inflammatory Cytokines as Preclinical Markers of Adverse Responses to Chemical Stressors

    EPA Science Inventory

    Abstract: The in vivo cytokine response to chemical stressors is a promising mainstream tool used to assess potential systemic inflammation and immune function changes. Notably, new instrumentation and statistical analysis provide the selectivity and sensitivity to rapidly diff...

  5. Computational mechanics analysis tools for parallel-vector supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Nguyen, D. T.; Baddourah, M. A.; Qin, J.

    1993-01-01

    Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigen-solution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization algorithm and domain decomposition. The source code for many of these algorithms is available from NASA Langley.

  6. Preliminary clinical results: an analyzing tool for 2D optical imaging in detection of active inflammation in rheumatoid arthritis

    NASA Astrophysics Data System (ADS)

    Adi Aizudin Bin Radin Nasirudin, Radin; Meier, Reinhard; Ahari, Carmen; Sievert, Matti; Fiebich, Martin; Rummeny, Ernst J.; No"l, Peter B.

    2011-03-01

    Optical imaging (OI) is a relatively new method in detecting active inflammation of hand joints of patients suffering from rheumatoid arthritis (RA). With the high number of people affected by this disease especially in western countries, the availability of OI as an early diagnostic imaging method is clinically highly relevant. In this paper, we present a newly in-house developed OI analyzing tool and a clinical evaluation study. Our analyzing tool extends the capability of existing OI tools. We include many features in the tool, such as region-based image analysis, hyper perfusion curve analysis, and multi-modality image fusion to aid clinicians in localizing and determining the intensity of inflammation in joints. Additionally, image data management options, such as the full integration of PACS/RIS, are included. In our clinical study we demonstrate how OI facilitates the detection of active inflammation in rheumatoid arthritis. The preliminary clinical results indicate a sensitivity of 43.5%, a specificity of 80.3%, an accuracy of 65.7%, a positive predictive value of 76.6%, and a negative predictive value of 64.9% in relation to clinical results from MRI. The accuracy of inflammation detection serves as evidence to the potential of OI as a useful imaging modality for early detection of active inflammation in patients with rheumatoid arthritis. With our in-house developed tool we extend the usefulness of OI imaging in the clinical arena. Overall, we show that OI is a fast, inexpensive, non-invasive and nonionizing yet highly sensitive and accurate imaging modality.-

  7. Automated Diabetic Retinopathy Screening and Monitoring Using Retinal Fundus Image Analysis.

    PubMed

    Bhaskaranand, Malavika; Ramachandra, Chaithanya; Bhat, Sandeep; Cuadros, Jorge; Nittala, Muneeswar Gupta; Sadda, SriniVas; Solanki, Kaushal

    2016-02-16

    Diabetic retinopathy (DR)-a common complication of diabetes-is the leading cause of vision loss among the working-age population in the western world. DR is largely asymptomatic, but if detected at early stages the progression to vision loss can be significantly slowed. With the increasing diabetic population there is an urgent need for automated DR screening and monitoring. To address this growing need, in this article we discuss an automated DR screening tool and extend it for automated estimation of microaneurysm (MA) turnover, a potential biomarker for DR risk. The DR screening tool automatically analyzes color retinal fundus images from a patient encounter for the various DR pathologies and collates the information from all the images belonging to a patient encounter to generate a patient-level screening recommendation. The MA turnover estimation tool aligns retinal images from multiple encounters of a patient, localizes MAs, and performs MA dynamics analysis to evaluate new, persistent, and disappeared lesion maps and estimate MA turnover rates. The DR screening tool achieves 90% sensitivity at 63.2% specificity on a data set of 40 542 images from 5084 patient encounters obtained from the EyePACS telescreening system. On a subset of 7 longitudinal pairs the MA turnover estimation tool identifies new and disappeared MAs with 100% sensitivity and average false positives of 0.43 and 1.6 respectively. The presented automated tools have the potential to address the growing need for DR screening and monitoring, thereby saving vision of millions of diabetic patients worldwide. © 2016 Diabetes Technology Society.

  8. General methods for sensitivity analysis of equilibrium dynamics in patch occupancy models

    USGS Publications Warehouse

    Miller, David A.W.

    2012-01-01

    Sensitivity analysis is a useful tool for the study of ecological models that has many potential applications for patch occupancy modeling. Drawing from the rich foundation of existing methods for Markov chain models, I demonstrate new methods for sensitivity analysis of the equilibrium state dynamics of occupancy models. Estimates from three previous studies are used to illustrate the utility of the sensitivity calculations: a joint occupancy model for a prey species, its predators, and habitat used by both; occurrence dynamics from a well-known metapopulation study of three butterfly species; and Golden Eagle occupancy and reproductive dynamics. I show how to deal efficiently with multistate models and how to calculate sensitivities involving derived state variables and lower-level parameters. In addition, I extend methods to incorporate environmental variation by allowing for spatial and temporal variability in transition probabilities. The approach used here is concise and general and can fully account for environmental variability in transition parameters. The methods can be used to improve inferences in occupancy studies by quantifying the effects of underlying parameters, aiding prediction of future system states, and identifying priorities for sampling effort.

  9. Adjoint Sensitivity Analysis for Scale-Resolving Turbulent Flow Solvers

    NASA Astrophysics Data System (ADS)

    Blonigan, Patrick; Garai, Anirban; Diosady, Laslo; Murman, Scott

    2017-11-01

    Adjoint-based sensitivity analysis methods are powerful design tools for engineers who use computational fluid dynamics. In recent years, these engineers have started to use scale-resolving simulations like large-eddy simulations (LES) and direct numerical simulations (DNS), which resolve more scales in complex flows with unsteady separation and jets than the widely-used Reynolds-averaged Navier-Stokes (RANS) methods. However, the conventional adjoint method computes large, unusable sensitivities for scale-resolving simulations, which unlike RANS simulations exhibit the chaotic dynamics inherent in turbulent flows. Sensitivity analysis based on least-squares shadowing (LSS) avoids the issues encountered by conventional adjoint methods, but has a high computational cost even for relatively small simulations. The following talk discusses a more computationally efficient formulation of LSS, ``non-intrusive'' LSS, and its application to turbulent flows simulated with a discontinuous-Galkerin spectral-element-method LES/DNS solver. Results are presented for the minimal flow unit, a turbulent channel flow with a limited streamwise and spanwise domain.

  10. The Global Modeling and Assimilation Office (GMAO) 4d-Var and its Adjoint-based Tools

    NASA Technical Reports Server (NTRS)

    Todling, Ricardo; Tremolet, Yannick

    2008-01-01

    The fifth generation of the Goddard Earth Observing System (GEOS-5) Data Assimilation System (DAS) is a 3d-var system that uses the Grid-point Statistical Interpolation (GSI) system developed in collaboration with NCEP, and a general circulation model developed at Goddard, that includes the finite-volume hydrodynamics of GEOS-4 wrapped in the Earth System Modeling Framework and physical packages tuned to provide a reliable hydrological cycle for the integration of the Modern Era Retrospective-analysis for Research and Applications (MERRA). This MERRA system is essentially complete and the next generation GEOS is under intense development. A prototype next generation system is now complete and has been producing preliminary results. This prototype system replaces the GSI-based Incremental Analysis Update procedure with a GSI-based 4d-var which uses the adjoint of the finite-volume hydrodynamics of GEOS-4 together with a vertical diffusing scheme for simplified physics. As part of this development we have kept the GEOS-5 IAU procedure as an option and have added the capability to experiment with a First Guess at the Appropriate Time (FGAT) procedure, thus allowing for at least three modes of running the data assimilation experiments. The prototype system is a large extension of GEOS-5 as it also includes various adjoint-based tools, namely, a forecast sensitivity tool, a singular vector tool, and an observation impact tool, that combines the model sensitivity tool with a GSI-based adjoint tool. These features bring the global data assimilation effort at Goddard up to date with technologies used in data assimilation systems at major meteorological centers elsewhere. Various aspects of the next generation GEOS will be discussed during the presentation at the Workshop, and preliminary results will illustrate the discussion.

  11. SURE reliability analysis: Program and mathematics

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; White, Allan L.

    1988-01-01

    The SURE program is a new reliability analysis tool for ultrareliable computer system architectures. The computational methods on which the program is based provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool.

  12. Automated subtyping of HIV-1 genetic sequences for clinical and surveillance purposes: performance evaluation of the new REGA version 3 and seven other tools.

    PubMed

    Pineda-Peña, Andrea-Clemencia; Faria, Nuno Rodrigues; Imbrechts, Stijn; Libin, Pieter; Abecasis, Ana Barroso; Deforche, Koen; Gómez-López, Arley; Camacho, Ricardo J; de Oliveira, Tulio; Vandamme, Anne-Mieke

    2013-10-01

    To investigate differences in pathogenesis, diagnosis and resistance pathways between HIV-1 subtypes, an accurate subtyping tool for large datasets is needed. We aimed to evaluate the performance of automated subtyping tools to classify the different subtypes and circulating recombinant forms using pol, the most sequenced region in clinical practice. We also present the upgraded version 3 of the Rega HIV subtyping tool (REGAv3). HIV-1 pol sequences (PR+RT) for 4674 patients retrieved from the Portuguese HIV Drug Resistance Database, and 1872 pol sequences trimmed from full-length genomes retrieved from the Los Alamos database were classified with statistical-based tools such as COMET, jpHMM and STAR; similarity-based tools such as NCBI and Stanford; and phylogenetic-based tools such as REGA version 2 (REGAv2), REGAv3, and SCUEAL. The performance of these tools, for pol, and for PR and RT separately, was compared in terms of reproducibility, sensitivity and specificity with respect to the gold standard which was manual phylogenetic analysis of the pol region. The sensitivity and specificity for subtypes B and C was more than 96% for seven tools, but was variable for other subtypes such as A, D, F and G. With regard to the most common circulating recombinant forms (CRFs), the sensitivity and specificity for CRF01_AE was ~99% with statistical-based tools, with phylogenetic-based tools and with Stanford, one of the similarity based tools. CRF02_AG was correctly identified for more than 96% by COMET, REGAv3, Stanford and STAR. All the tools reached a specificity of more than 97% for most of the subtypes and the two main CRFs (CRF01_AE and CRF02_AG). Other CRFs were identified only by COMET, REGAv2, REGAv3, and SCUEAL and with variable sensitivity. When analyzing sequences for PR and RT separately, the performance for PR was generally lower and variable between the tools. Similarity and statistical-based tools were 100% reproducible, but this was lower for phylogenetic-based tools such as REGA (~99%) and SCUEAL (~96%). REGAv3 had an improved performance for subtype B and CRF02_AG compared to REGAv2 and is now able to also identify all epidemiologically relevant CRFs. In general the best performing tools, in alphabetical order, were COMET, jpHMM, REGAv3, and SCUEAL when analyzing pure subtypes in the pol region, and COMET and REGAv3 when analyzing most of the CRFs. Based on this study, we recommend to confirm subtyping with 2 well performing tools, and be cautious with the interpretation of short sequences. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  13. ADC as a useful diagnostic tool for differentiating benign and malignant vertebral bone marrow lesions and compression fractures: a systematic review and meta-analysis.

    PubMed

    Suh, Chong Hyun; Yun, Seong Jong; Jin, Wook; Lee, Sun Hwa; Park, So Young; Ryu, Chang-Woo

    2018-07-01

    To assess the sensitivity and specificity of quantitative assessment of the apparent diffusion coefficient (ADC) for differentiating benign and malignant vertebral bone marrow lesions (BMLs) and compression fractures (CFs) METHODS: An electronic literature search of MEDLINE and EMBASE was conducted. Bivariate modelling and hierarchical summary receiver operating characteristic modelling were performed to evaluate the diagnostic performance of ADC for differentiating vertebral BMLs. Subgroup analysis was performed for differentiating benign and malignant vertebral CFs. Meta-regression analyses according to subject, study and diffusion-weighted imaging (DWI) characteristics were performed. Twelve eligible studies (748 lesions, 661 patients) were included. The ADC exhibited a pooled sensitivity of 0.89 (95% confidence interval [CI] 0.80-0.94) and a pooled specificity of 0.87 (95% CI 0.78-0.93) for differentiating benign and malignant vertebral BMLs. In addition, the pooled sensitivity and specificity for differentiating benign and malignant CFs were 0.92 (95% CI 0.82-0.97) and 0.91 (95% CI 0.87-0.94), respectively. In the meta-regression analysis, the DWI slice thickness was a significant factor affecting heterogeneity (p < 0.01); thinner slice thickness (< 5 mm) showed higher specificity (95%) than thicker slice thickness (81%). Quantitative assessment of ADC is a useful diagnostic tool for differentiating benign and malignant vertebral BMLs and CFs. • Quantitative assessment of ADC is useful in differentiating vertebral BMLs. • Quantitative ADC assessment for BMLs had sensitivity of 89%, specificity of 87%. • Quantitative ADC assessment for CFs had sensitivity of 92%, specificity of 91%. • The specificity is highest (95%) with thinner (< 5 mm) DWI slice thickness.

  14. Developing the Thai Siriraj Psoriatic Arthritis Screening Tool and validating the Thai Psoriasis Epidemiology Screening Tool and the Early Arthritis for Psoriatic Patients questionnaire.

    PubMed

    Chiowchanwisawakit, Praveena; Wattanamongkolsil, Luksame; Srinonprasert, Varalak; Petcharat, Chonachan; Siriwanarangsun, Palanan; Katchamart, Wanruchada

    2016-10-01

    To validate the Thai language version of the Psoriasis Epidemiology Screening Tool (PEST) and the Early Arthritis for Psoriatic Patients Questionnaire (EARP), as well as also to develop a new tool for screening psoriatic arthritis (PsA) among psoriasis (Ps) patients. This was a cross-sectional study. Ps patients visiting the psoriasis clinic at Siriraj Hospital were recruited. They completed the EARP and PEST. Full musculoskeletal history, examination, and radiography were evaluated. PsA was diagnosed by a rheumatologist's evaluation and fulfillment of the classification criteria for psoriatic arthritis. Receiver operator characteristic (ROC) curves, sensitivity, and specificity were used to evaluate the performances of the tools. The Siriraj Psoriatic Arthritis Screening Tool (SiPAT) contained questions most relevant to peripheral arthritis, axial inflammation, and enthesitis, selected from multivariate analysis. Of a total of 159 patients, the prevalence of PsA was 78.6 %. The ROC curve analyses of Thai EARP, PEST, and SiPAT were 0.90 (95 % CI 0.84, 0.96), 0.85 (0.78, 0.92), and 0.89 (0.83, 0.95), respectively. The sensitivities of SiPAT, Thai EARP, and PEST were 91.0, 83.0, and 72.0 %, respectively, while the specificities were 69.0, 79.3, and 89.7 %, respectively. All screening questionnaires showed good diagnostic performances. SiPAT could be considered as a screening tool with its desirable properties: higher sensitivity and taking less time. Thai PEST and EARP could possibly be sequentially administered for people with a positive test from SiPAT to reduce the number of false positives.

  15. Evaluation of whole genome sequencing and software tools for drug susceptibility testing of Mycobacterium tuberculosis.

    PubMed

    van Beek, J; Haanperä, M; Smit, P W; Mentula, S; Soini, H

    2018-04-11

    Culture-based assays are currently the reference standard for drug susceptibility testing for Mycobacterium tuberculosis. They provide good sensitivity and specificity but are time consuming. The objective of this study was to evaluate whether whole genome sequencing (WGS), combined with software tools for data analysis, can replace routine culture-based assays for drug susceptibility testing of M. tuberculosis. M. tuberculosis cultures sent to the Finnish mycobacterial reference laboratory in 2014 (n = 211) were phenotypically tested by Mycobacteria Growth Indicator Tube (MGIT) for first-line drug susceptibilities. WGS was performed for all isolates using the Illumina MiSeq system, and data were analysed using five software tools (PhyResSE, Mykrobe Predictor, TB Profiler, TGS-TB and KvarQ). Diagnostic time and reagent costs were estimated for both methods. The sensitivity of the five software tools to predict any resistance among strains was almost identical, ranging from 74% to 80%, and specificity was more than 95% for all software tools except for TGS-TB. The sensitivity and specificity to predict resistance to individual drugs varied considerably among the software tools. Reagent costs for MGIT and WGS were €26 and €143 per isolate respectively. Turnaround time for MGIT was 19 days (range 10-50 days) for first-line drugs, and turnaround time for WGS was estimated to be 5 days (range 3-7 days). WGS could be used as a prescreening assay for drug susceptibility testing with confirmation of resistant strains by MGIT. The functionality and ease of use of the software tools need to be improved. Copyright © 2018 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  16. Sharing tools and best practice in Global Sensitivity Analysis within academia and with industry

    NASA Astrophysics Data System (ADS)

    Wagener, T.; Pianosi, F.; Noacco, V.; Sarrazin, F.

    2017-12-01

    We have spent years trying to improve the use of global sensitivity analysis (GSA) in earth and environmental modelling. Our efforts included (1) the development of tools that provide easy access to widely used GSA methods, (2) the definition of workflows so that best practice is shared in an accessible way, and (3) the development of algorithms to close gaps in available GSA methods (such as moment independent strategies) and to make GSA applications more robust (such as convergence criteria). These elements have been combined in our GSA Toolbox, called SAFE (www.safetoolbox.info), which has up to now been adopted by over 1000 (largely) academic users worldwide. However, despite growing uptake in academic circles and across a wide range of application areas, transfer to industry applications has been difficult. Initial market research regarding opportunities and barriers for uptake revealed a large potential market, but also highlighted a significant lack of knowledge regarding state-of-the-art methods and their potential value for end-users. We will present examples and discuss our experience so far in trying to overcome these problems and move beyond academia in distributing GSA tools and expertise.

  17. Adaptation and psychometric properties of the ISPCAN Child Abuse Screening Tool for use in trials (ICAST-Trial) among South African adolescents and their primary caregivers.

    PubMed

    Meinck, Franziska; Boyes, Mark E; Cluver, Lucie; Ward, Catherine L; Schmidt, Peter; DeStone, Sachin; Dunne, Michael P

    2018-05-31

    Child abuse prevention research has been hampered by a lack of validated multi-dimensional non-proprietary instruments, sensitive enough to measure change in abuse victimization or behavior. This study aimed to adapt the ICAST child abuse self-report measure (parent and child) for use in intervention studies and to investigate the psychometric properties of this substantially modified tool in a South African sample. First, cross-cultural and sensitivity adaptation of the original ICAST tools resulted in two preliminary measures (ICAST-Trial adolescents: 27 items, ICAST-Trial caregivers: 19 items). Second, ICAST-Trial data from a cluster randomized trial of a parenting intervention for families with adolescents (N = 1104, 552 caregiver-adolescent dyads) was analyzed. Confirmatory factor analysis established the hypothesized 6-factor (adolescents) and 4-factor (caregivers) structure. Removal of two items for adolescents and five for caregivers resulted in adequate model fit. Concurrent criterion validity analysis confirmed hypothesized relationships between child abuse and adolescent and caregiver mental health, adolescent behavior, discipline techniques and caregiver childhood abuse history. The resulting ICAST-Trial measures have 25 (adolescent) and 14 (caregiver) items respectively and measure physical, emotional and contact sexual abuse, neglect (both versions), and witnessing intimate partner violence and sexual harassment (adolescent version). The study established that both tools are sensitive to measuring change over time in response to a parenting intervention. The ICAST-Trial should have utility for evaluating the effectiveness of child abuse prevention efforts in similar socioeconomic contexts. Further research is needed to replicate these findings and examine cultural appropriateness, barriers for disclosure, and willingness to engage in child abuse research. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Mast cell activation test in the diagnosis of allergic disease and anaphylaxis.

    PubMed

    Bahri, Rajia; Custovic, Adnan; Korosec, Peter; Tsoumani, Marina; Barron, Martin; Wu, Jiakai; Sayers, Rebekah; Weimann, Alf; Ruiz-Garcia, Monica; Patel, Nandinee; Robb, Abigail; Shamji, Mohamed H; Fontanella, Sara; Silar, Mira; Mills, E N Clare; Simpson, Angela; Turner, Paul J; Bulfone-Paus, Silvia

    2018-03-05

    Food allergy is an increasing public health issue and the most common cause of life-threatening anaphylactic reactions. Conventional allergy tests assess for the presence of allergen-specific IgE, significantly overestimating the rate of true clinical allergy and resulting in overdiagnosis and adverse effect on health-related quality of life. To undertake initial validation and assessment of a novel diagnostic tool, we used the mast cell activation test (MAT). Primary human blood-derived mast cells (MCs) were generated from peripheral blood precursors, sensitized with patients' sera, and then incubated with allergen. MC degranulation was assessed by means of flow cytometry and mediator release. We compared the diagnostic performance of MATs with that of existing diagnostic tools to assess in a cohort of peanut-sensitized subjects undergoing double-blind, placebo-controlled challenge. Human blood-derived MCs sensitized with sera from patients with peanut, grass pollen, and Hymenoptera (wasp venom) allergy demonstrated allergen-specific and dose-dependent degranulation, as determined based on both expression of surface activation markers (CD63 and CD107a) and functional assays (prostaglandin D 2 and β-hexosaminidase release). In this cohort of peanut-sensitized subjects, the MAT was found to have superior discrimination performance compared with other testing modalities, including component-resolved diagnostics and basophil activation tests. Using functional principle component analysis, we identified 5 clusters or patterns of reactivity in the resulting dose-response curves, which at preliminary analysis corresponded to the reaction phenotypes seen at challenge. The MAT is a robust tool that can confer superior diagnostic performance compared with existing allergy diagnostics and might be useful to explore differences in effector cell function between basophils and MCs during allergic reactions. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  19. Diagnosing leprosy: revisiting the role of the slit-skin smear with critical analysis of the applicability of polymerase chain reaction in diagnosis.

    PubMed

    Banerjee, Surajita; Biswas, Nibir; Kanti Das, Nilay; Sil, Amrita; Ghosh, Pramit; Hasanoor Raja, Abu Hena; Dasgupta, Sarbani; Kanti Datta, Pijush; Bhattacharya, Basudev

    2011-12-01

    Diagnosing leprosy is challenging, especially in early-stage cases, and the need for a sensitive diagnostic tool is urgent. Polymerase chain reaction (PCR) holds promise as a simple and sensitive diagnostic tool, but its usefulness in the Indian context requires further evaluation. Slit-skin smear (SSS) remains the conventional method of leprosy detection. Hence, this study was undertaken to evaluate and compare the diagnostic efficacy of PCR versus that of SSS. Punch biopsy of skin and SSS were obtained from the active margins of lesions. Cases were clinically grouped according to whether they were multibacillary (MB) or paucibacillary (PB) and classified into tuberculoid (TT), borderline tuberculoid (BT), borderline lepromatous (BL), lepromatous (LL), histoid, and indeterminate groups after clinicopathological correlation. DNA was extracted from biopsy specimens, and multiplex PCR was carried out incorporating primers intended for the amplification of a specific 372-bp fragment of a repetitive sequence of Mycobacterium leprae DNA. Among 164 patients, PCR was positive in 82.3%. The sensitivity of PCR was significantly greater (P < 0.0001) than that of SSS in both the MB (85.9% vs. 59.8%) and PB (75.4% vs. 1.8%) subgroups; the difference in sensitivity in the PB subgroup is remarkable. Positivity by PCR and SSS was found in 100% of LL and histoid leprosy, but PCR had significantly greater (P < 0.0001) positivity in BT leprosy and was of definite increased value in indeterminate and TT leprosy. Polymerase chain reaction had higher sensitivity compared with SSS, especially in diagnostically challenging and PB cases. Thus, the use of this costly but sensitive tool should be restricted to this subgroup, because SSS is sufficiently sensitive in the diagnosis of LL and histoid leprosy. © 2011 The International Society of Dermatology.

  20. FY17 Status Report on NEAMS Neutronics Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, C. H.; Jung, Y. S.; Smith, M. A.

    2017-09-30

    Under the U.S. DOE NEAMS program, the high-fidelity neutronics code system has been developed to support the multiphysics modeling and simulation capability named SHARP. The neutronics code system includes the high-fidelity neutronics code PROTEUS, the cross section library and preprocessing tools, the multigroup cross section generation code MC2-3, the in-house meshing generation tool, the perturbation and sensitivity analysis code PERSENT, and post-processing tools. The main objectives of the NEAMS neutronics activities in FY17 are to continue development of an advanced nodal solver in PROTEUS for use in nuclear reactor design and analysis projects, implement a simplified sub-channel based thermal-hydraulic (T/H)more » capability into PROTEUS to efficiently compute the thermal feedback, improve the performance of PROTEUS-MOCEX using numerical acceleration and code optimization, improve the cross section generation tools including MC2-3, and continue to perform verification and validation tests for PROTEUS.« less

  1. Analysis of significantly mutated genes as a clinical tool for the diagnosis in a case of lung cancer.

    PubMed

    Miyashita, Yoshihiro; Hirotsu, Yosuke; Tsutsui, Toshiharu; Higashi, Seishi; Sogami, Yusuke; Kakizaki, Yumiko; Goto, Taichiro; Amemiya, Kenji; Oyama, Toshio; Omata, Masao

    2017-01-01

    Bronchoendoscopic examination is not necessarily comfortable procedure and limited by its sensitivity, depending on the location and size of the tumor lesion. Patients with a non-diagnostic bronchoendoscopic examination often undergo further invasive examinations. Non-invasive diagnostic tool of lung cancer is desired. A 72-year-old man had a 3.0 cm × 2.5 cm mass lesion in the segment B1 of right lung. Cytological examination of sputum, bronchial washing and curetted samples were all "negative". We could confirm a diagnosis of lung cancer after right upper lung lobe resection pathologically, and also obtained concordant results by genomic analysis using cytological negative samples from airways collected before operation. Genetic analysis showed mutational profiles of both resected specimens and samples from airways were identical. These data clearly indicated the next generation sequencing (NGS) may yield a diagnostic tool to conduct "precision medicine".

  2. Secure FAST: Security Enhancement in the NATO Time Sensitive Targeting Tool

    DTIC Science & Technology

    2010-11-01

    designed to aid in the tracking and prosecuting of Time Sensitive Targets. The FAST tool provides user level authentication and authorisation in terms...level authentication and authorisation in terms of security. It uses operating system level security but does not provide application level security for...and collaboration tool, designed to aid in the tracking and prosecuting of Time Sensitive Targets. The FAST tool provides user level authentication and

  3. What are the most effective methods for assessment of nutritional status in outpatients with gastric and colorectal cancer?

    PubMed

    Abe Vicente, Mariana; Barão, Katia; Silva, Tiago Donizetti; Forones, Nora Manoukian

    2013-01-01

    To evaluate methods for the identification of nutrition risk and nutritional status in outpatients with colorectal (CRC) and gastric cancer (GC), and to compare the results to those obtained for patients already treated for these cancers. A cross-sectional study was conducted on 137 patients: group 1 (n = 75) consisting of patients with GC or CRC, and group 2 (n = 62) consisting of patients after treatment of GC or CRC under follow up, who were tumor free for a period longer than 3 months. Nutritional status was assessed in these patients using objective methods [body mass index (BMI), phase angle, serum albumin]; nutritional screening tools [Malnutrition Universal Screening Tool (MUST), Malnutrition Screening Tool (MST), Nutritional Risk Index (NRI)], and subjective assessment [Patient-Generated Subjective Global Assessment (PGSGA)]. The sensitivity and specificity of each method was calculated in relation to the PG-SGA used as gold standard. One hundred thirty seven patients participated in the study. Stage IV cancer patients were more common in group 1. There was no difference in BMI between groups (p = 0.67). Analysis of the association between methods of assessing nutritional status and PG-SGA showed that the nutritional screening tools provided more significant results (p < 0.05) than the objective methods in the two groups. PG-SGA detected the highest proportion of undernourished patients in group 1. The nutritional screening tools MUST, NRI and MST were more sensitive than the objective methods. Phase angle measurement was the most sensitive objective method in group 1. The nutritional screening tools showed the best association with PG-SGA and were also more sensitive than the objective methods. The results suggest the combination of MUST and PG-SGA for patients with cancer before and after treatment. Copyright © AULA MEDICA EDICIONES 2013. Published by AULA MEDICA. All rights reserved.

  4. Budget impact analysis of trastuzumab in early breast cancer: a hospital district perspective.

    PubMed

    Purmonen, Timo T; Auvinen, Päivi K; Martikainen, Janne A

    2010-04-01

    Adjuvant trastuzumab is widely used in HER2-positive (HER2+) early breast cancer, and despite its cost-effectiveness, it causes substantial costs for health care. The purpose of the study was to develop a tool for estimating the budget impact of new cancer treatments. With this tool, we were able to estimate the budget impact of adjuvant trastuzumab, as well as the probability of staying within a given budget constraint. The created model-based evaluation tool was used to explore the budget impact of trastuzumab in early breast cancer in a single Finnish hospital district with 250,000 inhabitants. The used model took into account the number of patients, HER2+ prevalence, length and cost of treatment, and the effectiveness of the therapy. Probabilistic sensitivity analysis and alternative case scenarios were performed to ensure the robustness of the results. Introduction of adjuvant trastuzumab caused substantial costs for a relatively small hospital district. In base-case analysis the 4-year net budget impact was 1.3 million euro. The trastuzumab acquisition costs were partially offset by the reduction in costs associated with the treatment of cancer recurrence and metastatic disease. Budget impact analyses provide important information about the overall economic impact of new treatments, and thus offer complementary information to cost-effectiveness analyses. Inclusion of treatment outcomes and probabilistic sensitivity analysis provides more realistic estimates of the net budget impact. The length of trastuzumab treatment has a strong effect on the budget impact.

  5. Online and offline tools for head movement compensation in MEG.

    PubMed

    Stolk, Arjen; Todorovic, Ana; Schoffelen, Jan-Mathijs; Oostenveld, Robert

    2013-03-01

    Magnetoencephalography (MEG) is measured above the head, which makes it sensitive to variations of the head position with respect to the sensors. Head movements blur the topography of the neuronal sources of the MEG signal, increase localization errors, and reduce statistical sensitivity. Here we describe two novel and readily applicable methods that compensate for the detrimental effects of head motion on the statistical sensitivity of MEG experiments. First, we introduce an online procedure that continuously monitors head position. Second, we describe an offline analysis method that takes into account the head position time-series. We quantify the performance of these methods in the context of three different experimental settings, involving somatosensory, visual and auditory stimuli, assessing both individual and group-level statistics. The online head localization procedure allowed for optimal repositioning of the subjects over multiple sessions, resulting in a 28% reduction of the variance in dipole position and an improvement of up to 15% in statistical sensitivity. Offline incorporation of the head position time-series into the general linear model resulted in improvements of group-level statistical sensitivity between 15% and 29%. These tools can substantially reduce the influence of head movement within and between sessions, increasing the sensitivity of many cognitive neuroscience experiments. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. SCALE Continuous-Energy Eigenvalue Sensitivity Coefficient Calculations

    DOE PAGES

    Perfetti, Christopher M.; Rearden, Bradley T.; Martin, William R.

    2016-02-25

    Sensitivity coefficients describe the fractional change in a system response that is induced by changes to system parameters and nuclear data. The Tools for Sensitivity and UNcertainty Analysis Methodology Implementation (TSUNAMI) code within the SCALE code system makes use of eigenvalue sensitivity coefficients for an extensive number of criticality safety applications, including quantifying the data-induced uncertainty in the eigenvalue of critical systems, assessing the neutronic similarity between different critical systems, and guiding nuclear data adjustment studies. The need to model geometrically complex systems with improved fidelity and the desire to extend TSUNAMI analysis to advanced applications has motivated the developmentmore » of a methodology for calculating sensitivity coefficients in continuous-energy (CE) Monte Carlo applications. The Contributon-Linked eigenvalue sensitivity/Uncertainty estimation via Tracklength importance CHaracterization (CLUTCH) and Iterated Fission Probability (IFP) eigenvalue sensitivity methods were recently implemented in the CE-KENO framework of the SCALE code system to enable TSUNAMI-3D to perform eigenvalue sensitivity calculations using continuous-energy Monte Carlo methods. This work provides a detailed description of the theory behind the CLUTCH method and describes in detail its implementation. This work explores the improvements in eigenvalue sensitivity coefficient accuracy that can be gained through the use of continuous-energy sensitivity methods and also compares several sensitivity methods in terms of computational efficiency and memory requirements.« less

  7. Establishing the Capability of a 1D SVAT Modelling Scheme in Predicting Key Biophysical Vegetation Characterisation Parameters

    NASA Astrophysics Data System (ADS)

    Ireland, Gareth; Petropoulos, George P.; Carlson, Toby N.; Purdy, Sarah

    2015-04-01

    Sensitivity analysis (SA) consists of an integral and important validatory check of a computer simulation model before it is used to perform any kind of analysis. In the present work, we present the results from a SA performed on the SimSphere Soil Vegetation Atmosphere Transfer (SVAT) model utilising a cutting edge and robust Global Sensitivity Analysis (GSA) approach, based on the use of the Gaussian Emulation Machine for Sensitivity Analysis (GEM-SA) tool. The sensitivity of the following model outputs was evaluated: the ambient CO2 concentration and the rate of CO2 uptake by the plant, the ambient O3 concentration, the flux of O3 from the air to the plant/soil boundary, and the flux of O3 taken up by the plant alone. The most sensitive model inputs for the majority of model outputs were related to the structural properties of vegetation, namely, the Leaf Area Index, Fractional Vegetation Cover, Cuticle Resistance and Vegetation Height. External CO2 in the leaf and the O3 concentration in the air input parameters also exhibited significant influence on model outputs. This work presents a very important step towards an all-inclusive evaluation of SimSphere. Indeed, results from this study contribute decisively towards establishing its capability as a useful teaching and research tool in modelling Earth's land surface interactions. This is of considerable importance in the light of the rapidly expanding use of this model worldwide, which also includes research conducted by various Space Agencies examining its synergistic use with Earth Observation data towards the development of operational products at a global scale. This research was supported by the European Commission Marie Curie Re-Integration Grant "TRANSFORM-EO". SimSphere is currently maintained and freely distributed by the Department of Geography and Earth Sciences at Aberystwyth University (http://www.aber.ac.uk/simsphere). Keywords: CO2 flux, ambient CO2, O3 flux, SimSphere, Gaussian process emulators, BACCO GEM-SA, TRANSFORM-EO.

  8. A sensitivity analysis of regional and small watershed hydrologic models

    NASA Technical Reports Server (NTRS)

    Ambaruch, R.; Salomonson, V. V.; Simmons, J. W.

    1975-01-01

    Continuous simulation models of the hydrologic behavior of watersheds are important tools in several practical applications such as hydroelectric power planning, navigation, and flood control. Several recent studies have addressed the feasibility of using remote earth observations as sources of input data for hydrologic models. The objective of the study reported here was to determine how accurately remotely sensed measurements must be to provide inputs to hydrologic models of watersheds, within the tolerances needed for acceptably accurate synthesis of streamflow by the models. The study objective was achieved by performing a series of sensitivity analyses using continuous simulation models of three watersheds. The sensitivity analysis showed quantitatively how variations in each of 46 model inputs and parameters affect simulation accuracy with respect to five different performance indices.

  9. Structural optimization: Status and promise

    NASA Astrophysics Data System (ADS)

    Kamat, Manohar P.

    Chapters contained in this book include fundamental concepts of optimum design, mathematical programming methods for constrained optimization, function approximations, approximate reanalysis methods, dual mathematical programming methods for constrained optimization, a generalized optimality criteria method, and a tutorial and survey of multicriteria optimization in engineering. Also included are chapters on the compromise decision support problem and the adaptive linear programming algorithm, sensitivity analyses of discrete and distributed systems, the design sensitivity analysis of nonlinear structures, optimization by decomposition, mixed elements in shape sensitivity analysis of structures based on local criteria, and optimization of stiffened cylindrical shells subjected to destabilizing loads. Other chapters are on applications to fixed-wing aircraft and spacecraft, integrated optimum structural and control design, modeling concurrency in the design of composite structures, and tools for structural optimization. (No individual items are abstracted in this volume)

  10. The diagnostic accuracy of magnetic resonance venography in the detection of deep venous thrombosis: a systematic review and meta-analysis.

    PubMed

    Abdalla, G; Fawzi Matuk, R; Venugopal, V; Verde, F; Magnuson, T H; Schweitzer, M A; Steele, K E

    2015-08-01

    To search the literature for further evidence for the use of magnetic resonance venography (MRV) in the detection of suspected DVT and to re-evaluate the accuracy of MRV in the detection of suspected deep vein thrombosis (DVT). PubMed, EMBASE, Scopus, Cochrane, and Web of Science were searched. Study quality and the risk of bias were evaluated using the QUADAS 2. A random effects meta-analysis including subgroup and sensitivity analyses were performed. The search resulted in 23 observational studies all from academic centres. Sixteen articles were included in the meta-analysis. The summary estimates for MRV as a diagnostic non-invasive tool revealed a sensitivity of 93% (95% confidence interval [CI]: 89% to 95%) and specificity of 96% (95% CI: 94% to 97%). The heterogeneity of the studies was high. Inconsistency (I2) for sensitivity and specificity was 80.7% and 77.9%, respectively. Further studies investigating the use of MRV in the detection of suspected DVT did not offer further evidence to support the replacement of ultrasound with MRV as the first-line investigation. However, MRV may offer an alternative tool in the detection/diagnosis of DVT for whom ultrasound is inadequate or not feasible (such as in the obese patient). Copyright © 2015 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  11. Sensitivity of emergent sociohydrologic dynamics to internal system properties and external sociopolitical factors: Implications for water management

    NASA Astrophysics Data System (ADS)

    Elshafei, Y.; Tonts, M.; Sivapalan, M.; Hipsey, M. R.

    2016-06-01

    It is increasingly acknowledged that effective management of water resources requires a holistic understanding of the coevolving dynamics inherent in the coupled human-hydrology system. One of the fundamental information gaps concerns the sensitivity of coupled system feedbacks to various endogenous system properties and exogenous societal contexts. This paper takes a previously calibrated sociohydrology model and applies an idealized implementation, in order to: (i) explore the sensitivity of emergent dynamics resulting from bidirectional feedbacks to assumptions regarding (a) internal system properties that control the internal dynamics of the coupled system and (b) the external sociopolitical context; and (ii) interpret the results within the context of water resource management decision making. The analysis investigates feedback behavior in three ways, (a) via a global sensitivity analysis on key parameters and assessment of relevant model outputs, (b) through a comparative analysis based on hypothetical placement of the catchment along various points on the international sociopolitical gradient, and (c) by assessing the effects of various direct management intervention scenarios. Results indicate the presence of optimum windows that might offer the greatest positive impact per unit of management effort. Results further advocate management tools that encourage an adaptive learning, community-based approach with respect to water management, which are found to enhance centralized policy measures. This paper demonstrates that it is possible to use a place-based sociohydrology model to make abstractions as to the dynamics of bidirectional feedback behavior, and provide insights as to the efficacy of water management tools under different circumstances.

  12. Diagnosing Chronic Pancreatitis: Comparison and Evaluation of Different Diagnostic Tools.

    PubMed

    Issa, Yama; van Santvoort, Hjalmar C; van Dieren, Susan; Besselink, Marc G; Boermeester, Marja A; Ahmed Ali, Usama

    2017-10-01

    This study aims to compare the M-ANNHEIM, Büchler, and Lüneburg diagnostic tools for chronic pancreatitis (CP). A cross-sectional analysis of the development of CP was performed in a prospectively collected multicenter cohort including 669 patients after a first episode of acute pancreatitis. We compared the individual components of the M-ANNHEIM, Büchler, and Lüneburg tools, the agreement between tools, and estimated diagnostic accuracy using Bayesian latent-class analysis. A total of 669 patients with acute pancreatitis followed-up for a median period of 57 (interquartile range, 42-70) months were included. Chronic pancreatitis was diagnosed in 50 patients (7%), 59 patients (9%), and 61 patients (9%) by the M-ANNHEIM, Lüneburg, and Büchler tools, respectively. The overall agreement between these tools was substantial (κ = 0.75). Differences between the tools regarding the following criteria led to significant changes in the total number of diagnoses of CP: abdominal pain, recurrent pancreatitis, moderate to marked ductal lesions, endocrine and exocrine insufficiency, pancreatic calcifications, and pancreatic pseudocysts. The Büchler tool had the highest sensitivity (94%), followed by the M-ANNHEIM (87%), and finally the Lüneburg tool (81%). Differences between diagnostic tools for CP are mainly attributed to presence of clinical symptoms, endocrine insufficiency, and certain morphological complications.

  13. Multiple heteroatom substitution to graphene nanoribbon

    PubMed Central

    Meyer, Ernst

    2018-01-01

    Substituting heteroatoms into nanostructured graphene elements, such as graphene nanoribbons, offers the possibility for atomic engineering of electronic properties. To characterize these substitutions, functionalized atomic force microscopy (AFM)—a tool to directly resolve chemical structures—is one of the most promising tools, yet the chemical analysis of heteroatoms has been rarely performed. We synthesized multiple heteroatom-substituted graphene nanoribbons and showed that AFM can directly resolve elemental differences and can be correlated to the van der Waals radii, as well as the modulated local electron density caused by the substitution. This elemental-sensitive measurement takes an important step in the analysis of functionalized two-dimensional carbon materials. PMID:29662955

  14. Frontal affinity chromatography: A unique research tool for biospecific interaction that promotes glycobiology

    PubMed Central

    KASAI, Kenichi

    2014-01-01

    Combination of bioaffinity and chromatography gave birth to affinity chromatography. A further combination with frontal analysis resulted in creation of frontal affinity chromatography (FAC). This new versatile research tool enabled detailed analysis of weak interactions that play essential roles in living systems, especially those between complex saccharides and saccharide-binding proteins. FAC now becomes the best method for the investigation of saccharide-binding proteins (lectins) from viewpoints of sensitivity, accuracy, and efficiency, and is contributing greatly to the development of glycobiology. It opened a door leading to deeper understanding of the significance of saccharide recognition in life. The theory is also concisely described. PMID:25169774

  15. Optimization Issues with Complex Rotorcraft Comprehensive Analysis

    NASA Technical Reports Server (NTRS)

    Walsh, Joanne L.; Young, Katherine C.; Tarzanin, Frank J.; Hirsh, Joel E.; Young, Darrell K.

    1998-01-01

    This paper investigates the use of the general purpose automatic differentiation (AD) tool called Automatic Differentiation of FORTRAN (ADIFOR) as a means of generating sensitivity derivatives for use in Boeing Helicopter's proprietary comprehensive rotor analysis code (VII). ADIFOR transforms an existing computer program into a new program that performs a sensitivity analysis in addition to the original analysis. In this study both the pros (exact derivatives, no step-size problems) and cons (more CPU, more memory) of ADIFOR are discussed. The size (based on the number of lines) of the VII code after ADIFOR processing increased by 70 percent and resulted in substantial computer memory requirements at execution. The ADIFOR derivatives took about 75 percent longer to compute than the finite-difference derivatives. However, the ADIFOR derivatives are exact and are not functions of step-size. The VII sensitivity derivatives generated by ADIFOR are compared with finite-difference derivatives. The ADIFOR and finite-difference derivatives are used in three optimization schemes to solve a low vibration rotor design problem.

  16. A framework for sensitivity analysis of decision trees.

    PubMed

    Kamiński, Bogumił; Jakubczyk, Michał; Szufel, Przemysław

    2018-01-01

    In the paper, we consider sequential decision problems with uncertainty, represented as decision trees. Sensitivity analysis is always a crucial element of decision making and in decision trees it often focuses on probabilities. In the stochastic model considered, the user often has only limited information about the true values of probabilities. We develop a framework for performing sensitivity analysis of optimal strategies accounting for this distributional uncertainty. We design this robust optimization approach in an intuitive and not overly technical way, to make it simple to apply in daily managerial practice. The proposed framework allows for (1) analysis of the stability of the expected-value-maximizing strategy and (2) identification of strategies which are robust with respect to pessimistic/optimistic/mode-favoring perturbations of probabilities. We verify the properties of our approach in two cases: (a) probabilities in a tree are the primitives of the model and can be modified independently; (b) probabilities in a tree reflect some underlying, structural probabilities, and are interrelated. We provide a free software tool implementing the methods described.

  17. Tools for observational gait analysis in patients with stroke: a systematic review.

    PubMed

    Ferrarello, Francesco; Bianchi, Valeria Anna Maria; Baccini, Marco; Rubbieri, Gaia; Mossello, Enrico; Cavallini, Maria Chiara; Marchionni, Niccolò; Di Bari, Mauro

    2013-12-01

    Stroke severely affects walking ability, and assessment of gait kinematics is important in defining diagnosis, planning treatment, and evaluating interventions in stroke rehabilitation. Although observational gait analysis is the most common approach to evaluate gait kinematics, tools useful for this purpose have received little attention in the scientific literature and have not been thoroughly reviewed. The aims of this systematic review were to identify tools proposed to conduct observational gait analysis in adults with a stroke, to summarize evidence concerning their quality, and to assess their implementation in rehabilitation research and clinical practice. An extensive search was performed of original articles reporting on visual/observational tools developed to investigate gait kinematics in adults with a stroke. Two reviewers independently selected studies, extracted data, assessed quality of the included studies, and scored the metric properties and clinical utility of each tool. Rigor in reporting metric properties and dissemination of the tools also was evaluated. Five tools were identified, not all of which had been tested adequately for their metric properties. Evaluation of content validity was partially satisfactory. Reliability was poorly investigated in all but one tool. Concurrent validity and sensitivity to change were shown for 3 and 2 tools, respectively. Overall, adequate levels of quality were rarely reached. The dissemination of the tools was poor. Based on critical appraisal, the Gait Assessment and Intervention Tool shows a good level of quality, and its use in stroke rehabilitation is recommended. Rigorous studies are needed for the other tools in order to establish their usefulness.

  18. Development of a practical approach to expert elicitation for randomised controlled trials with missing health outcomes: Application to the IMPROVE trial

    PubMed Central

    Mason, Alexina J; Gomes, Manuel; Grieve, Richard; Ulug, Pinar; Powell, Janet T; Carpenter, James

    2017-01-01

    Background/aims: The analyses of randomised controlled trials with missing data typically assume that, after conditioning on the observed data, the probability of missing data does not depend on the patient’s outcome, and so the data are ‘missing at random’ . This assumption is usually implausible, for example, because patients in relatively poor health may be more likely to drop out. Methodological guidelines recommend that trials require sensitivity analysis, which is best informed by elicited expert opinion, to assess whether conclusions are robust to alternative assumptions about the missing data. A major barrier to implementing these methods in practice is the lack of relevant practical tools for eliciting expert opinion. We develop a new practical tool for eliciting expert opinion and demonstrate its use for randomised controlled trials with missing data. Methods: We develop and illustrate our approach for eliciting expert opinion with the IMPROVE trial (ISRCTN 48334791), an ongoing multi-centre randomised controlled trial which compares an emergency endovascular strategy versus open repair for patients with ruptured abdominal aortic aneurysm. In the IMPROVE trial at 3 months post-randomisation, 21% of surviving patients did not complete health-related quality of life questionnaires (assessed by EQ-5D-3L). We address this problem by developing a web-based tool that provides a practical approach for eliciting expert opinion about quality of life differences between patients with missing versus complete data. We show how this expert opinion can define informative priors within a fully Bayesian framework to perform sensitivity analyses that allow the missing data to depend upon unobserved patient characteristics. Results: A total of 26 experts, of 46 asked to participate, completed the elicitation exercise. The elicited quality of life scores were lower on average for the patients with missing versus complete data, but there was considerable uncertainty in these elicited values. The missing at random analysis found that patients randomised to the emergency endovascular strategy versus open repair had higher average (95% credible interval) quality of life scores of 0.062 (−0.005 to 0.130). Our sensitivity analysis that used the elicited expert information as pooled priors found that the gain in average quality of life for the emergency endovascular strategy versus open repair was 0.076 (−0.054 to 0.198). Conclusion: We provide and exemplify a practical tool for eliciting the expert opinion required by recommended approaches to the sensitivity analyses of randomised controlled trials. We show how this approach allows the trial analysis to fully recognise the uncertainty that arises from making alternative, plausible assumptions about the reasons for missing data. This tool can be widely used in the design, analysis and interpretation of future trials, and to facilitate this, materials are available for download. PMID:28675302

  19. Adjoint-Based Aerodynamic Design of Complex Aerospace Configurations

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.

    2016-01-01

    An overview of twenty years of adjoint-based aerodynamic design research at NASA Langley Research Center is presented. Adjoint-based algorithms provide a powerful tool for efficient sensitivity analysis of complex large-scale computational fluid dynamics (CFD) simulations. Unlike alternative approaches for which computational expense generally scales with the number of design parameters, adjoint techniques yield sensitivity derivatives of a simulation output with respect to all input parameters at the cost of a single additional simulation. With modern large-scale CFD applications often requiring millions of compute hours for a single analysis, the efficiency afforded by adjoint methods is critical in realizing a computationally tractable design optimization capability for such applications.

  20. Noise spectroscopy as an equilibrium analysis tool for highly sensitive electrical biosensing

    NASA Astrophysics Data System (ADS)

    Guo, Qiushi; Kong, Tao; Su, Ruigong; Zhang, Qi; Cheng, Guosheng

    2012-08-01

    We demonstrate an approach for highly sensitive bio-detection based on silicon nanowire field-effect transistors by employing low frequency noise spectroscopy analysis. The inverse of noise amplitude of the device exhibits an enhanced gate coupling effect in strong inversion regime when measured in buffer solution than that in air. The approach was further validated by the detection of cardiac troponin I of 0.23 ng/ml in fetal bovine serum, in which 2 orders of change in noise amplitude was characterized. The selectivity of the proposed approach was also assessed by the addition of 10 μg/ml bovine serum albumin solution.

  1. Refractive collimation beam shaper design and sensitivity analysis using a free-form profile construction method.

    PubMed

    Tsai, Chung-Yu

    2017-07-01

    A refractive laser beam shaper comprising two free-form profiles is presented. The profiles are designed using a free-form profile construction method such that each incident ray is directed in a certain user-specified direction or to a particular point on the target surface so as to achieve the required illumination distribution of the output beam. The validity of the proposed design method is demonstrated by means of ZEMAX simulations. The method is mathematically straightforward and easily implemented in computer code. It thus provides a convenient tool for the design and sensitivity analysis of laser beam shapers and similar optical components.

  2. Modified GMDH-NN algorithm and its application for global sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Song, Shufang; Wang, Lu

    2017-11-01

    Global sensitivity analysis (GSA) is a very useful tool to evaluate the influence of input variables in the whole distribution range. Sobol' method is the most commonly used among variance-based methods, which are efficient and popular GSA techniques. High dimensional model representation (HDMR) is a popular way to compute Sobol' indices, however, its drawbacks cannot be ignored. We show that modified GMDH-NN algorithm can calculate coefficients of metamodel efficiently, so this paper aims at combining it with HDMR and proposes GMDH-HDMR method. The new method shows higher precision and faster convergent rate. Several numerical and engineering examples are used to confirm its advantages.

  3. Revisiting inconsistency in large pharmacogenomic studies

    PubMed Central

    Safikhani, Zhaleh; Smirnov, Petr; Freeman, Mark; El-Hachem, Nehme; She, Adrian; Rene, Quevedo; Goldenberg, Anna; Birkbak, Nicolai J.; Hatzis, Christos; Shi, Leming; Beck, Andrew H.; Aerts, Hugo J.W.L.; Quackenbush, John; Haibe-Kains, Benjamin

    2017-01-01

    In 2013, we published a comparative analysis of mutation and gene expression profiles and drug sensitivity measurements for 15 drugs characterized in the 471 cancer cell lines screened in the Genomics of Drug Sensitivity in Cancer (GDSC) and Cancer Cell Line Encyclopedia (CCLE). While we found good concordance in gene expression profiles, there was substantial inconsistency in the drug responses reported by the GDSC and CCLE projects. We received extensive feedback on the comparisons that we performed. This feedback, along with the release of new data, prompted us to revisit our initial analysis. We present a new analysis using these expanded data, where we address the most significant suggestions for improvements on our published analysis — that targeted therapies and broad cytotoxic drugs should have been treated differently in assessing consistency, that consistency of both molecular profiles and drug sensitivity measurements should be compared across cell lines, and that the software analysis tools provided should have been easier to run, particularly as the GDSC and CCLE released additional data. Our re-analysis supports our previous finding that gene expression data are significantly more consistent than drug sensitivity measurements. Using new statistics to assess data consistency allowed identification of two broad effect drugs and three targeted drugs with moderate to good consistency in drug sensitivity data between GDSC and CCLE. For three other targeted drugs, there were not enough sensitive cell lines to assess the consistency of the pharmacological profiles. We found evidence of inconsistencies in pharmacological phenotypes for the remaining eight drugs. Overall, our findings suggest that the drug sensitivity data in GDSC and CCLE continue to present challenges for robust biomarker discovery. This re-analysis provides additional support for the argument that experimental standardization and validation of pharmacogenomic response will be necessary to advance the broad use of large pharmacogenomic screens. PMID:28928933

  4. Adjoint-based sensitivity analysis of low-order thermoacoustic networks using a wave-based approach

    NASA Astrophysics Data System (ADS)

    Aguilar, José G.; Magri, Luca; Juniper, Matthew P.

    2017-07-01

    Strict pollutant emission regulations are pushing gas turbine manufacturers to develop devices that operate in lean conditions, with the downside that combustion instabilities are more likely to occur. Methods to predict and control unstable modes inside combustion chambers have been developed in the last decades but, in some cases, they are computationally expensive. Sensitivity analysis aided by adjoint methods provides valuable sensitivity information at a low computational cost. This paper introduces adjoint methods and their application in wave-based low order network models, which are used as industrial tools, to predict and control thermoacoustic oscillations. Two thermoacoustic models of interest are analyzed. First, in the zero Mach number limit, a nonlinear eigenvalue problem is derived, and continuous and discrete adjoint methods are used to obtain the sensitivities of the system to small modifications. Sensitivities to base-state modification and feedback devices are presented. Second, a more general case with non-zero Mach number, a moving flame front and choked outlet, is presented. The influence of the entropy waves on the computed sensitivities is shown.

  5. Sobol‧ sensitivity analysis of NAPL-contaminated aquifer remediation process based on multiple surrogates

    NASA Astrophysics Data System (ADS)

    Luo, Jiannan; Lu, Wenxi

    2014-06-01

    Sobol‧ sensitivity analyses based on different surrogates were performed on a trichloroethylene (TCE)-contaminated aquifer to assess the sensitivity of the design variables of remediation duration, surfactant concentration and injection rates at four wells to remediation efficiency First, the surrogate models of a multi-phase flow simulation model were constructed by applying radial basis function artificial neural network (RBFANN) and Kriging methods, and the two models were then compared. Based on the developed surrogate models, the Sobol‧ method was used to calculate the sensitivity indices of the design variables which affect the remediation efficiency. The coefficient of determination (R2) and the mean square error (MSE) of these two surrogate models demonstrated that both models had acceptable approximation accuracy, furthermore, the approximation accuracy of the Kriging model was slightly better than that of the RBFANN model. Sobol‧ sensitivity analysis results demonstrated that the remediation duration was the most important variable influencing remediation efficiency, followed by rates of injection at wells 1 and 3, while rates of injection at wells 2 and 4 and the surfactant concentration had negligible influence on remediation efficiency. In addition, high-order sensitivity indices were all smaller than 0.01, which indicates that interaction effects of these six factors were practically insignificant. The proposed Sobol‧ sensitivity analysis based on surrogate is an effective tool for calculating sensitivity indices, because it shows the relative contribution of the design variables (individuals and interactions) to the output performance variability with a limited number of runs of a computationally expensive simulation model. The sensitivity analysis results lay a foundation for the optimal groundwater remediation process optimization.

  6. The SURE Reliability Analysis Program

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1986-01-01

    The SURE program is a new reliability analysis tool for ultrareliable computer system architectures. The program is based on computational methods recently developed for the NASA Langley Research Center. These methods provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool.

  7. Discrimination of surface wear on obsidian tools using LSCM and RelA: pilot study results (area-scale analysis of obsidian tool surfaces).

    PubMed

    Stemp, W James; Chung, Steven

    2011-01-01

    This pilot study tests the reliability of laser scanning confocal microscopy (LSCM) to quantitatively measure wear on experimental obsidian tools. To our knowledge, this is the first use of confocal microscopy to study wear on stone flakes made from an amorphous silicate like obsidian. Three-dimensional surface roughness or texture area scans on three obsidian flakes used on different contact materials (hide, shell, wood) were documented using the LSCM to determine whether the worn surfaces could be discriminated using area-scale analysis, specifically relative area (RelA). When coupled with the F-test, this scale-sensitive fractal analysis could not only discriminate the used from unused surfaces on individual tools, but was also capable of discriminating the wear histories of tools used on different contact materials. Results indicate that such discriminations occur at different scales. Confidence levels for the discriminations at different scales were established using the F-test (mean square ratios or MSRs). In instances where discrimination of surface roughness or texture was not possible above the established confidence level based on MSRs, photomicrographs and RelA assisted in hypothesizing why this was so. Copyright © 2011 Wiley Periodicals, Inc.

  8. Emerging spectra of singular correlation matrices under small power-map deformations

    NASA Astrophysics Data System (ADS)

    Vinayak; Schäfer, Rudi; Seligman, Thomas H.

    2013-09-01

    Correlation matrices are a standard tool in the analysis of the time evolution of complex systems in general and financial markets in particular. Yet most analysis assume stationarity of the underlying time series. This tends to be an assumption of varying and often dubious validity. The validity of the assumption improves as shorter time series are used. If many time series are used, this implies an analysis of highly singular correlation matrices. We attack this problem by using the so-called power map, which was introduced to reduce noise. Its nonlinearity breaks the degeneracy of the zero eigenvalues and we analyze the sensitivity of the so-emerging spectra to correlations. This sensitivity will be demonstrated for uncorrelated and correlated Wishart ensembles.

  9. Emerging spectra of singular correlation matrices under small power-map deformations.

    PubMed

    Vinayak; Schäfer, Rudi; Seligman, Thomas H

    2013-09-01

    Correlation matrices are a standard tool in the analysis of the time evolution of complex systems in general and financial markets in particular. Yet most analysis assume stationarity of the underlying time series. This tends to be an assumption of varying and often dubious validity. The validity of the assumption improves as shorter time series are used. If many time series are used, this implies an analysis of highly singular correlation matrices. We attack this problem by using the so-called power map, which was introduced to reduce noise. Its nonlinearity breaks the degeneracy of the zero eigenvalues and we analyze the sensitivity of the so-emerging spectra to correlations. This sensitivity will be demonstrated for uncorrelated and correlated Wishart ensembles.

  10. R-SWAT-FME user's guide

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shu-Guang

    2012-01-01

    R program language-Soil and Water Assessment Tool-Flexible Modeling Environment (R-SWAT-FME) (Wu and Liu, 2012) is a comprehensive modeling framework that adopts an R package, Flexible Modeling Environment (FME) (Soetaert and Petzoldt, 2010), for the Soil and Water Assessment Tool (SWAT) model (Arnold and others, 1998; Neitsch and others, 2005). This framework provides the functionalities of parameter identifiability, model calibration, and sensitivity and uncertainty analysis with instant visualization. This user's guide shows how to apply this framework for a customized SWAT project.

  11. Sensitivity and Uncertainty Analysis for Streamflow Prediction Using Different Objective Functions and Optimization Algorithms: San Joaquin California

    NASA Astrophysics Data System (ADS)

    Paul, M.; Negahban-Azar, M.

    2017-12-01

    The hydrologic models usually need to be calibrated against observed streamflow at the outlet of a particular drainage area through a careful model calibration. However, a large number of parameters are required to fit in the model due to their unavailability of the field measurement. Therefore, it is difficult to calibrate the model for a large number of potential uncertain model parameters. This even becomes more challenging if the model is for a large watershed with multiple land uses and various geophysical characteristics. Sensitivity analysis (SA) can be used as a tool to identify most sensitive model parameters which affect the calibrated model performance. There are many different calibration and uncertainty analysis algorithms which can be performed with different objective functions. By incorporating sensitive parameters in streamflow simulation, effects of the suitable algorithm in improving model performance can be demonstrated by the Soil and Water Assessment Tool (SWAT) modeling. In this study, the SWAT was applied in the San Joaquin Watershed in California covering 19704 km2 to calibrate the daily streamflow. Recently, sever water stress escalating due to intensified climate variability, prolonged drought and depleting groundwater for agricultural irrigation in this watershed. Therefore it is important to perform a proper uncertainty analysis given the uncertainties inherent in hydrologic modeling to predict the spatial and temporal variation of the hydrologic process to evaluate the impacts of different hydrologic variables. The purpose of this study was to evaluate the sensitivity and uncertainty of the calibrated parameters for predicting streamflow. To evaluate the sensitivity of the calibrated parameters three different optimization algorithms (Sequential Uncertainty Fitting- SUFI-2, Generalized Likelihood Uncertainty Estimation- GLUE and Parameter Solution- ParaSol) were used with four different objective functions (coefficient of determination- r2, Nash-Sutcliffe efficiency- NSE, percent bias- PBIAS, and Kling-Gupta efficiency- KGE). The preliminary results showed that using the SUFI-2 algorithm with the objective function NSE and KGE has improved significantly the calibration (e.g. R2 and NSE is found 0.52 and 0.47 respectively for daily streamflow calibration).

  12. Statistical Performances of Resistive Active Power Splitter

    NASA Astrophysics Data System (ADS)

    Lalléchère, Sébastien; Ravelo, Blaise; Thakur, Atul

    2016-03-01

    In this paper, the synthesis and sensitivity analysis of an active power splitter (PWS) is proposed. It is based on the active cell composed of a Field Effect Transistor in cascade with shunted resistor at the input and the output (resistive amplifier topology). The PWS uncertainty versus resistance tolerances is suggested by using stochastic method. Furthermore, with the proposed topology, we can control easily the device gain while varying a resistance. This provides useful tool to analyse the statistical sensitivity of the system in uncertain environment.

  13. The plant leaf movement analyzer (PALMA): a simple tool for the analysis of periodic cotyledon and leaf movement in Arabidopsis thaliana.

    PubMed

    Wagner, Lucas; Schmal, Christoph; Staiger, Dorothee; Danisman, Selahattin

    2017-01-01

    The analysis of circadian leaf movement rhythms is a simple yet effective method to study effects of treatments or gene mutations on the circadian clock of plants. Currently, leaf movements are analysed using time lapse photography and subsequent bioinformatics analyses of leaf movements. Programs that are used for this purpose either are able to perform one function (i.e. leaf tip detection or rhythm analysis) or their function is limited to specific computational environments. We developed a leaf movement analysis tool-PALMA-that works in command line and combines image extraction with rhythm analysis using Fast Fourier transformation and non-linear least squares fitting. We validated PALMA in both simulated time series and in experiments using the known short period mutant sensitivity to red light reduced 1 ( srr1 - 1 ). We compared PALMA with two established leaf movement analysis tools and found it to perform equally well. Finally, we tested the effect of reduced iron conditions on the leaf movement rhythms of wild type plants. Here, we found that PALMA successfully detected period lengthening under reduced iron conditions. PALMA correctly estimated the period of both simulated and real-life leaf movement experiments. As a platform-independent console-program that unites both functions needed for the analysis of circadian leaf movements it is a valid alternative to existing leaf movement analysis tools.

  14. CASL L1 Milestone report : CASL.P4.01, sensitivity and uncertainty analysis for CIPS with VIPRE-W and BOA.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sung, Yixing; Adams, Brian M.; Secker, Jeffrey R.

    2011-12-01

    The CASL Level 1 Milestone CASL.P4.01, successfully completed in December 2011, aimed to 'conduct, using methodologies integrated into VERA, a detailed sensitivity analysis and uncertainty quantification of a crud-relevant problem with baseline VERA capabilities (ANC/VIPRE-W/BOA).' The VUQ focus area led this effort, in partnership with AMA, and with support from VRI. DAKOTA was coupled to existing VIPRE-W thermal-hydraulics and BOA crud/boron deposit simulations representing a pressurized water reactor (PWR) that previously experienced crud-induced power shift (CIPS). This work supports understanding of CIPS by exploring the sensitivity and uncertainty in BOA outputs with respect to uncertain operating and model parameters. Thismore » report summarizes work coupling the software tools, characterizing uncertainties, and analyzing the results of iterative sensitivity and uncertainty studies. These studies focused on sensitivity and uncertainty of CIPS indicators calculated by the current version of the BOA code used in the industry. Challenges with this kind of analysis are identified to inform follow-on research goals and VERA development targeting crud-related challenge problems.« less

  15. Sensitivity and Nonlinearity of Thermoacoustic Oscillations

    NASA Astrophysics Data System (ADS)

    Juniper, Matthew P.; Sujith, R. I.

    2018-01-01

    Nine decades of rocket engine and gas turbine development have shown that thermoacoustic oscillations are difficult to predict but can usually be eliminated with relatively small ad hoc design changes. These changes can, however, be ruinously expensive to devise. This review explains why linear and nonlinear thermoacoustic behavior is so sensitive to parameters such as operating point, fuel composition, and injector geometry. It shows how nonperiodic behavior arises in experiments and simulations and discusses how fluctuations in thermoacoustic systems with turbulent reacting flow, which are usually filtered or averaged out as noise, can reveal useful information. Finally, it proposes tools to exploit this sensitivity in the future: adjoint-based sensitivity analysis to optimize passive control designs and complex systems theory to warn of impending thermoacoustic oscillations and to identify the most sensitive elements of a thermoacoustic system.

  16. What Do We Mean By Sensitivity Analysis? The Need For A Comprehensive Characterization Of Sensitivity In Earth System Models

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Gupta, H. V.

    2014-12-01

    Sensitivity analysis (SA) is an important paradigm in the context of Earth System model development and application, and provides a powerful tool that serves several essential functions in modelling practice, including 1) Uncertainty Apportionment - attribution of total uncertainty to different uncertainty sources, 2) Assessment of Similarity - diagnostic testing and evaluation of similarities between the functioning of the model and the real system, 3) Factor and Model Reduction - identification of non-influential factors and/or insensitive components of model structure, and 4) Factor Interdependence - investigation of the nature and strength of interactions between the factors, and the degree to which factors intensify, cancel, or compensate for the effects of each other. A variety of sensitivity analysis approaches have been proposed, each of which formally characterizes a different "intuitive" understanding of what is meant by the "sensitivity" of one or more model responses to its dependent factors (such as model parameters or forcings). These approaches are based on different philosophies and theoretical definitions of sensitivity, and range from simple local derivatives and one-factor-at-a-time procedures to rigorous variance-based (Sobol-type) approaches. In general, each approach focuses on, and identifies, different features and properties of the model response and may therefore lead to different (even conflicting) conclusions about the underlying sensitivity. This presentation revisits the theoretical basis for sensitivity analysis, and critically evaluates existing approaches so as to demonstrate their flaws and shortcomings. With this background, we discuss several important properties of response surfaces that are associated with the understanding and interpretation of sensitivity. Finally, a new approach towards global sensitivity assessment is developed that is consistent with important properties of Earth System model response surfaces.

  17. Characterization of emission microscopy and liquid crystal thermography in IC fault localization

    NASA Astrophysics Data System (ADS)

    Lau, C. K.; Sim, K. S.

    2013-05-01

    This paper characterizes two fault localization techniques - Emission Microscopy (EMMI) and Liquid Crystal Thermography (LCT) by using integrated circuit (IC) leakage failures. The majority of today's semiconductor failures do not reveal a clear visual defect on the die surface and therefore require fault localization tools to identify the fault location. Among the various fault localization tools, liquid crystal thermography and frontside emission microscopy are commonly used in most semiconductor failure analysis laboratories. Many people misunderstand that both techniques are the same and both are detecting hot spot in chip failing with short or leakage. As a result, analysts tend to use only LCT since this technique involves very simple test setup compared to EMMI. The omission of EMMI as the alternative technique in fault localization always leads to incomplete analysis when LCT fails to localize any hot spot on a failing chip. Therefore, this research was established to characterize and compare both the techniques in terms of their sensitivity in detecting the fault location in common semiconductor failures. A new method was also proposed as an alternative technique i.e. the backside LCT technique. The research observed that both techniques have successfully detected the defect locations resulted from the leakage failures. LCT wass observed more sensitive than EMMI in the frontside analysis approach. On the other hand, EMMI performed better in the backside analysis approach. LCT was more sensitive in localizing ESD defect location and EMMI was more sensitive in detecting non ESD defect location. Backside LCT was proven to work as effectively as the frontside LCT and was ready to serve as an alternative technique to the backside EMMI. The research confirmed that LCT detects heat generation and EMMI detects photon emission (recombination radiation). The analysis results also suggested that both techniques complementing each other in the IC fault localization. It is necessary for a failure analyst to use both techniques when one of the techniques produces no result.

  18. [Validation of the Montgomery-Åsberg Depression Rating Scale (MADRS) in Colombia].

    PubMed

    Cano, Juan Fernando; Gomez Restrepo, Carlos; Rondón, Martín

    2016-01-01

    To adapt and to validate the Montgomery-Åsberg Depression Rating Scale (MADRS) in Colombia. Observational study for scale validation. Validity criteria were used to determine the severity cut-off points of the tool. Taking into account sensitivity and specificity values, those cut points were contrasted with ICD-10 criteria for depression severity. A a factor analysis was performed. The internal consistencY was determined with the same sample of patients used for the validity criteria. Inter-rater reliability was assessed by evaluating the 22 records of the patients that consented to a video interview. Sensitivity to change was established through a second application of the scale in 28 subjects after a lapse of 14 to 28 days. The study was performed in Bogotá, the tool was applied in 150 patients suffering from major depressive disorder. The cut-off point for moderate depression was 20 (sensitivity, 98%; specificity, 96%), and the cut-off point for severe depression was 34 (sensitivity, 98%; specificity, 92%). The tool appears as a unidimensional scale, which possesses a good internal consistency with (α=.9168). The findings of inter-rater reliability evaluation showed the scale as highly reliable (intraclass correlation coefficient=.9833). The instrument has a good sensitivity to change. The Colombian version of the Montgomery-Åsberg Depression Rating Scale has good psychometric properties and can be used in clinical practice and in clinical research in the field of depressive disorder. Copyright © 2015 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  19. The Java Image Science Toolkit (JIST) for rapid prototyping and publishing of neuroimaging software.

    PubMed

    Lucas, Blake C; Bogovic, John A; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L; Pham, Dzung L; Landman, Bennett A

    2010-03-01

    Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC).

  20. The Java Image Science Toolkit (JIST) for Rapid Prototyping and Publishing of Neuroimaging Software

    PubMed Central

    Lucas, Blake C.; Bogovic, John A.; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L.; Pham, Dzung

    2010-01-01

    Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC). PMID:20077162

  1. Space system operations and support cost analysis using Markov chains

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Dean, Edwin B.; Moore, Arlene A.; Fairbairn, Robert E.

    1990-01-01

    This paper evaluates the use of Markov chain process in probabilistic life cycle cost analysis and suggests further uses of the process as a design aid tool. A methodology is developed for estimating operations and support cost and expected life for reusable space transportation systems. Application of the methodology is demonstrated for the case of a hypothetical space transportation vehicle. A sensitivity analysis is carried out to explore the effects of uncertainty in key model inputs.

  2. Global Sensitivity of Simulated Water Balance Indicators Under Future Climate Change in the Colorado Basin

    NASA Astrophysics Data System (ADS)

    Bennett, Katrina E.; Urrego Blanco, Jorge R.; Jonko, Alexandra; Bohn, Theodore J.; Atchley, Adam L.; Urban, Nathan M.; Middleton, Richard S.

    2018-01-01

    The Colorado River Basin is a fundamentally important river for society, ecology, and energy in the United States. Streamflow estimates are often provided using modeling tools which rely on uncertain parameters; sensitivity analysis can help determine which parameters impact model results. Despite the fact that simulated flows respond to changing climate and vegetation in the basin, parameter sensitivity of the simulations under climate change has rarely been considered. In this study, we conduct a global sensitivity analysis to relate changes in runoff, evapotranspiration, snow water equivalent, and soil moisture to model parameters in the Variable Infiltration Capacity (VIC) hydrologic model. We combine global sensitivity analysis with a space-filling Latin Hypercube Sampling of the model parameter space and statistical emulation of the VIC model to examine sensitivities to uncertainties in 46 model parameters following a variance-based approach. We find that snow-dominated regions are much more sensitive to uncertainties in VIC parameters. Although baseflow and runoff changes respond to parameters used in previous sensitivity studies, we discover new key parameter sensitivities. For instance, changes in runoff and evapotranspiration are sensitive to albedo, while changes in snow water equivalent are sensitive to canopy fraction and Leaf Area Index (LAI) in the VIC model. It is critical for improved modeling to narrow uncertainty in these parameters through improved observations and field studies. This is important because LAI and albedo are anticipated to change under future climate and narrowing uncertainty is paramount to advance our application of models such as VIC for water resource management.

  3. Pseudotargeted MS Method for the Sensitive Analysis of Protein Phosphorylation in Protein Complexes.

    PubMed

    Lyu, Jiawen; Wang, Yan; Mao, Jiawei; Yao, Yating; Wang, Shujuan; Zheng, Yong; Ye, Mingliang

    2018-05-15

    In this study, we presented an enrichment-free approach for the sensitive analysis of protein phosphorylation in minute amounts of samples, such as purified protein complexes. This method takes advantage of the high sensitivity of parallel reaction monitoring (PRM). Specifically, low confident phosphopeptides identified from the data-dependent acquisition (DDA) data set were used to build a pseudotargeted list for PRM analysis to allow the identification of additional phosphopeptides with high confidence. The development of this targeted approach is very easy as the same sample and the same LC-system were used for the discovery and the targeted analysis phases. No sample fractionation or enrichment was required for the discovery phase which allowed this method to analyze minute amount of sample. We applied this pseudotargeted MS method to quantitatively examine phosphopeptides in affinity purified endogenous Shc1 protein complexes at four temporal stages of EGF signaling and identified 82 phospho-sites. To our knowledge, this is the highest number of phospho-sites identified from the protein complexes. This pseudotargeted MS method is highly sensitive in the identification of low abundance phosphopeptides and could be a powerful tool to study phosphorylation-regulated assembly of protein complex.

  4. Fragrances and other materials in deodorants: search for potentially sensitizing molecules using combined GC-MS and structure activity relationship (SAR) analysis.

    PubMed

    Rastogi, S C; Lepoittevin, J P; Johansen, J D; Frosch, P J; Menné, T; Bruze, M; Dreier, B; Andersen, K E; White, I R

    1998-12-01

    Deodorants are one of the most frequently-used types of cosmetics and are a source of allergic contact dermatitis. Therefore, a gas chromatography - mass spectrometric analysis of 71 deodorants was performed for identification of fragrance and non-fragrance materials present in marketed deodorants. Futhermore, the sensitizing potential of these molecules was evaluated using structure activity relationships (SARs) analysis. This was based on the presence of 1 or more chemically reactive site(s), in the chemical structure, associated with sensitizing potential. Among the many different substances used to formulate cosmetic products (over 3500), 226 chemicals were identified in a sample of 71 deodorants. 84 molecules were found to contain at least 1 structural alert, and 70 to belong to, or be susceptible to being metabolized into, the chemical group of aldehydes, ketones and alpha,beta-unsaturated aldehydes, ketone or esters. The combination of GC-MS and SARs analysis could be helpful in the selection of substances for supplementary investigations regarding sensitizing properties. Thus, it may be a valuable tool in the management of contact allergy to deodorants and for producing new deodorants with decreased propensity to cause contact allergy.

  5. Integrating exhaled breath diagnostics by disease-sniffing dogs with instrumental laboratory analysis

    EPA Science Inventory

    Dogs have been studied for many years as a medical diagnostic tool to detect a pre-clinical disease state by sniffing emissions directly from a human or an in vitro biological sample. Some of the studies report high sensitivity and specificity in blinded case-control studies. How...

  6. Sensitivity Analysis of Dispersion Model Results in the NEXUS Health Study Due to Uncertainties in Traffic-Related Emissions Inputs

    EPA Science Inventory

    Dispersion modeling tools have traditionally provided critical information for air quality management decisions, but have been used recently to provide exposure estimates to support health studies. However, these models can be challenging to implement, particularly in near-road s...

  7. Development of a sensitivity and uncertainty analysis tool in R for parametrization of the APEX model

    USDA-ARS?s Scientific Manuscript database

    Hydrologic models are used to simulate the responses of agricultural systems to different inputs and management strategies to identify alternative management practices to cope up with future climate and/or geophysical changes. The Agricultural Policy/Environmental eXtender (APEX) is a model develope...

  8. DIAGNOSTIC STUDY ON FINE PARTICULATE MATTER PREDICTIONS OF CMAQ IN THE SOUTHEASTERN U.S.

    EPA Science Inventory

    In this study, the authors use the process analysis tool embedded in CMAQ to examine major processes that govern the fate of key pollutants, identify the most influential processes that contribute to model errors, and guide the diagnostic and sensitivity studies aimed at improvin...

  9. Sensitivity analysis of dynamic biological systems with time-delays.

    PubMed

    Wu, Wu Hsiung; Wang, Feng Sheng; Chang, Maw Shang

    2010-10-15

    Mathematical modeling has been applied to the study and analysis of complex biological systems for a long time. Some processes in biological systems, such as the gene expression and feedback control in signal transduction networks, involve a time delay. These systems are represented as delay differential equation (DDE) models. Numerical sensitivity analysis of a DDE model by the direct method requires the solutions of model and sensitivity equations with time-delays. The major effort is the computation of Jacobian matrix when computing the solution of sensitivity equations. The computation of partial derivatives of complex equations either by the analytic method or by symbolic manipulation is time consuming, inconvenient, and prone to introduce human errors. To address this problem, an automatic approach to obtain the derivatives of complex functions efficiently and accurately is necessary. We have proposed an efficient algorithm with an adaptive step size control to compute the solution and dynamic sensitivities of biological systems described by ordinal differential equations (ODEs). The adaptive direct-decoupled algorithm is extended to solve the solution and dynamic sensitivities of time-delay systems describing by DDEs. To save the human effort and avoid the human errors in the computation of partial derivatives, an automatic differentiation technique is embedded in the extended algorithm to evaluate the Jacobian matrix. The extended algorithm is implemented and applied to two realistic models with time-delays: the cardiovascular control system and the TNF-α signal transduction network. The results show that the extended algorithm is a good tool for dynamic sensitivity analysis on DDE models with less user intervention. By comparing with direct-coupled methods in theory, the extended algorithm is efficient, accurate, and easy to use for end users without programming background to do dynamic sensitivity analysis on complex biological systems with time-delays.

  10. A closure test for time-specific capture-recapture data

    USGS Publications Warehouse

    Stanley, T.R.; Burnham, K.P.

    1999-01-01

    The assumption of demographic closure in the analysis of capture-recapture data under closed-population models is of fundamental importance. Yet, little progress has been made in the development of omnibus tests of the closure assumption. We present a closure test for time-specific data that, in principle, tests the null hypothesis of closed-population model M(t) against the open-population Jolly-Seber model as a specific alternative. This test is chi-square, and can be decomposed into informative components that can be interpreted to determine the nature of closure violations. The test is most sensitive to permanent emigration and least sensitive to temporary emigration, and is of intermediate sensitivity to permanent or temporary immigration. This test is a versatile tool for testing the assumption of demographic closure in the analysis of capture-recapture data.

  11. Systematic review and meta-analysis of the performance of clinical risk assessment instruments for screening for osteoporosis or low bone density

    PubMed Central

    Edwards, D. L.; Saleh, A. A.; Greenspan, S. L.

    2015-01-01

    Summary We performed a systematic review and meta-analysis of the performance of clinical risk assessment instruments for screening for DXA-determined osteoporosis or low bone density. Commonly evaluated risk instruments showed high sensitivity approaching or exceeding 90 % at particular thresholds within various populations but low specificity at thresholds required for high sensitivity. Simpler instruments, such as OST, generally performed as well as or better than more complex instruments. Introduction The purpose of the study is to systematically review the performance of clinical risk assessment instruments for screening for dual-energy X-ray absorptiometry (DXA)-determined osteoporosis or low bone density. Methods Systematic review and meta-analysis were performed. Multiple literature sources were searched, and data extracted and analyzed from included references. Results One hundred eight references met inclusion criteria. Studies assessed many instruments in 34 countries, most commonly the Osteoporosis Self-Assessment Tool (OST), the Simple Calculated Osteoporosis Risk Estimation (SCORE) instrument, the Osteoporosis Self-Assessment Tool for Asians (OSTA), the Osteoporosis Risk Assessment Instrument (ORAI), and body weight criteria. Meta-analyses of studies evaluating OST using a cutoff threshold of <1 to identify US postmenopausal women with osteoporosis at the femoral neck provided summary sensitivity and specificity estimates of 89 % (95%CI 82–96 %) and 41 % (95%CI 23–59 %), respectively. Meta-analyses of studies evaluating OST using a cutoff threshold of 3 to identify US men with osteoporosis at the femoral neck, total hip, or lumbar spine provided summary sensitivity and specificity estimates of 88 % (95%CI 79–97 %) and 55 % (95%CI 42–68 %), respectively. Frequently evaluated instruments each had thresholds and populations for which sensitivity for osteoporosis or low bone mass detection approached or exceeded 90 % but always with a trade-off of relatively low specificity. Conclusions Commonly evaluated clinical risk assessment instruments each showed high sensitivity approaching or exceeding 90 % for identifying individuals with DXA-determined osteoporosis or low BMD at certain thresholds in different populations but low specificity at thresholds required for high sensitivity. Simpler instruments, such as OST, generally performed as well as or better than more complex instruments. PMID:25644147

  12. Coupled rotor/airframe vibration analysis

    NASA Technical Reports Server (NTRS)

    Sopher, R.; Studwell, R. E.; Cassarino, S.; Kottapalli, S. B. R.

    1982-01-01

    A coupled rotor/airframe vibration analysis developed as a design tool for predicting helicopter vibrations and a research tool to quantify the effects of structural properties, aerodynamic interactions, and vibration reduction devices on vehicle vibration levels is described. The analysis consists of a base program utilizing an impedance matching technique to represent the coupled rotor/airframe dynamics of the system supported by inputs from several external programs supplying sophisticated rotor and airframe aerodynamic and structural dynamic representation. The theoretical background, computer program capabilities and limited correlation results are presented in this report. Correlation results using scale model wind tunnel results show that the analysis can adequately predict trends of vibration variations with airspeed and higher harmonic control effects. Predictions of absolute values of vibration levels were found to be very sensitive to modal characteristics and results were not representative of measured values.

  13. PASTA: splice junction identification from RNA-Sequencing data

    PubMed Central

    2013-01-01

    Background Next generation transcriptome sequencing (RNA-Seq) is emerging as a powerful experimental tool for the study of alternative splicing and its regulation, but requires ad-hoc analysis methods and tools. PASTA (Patterned Alignments for Splicing and Transcriptome Analysis) is a splice junction detection algorithm specifically designed for RNA-Seq data, relying on a highly accurate alignment strategy and on a combination of heuristic and statistical methods to identify exon-intron junctions with high accuracy. Results Comparisons against TopHat and other splice junction prediction software on real and simulated datasets show that PASTA exhibits high specificity and sensitivity, especially at lower coverage levels. Moreover, PASTA is highly configurable and flexible, and can therefore be applied in a wide range of analysis scenarios: it is able to handle both single-end and paired-end reads, it does not rely on the presence of canonical splicing signals, and it uses organism-specific regression models to accurately identify junctions. Conclusions PASTA is a highly efficient and sensitive tool to identify splicing junctions from RNA-Seq data. Compared to similar programs, it has the ability to identify a higher number of real splicing junctions, and provides highly annotated output files containing detailed information about their location and characteristics. Accurate junction data in turn facilitates the reconstruction of the splicing isoforms and the analysis of their expression levels, which will be performed by the remaining modules of the PASTA pipeline, still under development. Use of PASTA can therefore enable the large-scale investigation of transcription and alternative splicing. PMID:23557086

  14. On determining important aspects of mathematical models: Application to problems in physics and chemistry

    NASA Technical Reports Server (NTRS)

    Rabitz, Herschel

    1987-01-01

    The use of parametric and functional gradient sensitivity analysis techniques is considered for models described by partial differential equations. By interchanging appropriate dependent and independent variables, questions of inverse sensitivity may be addressed to gain insight into the inversion of observational data for parameter and function identification in mathematical models. It may be argued that the presence of a subset of dominantly strong coupled dependent variables will result in the overall system sensitivity behavior collapsing into a simple set of scaling and self similarity relations amongst elements of the entire matrix of sensitivity coefficients. These general tools are generic in nature, but herein their application to problems arising in selected areas of physics and chemistry is presented.

  15. Optical modeling of waveguide coupled TES detectors towards the SAFARI instrument for SPICA

    NASA Astrophysics Data System (ADS)

    Trappe, N.; Bracken, C.; Doherty, S.; Gao, J. R.; Glowacka, D.; Goldie, D.; Griffin, D.; Hijmering, R.; Jackson, B.; Khosropanah, P.; Mauskopf, P.; Morozov, D.; Murphy, A.; O'Sullivan, C.; Ridder, M.; Withington, S.

    2012-09-01

    The next generation of space missions targeting far-infrared wavelengths will require large-format arrays of extremely sensitive detectors. The development of Transition Edge Sensor (TES) array technology is being developed for future Far-Infrared (FIR) space applications such as the SAFARI instrument for SPICA where low-noise and high sensitivity is required to achieve ambitious science goals. In this paper we describe a modal analysis of multi-moded horn antennas feeding integrating cavities housing TES detectors with superconducting film absorbers. In high sensitivity TES detector technology the ability to control the electromagnetic and thermo-mechanical environment of the detector is critical. Simulating and understanding optical behaviour of such detectors at far IR wavelengths is difficult and requires development of existing analysis tools. The proposed modal approach offers a computationally efficient technique to describe the partial coherent response of the full pixel in terms of optical efficiency and power leakage between pixels. Initial wok carried out as part of an ESA technical research project on optical analysis is described and a prototype SAFARI pixel design is analyzed where the optical coupling between the incoming field and the pixel containing horn, cavity with an air gap, and thin absorber layer are all included in the model to allow a comprehensive optical characterization. The modal approach described is based on the mode matching technique where the horn and cavity are described in the traditional way while a technique to include the absorber was developed. Radiation leakage between pixels is also included making this a powerful analysis tool.

  16. HF Propagation sensitivity study and system performance analysis with the Air Force Coverage Analysis Program (AFCAP)

    NASA Astrophysics Data System (ADS)

    Caton, R. G.; Colman, J. J.; Parris, R. T.; Nickish, L.; Bullock, G.

    2017-12-01

    The Air Force Research Laboratory, in collaboration with NorthWest Research Associates, is developing advanced software capabilities for high fidelity simulations of high frequency (HF) sky wave propagation and performance analysis of HF systems. Based on the HiCIRF (High-frequency Channel Impulse Response Function) platform [Nickisch et. al, doi:10.1029/2011RS004928], the new Air Force Coverage Analysis Program (AFCAP) provides the modular capabilities necessary for a comprehensive sensitivity study of the large number of variables which define simulations of HF propagation modes. In this paper, we report on an initial exercise of AFCAP to analyze the sensitivities of the tool to various environmental and geophysical parameters. Through examination of the channel scattering function and amplitude-range-Doppler output on two-way propagation paths with injected target signals, we will compare simulated returns over a range of geophysical conditions as well as varying definitions for environmental noise, meteor clutter, and sea state models for Bragg backscatter. We also investigate the impacts of including clutter effects due to field-aligned backscatter from small scale ionization structures at varied levels of severity as defined by the climatologically WideBand Model (WBMOD). In the absence of additional user provided information, AFCAP relies on International Reference Ionosphere (IRI) model to define the ionospheric state for use in 2D ray tracing algorithms. Because the AFCAP architecture includes the option for insertion of a user defined gridded ionospheric representation, we compare output from the tool using the IRI and ionospheric definitions from assimilative models such as GPSII (GPS Ionospheric Inversion).

  17. Evaluation of FTIR spectroscopy as diagnostic tool for colorectal cancer using spectral analysis

    NASA Astrophysics Data System (ADS)

    Dong, Liu; Sun, Xuejun; Chao, Zhang; Zhang, Shiyun; Zheng, Jianbao; Gurung, Rajendra; Du, Junkai; Shi, Jingsen; Xu, Yizhuang; Zhang, Yuanfu; Wu, Jinguang

    2014-03-01

    The aim of this study is to confirm FTIR spectroscopy as a diagnostic tool for colorectal cancer. 180 freshly removed colorectal samples were collected from 90 patients for spectrum analysis. The ratios of spectral intensity and relative intensity (/I1460) were calculated. Principal component analysis (PCA) and Fisher's discriminant analysis (FDA) were applied to distinguish the malignant from normal. The FTIR parameters of colorectal cancer and normal tissues were distinguished due to the contents or configurations of nucleic acids, proteins, lipids and carbohydrates. Related to nitrogen containing, water, protein and nucleic acid were increased significantly in the malignant group. Six parameters were selected as independent factors to perform discriminant functions. The sensitivity for FTIR in diagnosing colorectal cancer was 96.6% by discriminant analysis. Our study demonstrates that FTIR can be a useful technique for detection of colorectal cancer and may be applied in clinical colorectal cancer diagnosis.

  18. SCALE Code System 6.2.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including 3 deterministic and 3 Monte Carlomore » radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results. SCALE 6.2 represents one of the most comprehensive revisions in the history of SCALE, providing several new capabilities and significant improvements in many existing features.« less

  19. The development and testing of a skin tear risk assessment tool.

    PubMed

    Newall, Nelly; Lewin, Gill F; Bulsara, Max K; Carville, Keryln J; Leslie, Gavin D; Roberts, Pam A

    2017-02-01

    The aim of the present study is to develop a reliable and valid skin tear risk assessment tool. The six characteristics identified in a previous case control study as constituting the best risk model for skin tear development were used to construct a risk assessment tool. The ability of the tool to predict skin tear development was then tested in a prospective study. Between August 2012 and September 2013, 1466 tertiary hospital patients were assessed at admission and followed up for 10 days to see if they developed a skin tear. The predictive validity of the tool was assessed using receiver operating characteristic (ROC) analysis. When the tool was found not to have performed as well as hoped, secondary analyses were performed to determine whether a potentially better performing risk model could be identified. The tool was found to have high sensitivity but low specificity and therefore have inadequate predictive validity. Secondary analysis of the combined data from this and the previous case control study identified an alternative better performing risk model. The tool developed and tested in this study was found to have inadequate predictive validity. The predictive validity of an alternative, more parsimonious model now needs to be tested. © 2015 Medicalhelplines.com Inc and John Wiley & Sons Ltd.

  20. Revenue Potential for Inpatient IR Consultation Services: A Financial Model.

    PubMed

    Misono, Alexander S; Mueller, Peter R; Hirsch, Joshua A; Sheridan, Robert M; Siddiqi, Assad U; Liu, Raymond W

    2016-05-01

    Interventional radiology (IR) has historically failed to fully capture the value of evaluation and management services in the inpatient setting. Understanding financial benefits of a formally incorporated billing discipline may yield meaningful insights for interventional practices. A revenue modeling tool was created deploying standard financial modeling techniques, including sensitivity and scenario analyses. Sensitivity analysis calculates revenue fluctuation related to dynamic adjustment of discrete variables. In scenario analysis, possible future scenarios as well as revenue potential of different-size clinical practices are modeled. Assuming a hypothetical inpatient IR consultation service with a daily patient census of 35 patients and two new consults per day, the model estimates annual charges of $2.3 million and collected revenue of $390,000. Revenues are most sensitive to provider billing documentation rates and patient volume. A range of realistic scenarios-from cautious to optimistic-results in a range of annual charges of $1.8 million to $2.7 million and a collected revenue range of $241,000 to $601,000. Even a small practice with a daily patient census of 5 and 0.20 new consults per day may expect annual charges of $320,000 and collected revenue of $55,000. A financial revenue modeling tool is a powerful adjunct in understanding economics of an inpatient IR consultation service. Sensitivity and scenario analyses demonstrate a wide range of revenue potential and uncover levers for financial optimization. Copyright © 2016 SIR. Published by Elsevier Inc. All rights reserved.

  1. Identification of Shiga-Toxigenic Escherichia coli outbreak isolates by a novel data analysis tool after matrix-assisted laser desorption/ionization time-of-flight mass spectrometry.

    PubMed

    Christner, Martin; Dressler, Dirk; Andrian, Mark; Reule, Claudia; Petrini, Orlando

    2017-01-01

    The fast and reliable characterization of bacterial and fungal pathogens plays an important role in infectious disease control and tracking of outbreak agents. DNA based methods are the gold standard for epidemiological investigations, but they are still comparatively expensive and time-consuming. Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) is a fast, reliable and cost-effective technique now routinely used to identify clinically relevant human pathogens. It has been used for subspecies differentiation and typing, but its use for epidemiological tasks, e. g. for outbreak investigations, is often hampered by the complexity of data analysis. We have analysed publicly available MALDI-TOF mass spectra from a large outbreak of Shiga-Toxigenic Escherichia coli in northern Germany using a general purpose software tool for the analysis of complex biological data. The software was challenged with depauperate spectra and reduced learning group sizes to mimic poor spectrum quality and scarcity of reference spectra at the onset of an outbreak. With high quality formic acid extraction spectra, the software's built in classifier accurately identified outbreak related strains using as few as 10 reference spectra (99.8% sensitivity, 98.0% specificity). Selective variation of processing parameters showed impaired marker peak detection and reduced classification accuracy in samples with high background noise or artificially reduced peak counts. However, the software consistently identified mass signals suitable for a highly reliable marker peak based classification approach (100% sensitivity, 99.5% specificity) even from low quality direct deposition spectra. The study demonstrates that general purpose data analysis tools can effectively be used for the analysis of bacterial mass spectra.

  2. Assessment of insulin sensitivity by the hyperinsulinemic euglycemic clamp: Comparison with the spectral analysis of photoplethysmography.

    PubMed

    De Souza, Aglecio Luiz; Batista, Gisele Almeida; Alegre, Sarah Monte

    2017-01-01

    We compare spectral analysis of photoplethysmography (PTG) with insulin resistance measured by the hyperinsulinemic euglycemic clamp (HEC) technique. A total of 100 nondiabetic subjects, 43 men and 57 women aged 20-63years, 30 lean, 42 overweight and 28 obese were enrolled in the study. These patients underwent an examination with HEC, and an examination with the PTG spectral analysis and calculation of the PTG Total Power (PTG-TP). Receiver-operating characteristic (ROC) curves were constructed to determine the specificity and sensitivity of PTG-TP in the assessment of insulin resistance. There is a moderate correlation between insulin sensitivity (M-value) and PTG-TP (r=- 0.64, p<0.0001). The ROC curves showed that the most relevant cutoff to the whole study group was a PTG-TP>406.2. This cut-off had a sensitivity=95.7%, specificity =84,4% and the area under the ROC curve (AUC)=0.929 for identifying insulin resistance. All AUC ROC curve analysis were significant (p<0.0001). The use of the PTG-TP marker measured from the PTG spectral analysis is a useful tool in screening and follow up of IR, especially in large-scale studies. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Exhaled breath condensate – from an analytical point of view

    PubMed Central

    Dodig, Slavica; Čepelak, Ivana

    2013-01-01

    Over the past three decades, the goal of many researchers is analysis of exhaled breath condensate (EBC) as noninvasively obtained sample. A total quality in laboratory diagnostic processes in EBC analysis was investigated: pre-analytical (formation, collection, storage of EBC), analytical (sensitivity of applied methods, standardization) and post-analytical (interpretation of results) phases. EBC analysis is still used as a research tool. Limitations referred to pre-analytical, analytical, and post-analytical phases of EBC analysis are numerous, e.g. low concentrations of EBC constituents, single-analyte methods lack in sensitivity, and multi-analyte has not been fully explored, and reference values are not established. When all, pre-analytical, analytical and post-analytical requirements are met, EBC biomarkers as well as biomarker patterns can be selected and EBC analysis can hopefully be used in clinical practice, in both, the diagnosis and in the longitudinal follow-up of patients, resulting in better outcome of disease. PMID:24266297

  4. Xpert MTB/RIF Assay for Pulmonary Tuberculosis and Rifampicin Resistance in Children: a Meta-Analysis.

    PubMed

    Wang, X W; Pappoe, F; Huang, Y; Cheng, X W; Xu, D F; Wang, H; Xu, Y H

    2015-01-01

    The Xpert MTB/RIF assay has been recommended by WHO to replace conventional microscopy, culture, and drug resistance tests. It simultaneously detects both Mycobacterium tuberculosis infection (TB) and resistance to rifampicin (RIF) within two hours. The objective was to review the available research studies on the accuracy of the Xpert MTB/RIF assay for diagnosing pulmonary TB and RIF-resistance in children. A comprehensive search of Pubmed and Embase was performed up to October 28, 2014. We identified published articles estimating the diagnostic accuracy of the Xpert MTB/RIF assay in children with or without HIV using culture or culture plus clinical TB as standard reference. QUADAS-2 tool was used to evaluate the quality of the studies. A summary estimation for sensitivity, specificity, diagnostic odds ratios (DOR), and the area under the summary ROC curve (AUC) was performed. Meta-analysis was used to establish the overall accuracy. 11 diagnostic studies with 3801 patients were included in the systematic review. The overall analysis revealed a moderate sensitivity and high specificity of 65% (95% CI: 61 - 69%) and 99% (95% CI: 98 - 99%), respectively, and a pooled diagnostic odds ratio of 164.09 (95% CI: 111.89 - 240.64). The AUC value was found to be 0.94. The pooled sensitivity and specificity for paediatric rifampicin resistance were 94.0% (95% CI: 80.0 - 93.0%) and 99.0% (95% CI: 95.0 - 98.0%), respectively. Hence, the Xpert MTB/RIF assay has good diagnostic and rifampicin performance for paediatric pulmonary tuberculosis. The Xpert MTB/RIF is sensitive and specific for diagnosing paediatric pulmonary TB. It is also effective in detecting rifamnicin resistance. It can, therefore, be used as an initial diagnostic tool.

  5. Ignoring correlation in uncertainty and sensitivity analysis in life cycle assessment: what is the risk?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groen, E.A., E-mail: Evelyne.Groen@gmail.com; Heijungs, R.; Leiden University, Einsteinweg 2, Leiden 2333 CC

    Life cycle assessment (LCA) is an established tool to quantify the environmental impact of a product. A good assessment of uncertainty is important for making well-informed decisions in comparative LCA, as well as for correctly prioritising data collection efforts. Under- or overestimation of output uncertainty (e.g. output variance) will lead to incorrect decisions in such matters. The presence of correlations between input parameters during uncertainty propagation, can increase or decrease the the output variance. However, most LCA studies that include uncertainty analysis, ignore correlations between input parameters during uncertainty propagation, which may lead to incorrect conclusions. Two approaches to include correlationsmore » between input parameters during uncertainty propagation and global sensitivity analysis were studied: an analytical approach and a sampling approach. The use of both approaches is illustrated for an artificial case study of electricity production. Results demonstrate that both approaches yield approximately the same output variance and sensitivity indices for this specific case study. Furthermore, we demonstrate that the analytical approach can be used to quantify the risk of ignoring correlations between input parameters during uncertainty propagation in LCA. We demonstrate that: (1) we can predict if including correlations among input parameters in uncertainty propagation will increase or decrease output variance; (2) we can quantify the risk of ignoring correlations on the output variance and the global sensitivity indices. Moreover, this procedure requires only little data. - Highlights: • Ignoring correlation leads to under- or overestimation of the output variance. • We demonstrated that the risk of ignoring correlation can be quantified. • The procedure proposed is generally applicable in life cycle assessment. • In some cases, ignoring correlation has a minimal effect on decision-making tools.« less

  6. The Clinical Utility of Methicillin Resistant Staphylococcus aureus (MRSA) Nasal Screening to Rule Out MRSA Pneumonia: A Diagnostic Meta-analysis with Antimicrobial Stewardship Implications.

    PubMed

    Parente, Diane M; Cunha, Cheston B; Mylonakis, Eleftherios; Timbrook, Tristan T

    2018-01-11

    Recent literature has highlighted MRSA nasal screening as a possible antimicrobial stewardship program (ASP) tool for avoiding unnecessary empiric MRSA therapy for pneumonia, yet current guidelines recommend MRSA therapy based on risk factors. The objective of this meta-analysis was to evaluate the diagnostic value of MRSA nasal screening in MRSA pneumonia. Pubmed and EMBASE were searched from inception to November 2016 for English studies evaluating MRSA nasal screening and development of MRSA pneumonia. Data analysis was performed using a bivariate random-effects model to estimate pooled sensitivity, specificity, and positive (PPV) and negative (NPV) predictive values. Twenty-two studies, comprising of 5,163 patients met our inclusion criteria. Pooled sensitivity and specificity of MRSA nares screen for all MRSA pneumonia types was 70.9% and 90.3%, respectively. With a 10% prevalence of potential MRSA pneumonia, the calculated PPV was 44.8% while the NPV was 96.5%. The pooled sensitivity and specificity for MRSA community-acquired pneumonia (CAP) and healthcare-associated pneumonia (HCAP) were at 85% and 92.1%, respectively. For CAP and HCAP both the PPV and NPV increased to 56.8% and 98.1%, respectively. In comparison, for MRSA ventilated-associated pneumonia (VAP), the sensitivity, specificity, PPV, NPV was 40.3%, 93.7%, 35.7%, and 94.8%, respectively. Nares screening for MRSA had a high specificity and NPV for ruling out MRSA pneumonia, particularly in cases of CAP/HCAP. Based on the NPV, utilization of MRSA nares screening is a valuable tool for AMS to streamline empiric antibiotic therapy, especially among patients with pneu. © The Author(s) 2018. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.

  7. [Assessment of the validity and utility of the Beijing questionnaire as atool to evaluate for obstructive sleep apnea hypopnea syndrome].

    PubMed

    Wang, X T; Gao, L M; Xu, W; Ding, X

    2016-10-20

    Objective: To test the Beijing questionnaire as a means of identifying patients with obstructive sleep apnea hypopnea syndrome(OSAHS). Method: The Beijing questionnaire is designed as an explorative tool consist of 11 questions for patients with obstructive sleep apnea hypopnea, and is targeted toward key symptoms include snoring, apneas, daytime sleepiness, hypertension and overweight. 1 336 female participants living in communities of age≥40 years and 198 male adult subjects visting clinics were given questionnaires. Finally, 59 female and 198 male subjects underwent sleep studies after factor analysis,reliability check,internal consistency study. The correlation analysis was performed between the scores from the Beijing questionnaire and the apnea-hypopnea index from inlaboratory polysomnography.Receiver operating characteristics were constructed to determine optimal sensitivity and specificity. Twenty-four male subjects were recorded in the sleep laberatory again after operative. Result: Factor analysis reduced 11 questions of scale to four common factors as we have designed: snoring,apneas,other symptoms,risk factors. Cronbach's α coefficient of scale reached 0.7.There were an acceptable level of testretest reliability(r=0.619, P <0.01).The apnea hypopnea indices were significantly correlated with their Beijing questionnaire scores( P <0.01).For wemen,an Beijing questionnaire scroe of 19.5 provided a sensitivity of 74.3% and a specificity of 62.5%.For men,an Beijing questionnaire scroe of 22.5 provided a sensitivity of 90.9% and a specificity of 54.5%. And the postoperative Beijing questionnaire scroes changed with the apnea hypopnea indices. Conclusion: This questionnaire has a good validity and reliability and appears to be valid and sensitive to clinical change. Copyright© by the Editorial Department of Journal of Clinical Otorhinolaryngology Head and Neck Surgery.

  8. ASPASIA: A toolkit for evaluating the effects of biological interventions on SBML model behaviour.

    PubMed

    Evans, Stephanie; Alden, Kieran; Cucurull-Sanchez, Lourdes; Larminie, Christopher; Coles, Mark C; Kullberg, Marika C; Timmis, Jon

    2017-02-01

    A calibrated computational model reflects behaviours that are expected or observed in a complex system, providing a baseline upon which sensitivity analysis techniques can be used to analyse pathways that may impact model responses. However, calibration of a model where a behaviour depends on an intervention introduced after a defined time point is difficult, as model responses may be dependent on the conditions at the time the intervention is applied. We present ASPASIA (Automated Simulation Parameter Alteration and SensItivity Analysis), a cross-platform, open-source Java toolkit that addresses a key deficiency in software tools for understanding the impact an intervention has on system behaviour for models specified in Systems Biology Markup Language (SBML). ASPASIA can generate and modify models using SBML solver output as an initial parameter set, allowing interventions to be applied once a steady state has been reached. Additionally, multiple SBML models can be generated where a subset of parameter values are perturbed using local and global sensitivity analysis techniques, revealing the model's sensitivity to the intervention. To illustrate the capabilities of ASPASIA, we demonstrate how this tool has generated novel hypotheses regarding the mechanisms by which Th17-cell plasticity may be controlled in vivo. By using ASPASIA in conjunction with an SBML model of Th17-cell polarisation, we predict that promotion of the Th1-associated transcription factor T-bet, rather than inhibition of the Th17-associated transcription factor RORγt, is sufficient to drive switching of Th17 cells towards an IFN-γ-producing phenotype. Our approach can be applied to all SBML-encoded models to predict the effect that intervention strategies have on system behaviour. ASPASIA, released under the Artistic License (2.0), can be downloaded from http://www.york.ac.uk/ycil/software.

  9. Quantitative molecular analysis in mantle cell lymphoma.

    PubMed

    Brízová, H; Hilská, I; Mrhalová, M; Kodet, R

    2011-07-01

    A molecular analysis has three major roles in modern oncopathology--as an aid in the differential diagnosis, in molecular monitoring of diseases, and in estimation of the potential prognosis. In this report we review the application of the molecular analysis in a group of patients with mantle cell lymphoma (MCL). We demonstrate that detection of the cyclin D1 mRNA level is a molecular marker in 98% of patients with MCL. Cyclin D1 quantitative monitoring is specific and sensitive for the differential diagnosis and for the molecular monitoring of the disease in the bone marrow. Moreover, the dynamics of cyclin D1 in bone marrow reflects the disease development and it predicts the clinical course. We employed the molecular analysis for a precise quantitative detection of proliferation markers, Ki-67, topoisomerase IIalpha, and TPX2, that are described as effective prognostic factors. Using the molecular approach it is possible to measure the proliferation rate in a reproducible, standard way which is an essential prerequisite for using the proliferation activity as a routine clinical tool. Comparing with immunophenotyping we may conclude that the quantitative PCR-based analysis is a useful, reliable, rapid, reproducible, sensitive and specific method broadening our diagnostic tools in hematopathology. In comparison to interphase FISH in paraffin sections quantitative PCR is less technically demanding and less time-consuming and furthermore it is more sensitive in detecting small changes in the mRNA level. Moreover, quantitative PCR is the only technology which provides precise and reproducible quantitative information about the expression level. Therefore it may be used to demonstrate the decrease or increase of a tumor-specific marker in bone marrow in comparison with a previously aspirated specimen. Thus, it has a powerful potential to monitor the course of the disease in correlation with clinical data.

  10. Societal costs in displaced transverse olecranon fractures: using decision analysis tools to find the most cost-effective strategy between tension band wiring and locked plating.

    PubMed

    Francis, Tittu; Washington, Travis; Srivastava, Karan; Moutzouros, Vasilios; Makhni, Eric C; Hakeos, William

    2017-11-01

    Tension band wiring (TBW) and locked plating are common treatment options for Mayo IIA olecranon fractures. Clinical trials have shown excellent functional outcomes with both techniques. Although TBW implants are significantly less expensive than a locked olecranon plate, TBW often requires an additional operation for implant removal. To choose the most cost-effective treatment strategy, surgeons must understand how implant costs and return to the operating room influence the most cost-effective strategy. This cost-effective analysis study explored the optimal treatment strategies by using decision analysis tools. An expected-value decision tree was constructed to estimate costs based on the 2 implant choices. Values for critical variables, such as implant removal rate, were obtained from the literature. A Monte Carlo simulation consisting of 100,000 trials was used to incorporate variability in medical costs and implant removal rates. Sensitivity analysis and strategy tables were used to show how different variables influence the most cost-effective strategy. TBW was the most cost-effective strategy, with a cost savings of approximately $1300. TBW was also the dominant strategy by being the most cost-effective solution in 63% of the Monte Carlo trials. Sensitivity analysis identified implant costs for plate fixation and surgical costs for implant removal as the most sensitive parameters influencing the cost-effective strategy. Strategy tables showed the most cost-effective solution as 2 parameters vary simultaneously. TBW is the most cost-effective strategy in treating Mayo IIA olecranon fractures despite a higher rate of return to the operating room. Copyright © 2017 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  11. Study on Web-Based Tool for Regional Agriculture Industry Structure Optimization Using Ajax

    NASA Astrophysics Data System (ADS)

    Huang, Xiaodong; Zhu, Yeping

    According to the research status of regional agriculture industry structure adjustment information system and the current development of information technology, this paper takes web-based regional agriculture industry structure optimization tool as research target. This paper introduces Ajax technology and related application frameworks to build an auxiliary toolkit of decision support system for agricultural policy maker and economy researcher. The toolkit includes a “one page” style component of regional agriculture industry structure optimization which provides agile arguments setting method that enables applying sensitivity analysis and usage of data and comparative advantage analysis result, and a component that can solve the linear programming model and its dual problem by simplex method.

  12. Memory Circuit Fault Simulator

    NASA Technical Reports Server (NTRS)

    Sheldon, Douglas J.; McClure, Tucker

    2013-01-01

    Spacecraft are known to experience significant memory part-related failures and problems, both pre- and postlaunch. These memory parts include both static and dynamic memories (SRAM and DRAM). These failures manifest themselves in a variety of ways, such as pattern-sensitive failures, timingsensitive failures, etc. Because of the mission critical nature memory devices play in spacecraft architecture and operation, understanding their failure modes is vital to successful mission operation. To support this need, a generic simulation tool that can model different data patterns in conjunction with variable write and read conditions was developed. This tool is a mathematical and graphical way to embed pattern, electrical, and physical information to perform what-if analysis as part of a root cause failure analysis effort.

  13. Diagnostic Performance of CT for Diagnosis of Fat-Poor Angiomyolipoma in Patients With Renal Masses: A Systematic Review and Meta-Analysis.

    PubMed

    Woo, Sungmin; Suh, Chong Hyun; Cho, Jeong Yeon; Kim, Sang Youn; Kim, Seung Hyup

    2017-11-01

    The purpose of this article is to systematically review and perform a meta-analysis of the diagnostic performance of CT for diagnosis of fat-poor angiomyolipoma (AML) in patients with renal masses. MEDLINE and EMBASE were systematically searched up to February 2, 2017. We included diagnostic accuracy studies that used CT for diagnosis of fat-poor AML in patients with renal masses, using pathologic examination as the reference standard. Two independent reviewers assessed the methodologic quality using the Quality Assessment of Diagnostic Accuracy Studies-2 tool. Sensitivity and specificity of included studies were calculated and were pooled and plotted in a hierarchic summary ROC plot. Sensitivity analyses using several clinically relevant covariates were performed to explore heterogeneity. Fifteen studies (2258 patients) were included. Pooled sensitivity and specificity were 0.67 (95% CI, 0.48-0.81) and 0.97 (95% CI, 0.89-0.99), respectively. Substantial and considerable heterogeneity was present with regard to sensitivity and specificity (I 2 = 91.21% and 78.53%, respectively). At sensitivity analyses, the specificity estimates were comparable and consistently high across all subgroups (0.93-1.00), but sensitivity estimates showed significant variation (0.14-0.82). Studies using pixel distribution analysis (n = 3) showed substantially lower sensitivity estimates (0.14; 95% CI, 0.04-0.40) compared with the remaining 12 studies (0.81; 95% CI, 0.76-0.85). CT shows moderate sensitivity and excellent specificity for diagnosis of fat-poor AML in patients with renal masses. When methods other than pixel distribution analysis are used, better sensitivity can be achieved.

  14. Haemocompatibility of iron oxide nanoparticles synthesized for theranostic applications: a high-sensitivity microfluidic tool

    NASA Astrophysics Data System (ADS)

    Rodrigues, Raquel O.; Bañobre-López, Manuel; Gallo, Juan; Tavares, Pedro B.; Silva, Adrián M. T.; Lima, Rui; Gomes, Helder T.

    2016-07-01

    The poor heating efficiency of the most reported magnetic nanoparticles (MNPs), allied to the lack of comprehensive biocompatibility and haemodynamic studies, hampers the spread of multifunctional nanoparticles as the next generation of therapeutic bio-agents in medicine. The present work reports the synthesis and characterization, with special focus on biological/toxicological compatibility, of superparamagnetic nanoparticles with diameter around 18 nm, suitable for theranostic applications (i.e. simultaneous diagnosis and therapy of cancer). Envisioning more insights into the complex nanoparticle-red blood cells (RBCs) membrane interaction, the deformability of the human RBCs in contact with magnetic nanoparticles (MNPs) was assessed for the first time with a microfluidic extensional approach, and used as an indicator of haematological disorders in comparison with a conventional haematological test, i.e. the haemolysis analysis. Microfluidic results highlight the potential of this microfluidic tool over traditional haemolysis analysis, by detecting small increments in the rigidity of the blood cells, when traditional haemotoxicology analysis showed no significant alteration (haemolysis rates lower than 2 %). The detected rigidity has been predicted to be due to the wrapping of small MNPs by the bilayer membrane of the RBCs, which is directly related to MNPs size, shape and composition. The proposed microfluidic tool adds a new dimension into the field of nanomedicine, allowing to be applied as a high-sensitivity technique capable of bringing a better understanding of the biological impact of nanoparticles developed for clinical applications.

  15. Parameterization and Sensitivity Analysis of a Complex Simulation Model for Mosquito Population Dynamics, Dengue Transmission, and Their Control

    PubMed Central

    Ellis, Alicia M.; Garcia, Andres J.; Focks, Dana A.; Morrison, Amy C.; Scott, Thomas W.

    2011-01-01

    Models can be useful tools for understanding the dynamics and control of mosquito-borne disease. More detailed models may be more realistic and better suited for understanding local disease dynamics; however, evaluating model suitability, accuracy, and performance becomes increasingly difficult with greater model complexity. Sensitivity analysis is a technique that permits exploration of complex models by evaluating the sensitivity of the model to changes in parameters. Here, we present results of sensitivity analyses of two interrelated complex simulation models of mosquito population dynamics and dengue transmission. We found that dengue transmission may be influenced most by survival in each life stage of the mosquito, mosquito biting behavior, and duration of the infectious period in humans. The importance of these biological processes for vector-borne disease models and the overwhelming lack of knowledge about them make acquisition of relevant field data on these biological processes a top research priority. PMID:21813844

  16. Perturbation analysis for patch occupancy dynamics

    USGS Publications Warehouse

    Martin, Julien; Nichols, James D.; McIntyre, Carol L.; Ferraz, Goncalo; Hines, James E.

    2009-01-01

    Perturbation analysis is a powerful tool to study population and community dynamics. This article describes expressions for sensitivity metrics reflecting changes in equilibrium occupancy resulting from small changes in the vital rates of patch occupancy dynamics (i.e., probabilities of local patch colonization and extinction). We illustrate our approach with a case study of occupancy dynamics of Golden Eagle (Aquila chrysaetos) nesting territories. Examination of the hypothesis of system equilibrium suggests that the system satisfies equilibrium conditions. Estimates of vital rates obtained using patch occupancy models are used to estimate equilibrium patch occupancy of eagles. We then compute estimates of sensitivity metrics and discuss their implications for eagle population ecology and management. Finally, we discuss the intuition underlying our sensitivity metrics and then provide examples of ecological questions that can be addressed using perturbation analyses. For instance, the sensitivity metrics lead to predictions about the relative importance of local colonization and local extinction probabilities in influencing equilibrium occupancy for rare and common species.

  17. Visualization of Global Sensitivity Analysis Results Based on a Combination of Linearly Dependent and Independent Directions

    NASA Technical Reports Server (NTRS)

    Davies, Misty D.; Gundy-Burlet, Karen

    2010-01-01

    A useful technique for the validation and verification of complex flight systems is Monte Carlo Filtering -- a global sensitivity analysis that tries to find the inputs and ranges that are most likely to lead to a subset of the outputs. A thorough exploration of the parameter space for complex integrated systems may require thousands of experiments and hundreds of controlled and measured variables. Tools for analyzing this space often have limitations caused by the numerical problems associated with high dimensionality and caused by the assumption of independence of all of the dimensions. To combat both of these limitations, we propose a technique that uses a combination of the original variables with the derived variables obtained during a principal component analysis.

  18. Parameterization of the InVEST Crop Pollination Model to spatially predict abundance of wild blueberry (Vaccinium angustifolium Aiton) native bee pollinators in Maine, USA

    USGS Publications Warehouse

    Groff, Shannon C.; Loftin, Cynthia S.; Drummond, Frank; Bushmann, Sara; McGill, Brian J.

    2016-01-01

    Non-native honeybees historically have been managed for crop pollination, however, recent population declines draw attention to pollination services provided by native bees. We applied the InVEST Crop Pollination model, developed to predict native bee abundance from habitat resources, in Maine's wild blueberry crop landscape. We evaluated model performance with parameters informed by four approaches: 1) expert opinion; 2) sensitivity analysis; 3) sensitivity analysis informed model optimization; and, 4) simulated annealing (uninformed) model optimization. Uninformed optimization improved model performance by 29% compared to expert opinion-informed model, while sensitivity-analysis informed optimization improved model performance by 54%. This suggests that expert opinion may not result in the best parameter values for the InVEST model. The proportion of deciduous/mixed forest within 2000 m of a blueberry field also reliably predicted native bee abundance in blueberry fields, however, the InVEST model provides an efficient tool to estimate bee abundance beyond the field perimeter.

  19. Advances in ultrasensitive mass spectrometry of organic molecules.

    PubMed

    Kandiah, Mathivathani; Urban, Pawel L

    2013-06-21

    Ultrasensitive mass spectrometric analysis of organic molecules is important for various branches of chemistry, and other fields including physics, earth and environmental sciences, archaeology, biomedicine, and materials science. It finds applications--as an enabling tool--in systems biology, biological imaging, clinical analysis, and forensics. Although there are a number of technical obstacles associated with the analysis of samples by mass spectrometry at ultratrace level (for example analyte losses during sample preparation, insufficient sensitivity, ion suppression), several noteworthy developments have been made over the years. They include: sensitive ion sources, loss-free interfaces, ion optics components, efficient mass analyzers and detectors, as well as "smart" sample preparation strategies. Some of the mass spectrometric methods published to date can achieve sensitivity which is by several orders of magnitude higher than that of alternative approaches. Femto- and attomole level limits of detection are nowadays common, while zepto- and yoctomole level limits of detection have also been reported. We envision that the ultrasensitive mass spectrometric assays will soon contribute to new discoveries in bioscience and other areas.

  20. Evaluation of Uncertainty and Sensitivity in Environmental Modeling at a Radioactive Waste Management Site

    NASA Astrophysics Data System (ADS)

    Stockton, T. B.; Black, P. K.; Catlett, K. M.; Tauxe, J. D.

    2002-05-01

    Environmental modeling is an essential component in the evaluation of regulatory compliance of radioactive waste management sites (RWMSs) at the Nevada Test Site in southern Nevada, USA. For those sites that are currently operating, further goals are to support integrated decision analysis for the development of acceptance criteria for future wastes, as well as site maintenance, closure, and monitoring. At these RWMSs, the principal pathways for release of contamination to the environment are upward towards the ground surface rather than downwards towards the deep water table. Biotic processes, such as burrow excavation and plant uptake and turnover, dominate this upward transport. A combined multi-pathway contaminant transport and risk assessment model was constructed using the GoldSim modeling platform. This platform facilitates probabilistic analysis of environmental systems, and is especially well suited for assessments involving radionuclide decay chains. The model employs probabilistic definitions of key parameters governing contaminant transport, with the goals of quantifying cumulative uncertainty in the estimation of performance measures and providing information necessary to perform sensitivity analyses. This modeling differs from previous radiological performance assessments (PAs) in that the modeling parameters are intended to be representative of the current knowledge, and the uncertainty in that knowledge, of parameter values rather than reflective of a conservative assessment approach. While a conservative PA may be sufficient to demonstrate regulatory compliance, a parametrically honest PA can also be used for more general site decision-making. In particular, a parametrically honest probabilistic modeling approach allows both uncertainty and sensitivity analyses to be explicitly coupled to the decision framework using a single set of model realizations. For example, sensitivity analysis provides a guide for analyzing the value of collecting more information by quantifying the relative importance of each input parameter in predicting the model response. However, in these complex, high dimensional eco-system models, represented by the RWMS model, the dynamics of the systems can act in a non-linear manner. Quantitatively assessing the importance of input variables becomes more difficult as the dimensionality, the non-linearities, and the non-monotonicities of the model increase. Methods from data mining such as Multivariate Adaptive Regression Splines (MARS) and the Fourier Amplitude Sensitivity Test (FAST) provide tools that can be used in global sensitivity analysis in these high dimensional, non-linear situations. The enhanced interpretability of model output provided by the quantitative measures estimated by these global sensitivity analysis tools will be demonstrated using the RWMS model.

  1. Setting up a proper power spectral density (PSD) and autocorrelation analysis for material and process characterization

    NASA Astrophysics Data System (ADS)

    Rutigliani, Vito; Lorusso, Gian Francesco; De Simone, Danilo; Lazzarino, Frederic; Rispens, Gijsbert; Papavieros, George; Gogolides, Evangelos; Constantoudis, Vassilios; Mack, Chris A.

    2018-03-01

    Power spectral density (PSD) analysis is playing more and more a critical role in the understanding of line-edge roughness (LER) and linewidth roughness (LWR) in a variety of applications across the industry. It is an essential step to get an unbiased LWR estimate, as well as an extremely useful tool for process and material characterization. However, PSD estimate can be affected by both random to systematic artifacts caused by image acquisition and measurement settings, which could irremediably alter its information content. In this paper, we report on the impact of various setting parameters (smoothing image processing filters, pixel size, and SEM noise levels) on the PSD estimate. We discuss also the use of PSD analysis tool in a variety of cases. Looking beyond the basic roughness estimate, we use PSD and autocorrelation analysis to characterize resist blur[1], as well as low and high frequency roughness contents and we apply this technique to guide the EUV material stack selection. Our results clearly indicate that, if properly used, PSD methodology is a very sensitive tool to investigate material and process variations

  2. Uncertainty Quantification Techniques of SCALE/TSUNAMI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Mueller, Don

    2011-01-01

    The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification ismore » useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an additional administrative margin to account for gap in the validation data or to conclude that the impact on the calculated bias and bias uncertainty is negligible. As a result of advances in computer programs and the evolution of cross-section covariance data, analysts can use the sensitivity and uncertainty analysis tools in the TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides not represented in available benchmark experiments. This paper presents the application of methods described in a companion paper.« less

  3. Screening for sepsis in general hospitalized patients: a systematic review.

    PubMed

    Alberto, L; Marshall, A P; Walker, R; Aitken, L M

    2017-08-01

    Sepsis is a condition widely observed outside critical care areas. To examine the application of sepsis screening tools for early recognition of sepsis in general hospitalized patients to: (i) identify the accuracy of these tools; (ii) determine the outcomes associated with their implementation; and (iii) describe the implementation process. A systematic review method was used. PubMed, CINAHL, Cochrane, Scopus, Web of Science, and Embase databases were systematically searched for primary articles, published from January 1990 to June 2016, that investigated screening tools or alert mechanisms for early identification of sepsis in adult general hospitalized patients. The review protocol was registered with PROSPERO (CRD42016042261). More than 8000 citations were screened for eligibility after duplicates had been removed. Six articles met the inclusion criteria testing two types of sepsis screening tools. Electronic tools can capture, recognize abnormal variables, and activate an alert in real time. However, accuracy of these tools was inconsistent across studies with only one demonstrating high specificity and sensitivity. Paper-based, nurse-led screening tools appear to be more sensitive in the identification of septic patients but were only studied in small samples and particular populations. The process of care measures appears to be enhanced; however, demonstrating improved outcomes is more challenging. Implementation details are rarely reported. Heterogeneity of studies prevented meta-analysis. Clinicians, researchers and health decision-makers should consider these findings and limitations when implementing screening tools, research or policy on sepsis recognition in general hospitalized patients. Copyright © 2017 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  4. Global sensitivity and uncertainty analysis of the nitrate leaching and crop yield simulation under different water and nitrogen management practices

    USDA-ARS?s Scientific Manuscript database

    Agricultural system models have become important tools in studying water and nitrogen (N) dynamics, as well as crop growth, under different management practices. Complexity in input parameters often leads to significant uncertainty when simulating dynamic processes such as nitrate leaching or crop y...

  5. Sensitivity analysis of the agricultural policy/environmental extender (APEX) for phosphorus loads in tile-drained landscapes

    USDA-ARS?s Scientific Manuscript database

    Numerical modeling is an economical and feasible approach for quantifying the effects of best management practices on phosphorus (P) loadings from agricultural fields. However, tools that simulate both surface and subsurface P pathways are limited and have not been robustly evaluated in tile-drained...

  6. YBYRÁ facilitates comparison of large phylogenetic trees.

    PubMed

    Machado, Denis Jacob

    2015-07-01

    The number and size of tree topologies that are being compared by phylogenetic systematists is increasing due to technological advancements in high-throughput DNA sequencing. However, we still lack tools to facilitate comparison among phylogenetic trees with a large number of terminals. The "YBYRÁ" project integrates software solutions for data analysis in phylogenetics. It comprises tools for (1) topological distance calculation based on the number of shared splits or clades, (2) sensitivity analysis and automatic generation of sensitivity plots and (3) clade diagnoses based on different categories of synapomorphies. YBYRÁ also provides (4) an original framework to facilitate the search for potential rogue taxa based on how much they affect average matching split distances (using MSdist). YBYRÁ facilitates comparison of large phylogenetic trees and outperforms competing software in terms of usability and time efficiency, specially for large data sets. The programs that comprises this toolkit are written in Python, hence they do not require installation and have minimum dependencies. The entire project is available under an open-source licence at http://www.ib.usp.br/grant/anfibios/researchSoftware.html .

  7. DeltaSA tool for source apportionment benchmarking, description and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Pernigotti, D.; Belis, C. A.

    2018-05-01

    DeltaSA is an R-package and a Java on-line tool developed at the EC-Joint Research Centre to assist and benchmark source apportionment applications. Its key functionalities support two critical tasks in this kind of studies: the assignment of a factor to a source in factor analytical models (source identification) and the model performance evaluation. The source identification is based on the similarity between a given factor and source chemical profiles from public databases. The model performance evaluation is based on statistical indicators used to compare model output with reference values generated in intercomparison exercises. The references values are calculated as the ensemble average of the results reported by participants that have passed a set of testing criteria based on chemical profiles and time series similarity. In this study, a sensitivity analysis of the model performance criteria is accomplished using the results of a synthetic dataset where "a priori" references are available. The consensus modulated standard deviation punc gives the best choice for the model performance evaluation when a conservative approach is adopted.

  8. Diagnostic value of fecal tumor M2-pyruvate kinase for CRC screening: a systematic review and meta-analysis.

    PubMed

    Li, Rui; Liu, Jianjun; Xue, Huiping; Huang, Gang

    2012-10-15

    The measurement of fecal tumor M2-pyruvate kinase (PKM2), overexpressed in tumor cells, has been proposed as a novel tool for detecting colorectal cancer (CRC). However, the sensitivity and specificity of this test varied among studies. The aim of this meta-analysis was to determine the diagnostic accuracy of fecal PKM2 for CRC and to evaluate its utility in the CRC screening. It was compared to guaiac fecal occult blood test (gFOBT) or immunological fecal occult blood test (iFOBT). Through comprehensive literature search, 10 studies met the inclusion criteria and were included. Summary estimates for sensitivity and specificity were calculated by using the bivariate random effect model. The hierarchical summary receiver operating characteristic curve was also undertaken. The overall sensitivity and specificity of fecal PKM2 for detecting CRC were 79% (95% CI = 75-83%) and 81% (95% CI = 73-87%), respectively. The summary positive predictive value and negative predictive value were 74% (95% CI = 56-87%) and 86% (95% CI = 79-91%), respectively. The pooled diagnostic odds ratio was 16 (95% CI = 10-26). In head-to-head comparison, the diagnostic odds ratio of PKM2 and gFOBT for CRC were 10.167 (95% CI = 5.992-17.250) and 6.557 (95% CI = 3.467-12.403), respectively. The diagnostic odds ratio of PKM2 and iFOBT for CRC were 9.542 (95% CI = 5.893-15.452) and 67.248 (95% CI = 16.194-279.26), respectively. The fecal PKM2 test was a diagnostic tool with moderate sensitivity and specificity for detecting CRC. Its diagnostic efficiency was similar to that of gFOBT. Because of its relatively low specificity and positive predict value, fecal PKM2 was not recommended used alone as a screening tool for CRC. Copyright © 2012 UICC.

  9. MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.

    PubMed

    Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y

    2017-08-14

    Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .

  10. An Excel®-based visualization tool of 2-D soil gas concentration profiles in petroleum vapor intrusion

    PubMed Central

    Verginelli, Iason; Yao, Yijun; Suuberg, Eric M.

    2017-01-01

    In this study we present a petroleum vapor intrusion tool implemented in Microsoft® Excel® using Visual Basic for Applications (VBA) and integrated within a graphical interface. The latter helps users easily visualize two-dimensional soil gas concentration profiles and indoor concentrations as a function of site-specific conditions such as source strength and depth, biodegradation reaction rate constant, soil characteristics and building features. This tool is based on a two-dimensional explicit analytical model that combines steady-state diffusion-dominated vapor transport in a homogeneous soil with a piecewise first-order aerobic biodegradation model, in which rate is limited by oxygen availability. As recommended in the recently released United States Environmental Protection Agency's final Petroleum Vapor Intrusion guidance, a sensitivity analysis and a simplified Monte Carlo uncertainty analysis are also included in the spreadsheet. PMID:28163564

  11. An Excel®-based visualization tool of 2-D soil gas concentration profiles in petroleum vapor intrusion.

    PubMed

    Verginelli, Iason; Yao, Yijun; Suuberg, Eric M

    2016-01-01

    In this study we present a petroleum vapor intrusion tool implemented in Microsoft ® Excel ® using Visual Basic for Applications (VBA) and integrated within a graphical interface. The latter helps users easily visualize two-dimensional soil gas concentration profiles and indoor concentrations as a function of site-specific conditions such as source strength and depth, biodegradation reaction rate constant, soil characteristics and building features. This tool is based on a two-dimensional explicit analytical model that combines steady-state diffusion-dominated vapor transport in a homogeneous soil with a piecewise first-order aerobic biodegradation model, in which rate is limited by oxygen availability. As recommended in the recently released United States Environmental Protection Agency's final Petroleum Vapor Intrusion guidance, a sensitivity analysis and a simplified Monte Carlo uncertainty analysis are also included in the spreadsheet.

  12. DNA marker technology for wildlife conservation

    PubMed Central

    Arif, Ibrahim A.; Khan, Haseeb A.; Bahkali, Ali H.; Al Homaidan, Ali A.; Al Farhan, Ahmad H.; Al Sadoon, Mohammad; Shobrak, Mohammad

    2011-01-01

    Use of molecular markers for identification of protected species offers a greater promise in the field of conservation biology. The information on genetic diversity of wildlife is necessary to ascertain the genetically deteriorated populations so that better management plans can be established for their conservation. Accurate classification of these threatened species allows understanding of the species biology and identification of distinct populations that should be managed with utmost care. Molecular markers are versatile tools for identification of populations with genetic crisis by comparing genetic diversities that in turn helps to resolve taxonomic uncertainties and to establish management units within species. The genetic marker analysis also provides sensitive and useful tools for prevention of illegal hunting and poaching and for more effective implementation of the laws for protection of the endangered species. This review summarizes various tools of DNA markers technology for application in molecular diversity analysis with special emphasis on wildlife conservation. PMID:23961128

  13. A universal Model-R Coupler to facilitate the use of R functions for model calibration and analysis

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shuguang; Yan, Wende

    2014-01-01

    Mathematical models are useful in various fields of science and engineering. However, it is a challenge to make a model utilize the open and growing functions (e.g., model inversion) on the R platform due to the requirement of accessing and revising the model's source code. To overcome this barrier, we developed a universal tool that aims to convert a model developed in any computer language to an R function using the template and instruction concept of the Parameter ESTimation program (PEST) and the operational structure of the R-Soil and Water Assessment Tool (R-SWAT). The developed tool (Model-R Coupler) is promising because users of any model can connect an external algorithm (written in R) with their model to implement various model behavior analyses (e.g., parameter optimization, sensitivity and uncertainty analysis, performance evaluation, and visualization) without accessing or modifying the model's source code.

  14. Analyzing reflective narratives to assess the ethical reasoning of pediatric residents.

    PubMed

    Moon, Margaret; Taylor, Holly A; McDonald, Erin L; Hughes, Mark T; Beach, Mary Catherine; Carrese, Joseph A

    2013-01-01

    A limiting factor in ethics education in medical training has been difficulty in assessing competence in ethics. This study was conducted to test the concept that content analysis of pediatric residents' personal reflections about ethics experiences can identify changes in ethical sensitivity and reasoning over time. Analysis of written narratives focused on two of our ethics curriculum's goals: 1) To raise sensitivity to ethical issues in everyday clinical practice and 2) to enhance critical reflection on personal and professional values as they affect patient care. Content analysis of written reflections was guided by a tool developed to identify and assess the level of ethical reasoning in eight domains determined to be important aspects of ethical competence. Based on the assessment of narratives written at two times (12 to 16 months/apart) during their training, residents showed significant progress in two specific domains: use of professional values, and use of personal values. Residents did not show decline in ethical reasoning in any domain. This study demonstrates that content analysis of personal narratives may provide a useful method for assessment of developing ethical sensitivity and reasoning.

  15. ON IDENTIFIABILITY OF NONLINEAR ODE MODELS AND APPLICATIONS IN VIRAL DYNAMICS

    PubMed Central

    MIAO, HONGYU; XIA, XIAOHUA; PERELSON, ALAN S.; WU, HULIN

    2011-01-01

    Ordinary differential equations (ODE) are a powerful tool for modeling dynamic processes with wide applications in a variety of scientific fields. Over the last 2 decades, ODEs have also emerged as a prevailing tool in various biomedical research fields, especially in infectious disease modeling. In practice, it is important and necessary to determine unknown parameters in ODE models based on experimental data. Identifiability analysis is the first step in determing unknown parameters in ODE models and such analysis techniques for nonlinear ODE models are still under development. In this article, we review identifiability analysis methodologies for nonlinear ODE models developed in the past one to two decades, including structural identifiability analysis, practical identifiability analysis and sensitivity-based identifiability analysis. Some advanced topics and ongoing research are also briefly reviewed. Finally, some examples from modeling viral dynamics of HIV, influenza and hepatitis viruses are given to illustrate how to apply these identifiability analysis methods in practice. PMID:21785515

  16. Critical comparison of diffuse reflectance spectroscopy and colorimetry as dermatological diagnostic tools for acanthosis nigricans: a chemometric approach.

    PubMed

    Devpura, Suneetha; Pattamadilok, Bensachee; Syed, Zain U; Vemulapalli, Pranita; Henderson, Marsha; Rehse, Steven J; Hamzavi, Iltefat; Lim, Henry W; Naik, Ratna

    2011-06-01

    Quantification of skin changes due to acanthosis nigricans (AN), a disorder common among insulin-resistant diabetic and obese individuals, was investigated using two optical techniques: diffuse reflectance spectroscopy (DRS) and colorimetry. Measurements were obtained from AN lesions on the neck and two control sites of eight AN patients. A principal component/discriminant function analysis successfully differentiated between AN lesion and normal skin with 87.7% sensitivity and 94.8% specificity in DRS measurements and 97.2% sensitivity and 96.4% specificity in colorimetry measurements.

  17. SCALE 6.2 Continuous-Energy TSUNAMI-3D Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perfetti, Christopher M; Rearden, Bradley T

    2015-01-01

    The TSUNAMI (Tools for Sensitivity and UNcertainty Analysis Methodology Implementation) capabilities within the SCALE code system make use of sensitivity coefficients for an extensive number of criticality safety applications, such as quantifying the data-induced uncertainty in the eigenvalue of critical systems, assessing the neutronic similarity between different systems, quantifying computational biases, and guiding nuclear data adjustment studies. The need to model geometrically complex systems with improved ease of use and fidelity and the desire to extend TSUNAMI analysis to advanced applications have motivated the development of a SCALE 6.2 module for calculating sensitivity coefficients using three-dimensional (3D) continuous-energy (CE) Montemore » Carlo methods: CE TSUNAMI-3D. This paper provides an overview of the theory, implementation, and capabilities of the CE TSUNAMI-3D sensitivity analysis methods. CE TSUNAMI contains two methods for calculating sensitivity coefficients in eigenvalue sensitivity applications: (1) the Iterated Fission Probability (IFP) method and (2) the Contributon-Linked eigenvalue sensitivity/Uncertainty estimation via Track length importance CHaracterization (CLUTCH) method. This work also presents the GEneralized Adjoint Response in Monte Carlo method (GEAR-MC), a first-of-its-kind approach for calculating adjoint-weighted, generalized response sensitivity coefficients—such as flux responses or reaction rate ratios—in CE Monte Carlo applications. The accuracy and efficiency of the CE TSUNAMI-3D eigenvalue sensitivity methods are assessed from a user perspective in a companion publication, and the accuracy and features of the CE TSUNAMI-3D GEAR-MC methods are detailed in this paper.« less

  18. Evaluation of nutritional screening tools among patients scheduled for heart valve surgery.

    PubMed

    Lomivorotov, Vladimir V; Efremov, Sergey M; Boboshko, Vladimir A; Nikolaev, Dmitry A; Vedernikov, Pavel E; Shilova, Anna N; Lomivorotov, Vladimir N; Karaskov, Alexander M

    2013-03-01

    The study aim was to detect the most sensitive nutritional screening tool and to assess its prognostic value with regards to an adverse clinical course in patients with heart valve disease undergoing cardiopulmonary bypass (CPB). This prospective cohort study included 441 adult patients who were screened using four nutritional screening tools: Nutritional Risk Screening 2002 (NRS-2002); Malnutrition Universal Screening Tool (MUST); Mini Nutritional Assessment (MNA); and Short Nutritional Assessment Questionnaire (SNAQ). Nutritional assessment was performed using a Subjective Global Assessment (SGA). In-hospital mortality, postoperative complications, and duration of hospital stay were each analyzed. With regards to the detection of malnutrition, the sensitivities of MUST, SNAQ, MNA and NRS-2002 were 100%, 92%, 84.6% and 43.6%, respectively. Malnutrition identified by MUST and MNA were associated with postoperative complications (OR 1.63, p = 0.033 and OR 1.6, p = 0.035) and prolonged hospitalization (OR 1.57, p = 0.048 and OR 1.7, p = 0.02). According to multivariate logistic regression analysis, along with well-known age and duration of CPB, malnutrition identified by MUST and MNA was associated with a risk of development of complications (OR 1.6, p = 0.049 and OR 1.6, p = 0.04, respectively). The sensitivities of SNAQ, MUST, NRS-2002 and MNA with regards to postoperative complications were 26.8%, 28.8%, 10%, and 31.6%, respectively. The MUST tool is preferable with regards to the detection of malnutrition. Both, MUST and MNA independently predicted postoperative complications. SNAQ and NRS-2002 proved insensitive with regards to the postoperative course among patients with heart valve disease who were scheduled for cardiothoracic surgery.

  19. Sensitivity analysis of navy aviation readiness based sparing model

    DTIC Science & Technology

    2017-09-01

    variability. (See Figure 4.) Figure 4. Research design flowchart 18 Figure 4 lays out the four steps of the methodology , starting in the upper left-hand...as a function of changes in key inputs. We develop NAVARM Experimental Designs (NED), a computational tool created by applying a state-of-the-art...experimental design to the NAVARM model. Statistical analysis of the resulting data identifies the most influential cost factors. Those are, in order of

  20. Combustor liner durability analysis

    NASA Technical Reports Server (NTRS)

    Moreno, V.

    1981-01-01

    An 18 month combustor liner durability analysis program was conducted to evaluate the use of advanced three dimensional transient heat transfer and nonlinear stress-strain analyses for modeling the cyclic thermomechanical response of a simulated combustor liner specimen. Cyclic life prediction technology for creep/fatigue interaction is evaluated for a variety of state-of-the-art tools for crack initiation and propagation. The sensitivity of the initiation models to a change in the operating conditions is also assessed.

  1. Rapid and sensitive detection of synthetic cannabinoids AMB-FUBINACA and α-PVP using surface enhanced Raman scattering (SERS)

    NASA Astrophysics Data System (ADS)

    Islam, Syed K.; Cheng, Yin Pak; Birke, Ronald L.; Green, Omar; Kubic, Thomas; Lombardi, John R.

    2018-04-01

    The application of surface enhanced Raman scattering (SERS) has been reported as a fast and sensitive analytical method in the trace detection of the two most commonly known synthetic cannabinoids AMB-FUBINACA and alpha-pyrrolidinovalerophenone (α-PVP). FUBINACA and α-PVP are two of the most dangerous synthetic cannabinoids which have been reported to cause numerous deaths in the United States. While instruments such as GC-MS, LC-MS have been traditionally recognized as analytical tools for the detection of these synthetic drugs, SERS has been recently gaining ground in the analysis of these synthetic drugs due to its sensitivity in trace analysis and its effectiveness as a rapid method of detection. This present study shows the limit of detection of a concentration as low as picomolar for AMB-FUBINACA while for α-PVP, the limit of detection is in nanomolar concentration using SERS.

  2. Automating calibration, sensitivity and uncertainty analysis of complex models using the R package Flexible Modeling Environment (FME): SWAT as an example

    USGS Publications Warehouse

    Wu, Y.; Liu, S.

    2012-01-01

    Parameter optimization and uncertainty issues are a great challenge for the application of large environmental models like the Soil and Water Assessment Tool (SWAT), which is a physically-based hydrological model for simulating water and nutrient cycles at the watershed scale. In this study, we present a comprehensive modeling environment for SWAT, including automated calibration, and sensitivity and uncertainty analysis capabilities through integration with the R package Flexible Modeling Environment (FME). To address challenges (e.g., calling the model in R and transferring variables between Fortran and R) in developing such a two-language coupling framework, 1) we converted the Fortran-based SWAT model to an R function (R-SWAT) using the RFortran platform, and alternatively 2) we compiled SWAT as a Dynamic Link Library (DLL). We then wrapped SWAT (via R-SWAT) with FME to perform complex applications including parameter identifiability, inverse modeling, and sensitivity and uncertainty analysis in the R environment. The final R-SWAT-FME framework has the following key functionalities: automatic initialization of R, running Fortran-based SWAT and R commands in parallel, transferring parameters and model output between SWAT and R, and inverse modeling with visualization. To examine this framework and demonstrate how it works, a case study simulating streamflow in the Cedar River Basin in Iowa in the United Sates was used, and we compared it with the built-in auto-calibration tool of SWAT in parameter optimization. Results indicate that both methods performed well and similarly in searching a set of optimal parameters. Nonetheless, the R-SWAT-FME is more attractive due to its instant visualization, and potential to take advantage of other R packages (e.g., inverse modeling and statistical graphics). The methods presented in the paper are readily adaptable to other model applications that require capability for automated calibration, and sensitivity and uncertainty analysis.

  3. Simplifiying global biogeochemistry models to evaluate methane emissions

    NASA Astrophysics Data System (ADS)

    Gerber, S.; Alonso-Contes, C.

    2017-12-01

    Process-based models are important tools to quantify wetland methane emissions, particularly also under climate change scenarios, evaluating these models is often cumbersome as they are embedded in larger land-surface models where fluctuating water table and the carbon cycle (including new readily decomposable plant material) are predicted variables. Here, we build on these large scale models but instead of modeling water table and plant productivity we provide values as boundary conditions. In contrast, aerobic and anaerobic decomposition, as well as soil column transport of oxygen and methane are predicted by the model. Because of these simplifications, the model has the potential to be more readily adaptable to the analysis of field-scale data. Here we determine the sensitivity of the model to specific setups, parameter choices, and to boundary conditions in order to determine set-up needs and inform what critical auxiliary variables need to be measured in order to better predict field-scale methane emissions from wetland soils. To that end we performed a global sensitivity analysis that also considers non-linear interactions between processes. The global sensitivity analysis revealed, not surprisingly, that water table dynamics (both mean level and amplitude of fluctuations), and the rate of the carbon cycle (i.e. net primary productivity) are critical determinants of methane emissions. The depth-scale where most of the potential decomposition occurs also affects methane emissions. Different transport mechanisms are compensating each other to some degree: If plant conduits are constrained, methane emissions by diffusive flux and ebullition compensate to some degree, however annual emissions are higher when plants help to bypass methanotrophs in temporally unsaturated upper layers. Finally, while oxygen consumption by plant roots help creating anoxic conditions it has little effect on overall methane emission. Our initial sensitivity analysis helps guiding further model development and improvement. However, an important goal for our model is to use it in field settings as a tool to deconvolve the different processes that contribute to the net transfer of methane from soils to atmosphere.

  4. A practical approach to the sensitivity analysis for kinetic Monte Carlo simulation of heterogeneous catalysis

    DOE PAGES

    Hoffmann, Max J.; Engelmann, Felix; Matera, Sebastian

    2017-01-31

    Lattice kinetic Monte Carlo simulations have become a vital tool for predictive quality atomistic understanding of complex surface chemical reaction kinetics over a wide range of reaction conditions. In order to expand their practical value in terms of giving guidelines for atomic level design of catalytic systems, it is very desirable to readily evaluate a sensitivity analysis for a given model. The result of such a sensitivity analysis quantitatively expresses the dependency of the turnover frequency, being the main output variable, on the rate constants entering the model. In the past the application of sensitivity analysis, such as Degree ofmore » Rate Control, has been hampered by its exuberant computational effort required to accurately sample numerical derivatives of a property that is obtained from a stochastic simulation method. Here in this study we present an efficient and robust three stage approach that is capable of reliably evaluating the sensitivity measures for stiff microkinetic models as we demonstrate using CO oxidation on RuO 2(110) as a prototypical reaction. In a first step, we utilize the Fisher Information Matrix for filtering out elementary processes which only yield negligible sensitivity. Then we employ an estimator based on linear response theory for calculating the sensitivity measure for non-critical conditions which covers the majority of cases. Finally we adopt a method for sampling coupled finite differences for evaluating the sensitivity measure of lattice based models. This allows efficient evaluation even in critical regions near a second order phase transition that are hitherto difficult to control. The combined approach leads to significant computational savings over straightforward numerical derivatives and should aid in accelerating the nano scale design of heterogeneous catalysts.« less

  5. A practical approach to the sensitivity analysis for kinetic Monte Carlo simulation of heterogeneous catalysis.

    PubMed

    Hoffmann, Max J; Engelmann, Felix; Matera, Sebastian

    2017-01-28

    Lattice kinetic Monte Carlo simulations have become a vital tool for predictive quality atomistic understanding of complex surface chemical reaction kinetics over a wide range of reaction conditions. In order to expand their practical value in terms of giving guidelines for the atomic level design of catalytic systems, it is very desirable to readily evaluate a sensitivity analysis for a given model. The result of such a sensitivity analysis quantitatively expresses the dependency of the turnover frequency, being the main output variable, on the rate constants entering the model. In the past, the application of sensitivity analysis, such as degree of rate control, has been hampered by its exuberant computational effort required to accurately sample numerical derivatives of a property that is obtained from a stochastic simulation method. In this study, we present an efficient and robust three-stage approach that is capable of reliably evaluating the sensitivity measures for stiff microkinetic models as we demonstrate using the CO oxidation on RuO 2 (110) as a prototypical reaction. In the first step, we utilize the Fisher information matrix for filtering out elementary processes which only yield negligible sensitivity. Then we employ an estimator based on the linear response theory for calculating the sensitivity measure for non-critical conditions which covers the majority of cases. Finally, we adapt a method for sampling coupled finite differences for evaluating the sensitivity measure for lattice based models. This allows for an efficient evaluation even in critical regions near a second order phase transition that are hitherto difficult to control. The combined approach leads to significant computational savings over straightforward numerical derivatives and should aid in accelerating the nano-scale design of heterogeneous catalysts.

  6. A practical approach to the sensitivity analysis for kinetic Monte Carlo simulation of heterogeneous catalysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffmann, Max J.; Engelmann, Felix; Matera, Sebastian

    Lattice kinetic Monte Carlo simulations have become a vital tool for predictive quality atomistic understanding of complex surface chemical reaction kinetics over a wide range of reaction conditions. In order to expand their practical value in terms of giving guidelines for atomic level design of catalytic systems, it is very desirable to readily evaluate a sensitivity analysis for a given model. The result of such a sensitivity analysis quantitatively expresses the dependency of the turnover frequency, being the main output variable, on the rate constants entering the model. In the past the application of sensitivity analysis, such as Degree ofmore » Rate Control, has been hampered by its exuberant computational effort required to accurately sample numerical derivatives of a property that is obtained from a stochastic simulation method. Here in this study we present an efficient and robust three stage approach that is capable of reliably evaluating the sensitivity measures for stiff microkinetic models as we demonstrate using CO oxidation on RuO 2(110) as a prototypical reaction. In a first step, we utilize the Fisher Information Matrix for filtering out elementary processes which only yield negligible sensitivity. Then we employ an estimator based on linear response theory for calculating the sensitivity measure for non-critical conditions which covers the majority of cases. Finally we adopt a method for sampling coupled finite differences for evaluating the sensitivity measure of lattice based models. This allows efficient evaluation even in critical regions near a second order phase transition that are hitherto difficult to control. The combined approach leads to significant computational savings over straightforward numerical derivatives and should aid in accelerating the nano scale design of heterogeneous catalysts.« less

  7. A practical approach to the sensitivity analysis for kinetic Monte Carlo simulation of heterogeneous catalysis

    NASA Astrophysics Data System (ADS)

    Hoffmann, Max J.; Engelmann, Felix; Matera, Sebastian

    2017-01-01

    Lattice kinetic Monte Carlo simulations have become a vital tool for predictive quality atomistic understanding of complex surface chemical reaction kinetics over a wide range of reaction conditions. In order to expand their practical value in terms of giving guidelines for the atomic level design of catalytic systems, it is very desirable to readily evaluate a sensitivity analysis for a given model. The result of such a sensitivity analysis quantitatively expresses the dependency of the turnover frequency, being the main output variable, on the rate constants entering the model. In the past, the application of sensitivity analysis, such as degree of rate control, has been hampered by its exuberant computational effort required to accurately sample numerical derivatives of a property that is obtained from a stochastic simulation method. In this study, we present an efficient and robust three-stage approach that is capable of reliably evaluating the sensitivity measures for stiff microkinetic models as we demonstrate using the CO oxidation on RuO2(110) as a prototypical reaction. In the first step, we utilize the Fisher information matrix for filtering out elementary processes which only yield negligible sensitivity. Then we employ an estimator based on the linear response theory for calculating the sensitivity measure for non-critical conditions which covers the majority of cases. Finally, we adapt a method for sampling coupled finite differences for evaluating the sensitivity measure for lattice based models. This allows for an efficient evaluation even in critical regions near a second order phase transition that are hitherto difficult to control. The combined approach leads to significant computational savings over straightforward numerical derivatives and should aid in accelerating the nano-scale design of heterogeneous catalysts.

  8. Analyzing the texture changes in the quantitative phase maps of adipocytes

    NASA Astrophysics Data System (ADS)

    Roitshtain, Darina; Sharabani-Yosef, Orna; Gefen, Amit; Shaked, Natan T.

    2016-03-01

    We present a new analysis tool for studying texture changes in the quantitative phase maps of live cells acquired by wide-field interferometry. The sensitivity of wide-field interferometry systems to small changes in refractive index enables visualizing cells and inner cell organelles without the using fluorescent dyes or other cell-invasive approaches, which may affect the measurement and require external labeling. Our label-free texture-analysis tool is based directly on the optical path delay profile of the sample and does not necessitate decoupling refractive index and thickness in the cell quantitative phase profile; thus, relevant parameters can be calculated using a single-frame acquisition. Our experimental system includes low-coherence wide-field interferometer, combined with simultaneous florescence microscopy system for validation. We used this system and analysis tool for studying lipid droplets formation in adipocytes. The latter demonstration is relevant for various cellular functions such as lipid metabolism, protein storage and degradation to viral replication. These processes are functionally linked to several physiological and pathological conditions, including obesity and metabolic diseases. Quantification of these biological phenomena based on the texture changes in the cell phase map has a potential as a new cellular diagnosis tool.

  9. Derivation of Continuum Models from An Agent-based Cancer Model: Optimization and Sensitivity Analysis.

    PubMed

    Voulgarelis, Dimitrios; Velayudhan, Ajoy; Smith, Frank

    2017-01-01

    Agent-based models provide a formidable tool for exploring complex and emergent behaviour of biological systems as well as accurate results but with the drawback of needing a lot of computational power and time for subsequent analysis. On the other hand, equation-based models can more easily be used for complex analysis in a much shorter timescale. This paper formulates an ordinary differential equations and stochastic differential equations model to capture the behaviour of an existing agent-based model of tumour cell reprogramming and applies it to optimization of possible treatment as well as dosage sensitivity analysis. For certain values of the parameter space a close match between the equation-based and agent-based models is achieved. The need for division of labour between the two approaches is explored. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  10. Magnetic particles as powerful purification tool for high sensitive mass spectrometric screening procedures.

    PubMed

    Peter, Jochen F; Otto, Angela M

    2010-02-01

    The effective isolation and purification of proteins from biological fluids is the most crucial step for a successful protein analysis when only minute amounts are available. While conventional purification methods such as dialysis, ultrafiltration or protein precipitation often lead to a marked loss of protein, SPE with small-sized particles is a powerful alternative. The implementation of particles with superparamagnetic cores facilitates the handling of those particles and allows the application of particles in the nanometer to low micrometer range. Due to the small diameters, magnetic particles are advantageous for increasing sensitivity when using subsequent MS analysis or gel electrophoresis. In the last years, different types of magnetic particles were developed for specific protein purification purposes followed by analysis or screening procedures using MS or SDS gel electrophoresis. In this review, the use of magnetic particles for different applications, such as, the extraction and analysis of DNA/RNA, peptides and proteins, is described.

  11. Hydrologic analysis for selection and placement of conservation practices at the watershed scale

    NASA Astrophysics Data System (ADS)

    Wilson, C.; Brooks, E. S.; Boll, J.

    2012-12-01

    When a water body is exceeding water quality standards and a Total Maximum Daily Load has been established, conservation practices in the watershed are able to reduce point and non-point source pollution. Hydrological analysis is needed to place conservation practices in the most hydrologically sensitive areas. The selection and placement of conservation practices, however, is challenging in ungauged watersheds with little or no data for the hydrological analysis. The objective of this research is to perform a hydrological analysis for mitigation of erosion and total phosphorus in a mixed land use watershed, and to select and place the conservation practices in the most sensitive areas. The study area is the Hangman Creek watershed in Idaho and Washington State, upstream of Long Lake (WA) reservoir, east of Spokane, WA. While the pollutant of concern is total phosphorus (TP), reductions in TP were translated to total suspended solids or reductions in nonpoint source erosion and sediment delivery to streams. Hydrological characterization was done with a simple web-based tool, which runs the Water Erosion Prediction Project (WEPP) model for representative land types in the watersheds, where a land type is defined as a unique combination of soil type, slope configuration, land use and management, and climate. The web-based tool used site-specific spatial and temporal data on land use, soil physical parameters, slope, and climate derived from readily available data sources and provided information on potential pollutant pathways (i.e. erosion, runoff, lateral flow, and percolation). Multiple land types representative in the watershed were ordered from most effective to least effective, and displayed spatially using GIS. The methodology for the Hangman Creek watershed was validated in the nearby Paradise Creek watershed that has long-term stream discharge and monitoring as well as land use data. Output from the web-based tool shows the potential reductions for different tillage practices, buffer strips, streamside management, and conversion to the conservation reserve program in the watershed. The output also includes the relationship between land area where conservation practices are placed and the potential reduction in pollution, showing the diminished returns on investment as less sensitive areas are being treated. This application of a simple web-based tool and the use of a physically-based erosion model (i.e. WEPP) illustrates that quantitative, spatial and temporal analysis of changes in pollutant loading and site-specific recommendations of conservation practices can be made in ungauged watersheds.

  12. Clinical validation of the C-VAT 2.0 assessment tool for gaming disorder: A sensitivity analysis of the proposed DSM-5 criteria and the clinical characteristics of young patients with 'video game addiction'.

    PubMed

    van Rooij, Antonius J; Schoenmakers, Tim M; van de Mheen, Dike

    2017-01-01

    Clinicians struggle with the identification of video gaming problems. To address this issue, a clinical assessment tool (C-VAT 2.0) was developed and tested in a clinical setting. The instrument allows exploration of the validity of the DSM-5 proposal for 'internet gaming disorder'. Using C-VAT 2.0, the current study provides a sensitivity analysis of the proposed DSM-5 criteria in a clinical youth sample (13-23years old) in treatment for video gaming disorder (N=32). The study also explores the clinical characteristics of these patients. The patients were all male and reported spending extensive amounts of time on video games. At least half of the patients reported playing online games (n=15). Comorbid problems were common (n=22) and included (social) anxiety disorders, PDD NOS, ADHD/ADD, Parent-Child relationship problem, and various types of depressive mood problems. The sensitivity of the test was good: results further show that the C-VAT correctly identified 91% of the sample at the proposed cut-off score of at least 5 out of 9 of the criteria. As our study did not include healthy, extreme gamers, we could not assess the specificity of the tool: future research should make this a priority. Using the proposed DSM-5 cut-off score, the C-VAT 2.0 shows preliminary validity in a sample of gamers in treatment for gaming disorder, but the discriminating value of the instrument should be studied further. In the meantime, it is crucial that therapists try to avoid false positives by using expert judgment of functional impairment in each case. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. What Role Does "Elongation" Play in "Tool-Specific" Activation and Connectivity in the Dorsal and Ventral Visual Streams?

    PubMed

    Chen, Juan; Snow, Jacqueline C; Culham, Jody C; Goodale, Melvyn A

    2018-04-01

    Images of tools induce stronger activation than images of nontools in a left-lateralized network that includes ventral-stream areas implicated in tool identification and dorsal-stream areas implicated in tool manipulation. Importantly, however, graspable tools tend to be elongated rather than stubby, and so the tool-selective responses in some of these areas may, to some extent, reflect sensitivity to elongation rather than "toolness" per se. Using functional magnetic resonance imaging, we investigated the role of elongation in driving tool-specific activation in the 2 streams and their interconnections. We showed that in some "tool-selective" areas, the coding of toolness and elongation coexisted, but in others, elongation and toolness were coded independently. Psychophysiological interaction analysis revealed that toolness, but not elongation, had a strong modulation of the connectivity between the ventral and dorsal streams. Dynamic causal modeling revealed that viewing tools (either elongated or stubby) increased the connectivity from the ventral- to the dorsal-stream tool-selective areas, but only viewing elongated tools increased the reciprocal connectivity between these areas. Overall, these data disentangle how toolness and elongation affect the activation and connectivity of the tool network and help to resolve recent controversies regarding the relative contribution of "toolness" versus elongation in driving dorsal-stream "tool-selective" areas.

  14. A Comparison of Variant Calling Pipelines Using Genome in a Bottle as a Reference

    PubMed Central

    2015-01-01

    High-throughput sequencing, especially of exomes, is a popular diagnostic tool, but it is difficult to determine which tools are the best at analyzing this data. In this study, we use the NIST Genome in a Bottle results as a novel resource for validation of our exome analysis pipeline. We use six different aligners and five different variant callers to determine which pipeline, of the 30 total, performs the best on a human exome that was used to help generate the list of variants detected by the Genome in a Bottle Consortium. Of these 30 pipelines, we found that Novoalign in conjunction with GATK UnifiedGenotyper exhibited the highest sensitivity while maintaining a low number of false positives for SNVs. However, it is apparent that indels are still difficult for any pipeline to handle with none of the tools achieving an average sensitivity higher than 33% or a Positive Predictive Value (PPV) higher than 53%. Lastly, as expected, it was found that aligners can play as vital a role in variant detection as variant callers themselves. PMID:26539496

  15. Acoustic sensors as a biophysical tool for probing cell attachment and cell/surface interactions.

    PubMed

    Saitakis, Michael; Gizeli, Electra

    2012-02-01

    Acoustic biosensors offer the possibility to analyse cell attachment and spreading. This is due to the offered speed of detection, the real-time non-invasive approach and their high sensitivity not only to mass coupling, but also to viscoelastic changes occurring close to the sensor surface. Quartz crystal microbalance (QCM) and surface acoustic wave (Love-wave) systems have been used to monitor the adhesion of animal cells to various surfaces and record the behaviour of cell layers under various conditions. The sensors detect cells mostly via their sensitivity in viscoelasticity and mechanical properties. Particularly, the QCM sensor detects cytoskeletal rearrangements caused by specific drugs affecting either actin microfilaments or microtubules. The Love-wave sensor directly measures cell/substrate bonds via acoustic damping and provides 2D kinetic and affinity parameters. Other studies have applied the QCM sensor as a diagnostic tool for leukaemia and, potentially, for chemotherapeutic agents. Acoustic sensors have also been used in the evaluation of the cytocompatibility of artificial surfaces and, in general, they have the potential to become powerful tools for even more diverse cellular analysis.

  16. Valid screening questions useful to diagnose hand and forearm eczema are available in the Spanish language, a new tool for global research.

    PubMed

    Martí-Margarit, Anna; Manresa, Josep M; Herdman, Mike; Pujol, Ramon; Serra, Consol; Flyvholm, Mary-Ann; Giménez-Arnau, Ana M

    2015-04-01

    Hand eczema is an impacting cutaneous disease. Globally valid tools that help to diagnose hand and forearm eczema are required. To validate the questions to detect hand and/or forearm eczema included in the "Nordic Occupational Skin Questionnaire" (NOSQ-2002) in the Spanish language. A prospective pilot study was conducted with 80 employees of a cleaning company and a retrospective one involving 2,546 individuals. The responses were analysed for sensitivity, specificity and positive and negative predictive values. The final diagnosis according to the patients' hospital records, the specialty care records and the physical examination was taken as gold standard. The Dermatology Life Quality Index (DLQI) was also evaluated. Sensitivity and specificity, in a worst case scenario (WC) combining both questions, were 96.5% and 66.7%, respectively, and in a per protocol (PP) analysis, were 96.5% and 75.2%. The questions validated detected eczema effectively, making this tool suitable for use e.g. in multicentre epidemiological studies or clinical trials.

  17. Esophageal cancer detection based on tissue surface-enhanced Raman spectroscopy and multivariate analysis

    NASA Astrophysics Data System (ADS)

    Feng, Shangyuan; Lin, Juqiang; Huang, Zufang; Chen, Guannan; Chen, Weisheng; Wang, Yue; Chen, Rong; Zeng, Haishan

    2013-01-01

    The capability of using silver nanoparticle based near-infrared surface enhanced Raman scattering (SERS) spectroscopy combined with principal component analysis (PCA) and linear discriminate analysis (LDA) to differentiate esophageal cancer tissue from normal tissue was presented. Significant differences in Raman intensities of prominent SERS bands were observed between normal and cancer tissues. PCA-LDA multivariate analysis of the measured tissue SERS spectra achieved diagnostic sensitivity of 90.9% and specificity of 97.8%. This exploratory study demonstrated great potential for developing label-free tissue SERS analysis into a clinical tool for esophageal cancer detection.

  18. HPAEC-PAD for oligosaccharide analysis-novel insights into analyte sensitivity and response stability.

    PubMed

    Mechelke, Matthias; Herlet, Jonathan; Benz, J Philipp; Schwarz, Wolfgang H; Zverlov, Vladimir V; Liebl, Wolfgang; Kornberger, Petra

    2017-12-01

    The rising importance of accurately detecting oligosaccharides in biomass hydrolyzates or as ingredients in food, such as in beverages and infant milk products, demands for the availability of tools to sensitively analyze the broad range of available oligosaccharides. Over the last decades, HPAEC-PAD has been developed into one of the major technologies for this task and represents a popular alternative to state-of-the-art LC-MS oligosaccharide analysis. This work presents the first comprehensive study which gives an overview of the separation of 38 analytes as well as enzymatic hydrolyzates of six different polysaccharides focusing on oligosaccharides. The high sensitivity of the PAD comes at cost of its stability due to recession of the gold electrode. By an in-depth analysis of the sensitivity drop over time for 35 analytes, including xylo- (XOS), arabinoxylo- (AXOS), laminari- (LOS), manno- (MOS), glucomanno- (GMOS), and cellooligosaccharides (COS), we developed an analyte-specific one-phase decay model for this effect over time. Using this model resulted in significantly improved data normalization when using an internal standard. Our results thereby allow a quantification approach which takes the inevitable and analyte-specific PAD response drop into account. Graphical abstract HPAEC-PAD analysis of oligosaccharides and determination of PAD response drop leading to an improved data normalization.

  19. What nursing students reveal about and learn from mentors when using stories of clinical practice.

    PubMed

    Edwards, Sharon

    2017-02-27

    Aim This article considers findings from a narrative research analysis that illustrate what nursing students can reveal about being mentored through their stories of clinical practice experience. The aim is to advocate the use of stories as tools to assist mentors in their roles, and to express to them students' concerns, sensitivities and priorities about clinical placement experiences. The findings are extracted from the author's unpublished doctoral thesis Learning from Practice: The Value of Story in Nurse Education ( Edwards 2013 ). Method The data are drawn from nursing students' stories about clinical practice experiences when engaged in the care of patients, and their perceived learning from them. Results Findings suggest stories can help develop understanding of nursing students' concerns, sensitivities and priorities, and can support mentors' important roles in students' learning. Conclusion The article illustrates the value of stories as learning tools in the workplace and, by looking at nursing students' stories about clinical practice, shows that paying attention to their concerns, sensitivities and priorities can improve the already significant role played by mentors in student learning.

  20. Translational Research and Plasma Proteomic in Cancer.

    PubMed

    Santini, Annamaria Chiara; Giovane, Giancarlo; Auletta, Adelaide; Di Carlo, Angelina; Fiorelli, Alfonso; Cito, Letizia; Astarita, Carlo; Giordano, Antonio; Alfano, Roberto; Feola, Antonia; Di Domenico, Marina

    2016-04-01

    Proteomics is a recent field of research in molecular biology that can help in the fight against cancer through the search for biomarkers that can detect this disease in the early stages of its development. Proteomic is a speedily growing technology, also thanks to the development of even more sensitive and fast mass spectrometry analysis. Although this technique is the most widespread for the discovery of new cancer biomarkers, it still suffers of a poor sensitivity and insufficient reproducibility, essentially due to the tumor heterogeneity. Common technical shortcomings include limitations in the sensitivity of detecting low abundant biomarkers and possible systematic biases in the observed data. Current research attempts are trying to develop high-resolution proteomic instrumentation for high-throughput monitoring of protein changes that occur in cancer. In this review, we describe the basic features of the proteomic tools which have proven to be useful in cancer research, showing their advantages and disadvantages. The application of these proteomic tools could provide early biomarkers detection in various cancer types and could improve the understanding the mechanisms of tumor growth and dissemination. © 2015 Wiley Periodicals, Inc.

  1. Quantitative evaluation of skeletal muscle defects in second harmonic generation images.

    PubMed

    Liu, Wenhua; Raben, Nina; Ralston, Evelyn

    2013-02-01

    Skeletal muscle pathologies cause irregularities in the normally periodic organization of the myofibrils. Objective grading of muscle morphology is necessary to assess muscle health, compare biopsies, and evaluate treatments and the evolution of disease. To facilitate such quantitation, we have developed a fast, sensitive, automatic imaging analysis software. It detects major and minor morphological changes by combining texture features and Fourier transform (FT) techniques. We apply this tool to second harmonic generation (SHG) images of muscle fibers which visualize the repeating myosin bands. Texture features are then calculated by using a Haralick gray-level cooccurrence matrix in MATLAB. Two scores are retrieved from the texture correlation plot by using FT and curve-fitting methods. The sensitivity of the technique was tested on SHG images of human adult and infant muscle biopsies and of mouse muscle samples. The scores are strongly correlated to muscle fiber condition. We named the software MARS (muscle assessment and rating scores). It is executed automatically and is highly sensitive even to subtle defects. We propose MARS as a powerful and unbiased tool to assess muscle health.

  2. Quantitative evaluation of skeletal muscle defects in second harmonic generation images

    NASA Astrophysics Data System (ADS)

    Liu, Wenhua; Raben, Nina; Ralston, Evelyn

    2013-02-01

    Skeletal muscle pathologies cause irregularities in the normally periodic organization of the myofibrils. Objective grading of muscle morphology is necessary to assess muscle health, compare biopsies, and evaluate treatments and the evolution of disease. To facilitate such quantitation, we have developed a fast, sensitive, automatic imaging analysis software. It detects major and minor morphological changes by combining texture features and Fourier transform (FT) techniques. We apply this tool to second harmonic generation (SHG) images of muscle fibers which visualize the repeating myosin bands. Texture features are then calculated by using a Haralick gray-level cooccurrence matrix in MATLAB. Two scores are retrieved from the texture correlation plot by using FT and curve-fitting methods. The sensitivity of the technique was tested on SHG images of human adult and infant muscle biopsies and of mouse muscle samples. The scores are strongly correlated to muscle fiber condition. We named the software MARS (muscle assessment and rating scores). It is executed automatically and is highly sensitive even to subtle defects. We propose MARS as a powerful and unbiased tool to assess muscle health.

  3. Genetically Encoded Catalytic Hairpin Assembly for Sensitive RNA Imaging in Live Cells.

    PubMed

    Mudiyanselage, Aruni P K K Karunanayake; Yu, Qikun; Leon-Duque, Mark A; Zhao, Bin; Wu, Rigumula; You, Mingxu

    2018-06-26

    DNA and RNA nanotechnology has been used for the development of dynamic molecular devices. In particular, programmable enzyme-free nucleic acid circuits, such as catalytic hairpin assembly, have been demonstrated as useful tools for bioanalysis and to scale up system complexity to an extent beyond current cellular genetic circuits. However, the intracellular functions of most synthetic nucleic acid circuits have been hindered by challenges in the biological delivery and degradation. On the other hand, genetically encoded and transcribed RNA circuits emerge as alternative powerful tools for long-term embedded cellular analysis and regulation. Herein, we reported a genetically encoded RNA-based catalytic hairpin assembly circuit for sensitive RNA imaging inside living cells. The split version of Broccoli, a fluorogenic RNA aptamer, was used as the reporter. One target RNA can catalytically trigger the fluorescence from tens-to-hundreds of Broccoli. As a result, target RNAs can be sensitively detected. We have further engineered our circuit to allow easy programming to image various target RNA sequences. This design principle opens the arena for developing a large variety of genetically encoded RNA circuits for cellular applications.

  4. Parameter optimization, sensitivity, and uncertainty analysis of an ecosystem model at a forest flux tower site in the United States

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende

    2014-01-01

    Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.

  5. Application of stable isotope tools for evaluating natural and stimulated biodegradation of organic pollutants in field studies.

    PubMed

    Fischer, Anko; Manefield, Mike; Bombach, Petra

    2016-10-01

    Stable isotope tools are increasingly applied for in-depth evaluation of biodegradation of organic pollutants at contaminated field sites. They can be divided into three methods i) determination of changes in natural abundance of stable isotopes using compound-specific stable isotope analysis (CSIA), ii) detection of incorporation of stable-isotope label from a stable-isotope labelled target compound into degradation and/or mineralisation products and iii) determination of stable-isotope label incorporation into biomarkers using stable isotope probing (SIP). Stable isotope tools have been applied as key monitoring tools for multiple-line-of-evidence-approaches (MLEA) for sensitive evaluation of pollutant biodegradation. This review highlights the application of CSIA, SIP and MLEA including stable isotope tools for assessing natural and stimulated biodegradation of organic pollutants in field studies dealing with soil and groundwater contaminations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. The SURE reliability analysis program

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1986-01-01

    The SURE program is a new reliability tool for ultrareliable computer system architectures. The program is based on computational methods recently developed for the NASA Langley Research Center. These methods provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool.

  7. Robust detection, isolation and accommodation for sensor failures

    NASA Technical Reports Server (NTRS)

    Emami-Naeini, A.; Akhter, M. M.; Rock, S. M.

    1986-01-01

    The objective is to extend the recent advances in robust control system design of multivariable systems to sensor failure detection, isolation, and accommodation (DIA), and estimator design. This effort provides analysis tools to quantify the trade-off between performance robustness and DIA sensitivity, which are to be used to achieve higher levels of performance robustness for given levels of DIA sensitivity. An innovations-based DIA scheme is used. Estimators, which depend upon a model of the process and process inputs and outputs, are used to generate these innovations. Thresholds used to determine failure detection are computed based on bounds on modeling errors, noise properties, and the class of failures. The applicability of the newly developed tools are demonstrated on a multivariable aircraft turbojet engine example. A new concept call the threshold selector was developed. It represents a significant and innovative tool for the analysis and synthesis of DiA algorithms. The estimators were made robust by introduction of an internal model and by frequency shaping. The internal mode provides asymptotically unbiased filter estimates.The incorporation of frequency shaping of the Linear Quadratic Gaussian cost functional modifies the estimator design to make it suitable for sensor failure DIA. The results are compared with previous studies which used thresholds that were selcted empirically. Comparison of these two techniques on a nonlinear dynamic engine simulation shows improved performance of the new method compared to previous techniques

  8. Diagnostic accuracy of contrast-enhanced ultrasound in assessing the therapeutic response to radio frequency ablation for liver tumors: systematic review and meta-analysis.

    PubMed

    Xuan, Min; Zhou, Fengsheng; Ding, Yan; Zhu, Qiaoying; Dong, Ji; Zhou, Hao; Cheng, Jun; Jiang, Xiao; Wu, Pengxi

    2018-04-01

    To review the diagnostic accuracy of contrast-enhanced ultrasound (CEUS) used to detect residual or recurrent liver tumors after radiofrequency ablation (RFA). This technique uses contrast-enhanced computer tomography or/and contrast-enhanced magnetic resonance imaging as the gold standard of investigation. MEDLINE, EMBASE, and COCHRANE were systematically searched for all potentially eligible studies comparing CEUS with the reference standard that follows RFA. Risk of bias and applicability concerns were addressed by adopting the Quality Assessment of Diagnostic Accuracy Studies (QUADAS-2) tool. Pooled point estimates for sensitivity, specificity, positive and negative likelihood ratios, and diagnostic odds ratios (DOR) with 95% CI were computed before plotting the sROC (summary receiver operating characteristic) curve. Meta-regression and subgroup analysis were used to identify the source of the heterogeneity that was detected. Publication bias was evaluated using Deeks' funnel plot asymmetry test. Ten eligible studies on 1162 lesions that occurred between 2001 and 2016 were included in the final analysis. The quality of the included studies assessed by the QUADAS-2 tool was considered reasonable. The pooled sensitivity and specificity of CEUS in detecting residual or recurrent liver tumors had the following values: 0.90 (95% CI 0.85-0.94) and 1.00 (95% CI 0.99-1.00), respectively. Overall DOR was 420.10 (95% CI 142.30-1240.20). The sources of heterogeneity could not be precisely identified by meta-regression or subgroup analysis. No evidence of publication bias was found. This study confirmed that CEUS exhibits high sensitivity and specificity in assessing therapeutic responses to RFA for liver tumors.

  9. Automated classification of focal breast lesions according to S-detect: validation and role as a clinical and teaching tool.

    PubMed

    Di Segni, Mattia; de Soccio, Valeria; Cantisani, Vito; Bonito, Giacomo; Rubini, Antonello; Di Segni, Gabriele; Lamorte, Sveva; Magri, Valentina; De Vito, Corrado; Migliara, Giuseppe; Bartolotta, Tommaso Vincenzo; Metere, Alessio; Giacomelli, Laura; de Felice, Carlo; D'Ambrosio, Ferdinando

    2018-06-01

    To assess the diagnostic performance and the potential as a teaching tool of S-detect in the assessment of focal breast lesions. 61 patients (age 21-84 years) with benign breast lesions in follow-up or candidate to pathological sampling or with suspicious lesions candidate to biopsy were enrolled. The study was based on a prospective and on a retrospective phase. In the prospective phase, after completion of baseline US by an experienced breast radiologist and S-detect assessment, 5 operators with different experience and dedication to breast radiology performed elastographic exams. In the retrospective phase, the 5 operators performed a retrospective assessment and categorized lesions with BI-RADS 2013 lexicon. Integration of S-detect to in-training operators evaluations was performed by giving priority to S-detect analysis in case of disagreement. 2 × 2 contingency tables and ROC analysis were used to assess the diagnostic performances; inter-rater agreement was measured with Cohen's k; Bonferroni's test was used to compare performances. A significance threshold of p = 0.05 was adopted. All operators showed sensitivity > 90% and varying specificity (50-75%); S-detect showed sensitivity > 90 and 70.8% specificity, with inter-rater agreement ranging from moderate to good. Lower specificities were improved by the addition of S-detect. The addition of elastography did not lead to any improvement of the diagnostic performance. S-detect is a feasible tool for the characterization of breast lesions; it has a potential as a teaching tool for the less experienced operators.

  10. Development of the LaComm 1.0, A French medical communication analysis software: A study assessing its sensitivity to change.

    PubMed

    Gibon, Anne-Sophie; Durieux, Jean-François; Merckaert, Isabelle; Delvaux, Nicole; Farvacques, Christine; Libert, Yves; Marchal, Serge; Moucheux, Angélique; Slachmuylder, Jean-Louis; Razavi, Darius

    2017-02-01

    To test and compare the sensitivity to change of a communication analysis software, the LaComm 1.0, to the CRCWEM's using data from a randomized study assessing the efficacy of a communication skills training program designed for nurses. The program assessment included the recording of two-person simulated interviews at baseline and after training or 3 months later. Interview transcripts were analyzed using the CRCWEM and the LaComm 1.0 tools. One hundred and nine oncology nurses (mainly graduated or certified) were included in the study. The CRCWEM detected 5 changes out of 13 expected changes (38%) (e.g., more open directive questions after training) and the LaComm 1.0, 4 changes out of 7 expected changes (57%) (e.g., more empathic statements after training). For open directive question, the effect sizes of the group-by-time changes were slightly different between tools (CRCWEM: Cohen's d=0.97; LaComm 1.0: Cohen's d=0.67). This study shows that the LaComm 1.0 is sensitive to change. The LaComm 1.0 is a valid method to assess training effectiveness in French. The use of the Lacomm 1.0 in future French communication skills training programs will allow comparisons of studies. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Downregulation of serum metabolite GTA-446 as a novel potential marker for early detection of colorectal cancer.

    PubMed

    Hata, Tsuyoshi; Takemasa, Ichiro; Takahashi, Hidekazu; Haraguchi, Naotsugu; Nishimura, Junichi; Hata, Taishi; Mizushima, Tsunekazu; Doki, Yuichiro; Mori, Masaki

    2017-07-11

    We previously reported that GTA-446 may be a useful biomarker for early detection of colorectal cancer. In the present study, we confirmed the clinical feasibility of GTA-446 as a screening tool for colorectal cancer with a novel measurement system developed for clinical use. We also improved sensitivity by analysing GTA-446 levels according to gender. Serum samples were collected from 225 colorectal cancer patients and 916 healthy volunteers to measure GTA-446 levels by flow injection analysis-mass spectrometry. GTA-446 levels were downregulated in colorectal cancer patients compared with the healthy volunteers, and in females compared with the males in both groups. Receiver operating characteristic curve analysis revealed an optimal cut-off of 2.72 μmol l -1 in males and 1.87 μmol l -1 in females, with a large area under the curve of 0.89-0.93. The sensitivity and specificity were 90.4% and 84.9% for males, 85.2% and 80.5% for females, and 83.3% and 84.8% for all subjects, respectively. GTA-446 is a clinically relevant biomarker for colorectal cancer with high sensitivity when analysed by gender. Thus, GTA-446 is a promising tool for primary colorectal cancer screening to identify populations at a higher risk of colorectal cancer, with an emphasis on early detection.

  12. Ultra-sensitive flow measurement in individual nanopores through pressure--driven particle translocation.

    PubMed

    Gadaleta, Alessandro; Biance, Anne-Laure; Siria, Alessandro; Bocquet, Lyderic

    2015-05-07

    A challenge for the development of nanofluidics is to develop new instrumentation tools, able to probe the extremely small mass transport across individual nanochannels. Such tools are a prerequisite for the fundamental exploration of the breakdown of continuum transport in nanometric confinement. In this letter, we propose a novel method for the measurement of the hydrodynamic permeability of nanometric pores, by diverting the classical technique of Coulter counting to characterize a pressure-driven flow across an individual nanopore. Both the analysis of the translocation rate, as well as the detailed statistics of the dwell time of nanoparticles flowing across a single nanopore, allow us to evaluate the permeability of the system. We reach a sensitivity for the water flow down to a few femtoliters per second, which is more than two orders of magnitude better than state-of-the-art alternative methods.

  13. Thermal sensors to control polymer forming. Challenge and solutions

    NASA Astrophysics Data System (ADS)

    Lemeunier, F.; Boyard, N.; Sarda, A.; Plot, C.; Lefèvre, N.; Petit, I.; Colomines, G.; Allanic, N.; Bailleul, J. L.

    2017-10-01

    Many thermal sensors are already used, for many years, to better understand and control material forming processes, especially polymer processing. Due to technical constraints (high pressure, sealing, sensor dimensions…) the thermal measurement is often performed in the tool or close its surface. Thus, it only gives partial and disturbed information. Having reliable information about the heat flux exchanges between the tool and the material during the process would be very helpful to improve the control of the process and to favor the development of new materials. In this work, we present several sensors developed in labs to study the molding steps in forming processes. The analysis of the obtained thermal measurements (temperature, heat flux) shows the required sensitivity threshold of sensitivity of thermal sensors to be able to detect on-line the rate of thermal reaction. Based on these data, we will present new sensor designs which have been patented.

  14. Application of the SCALE TSUNAMI Tools for the Validation of Criticality Safety Calculations Involving 233U

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, Don; Rearden, Bradley T; Hollenbach, Daniel F

    2009-02-01

    The Radiochemical Development Facility at Oak Ridge National Laboratory has been storing solid materials containing 233U for decades. Preparations are under way to process these materials into a form that is inherently safe from a nuclear criticality safety perspective. This will be accomplished by down-blending the {sup 233}U materials with depleted or natural uranium. At the request of the U.S. Department of Energy, a study has been performed using the SCALE sensitivity and uncertainty analysis tools to demonstrate how these tools could be used to validate nuclear criticality safety calculations of selected process and storage configurations. ISOTEK nuclear criticality safetymore » staff provided four models that are representative of the criticality safety calculations for which validation will be needed. The SCALE TSUNAMI-1D and TSUNAMI-3D sequences were used to generate energy-dependent k{sub eff} sensitivity profiles for each nuclide and reaction present in the four safety analysis models, also referred to as the applications, and in a large set of critical experiments. The SCALE TSUNAMI-IP module was used together with the sensitivity profiles and the cross-section uncertainty data contained in the SCALE covariance data files to propagate the cross-section uncertainties ({Delta}{sigma}/{sigma}) to k{sub eff} uncertainties ({Delta}k/k) for each application model. The SCALE TSUNAMI-IP module was also used to evaluate the similarity of each of the 672 critical experiments with each application. Results of the uncertainty analysis and similarity assessment are presented in this report. A total of 142 experiments were judged to be similar to application 1, and 68 experiments were judged to be similar to application 2. None of the 672 experiments were judged to be adequately similar to applications 3 and 4. Discussion of the uncertainty analysis and similarity assessment is provided for each of the four applications. Example upper subcritical limits (USLs) were generated for application 1 based on trending of the energy of average lethargy of neutrons causing fission, trending of the TSUNAMI similarity parameters, and use of data adjustment techniques.« less

  15. Cryptic and Asymptomatic Opisthorchis felineus Infections

    PubMed Central

    Armignacco, Orlando; Ferri, Fabrizio; Gomez-Morales, Maria Angeles; Caterini, Luciano; Pozio, Edoardo

    2013-01-01

    We describe the diagnostic difficulties experienced during an opisthorchiasis outbreak. Of 31 infected individuals, 61.3% were asymptomatic, and in the 12 symptomatic individuals, the duration of non-pathognomonic symptoms was shorter than 4 weeks. Serology by enzyme-linked immunosorbent assay and polymerase chain reaction fecal analysis were shown to be the most sensitive diagnostic tools. PMID:23249682

  16. Onboard Acoustic Data-Processing for the Statistical Analysis of Array Beam-Noise,

    DTIC Science & Technology

    1980-12-15

    performance of the sonar system as a measurement tool and others that can assess the character of the ambient- noise field at the time of the measurement. In...the plot as would "dead" hydrophones. A reduction in sensitivity of a hydrophone, a faulty preamplifier , or any other fault in the acoustic channel

  17. Compilation of Abstracts of Theses Submitted by Candidates for Degrees: October 1990 to September 1991

    DTIC Science & Technology

    1991-09-30

    Tool (ASSET) COMPUTER SCIENCE Vicki Sue Abel VIEWER - A User Interface for Failure 49 Lieutenant Commander, U.S. Navy Region Analysis and Medio Monti...California Current System using a Primitive Equation Model Charles C. McGlothin, Jr. Ambient Sound in the Ocean Induced by 257 Lieutenant, U.S. Navy Heavy...parameters,, and ambient flow/oscillating flow combinations using VAX-3520 and NASA’s Supercomputers. Extensive sensitivity analysis has been performed

  18. Validity, sensitivity and specificity of the mentation, behavior and mood subscale of the UPDRS.

    PubMed

    Holroyd, Suzanne; Currie, Lillian J; Wooten, G Frederick

    2008-06-01

    The unified Parkinson's disease rating scale (UPDRS) is the most widely used tool to rate the severity and the stage of Parkinson's disease (PD). However, the mentation, behavior and mood (MBM) subscale of the UPDRS has received little investigation regarding its validity and sensitivity. Three items of this subscale were compared to criterion tests to examine validity, sensitivity and specificity. Ninety-seven patients with idiopathic PD were assessed on the UPDRS. Scores on three items of the MBM subscale, intellectual impairment, thought disorder and depression, were compared to criterion tests, the telephone interview for cognition status (TICS), psychiatric assessment for psychosis and the geriatric depression scale (GDS). Non-parametric tests of association were performed to examine concurrent validity of the MBM items. The sensitivities, specificities and optimal cutoff scores for each MBM item were estimated by receiver operating characteristic (ROC) curve analysis. The MBM items demonstrated low to moderate correlation with the criterion tests, and the sensitivity and specificity were not strong. Even using a score of 7.0 on the items of the MBM demonstrated a sensitivity/specificity of only 0.19/0.48 for intellectual impairment, 0.60/0.72 for thought disorder and 0.61/0.87 for depression. Using a more appropriate cutoff of 2.0 revealed sensitivities of 0.01, 0.38 and 0.13 respectively. The MBM subscale items of intellectual impairment, thought disorder and depression are not appropriate for screening or diagnostic purposes. Tools such as the TICS and the GDS should be considered instead.

  19. Optical coherence tomography for the diagnosis of malignant skin tumors: a meta-analysis

    NASA Astrophysics Data System (ADS)

    Xiong, Yi-Quan; Mo, Yun; Wen, Yu-Qi; Cheng, Ming-Ji; Huo, Shu-Ting; Chen, Xue-Jiao; Chen, Qing

    2018-02-01

    Optical coherence tomography (OCT) is an emergent imaging tool used for noninvasive diagnosis of skin diseases. The present meta-analysis was carried out to assess the accuracy of OCT for the diagnosis of skin cancer. We conducted a systematic literature search though EMBASE, Medline, PubMed, the Cochrane Library, and Web of Science database for relevant articles published up to June 6, 2017. The quality of the included studies was assessed using the QUADAS-2 tool and the Oxford Levels of Evidence Scale. Statistical analyses were conducted using the software Meta-Disc version 1.4 and STATA version 12.0. A total of 14 studies involving more than 813 patients with a total of 1958 lesions were included in our analyses. The pooled sensitivity and specificity of OCT for skin cancer diagnoses were 91.8% and 86.7%, respectively. Subgroup analysis showed that the pooled sensitivities of OCT for detecting basal cell carcinoma (BCC), squamous cell carcinoma (SCC), actinic keratosis, and malignant melanoma were 92.4%, 92.3%, 73.8%, and 81.0%, respectively. The pooled specificities were 86.9%, 99.5%, 91.5%, and 93.8%, respectively. OCT appears to be useful for the detection of BCC and SCC. It is a valuable diagnostic method when screening for early skin cancers.

  20. The Beck Depression Inventory (BDI-II) and a single screening question as screening tools for depressive disorder in Dutch advanced cancer patients.

    PubMed

    Warmenhoven, Franca; van Rijswijk, Eric; Engels, Yvonne; Kan, Cornelis; Prins, Judith; van Weel, Chris; Vissers, Kris

    2012-02-01

    Depression is highly prevalent in advanced cancer patients, but the diagnosis of depressive disorder in patients with advanced cancer is difficult. Screening instruments could facilitate diagnosing depressive disorder in patients with advanced cancer. The aim of this study was to determine the validity of the Beck Depression Inventory (BDI-II) and a single screening question as screening tools for depressive disorder in advanced cancer patients. Patients with advanced metastatic disease, visiting the outpatient palliative care department, were asked to fill out a self-questionnaire containing the Beck Depression Inventory (BDI-II) and a single screening question "Are you feeling depressed?" The mood section of the PRIME-MD was used as a gold standard. Sixty-one patients with advanced metastatic disease were eligible to be included in the study. Complete data were obtained from 46 patients. The area under the curve of the receiver operating characteristics analysis of the BDI-II was 0.82. The optimal cut-off point of the BDI-II was 16 with a sensitivity of 90% and a specificity of 69%. The single screening question showed a sensitivity of 50% and a specificity of 94%. The BDI-II seems an adequate screening tool for a depressive disorder in advanced cancer patients. The sensitivity of a single screening question is poor.

  1. BBMerge – Accurate paired shotgun read merging via overlap

    DOE PAGES

    Bushnell, Brian; Rood, Jonathan; Singer, Esther

    2017-10-26

    Merging paired-end shotgun reads generated on high-throughput sequencing platforms can substantially improve various subsequent bioinformatics processes, including genome assembly, binning, mapping, annotation, and clustering for taxonomic analysis. With the inexorable growth of sequence data volume and CPU core counts, the speed and scalability of read-processing tools becomes ever-more important. The accuracy of shotgun read merging is crucial as well, as errors introduced by incorrect merging percolate through to reduce the quality of downstream analysis. Thus, we designed a new tool to maximize accuracy and minimize processing time, allowing the use of read merging on larger datasets, and in analyses highlymore » sensitive to errors. We present BBMerge, a new merging tool for paired-end shotgun sequence data. We benchmark BBMerge by comparison with eight other widely used merging tools, assessing speed, accuracy and scalability. Evaluations of both synthetic and real-world datasets demonstrate that BBMerge produces merged shotgun reads with greater accuracy and at higher speed than any existing merging tool examined. BBMerge also provides the ability to merge non-overlapping shotgun read pairs by using k-mer frequency information to assemble the unsequenced gap between reads, achieving a significantly higher merge rate while maintaining or increasing accuracy.« less

  2. BBMerge – Accurate paired shotgun read merging via overlap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bushnell, Brian; Rood, Jonathan; Singer, Esther

    Merging paired-end shotgun reads generated on high-throughput sequencing platforms can substantially improve various subsequent bioinformatics processes, including genome assembly, binning, mapping, annotation, and clustering for taxonomic analysis. With the inexorable growth of sequence data volume and CPU core counts, the speed and scalability of read-processing tools becomes ever-more important. The accuracy of shotgun read merging is crucial as well, as errors introduced by incorrect merging percolate through to reduce the quality of downstream analysis. Thus, we designed a new tool to maximize accuracy and minimize processing time, allowing the use of read merging on larger datasets, and in analyses highlymore » sensitive to errors. We present BBMerge, a new merging tool for paired-end shotgun sequence data. We benchmark BBMerge by comparison with eight other widely used merging tools, assessing speed, accuracy and scalability. Evaluations of both synthetic and real-world datasets demonstrate that BBMerge produces merged shotgun reads with greater accuracy and at higher speed than any existing merging tool examined. BBMerge also provides the ability to merge non-overlapping shotgun read pairs by using k-mer frequency information to assemble the unsequenced gap between reads, achieving a significantly higher merge rate while maintaining or increasing accuracy.« less

  3. A global sensitivity analysis approach for morphogenesis models.

    PubMed

    Boas, Sonja E M; Navarro Jimenez, Maria I; Merks, Roeland M H; Blom, Joke G

    2015-11-21

    Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such 'black-box' models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all 'black-box' models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  4. Development of a Portable Sensitive Equipment Decontamination System. Volume 2: Activated Carbon Fiber Wipe

    DTIC Science & Technology

    2010-05-01

    absorption. Thermogravimetric Analysis (TGA) was employed to measure absorption of HD and GD into the nylon fabric. TGA is an analytical tool useful in...Chromatography Analysis 26 4.4.5.1 Mass Removed by Wiper at Room Temperature 26 4.4.6 Chemical Agent Mass Removed by Wiper at Elevated and Reduced...Temperature Tests 71 5.9 Vapor Analysis of Spent Wipe 78 5.10 Wiping Efficacy and Complex Geometries 82 5.11 Spray and Wipe Tests 84 5.12 Effect of

  5. Goal-oriented sensitivity analysis for lattice kinetic Monte Carlo simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arampatzis, Georgios, E-mail: garab@math.uoc.gr; Department of Mathematics and Statistics, University of Massachusetts, Amherst, Massachusetts 01003; Katsoulakis, Markos A., E-mail: markos@math.umass.edu

    2014-03-28

    In this paper we propose a new class of coupling methods for the sensitivity analysis of high dimensional stochastic systems and in particular for lattice Kinetic Monte Carlo (KMC). Sensitivity analysis for stochastic systems is typically based on approximating continuous derivatives with respect to model parameters by the mean value of samples from a finite difference scheme. Instead of using independent samples the proposed algorithm reduces the variance of the estimator by developing a strongly correlated-“coupled”- stochastic process for both the perturbed and unperturbed stochastic processes, defined in a common state space. The novelty of our construction is that themore » new coupled process depends on the targeted observables, e.g., coverage, Hamiltonian, spatial correlations, surface roughness, etc., hence we refer to the proposed method as goal-oriented sensitivity analysis. In particular, the rates of the coupled Continuous Time Markov Chain are obtained as solutions to a goal-oriented optimization problem, depending on the observable of interest, by considering the minimization functional of the corresponding variance. We show that this functional can be used as a diagnostic tool for the design and evaluation of different classes of couplings. Furthermore, the resulting KMC sensitivity algorithm has an easy implementation that is based on the Bortz–Kalos–Lebowitz algorithm's philosophy, where events are divided in classes depending on level sets of the observable of interest. Finally, we demonstrate in several examples including adsorption, desorption, and diffusion Kinetic Monte Carlo that for the same confidence interval and observable, the proposed goal-oriented algorithm can be two orders of magnitude faster than existing coupling algorithms for spatial KMC such as the Common Random Number approach. We also provide a complete implementation of the proposed sensitivity analysis algorithms, including various spatial KMC examples, in a supplementary MATLAB source code.« less

  6. RIEMS: a software pipeline for sensitive and comprehensive taxonomic classification of reads from metagenomics datasets.

    PubMed

    Scheuch, Matthias; Höper, Dirk; Beer, Martin

    2015-03-03

    Fuelled by the advent and subsequent development of next generation sequencing technologies, metagenomics became a powerful tool for the analysis of microbial communities both scientifically and diagnostically. The biggest challenge is the extraction of relevant information from the huge sequence datasets generated for metagenomics studies. Although a plethora of tools are available, data analysis is still a bottleneck. To overcome the bottleneck of data analysis, we developed an automated computational workflow called RIEMS - Reliable Information Extraction from Metagenomic Sequence datasets. RIEMS assigns every individual read sequence within a dataset taxonomically by cascading different sequence analyses with decreasing stringency of the assignments using various software applications. After completion of the analyses, the results are summarised in a clearly structured result protocol organised taxonomically. The high accuracy and performance of RIEMS analyses were proven in comparison with other tools for metagenomics data analysis using simulated sequencing read datasets. RIEMS has the potential to fill the gap that still exists with regard to data analysis for metagenomics studies. The usefulness and power of RIEMS for the analysis of genuine sequencing datasets was demonstrated with an early version of RIEMS in 2011 when it was used to detect the orthobunyavirus sequences leading to the discovery of Schmallenberg virus.

  7. Parameter identification and optimization of slide guide joint of CNC machine tools

    NASA Astrophysics Data System (ADS)

    Zhou, S.; Sun, B. B.

    2017-11-01

    The joint surface has an important influence on the performance of CNC machine tools. In order to identify the dynamic parameters of slide guide joint, the parametric finite element model of the joint is established and optimum design method is used based on the finite element simulation and modal test. Then the mode that has the most influence on the dynamics of slip joint is found through harmonic response analysis. Take the frequency of this mode as objective, the sensitivity analysis of the stiffness of each joint surface is carried out using Latin Hypercube Sampling and Monte Carlo Simulation. The result shows that the vertical stiffness of slip joint surface constituted by the bed and the slide plate has the most obvious influence on the structure. Therefore, this stiffness is taken as the optimization variable and the optimal value is obtained through studying the relationship between structural dynamic performance and stiffness. Take the stiffness values before and after optimization into the FEM of machine tool, and it is found that the dynamic performance of the machine tool is improved.

  8. Development and Validation of the Controller Acceptance Rating Scale (CARS): Results of Empirical Research

    NASA Technical Reports Server (NTRS)

    Lee, Katharine K.; Kerns, Karol; Bone, Randall

    2001-01-01

    The measurement of operational acceptability is important for the development, implementation, and evolution of air traffic management decision support tools. The Controller Acceptance Rating Scale was developed at NASA Ames Research Center for the development and evaluation of the Passive Final Approach Spacing Tool. CARS was modeled after a well-known pilot evaluation rating instrument, the Cooper-Harper Scale, and has since been used in the evaluation of the User Request Evaluation Tool, developed by MITRE's Center for Advanced Aviation System Development. In this paper, we provide a discussion of the development of CARS and an analysis of the empirical data collected with CARS to examine construct validity. Results of intraclass correlations indicated statistically significant reliability for the CARS. From the subjective workload data that were collected in conjunction with the CARS, it appears that the expected set of workload attributes was correlated with the CARS. As expected, the analysis also showed that CARS was a sensitive indicator of the impact of decision support tools on controller operations. Suggestions for future CARS development and its improvement are also provided.

  9. Combining magnetic nanoparticle with biotinylated nanobodies for rapid and sensitive detection of influenza H3N2

    PubMed Central

    2014-01-01

    Our objective is to develop a rapid and sensitive assay based on magnetic beads to detect the concentration of influenza H3N2. The possibility of using variable domain heavy-chain antibodies (nanobody) as diagnostic tools for influenza H3N2 was investigated. A healthy camel was immunized with inactivated influenza H3N2. A nanobody library of 8 × 108 clones was constructed and phage displayed. After three successive biopanning steps, H3N2-specific nanobodies were successfully isolated, expressed in Escherichia coli, and purified. Sequence analysis of the nanobodies revealed that we possessed four classes of nanobodies against H3N2. Two nanobodies were further used to prepare our rapid diagnostic kit. Biotinylated nanobody was effectively immobilized onto the surface of streptavidin magnetic beads. The modified magnetic beads with nanobody capture specifically influenza H3N2 and can still be recognized by nanobodies conjugated to horseradish peroxidase (HRP) conjugates. Under optimized conditions, the present immunoassay exhibited a relatively high sensitive detection with a limit of 50 ng/mL. In conclusion, by combining magnetic beads with specific nanobodies, this assay provides a promising influenza detection assay to develop a potential rapid, sensitive, and low-cost diagnostic tool to screen for influenza infections. PMID:25328501

  10. Combining magnetic nanoparticle with biotinylated nanobodies for rapid and sensitive detection of influenza H3N2

    NASA Astrophysics Data System (ADS)

    Zhu, Min; Hu, Yonghong; Li, Guirong; Ou, Weijun; Mao, Panyong; Xin, Shaojie; Wan, Yakun

    2014-09-01

    Our objective is to develop a rapid and sensitive assay based on magnetic beads to detect the concentration of influenza H3N2. The possibility of using variable domain heavy-chain antibodies (nanobody) as diagnostic tools for influenza H3N2 was investigated. A healthy camel was immunized with inactivated influenza H3N2. A nanobody library of 8 × 108 clones was constructed and phage displayed. After three successive biopanning steps, H3N2-specific nanobodies were successfully isolated, expressed in Escherichia coli, and purified. Sequence analysis of the nanobodies revealed that we possessed four classes of nanobodies against H3N2. Two nanobodies were further used to prepare our rapid diagnostic kit. Biotinylated nanobody was effectively immobilized onto the surface of streptavidin magnetic beads. The modified magnetic beads with nanobody capture specifically influenza H3N2 and can still be recognized by nanobodies conjugated to horseradish peroxidase (HRP) conjugates. Under optimized conditions, the present immunoassay exhibited a relatively high sensitive detection with a limit of 50 ng/mL. In conclusion, by combining magnetic beads with specific nanobodies, this assay provides a promising influenza detection assay to develop a potential rapid, sensitive, and low-cost diagnostic tool to screen for influenza infections.

  11. Sensitivity of Podosphaera xanthii populations to anti-powdery-mildew fungicides in Spain.

    PubMed

    Bellón-Gómez, Davinia; Vela-Corcía, David; Pérez-García, Alejandro; Torés, Juan A

    2015-10-01

    Cucurbit powdery mildew caused by Podosphaera xanthii limits crop production in Spain, where disease control is largely dependent on fungicides. In previous studies, high levels of resistance to QoI and DMI fungicides were documented in south-central Spain. The aim of this study was to investigate the sensitivity of P. xanthii populations to other fungicides and to provide tools for improved disease management. Using a leaf-disc assay, sensitivity to thiophanate-methyl, bupirimate and quinoxyfen of 50 isolates of P. xanthii was analysed to determine discriminatory concentrations between sensitive and resistant isolates. With the exception of thiophanate-methyl, no clearly different groups of isolates could be identified, and as a result, discriminatory concentrations were established on the basis of the maximum fungicide field application rate. Subsequently, a survey of P. xanthii resistance to these fungicides was carried out by testing a collection of 237 isolates obtained during the 2002-2011 cucurbit growing seasons. This analysis revealed very high levels of resistance to thiophanate-methyl (95%). By contrast, no resistance to bupirimate and quinoxyfen was found. Results suggest that thiophanate-methyl has become completely ineffective for controlling cucurbit powdery mildew in Spain. By contrast, bupirimate and quinoxyfen remain as very effective tools for cucurbit powdery mildew management. © 2014 Society of Chemical Industry. © 2014 Society of Chemical Industry.

  12. Major incident triage: Derivation and comparative analysis of the Modified Physiological Triage Tool (MPTT).

    PubMed

    Vassallo, James; Beavis, John; Smith, Jason E; Wallis, Lee A

    2017-05-01

    Triage is a key principle in the effective management at a major incident. There are at least three different triage systems in use worldwide and previous attempts to validate them, have revealed limited sensitivity. Within a civilian adult population, there has been no work to develop an improved system. A retrospective database review of the UK Joint Theatre Trauma Registry was performed for all adult patients (>18years) presenting to a deployed Military Treatment Facility between 2006 and 2013. Patients were defined as Priority One if they had received one or more life-saving interventions from a previously defined list. Using first recorded hospital physiological data (HR/RR/GCS), binary logistic regression models were used to derive optimum physiological ranges to predict need for life-saving intervention. This allowed for the derivation of the Modified Physiological Triage Tool-MPTT (GCS≥14, HR≥100, 12

  13. High-fidelity modeling and impact footprint prediction for vehicle breakup analysis

    NASA Astrophysics Data System (ADS)

    Ling, Lisa

    For decades, vehicle breakup analysis had been performed for space missions that used nuclear heater or power units in order to assess aerospace nuclear safety for potential launch failures leading to inadvertent atmospheric reentry. Such pre-launch risk analysis is imperative to assess possible environmental impacts, obtain launch approval, and for launch contingency planning. In order to accurately perform a vehicle breakup analysis, the analysis tool should include a trajectory propagation algorithm coupled with thermal and structural analyses and influences. Since such a software tool was not available commercially or in the public domain, a basic analysis tool was developed by Dr. Angus McRonald prior to this study. This legacy software consisted of low-fidelity modeling and had the capability to predict vehicle breakup, but did not predict the surface impact point of the nuclear component. Thus the main thrust of this study was to develop and verify the additional dynamics modeling and capabilities for the analysis tool with the objectives to (1) have the capability to predict impact point and footprint, (2) increase the fidelity in the prediction of vehicle breakup, and (3) reduce the effort and time required to complete an analysis. The new functions developed for predicting the impact point and footprint included 3-degrees-of-freedom trajectory propagation, the generation of non-arbitrary entry conditions, sensitivity analysis, and the calculation of impact footprint. The functions to increase the fidelity in the prediction of vehicle breakup included a panel code to calculate the hypersonic aerodynamic coefficients for an arbitrary-shaped body and the modeling of local winds. The function to reduce the effort and time required to complete an analysis included the calculation of node failure criteria. The derivation and development of these new functions are presented in this dissertation, and examples are given to demonstrate the new capabilities and the improvements made, with comparisons between the results obtained from the upgraded analysis tool and the legacy software wherever applicable.

  14. Indel analysis by droplet digital PCR: a sensitive method for DNA mixture detection and chimerism analysis.

    PubMed

    Santurtún, Ana; Riancho, José A; Arozamena, Jana; López-Duarte, Mónica; Zarrabeitia, María T

    2017-01-01

    Several methods have been developed to determinate genetic profiles from a mixed samples and chimerism analysis in transplanted patients. The aim of this study was to explore the effectiveness of using the droplet digital PCR (ddPCR) for mixed chimerism detection (a mixture of genetic profiles resulting after allogeneic hematopoietic stem cell transplantation (HSCT)). We analyzed 25 DNA samples from patients who had undergone HSCT and compared the performance of ddPCR and two established methods for chimerism detection, based upon the Indel and STRs analysis, respectively. Additionally, eight artificial mixture DNA samples were created to evaluate the sensibility of ddPCR. Our results show that the chimerism percentages estimated by the analysis of a single Indel using ddPCR were very similar to those calculated by the amplification of 15 STRs (r 2  = 0.970) and with the results obtained by the amplification of 38 Indels (r 2  = 0.975). Moreover, the amplification of a single Indel by ddPCR was sensitive enough to detect a minor DNA contributor comprising down to 0.5 % of the sample. We conclude that ddPCR can be a powerful tool for the determination of a genetic profile of forensic mixtures and clinical chimerism analysis when traditional techniques are not sensitive enough.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Marshall, William BJ J

    In the course of criticality code validation, outlier cases are frequently encountered. Historically, the causes of these unexpected results could be diagnosed only through comparison with other similar cases or through the known presence of a unique component of the critical experiment. The sensitivity and uncertainty (S/U) analysis tools available in the SCALE 6.1 code system provide a much broader range of options to examine underlying causes of outlier cases. This paper presents some case studies performed as a part of the recent validation of the KENO codes in SCALE 6.1 using S/U tools to examine potential causes of biases.

  16. High Sensitivity Combined with Extended Structural Coverage of Labile Compounds via Nanoelectrospray Ionization at Subambient Pressures

    DOE PAGES

    Cox, Jonathan T.; Kronewitter, Scott R.; Shukla, Anil K.; ...

    2014-09-15

    Subambient pressure ionization with nanoelectrospray (SPIN) has proven to be effective in producing ions with high efficiency and transmitting them to low pressures for high sensitivity mass spectrometry (MS) analysis. Here we present evidence that not only does the SPIN source improve MS sensitivity but also allows for gentler ionization conditions. The gentleness of a conventional heated capillary electrospray ionization (ESI) source and the SPIN source was compared by the liquid chromatography mass spectrometry (LC-MS) analysis of colominic acid. Colominic acid is a mixture of sialic acid polymers of different lengths containing labile glycosidic linkages between monomer units necessitating amore » gentle ion source. By coupling the SPIN source with high resolution mass spectrometry and using advanced data processing tools, we demonstrate much extended coverage of sialic acid polymer chains as compared to using the conventional ESI source. Additionally we show that SPIN-LC-MS is effective in elucidating polymer features with high efficiency and high sensitivity previously unattainable by the conventional ESI-LC-MS methods.« less

  17. Feasibility study of basic characterization of MAGAT polymer gel using CBCT attached in linear accelerator: Preliminary study

    NASA Astrophysics Data System (ADS)

    Sathiyaraj, P.; Samuel, E. James jebaseelan

    2018-01-01

    The aim of this study is to evaluate the methacrylic acid, gelatin and tetrakis (hydroxymethyl) phosphonium chloride gel (MAGAT) by cone beam computed tomography (CBCT) attached with modern linear accelerator. To compare the results of standard diagnostic computed tomography (CT) with CBCT, different parameters such as linearity, sensitivity and temporal stability were checked. MAGAT gel showed good linearity for both diagnostic CT and CBCT measurements. Sensitivity and temporal stability were also comparable with diagnostic CT measurements. In both the modalities, the sensitivity of the MAGAT increased to 4 days and decreased till the 10th day of post irradiation. Since all measurements (linearity, sensitivity and temporal stability) from diagnostic CT and CBCT were comparable, CBCT could be a potential tool for dose analysis study for polymer gel dosimeter.

  18. Design and characterization of planar capacitive imaging probe based on the measurement sensitivity distribution

    NASA Astrophysics Data System (ADS)

    Yin, X.; Chen, G.; Li, W.; Huthchins, D. A.

    2013-01-01

    Previous work indicated that the capacitive imaging (CI) technique is a useful NDE tool which can be used on a wide range of materials, including metals, glass/carbon fibre composite materials and concrete. The imaging performance of the CI technique for a given application is determined by design parameters and characteristics of the CI probe. In this paper, a rapid method for calculating the whole probe sensitivity distribution based on the finite element model (FEM) is presented to provide a direct view of the imaging capabilities of the planar CI probe. Sensitivity distributions of CI probes with different geometries were obtained. Influencing factors on sensitivity distribution were studied. Comparisons between CI probes with point-to-point triangular electrode pair and back-to-back triangular electrode pair were made based on the analysis of the corresponding sensitivity distributions. The results indicated that the sensitivity distribution could be useful for optimising the probe design parameters and predicting the imaging performance.

  19. Critical comparison of diffuse reflectance spectroscopy and colorimetry as dermatological diagnostic tools for acanthosis nigricans: a chemometric approach

    PubMed Central

    Devpura, Suneetha; Pattamadilok, Bensachee; Syed, Zain U.; Vemulapalli, Pranita; Henderson, Marsha; Rehse, Steven J.; Hamzavi, Iltefat; Lim, Henry W.; Naik, Ratna

    2011-01-01

    Quantification of skin changes due to acanthosis nigricans (AN), a disorder common among insulin-resistant diabetic and obese individuals, was investigated using two optical techniques: diffuse reflectance spectroscopy (DRS) and colorimetry. Measurements were obtained from AN lesions on the neck and two control sites of eight AN patients. A principal component/discriminant function analysis successfully differentiated between AN lesion and normal skin with 87.7% sensitivity and 94.8% specificity in DRS measurements and 97.2% sensitivity and 96.4% specificity in colorimetry measurements. PMID:21698027

  20. On Designing Multicore-Aware Simulators for Systems Biology Endowed with OnLine Statistics

    PubMed Central

    Calcagno, Cristina; Coppo, Mario

    2014-01-01

    The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed. PMID:25050327

  1. On designing multicore-aware simulators for systems biology endowed with OnLine statistics.

    PubMed

    Aldinucci, Marco; Calcagno, Cristina; Coppo, Mario; Damiani, Ferruccio; Drocco, Maurizio; Sciacca, Eva; Spinella, Salvatore; Torquati, Massimo; Troina, Angelo

    2014-01-01

    The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed.

  2. Designing novel cellulase systems through agent-based modeling and global sensitivity analysis.

    PubMed

    Apte, Advait A; Senger, Ryan S; Fong, Stephen S

    2014-01-01

    Experimental techniques allow engineering of biological systems to modify functionality; however, there still remains a need to develop tools to prioritize targets for modification. In this study, agent-based modeling (ABM) was used to build stochastic models of complexed and non-complexed cellulose hydrolysis, including enzymatic mechanisms for endoglucanase, exoglucanase, and β-glucosidase activity. Modeling results were consistent with experimental observations of higher efficiency in complexed systems than non-complexed systems and established relationships between specific cellulolytic mechanisms and overall efficiency. Global sensitivity analysis (GSA) of model results identified key parameters for improving overall cellulose hydrolysis efficiency including: (1) the cellulase half-life, (2) the exoglucanase activity, and (3) the cellulase composition. Overall, the following parameters were found to significantly influence cellulose consumption in a consolidated bioprocess (CBP): (1) the glucose uptake rate of the culture, (2) the bacterial cell concentration, and (3) the nature of the cellulase enzyme system (complexed or non-complexed). Broadly, these results demonstrate the utility of combining modeling and sensitivity analysis to identify key parameters and/or targets for experimental improvement.

  3. BEATBOX v1.0: Background Error Analysis Testbed with Box Models

    NASA Astrophysics Data System (ADS)

    Knote, Christoph; Barré, Jérôme; Eckl, Max

    2018-02-01

    The Background Error Analysis Testbed (BEATBOX) is a new data assimilation framework for box models. Based on the BOX Model eXtension (BOXMOX) to the Kinetic Pre-Processor (KPP), this framework allows users to conduct performance evaluations of data assimilation experiments, sensitivity analyses, and detailed chemical scheme diagnostics from an observation simulation system experiment (OSSE) point of view. The BEATBOX framework incorporates an observation simulator and a data assimilation system with the possibility of choosing ensemble, adjoint, or combined sensitivities. A user-friendly, Python-based interface allows for the tuning of many parameters for atmospheric chemistry and data assimilation research as well as for educational purposes, for example observation error, model covariances, ensemble size, perturbation distribution in the initial conditions, and so on. In this work, the testbed is described and two case studies are presented to illustrate the design of a typical OSSE experiment, data assimilation experiments, a sensitivity analysis, and a method for diagnosing model errors. BEATBOX is released as an open source tool for the atmospheric chemistry and data assimilation communities.

  4. Designing novel cellulase systems through agent-based modeling and global sensitivity analysis

    PubMed Central

    Apte, Advait A; Senger, Ryan S; Fong, Stephen S

    2014-01-01

    Experimental techniques allow engineering of biological systems to modify functionality; however, there still remains a need to develop tools to prioritize targets for modification. In this study, agent-based modeling (ABM) was used to build stochastic models of complexed and non-complexed cellulose hydrolysis, including enzymatic mechanisms for endoglucanase, exoglucanase, and β-glucosidase activity. Modeling results were consistent with experimental observations of higher efficiency in complexed systems than non-complexed systems and established relationships between specific cellulolytic mechanisms and overall efficiency. Global sensitivity analysis (GSA) of model results identified key parameters for improving overall cellulose hydrolysis efficiency including: (1) the cellulase half-life, (2) the exoglucanase activity, and (3) the cellulase composition. Overall, the following parameters were found to significantly influence cellulose consumption in a consolidated bioprocess (CBP): (1) the glucose uptake rate of the culture, (2) the bacterial cell concentration, and (3) the nature of the cellulase enzyme system (complexed or non-complexed). Broadly, these results demonstrate the utility of combining modeling and sensitivity analysis to identify key parameters and/or targets for experimental improvement. PMID:24830736

  5. Model-based POD study of manual ultrasound inspection and sensitivity analysis using metamodel

    NASA Astrophysics Data System (ADS)

    Ribay, Guillemette; Artusi, Xavier; Jenson, Frédéric; Reece, Christopher; Lhuillier, Pierre-Emile

    2016-02-01

    The reliability of NDE can be quantified by using the Probability of Detection (POD) approach. Former studies have shown the potential of the model-assisted POD (MAPOD) approach to replace expensive experimental determination of POD curves. In this paper, we make use of CIVA software to determine POD curves for a manual ultrasonic inspection of a heavy component, for which a whole experimental POD campaign was not available. The influential parameters were determined by expert analysis. The semi-analytical models used in CIVA for wave propagation and beam-defect interaction have been validated in the range of variation of the influential parameters by comparison with finite element modelling (Athena). The POD curves are computed for « hit/miss » and « â versus a » analysis. The verification of Berens hypothesis is evaluated by statistical tools. A sensitivity study is performed to measure the relative influence of parameters on the defect response amplitude variance, using the Sobol sensitivity index. A meta-model is also built to reduce computing cost and enhance the precision of estimated index.

  6. Comparative analysis of methicillin-sensitive and resistant Staphylococcus aureus exposed to emodin based on proteomic profiling.

    PubMed

    Ji, Xiaoyu; Liu, Xiaoqiang; Peng, Yuanxia; Zhan, Ruoting; Xu, Hui; Ge, Xijin

    2017-12-09

    Emodin has a strong antibacterial activity, including methicillin-resistant Staphylococcus aureus (MRSA). However, the mechanism by which emodin induces growth inhibition against MRSA remains unclear. In this study, the isobaric tags for relative and absolute quantitation (iTRAQ) proteomics approach was used to investigate the modes of action of emodin on a MRSA isolate and methicillin-sensitive S. aureus ATCC29213(MSSA). Proteomic analysis showed that expression levels of 145 and 122 proteins were changed significantly in MRSA and MSSA, respectively, after emodin treatment. Comparative analysis of the functions of differentially expressed proteins between the two strains was performed via bioinformatics tools blast2go and STRING database. Proteins related to pyruvate pathway imbalance induction, protein synthesis inhibition, and DNA synthesis suppression were found in both methicillin-sensitive and resistant strains. Moreover, Interference proteins related to membrane damage mechanism were also observed in MRSA. Our findings indicate that emodin is a potential antibacterial agent targeting MRSA via multiple mechanisms. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Association between atopic dermatitis and contact sensitization: A systematic review and meta-analysis.

    PubMed

    Hamann, Carsten R; Hamann, Dathan; Egeberg, Alexander; Johansen, Jeanne D; Silverberg, Jonathan; Thyssen, Jacob P

    2017-07-01

    It is unclear whether patients with atopic dermatitis (AD) have an altered prevalence or risk for contact sensitization. Increased exposure to chemicals in topical products together with impaired skin barrier function suggest a higher risk, whereas the immune profile suggests a lower risk. To perform a systematic review and meta-analysis of the association between AD and contact sensitization. The PubMed/Medline, Embase, and Cochrane databases were searched for articles that reported on contact sensitization in individuals with and without AD. The literature search yielded 10,083 citations; 417 were selected based on title and abstract screening and 74 met inclusion criteria. In a pooled analysis, no significant difference in contact sensitization between AD and controls was evident (random effects model odds ratio [OR] = 0.891; 95% confidence interval [CI] = 0.771-1.03). There was a positive correlation in studies that compared AD patients with individuals from the general population (OR 1.50, 95% CI 1.23-1.93) but an inverse association when comparing with referred populations (OR 0.753, 95% CI 0.63-0.90). Included studies used different tools to diagnose AD and did not always provide information on current or past disease. Patch test allergens varied between studies. No overall relationship between AD and contact sensitization was found. We recommend that clinicians consider patch testing AD patients when allergic contact dermatitis is suspected. Copyright © 2017 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  8. DNA Electrochemistry and Electrochemical Sensors for Nucleic Acids.

    PubMed

    Ferapontova, Elena E

    2018-06-12

    Sensitive, specific, and fast analysis of nucleic acids (NAs) is strongly needed in medicine, environmental science, biodefence, and agriculture for the study of bacterial contamination of food and beverages and genetically modified organisms. Electrochemistry offers accurate, simple, inexpensive, and robust tools for the development of such analytical platforms that can successfully compete with other approaches for NA detection. Here, electrode reactions of DNA, basic principles of electrochemical NA analysis, and their relevance for practical applications are reviewed and critically discussed.

  9. Bayesian model selection techniques as decision support for shaping a statistical analysis plan of a clinical trial: An example from a vertigo phase III study with longitudinal count data as primary endpoint

    PubMed Central

    2012-01-01

    Background A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. Methods We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). Results The instruments under study provide excellent tools for preparing decisions within the SAP in a transparent way when structuring the primary analysis, sensitivity or ancillary analyses, and specific analyses for secondary endpoints. The mean logarithmic score and DIC discriminate well between different model scenarios. It becomes obvious that the naive choice of a conventional random effects Poisson model is often inappropriate for real-life count data. The findings are used to specify an appropriate mixed model employed in the sensitivity analyses of an ongoing phase III trial. Conclusions The proposed Bayesian methods are not only appealing for inference but notably provide a sophisticated insight into different aspects of model performance, such as forecast verification or calibration checks, and can be applied within the model selection process. The mean of the logarithmic score is a robust tool for model ranking and is not sensitive to sample size. Therefore, these Bayesian model selection techniques offer helpful decision support for shaping sensitivity and ancillary analyses in a statistical analysis plan of a clinical trial with longitudinal count data as the primary endpoint. PMID:22962944

  10. Bayesian model selection techniques as decision support for shaping a statistical analysis plan of a clinical trial: an example from a vertigo phase III study with longitudinal count data as primary endpoint.

    PubMed

    Adrion, Christine; Mansmann, Ulrich

    2012-09-10

    A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). The instruments under study provide excellent tools for preparing decisions within the SAP in a transparent way when structuring the primary analysis, sensitivity or ancillary analyses, and specific analyses for secondary endpoints. The mean logarithmic score and DIC discriminate well between different model scenarios. It becomes obvious that the naive choice of a conventional random effects Poisson model is often inappropriate for real-life count data. The findings are used to specify an appropriate mixed model employed in the sensitivity analyses of an ongoing phase III trial. The proposed Bayesian methods are not only appealing for inference but notably provide a sophisticated insight into different aspects of model performance, such as forecast verification or calibration checks, and can be applied within the model selection process. The mean of the logarithmic score is a robust tool for model ranking and is not sensitive to sample size. Therefore, these Bayesian model selection techniques offer helpful decision support for shaping sensitivity and ancillary analyses in a statistical analysis plan of a clinical trial with longitudinal count data as the primary endpoint.

  11. Are quantitative sensitivity analysis methods always reliable?

    NASA Astrophysics Data System (ADS)

    Huang, X.

    2016-12-01

    Physical parameterizations developed to represent subgrid-scale physical processes include various uncertain parameters, leading to large uncertainties in today's Earth System Models (ESMs). Sensitivity Analysis (SA) is an efficient approach to quantitatively determine how the uncertainty of the evaluation metric can be apportioned to each parameter. Also, SA can identify the most influential parameters, as a result to reduce the high dimensional parametric space. In previous studies, some SA-based approaches, such as Sobol' and Fourier amplitude sensitivity testing (FAST), divide the parameters into sensitive and insensitive groups respectively. The first one is reserved but the other is eliminated for certain scientific study. However, these approaches ignore the disappearance of the interactive effects between the reserved parameters and the eliminated ones, which are also part of the total sensitive indices. Therefore, the wrong sensitive parameters might be identified by these traditional SA approaches and tools. In this study, we propose a dynamic global sensitivity analysis method (DGSAM), which iteratively removes the least important parameter until there are only two parameters left. We use the CLM-CASA, a global terrestrial model, as an example to verify our findings with different sample sizes ranging from 7000 to 280000. The result shows DGSAM has abilities to identify more influential parameters, which is confirmed by parameter calibration experiments using four popular optimization methods. For example, optimization using Top3 parameters filtered by DGSAM could achieve substantial improvement against Sobol' by 10%. Furthermore, the current computational cost for calibration has been reduced to 1/6 of the original one. In future, it is necessary to explore alternative SA methods emphasizing parameter interactions.

  12. @NWTC Newsletter: Summer 2014 | Wind | NREL

    Science.gov Websites

    , Developmental Role in Major Wind Journal Boosting Wind Plant Power Output by 4%-5% through Coordinated Turbine . Part 2: Wind Farm Wake Models New Framework Transforms FAST Wind Turbine Modeling Tool (Fact Sheet ) Sensitivity Analysis of Wind Plant Performance to Key Turbine Design Parameters: A Systems Engineering

  13. Diagnostic test accuracy of nutritional tools used to identify undernutrition in patients with colorectal cancer: a systematic review.

    PubMed

    Håkonsen, Sasja Jul; Pedersen, Preben Ulrich; Bath-Hextall, Fiona; Kirkpatrick, Pamela

    2015-05-15

    Effective nutritional screening, nutritional care planning and nutritional support are essential in all settings, and there is no doubt that a health service seeking to increase safety and clinical effectiveness must take nutritional care seriously. Screening and early detection of malnutrition is crucial in identifying patients at nutritional risk. There is a high prevalence of malnutrition in hospitalized patients undergoing treatment for colorectal cancer. To synthesize the best available evidence regarding the diagnostic test accuracy of nutritional tools (sensitivity and specificity) used to identify malnutrition (specifically undernutrition) in patients with colorectal cancer (such as the Malnutrition Screening Tool and Nutritional Risk Index) compared to reference tests (such as the Subjective Global Assessment or Patient Generated Subjective Global Assessment). Patients with colorectal cancer requiring either (or all) surgery, chemotherapy and/or radiotherapy in secondary care. Focus of the review: The diagnostic test accuracy of validated assessment tools/instruments (such as the Malnutrition Screening Tool and Nutritional Risk Index) in the diagnosis of malnutrition (specifically under-nutrition) in patients with colorectal cancer, relative to reference tests (Subjective Global Assessment or Patient Generated Subjective Global Assessment). Types of studies: Diagnostic test accuracy studies regardless of study design. Studies published in English, German, Danish, Swedish and Norwegian were considered for inclusion in this review. Databases were searched from their inception to April 2014. Methodological quality was determined using the Quality Assessment of Diagnostic Accuracy Studies checklist. Data was collected using the data extraction form: the Standards for Reporting Studies of Diagnostic Accuracy checklist for the reporting of studies of diagnostic accuracy. The accuracy of diagnostic tests is presented in terms of sensitivity, specificity, positive and negative predictive values. In addition, the positive likelihood ratio (sensitivity/ [1 - specificity]) and negative likelihood ratio (1 - sensitivity)/ specificity), were also calculated and presented in this review to provide information about the likelihood that a given test result would be expected when the target condition is present compared with the likelihood that the same result would be expected when the condition is absent. Not all trials reported true positive, true negative, false positive and false negative rates, therefore these rates were calculated based on the data in the published papers. A two-by-two truth table was reconstructed for each study, and sensitivity, specificity, positive predictive value, negative predictive value positive likelihood ratio and negative likelihood ratio were calculated for each study. A summary receiver operator characteristics curve was constructed to determine the relationship between sensitivity and specificity, and the area under the summary receiver operator characteristics curve which measured the usefulness of a test was calculated. Meta-analysis was not considered appropriate, therefore data was synthesized in a narrative summary. 1. One study evaluated the Malnutrition Screening Tool against the reference standard Patient-Generated Subjective Global Assessment. The sensitivity was 56% and the specificity 84%. The positive likelihood ratio was 3.100, negative likelihood ratio was 0.59, the diagnostic odds ratio (CI 95%) was 5.20 (1.09-24.90) and the Area Under the Curve (AUC) represents only a poor to fair diagnostic test accuracy. A total of two studies evaluated the diagnostic accuracy of Malnutrition Universal Screening Tool (MUST) (index test) compared to both Subjective Global Assessment (SGA) (reference standard) and PG-SGA (reference standard) in patients with colorectal cancer. In MUST vs SGA the sensitivity of the tool was 96%, specificity was 75%, LR+ 3.826, LR- 0.058, diagnostic OR (CI 95%) 66.00 (6.61-659.24) and AUC represented excellent diagnostic accuracy. In MUST vs PG-SGA the sensitivity of the tool was 72%, specificity 48.9%, LR+ 1.382, LR- 0.579, diagnostic OR (CI 95%) 2.39 (0.87-6.58) and AUC indicated that the tool failed as a diagnostic test to identify patients with colorectal cancer at nutritional risk,. The Nutrition Risk Index (NRI) was compared to SGA representing a sensitivity of 95.2%, specificity of 62.5%, LR+ 2.521, LR- 0.087, diagnostic OR (CI 95%) 28.89 (6.93-120.40) and AUC represented good diagnostic accuracy. In regard to NRI vs PG-SGA the sensitivity of the tool was 68%, specificity 64%, LR+ 1.947, LR- 0.487, diagnostic OR (CI 95%) 4.00 (1.23-13.01) and AUC indicated poor diagnostic test accuracy. There are no single, specific tools used to screen or assess the nutritional status of colorectal cancer patients. All tools showed varied diagnostic accuracies when compared to the reference standards SGA and PG-SGA. Hence clinical judgment combined with perhaps the SGA or PG-SGA should play a major role. The PG-SGA offers several advantages over the SGA tool: 1) the patient completes the medical history component, thereby decreasing the amount of time involved; 2) it contains more nutrition impact symptoms, which are important to the patient with cancer; and 3) it has a scoring system that allows patients to be triaged for nutritional intervention. Therefore, the PG-SGA could be used as a nutrition assessment tool as it allows quick identification and prioritization of colorectal cancer patients with malnutrition in combination with other parameters. This systematic review highlights the need for the following: Further studies needs to investigate the diagnostic accuracy of already existing nutritional screening tools in the context of colorectal cancer patients. If new screenings tools are developed, they should be developed and validated in the specific clinical context within the same patient population (colorectal cancer patients). The Joanna Briggs Institute.

  14. Fast assessment of planar chromatographic layers quality using pulse thermovision method.

    PubMed

    Suszyński, Zbigniew; Świta, Robert; Loś, Joanna; Zarzycka, Magdalena B; Kaleniecka, Aleksandra; Zarzycki, Paweł K

    2014-12-19

    The main goal of this paper is to demonstrate capability of pulse thermovision (thermal-wave) methodology for sensitive detection of photothermal non-uniformities within light scattering and semi-transparent planar stationary phases. Successful visualization of stationary phases defects required signal processing protocols based on wavelet filtration, correlation analysis and k-means 3D segmentation. Such post-processing data handling approach allows extremely sensitive detection of thickness and structural changes within commercially available planar chromatographic layers. Particularly, a number of TLC and HPTLC stationary phases including silica, cellulose, aluminum oxide, polyamide and octadecylsilane coated with adsorbent layer ranging from 100 to 250μm were investigated. Presented detection protocol can be used as an efficient tool for fast screening the overall heterogeneity of any layered materials. Moreover, described procedure is very fast (few seconds including acquisition and data processing) and may be applied for fabrication processes online controlling. In spite of planar chromatographic plates this protocol can be used for assessment of different planar separation tools like paper based analytical devices or micro total analysis systems, consisted of organic and non-organic layers. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Chemical Visualization of Phosphoproteomes on Membrane*

    PubMed Central

    Iliuk, Anton; Liu, X. Shawn; Xue, Liang; Liu, Xiaoqi; Tao, W. Andy

    2012-01-01

    With new discoveries of important roles of phosphorylation on a daily basis, phospho-specific antibodies, as the primary tool for on-membrane detection of phosphoproteins, face enormous challenges. To address an urgent need for convenient and reliable analysis of phosphorylation events, we report a novel strategy for sensitive phosphorylation analysis in the Western blotting format. The chemical reagent, which we termed pIMAGO, is based on a multifunctionalized soluble nanopolymer and is capable of selectively binding to phosphorylated residues independent of amino acid microenvironment, thus offering great promise as a universal tool in biological analyses where the site of phosphorylation is not known or its specific antibody is not available. The specificity and sensitivity of the approach was first examined using a mixture of standard proteins. The method was then applied to monitor phosphorylation changes in in vitro kinase and phosphatase assays. Finally, to demonstrate the unique ability of pIMAGO to measure endogenous phosphorylation, we used it to visualize and determine the differences in phosphorylated proteins that interact with wild-type and kinase dead mutant of Polo-like kinase 1 during mitosis, the results of which were further confirmed by a quantitative phosphoproteomics experiment. PMID:22593177

  16. Refining and validating a two-stage and web-based cancer risk assessment tool for village doctors in China.

    PubMed

    Shen, Xing-Rong; Chai, Jing; Feng, Rui; Liu, Tong-Zhu; Tong, Gui-Xian; Cheng, Jing; Li, Kai-Chun; Xie, Shao-Yu; Shi, Yong; Wang, De-Bin

    2014-01-01

    The big gap between efficacy of population level prevention and expectations due to heterogeneity and complexity of cancer etiologic factors calls for selective yet personalized interventions based on effective risk assessment. This paper documents our research protocol aimed at refining and validating a two-stage and web- based cancer risk assessment tool, from a tentative one in use by an ongoing project, capable of identifying individuals at elevated risk for one or more types of the 80% leading cancers in rural China with adequate sensitivity and specificity and featuring low cost, easy application and cultural and technical sensitivity for farmers and village doctors. The protocol adopted a modified population-based case control design using 72, 000 non-patients as controls, 2, 200 cancer patients as cases, and another 600 patients as cases for external validation. Factors taken into account comprised 8 domains including diet and nutrition, risk behaviors, family history, precancerous diseases, related medical procedures, exposure to environment hazards, mood and feelings, physical activities and anthropologic and biologic factors. Modeling stresses explored various methodologies like empirical analysis, logistic regression, neuro-network analysis, decision theory and both internal and external validation using concordance statistics, predictive values, etc..

  17. Targeted Quantitation of Proteins by Mass Spectrometry

    PubMed Central

    2013-01-01

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement. PMID:23517332

  18. Targeted quantitation of proteins by mass spectrometry.

    PubMed

    Liebler, Daniel C; Zimmerman, Lisa J

    2013-06-04

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement.

  19. Evaluation of the Predictive Index for Osteoporosis as a Clinical Tool to Identify the Risk of Osteoporosis in Korean Men by Using the Korea National Health and Nutrition Examination Survey Data.

    PubMed

    Moon, Ji Hyun; Kim, Lee Oh; Kim, Hyeon Ju; Kong, Mi Hee

    2016-11-01

    We previously proposed the Predictive Index for Osteoporosis as a new index to identify men who require bone mineral density measurement. However, the previous study had limitations such as a single-center design and small sample size. Here, we evaluated the usefulness of the Predictive Index for Osteoporosis using the nationally representative data of the Korea National Health and Nutrition Examination Survey. Participants underwent bone mineral density measurements via dual energy X-ray absorptiometry, and the Predictive Index for Osteoporosis and Osteoporosis Self-Assessment Tool for Asians were assessed. Receiver operating characteristic analysis was used to obtain optimal cut-off points for the Predictive Index for Osteoporosis and Osteoporosis Self-Assessment Tool for Asians, and the predictability of osteoporosis for the 2 indices was compared. Both indices were useful clinical tools for identifying osteoporosis risk in Korean men. The optimal cut-off value for the Predictive Index for Osteoporosis was 1.07 (sensitivity, 67.6%; specificity, 72.7%; area under the curve, 0.743). When using a cut-off point of 0.5 for the Osteoporosis Self-Assessment Tool for Asians, the sensitivity and specificity were 71.9% and 64.0%, respectively, and the area under the curve was 0.737. The Predictive Index for Osteoporosis was as useful as the Osteoporosis Self-Assessment Tool for Asians as a screening index to identify candidates for dual energy X-ray absorptiometry among men aged 50-69 years.

  20. Evaluation of in silico tools to predict the skin sensitization potential of chemicals.

    PubMed

    Verheyen, G R; Braeken, E; Van Deun, K; Van Miert, S

    2017-01-01

    Public domain and commercial in silico tools were compared for their performance in predicting the skin sensitization potential of chemicals. The packages were either statistical based (Vega, CASE Ultra) or rule based (OECD Toolbox, Toxtree, Derek Nexus). In practice, several of these in silico tools are used in gap filling and read-across, but here their use was limited to make predictions based on presence/absence of structural features associated to sensitization. The top 400 ranking substances of the ATSDR 2011 Priority List of Hazardous Substances were selected as a starting point. Experimental information was identified for 160 chemically diverse substances (82 positive and 78 negative). The prediction for skin sensitization potential was compared with the experimental data. Rule-based tools perform slightly better, with accuracies ranging from 0.6 (OECD Toolbox) to 0.78 (Derek Nexus), compared with statistical tools that had accuracies ranging from 0.48 (Vega) to 0.73 (CASE Ultra - LLNA weak model). Combining models increased the performance, with positive and negative predictive values up to 80% and 84%, respectively. However, the number of substances that were predicted positive or negative for skin sensitization in both models was low. Adding more substances to the dataset will increase the confidence in the conclusions reached. The insights obtained in this evaluation are incorporated in a web database www.asopus.weebly.com that provides a potential end user context for the scope and performance of different in silico tools with respect to a common dataset of curated skin sensitization data.

  1. The use of the sexual function questionnaire as a screening tool for women with sexual dysfunction.

    PubMed

    Quirk, Frances; Haughie, Scott; Symonds, Tara

    2005-07-01

    To determine if the validated Sexual Function Questionnaire (SFQ), developed to assess efficacy in female sexual dysfunction (FSD) clinical trials, may also have utility in identifying target populations for such studies. Data from five clinical trials and two general population surveys were used to analyze the utility of the SFQ as a tool to discriminate between the presence of specific components of FSD (i.e., hypoactive sexual desire disorder, female sexual arousal disorder, female orgasmic disorder, and dyspareunia). Sensitivity/specificity analysis and logistic regression analysis, using data from all five clinical studies and the general population surveys, confirmed that the SFQ domains have utility in detecting the presence of specific components of FSD and provide scores indicative of the presence of a specific sexual disorder. The SFQ is a valuable new tool for detecting the presence of FSD and identifying the specific components of sexual functions affected (desire, arousal, orgasm, or dyspareunia).

  2. Benchmarking of a treatment planning system for spot scanning proton therapy: Comparison and analysis of robustness to setup errors of photon IMRT and proton SFUD treatment plans of base of skull meningioma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harding, R., E-mail: ruth.harding2@wales.nhs.uk; Trnková, P.; Lomax, A. J.

    Purpose: Base of skull meningioma can be treated with both intensity modulated radiation therapy (IMRT) and spot scanned proton therapy (PT). One of the main benefits of PT is better sparing of organs at risk, but due to the physical and dosimetric characteristics of protons, spot scanned PT can be more sensitive to the uncertainties encountered in the treatment process compared with photon treatment. Therefore, robustness analysis should be part of a comprehensive comparison between these two treatment methods in order to quantify and understand the sensitivity of the treatment techniques to uncertainties. The aim of this work was tomore » benchmark a spot scanning treatment planning system for planning of base of skull meningioma and to compare the created plans and analyze their robustness to setup errors against the IMRT technique. Methods: Plans were produced for three base of skull meningioma cases: IMRT planned with a commercial TPS [Monaco (Elekta AB, Sweden)]; single field uniform dose (SFUD) spot scanning PT produced with an in-house TPS (PSI-plan); and SFUD spot scanning PT plan created with a commercial TPS [XiO (Elekta AB, Sweden)]. A tool for evaluating robustness to random setup errors was created and, for each plan, both a dosimetric evaluation and a robustness analysis to setup errors were performed. Results: It was possible to create clinically acceptable treatment plans for spot scanning proton therapy of meningioma with a commercially available TPS. However, since each treatment planning system uses different methods, this comparison showed different dosimetric results as well as different sensitivities to setup uncertainties. The results confirmed the necessity of an analysis tool for assessing plan robustness to provide a fair comparison of photon and proton plans. Conclusions: Robustness analysis is a critical part of plan evaluation when comparing IMRT plans with spot scanned proton therapy plans.« less

  3. Evaluating an holistic assessment tool for palliative care practice.

    PubMed

    McIlfatrick, Sonja; Hasson, Felicity

    2014-04-01

    To evaluate a holistic assessment tool for palliative care practice. This included identifying patients' needs using the holistic tool and exploring the usability, applicability and barriers and facilitators towards implementation in practice. The delivery of effective holistic palliative care requires a careful assessment of the patients' needs and circumstances. Whilst holistic assessment of palliative care needs is advocated, questions exist around the appropriateness of tools to assist this process. Mixed-method research design. Data collection involved an analysis of piloted holistic assessments undertaken using the tool (n = 132) and two focus groups with healthcare professionals (n = 10). The tool enabled health professionals to identify and gain an understanding of the needs of the patients, specifically in relation to the physical healthcare needs. Differences, however, between the analysis of the tool documentation and focus group responses were identified in particular areas. For example, 59 (68·8%) respondents had discussed preferred priorities of care with the patient; however, focus group comments revealed participants had concerns around this. Similarly, whilst over half of responses (n = 50; 57·5%) had considered a prognostic clinical indicator for the patient as an action, focus group results indicated questions around healthcare professionals' knowledge and perceived usefulness of such indicators. Positive aspects of the tool were that it was easy to understand and captured the needs of individuals. Negative aspects of the tool were that it was repetitive and the experience of assessors required consideration. The tool evaluation identified questions regarding holistic assessment in palliative care practice and the importance of communication. A holistic assessment tool can support patient assessment and identification of patients' needs in the 'real world' of palliative care practice, but the 'tool' is merely an aid to assist professionals to discuss difficult and sensitive aspects of care. © 2013 John Wiley & Sons Ltd.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eto, Joseph H.; Parashar, Manu; Lewis, Nancy Jo

    The Real Time System Operations (RTSO) 2006-2007 project focused on two parallel technical tasks: (1) Real-Time Applications of Phasors for Monitoring, Alarming and Control; and (2) Real-Time Voltage Security Assessment (RTVSA) Prototype Tool. The overall goal of the phasor applications project was to accelerate adoption and foster greater use of new, more accurate, time-synchronized phasor measurements by conducting research and prototyping applications on California ISO's phasor platform - Real-Time Dynamics Monitoring System (RTDMS) -- that provide previously unavailable information on the dynamic stability of the grid. Feasibility assessment studies were conducted on potential application of this technology for small-signal stabilitymore » monitoring, validating/improving existing stability nomograms, conducting frequency response analysis, and obtaining real-time sensitivity information on key metrics to assess grid stress. Based on study findings, prototype applications for real-time visualization and alarming, small-signal stability monitoring, measurement based sensitivity analysis and frequency response assessment were developed, factory- and field-tested at the California ISO and at BPA. The goal of the RTVSA project was to provide California ISO with a prototype voltage security assessment tool that runs in real time within California ISO?s new reliability and congestion management system. CERTS conducted a technical assessment of appropriate algorithms, developed a prototype incorporating state-of-art algorithms (such as the continuation power flow, direct method, boundary orbiting method, and hyperplanes) into a framework most suitable for an operations environment. Based on study findings, a functional specification was prepared, which the California ISO has since used to procure a production-quality tool that is now a part of a suite of advanced computational tools that is used by California ISO for reliability and congestion management.« less

  5. Novel strategy for typing Mycoplasma pneumoniae isolates by use of matrix-assisted laser desorption ionization-time of flight mass spectrometry coupled with ClinProTools.

    PubMed

    Xiao, Di; Zhao, Fei; Zhang, Huifang; Meng, Fanliang; Zhang, Jianzhong

    2014-08-01

    The typing of Mycoplasma pneumoniae mainly relies on the detection of nucleic acid, which is limited by the use of a single gene target, complex operation procedures, and a lengthy assay time. Here, matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) coupled to ClinProTools was used to discover MALDI-TOF MS biomarker peaks and to generate a classification model based on a genetic algorithm (GA) to differentiate between type 1 and type 2 M. pneumoniae isolates. Twenty-five M. pneumoniae strains were used to construct an analysis model, and 43 Mycoplasma strains were used for validation. For the GA typing model, the cross-validation values, which reflect the ability of the model to handle variability among the test spectra and the recognition capability value, which reflects the model's ability to correctly identify its component spectra, were all 100%. This model contained 7 biomarker peaks (m/z 3,318.8, 3,215.0, 5,091.8, 5,766.8, 6,337.1, 6,431.1, and 6,979.9) used to correctly identify 31 type 1 and 7 type 2 M. pneumoniae isolates from 43 Mycoplasma strains with a sensitivity and specificity of 100%. The strain distribution map and principle component analysis based on the GA classification model also clearly showed that the type 1 and type 2 M. pneumoniae isolates can be divided into two categories based on their peptide mass fingerprints. With the obvious advantages of being rapid, highly accurate, and highly sensitive and having a low cost and high throughput, MALDI-TOF MS ClinProTools is a powerful and reliable tool for M. pneumoniae typing. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  6. Reliability and Productivity Modeling for the Optimization of Separated Spacecraft Interferometers

    NASA Technical Reports Server (NTRS)

    Kenny, Sean (Technical Monitor); Wertz, Julie

    2002-01-01

    As technological systems grow in capability, they also grow in complexity. Due to this complexity, it is no longer possible for a designer to use engineering judgement to identify the components that have the largest impact on system life cycle metrics, such as reliability, productivity, cost, and cost effectiveness. One way of identifying these key components is to build quantitative models and analysis tools that can be used to aid the designer in making high level architecture decisions. Once these key components have been identified, two main approaches to improving a system using these components exist: add redundancy or improve the reliability of the component. In reality, the most effective approach to almost any system will be some combination of these two approaches, in varying orders of magnitude for each component. Therefore, this research tries to answer the question of how to divide funds, between adding redundancy and improving the reliability of components, to most cost effectively improve the life cycle metrics of a system. While this question is relevant to any complex system, this research focuses on one type of system in particular: Separate Spacecraft Interferometers (SSI). Quantitative models are developed to analyze the key life cycle metrics of different SSI system architectures. Next, tools are developed to compare a given set of architectures in terms of total performance, by coupling different life cycle metrics together into one performance metric. Optimization tools, such as simulated annealing and genetic algorithms, are then used to search the entire design space to find the "optimal" architecture design. Sensitivity analysis tools have been developed to determine how sensitive the results of these analyses are to uncertain user defined parameters. Finally, several possibilities for the future work that could be done in this area of research are presented.

  7. [Usefulness of scoring risk for adverse outcomes in older patients with the Identification of Seniors at Risk scale and the Triage Risk Screening Tool: a meta-analysis].

    PubMed

    Rivero-Santana, Amado; Del Pino-Sedeño, Tasmania; Ramallo-Fariña, Yolanda; Vergara, Itziar; Serrano-Aguilar, Pedro

    2017-02-01

    A considerable proportion of the geriatric population experiences unfavorable outcomes of hospital emergency department care. An assessment of risk for adverse outcomes would facilitate making changes in clinical management by adjusting available resources to needs according to an individual patient's risk. Risk assessment tools are available, but their prognostic precision varies. This systematic review sought to quantify the prognostic precision of 2 geriatric screening and risk assessment tools commonly used in emergency settings for patients at high risk of adverse outcomes (revisits, functional deterioration, readmissions, or death): the Identification of Seniors at Risk (ISAR) scale and the Triage Risk Screening Tool (TRST). We searched PubMed, EMBASE, the Cochrane Central Register of Controlled Trials, and SCOPUS, with no date limits, to find relevant studies. Quality was assessed with the QUADAS-2 checklist (for quality assessment of diagnostic accuracy studies). We pooled data for prognostic yield reported for the ISAR and TRST scores for each short- and medium-term outcome using bivariate random-effects modeling. The sensitivity of the ISAR scoring system as a whole ranged between 67% and 99%; specificity fell between 21% and 41%. TRST sensitivity ranged between 52% and 75% and specificity between 39% and 51%.We conclude that the tools currently used to assess risk of adverse outcomes in patients of advanced age attended in hospital emergency departments do not have adequate prognostic precision to be clinically useful.

  8. Endobronchial Ultrasound for Nodal Staging of Non-Small Cell Lung Cancer Patients with Radiologically Normal Mediastinum: A Meta-Analysis.

    PubMed

    El-Osta, Hazem; Jani, Pushan; Mansour, Ali; Rascoe, Philip; Jafri, Syed

    2018-04-23

    An accurate assessment of the mediastinal lymph nodes status is essential in the staging and treatment planning of potentially resectable non-small cell lung cancer (NSCLC). We performed this meta-analysis to evaluate the role of endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) in detecting occult mediastinal disease in NSCLC with no radiologic mediastinal involvement. The PubMed, Embase, and Cochrane libraries were searched for studies describing the role of EBUS-TBNA in lung cancer patients with radiologically negative mediastinum. The individual and pooled sensitivity, prevalence, negative predictive value (NPV), and diagnostic odds ratio (DOR) were calculated using the random effects model. Metaregression analysis, heterogeneity, and publication bias were also assessed. A total of 13 studies that met the inclusion criteria were included in the meta-analysis. The pooled effect size of the different diagnostic parameters were estimated as follows: prevalence, 12.8% (95% CI, 10.4%-15.7%); sensitivity, 49.5% (95% confidence interval [CI], 36.4%-62.6%); NPV, 93.0% (95% CI, 90.3%-95.0%); and log DOR, 5.069 (95% CI, 4.212-5.925). Significant heterogeneity was noticeable for the sensitivity, disease prevalence, and NPV, but not observed for log DOR. Publication bias was detected for sensitivity, NPV and log DOR but not for prevalence. Bivariate meta-regression analysis showed no significant association between the pooled calculated parameters and the type of anesthesia, imaging utilized to define negative mediastinum, rapid on-site test usage, and presence of bias by QUADAS-2 tool. Interestingly, we observed a greater sensitivity, NPV and log DOR for studies published prior to 2010, and for prospective multicenter studies. Among NSCLC patients with a radiologically normal mediastinum, the prevalence of mediastinal disease is 12.8% and the sensitivity of EBUS-TBNA is 49.5%. Despite the low sensitivity, the resulting NPV of 93.0% for EBUS-TBNA suggests that mediastinal metastasis is uncommon in such patients.

  9. High-throughput and sensitive analysis of 3-monochloropropane-1,2-diol fatty acid esters in edible oils by supercritical fluid chromatography/tandem mass spectrometry.

    PubMed

    Hori, Katsuhito; Matsubara, Atsuki; Uchikata, Takato; Tsumura, Kazunobu; Fukusaki, Eiichiro; Bamba, Takeshi

    2012-08-10

    We have established a high-throughput and sensitive analytical method based on supercritical fluid chromatography (SFC) coupled with triple quadrupole mass spectrometry (QqQ MS) for 3-monochloropropane-1,2-diol (3-MCPD) fatty acid esters in edible oils. All analytes were successfully separated within 9 min without sample purification. The system was precise and sensitive, with a limit of detection less than 0.063 mg/kg. The recovery rate of 3-MCPD fatty acid esters spiked into oil samples was in the range of 62.68-115.23%. Furthermore, several edible oils were tested for analyzing 3-MCPD fatty acid ester profiles. This is the first report on the analysis of 3-MCPD fatty acid esters by SFC/QqQ MS. The developed method will be a powerful tool for investigating 3-MCPD fatty acid esters in edible oils. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Chapter 5: Modulation Excitation Spectroscopy with Phase-Sensitive Detection for Surface Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shulda, Sarah; Richards, Ryan M.

    Advancements in in situ spectroscopic techniques have led to significant progress being made in elucidating heterogeneous reaction mechanisms. The potential of these progressive methods is often limited only by the complexity of the system and noise in the data. Short-lived intermediates can be challenging, if not impossible, to identify with conventional spectra analysis means. Often equally difficult is separating signals that arise from active and inactive species. Modulation excitation spectroscopy combined with phase-sensitive detection analysis is a powerful tool for removing noise from the data while simultaneously revealing the underlying kinetics of the reaction. A stimulus is applied at amore » constant frequency to the reaction system, for example, a reactant cycled with an inert phase. Through mathematical manipulation of the data, any signal contributing to the overall spectra but not oscillating with the same frequency as the stimulus will be dampened or removed. With phase-sensitive detection, signals oscillating with the stimulus frequency but with various lag times are amplified providing valuable kinetic information. In this chapter, some examples are provided from the literature that have successfully used modulation excitation spectroscopy with phase-sensitive detection to uncover previously unobserved reaction intermediates and kinetics. Examples from a broad range of spectroscopic methods are included to provide perspective to the reader.« less

  11. Calibration and Validation of Nonpoint Source Pollution and Erosion Comparison Tool,N- SPECT, for Tropical Conditions

    NASA Astrophysics Data System (ADS)

    Fares, A.; Cheng, C. L.; Dogan, A.

    2006-12-01

    Impaired water quality caused by agriculture, urbanization, and spread of invasive species has been identified as a major factor in the degradation of coastal ecosystems in the tropics. Watershed-scale nonpoint source pollution models facilitate in evaluating effective management practices to alleviate the negative impacts of different land-use changes. The Non-Point Source Pollution and Erosion Comparison Tool (N-SPECT) is a newly released watershed model that was not previously tested under tropical conditions. The two objectives of this study were to: i) calibrate and validate N-SPECT for the Hanalei Watershed of the Hawai`ian island of Kaua`i; ii) evaluate the performance of N-SPECT under tropical conditions using the sensitivity analysis approach. Hanalei watershed has one of the wettest points on earth, Mt. Waialeale with an average annual rainfall of 11,000 mm. This rainfall decreases to 2,000 mm at the outlet of the watershed near the coast. Number of rain days is one of the major input parameters that influences N-SPECT's simulation results. This parameter was used to account for plant canopy interception losses. The watershed was divided into sub- basins to accurately distribute the number of rain days throughout the watershed. Total runoff volume predicted by the model compared well with measured data. The model underestimated measured runoff by 1% for calibration period and 5% for validation period due to higher intensity precipitation in the validation period. Sensitivity analysis revealed that the model was most sensitive to the number of rain days, followed by canopy interception, and least sensitive to the number of sub-basins. The sediment and water quality portion of the model is currently being evaluated.

  12. Schistosomal MicroRNAs Isolated From Extracellular Vesicles in Sera of Infected Patients: A New Tool for Diagnosis and Follow-up of Human Schistosomiasis.

    PubMed

    Meningher, Tal; Lerman, Galya; Regev-Rudzki, Neta; Gold, Daniel; Ben-Dov, Iddo Z; Sidi, Yechezkel; Avni, Dror; Schwartz, Eli

    2017-02-01

    Schistosomiasis traditionally has been diagnosed by detecting eggs in stool or urine. However, the sensitivity of these examinations is limited, especially in travelers with a low worm burden. Serologic tests have a greater sensitivity, but their results remain positive regardless of treatment and thus cannot be used for follow-up of patients. We hypothesized that detection of worm microRNAs (miRNAs) in serum can overcome the drawbacks of the existing diagnostic methods. Twenty-six returning travelers with schistosomiasis (based on positive results of serologic tests or detection of ova) and 17 healthy controls were included in the study. Quantitative reverse transcription polymerase chain reaction (qRT-PCR) amplification of miRNA extracted directly from 500 µL of serum had limited sensitivity and specificity. However, qRT-PCR analysis of RNA extracted from 200 μL of serum extracellular vesicles detected 4 schistosomal miRNAs; the sensitivity and specificity of the 2 highest expressed miRNAs (bantam and miR-2c-3p) were 86% and 84%, respectively. In 7 patients with posttreatment serum available for analysis, we observed outcomes ranging from a reduction in the schistosomal miRNA level to full recovery from disease. qRT-PCR of pathogen miRNAs isolated from extracellular vesicles in sera from infected individuals may provide a new tool for diagnosing schistosomiasis in patients with a low parasite burden. This assay could also be used for evaluating the outcome of therapy, as well as disease-control programs. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.

  13. Measuring voice outcomes: state of the science review.

    PubMed

    Carding, Pau N; Wilson, J A; MacKenzie, K; Deary, I J

    2009-08-01

    Researchers evaluating voice disorder interventions currently have a plethora of voice outcome measurement tools from which to choose. Faced with such a wide choice, it would be beneficial to establish a clear rationale to guide selection. This article reviews the published literature on the three main areas of voice outcome assessment: (1) perceptual rating of voice quality, (2) acoustic measurement of the speech signal and (3) patient self-reporting of voice problems. We analysed the published reliability, validity, sensitivity to change and utility of the common outcome measurement tools in each area. From the data, we suggest that routine voice outcome measurement should include (1) an expert rating of voice quality (using the Grade-Roughness-Breathiness-Asthenia-Strain rating scale) and (2) a short self-reporting tool (either the Vocal Performance Questionnaire or the Vocal Handicap Index 10). These measures have high validity, the best reported reliability to date, good sensitivity to change data and excellent utility ratings. However, their application and administration require attention to detail. Acoustic measurement has arguable validity and poor reliability data at the present time. Other areas of voice outcome measurement (e.g. stroboscopy and aerodynamic phonatory measurements) require similarly detailed research and analysis.

  14. Detection of Onchocerca volvulus in Skin Snips by Microscopy and Real-Time Polymerase Chain Reaction: Implications for Monitoring and Evaluation Activities.

    PubMed

    Thiele, Elizabeth A; Cama, Vitaliano A; Lakwo, Thomson; Mekasha, Sindeaw; Abanyie, Francisca; Sleshi, Markos; Kebede, Amha; Cantey, Paul T

    2016-04-01

    Microscopic evaluation of skin biopsies is the monitoring and evaluation (M and E) method currently used by multiple onchocerciasis elimination programs in Africa. However, as repeated mass drug administration suppresses microfilarial loads, the sensitivity and programmatic utility of skin snip microscopy is expected to decrease. Using a pan-filarial real-time polymerase chain reaction with melt curve analysis (qPCR-MCA), we evaluated 1) the use of a single-step molecular assay for detecting and identifying Onchocerca volvulus microfilariae in residual skin snips and 2) the sensitivity of skin snip microscopy relative to qPCR-MCA. Skin snips were collected and examined with routine microscopy in hyperendemic regions of Uganda and Ethiopia (N= 500 each) and "residual" skin snips (tissue remaining after induced microfilarial emergence) were tested with qPCR-MCA. qPCR-MCA detected Onchocerca DNA in 223 residual snips: 139 of 147 microscopy(+) and 84 among microscopy(-) snips, suggesting overall sensitivity of microscopy was 62.3% (139/223) relative to qPCR-MCA (75.6% in Uganda and 28.6% in Ethiopia). These findings demonstrate the insufficient sensitivity of skin snip microscopy for reliable programmatic monitoring. Molecular tools such as qPCR-MCA can augment sensitivity and provide diagnostic confirmation of skin biopsies and will be useful for evaluation or validation of new onchocerciasis M and E tools. © The American Society of Tropical Medicine and Hygiene.

  15. MutScan: fast detection and visualization of target mutations by scanning FASTQ data.

    PubMed

    Chen, Shifu; Huang, Tanxiao; Wen, Tiexiang; Li, Hong; Xu, Mingyan; Gu, Jia

    2018-01-22

    Some types of clinical genetic tests, such as cancer testing using circulating tumor DNA (ctDNA), require sensitive detection of known target mutations. However, conventional next-generation sequencing (NGS) data analysis pipelines typically involve different steps of filtering, which may cause miss-detection of key mutations with low frequencies. Variant validation is also indicated for key mutations detected by bioinformatics pipelines. Typically, this process can be executed using alignment visualization tools such as IGV or GenomeBrowse. However, these tools are too heavy and therefore unsuitable for validating mutations in ultra-deep sequencing data. We developed MutScan to address problems of sensitive detection and efficient validation for target mutations. MutScan involves highly optimized string-searching algorithms, which can scan input FASTQ files to grab all reads that support target mutations. The collected supporting reads for each target mutation will be piled up and visualized using web technologies such as HTML and JavaScript. Algorithms such as rolling hash and bloom filter are applied to accelerate scanning and make MutScan applicable to detect or visualize target mutations in a very fast way. MutScan is a tool for the detection and visualization of target mutations by only scanning FASTQ raw data directly. Compared to conventional pipelines, this offers a very high performance, executing about 20 times faster, and offering maximal sensitivity since it can grab mutations with even one single supporting read. MutScan visualizes detected mutations by generating interactive pile-ups using web technologies. These can serve to validate target mutations, thus avoiding false positives. Furthermore, MutScan can visualize all mutation records in a VCF file to HTML pages for cloud-friendly VCF validation. MutScan is an open source tool available at GitHub: https://github.com/OpenGene/MutScan.

  16. The current role of high-resolution mass spectrometry in food analysis.

    PubMed

    Kaufmann, Anton

    2012-05-01

    High-resolution mass spectrometry (HRMS), which is used for residue analysis in food, has gained wider acceptance in the last few years. This development is due to the availability of more rugged, sensitive, and selective instrumentation. The benefits provided by HRMS over classical unit-mass-resolution tandem mass spectrometry are considerable. These benefits include the collection of full-scan spectra, which provides greater insight into the composition of a sample. Consequently, the analyst has the freedom to measure compounds without previous compound-specific tuning, the possibility of retrospective data analysis, and the capability of performing structural elucidations of unknown or suspected compounds. HRMS strongly competes with classical tandem mass spectrometry in the field of quantitative multiresidue methods (e.g., pesticides and veterinary drugs). It is one of the most promising tools when moving towards nontargeted approaches. Certain hardware and software issues still have to be addressed by the instrument manufacturers for it to dislodge tandem mass spectrometry from its position as the standard trace analysis tool.

  17. Investigation of priorities in water quality management based on correlations and variations.

    PubMed

    Boyacıoğlu, Hülya; Gündogdu, Vildan; Boyacıoğlu, Hayal

    2013-04-15

    The development of water quality assessment strategies investigating spatial and temporal changes caused by natural and anthropogenic phenomena is an important tool in management practices. This paper used cluster analysis, water quality index method, sensitivity analysis and canonical correlation analysis to investigate priorities in pollution control activities. Data sets representing 22 surface water quality parameters were subject to analysis. Results revealed that organic pollution was serious threat for overall water quality in the region. Besides, oil and grease, lead and mercury were the critical variables violating the standard. In contrast to inorganic variables, organic and physical-inorganic chemical parameters were influenced by variations in physical conditions (discharge, temperature). This study showed that information produced based on the variations and correlations in water quality data sets can be helpful to investigate priorities in water management activities. Moreover statistical techniques and index methods are useful tools in data - information transformation process. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Acoustic emission and nondestructive evaluation of biomaterials and tissues.

    PubMed

    Kohn, D H

    1995-01-01

    Acoustic emission (AE) is an acoustic wave generated by the release of energy from localized sources in a material subjected to an externally applied stimulus. This technique may be used nondestructively to analyze tissues, materials, and biomaterial/tissue interfaces. Applications of AE include use as an early warning tool for detecting tissue and material defects and incipient failure, monitoring damage progression, predicting failure, characterizing failure mechanisms, and serving as a tool to aid in understanding material properties and structure-function relations. All these applications may be performed in real time. This review discusses general principles of AE monitoring and the use of the technique in 3 areas of importance to biomedical engineering: (1) analysis of biomaterials, (2) analysis of tissues, and (3) analysis of tissue/biomaterial interfaces. Focus in these areas is on detection sensitivity, methods of signal analysis in both the time and frequency domains, the relationship between acoustic signals and microstructural phenomena, and the uses of the technique in establishing a relationship between signals and failure mechanisms.

  19. Using a Software Tool in Forecasting: a Case Study of Sales Forecasting Taking into Account Data Uncertainty

    NASA Astrophysics Data System (ADS)

    Fabianová, Jana; Kačmáry, Peter; Molnár, Vieroslav; Michalik, Peter

    2016-10-01

    Forecasting is one of the logistics activities and a sales forecast is the starting point for the elaboration of business plans. Forecast accuracy affects the business outcomes and ultimately may significantly affect the economic stability of the company. The accuracy of the prediction depends on the suitability of the use of forecasting methods, experience, quality of input data, time period and other factors. The input data are usually not deterministic but they are often of random nature. They are affected by uncertainties of the market environment, and many other factors. Taking into account the input data uncertainty, the forecast error can by reduced. This article deals with the use of the software tool for incorporating data uncertainty into forecasting. Proposals are presented of a forecasting approach and simulation of the impact of uncertain input parameters to the target forecasted value by this case study model. The statistical analysis and risk analysis of the forecast results is carried out including sensitivity analysis and variables impact analysis.

  20. A rapid and sensitive fluorometric method for the quantitative analysis of snake venom metalloproteases and their inhibitors.

    PubMed

    Biardi, J E; Nguyen, K T; Lander, S; Whitley, M; Nambiar, K P

    2011-02-01

    Metalloproteases are responsible for the hemorrhagic effects of many snake venoms and contribute to other pathways that lead to local tissue damage. Methods that quantify snake venom metalloproteases (SVMP) are therefore valuable tools in research on the clinical, physiological, and biochemical effects of envenomation. Comparative analysis of individual, population, and species differences requires screening of large numbers of samples and treatments, and therefore require a method of quantifying SVMP activity that is simple, rapid, and sensitive. This paper demonstrates the properties of a new fluorometric assay of SVMP activity that can provide a measure of metalloprotease activity in 1 h. The assay is reliable, with variation among replicates sufficiently small to reliably detect differences in between species (F(19,60) = 2924, p < 0.001), even for those venoms with low overall activity. It is also sensitive enough to detect differences among venoms using <2 ng of whole venom protein. We provide an example use of this assay to detect the presence of natural SVMP inhibitors in minute samples of blood plasma from rock squirrels (S. variegatus), a natural prey species for North American rattlesnakes. We propose this assay is a useful addition to the set of tools used to characterize venoms, as well as high-throughput screening of natural or synthetic inhibitors, or other novel therapeutic agents against SVMP effects. Copyright © 2010 Elsevier Ltd. All rights reserved.

  1. Horizontal transfer of miR-106a/b from cisplatin resistant hepatocarcinoma cells can alter the sensitivity of cervical cancer cells to cisplatin.

    PubMed

    Raji, Grace R; Sruthi, T V; Edatt, Lincy; Haritha, K; Sharath Shankar, S; Sameer Kumar, V B

    2017-10-01

    Recent studies indicate that horizontal transfer of genetic material can act as a communication tool between heterogenous populations of tumour cells, thus altering the chemosensitivity of tumour cells. The present study was designed to check whether the horizontal transfer of miRNAs released by cisplatin resistant (Cp-r) Hepatocarcinoma cells can alter the sensitivity of cervical cancer cells. For this exosomes secreted by cisplatin resistant and cisplatin sensitive HepG2 cells (EXres and EXsen) were isolated and characterised. Cytotoxicity analysis showed that EXres can make Hela cells resistant to cisplatin. Analysis of miR-106a/b levels in EXres and EXsen showed that their levels vary. Mechanistic studies showed that miR-106a/b play an important role in EXsen and EXres mediated change in chemosensitivity of Hela cells to cisplatin. Further SIRT1 was identified as a major target of miR-106a/b using in silico tools and this was proved by experimentation. Also the effect of miR-106a/b in chemosensitivity was seen to be dependent on regulation of SIRT1 by miR-106a/b. In brief, this study brings into light, the SIRT1 dependent mechanism of miR-106a/b mediated regulation of chemosensitivity upon the horizontal transfer from one cell type to another. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. A computational model that predicts behavioral sensitivity to intracortical microstimulation

    PubMed Central

    Kim, Sungshin; Callier, Thierri; Bensmaia, Sliman J.

    2016-01-01

    Objective Intracortical microstimulation (ICMS) is a powerful tool to investigate the neural mechanisms of perception and can be used to restore sensation for patients who have lost it. While sensitivity to ICMS has previously been characterized, no systematic framework has been developed to summarize the detectability of individual ICMS pulse trains or the discriminability of pairs of pulse trains. Approach We develop a simple simulation that describes the responses of a population of neurons to a train of electrical pulses delivered through a microelectrode. We then perform an ideal observer analysis on the simulated population responses to predict the behavioral performance of non-human primates in ICMS detection and discrimination tasks. Main results Our computational model can predict behavioral performance across a wide range of stimulation conditions with high accuracy (R2 = 0.97) and generalizes to novel ICMS pulse trains that were not used to fit its parameters. Furthermore, the model provides a theoretical basis for the finding that amplitude discrimination based on ICMS violates Weber's law. Significance The model can be used to characterize the sensitivity to ICMS across the range of perceptible and safe stimulation regimes. As such, it will be a useful tool for both neuroscience and neuroprosthetics. PMID:27977419

  3. A computational model that predicts behavioral sensitivity to intracortical microstimulation.

    PubMed

    Kim, Sungshin; Callier, Thierri; Bensmaia, Sliman J

    2017-02-01

    Intracortical microstimulation (ICMS) is a powerful tool to investigate the neural mechanisms of perception and can be used to restore sensation for patients who have lost it. While sensitivity to ICMS has previously been characterized, no systematic framework has been developed to summarize the detectability of individual ICMS pulse trains or the discriminability of pairs of pulse trains. We develop a simple simulation that describes the responses of a population of neurons to a train of electrical pulses delivered through a microelectrode. We then perform an ideal observer analysis on the simulated population responses to predict the behavioral performance of non-human primates in ICMS detection and discrimination tasks. Our computational model can predict behavioral performance across a wide range of stimulation conditions with high accuracy (R 2  = 0.97) and generalizes to novel ICMS pulse trains that were not used to fit its parameters. Furthermore, the model provides a theoretical basis for the finding that amplitude discrimination based on ICMS violates Weber's law. The model can be used to characterize the sensitivity to ICMS across the range of perceptible and safe stimulation regimes. As such, it will be a useful tool for both neuroscience and neuroprosthetics.

  4. A computational model that predicts behavioral sensitivity to intracortical microstimulation

    NASA Astrophysics Data System (ADS)

    Kim, Sungshin; Callier, Thierri; Bensmaia, Sliman J.

    2017-02-01

    Objective. Intracortical microstimulation (ICMS) is a powerful tool to investigate the neural mechanisms of perception and can be used to restore sensation for patients who have lost it. While sensitivity to ICMS has previously been characterized, no systematic framework has been developed to summarize the detectability of individual ICMS pulse trains or the discriminability of pairs of pulse trains. Approach. We develop a simple simulation that describes the responses of a population of neurons to a train of electrical pulses delivered through a microelectrode. We then perform an ideal observer analysis on the simulated population responses to predict the behavioral performance of non-human primates in ICMS detection and discrimination tasks. Main results. Our computational model can predict behavioral performance across a wide range of stimulation conditions with high accuracy (R 2 = 0.97) and generalizes to novel ICMS pulse trains that were not used to fit its parameters. Furthermore, the model provides a theoretical basis for the finding that amplitude discrimination based on ICMS violates Weber’s law. Significance. The model can be used to characterize the sensitivity to ICMS across the range of perceptible and safe stimulation regimes. As such, it will be a useful tool for both neuroscience and neuroprosthetics.

  5. [Criterion Validity of the German Version of the CES-D in the General Population].

    PubMed

    Jahn, Rebecca; Baumgartner, Josef S; van den Nest, Miriam; Friedrich, Fabian; Alexandrowicz, Rainer W; Wancata, Johannes

    2018-04-17

    The "Center of Epidemiologic Studies - Depression scale" (CES-D) is a well-known screening tool for depression. Until now the criterion validity of the German version of the CES-D was not investigated in a sample of the adult general population. 508 study participants of the Austrian general population completed the CES-D. ICD-10 diagnoses were established by using the Schedules for Clinical Assessment in Neuropsychiatry (SCAN). Receiver Operating Characteristics (ROC) analysis was conducted. Possible gender differences were explored. Overall discriminating performance of the CES-D was sufficient (ROC-AUC 0,836). Using the traditional cut-off values of 15/16 and 21/22 respectively the sensitivity was 43.2 % and 32.4 %, respectively. The cut-off value developed on the basis of our sample was 9/10 with a sensitivity of 81.1 % und a specificity of 74.3 %. There were no significant gender differences. This is the first study investigating the criterion validity of the German version of the CES-D in the general population. The optimal cut-off values yielded sufficient sensitivity and specificity, comparable to the values of other screening tools. © Georg Thieme Verlag KG Stuttgart · New York.

  6. Measuring Road Network Vulnerability with Sensitivity Analysis

    PubMed Central

    Jun-qiang, Leng; Long-hai, Yang; Liu, Wei-yi; Zhao, Lin

    2017-01-01

    This paper focuses on the development of a method for road network vulnerability analysis, from the perspective of capacity degradation, which seeks to identify the critical infrastructures in the road network and the operational performance of the whole traffic system. This research involves defining the traffic utility index and modeling vulnerability of road segment, route, OD (Origin Destination) pair and road network. Meanwhile, sensitivity analysis method is utilized to calculate the change of traffic utility index due to capacity degradation. This method, compared to traditional traffic assignment, can improve calculation efficiency and make the application of vulnerability analysis to large actual road network possible. Finally, all the above models and calculation method is applied to actual road network evaluation to verify its efficiency and utility. This approach can be used as a decision-supporting tool for evaluating the performance of road network and identifying critical infrastructures in transportation planning and management, especially in the resource allocation for mitigation and recovery. PMID:28125706

  7. Motif-based analysis of large nucleotide data sets using MEME-ChIP

    PubMed Central

    Ma, Wenxiu; Noble, William S; Bailey, Timothy L

    2014-01-01

    MEME-ChIP is a web-based tool for analyzing motifs in large DNA or RNA data sets. It can analyze peak regions identified by ChIP-seq, cross-linking sites identified by cLIP-seq and related assays, as well as sets of genomic regions selected using other criteria. MEME-ChIP performs de novo motif discovery, motif enrichment analysis, motif location analysis and motif clustering, providing a comprehensive picture of the DNA or RNA motifs that are enriched in the input sequences. MEME-ChIP performs two complementary types of de novo motif discovery: weight matrix–based discovery for high accuracy; and word-based discovery for high sensitivity. Motif enrichment analysis using DNA or RNA motifs from human, mouse, worm, fly and other model organisms provides even greater sensitivity. MEME-ChIP’s interactive HTML output groups and aligns significant motifs to ease interpretation. this protocol takes less than 3 h, and it provides motif discovery approaches that are distinct and complementary to other online methods. PMID:24853928

  8. Sensitivity of Podosphaera aphanis isolates to DMI fungicides: distribution and reduced cross-sensitivity.

    PubMed

    Sombardier, Audrey; Dufour, Marie-Cécile; Blancard, Dominique; Corio-Costet, Marie-France

    2010-01-01

    Management of strawberry powdery mildew, Podopshaera aphanis (Wallr.), requires numerous fungicide treatments. Limiting epidemics is heavily dependent on sterol demethylation inhibitors (DMIs) such as myclobutanil or penconazole. Recently, a noticeable reduction in the efficacy of these triazole fungicides was reported by strawberry growers in France. The goal of this study was to investigate the state of DMI sensitivity of French P. aphanis and provide tools for improved pest management. Using leaf disc sporulation assays, sensitivity to myclobutanil and penconazole of 23 isolates of P. aphanis was monitored. Myclobutanil EC(50) ranged from less than 0.1 to 14.67 mg L(-1) and for penconazole from 0.04 to 4.2 mg L(-1). A cross-analysis and a Venn diagram showed that there was reduced sensitivity and a positive correlation between the less sensitive myclobutanil and penconazole isolates; 73.9% of isolates were less sensitive to a DMI and 47.8% exhibited less sensitivity to both fungicides. The results show that sensitivity to myclobutanil and, to a lesser extent, penconazole has become less efficient in strawberry powdery mildew in France. Therefore, urgent action is required in order to document its appearance and optimise methods of control.

  9. Stereometric parameters change vs. Topographic Change Analysis (TCA) agreement in Heidelberg Retina Tomography III (HRT-3) early detection of clinical significant glaucoma progression.

    PubMed

    Dascalu, A M; Cherecheanu, A P; Stana, D; Voinea, L; Ciuluvica, R; Savlovschi, C; Serban, D

    2014-01-01

    to investigate the sensitivity and specificity of the stereometric parameters change analysis vs. Topographic Change Analysis in early detection of glaucoma progression. 81 patients with POAG were monitored for 4 years (GAT monthly, SAP at every 6 months, optic disc photographs and HRT3 yearly). The exclusion criteria were other optic disc or retinal pathology; topographic standard deviation (TSD>30; inter-test variation of reference height>25 μm. The criterion for structural progression was the following: at least 20 adjacent super-pixels with a clinically significant decrease in height (>5%). 16 patients of the total 81 presented structural progression on TCA. The most useful stereometric parameters for the early detection of glaucoma progression were the following: Rim Area change (sensitivity 100%, specificity 74.2% for a "cut-off " value of -0.05), C/D Area change (sensitivity 85.7%, specificity 71.5% for a "cut off " value of 0.02), C/D linear change (sensitivity 85.7%, specificity 71.5% for a "cut-off " value of 0.02), Rim Volume change (sensitivity 71.4%, specificity 88.8% for a "cut-off " value of -0.04). RNFL Thickness change (<0) was highly sensitive (82%), but less specific for glaucoma progression (45,2%). Changes of the other stereometric parameters have a limited diagnostic value for the early detection of glaucoma progression. TCA is a valuable tool for the assessment of the structural progression in glaucoma patients and its inter-test variability is low. On long-term, the quantitative analysis according to stereometric parameters change is also very important. The most relevant parameters to detect progression are RA, C/D Area, Linear C/D and RV.

  10. Computational simulation and aerodynamic sensitivity analysis of film-cooled turbines

    NASA Astrophysics Data System (ADS)

    Massa, Luca

    A computational tool is developed for the time accurate sensitivity analysis of the stage performance of hot gas, unsteady turbine components. An existing turbomachinery internal flow solver is adapted to the high temperature environment typical of the hot section of jet engines. A real gas model and film cooling capabilities are successfully incorporated in the software. The modifications to the existing algorithm are described; both the theoretical model and the numerical implementation are validated. The accuracy of the code in evaluating turbine stage performance is tested using a turbine geometry typical of the last stage of aeronautical jet engines. The results of the performance analysis show that the predictions differ from the experimental data by less than 3%. A reliable grid generator, applicable to the domain discretization of the internal flow field of axial flow turbine is developed. A sensitivity analysis capability is added to the flow solver, by rendering it able to accurately evaluate the derivatives of the time varying output functions. The complex Taylor's series expansion (CTSE) technique is reviewed. Two of them are used to demonstrate the accuracy and time dependency of the differentiation process. The results are compared with finite differences (FD) approximations. The CTSE is more accurate than the FD, but less efficient. A "black box" differentiation of the source code, resulting from the automated application of the CTSE, generates high fidelity sensitivity algorithms, but with low computational efficiency and high memory requirements. New formulations of the CTSE are proposed and applied. Selective differentiation of the method for solving the non-linear implicit residual equation leads to sensitivity algorithms with the same accuracy but improved run time. The time dependent sensitivity derivatives are computed in run times comparable to the ones required by the FD approach.

  11. Using equitable impact sensitive tool (EQUIST) and knowledge translation to promote evidence to policy link in maternal and child health: report of first EQUIST training workshop in Nigeria.

    PubMed

    Uneke, Chigozie Jesse; Sombie, Issiaka; Uro-Chukwu, Henry Chukwuemeka; Johnson, Ermel; Okonofua, Friday

    2017-01-01

    The Equitable Impact Sensitive Tool (EQUIST) designed by UNICEF and knowledge translation (KT) are important strategies that can help policymakers to improve equity and evidence-informed policy making in maternal, newborn and child health (MNCH). The purpose of this study was to improve the knowledge and capacity of an MNCH implementation research team (IRT) and policy makers to use EQUIST and KT. A modified "before and after" intervention study design was used in which outcomes were measured on the target participants both before the intervention (workshop) is implemented and after. A 5-point likert scale according to the degree of adequacy was employed. A three -day intensive EQUIST and KT training workshop was organized in Edo State, Nigeria with 45 participants in attendance. Some of the topics covered included: (i) Knowledge translation models, measures & tools; (ii) Policy review, analysis and contextualization; (iii) Policy formulation and legislation process; (iv) EQUIST Overview & Theory of change; (v) EQUIST's situation analysis, scenario analysis and scenario comparison. The pre-workshop mean of understanding of use of KT ranged from 2.02-3.41, while the post-workshop mean ranged from 3.24-4.30. Pre-workshop mean of understanding of use of EQUIST ranged from 1.66-2.41, while the post-workshop mean ranged from 3.56-4.54 on the 5point scale. The percentage increase in mean of KT and EQUIST at the end of the workshop ranged from 8.0%-88.1% and 65.6%-158.4% respectively. Findings of this study suggest that policymakers' and researchers KT and EQUSIT use competence relevant to evidence-informed policymaking can be enhanced through training workshop.

  12. Using equitable impact sensitive tool (EQUIST) and knowledge translation to promote evidence to policy link in maternal and child health: report of first EQUIST training workshop in Nigeria

    PubMed Central

    Uneke, Chigozie Jesse; Sombie, Issiaka; Uro-Chukwu, Henry Chukwuemeka; Johnson, Ermel; Okonofua, Friday

    2017-01-01

    The Equitable Impact Sensitive Tool (EQUIST) designed by UNICEF and knowledge translation (KT) are important strategies that can help policymakers to improve equity and evidence-informed policy making in maternal, newborn and child health (MNCH). The purpose of this study was to improve the knowledge and capacity of an MNCH implementation research team (IRT) and policy makers to use EQUIST and KT. A modified “before and after” intervention study design was used in which outcomes were measured on the target participants both before the intervention (workshop) is implemented and after. A 5-point likert scale according to the degree of adequacy was employed. A three -day intensive EQUIST and KT training workshop was organized in Edo State, Nigeria with 45 participants in attendance. Some of the topics covered included: (i) Knowledge translation models, measures & tools; (ii) Policy review, analysis and contextualization; (iii) Policy formulation and legislation process; (iv) EQUIST Overview & Theory of change; (v) EQUIST's situation analysis, scenario analysis and scenario comparison. The pre-workshop mean of understanding of use of KT ranged from 2.02-3.41, while the post-workshop mean ranged from 3.24-4.30. Pre-workshop mean of understanding of use of EQUIST ranged from 1.66-2.41, while the post-workshop mean ranged from 3.56-4.54 on the 5point scale. The percentage increase in mean of KT and EQUIST at the end of the workshop ranged from 8.0%-88.1% and 65.6%-158.4% respectively. Findings of this study suggest that policymakers' and researchers KT and EQUSIT use competence relevant to evidence-informed policymaking can be enhanced through training workshop. PMID:29158860

  13. Fluorescence Spectroscopy as a Tool for the Assessment of Liver Samples with Several Stages of Fibrosis.

    PubMed

    Fabila-Bustos, Diego A; Arroyo-Camarena, Úrsula D; López-Vancell, María D; Durán-Padilla, Marco A; Azuceno-García, Itzel; Stolik-Isakina, Suren; Valor-Reed, Alma; Ibarra-Coronado, Elizabeth; Hernández-Quintanar, Luis F; Escobedo, Galileo; de la Rosa-Vázquez, José M

    2018-03-01

    During the last years, fluorescence spectroscopy has been used as a potential tool for the evaluation and characterization of tissues with different disease conditions due to its low cost, high sensitivity, and minimally or noninvasive character. In this study, fluorescence spectroscopy was used to study 19 paraffin blocks containing human liver tissue from biopsies. All samples were previously analyzed by two senior pathologists in a single-blind trial. After their evaluation, four liver samples were classified as nonfibrosis (F0), four as initial fibrosis (F1-F2), four as advanced fibrosis (F3), and six as cirrhosis (F4). The fluorescence was induced at different wavelengths as follows: 330, 365, and 405 nm using a portable fiber-optic system. The fluorescence spectra were recorded in the range of 400-750 nm. A distinctive correlation between the shape of each spectrum and the level of fibrosis in the liver sample was detected. A multi-variate statistical analysis based on principal component analysis followed by linear discrimination analysis was applied to develop algorithms able to distinguish different stages of fibrosis based on the characteristics of fluorescence spectra. Pairwise comparisons were performed: F0 versus F1-F2, F1-F2 versus F3, F3 versus F4, and F1-F2 versus F4. The algorithms applied to each set of data yielded values of sensitivity and specificity that were higher than 90% and 95%, respectively, in all the analyzed cases. With this study, it is concluded that fluorescence spectroscopy can be used as a complementary tool for the assessment of liver fibrosis in liver tissue samples, which sets the stage for subsequent clinical trials.

  14. Toxicity risk assessment of mercury, DDT and arsenic legacy pollution in sediments: A triad approach under low concentration conditions.

    PubMed

    Marziali, L; Rosignoli, F; Drago, A; Pascariello, S; Valsecchi, L; Rossaro, B; Guzzella, L

    2017-09-01

    The determination of sediment toxicity is challenging due to site-specific factors affecting pollutants distribution and bioavailability, especially when contamination levels are close to expected non-effect concentrations. Different lines of evidence and sensitive tools are necessary for a proper toxicity risk assessment. We examined the case study of the Toce River (Northern Italy), where past industrial activities determined Hg, DDT and As enrichment in sediments. A triad approach comprising chemical, ecotoxicological and ecological analyses (benthic invertebrates) was carried out for risk assessment of residual contamination in river sediments. A "blank" site upstream from the industrial site was selected to compare the other sites downstream. Sediment, water and benthic invertebrate samplings were carried out following standard protocols. Results emphasized that despite the emissions of the industrial site ceased about 20years ago, sediments in the downstream section of the river remain contaminated by Hg, DDT and As with concentrations exceeding Threshold Effect Concentrations. A chronic whole-sediment test with Chironomus riparius showed decreased development rate and a lower number of eggs per mass in the contaminated sediments. Benthic community was analyzed with the calculation of integrated (STAR_ICMi) and stressor-specific metrics (SPEAR pesticide and mean sensitivity to Hg), but no significant differences were found between upstream and downstream sites. On the other hand, multivariate analysis (partial Redundancy Analysis and variation partitioning) emphasized a slight impact on invertebrate community, accounting for 5% variation in taxa composition. Results show that legacy contaminants in sediments, even at low concentrations, may be bioavailable and possibly toxic for benthic invertebrates. At low concentration levels, sensitive and site-specific tools need to be developed for a proper risk analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Differentiation of Leishmania species by FT-IR spectroscopy

    NASA Astrophysics Data System (ADS)

    Aguiar, Josafá C.; Mittmann, Josane; Ferreira, Isabelle; Ferreira-Strixino, Juliana; Raniero, Leandro

    2015-05-01

    Leishmaniasis is a parasitic infectious disease caused by protozoa that belong to the genus Leishmania. It is transmitted by the bite of an infected female Sand fly. The disease is endemic in 88 countries Desjeux (2001) [1] (16 developed countries and 72 developing countries) on four continents. In Brazil, epidemiological data show the disease is present in all Brazilian regions, with the highest incidences in the North and Northeast. There are several methods used to diagnose leishmaniasis, but these procedures have many limitations, are time consuming, have low sensitivity, and are expensive. In this context, Fourier Transform Infrared Spectroscopy (FT-IR) analysis has the potential to provide rapid results and may be adapted for a clinical test with high sensitivity and specificity. In this work, FT-IR was used as a tool to investigate the promastigotes of Leishmaniaamazonensis, Leishmaniachagasi, and Leishmaniamajor species. The spectra were analyzed by cluster analysis and deconvolution procedure base on spectra second derivatives. Results: cluster analysis found four specific regions that are able to identify the Leishmania species. The dendrogram representation clearly indicates the heterogeneity among Leishmania species. The band deconvolution done by the curve fitting in these regions quantitatively differentiated the polysaccharides, amide III, phospholipids, proteins, and nucleic acids. L. chagasi and L. major showed a greater biochemistry similarity and have three bands that were not registered in L. amazonensis. The L. amazonensis presented three specific bands that were not recorded in the other two species. It is evident that the FT-IR method is an indispensable tool to discriminate these parasites. The high sensitivity and specificity of this technique opens up the possibilities for further studies about characterization of other microorganisms.

  16. Sensitivity and Specificity Analysis: Use of Emoticon for Screening of Depression in Elderly in Singapore.

    PubMed

    Tan, Laurence; Toh, Hui Jin; Sim, Lai Kiow; Low, James Alvin

    2018-03-01

    The current screening tools for depression can be tedious to administer, especially in the elderly population with hearing impairment and/or limited proficiency in English language. To look at the feasibility of using emoticon as a screening and assessment tool for depression in the elderly. Cross-sectional study. A total of 77 elderly patients completed the study from June 2014 to August 2015 in a general geriatric outpatient clinic of an acute care hospital in Singapore. Patients rated their mood using an emoticon scale, which ranges from 1 ( most happy face) to 7 ( most sad face). Depression was assessed using the Diagnostic and Statistical Manual of Mental Disorders (4th ed.; DSM-IV) criteria as the gold standard. Sensitivity and specificity for depression were calculated for the cutoff scores from 1 to 7 on the emoticon scale. The sensitivity percentages were low across all cutoff scores. The specificity was more than 90% for the cutoff score of 5 and above on the emoticon scale. However, all the patients who had depression diagnosed using the DSM-IV criteria did not have emoticon scores of 5 and above. The emoticon scale was easy to use, but its effectiveness in the screening of depression in the elderly needs to be explored further. The inability to use the emoticon scale as a tool may be the lack of measurements in the other domains of the DSM-IV criteria (sleep, energy, appetite, etc.), rather than failure of the emoticon scale to assess mood.

  17. Simplified tools for evaluating domestic ventilation systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maansson, L.G.; Orme, M.

    1999-07-01

    Within an International Energy Agency (IEA) project, Annex 27, experts from 8 countries (Canada, France, Italy, Japan, The Netherlands, Sweden, UK and USA) have developed simplified tools for evaluating domestic ventilation systems during the heating season. Tools for building and user aspects, thermal comfort, noise, energy, life cycle cost, reliability and indoor air quality (IAQ) have been devised. The results can be used both for dwellings at the design stage and after construction. The tools lead to immediate answers and indications about the consequences of different choices that may arise during discussion with clients. This paper presents an introduction tomore » these tools. Examples applications of the indoor air quality and energy simplified tools are also provided. The IAQ tool accounts for constant emission sources, CO{sub 2}, cooking products, tobacco smoke, condensation risks, humidity levels (i.e., for judging the risk for mould and house dust mites), and pressure difference (for identifying the risk for radon or land fill spillage entering the dwelling or problems with indoor combustion appliances). An elaborated set of design parameters were worked out that resulted in about 17,000 combinations. By using multi-variate analysis it was possible to reduce this to 174 combinations for IAQ. In addition, a sensitivity analysis was made using 990 combinations. The results from all the runs were used to develop a simplified tool, as well as quantifying equations relying on the design parameters. A computerized energy tool has also been developed within this project, which takes into account air tightness, climate, window airing pattern, outdoor air flow rate and heat exchange efficiency.« less

  18. Spectral discrimination of serum from liver cancer and liver cirrhosis using Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Yang, Tianyue; Li, Xiaozhou; Yu, Ting; Sun, Ruomin; Li, Siqi

    2011-07-01

    In this paper, Raman spectra of human serum were measured using Raman spectroscopy, then the spectra was analyzed by multivariate statistical methods of principal component analysis (PCA). Then linear discriminant analysis (LDA) was utilized to differentiate the loading score of different diseases as the diagnosing algorithm. Artificial neural network (ANN) was used for cross-validation. The diagnosis sensitivity and specificity by PCA-LDA are 88% and 79%, while that of the PCA-ANN are 89% and 95%. It can be seen that modern analyzing method is a useful tool for the analysis of serum spectra for diagnosing diseases.

  19. Frequency analysis for modulation-enhanced powder diffraction.

    PubMed

    Chernyshov, Dmitry; Dyadkin, Vadim; van Beek, Wouter; Urakawa, Atsushi

    2016-07-01

    Periodic modulation of external conditions on a crystalline sample with a consequent analysis of periodic diffraction response has been recently proposed as a tool to enhance experimental sensitivity for minor structural changes. Here the intensity distributions for both a linear and nonlinear structural response induced by a symmetric and periodic stimulus are analysed. The analysis is further extended for powder diffraction when an external perturbation changes not only the intensity of Bragg lines but also their positions. The derived results should serve as a basis for a quantitative modelling of modulation-enhanced diffraction data measured in real conditions.

  20. Analytical improvements of hybrid LC-MS/MS techniques for the efficient evaluation of emerging contaminants in river waters: a case study of the Henares River (Madrid, Spain).

    PubMed

    Pérez-Parada, Andrés; Gómez-Ramos, María del Mar; Martínez Bueno, María Jesús; Uclés, Samanta; Uclés, Ana; Fernández-Alba, Amadeo R

    2012-02-01

    Instrumental capabilities and software tools of modern hybrid mass spectrometry (MS) instruments such as high-resolution mass spectrometry (HRMS), quadrupole time-of-flight (QTOF), and quadrupole linear ion trap (QLIT) were experimentally investigated for the study of emerging contaminants in Henares River water samples. Automated screening and confirmatory capabilities of QTOF working in full-scan MS and tandem MS (MS/MS) were explored when dealing with real samples. Investigations on the effect of sensitivity and resolution power influence on mass accuracy were studied for the correct assignment of the amoxicillin transformation product 5(R) amoxicillin-diketopiperazine-2',5' as an example of a nontarget compound. On the other hand, a comparison of quantitative and qualitative strategies based on direct injection analysis and off-line solid-phase extraction sample treatment were assayed using two different QLIT instruments for a selected group of emerging contaminants when operating in selected reaction monitoring (SRM) and information-dependent acquisition (IDA) modes. Software-aided screening usually needs a further confirmatory step. Resolving power and MS/MS feature of QTOF showed to confirm/reject most findings in river water, although sensitivity-related limitations are usually found. Superior sensitivity of modern QLIT-MS/MS offered the possibility of direct injection analysis for proper quantitative study of a variety of contaminants, while it simultaneously reduced the matrix effect and increased the reliability of the results. Confirmation of ethylamphetamine, which lacks on a second SRM transition, was accomplished by using the IDA feature. Hybrid MS instruments equipped with high resolution and high sensitivity contributes to enlarge the scope of targeted analytes in river waters. However, in the tested instruments, there is a margin of improvement principally in required sensitivity and data treatment software tools devoted to reliable confirmation and improved automated data processing.

  1. Advances in global sensitivity analyses of demographic-based species distribution models to address uncertainties in dynamic landscapes.

    PubMed

    Naujokaitis-Lewis, Ilona; Curtis, Janelle M R

    2016-01-01

    Developing a rigorous understanding of multiple global threats to species persistence requires the use of integrated modeling methods that capture processes which influence species distributions. Species distribution models (SDMs) coupled with population dynamics models can incorporate relationships between changing environments and demographics and are increasingly used to quantify relative extinction risks associated with climate and land-use changes. Despite their appeal, uncertainties associated with complex models can undermine their usefulness for advancing predictive ecology and informing conservation management decisions. We developed a computationally-efficient and freely available tool (GRIP 2.0) that implements and automates a global sensitivity analysis of coupled SDM-population dynamics models for comparing the relative influence of demographic parameters and habitat attributes on predicted extinction risk. Advances over previous global sensitivity analyses include the ability to vary habitat suitability across gradients, as well as habitat amount and configuration of spatially-explicit suitability maps of real and simulated landscapes. Using GRIP 2.0, we carried out a multi-model global sensitivity analysis of a coupled SDM-population dynamics model of whitebark pine (Pinus albicaulis) in Mount Rainier National Park as a case study and quantified the relative influence of input parameters and their interactions on model predictions. Our results differed from the one-at-time analyses used in the original study, and we found that the most influential parameters included the total amount of suitable habitat within the landscape, survival rates, and effects of a prevalent disease, white pine blister rust. Strong interactions between habitat amount and survival rates of older trees suggests the importance of habitat in mediating the negative influences of white pine blister rust. Our results underscore the importance of considering habitat attributes along with demographic parameters in sensitivity routines. GRIP 2.0 is an important decision-support tool that can be used to prioritize research, identify habitat-based thresholds and management intervention points to improve probability of species persistence, and evaluate trade-offs of alternative management options.

  2. Advances in global sensitivity analyses of demographic-based species distribution models to address uncertainties in dynamic landscapes

    PubMed Central

    Curtis, Janelle M.R.

    2016-01-01

    Developing a rigorous understanding of multiple global threats to species persistence requires the use of integrated modeling methods that capture processes which influence species distributions. Species distribution models (SDMs) coupled with population dynamics models can incorporate relationships between changing environments and demographics and are increasingly used to quantify relative extinction risks associated with climate and land-use changes. Despite their appeal, uncertainties associated with complex models can undermine their usefulness for advancing predictive ecology and informing conservation management decisions. We developed a computationally-efficient and freely available tool (GRIP 2.0) that implements and automates a global sensitivity analysis of coupled SDM-population dynamics models for comparing the relative influence of demographic parameters and habitat attributes on predicted extinction risk. Advances over previous global sensitivity analyses include the ability to vary habitat suitability across gradients, as well as habitat amount and configuration of spatially-explicit suitability maps of real and simulated landscapes. Using GRIP 2.0, we carried out a multi-model global sensitivity analysis of a coupled SDM-population dynamics model of whitebark pine (Pinus albicaulis) in Mount Rainier National Park as a case study and quantified the relative influence of input parameters and their interactions on model predictions. Our results differed from the one-at-time analyses used in the original study, and we found that the most influential parameters included the total amount of suitable habitat within the landscape, survival rates, and effects of a prevalent disease, white pine blister rust. Strong interactions between habitat amount and survival rates of older trees suggests the importance of habitat in mediating the negative influences of white pine blister rust. Our results underscore the importance of considering habitat attributes along with demographic parameters in sensitivity routines. GRIP 2.0 is an important decision-support tool that can be used to prioritize research, identify habitat-based thresholds and management intervention points to improve probability of species persistence, and evaluate trade-offs of alternative management options. PMID:27547529

  3. Performance of in silico prediction tools for the classification of rare BRCA1/2 missense variants in clinical diagnostics.

    PubMed

    Ernst, Corinna; Hahnen, Eric; Engel, Christoph; Nothnagel, Michael; Weber, Jonas; Schmutzler, Rita K; Hauke, Jan

    2018-03-27

    The use of next-generation sequencing approaches in clinical diagnostics has led to a tremendous increase in data and a vast number of variants of uncertain significance that require interpretation. Therefore, prediction of the effects of missense mutations using in silico tools has become a frequently used approach. Aim of this study was to assess the reliability of in silico prediction as a basis for clinical decision making in the context of hereditary breast and/or ovarian cancer. We tested the performance of four prediction tools (Align-GVGD, SIFT, PolyPhen-2, MutationTaster2) using a set of 236 BRCA1/2 missense variants that had previously been classified by expert committees. However, a major pitfall in the creation of a reliable evaluation set for our purpose is the generally accepted classification of BRCA1/2 missense variants using the multifactorial likelihood model, which is partially based on Align-GVGD results. To overcome this drawback we identified 161 variants whose classification is independent of any previous in silico prediction. In addition to the performance as stand-alone tools we examined the sensitivity, specificity, accuracy and Matthews correlation coefficient (MCC) of combined approaches. PolyPhen-2 achieved the lowest sensitivity (0.67), specificity (0.67), accuracy (0.67) and MCC (0.39). Align-GVGD achieved the highest values of specificity (0.92), accuracy (0.92) and MCC (0.73), but was outperformed regarding its sensitivity (0.90) by SIFT (1.00) and MutationTaster2 (1.00). All tools suffered from poor specificities, resulting in an unacceptable proportion of false positive results in a clinical setting. This shortcoming could not be bypassed by combination of these tools. In the best case scenario, 138 families would be affected by the misclassification of neutral variants within the cohort of patients of the German Consortium for Hereditary Breast and Ovarian Cancer. We show that due to low specificities state-of-the-art in silico prediction tools are not suitable to predict pathogenicity of variants of uncertain significance in BRCA1/2. Thus, clinical consequences should never be based solely on in silico forecasts. However, our data suggests that SIFT and MutationTaster2 could be suitable to predict benignity, as both tools did not result in false negative predictions in our analysis.

  4. Biochemical component identification by plasmonic improved whispering gallery mode optical resonance based sensor

    NASA Astrophysics Data System (ADS)

    Saetchnikov, Vladimir A.; Tcherniavskaia, Elina A.; Saetchnikov, Anton V.; Schweiger, Gustav; Ostendorf, Andreas

    2014-05-01

    Experimental data on detection and identification of variety of biochemical agents, such as proteins, microelements, antibiotic of different generation etc. in both single and multi component solutions under varied in wide range concentration analyzed on the light scattering parameters of whispering gallery mode optical resonance based sensor are represented. Multiplexing on parameters and components has been realized using developed fluidic sensor cell with fixed in adhesive layer dielectric microspheres and data processing. Biochemical component identification has been performed by developed network analysis techniques. Developed approach is demonstrated to be applicable both for single agent and for multi component biochemical analysis. Novel technique based on optical resonance on microring structures, plasmon resonance and identification tools has been developed. To improve a sensitivity of microring structures microspheres fixed by adhesive had been treated previously by gold nanoparticle solution. Another technique used thin film gold layers deposited on the substrate below adhesive. Both biomolecule and nanoparticle injections caused considerable changes of optical resonance spectra. Plasmonic gold layers under optimized thickness also improve parameters of optical resonance spectra. Biochemical component identification has been also performed by developed network analysis techniques both for single and for multi component solution. So advantages of plasmon enhancing optical microcavity resonance with multiparameter identification tools is used for development of a new platform for ultra sensitive label-free biomedical sensor.

  5. Parameter sensitivity analysis of the mixed Green-Ampt/Curve-Number method for rainfall excess estimation in small ungauged catchments

    NASA Astrophysics Data System (ADS)

    Romano, N.; Petroselli, A.; Grimaldi, S.

    2012-04-01

    With the aim of combining the practical advantages of the Soil Conservation Service - Curve Number (SCS-CN) method and Green-Ampt (GA) infiltration model, we have developed a mixed procedure, which is referred to as CN4GA (Curve Number for Green-Ampt). The basic concept is that, for a given storm, the computed SCS-CN total net rainfall amount is used to calibrate the soil hydraulic conductivity parameter of the Green-Ampt model so as to distribute in time the information provided by the SCS-CN method. In a previous contribution, the proposed mixed procedure was evaluated on 100 observed events showing encouraging results. In this study, a sensitivity analysis is carried out to further explore the feasibility of applying the CN4GA tool in small ungauged catchments. The proposed mixed procedure constrains the GA model with boundary and initial conditions so that the GA soil hydraulic parameters are expected to be insensitive toward the net hyetograph peak. To verify and evaluate this behaviour, synthetic design hyetograph and synthetic rainfall time series are selected and used in a Monte Carlo analysis. The results are encouraging and confirm that the parameter variability makes the proposed method an appropriate tool for hydrologic predictions in ungauged catchments. Keywords: SCS-CN method, Green-Ampt method, rainfall excess, ungauged basins, design hydrograph, rainfall-runoff modelling.

  6. In Silico PCR Tools for a Fast Primer, Probe, and Advanced Searching.

    PubMed

    Kalendar, Ruslan; Muterko, Alexandr; Shamekova, Malika; Zhambakin, Kabyl

    2017-01-01

    The polymerase chain reaction (PCR) is fundamental to molecular biology and is the most important practical molecular technique for the research laboratory. The principle of this technique has been further used and applied in plenty of other simple or complex nucleic acid amplification technologies (NAAT). In parallel to laboratory "wet bench" experiments for nucleic acid amplification technologies, in silico or virtual (bioinformatics) approaches have been developed, among which in silico PCR analysis. In silico NAAT analysis is a useful and efficient complementary method to ensure the specificity of primers or probes for an extensive range of PCR applications from homology gene discovery, molecular diagnosis, DNA fingerprinting, and repeat searching. Predicting sensitivity and specificity of primers and probes requires a search to determine whether they match a database with an optimal number of mismatches, similarity, and stability. In the development of in silico bioinformatics tools for nucleic acid amplification technologies, the prospects for the development of new NAAT or similar approaches should be taken into account, including forward-looking and comprehensive analysis that is not limited to only one PCR technique variant. The software FastPCR and the online Java web tool are integrated tools for in silico PCR of linear and circular DNA, multiple primer or probe searches in large or small databases and for advanced search. These tools are suitable for processing of batch files that are essential for automation when working with large amounts of data. The FastPCR software is available for download at http://primerdigital.com/fastpcr.html and the online Java version at http://primerdigital.com/tools/pcr.html .

  7. Development of Response Surface Models for Rapid Analysis & Multidisciplinary Optimization of Launch Vehicle Design Concepts

    NASA Technical Reports Server (NTRS)

    Unal, Resit

    1999-01-01

    Multdisciplinary design optimization (MDO) is an important step in the design and evaluation of launch vehicles, since it has a significant impact on performance and lifecycle cost. The objective in MDO is to search the design space to determine the values of design parameters that optimize the performance characteristics subject to system constraints. Vehicle Analysis Branch (VAB) at NASA Langley Research Center has computerized analysis tools in many of the disciplines required for the design and analysis of launch vehicles. Vehicle performance characteristics can be determined by the use of these computerized analysis tools. The next step is to optimize the system performance characteristics subject to multidisciplinary constraints. However, most of the complex sizing and performance evaluation codes used for launch vehicle design are stand-alone tools, operated by disciplinary experts. They are, in general, difficult to integrate and use directly for MDO. An alternative has been to utilize response surface methodology (RSM) to obtain polynomial models that approximate the functional relationships between performance characteristics and design variables. These approximation models, called response surface models, are then used to integrate the disciplines using mathematical programming methods for efficient system level design analysis, MDO and fast sensitivity simulations. A second-order response surface model of the form given has been commonly used in RSM since in many cases it can provide an adequate approximation especially if the region of interest is sufficiently limited.

  8. IUS solid rocket motor contamination prediction methods

    NASA Technical Reports Server (NTRS)

    Mullen, C. R.; Kearnes, J. H.

    1980-01-01

    A series of computer codes were developed to predict solid rocket motor produced contamination to spacecraft sensitive surfaces. Subscale and flight test data have confirmed some of the analytical results. Application of the analysis tools to a typical spacecraft has provided early identification of potential spacecraft contamination problems and provided insight into their solution; e.g., flight plan modifications, plume or outgassing shields and/or contamination covers.

  9. Brunnstrom Recovery Stage and Motricity Index for the Evaluation of Upper Extremity in Stroke: Analysis for Correlation and Responsiveness

    ERIC Educational Resources Information Center

    Safaz, Ismail; Ylmaz, Bilge; Yasar, Evren; Alaca, Rdvan

    2009-01-01

    The aim of this study was to find out first whether Brunnstrom recovery stage (BRS) and motricity index (MI) were correlated with each other and second to observe whether the two assessment tools were sensitive to changes regarding the rehabilitation outcome. Forty-six stroke patients who were admitted to the Stroke Rehabilitation Unit at our…

  10. Diagnostic accuracy of the aspartate aminotransferase-to-platelet ratio index for the prediction of hepatitis B-related fibrosis: a leading meta-analysis

    PubMed Central

    2012-01-01

    Background The aspartate aminotransferase-to-platelet ratio index (APRI), a tool with limited expense and widespread availability, is a promising noninvasive alternative to liver biopsy for detecting hepatic fibrosis. The objective of this study was to systematically review the performance of the APRI in predicting significant fibrosis and cirrhosis in hepatitis B-related fibrosis. Methods Areas under summary receiver operating characteristic curves (AUROC), sensitivity and specificity were used to examine the accuracy of the APRI for the diagnosis of hepatitis B-related significant fibrosis and cirrhosis. Heterogeneity was explored using meta-regression. Results Nine studies were included in this meta-analysis (n = 1,798). Prevalence of significant fibrosis and cirrhosis were 53.1% and 13.5%, respectively. The summary AUCs of the APRI for significant fibrosis and cirrhosis were 0.79 and 0.75, respectively. For significant fibrosis, an APRI threshold of 0.5 was 84% sensitive and 41% specific. At the cutoff of 1.5, the summary sensitivity and specificity were 49% and 84%, respectively. For cirrhosis, an APRI threshold of 1.0-1.5 was 54% sensitive and 78% specific. At the cutoff of 2.0, the summary sensitivity and specificity were 28% and 87%, respectively. Meta-regression analysis indicated that the APRI accuracy for both significant fibrosis and cirrhosis was affected by histological classification systems, but not influenced by the interval between Biopsy & APRI or blind biopsy. Conclusion Our meta-analysis suggests that APRI show limited value in identifying hepatitis B-related significant fibrosis and cirrhosis. PMID:22333407

  11. Accuracy analysis and design of A3 parallel spindle head

    NASA Astrophysics Data System (ADS)

    Ni, Yanbing; Zhang, Biao; Sun, Yupeng; Zhang, Yuan

    2016-03-01

    As functional components of machine tools, parallel mechanisms are widely used in high efficiency machining of aviation components, and accuracy is one of the critical technical indexes. Lots of researchers have focused on the accuracy problem of parallel mechanisms, but in terms of controlling the errors and improving the accuracy in the stage of design and manufacturing, further efforts are required. Aiming at the accuracy design of a 3-DOF parallel spindle head(A3 head), its error model, sensitivity analysis and tolerance allocation are investigated. Based on the inverse kinematic analysis, the error model of A3 head is established by using the first-order perturbation theory and vector chain method. According to the mapping property of motion and constraint Jacobian matrix, the compensatable and uncompensatable error sources which affect the accuracy in the end-effector are separated. Furthermore, sensitivity analysis is performed on the uncompensatable error sources. The sensitivity probabilistic model is established and the global sensitivity index is proposed to analyze the influence of the uncompensatable error sources on the accuracy in the end-effector of the mechanism. The results show that orientation error sources have bigger effect on the accuracy in the end-effector. Based upon the sensitivity analysis results, the tolerance design is converted into the issue of nonlinearly constrained optimization with the manufacturing cost minimum being the optimization objective. By utilizing the genetic algorithm, the allocation of the tolerances on each component is finally determined. According to the tolerance allocation results, the tolerance ranges of ten kinds of geometric error sources are obtained. These research achievements can provide fundamental guidelines for component manufacturing and assembly of this kind of parallel mechanisms.

  12. Utility of quantitative sensory testing and screening tools in identifying HIV-associated peripheral neuropathy in Western Kenya: pilot testing.

    PubMed

    Cettomai, Deanna; Kwasa, Judith; Kendi, Caroline; Birbeck, Gretchen L; Price, Richard W; Bukusi, Elizabeth A; Cohen, Craig R; Meyer, Ana-Claire

    2010-12-08

    Neuropathy is the most common neurologic complication of HIV but is widely under-diagnosed in resource-constrained settings. We aimed to identify tools that accurately distinguish individuals with moderate/severe peripheral neuropathy and can be administered by non-physician healthcare workers (HCW) in resource-constrained settings. We enrolled a convenience sample of 30 HIV-infected outpatients from a Kenyan HIV-care clinic. A HCW administered the Neuropathy Severity Score (NSS), Single Question Neuropathy Screen (Single-QNS), Subjective Peripheral Neuropathy Screen (Subjective-PNS), and Brief Peripheral Neuropathy Screen (Brief-PNS). Monofilament, graduated tuning fork, and two-point discrimination examinations were performed. Tools were validated against a neurologist's clinical assessment of moderate/severe neuropathy. The sample was 57% male, mean age 38.6 years, and mean CD4 count 324 cells/µL. Neurologist's assessment identified 20% (6/30) with moderate/severe neuropathy. Diagnostic utilities for moderate/severe neuropathy were: Single-QNS--83% sensitivity, 71% specificity; Subjective-PNS-total--83% sensitivity, 83% specificity; Subjective-PNS-max and NSS--67% sensitivity, 92% specificity; Brief-PNS--0% sensitivity, 92% specificity; monofilament--100% sensitivity, 88% specificity; graduated tuning fork--83% sensitivity, 88% specificity; two-point discrimination--75% sensitivity, 58% specificity. Pilot testing suggests Single-QNS, Subjective-PNS, and monofilament examination accurately identify HIV-infected patients with moderate/severe neuropathy and may be useful diagnostic tools in resource-constrained settings.

  13. CometQ: An automated tool for the detection and quantification of DNA damage using comet assay image analysis.

    PubMed

    Ganapathy, Sreelatha; Muraleedharan, Aparna; Sathidevi, Puthumangalathu Savithri; Chand, Parkash; Rajkumar, Ravi Philip

    2016-09-01

    DNA damage analysis plays an important role in determining the approaches for treatment and prevention of various diseases like cancer, schizophrenia and other heritable diseases. Comet assay is a sensitive and versatile method for DNA damage analysis. The main objective of this work is to implement a fully automated tool for the detection and quantification of DNA damage by analysing comet assay images. The comet assay image analysis consists of four stages: (1) classifier (2) comet segmentation (3) comet partitioning and (4) comet quantification. Main features of the proposed software are the design and development of four comet segmentation methods, and the automatic routing of the input comet assay image to the most suitable one among these methods depending on the type of the image (silver stained or fluorescent stained) as well as the level of DNA damage (heavily damaged or lightly/moderately damaged). A classifier stage, based on support vector machine (SVM) is designed and implemented at the front end, to categorise the input image into one of the above four groups to ensure proper routing. Comet segmentation is followed by comet partitioning which is implemented using a novel technique coined as modified fuzzy clustering. Comet parameters are calculated in the comet quantification stage and are saved in an excel file. Our dataset consists of 600 silver stained images obtained from 40 Schizophrenia patients with different levels of severity, admitted to a tertiary hospital in South India and 56 fluorescent stained images obtained from different internet sources. The performance of "CometQ", the proposed standalone application for automated analysis of comet assay images, is evaluated by a clinical expert and is also compared with that of a most recent and related software-OpenComet. CometQ gave 90.26% positive predictive value (PPV) and 93.34% sensitivity which are much higher than those of OpenComet, especially in the case of silver stained images. The results are validated using confusion matrix and Jaccard index (JI). Comet assay images obtained after DNA damage repair by incubation in the nutrient medium were also analysed, and CometQ showed a significant change in all the comet parameters in most of the cases. Results show that CometQ is an accurate and efficient tool with good sensitivity and PPV for DNA damage analysis using comet assay images. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  14. FY09 Final Report for LDRD Project: Understanding Viral Quasispecies Evolution through Computation and Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, C

    2009-11-12

    In FY09 they will (1) complete the implementation, verification, calibration, and sensitivity and scalability analysis of the in-cell virus replication model; (2) complete the design of the cell culture (cell-to-cell infection) model; (3) continue the research, design, and development of their bioinformatics tools: the Web-based structure-alignment-based sequence variability tool and the functional annotation of the genome database; (4) collaborate with the University of California at San Francisco on areas of common interest; and (5) submit journal articles that describe the in-cell model with simulations and the bioinformatics approaches to evaluation of genome variability and fitness.

  15. NEMOTAM: tangent and adjoint models for the ocean modelling platform NEMO

    NASA Astrophysics Data System (ADS)

    Vidard, A.; Bouttier, P.-A.; Vigilant, F.

    2015-04-01

    Tangent linear and adjoint models (TAMs) are efficient tools to analyse and to control dynamical systems such as NEMO. They can be involved in a large range of applications such as sensitivity analysis, parameter estimation or the computation of characteristic vectors. A TAM is also required by the 4D-Var algorithm, which is one of the major methods in data assimilation. This paper describes the development and the validation of the tangent linear and adjoint model for the NEMO ocean modelling platform (NEMOTAM). The diagnostic tools that are available alongside NEMOTAM are detailed and discussed, and several applications are also presented.

  16. NEMOTAM: tangent and adjoint models for the ocean modelling platform NEMO

    NASA Astrophysics Data System (ADS)

    Vidard, A.; Bouttier, P.-A.; Vigilant, F.

    2014-10-01

    The tangent linear and adjoint model (TAM) are efficient tools to analyse and to control dynamical systems such as NEMO. They can be involved in a large range of applications such as sensitivity analysis, parameter estimation or the computation of characteristics vectors. TAM is also required by the 4-D-VAR algorithm which is one of the major method in Data Assimilation. This paper describes the development and the validation of the Tangent linear and Adjoint Model for the NEMO ocean modelling platform (NEMOTAM). The diagnostic tools that are available alongside NEMOTAM are detailed and discussed and several applications are also presented.

  17. An evaluation of copy number variation detection tools for cancer using whole exome sequencing data.

    PubMed

    Zare, Fatima; Dow, Michelle; Monteleone, Nicholas; Hosny, Abdelrahman; Nabavi, Sheida

    2017-05-31

    Recently copy number variation (CNV) has gained considerable interest as a type of genomic/genetic variation that plays an important role in disease susceptibility. Advances in sequencing technology have created an opportunity for detecting CNVs more accurately. Recently whole exome sequencing (WES) has become primary strategy for sequencing patient samples and study their genomics aberrations. However, compared to whole genome sequencing, WES introduces more biases and noise that make CNV detection very challenging. Additionally, tumors' complexity makes the detection of cancer specific CNVs even more difficult. Although many CNV detection tools have been developed since introducing NGS data, there are few tools for somatic CNV detection for WES data in cancer. In this study, we evaluated the performance of the most recent and commonly used CNV detection tools for WES data in cancer to address their limitations and provide guidelines for developing new ones. We focused on the tools that have been designed or have the ability to detect cancer somatic aberrations. We compared the performance of the tools in terms of sensitivity and false discovery rate (FDR) using real data and simulated data. Comparative analysis of the results of the tools showed that there is a low consensus among the tools in calling CNVs. Using real data, tools show moderate sensitivity (~50% - ~80%), fair specificity (~70% - ~94%) and poor FDRs (~27% - ~60%). Also, using simulated data we observed that increasing the coverage more than 10× in exonic regions does not improve the detection power of the tools significantly. The limited performance of the current CNV detection tools for WES data in cancer indicates the need for developing more efficient and precise CNV detection methods. Due to the complexity of tumors and high level of noise and biases in WES data, employing advanced novel segmentation, normalization and de-noising techniques that are designed specifically for cancer data is necessary. Also, CNV detection development suffers from the lack of a gold standard for performance evaluation. Finally, developing tools with user-friendly user interfaces and visualization features can enhance CNV studies for a broader range of users.

  18. Clinical color vision testing and correlation with visual function.

    PubMed

    Zhao, Jiawei; Davé, Sarita B; Wang, Jiangxia; Subramanian, Prem S

    2015-09-01

    To determine if Hardy-Rand-Rittler (H-R-R) and Ishihara testing are accurate estimates of color vision in subjects with acquired visual dysfunction. Assessment of diagnostic tools. Twenty-two subjects with optic neuropathy (aged 18-65) and 18 control subjects were recruited prospectively from an outpatient clinic. Individuals with visual acuity (VA) <20/200 or with congenital color blindness were excluded. All subjects underwent a comprehensive eye examination including VA, color vision, and contrast sensitivity testing. Color vision was assessed using H-R-R and Ishihara plates and Farnsworth D-15 (D-15) discs. D-15 is the accepted standard for detecting and classifying color vision deficits. Contrast sensitivity was measured using Pelli-Robson contrast sensitivity charts. No relationship was found between H-R-R and D-15 scores (P = .477). H-R-R score and contrast sensitivity were positively correlated (P = .003). On multivariate analysis, contrast sensitivity (β = 8.61, P < .001) and VA (β = 2.01, P = .022) both showed association with H-R-R scores. Similar to H-R-R, Ishihara score did not correlate with D-15 score (P = .973), but on multivariate analysis was related to contrast sensitivity (β = 8.69, P < .001). H-R-R and Ishihara scores had an equivalent relationship with contrast sensitivity (P = .069). Neither H-R-R nor Ishihara testing appears to assess color identification in patients with optic neuropathy. Both H-R-R and Ishihara testing are correlated with contrast sensitivity, and these tests may be useful clinical surrogates for contrast sensitivity testing. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Old model organisms and new behavioral end-points: Swimming alteration as an ecotoxicological response.

    PubMed

    Faimali, Marco; Gambardella, Chiara; Costa, Elisa; Piazza, Veronica; Morgana, Silvia; Estévez-Calvar, Noelia; Garaventa, Francesca

    2017-07-01

    Behavioral responses of aquatic organisms have received much less attention than developmental or reproductive ones due to the scarce presence of user-friendly tools for their acquisition. The technological development of data acquisition systems for quantifying behavior in the aquatic environment and the increase of studies on the understanding the relationship between the behavior of aquatic organisms and the physiological/ecological activities have generated renewed interest in using behavioral responses also in marine ecotoxicology. Recent reviews on freshwater environment show that behavioral end-points are comparatively fast and sensitive, and warrant further attention as tools for assessing the toxicological effects of environmental contaminants. In this mini-review, we perform a systematic analysis of the most recent works that have used marine invertebrate swimming alteration as behavioral end-point in ecotoxicological studies by assessing the differences between behavioral and acute responses in a wide range of species, in order to compare their sensitivity. Copyright © 2016. Published by Elsevier Ltd.

  20. Applications of automatic differentiation in computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Carle, A.; Bischof, C.; Haigler, Kara J.; Newman, Perry A.

    1994-01-01

    Automatic differentiation (AD) is a powerful computational method that provides for computing exact sensitivity derivatives (SD) from existing computer programs for multidisciplinary design optimization (MDO) or in sensitivity analysis. A pre-compiler AD tool for FORTRAN programs called ADIFOR has been developed. The ADIFOR tool has been easily and quickly applied by NASA Langley researchers to assess the feasibility and computational impact of AD in MDO with several different FORTRAN programs. These include a state-of-the-art three dimensional multigrid Navier-Stokes flow solver for wings or aircraft configurations in transonic turbulent flow. With ADIFOR the user specifies sets of independent and dependent variables with an existing computer code. ADIFOR then traces the dependency path throughout the code, applies the chain rule to formulate derivative expressions, and generates new code to compute the required SD matrix. The resulting codes have been verified to compute exact non-geometric and geometric SD for a variety of cases. in less time than is required to compute the SD matrix using centered divided differences.

  1. Can we use high precision metal isotope analysis to improve our understanding of cancer?

    PubMed

    Larner, Fiona

    2016-01-01

    High precision natural isotope analyses are widely used in geosciences to trace elemental transport pathways. The use of this analytical tool is increasing in nutritional and disease-related research. In recent months, a number of groups have shown the potential this technique has in providing new observations for various cancers when applied to trace metal metabolism. The deconvolution of isotopic signatures, however, relies on mathematical models and geochemical data, which are not representative of the system under investigation. In addition to relevant biochemical studies of protein-metal isotopic interactions, technological development both in terms of sample throughput and detection sensitivity of these elements is now needed to translate this novel approach into a mainstream analytical tool. Following this, essential background healthy population studies must be performed, alongside observational, cross-sectional disease-based studies. Only then can the sensitivity and specificity of isotopic analyses be tested alongside currently employed methods, and important questions such as the influence of cancer heterogeneity and disease stage on isotopic signatures be addressed.

  2. Global Sensitivity Analysis with Small Sample Sizes: Ordinary Least Squares Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Michael J.; Liu, Wei; Sivaramakrishnan, Raghu

    2016-12-21

    A new version of global sensitivity analysis is developed in this paper. This new version coupled with tools from statistics, machine learning, and optimization can devise small sample sizes that allow for the accurate ordering of sensitivity coefficients for the first 10-30 most sensitive chemical reactions in complex chemical-kinetic mechanisms, and is particularly useful for studying the chemistry in realistic devices. A key part of the paper is calibration of these small samples. Because these small sample sizes are developed for use in realistic combustion devices, the calibration is done over the ranges of conditions in such devices, with amore » test case being the operating conditions of a compression ignition engine studied earlier. Compression ignition engines operate under low-temperature combustion conditions with quite complicated chemistry making this calibration difficult, leading to the possibility of false positives and false negatives in the ordering of the reactions. So an important aspect of the paper is showing how to handle the trade-off between false positives and false negatives using ideas from the multiobjective optimization literature. The combination of the new global sensitivity method and the calibration are sample sizes a factor of approximately 10 times smaller than were available with our previous algorithm.« less

  3. Evaluation of the AnnAGNPS Model for Predicting Runoff and Nutrient Export in a Typical Small Watershed in the Hilly Region of Taihu Lake.

    PubMed

    Luo, Chuan; Li, Zhaofu; Li, Hengpeng; Chen, Xiaomin

    2015-09-02

    The application of hydrological and water quality models is an efficient approach to better understand the processes of environmental deterioration. This study evaluated the ability of the Annualized Agricultural Non-Point Source (AnnAGNPS) model to predict runoff, total nitrogen (TN) and total phosphorus (TP) loading in a typical small watershed of a hilly region near Taihu Lake, China. Runoff was calibrated and validated at both an annual and monthly scale, and parameter sensitivity analysis was performed for TN and TP before the two water quality components were calibrated. The results showed that the model satisfactorily simulated runoff at annual and monthly scales, both during calibration and validation processes. Additionally, results of parameter sensitivity analysis showed that the parameters Fertilizer rate, Fertilizer organic, Canopy cover and Fertilizer inorganic were more sensitive to TN output. In terms of TP, the parameters Residue mass ratio, Fertilizer rate, Fertilizer inorganic and Canopy cover were the most sensitive. Based on these sensitive parameters, calibration was performed. TN loading produced satisfactory results for both the calibration and validation processes, whereas the performance of TP loading was slightly poor. The simulation results showed that AnnAGNPS has the potential to be used as a valuable tool for the planning and management of watersheds.

  4. 78 FR 13874 - Watershed Modeling To Assess the Sensitivity of Streamflow, Nutrient, and Sediment Loads to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-01

    ... an improved understanding of methodological challenges associated with integrating existing tools and... methodological challenges associated with integrating existing tools (e.g., climate models, downscaling... sensitivity to methodological choices such as different approaches for downscaling global climate change...

  5. Aerodynamic Shape Sensitivity Analysis and Design Optimization of Complex Configurations Using Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Newman, James C., III; Barnwell, Richard W.

    1997-01-01

    A three-dimensional unstructured grid approach to aerodynamic shape sensitivity analysis and design optimization has been developed and is extended to model geometrically complex configurations. The advantage of unstructured grids (when compared with a structured-grid approach) is their inherent ability to discretize irregularly shaped domains with greater efficiency and less effort. Hence, this approach is ideally suited for geometrically complex configurations of practical interest. In this work the nonlinear Euler equations are solved using an upwind, cell-centered, finite-volume scheme. The discrete, linearized systems which result from this scheme are solved iteratively by a preconditioned conjugate-gradient-like algorithm known as GMRES for the two-dimensional geometry and a Gauss-Seidel algorithm for the three-dimensional; similar procedures are used to solve the accompanying linear aerodynamic sensitivity equations in incremental iterative form. As shown, this particular form of the sensitivity equation makes large-scale gradient-based aerodynamic optimization possible by taking advantage of memory efficient methods to construct exact Jacobian matrix-vector products. Simple parameterization techniques are utilized for demonstrative purposes. Once the surface has been deformed, the unstructured grid is adapted by considering the mesh as a system of interconnected springs. Grid sensitivities are obtained by differentiating the surface parameterization and the grid adaptation algorithms with ADIFOR (which is an advanced automatic-differentiation software tool). To demonstrate the ability of this procedure to analyze and design complex configurations of practical interest, the sensitivity analysis and shape optimization has been performed for a two-dimensional high-lift multielement airfoil and for a three-dimensional Boeing 747-200 aircraft.

  6. Sensitivity and specificity of the American College of Rheumatology 1987 criteria for the diagnosis of rheumatoid arthritis according to disease duration: a systematic literature review and meta-analysis.

    PubMed

    Banal, F; Dougados, M; Combescure, C; Gossec, L

    2009-07-01

    To evaluate the ability of the widely used ACR set of criteria (both list and tree format) to diagnose RA compared with expert opinion according to disease duration. A systematic literature review was conducted in PubMed and Embase databases. All articles reporting the prevalence of RA according to ACR criteria and expert opinion in cohorts of early (<1 year duration) or established (>1 year) arthritis were analysed to calculate the sensitivity and specificity of ACR 1987 criteria against the "gold standard" (expert opinion). A meta-analysis using a summary receiver operating characteristic (SROC) curve was performed and pooled sensitivity and specificity were calculated with confidence intervals. Of 138 publications initially identified, 19 were analysable (total 7438 patients, 3883 RA). In early arthritis, pooled sensitivity and specificity of the ACR set of criteria were 77% (68% to 84%) and 77% (68% to 84%) in the list format versus 80% (72% to 88%) and 33% (24% to 43%) in the tree format. In established arthritis, sensitivity and specificity were respectively 79% (71% to 85%) and 90% (84% to 94%) versus 80% (71% to 85%) and 93% (86% to 97%). The SROC meta-analysis confirmed the statistically significant differences, suggesting that diagnostic performances of ACR list criteria are better in established arthritis. The specificity of ACR 1987 criteria in early RA is low, and these criteria should not be used as diagnostic tools. Sensitivity and specificity in established RA are higher, which reflects their use as classification criteria gold standard.

  7. Metacarpophalangeal pattern profile analysis: useful diagnostic tool for differentiating between dyschondrosteosis, Turner syndrome, and hypochondroplasia.

    PubMed

    Laurencikas, E; Sävendahl, L; Jorulf, H

    2006-06-01

    To assess the value of the metacarpophalangeal pattern profile (MCPP) analysis as a diagnostic tool for differentiating between patients with dyschondrosteosis, Turner syndrome, and hypochondroplasia. Radiographic and clinical data from 135 patients between 1 and 51 years of age were collected and analyzed. The study included 25 patients with hypochondroplasia (HCP), 39 with dyschondrosteosis (LWD), and 71 with Turner syndrome (TS). Hand pattern profiles were calculated and compared with those of 110 normal individuals. Pearson correlation coefficient (r) and multivariate discriminant analysis were used for pattern profile analysis. Pattern variability index, a measure of dysmorphogenesis, was calculated for LWD, TS, HCP, and normal controls. Our results demonstrate that patients with LWD, TS, or HCP have distinct pattern profiles that are significantly different from each other and from those of normal controls. Discriminant analysis yielded correct classification of normal versus abnormal individuals in 84% of cases. Classification of the patients into LWD, TS, and HCP groups was successful in 75%. The correct classification rate was higher (85%) when differentiating two pathological groups at a time. Pattern variability index was not helpful for differential diagnosis of LWD, TS, and HCP. Patients with LWD, TS, or HCP have distinct MCPPs and can be successfully differentiated from each other using advanced MCPP analysis. Discriminant analysis is to be preferred over Pearson correlation coefficient because it is a more sensitive and specific technique. MCPP analysis is a helpful tool for differentiating between syndromes with similar clinical and radiological abnormalities.

  8. Development of TUA-WELLNESS screening tool for screening risk of mild cognitive impairment among community-dwelling older adults

    PubMed Central

    Vanoh, Divya; Shahar, Suzana; Rosdinom, Razali; Din, Normah Che; Yahya, Hanis Mastura; Omar, Azahadi

    2016-01-01

    Background and aim Focus on screening for cognitive impairment has to be given particular importance because of the rising older adult population. Thus, this study aimed to develop and assess a brief screening tool consisting of ten items that can be self-administered by community dwelling older adults (TUA-WELLNESS). Methodology A total of 1,993 noninstitutionalized respondents aged 60 years and above were selected for this study. The dependent variable was mild cognitive impairment (MCI) assessed using neuropsychological test batteries. The items for the screening tool comprised a wide range of factors that were chosen mainly from the analysis of ordinal logistic regression (OLR) and based on past literature. A suitable cut-off point was developed using receiver operating characteristic analysis. Results A total of ten items were included in the screening tool. From the ten items, eight were found to be significant by ordinal logistic regression and the remaining two items were part of the tool because they showed strong association with cognitive impairment in previous studies. The area under curve (AUC), sensitivity, and specificity for cut-off 11 were 0.84%, 83.3%, and 73.4%, respectively. Conclusion TUA-WELLNESS screening tool has been used to screen for major risk factors of MCI among Malaysian older adults. This tool is only suitable for basic MCI risk screening purpose and should not be used for diagnostic purpose. PMID:27274208

  9. Modification and Validation of the Triglyceride-to-HDL Cholesterol Ratio as a Surrogate of Insulin Sensitivity in White Juveniles and Adults without Diabetes Mellitus: The Single Point Insulin Sensitivity Estimator (SPISE).

    PubMed

    Paulmichl, Katharina; Hatunic, Mensud; Højlund, Kurt; Jotic, Aleksandra; Krebs, Michael; Mitrakou, Asimina; Porcellati, Francesca; Tura, Andrea; Bergsten, Peter; Forslund, Anders; Manell, Hannes; Widhalm, Kurt; Weghuber, Daniel; Anderwald, Christian-Heinz

    2016-09-01

    The triglyceride-to-HDL cholesterol (TG/HDL-C) ratio was introduced as a tool to estimate insulin resistance, because circulating lipid measurements are available in routine settings. Insulin, C-peptide, and free fatty acids are components of other insulin-sensitivity indices but their measurement is expensive. Easier and more affordable tools are of interest for both pediatric and adult patients. Study participants from the Relationship Between Insulin Sensitivity and Cardiovascular Disease [43.9 (8.3) years, n = 1260] as well as the Beta-Cell Function in Juvenile Diabetes and Obesity study cohorts [15 (1.9) years, n = 29] underwent oral-glucose-tolerance tests and euglycemic clamp tests for estimation of whole-body insulin sensitivity and calculation of insulin sensitivity indices. To refine the TG/HDL ratio, mathematical modeling was applied including body mass index (BMI), fasting TG, and HDL cholesterol and compared to the clamp-derived M-value as an estimate of insulin sensitivity. Each modeling result was scored by identifying insulin resistance and correlation coefficient. The Single Point Insulin Sensitivity Estimator (SPISE) was compared to traditional insulin sensitivity indices using area under the ROC curve (aROC) analysis and χ(2) test. The novel formula for SPISE was computed as follows: SPISE = 600 × HDL-C(0.185)/(TG(0.2) × BMI(1.338)), with fasting HDL-C (mg/dL), fasting TG concentrations (mg/dL), and BMI (kg/m(2)). A cutoff value of 6.61 corresponds to an M-value smaller than 4.7 mg · kg(-1) · min(-1) (aROC, M:0.797). SPISE showed a significantly better aROC than the TG/HDL-C ratio. SPISE aROC was comparable to the Matsuda ISI (insulin sensitivity index) and equal to the QUICKI (quantitative insulin sensitivity check index) and HOMA-IR (homeostasis model assessment-insulin resistance) when calculated with M-values. The SPISE seems well suited to surrogate whole-body insulin sensitivity from inexpensive fasting single-point blood draw and BMI in white adolescents and adults. © 2016 American Association for Clinical Chemistry.

  10. General Tool for Evaluating High-Contrast Coronagraphic Telescope Performance Error Budgets

    NASA Technical Reports Server (NTRS)

    Marchen, Luis F.

    2011-01-01

    The Coronagraph Performance Error Budget (CPEB) tool automates many of the key steps required to evaluate the scattered starlight contrast in the dark hole of a space-based coronagraph. The tool uses a Code V prescription of the optical train, and uses MATLAB programs to call ray-trace code that generates linear beam-walk and aberration sensitivity matrices for motions of the optical elements and line-of-sight pointing, with and without controlled fine-steering mirrors (FSMs). The sensitivity matrices are imported by macros into Excel 2007, where the error budget is evaluated. The user specifies the particular optics of interest, and chooses the quality of each optic from a predefined set of PSDs. The spreadsheet creates a nominal set of thermal and jitter motions, and combines that with the sensitivity matrices to generate an error budget for the system. CPEB also contains a combination of form and ActiveX controls with Visual Basic for Applications code to allow for user interaction in which the user can perform trade studies such as changing engineering requirements, and identifying and isolating stringent requirements. It contains summary tables and graphics that can be instantly used for reporting results in view graphs. The entire process to obtain a coronagraphic telescope performance error budget has been automated into three stages: conversion of optical prescription from Zemax or Code V to MACOS (in-house optical modeling and analysis tool), a linear models process, and an error budget tool process. The first process was improved by developing a MATLAB package based on the Class Constructor Method with a number of user-defined functions that allow the user to modify the MACOS optical prescription. The second process was modified by creating a MATLAB package that contains user-defined functions that automate the process. The user interfaces with the process by utilizing an initialization file where the user defines the parameters of the linear model computations. Other than this, the process is fully automated. The third process was developed based on the Terrestrial Planet Finder coronagraph Error Budget Tool, but was fully automated by using VBA code, form, and ActiveX controls.

  11. Accuracy of Nutritional Screening Tools in Assessing the Risk of Undernutrition in Hospitalized Children.

    PubMed

    Huysentruyt, Koen; Devreker, Thierry; Dejonckheere, Joachim; De Schepper, Jean; Vandenplas, Yvan; Cools, Filip

    2015-08-01

    The aim of the present study was to evaluate the predictive accuracy of screening tools for assessing nutritional risk in hospitalized children in developed countries. The study involved a systematic review of literature (MEDLINE, EMBASE, and Cochrane Central databases up to January 17, 2014) of studies on the diagnostic performance of pediatric nutritional screening tools. Methodological quality was assessed using a modified QUADAS tool. Sensitivity and specificity were calculated for each screening tool per validation method. A meta-analysis was performed to estimate the risk ratio of different screening result categories of being truly at nutritional risk. A total of 11 studies were included on ≥1 of the following screening tools: Pediatric Nutritional Risk Score, Screening Tool for the Assessment of Malnutrition in Paediatrics, Paediatric Yorkhill Malnutrition Score, and Screening Tool for Risk on Nutritional Status and Growth. Because of variation in reference standards, a direct comparison of the predictive accuracy of the screening tools was not possible. A meta-analysis was performed on 1629 children from 7 different studies. The risk ratio of being truly at nutritional risk was 0.349 (95% confidence interval [CI] 0.16-0.78) for children in the low versus moderate screening category and 0.292 (95% CI 0.19-0.44) in the moderate versus high screening category. There is insufficient evidence to choose 1 nutritional screening tool over another based on their predictive accuracy. The estimated risk of being at "true nutritional risk" increases with each category of screening test result. Each screening category should be linked to a specific course of action, although further research is needed.

  12. Simple Nutrition Screening Tool for Pediatric Inpatients.

    PubMed

    White, Melinda; Lawson, Karen; Ramsey, Rebecca; Dennis, Nicole; Hutchinson, Zoe; Soh, Xin Ying; Matsuyama, Misa; Doolan, Annabel; Todd, Alwyn; Elliott, Aoife; Bell, Kristie; Littlewood, Robyn

    2016-03-01

    Pediatric nutrition risk screening tools are not routinely implemented throughout many hospitals, despite prevalence studies demonstrating malnutrition is common in hospitalized children. Existing tools lack the simplicity of those used to assess nutrition risk in the adult population. This study reports the accuracy of a new, quick, and simple pediatric nutrition screening tool (PNST) designed to be used for pediatric inpatients. The pediatric Subjective Global Nutrition Assessment (SGNA) and anthropometric measures were used to develop and assess the validity of 4 simple nutrition screening questions comprising the PNST. Participants were pediatric inpatients in 2 tertiary pediatric hospitals and 1 regional hospital. Two affirmative answers to the PNST questions were found to maximize the specificity and sensitivity to the pediatric SGNA and body mass index (BMI) z scores for malnutrition in 295 patients. The PNST identified 37.6% of patients as being at nutrition risk, whereas the pediatric SGNA identified 34.2%. The sensitivity and specificity of the PNST compared with the pediatric SGNA were 77.8% and 82.1%, respectively. The sensitivity of the PNST at detecting patients with a BMI z score of less than -2 was 89.3%, and the specificity was 66.2%. Both the PNST and pediatric SGNA were relatively poor at detecting patients who were stunted or overweight, with the sensitivity and specificity being less than 69%. The PNST provides a sensitive, valid, and simpler alternative to existing pediatric nutrition screening tools such as Screening Tool for the Assessment of Malnutrition in Pediatrics (STAMP), Screening Tool Risk on Nutritional status and Growth (STRONGkids), and Paediatric Yorkhill Malnutrition Score (PYMS) to ensure the early detection of hospitalized children at nutrition risk. © 2014 American Society for Parenteral and Enteral Nutrition.

  13. An analysis, sensitivity and prediction of winter fog events using FASP model over Indo-Gangetic plains, India

    NASA Astrophysics Data System (ADS)

    Srivastava, S. K., Sr.; Sharma, D. A.; Sachdeva, K.

    2017-12-01

    Indo-Gangetic plains of India experience severe fog conditions during the peak winter months of December and January every year. In this paper an attempt has been to analyze the spatial and temporal variability of winter fog over Indo-Gangetic plains. Further, an attempt has also been made to configure an efficient meso-scale numerical weather prediction model using different parameterization schemes and develop a forecasting tool for prediction of fog during winter months over Indo-Gangetic plains. The study revealed that an alarming increasing positive trend of fog frequency prevails over many locations of IGP. Hot spot and cluster analysis were conducted to identify the high fog prone zones using GIS and inferential statistical tools respectively. Hot spots on an average experiences fog on 68.27% days, it is followed by moderate and cold spots with 48.03% and 21.79% respectively. The study proposes a new FASP (Fog Analysis, sensitivity and prediction) Model for overall analysis and prediction of fog at a particular location and period over IGP. In the first phase of this model long term climatological fog data of a location is analyzed to determine its characteristics and prevailing trend using various advanced statistical techniques. During a second phase a sensitivity test is conducted with different combination of parameterization schemes to determine the most suitable combination for fog simulation over a particular location and period and in the third and final phase, first ARIMA model is used to predict the number of fog days in future . Thereafter, Numerical model is used to predict the various meteorological parameters favourable for fog forecast. Finally, Hybrid model is used for fog forecast over the study location. The results of the FASP model are validated with actual ground based fog data using statistical tools. Forecast Fog-gram generated using hybrid model during Jan 2017 shows highly encouraging results for fog occurrence/Non occurrence between 25 hrs to 72 hours forecast. The model predicted the fog occurrences/Non occurrence with more than 85 % accuracy over most of the locations across the study area. The minimum visibility departure is within 500 m on 90% occasions over the central IGP and within 1000m on more than 80 % occasions over most of the locations across Indo-Gangetic plains.

  14. Biotechnical use of polymerase chain reaction for microbiological analysis of biological samples.

    PubMed

    Lantz, P G; Abu al-Soud, W; Knutsson, R; Hahn-Hägerdal, B; Rådström, P

    2000-01-01

    Since its introduction in the mid-80s, polymerase chain reaction (PCR) technology has been recognised as a rapid, sensitive and specific molecular diagnostic tool for the analysis of micro-organisms in clinical, environmental and food samples. Although this technique can be extremely effective with pure solutions of nucleic acids, it's sensitivity may be reduced dramatically when applied directly to biological samples. This review describes PCR technology as a microbial detection method, PCR inhibitors in biological samples and various sample preparation techniques that can be used to facilitate PCR detection, by either separating the micro-organisms from PCR inhibitors and/or by concentrating the micro-organisms to detectable concentrations. Parts of this review are updated and based on a doctoral thesis by Lantz [1] and on a review discussing methods to overcome PCR inhibition in foods [2].

  15. Pulsed quantum cascade laser-based cavity ring-down spectroscopy for ammonia detection in breath.

    PubMed

    Manne, Jagadeeshwari; Sukhorukov, Oleksandr; Jäger, Wolfgang; Tulip, John

    2006-12-20

    Breath analysis can be a valuable, noninvasive tool for the clinical diagnosis of a number of pathological conditions. The detection of ammonia in exhaled breath is of particular interest for it has been linked to kidney malfunction and peptic ulcers. Pulsed cavity ringdown spectroscopy in the mid-IR region has developed into a sensitive analytical technique for trace gas analysis. A gas analyzer based on a pulsed mid-IR quantum cascade laser operating near 970 cm(-1) has been developed for the detection of ammonia levels in breath. We report a sensitivity of approximately 50 parts per billion with a 20 s time resolution for ammonia detection in breath with this system. The challenges and possible solutions for the quantification of ammonia in human breath by the described technique are discussed.

  16. Damage classification and estimation in experimental structures using time series analysis and pattern recognition

    NASA Astrophysics Data System (ADS)

    de Lautour, Oliver R.; Omenzetter, Piotr

    2010-07-01

    Developed for studying long sequences of regularly sampled data, time series analysis methods are being increasingly investigated for the use of Structural Health Monitoring (SHM). In this research, Autoregressive (AR) models were used to fit the acceleration time histories obtained from two experimental structures: a 3-storey bookshelf structure and the ASCE Phase II Experimental SHM Benchmark Structure, in undamaged and limited number of damaged states. The coefficients of the AR models were considered to be damage-sensitive features and used as input into an Artificial Neural Network (ANN). The ANN was trained to classify damage cases or estimate remaining structural stiffness. The results showed that the combination of AR models and ANNs are efficient tools for damage classification and estimation, and perform well using small number of damage-sensitive features and limited sensors.

  17. ASKI: A modular toolbox for scattering-integral-based seismic full waveform inversion and sensitivity analysis utilizing external forward codes

    NASA Astrophysics Data System (ADS)

    Schumacher, Florian; Friederich, Wolfgang

    Due to increasing computational resources, the development of new numerically demanding methods and software for imaging Earth's interior remains of high interest in Earth sciences. Here, we give a description from a user's and programmer's perspective of the highly modular, flexible and extendable software package ASKI-Analysis of Sensitivity and Kernel Inversion-recently developed for iterative scattering-integral-based seismic full waveform inversion. In ASKI, the three fundamental steps of solving the seismic forward problem, computing waveform sensitivity kernels and deriving a model update are solved by independent software programs that interact via file output/input only. Furthermore, the spatial discretizations of the model space used for solving the seismic forward problem and for deriving model updates, respectively, are kept completely independent. For this reason, ASKI does not contain a specific forward solver but instead provides a general interface to established community wave propagation codes. Moreover, the third fundamental step of deriving a model update can be repeated at relatively low costs applying different kinds of model regularization or re-selecting/weighting the inverted dataset without need to re-solve the forward problem or re-compute the kernels. Additionally, ASKI offers the user sensitivity and resolution analysis tools based on the full sensitivity matrix and allows to compose customized workflows in a consistent computational environment. ASKI is written in modern Fortran and Python, it is well documented and freely available under terms of the GNU General Public License (http://www.rub.de/aski).

  18. GDA, a web-based tool for Genomics and Drugs integrated analysis.

    PubMed

    Caroli, Jimmy; Sorrentino, Giovanni; Forcato, Mattia; Del Sal, Giannino; Bicciato, Silvio

    2018-05-25

    Several major screenings of genetic profiling and drug testing in cancer cell lines proved that the integration of genomic portraits and compound activities is effective in discovering new genetic markers of drug sensitivity and clinically relevant anticancer compounds. Despite most genetic and drug response data are publicly available, the availability of user-friendly tools for their integrative analysis remains limited, thus hampering an effective exploitation of this information. Here, we present GDA, a web-based tool for Genomics and Drugs integrated Analysis that combines drug response data for >50 800 compounds with mutations and gene expression profiles across 73 cancer cell lines. Genomic and pharmacological data are integrated through a modular architecture that allows users to identify compounds active towards cancer cell lines bearing a specific genomic background and, conversely, the mutational or transcriptional status of cells responding or not-responding to a specific compound. Results are presented through intuitive graphical representations and supplemented with information obtained from public repositories. As both personalized targeted therapies and drug-repurposing are gaining increasing attention, GDA represents a resource to formulate hypotheses on the interplay between genomic traits and drug response in cancer. GDA is freely available at http://gda.unimore.it/.

  19. Nondestructive surface analysis for material research using fiber optic vibrational spectroscopy

    NASA Astrophysics Data System (ADS)

    Afanasyeva, Natalia I.

    2001-11-01

    The advanced methods of fiber optical vibrational spectroscopy (FOVS) has been developed in conjunction with interferometer and low-loss, flexible, and nontoxic optical fibers, sensors, and probes. The combination of optical fibers and sensors with Fourier Transform (FT) spectrometer has been used in the range from 2.5 to 12micrometers . This technique serves as an ideal diagnostic tool for surface analysis of numerous and various diverse materials such as complex structured materials, fluids, coatings, implants, living cells, plants, and tissue. Such surfaces as well as living tissue or plants are very difficult to investigate in vivo by traditional FT infrared or Raman spectroscopy methods. The FOVS technique is nondestructive, noninvasive, fast (15 sec) and capable of operating in remote sampling regime (up to a fiber length of 3m). Fourier transform infrared (FTIR) and Raman fiber optic spectroscopy operating with optical fibers has been suggested as a new powerful tool. These techniques are highly sensitive techniques for structural studies in material research and various applications during process analysis to determine molecular composition, chemical bonds, and molecular conformations. These techniques could be developed as a new tool for quality control of numerous materials as well as noninvasive biopsy.

  20. Rapid analysis of protein backbone resonance assignments using cryogenic probes, a distributed Linux-based computing architecture, and an integrated set of spectral analysis tools.

    PubMed

    Monleón, Daniel; Colson, Kimberly; Moseley, Hunter N B; Anklin, Clemens; Oswald, Robert; Szyperski, Thomas; Montelione, Gaetano T

    2002-01-01

    Rapid data collection, spectral referencing, processing by time domain deconvolution, peak picking and editing, and assignment of NMR spectra are necessary components of any efficient integrated system for protein NMR structure analysis. We have developed a set of software tools designated AutoProc, AutoPeak, and AutoAssign, which function together with the data processing and peak-picking programs NMRPipe and Sparky, to provide an integrated software system for rapid analysis of protein backbone resonance assignments. In this paper we demonstrate that these tools, together with high-sensitivity triple resonance NMR cryoprobes for data collection and a Linux-based computer cluster architecture, can be combined to provide nearly complete backbone resonance assignments and secondary structures (based on chemical shift data) for a 59-residue protein in less than 30 hours of data collection and processing time. In this optimum case of a small protein providing excellent spectra, extensive backbone resonance assignments could also be obtained using less than 6 hours of data collection and processing time. These results demonstrate the feasibility of high throughput triple resonance NMR for determining resonance assignments and secondary structures of small proteins, and the potential for applying NMR in large scale structural proteomics projects.

  1. CLMSVault: A Software Suite for Protein Cross-Linking Mass-Spectrometry Data Analysis and Visualization.

    PubMed

    Courcelles, Mathieu; Coulombe-Huntington, Jasmin; Cossette, Émilie; Gingras, Anne-Claude; Thibault, Pierre; Tyers, Mike

    2017-07-07

    Protein cross-linking mass spectrometry (CL-MS) enables the sensitive detection of protein interactions and the inference of protein complex topology. The detection of chemical cross-links between protein residues can identify intra- and interprotein contact sites or provide physical constraints for molecular modeling of protein structure. Recent innovations in cross-linker design, sample preparation, mass spectrometry, and software tools have significantly improved CL-MS approaches. Although a number of algorithms now exist for the identification of cross-linked peptides from mass spectral data, a dearth of user-friendly analysis tools represent a practical bottleneck to the broad adoption of the approach. To facilitate the analysis of CL-MS data, we developed CLMSVault, a software suite designed to leverage existing CL-MS algorithms and provide intuitive and flexible tools for cross-platform data interpretation. CLMSVault stores and combines complementary information obtained from different cross-linkers and search algorithms. CLMSVault provides filtering, comparison, and visualization tools to support CL-MS analyses and includes a workflow for label-free quantification of cross-linked peptides. An embedded 3D viewer enables the visualization of quantitative data and the mapping of cross-linked sites onto PDB structural models. We demonstrate the application of CLMSVault for the analysis of a noncovalent Cdc34-ubiquitin protein complex cross-linked under different conditions. CLMSVault is open-source software (available at https://gitlab.com/courcelm/clmsvault.git ), and a live demo is available at http://democlmsvault.tyerslab.com/ .

  2. Demonstration of a modelling-based multi-criteria decision analysis procedure for prioritisation of occupational risks from manufactured nanomaterials.

    PubMed

    Hristozov, Danail; Zabeo, Alex; Alstrup Jensen, Keld; Gottardo, Stefania; Isigonis, Panagiotis; Maccalman, Laura; Critto, Andrea; Marcomini, Antonio

    2016-11-01

    Several tools to facilitate the risk assessment and management of manufactured nanomaterials (MN) have been developed. Most of them require input data on physicochemical properties, toxicity and scenario-specific exposure information. However, such data are yet not readily available, and tools that can handle data gaps in a structured way to ensure transparent risk analysis for industrial and regulatory decision making are needed. This paper proposes such a quantitative risk prioritisation tool, based on a multi-criteria decision analysis algorithm, which combines advanced exposure and dose-response modelling to calculate margins of exposure (MoE) for a number of MN in order to rank their occupational risks. We demonstrated the tool in a number of workplace exposure scenarios (ES) involving the production and handling of nanoscale titanium dioxide, zinc oxide (ZnO), silver and multi-walled carbon nanotubes. The results of this application demonstrated that bag/bin filling, manual un/loading and dumping of large amounts of dry powders led to high emissions, which resulted in high risk associated with these ES. The ZnO MN revealed considerable hazard potential in vivo, which significantly influenced the risk prioritisation results. In order to study how variations in the input data affect our results, we performed probabilistic Monte Carlo sensitivity/uncertainty analysis, which demonstrated that the performance of the proposed model is stable against changes in the exposure and hazard input variables.

  3. Automatic differentiation evaluated as a tool for rotorcraft design and optimization

    NASA Technical Reports Server (NTRS)

    Walsh, Joanne L.; Young, Katherine C.

    1995-01-01

    This paper investigates the use of automatic differentiation (AD) as a means for generating sensitivity analyses in rotorcraft design and optimization. This technique transforms an existing computer program into a new program that performs sensitivity analysis in addition to the original analysis. The original FORTRAN program calculates a set of dependent (output) variables from a set of independent (input) variables, the new FORTRAN program calculates the partial derivatives of the dependent variables with respect to the independent variables. The AD technique is a systematic implementation of the chain rule of differentiation, this method produces derivatives to machine accuracy at a cost that is comparable with that of finite-differencing methods. For this study, an analysis code that consists of the Langley-developed hover analysis HOVT, the comprehensive rotor analysis CAMRAD/JA, and associated preprocessors is processed through the AD preprocessor ADIFOR 2.0. The resulting derivatives are compared with derivatives obtained from finite-differencing techniques. The derivatives obtained with ADIFOR 2.0 are exact within machine accuracy and do not depend on the selection of step-size, as are the derivatives obtained with finite-differencing techniques.

  4. Fluid Film Bearing Code Development

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The next generation of rocket engine turbopumps is being developed by industry through Government-directed contracts. These turbopumps will use fluid film bearings because they eliminate the life and shaft-speed limitations of rolling-element bearings, increase turbopump design flexibility, and reduce the need for turbopump overhauls and maintenance. The design of the fluid film bearings for these turbopumps, however, requires sophisticated analysis tools to model the complex physical behavior characteristic of fluid film bearings operating at high speeds with low viscosity fluids. State-of-the-art analysis and design tools are being developed at the Texas A&M University under a grant guided by the NASA Lewis Research Center. The latest version of the code, HYDROFLEXT, is a thermohydrodynamic bulk flow analysis with fluid compressibility, full inertia, and fully developed turbulence models. It can predict the static and dynamic force response of rigid and flexible pad hydrodynamic bearings and of rigid and tilting pad hydrostatic bearings. The Texas A&M code is a comprehensive analysis tool, incorporating key fluid phenomenon pertinent to bearings that operate at high speeds with low-viscosity fluids typical of those used in rocket engine turbopumps. Specifically, the energy equation was implemented into the code to enable fluid properties to vary with temperature and pressure. This is particularly important for cryogenic fluids because their properties are sensitive to temperature as well as pressure. As shown in the figure, predicted bearing mass flow rates vary significantly depending on the fluid model used. Because cryogens are semicompressible fluids and the bearing dynamic characteristics are highly sensitive to fluid compressibility, fluid compressibility effects are also modeled. The code contains fluid properties for liquid hydrogen, liquid oxygen, and liquid nitrogen as well as for water and air. Other fluids can be handled by the code provided that the user inputs information that relates the fluid transport properties to the temperature.

  5. Adverse outcomes in older adults attending emergency departments: a systematic review and meta-analysis of the Identification of Seniors At Risk (ISAR) screening tool.

    PubMed

    Galvin, Rose; Gilleit, Yannick; Wallace, Emma; Cousins, Gráinne; Bolmer, Manon; Rainer, Timothy; Smith, Susan M; Fahey, Tom

    2017-03-01

    older adults are frequent users of emergency services and demonstrate high rates of adverse outcomes following emergency care. to perform a systematic review and meta-analysis of the Identification of Seniors At Risk (ISAR) screening tool, to determine its predictive value in identifying adults ≥65 years at risk of functional decline, unplanned emergency department (ED) readmission, emergency hospitalisation or death within 180 days after index ED visit/hospitalisation. a systematic literature search was conducted in PubMed, EMBASE, CINAHL, EBSCO and the Cochrane Library to identify validation and impact analysis studies of the ISAR tool. A pre-specified ISAR score of ≥2 (maximum score 6 points) was used to identify patients at high risk of adverse outcomes. A bivariate random effects model generated pooled estimates of sensitivity and specificity. Statistical heterogeneity was explored and methodological quality was assessed using validated criteria. thirty-two validation studies (n = 12,939) are included. At ≥2, the pooled sensitivity of the ISAR for predicting ED return, emergency hospitalisation and mortality at 6 months is 0.80 (95% confidence interval (CI) 0.70-0.87), 0.82 (95% CI 0.74-0.88) and 0.87 (95% CI 0.75-0.94), respectively, with a pooled specificity of 0.31 (95% CI 0.24-0.38), 0.32 (95% CI 0.24-0.41) and 0.35 (95% CI 0.26-0.44). Similar values are demonstrated at 30 and 90 days. Three heterogeneous impact analysis studies examined the clinical implementation of the ISAR and reported mixed findings across patient and process outcomes. the ISAR has modest predictive accuracy and may serve as a decision-making adjunct when determining which older adults can be safely discharged. © The Author 2016. Published by Oxford University Press on behalf of the British Geriatrics Society. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  6. Physiologically based pharmacokinetic modeling of a homologous series of barbiturates in the rat: a sensitivity analysis.

    PubMed

    Nestorov, I A; Aarons, L J; Rowland, M

    1997-08-01

    Sensitivity analysis studies the effects of the inherent variability and uncertainty in model parameters on the model outputs and may be a useful tool at all stages of the pharmacokinetic modeling process. The present study examined the sensitivity of a whole-body physiologically based pharmacokinetic (PBPK) model for the distribution kinetics of nine 5-n-alkyl-5-ethyl barbituric acids in arterial blood and 14 tissues (lung, liver, kidney, stomach, pancreas, spleen, gut, muscle, adipose, skin, bone, heart, brain, testes) after i.v. bolus administration to rats. The aims were to obtain new insights into the model used, to rank the model parameters involved according to their impact on the model outputs and to study the changes in the sensitivity induced by the increase in the lipophilicity of the homologues on ascending the series. Two approaches for sensitivity analysis have been implemented. The first, based on the Matrix Perturbation Theory, uses a sensitivity index defined as the normalized sensitivity of the 2-norm of the model compartmental matrix to perturbations in its entries. The second approach uses the traditional definition of the normalized sensitivity function as the relative change in a model state (a tissue concentration) corresponding to a relative change in a model parameter. Autosensitivity has been defined as sensitivity of a state to any of its parameters; cross-sensitivity as the sensitivity of a state to any other states' parameters. Using the two approaches, the sensitivity of representative tissue concentrations (lung, liver, kidney, stomach, gut, adipose, heart, and brain) to the following model parameters: tissue-to-unbound plasma partition coefficients, tissue blood flows, unbound renal and intrinsic hepatic clearance, permeability surface area product of the brain, have been analyzed. Both the tissues and the parameters were ranked according to their sensitivity and impact. The following general conclusions were drawn: (i) the overall sensitivity of the system to all parameters involved is small due to the weak connectivity of the system structure; (ii) the time course of both the auto- and cross-sensitivity functions for all tissues depends on the dynamics of the tissues themselves, e.g., the higher the perfusion of a tissue, the higher are both its cross-sensitivity to other tissues' parameters and the cross-sensitivities of other tissues to its parameters; and (iii) with a few exceptions, there is not a marked influence of the lipophilicity of the homologues on either the pattern or the values of the sensitivity functions. The estimates of the sensitivity and the subsequent tissue and parameter rankings may be extended to other drugs, sharing the same common structure of the whole body PBPK model, and having similar model parameters. Results show also that the computationally simple Matrix Perturbation Analysis should be used only when an initial idea about the sensitivity of a system is required. If comprehensive information regarding the sensitivity is needed, the numerically expensive Direct Sensitivity Analysis should be used.

  7. Resilience through adaptation

    PubMed Central

    van Voorn, George A. K.; Ligtenberg, Arend; Molenaar, Jaap

    2017-01-01

    Adaptation of agents through learning or evolution is an important component of the resilience of Complex Adaptive Systems (CAS). Without adaptation, the flexibility of such systems to cope with outside pressures would be much lower. To study the capabilities of CAS to adapt, social simulations with agent-based models (ABMs) provide a helpful tool. However, the value of ABMs for studying adaptation depends on the availability of methodologies for sensitivity analysis that can quantify resilience and adaptation in ABMs. In this paper we propose a sensitivity analysis methodology that is based on comparing time-dependent probability density functions of output of ABMs with and without agent adaptation. The differences between the probability density functions are quantified by the so-called earth-mover’s distance. We use this sensitivity analysis methodology to quantify the probability of occurrence of critical transitions and other long-term effects of agent adaptation. To test the potential of this new approach, it is used to analyse the resilience of an ABM of adaptive agents competing for a common-pool resource. Adaptation is shown to contribute positively to the resilience of this ABM. If adaptation proceeds sufficiently fast, it may delay or avert the collapse of this system. PMID:28196372

  8. Resilience through adaptation.

    PubMed

    Ten Broeke, Guus A; van Voorn, George A K; Ligtenberg, Arend; Molenaar, Jaap

    2017-01-01

    Adaptation of agents through learning or evolution is an important component of the resilience of Complex Adaptive Systems (CAS). Without adaptation, the flexibility of such systems to cope with outside pressures would be much lower. To study the capabilities of CAS to adapt, social simulations with agent-based models (ABMs) provide a helpful tool. However, the value of ABMs for studying adaptation depends on the availability of methodologies for sensitivity analysis that can quantify resilience and adaptation in ABMs. In this paper we propose a sensitivity analysis methodology that is based on comparing time-dependent probability density functions of output of ABMs with and without agent adaptation. The differences between the probability density functions are quantified by the so-called earth-mover's distance. We use this sensitivity analysis methodology to quantify the probability of occurrence of critical transitions and other long-term effects of agent adaptation. To test the potential of this new approach, it is used to analyse the resilience of an ABM of adaptive agents competing for a common-pool resource. Adaptation is shown to contribute positively to the resilience of this ABM. If adaptation proceeds sufficiently fast, it may delay or avert the collapse of this system.

  9. Expanding the occupational health methodology: A concatenated artificial neural network approach to model the burnout process in Chinese nurses.

    PubMed

    Ladstätter, Felix; Garrosa, Eva; Moreno-Jiménez, Bernardo; Ponsoda, Vicente; Reales Aviles, José Manuel; Dai, Junming

    2016-01-01

    Artificial neural networks are sophisticated modelling and prediction tools capable of extracting complex, non-linear relationships between predictor (input) and predicted (output) variables. This study explores this capacity by modelling non-linearities in the hardiness-modulated burnout process with a neural network. Specifically, two multi-layer feed-forward artificial neural networks are concatenated in an attempt to model the composite non-linear burnout process. Sensitivity analysis, a Monte Carlo-based global simulation technique, is then utilised to examine the first-order effects of the predictor variables on the burnout sub-dimensions and consequences. Results show that (1) this concatenated artificial neural network approach is feasible to model the burnout process, (2) sensitivity analysis is a prolific method to study the relative importance of predictor variables and (3) the relationships among variables involved in the development of burnout and its consequences are to different degrees non-linear. Many relationships among variables (e.g., stressors and strains) are not linear, yet researchers use linear methods such as Pearson correlation or linear regression to analyse these relationships. Artificial neural network analysis is an innovative method to analyse non-linear relationships and in combination with sensitivity analysis superior to linear methods.

  10. Detection of sdhB Gene Mutations in SDHI-Resistant Isolates of Botrytis cinerea Using High Resolution Melting (HRM) Analysis.

    PubMed

    Samaras, Anastasios; Madesis, Panagiotis; Karaoglanidis, George S

    2016-01-01

    Botrytis cinerea , is a high risk pathogen for fungicide resistance development. Pathogen' resistance to SDHIs is associated with several mutations in sdh gene. The diversity of mutations and their differential effect on cross-resistance patterns among SDHIs and the fitness of resistant strains necessitate the availability of a tool for their rapid identification. This study was initiated to develop and validate a high-resolution melting (HRM) analysis for the identification of P225H/F/L//T, N230I, and H272L/R/Y mutations. Based on the sequence of sdh B subunit of resistant and sensitive isolates, a universal primer pair was designed. The specificity of the HRM analysis primers was verified to ensure against the cross-reaction with other fungal species and its sensitivity was evaluated using concentrations of known amounts of mutant's DNA. The melting curve analysis generated nine distinct curve profiles, enabling the discrimination of all the four mutations located at codon 225, the N230I mutation, the three mutations located in codon 272, and the non-mutated isolates (isolates of wild-type sensitivity). Similar results were obtained when DNA was extracted directly from artificially inoculated strawberry fruit. The method was validated by monitoring the presence of sdh B mutations in samples of naturally infected strawberry fruits and stone fruit rootstock seedling plants showing damping-off symptoms. HRM analysis data were compared with a standard PIRA-PCR technique and an absolute agreement was observed suggesting that in both populations the H272R mutation was the predominant one, while H272Y, N230I, and P225H were detected in lower frequencies. The results of the study suggest that HRM analysis can be a useful tool for sensate, accurate, and rapid identification of several sdh B mutations in B. cinerea and it is expected to contribute in routine fungicide resistance monitoring or assessments of the effectiveness of anti-resistance strategies implemented in crops heavily treated with botryticides.

  11. Detection of sdhB Gene Mutations in SDHI-Resistant Isolates of Botrytis cinerea Using High Resolution Melting (HRM) Analysis

    PubMed Central

    Samaras, Anastasios; Madesis, Panagiotis; Karaoglanidis, George S.

    2016-01-01

    Botrytis cinerea, is a high risk pathogen for fungicide resistance development. Pathogen’ resistance to SDHIs is associated with several mutations in sdh gene. The diversity of mutations and their differential effect on cross-resistance patterns among SDHIs and the fitness of resistant strains necessitate the availability of a tool for their rapid identification. This study was initiated to develop and validate a high-resolution melting (HRM) analysis for the identification of P225H/F/L//T, N230I, and H272L/R/Y mutations. Based on the sequence of sdhB subunit of resistant and sensitive isolates, a universal primer pair was designed. The specificity of the HRM analysis primers was verified to ensure against the cross-reaction with other fungal species and its sensitivity was evaluated using concentrations of known amounts of mutant’s DNA. The melting curve analysis generated nine distinct curve profiles, enabling the discrimination of all the four mutations located at codon 225, the N230I mutation, the three mutations located in codon 272, and the non-mutated isolates (isolates of wild-type sensitivity). Similar results were obtained when DNA was extracted directly from artificially inoculated strawberry fruit. The method was validated by monitoring the presence of sdhB mutations in samples of naturally infected strawberry fruits and stone fruit rootstock seedling plants showing damping-off symptoms. HRM analysis data were compared with a standard PIRA–PCR technique and an absolute agreement was observed suggesting that in both populations the H272R mutation was the predominant one, while H272Y, N230I, and P225H were detected in lower frequencies. The results of the study suggest that HRM analysis can be a useful tool for sensate, accurate, and rapid identification of several sdhB mutations in B. cinerea and it is expected to contribute in routine fungicide resistance monitoring or assessments of the effectiveness of anti-resistance strategies implemented in crops heavily treated with botryticides. PMID:27895633

  12. Nonindependence and sensitivity analyses in ecological and evolutionary meta-analyses.

    PubMed

    Noble, Daniel W A; Lagisz, Malgorzata; O'dea, Rose E; Nakagawa, Shinichi

    2017-05-01

    Meta-analysis is an important tool for synthesizing research on a variety of topics in ecology and evolution, including molecular ecology, but can be susceptible to nonindependence. Nonindependence can affect two major interrelated components of a meta-analysis: (i) the calculation of effect size statistics and (ii) the estimation of overall meta-analytic estimates and their uncertainty. While some solutions to nonindependence exist at the statistical analysis stages, there is little advice on what to do when complex analyses are not possible, or when studies with nonindependent experimental designs exist in the data. Here we argue that exploring the effects of procedural decisions in a meta-analysis (e.g. inclusion of different quality data, choice of effect size) and statistical assumptions (e.g. assuming no phylogenetic covariance) using sensitivity analyses are extremely important in assessing the impact of nonindependence. Sensitivity analyses can provide greater confidence in results and highlight important limitations of empirical work (e.g. impact of study design on overall effects). Despite their importance, sensitivity analyses are seldom applied to problems of nonindependence. To encourage better practice for dealing with nonindependence in meta-analytic studies, we present accessible examples demonstrating the impact that ignoring nonindependence can have on meta-analytic estimates. We also provide pragmatic solutions for dealing with nonindependent study designs, and for analysing dependent effect sizes. Additionally, we offer reporting guidelines that will facilitate disclosure of the sources of nonindependence in meta-analyses, leading to greater transparency and more robust conclusions. © 2017 John Wiley & Sons Ltd.

  13. PREVALENCE OF METABOLIC SYNDROME IN YOUNG MEXICANS: A SENSITIVITY ANALYSIS ON ITS COMPONENTS.

    PubMed

    Murguía-Romero, Miguel; Jiménez-Flores, J Rafael; Sigrist-Flores, Santiago C; Tapia-Pancardo, Diana C; Jiménez-Ramos, Arnulfo; Méndez-Cruz, A René; Villalobos-Molina, Rafael

    2015-07-28

    obesity is a worldwide epidemic, and the high prevalence of diabetes type II (DM2) and cardiovascular disease (CVD) is in great part a consequence of that epidemic. Metabolic syndrome is a useful tool to estimate the risk of a young population to evolve to DM2 and CVD. to estimate the MetS prevalence in young Mexicans, and to evaluate each parameter as an independent indicator through a sensitivity analysis. the prevalence of MetS was estimated in 6 063 young of the México City metropolitan area. A sensitivity analysis was conducted to estimate the performance of each one of the components of MetS, as an indicator of the presence of MetS itself. Five statistical of the sensitivity analysis were calculated for each MetS component and the other parameters included: sensitivity, specificity, positive predictive value or precision, negative predictive value, and accuracy. the prevalence of MetS in Mexican young population was estimated to be 13.4%. Waist circumference presented the highest sensitivity (96.8% women; 90.0% men), blood pressure presented the highest specificity for women (97.7%) and glucose for men (91.0%). When all the five statistical are considered triglycerides is the component with the highest values, showing a value of 75% or more in four of them. Differences by sex are detected for averages of all components of MetS in young without alterations. Mexican young are highly prone to acquire MetS: 71% have at least one and up to five MetS parameters altered, and 13.4% of them have MetS. From all the five components of MetS, waist circumference presented the highest sensitivity as a predictor of MetS, and triglycerides is the best parameter if a single factor is to be taken as sole predictor of MetS in Mexican young population, triglycerides is also the parameter with the highest accuracy. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  14. Parametric sensitivity analysis of an agro-economic model of management of irrigation water

    NASA Astrophysics Data System (ADS)

    El Ouadi, Ihssan; Ouazar, Driss; El Menyari, Younesse

    2015-04-01

    The current work aims to build an analysis and decision support tool for policy options concerning the optimal allocation of water resources, while allowing a better reflection on the issue of valuation of water by the agricultural sector in particular. Thus, a model disaggregated by farm type was developed for the rural town of Ait Ben Yacoub located in the east Morocco. This model integrates economic, agronomic and hydraulic data and simulates agricultural gross margin across in this area taking into consideration changes in public policy and climatic conditions, taking into account the competition for collective resources. To identify the model input parameters that influence over the results of the model, a parametric sensitivity analysis is performed by the "One-Factor-At-A-Time" approach within the "Screening Designs" method. Preliminary results of this analysis show that among the 10 parameters analyzed, 6 parameters affect significantly the objective function of the model, it is in order of influence: i) Coefficient of crop yield response to water, ii) Average daily gain in weight of livestock, iii) Exchange of livestock reproduction, iv) maximum yield of crops, v) Supply of irrigation water and vi) precipitation. These 6 parameters register sensitivity indexes ranging between 0.22 and 1.28. Those results show high uncertainties on these parameters that can dramatically skew the results of the model or the need to pay particular attention to their estimates. Keywords: water, agriculture, modeling, optimal allocation, parametric sensitivity analysis, Screening Designs, One-Factor-At-A-Time, agricultural policy, climate change.

  15. Determining the best treatment for simple bone cyst: a decision analysis.

    PubMed

    Lee, Seung Yeol; Chung, Chin Youb; Lee, Kyoung Min; Sung, Ki Hyuk; Won, Sung Hun; Choi, In Ho; Cho, Tae-Joon; Yoo, Won Joon; Yeo, Ji Hyun; Park, Moon Seok

    2014-03-01

    The treatment of simple bone cysts (SBC) in children varies significantly among physicians. This study examined which procedure is better for the treatment of SBC, using a decision analysis based on current published evidence. A decision tree focused on five treatment modalities of SBC (observation, steroid injection, autologous bone marrow injection, decompression, and curettage with bone graft) were created. Each treatment modality was further branched, according to the presence and severity of complications. The probabilities of all cases were obtained by literature review. A roll back tool was utilized to determine the most preferred treatment modality. One-way sensitivity analysis was performed to determine the threshold value of the treatment modalities. Two-way sensitivity analysis was utilized to examine the joint impact of changes in probabilities of two parameters. The decision model favored autologous bone marrow injection. The expected value of autologous bone marrow injection was 0.9445, while those of observation, steroid injection, decompression, and curettage and bone graft were 0.9318, 0.9400, 0.9395, and 0.9342, respectively. One-way sensitivity analysis showed that autologous bone marrow injection was better than that of decompression for the expected value when the rate of pathologic fracture, or positive symptoms of SBC after autologous bone marrow injection, was lower than 20.4%. In our study, autologous bone marrow injection was found to be the best choice of treatment of SBC. However, the results were sensitive to the rate of pathologic fracture after treatment of SBC. Physicians should consider the possibility of pathologic fracture when they determine a treatment method for SBC.

  16. Characterization of non-polar aromatic hydrocarbons in crude oil using atmospheric pressure laser ionization and Fourier transform ion cyclotron resonance mass spectrometry (APLI FT-ICR MS).

    PubMed

    Schrader, Wolfgang; Panda, Saroj K; Brockmann, Klaus J; Benter, Thorsten

    2008-07-01

    We report on the successful application of the recently introduced atmospheric pressure laser ionization (APLI) method as a novel tool for the analysis of crude oil and its components. Using Fourier transform ion cyclotron resonance mass spectrometry, unambiguous determination of key compounds in this complex matrix with unprecedented sensitivity is presented.

  17. Evaluation of digital real-time PCR assay as a molecular diagnostic tool for single-cell analysis.

    PubMed

    Chang, Chia-Hao; Mau-Hsu, Daxen; Chen, Ke-Cheng; Wei, Cheng-Wey; Chiu, Chiung-Ying; Young, Tai-Horng

    2018-02-21

    In a single-cell study, isolating and identifying single cells are essential, but these processes often require a large investment of time or money. The aim of this study was to isolate and analyse single cells using a novel platform, the PanelChip™ Analysis System, which includes 2500 microwells chip and a digital real-time polymerase chain reaction (dqPCR) assay, in comparison with a standard PCR (qPCR) assay. Through the serial dilution of a known concentration standard, namely pUC19, the accuracy and sensitivity levels of two methodologies were compared. The two systems were tested on the basis of expression levels of the genetic markers vimentin, E-cadherin, N-cadherin and GAPDH in A549 lung carcinoma cells at two known concentrations. Furthermore, the influence of a known PCR inhibitor commonly found in blood samples, heparin, was evaluated in both methodologies. Finally, mathematical models were proposed and separation method of single cells was verified; moreover, gene expression levels during epithelial-mesenchymal transition in single cells under TGFβ1 treatment were measured. The drawn conclusion is that dqPCR performed using PanelChip™ is superior to the standard qPCR in terms of sensitivity, precision, and heparin tolerance. The dqPCR assay is a potential tool for clinical diagnosis and single-cell applications.

  18. Recent Advances in Biosensing With Photonic Crystal Surfaces: A Review

    PubMed Central

    Cunningham, B.T.; Zhang, M.; Zhuo, Y.; Kwon, L.; Race, C.

    2016-01-01

    Photonic crystal surfaces that are designed to function as wavelength-selective optical resonators have become a widely adopted platform for label-free biosensing, and for enhancement of the output of photon-emitting tags used throughout life science research and in vitro diagnostics. While some applications, such as analysis of drug-protein interactions, require extremely high resolution and the ability to accurately correct for measurement artifacts, others require sensitivity that is high enough for detection of disease biomarkers in serum with concentrations less than 1 pg/ml. As the analysis of cells becomes increasingly important for studying the behavior of stem cells, cancer cells, and biofilms under a variety of conditions, approaches that enable high resolution imaging of live cells without cytotoxic stains or photobleachable fluorescent dyes are providing new tools to biologists who seek to observe individual cells over extended time periods. This paper will review several recent advances in photonic crystal biosensor detection instrumentation and device structures that are being applied towards direct detection of small molecules in the context of high throughput drug screening, photonic crystal fluorescence enhancement as utilized for high sensitivity multiplexed cancer biomarker detection, and label-free high resolution imaging of cells and individual nanoparticles as a new tool for life science research and single-molecule diagnostics. PMID:27642265

  19. Development of a music therapy assessment tool for patients in low awareness states.

    PubMed

    Magee, Wendy L

    2007-01-01

    People in low awareness states following profound brain injury typically demonstrate subtle changes in functional behaviors which challenge the sensitivity of measurement tools. Failure to identify and measure changes in functioning can lead to misdiagnosis and withdrawal of treatment with this population. Thus, the development of tools which are sensitive to responsiveness is of central concern. As the auditory modality has been found to be particularly sensitive in identifying responses indicating awareness, a convincing case can be made for music therapy as a treatment medium. However, little has been recommended about protocols for intervention or tools for measuring patient responses within the music therapy setting. This paper presents the rationale for an assessment tool specifically designed to measure responses in the music therapy setting with patients who are diagnosed as minimally conscious or in a vegetative state. Developed over fourteen years as part of interdisciplinary assessment and treatment, the music therapy assessment tool for low awareness states (MATLAS) contains fourteen items which rate behavioral responses across a number of domains. The tool can provide important information for interdisciplinary assessment and treatment particularly in the auditory and communication domains. Recommendations are made for testing its reliability and validity through research.

  20. The choice of prior distribution for a covariance matrix in multivariate meta-analysis: a simulation study.

    PubMed

    Hurtado Rúa, Sandra M; Mazumdar, Madhu; Strawderman, Robert L

    2015-12-30

    Bayesian meta-analysis is an increasingly important component of clinical research, with multivariate meta-analysis a promising tool for studies with multiple endpoints. Model assumptions, including the choice of priors, are crucial aspects of multivariate Bayesian meta-analysis (MBMA) models. In a given model, two different prior distributions can lead to different inferences about a particular parameter. A simulation study was performed in which the impact of families of prior distributions for the covariance matrix of a multivariate normal random effects MBMA model was analyzed. Inferences about effect sizes were not particularly sensitive to prior choice, but the related covariance estimates were. A few families of prior distributions with small relative biases, tight mean squared errors, and close to nominal coverage for the effect size estimates were identified. Our results demonstrate the need for sensitivity analysis and suggest some guidelines for choosing prior distributions in this class of problems. The MBMA models proposed here are illustrated in a small meta-analysis example from the periodontal field and a medium meta-analysis from the study of stroke. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  1. A simple Bird Sensitivity to Oil Index as a management tool in coastal and marine areas subject to oil spills when few biological information is available.

    PubMed

    Romero, A F; Oliveira, M; Abessa, D M S

    2018-03-01

    This study sought to develop a simple index for ranking birds' environmental sensitivity to oil in which birds are used as biological indicators. The study area consisted of both the Santos Estuarine System (SES), and the Laje de Santos Marine State Park (LSMSP), located in Southeastern Brazil. Information on the bird species and their feeding and nesting behaviors were obtained from the literature and were the basis of the sensitivity index created. The SES had a higher number of species, but only about 30% were found to be highly sensitive. The LSMSP presented a much lower number of species, but all of them were considered to be highly sensitive to oil. Due to its simplicity, this index can be employed worldwide as a decision-making tool that may be integrated into other management tools, particularly when robust information on the biology of birds is lacking. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. A Sensitive and Effective Proteomic Approach to Identify She-Donkey’s and Goat’s Milk Adulterations by MALDI-TOF MS Fingerprinting

    PubMed Central

    Di Girolamo, Francesco; Masotti, Andrea; Salvatori, Guglielmo; Scapaticci, Margherita; Muraca, Maurizio; Putignani, Lorenza

    2014-01-01

    She-donkey’s milk (DM) and goat’s milk (GM) are commonly used in newborn and infant feeding because they are less allergenic than other milk types. It is, therefore, mandatory to avoid adulteration and contamination by other milk allergens, developing fast and efficient analytical methods to assess the authenticity of these precious nutrients. In this experimental work, a sensitive and robust matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) profiling was designed to assess the genuineness of DM and GM milks. This workflow allows the identification of DM and GM adulteration at levels of 0.5%, thus, representing a sensitive tool for milk adulteration analysis, if compared with other laborious and time-consuming analytical procedures. PMID:25110863

  3. Exploring the Relationship between Reward and Punishment Sensitivity and Gambling Disorder in a Clinical Sample: A Path Modeling Analysis.

    PubMed

    Jiménez-Murcia, Susana; Fernández-Aranda, Fernando; Mestre-Bach, Gemma; Granero, Roser; Tárrega, Salomé; Torrubia, Rafael; Aymamí, Neus; Gómez-Peña, Mónica; Soriano-Mas, Carles; Steward, Trevor; Moragas, Laura; Baño, Marta; Del Pino-Gutiérrez, Amparo; Menchón, José M

    2017-06-01

    Most individuals will gamble during their lifetime, yet only a select few will develop gambling disorder. Gray's Reinforcement Sensitivity Theory holds promise for providing insight into gambling disorder etiology and symptomatology as it ascertains that neurobiological differences in reward and punishment sensitivity play a crucial role in determining an individual's affect and motives. The aim of the study was to assess a mediational pathway, which included patients' sex, personality traits, reward and punishment sensitivity, and gambling-severity variables. The Sensitivity to Punishment and Sensitivity to Reward Questionnaire, the South Oaks Gambling Screen, the Symptom Checklist-Revised, and the Temperament and Character Inventory-Revised were administered to a sample of gambling disorder outpatients (N = 831), diagnosed according to DSM-5 criteria, attending a specialized outpatient unit. Sociodemographic variables were also recorded. A structural equation model found that both reward and punishment sensitivity were positively and directly associated with increased gambling severity, sociodemographic variables, and certain personality traits while also revealing a complex mediational role for these dimensions. To this end, our findings suggest that the Sensitivity to Punishment and Sensitivity to Reward Questionnaire could be a useful tool for gaining a better understanding of different gambling disorder phenotypes and developing tailored interventions.

  4. The Berlin Inventory of Gambling behavior - Screening (BIG-S): Validation using a clinical sample.

    PubMed

    Wejbera, Martin; Müller, Kai W; Becker, Jan; Beutel, Manfred E

    2017-05-18

    Published diagnostic questionnaires for gambling disorder in German are either based on DSM-III criteria or focus on aspects other than life time prevalence. This study was designed to assess the usability of the DSM-IV criteria based Berlin Inventory of Gambling Behavior Screening tool in a clinical sample and adapt it to DSM-5 criteria. In a sample of 432 patients presenting for behavioral addiction assessment at the University Medical Center Mainz, we checked the screening tool's results against clinical diagnosis and compared a subsample of n=300 clinically diagnosed gambling disorder patients with a comparison group of n=132. The BIG-S produced a sensitivity of 99.7% and a specificity of 96.2%. The instrument's unidimensionality and the diagnostic improvements of DSM-5 criteria were verified by exploratory and confirmatory factor analysis as well as receiver operating characteristic analysis. The BIG-S is a reliable and valid screening tool for gambling disorder and demonstrated its concise and comprehensible operationalization of current DSM-5 criteria in a clinical setting.

  5. A probabilistic sizing tool and Monte Carlo analysis for entry vehicle ablative thermal protection systems

    NASA Astrophysics Data System (ADS)

    Mazzaracchio, Antonio; Marchetti, Mario

    2010-03-01

    Implicit ablation and thermal response software was developed to analyse and size charring ablative thermal protection systems for entry vehicles. A statistical monitor integrated into the tool, which uses the Monte Carlo technique, allows a simulation to run over stochastic series. This performs an uncertainty and sensitivity analysis, which estimates the probability of maintaining the temperature of the underlying material within specified requirements. This approach and the associated software are primarily helpful during the preliminary design phases of spacecraft thermal protection systems. They are proposed as an alternative to traditional approaches, such as the Root-Sum-Square method. The developed tool was verified by comparing the results with those from previous work on thermal protection system probabilistic sizing methodologies, which are based on an industry standard high-fidelity ablation and thermal response program. New case studies were analysed to establish thickness margins on sizing heat shields that are currently proposed for vehicles using rigid aeroshells for future aerocapture missions at Neptune, and identifying the major sources of uncertainty in the material response.

  6. Comparison between laser terahertz emission microscope and conventional methods for analysis of polycrystalline silicon solar cell

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakanishi, Hidetoshi, E-mail: nakanisi@screen.co.jp; Ito, Akira, E-mail: a.ito@screen.co.jp; Takayama, Kazuhisa, E-mail: takayama.k0123@gmail.com

    2015-11-15

    A laser terahertz emission microscope (LTEM) can be used for noncontact inspection to detect the waveforms of photoinduced terahertz emissions from material devices. In this study, we experimentally compared the performance of LTEM with conventional analysis methods, e.g., electroluminescence (EL), photoluminescence (PL), and laser beam induced current (LBIC), as an inspection method for solar cells. The results showed that LTEM was more sensitive to the characteristics of the depletion layer of the polycrystalline solar cell compared with EL, PL, and LBIC and that it could be used as a complementary tool to the conventional analysis methods for a solar cell.

  7. CXTFIT/Excel A modular adaptable code for parameter estimation, sensitivity analysis and uncertainty analysis for laboratory or field tracer experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, Guoping; Mayes, Melanie; Parker, Jack C

    2010-01-01

    We implemented the widely used CXTFIT code in Excel to provide flexibility and added sensitivity and uncertainty analysis functions to improve transport parameter estimation and to facilitate model discrimination for multi-tracer experiments on structured soils. Analytical solutions for one-dimensional equilibrium and nonequilibrium convection dispersion equations were coded as VBA functions so that they could be used as ordinary math functions in Excel for forward predictions. Macros with user-friendly interfaces were developed for optimization, sensitivity analysis, uncertainty analysis, error propagation, response surface calculation, and Monte Carlo analysis. As a result, any parameter with transformations (e.g., dimensionless, log-transformed, species-dependent reactions, etc.) couldmore » be estimated with uncertainty and sensitivity quantification for multiple tracer data at multiple locations and times. Prior information and observation errors could be incorporated into the weighted nonlinear least squares method with a penalty function. Users are able to change selected parameter values and view the results via embedded graphics, resulting in a flexible tool applicable to modeling transport processes and to teaching students about parameter estimation. The code was verified by comparing to a number of benchmarks with CXTFIT 2.0. It was applied to improve parameter estimation for four typical tracer experiment data sets in the literature using multi-model evaluation and comparison. Additional examples were included to illustrate the flexibilities and advantages of CXTFIT/Excel. The VBA macros were designed for general purpose and could be used for any parameter estimation/model calibration when the forward solution is implemented in Excel. A step-by-step tutorial, example Excel files and the code are provided as supplemental material.« less

  8. CombiROC: an interactive web tool for selecting accurate marker combinations of omics data.

    PubMed

    Mazzara, Saveria; Rossi, Riccardo L; Grifantini, Renata; Donizetti, Simone; Abrignani, Sergio; Bombaci, Mauro

    2017-03-30

    Diagnostic accuracy can be improved considerably by combining multiple markers, whose performance in identifying diseased subjects is usually assessed via receiver operating characteristic (ROC) curves. The selection of multimarker signatures is a complicated process that requires integration of data signatures with sophisticated statistical methods. We developed a user-friendly tool, called CombiROC, to help researchers accurately determine optimal markers combinations from diverse omics methods. With CombiROC data from different domains, such as proteomics and transcriptomics, can be analyzed using sensitivity/specificity filters: the number of candidate marker panels rising from combinatorial analysis is easily optimized bypassing limitations imposed by the nature of different experimental approaches. Leaving to the user full control on initial selection stringency, CombiROC computes sensitivity and specificity for all markers combinations, performances of best combinations and ROC curves for automatic comparisons, all visualized in a graphic interface. CombiROC was designed without hard-coded thresholds, allowing a custom fit to each specific data: this dramatically reduces the computational burden and lowers the false negative rates given by fixed thresholds. The application was validated with published data, confirming the marker combination already originally described or even finding new ones. CombiROC is a novel tool for the scientific community freely available at http://CombiROC.eu.

  9. Assessing fears of preschool children with nighttime fears by a parent version of the fear survey schedule for preschool children.

    PubMed

    Kushnir, Jonathan; Gothelf, Doron; Sadeh, Avi

    2015-01-01

    Although excessive fears are common in preschool children, validated assessment tools for this age are lacking. Our aim was to modify and provide preliminary evidence of the utility of a preschoolers' fear screening tool, a parent-reported Fear Survey Schedule for Preschool Children (FSS-PC). 109 Israeli preschool children (aged 4-6 years) with chronic night time fears (NF) and 30 healthy children (controls) participated. The FSS-PC analysis included: 1) internal reliability, 2) correlations between FSS-PC scores and Child Behavior Checklist (CBCL) measures, 3) differences between NF and a comparison sample of FSS-PC scores, and 4) FSS-PC sensitivity in detecting change in NF following an intervention for NF. There were low-to-medium positive correlations between the FSS-PC scores and several internalizing scales of the CBCL measures. FSS-PC scores in the NF group were significantly higher than the control children's score. FSS-PC scores had adequate internal reliability and were also sensitive for detecting significant changes in fear levels following behavioral interventions. Unique cultural and environmental circumstances and specific study group. This new version of the FSS-PC may provide clinicians with a novel and useful screening tool for early assessment of fear- and anxiety-related phenomena of preschool children.

  10. Mid-frequency Band Dynamics of Large Space Structures

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.; Adams, Douglas S.

    2004-01-01

    High and low intensity dynamic environments experienced by a spacecraft during launch and on-orbit operations, respectively, induce structural loads and motions, which are difficult to reliably predict. Structural dynamics in low- and mid-frequency bands are sensitive to component interface uncertainty and non-linearity as evidenced in laboratory testing and flight operations. Analytical tools for prediction of linear system response are not necessarily adequate for reliable prediction of mid-frequency band dynamics and analysis of measured laboratory and flight data. A new MATLAB toolbox, designed to address the key challenges of mid-frequency band dynamics, is introduced in this paper. Finite-element models of major subassemblies are defined following rational frequency-wavelength guidelines. For computational efficiency, these subassemblies are described as linear, component mode models. The complete structural system model is composed of component mode subassemblies and linear or non-linear joint descriptions. Computation and display of structural dynamic responses are accomplished employing well-established, stable numerical methods, modern signal processing procedures and descriptive graphical tools. Parametric sensitivity and Monte-Carlo based system identification tools are used to reconcile models with experimental data and investigate the effects of uncertainties. Models and dynamic responses are exported for employment in applications, such as detailed structural integrity and mechanical-optical-control performance analyses.

  11. Using false colors to protect visual privacy of sensitive content

    NASA Astrophysics Data System (ADS)

    Ćiftçi, Serdar; Korshunov, Pavel; Akyüz, Ahmet O.; Ebrahimi, Touradj

    2015-03-01

    Many privacy protection tools have been proposed for preserving privacy. Tools for protection of visual privacy available today lack either all or some of the important properties that are expected from such tools. Therefore, in this paper, we propose a simple yet effective method for privacy protection based on false color visualization, which maps color palette of an image into a different color palette, possibly after a compressive point transformation of the original pixel data, distorting the details of the original image. This method does not require any prior face detection or other sensitive regions detection and, hence, unlike typical privacy protection methods, it is less sensitive to inaccurate computer vision algorithms. It is also secure as the look-up tables can be encrypted, reversible as table look-ups can be inverted, flexible as it is independent of format or encoding, adjustable as the final result can be computed by interpolating the false color image with the original using different degrees of interpolation, less distracting as it does not create visually unpleasant artifacts, and selective as it preserves better semantic structure of the input. Four different color scales and four different compression functions, one which the proposed method relies, are evaluated via objective (three face recognition algorithms) and subjective (50 human subjects in an online-based study) assessments using faces from FERET public dataset. The evaluations demonstrate that DEF and RBS color scales lead to the strongest privacy protection, while compression functions add little to the strength of privacy protection. Statistical analysis also shows that recognition algorithms and human subjects perceive the proposed protection similarly

  12. The diagnostic value of polymerase chain reaction for Mycobacterium tuberculosis to distinguish intestinal tuberculosis from crohn's disease: A meta-analysis.

    PubMed

    Jin, Ting; Fei, Baoying; Zhang, Yu; He, Xujun

    2017-01-01

    Intestinal tuberculosis (ITB) and Crohn's disease (CD) are important differential diagnoses that can be difficult to distinguish. Polymerase chain reaction (PCR) for Mycobacterium tuberculosis (MTB) is an efficient and promising tool. This meta-analysis was performed to systematically and objectively assess the potential diagnostic accuracy and clinical value of PCR for MTB in distinguishing ITB from CD. We searched PubMed, Embase, Web of Science, Science Direct, and the Cochrane Library for eligible studies, and nine articles with 12 groups of data were identified. The included studies were subjected to quality assessment using the revised Quality Assessment of Diagnostic Accuracy Studies (QUADAS-2) tool. The summary estimates were as follows: sensitivity 0.47 (95% CI: 0.42-0.51); specificity 0.95 (95% CI: 0.93-0.97); the positive likelihood ratio (PLR) 10.68 (95% CI: 6.98-16.35); the negative likelihood ratio (NLR) 0.49 (95% CI: 0.33-0.71); and diagnostic odds ratio (DOR) 21.92 (95% CI: 13.17-36.48). The area under the curve (AUC) was 0.9311, with a Q* value of 0.8664. Heterogeneity was found in the NLR. The heterogeneity of the studies was evaluated by meta-regression analysis and subgroup analysis. The current evidence suggests that PCR for MTB is a promising and highly specific diagnostic method to distinguish ITB from CD. However, physicians should also keep in mind that negative results cannot exclude ITB for its low sensitivity. Additional prospective studies are needed to further evaluate the diagnostic accuracy of PCR.

  13. A parallel and sensitive software tool for methylation analysis on multicore platforms.

    PubMed

    Tárraga, Joaquín; Pérez, Mariano; Orduña, Juan M; Duato, José; Medina, Ignacio; Dopazo, Joaquín

    2015-10-01

    DNA methylation analysis suffers from very long processing time, as the advent of Next-Generation Sequencers has shifted the bottleneck of genomic studies from the sequencers that obtain the DNA samples to the software that performs the analysis of these samples. The existing software for methylation analysis does not seem to scale efficiently neither with the size of the dataset nor with the length of the reads to be analyzed. As it is expected that the sequencers will provide longer and longer reads in the near future, efficient and scalable methylation software should be developed. We present a new software tool, called HPG-Methyl, which efficiently maps bisulphite sequencing reads on DNA, analyzing DNA methylation. The strategy used by this software consists of leveraging the speed of the Burrows-Wheeler Transform to map a large number of DNA fragments (reads) rapidly, as well as the accuracy of the Smith-Waterman algorithm, which is exclusively employed to deal with the most ambiguous and shortest reads. Experimental results on platforms with Intel multicore processors show that HPG-Methyl significantly outperforms in both execution time and sensitivity state-of-the-art software such as Bismark, BS-Seeker or BSMAP, particularly for long bisulphite reads. Software in the form of C libraries and functions, together with instructions to compile and execute this software. Available by sftp to anonymous@clariano.uv.es (password 'anonymous'). juan.orduna@uv.es or jdopazo@cipf.es. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. Genome-wide comparison of paired fresh frozen and formalin-fixed paraffin-embedded gliomas by custom BAC and oligonucleotide array comparative genomic hybridization: facilitating analysis of archival gliomas.

    PubMed

    Mohapatra, Gayatry; Engler, David A; Starbuck, Kristen D; Kim, James C; Bernay, Derek C; Scangas, George A; Rousseau, Audrey; Batchelor, Tracy T; Betensky, Rebecca A; Louis, David N

    2011-04-01

    Array comparative genomic hybridization (aCGH) is a powerful tool for detecting DNA copy number alterations (CNA). Because diffuse malignant gliomas are often sampled by small biopsies, formalin-fixed paraffin-embedded (FFPE) blocks are often the only tissue available for genetic analysis; FFPE tissues are also needed to study the intratumoral heterogeneity that characterizes these neoplasms. In this paper, we present a combination of evaluations and technical advances that provide strong support for the ready use of oligonucleotide aCGH on FFPE diffuse gliomas. We first compared aCGH using bacterial artificial chromosome (BAC) arrays in 45 paired frozen and FFPE gliomas, and demonstrate a high concordance rate between FFPE and frozen DNA in an individual clone-level analysis of sensitivity and specificity, assuring that under certain array conditions, frozen and FFPE DNA can perform nearly identically. However, because oligonucleotide arrays offer advantages to BAC arrays in genomic coverage and practical availability, we next developed a method of labeling DNA from FFPE tissue that allows efficient hybridization to oligonucleotide arrays. To demonstrate utility in FFPE tissues, we applied this approach to biphasic anaplastic oligoastrocytomas and demonstrate CNA differences between DNA obtained from the two components. Therefore, BAC and oligonucleotide aCGH can be sensitive and specific tools for detecting CNAs in FFPE DNA, and novel labeling techniques enable the routine use of oligonucleotide arrays for FFPE DNA. In combination, these advances should facilitate genome-wide analysis of rare, small and/or histologically heterogeneous gliomas from FFPE tissues.

  15. The role of spectrophotometry in the diagnosis of melanoma.

    PubMed

    Ascierto, Paolo A; Palla, Marco; Ayala, Fabrizio; De Michele, Ileana; Caracò, Corrado; Daponte, Antonio; Simeone, Ester; Mori, Stefano; Del Giudice, Maurizio; Satriano, Rocco A; Vozza, Antonio; Palmieri, Giuseppe; Mozzillo, Nicola

    2010-08-13

    Spectrophotometry (SPT) could represent a promising technique for the diagnosis of cutaneous melanoma (CM) at earlier stages of the disease. Starting from our experience, we further assessed the role of SPT in CM early detection. During a health campaign for malignant melanoma at National Cancer Institute of Naples, we identified a subset of 54 lesions to be addressed to surgical excision and histological examination. Before surgery, all patients were investigated by clinical and epiluminescence microscopy (ELM) screenings; selected lesions underwent spectrophotometer analysis. For SPT, we used a video spectrophotometer imaging system (Spectroshade MHT S.p.A., Verona, Italy). Among the 54 patients harbouring cutaneous pigmented lesions, we performed comparison between results from the SPT screening and the histological diagnoses as well as evaluation of both sensitivity and specificity in detecting CM using either SPT or conventional approaches. For all pigmented lesions, agreement between histology and SPT classification was 57.4%. The sensitivity and specificity of SPT in detecting melanoma were 66.6% and 76.2%, respectively. Although SPT is still considered as a valuable diagnostic tool for CM, its low accuracy, sensitivity, and specificity represent the main hamper for the introduction of such a methodology in clinical practice. Dermoscopy remains the best diagnostic tool for the preoperative diagnosis of pigmented skin lesions.

  16. The role of spectrophotometry in the diagnosis of melanoma

    PubMed Central

    2010-01-01

    Background Spectrophotometry (SPT) could represent a promising technique for the diagnosis of cutaneous melanoma (CM) at earlier stages of the disease. Starting from our experience, we further assessed the role of SPT in CM early detection. Methods During a health campaign for malignant melanoma at National Cancer Institute of Naples, we identified a subset of 54 lesions to be addressed to surgical excision and histological examination. Before surgery, all patients were investigated by clinical and epiluminescence microscopy (ELM) screenings; selected lesions underwent spectrophotometer analysis. For SPT, we used a video spectrophotometer imaging system (Spectroshade® MHT S.p.A., Verona, Italy). Results Among the 54 patients harbouring cutaneous pigmented lesions, we performed comparison between results from the SPT screening and the histological diagnoses as well as evaluation of both sensitivity and specificity in detecting CM using either SPT or conventional approaches. For all pigmented lesions, agreement between histology and SPT classification was 57.4%. The sensitivity and specificity of SPT in detecting melanoma were 66.6% and 76.2%, respectively. Conclusions Although SPT is still considered as a valuable diagnostic tool for CM, its low accuracy, sensitivity, and specificity represent the main hamper for the introduction of such a methodology in clinical practice. Dermoscopy remains the best diagnostic tool for the preoperative diagnosis of pigmented skin lesions. PMID:20707921

  17. Comparison of loop-mediated isothermal amplification assay and smear microscopy with culture for the diagnostic accuracy of tuberculosis.

    PubMed

    Gelaw, Baye; Shiferaw, Yitayal; Alemayehu, Marta; Bashaw, Abate Assefa

    2017-01-17

    Tuberculosis (TB) caused by Mycobacterium tuberculosis is one of the leading causes of death from infectious diseases worldwide. Sputum smear microscopy remains the most widely available pulmonary TB diagnostic tool particularly in resource limited settings. A highly sensitive diagnostic with minimal infrastructure, cost and training is required. Hence, we assessed the diagnostic performance of Loop-mediated isothermal amplification (LAMP) assay in detecting M.tuberculosis infection in sputum sample compared to LED fluorescent smear microscopy and culture. A cross-sectional study was conducted at the University of Gondar Hospital from June 01, 2015 to August 30, 2015. Pulmonary TB diagnosis using sputum LED fluorescence smear microscopy, TB-LAMP assay and culture were done. A descriptive analysis was used to determine demographic characteristics of the study participants. Analysis of sensitivity and specificity for smear microscopy and TB-LAMP compared with culture as a reference test was performed. Cohen's kappa was calculated as a measure of agreement between the tests. A total of 78 pulmonary presumptive TB patients sputum sample were analyzed. The overall sensitivity and specificity of LAMP were 75 and 98%, respectively. Among smear negative sputum samples, 33.3% sensitivity and 100% specificity of LAMP were observed. Smear microscopy showed 78.6% sensitivity and 98% specificity. LAMP and smear in series had sensitivity of 67.8% and specificity of 100%. LAMP and smear in parallel had sensitivity of 85.7% and specificity of 96%. The agreement between LAMP and fluorescent smear microscopy tests was very good (κ = 0.83, P-value ≤0.0001). TB-LAMP showed similar specificity but a slightly lower sensitivity with LED fluorescence microscopy. The specificity of LAMP and smear microscopy in series was high. The sensitivity of LAMP was insufficient for smear negative sputum samples.

  18. Assessing Instructional Sensitivity Using the Pre-Post Difference Index: A Nontechnical Tool for Extension Educators

    ERIC Educational Resources Information Center

    Adedokun, Omolola A.

    2018-01-01

    This article provides an illustrative description of the pre-post difference index (PPDI), a simple, nontechnical yet robust tool for examining the instructional sensitivity of assessment items. Extension educators often design pretest-posttest instruments to assess the impact of their curricula on participants' knowledge and understanding of the…

  19. Sensitivity analysis in practice: providing an uncertainty budget when applying supplement 1 to the GUM

    NASA Astrophysics Data System (ADS)

    Allard, Alexandre; Fischer, Nicolas

    2018-06-01

    Sensitivity analysis associated with the evaluation of measurement uncertainty is a very important tool for the metrologist, enabling them to provide an uncertainty budget and to gain a better understanding of the measurand and the underlying measurement process. Using the GUM uncertainty framework, the contribution of an input quantity to the variance of the output quantity is obtained through so-called ‘sensitivity coefficients’. In contrast, such coefficients are no longer computed in cases where a Monte-Carlo method is used. In such a case, supplement 1 to the GUM suggests varying the input quantities one at a time, which is not an efficient method and may provide incorrect contributions to the variance in cases where significant interactions arise. This paper proposes different methods for the elaboration of the uncertainty budget associated with a Monte Carlo method. An application to the mass calibration example described in supplement 1 to the GUM is performed with the corresponding R code for implementation. Finally, guidance is given for choosing a method, including suggestions for a future revision of supplement 1 to the GUM.

  20. Signatures of mountain building: Detrital zircon U/Pb ages from northeast Tibet

    USGS Publications Warehouse

    Lease, Richard O.; Burbank, Douglas W.; Gehrels, George E.; Wang, Zhicai; Yuan, Daoyang

    2007-01-01

    Although detrital zircon has proven to be a powerful tool for determining provenance, past work has focused primarily on delimiting regional source terranes. Here we explore the limits of spatial resolution and stratigraphic sensitivity of detrital zircon in ascertaining provenance, and we demonstrate its ability to detect source changes for terranes separated by only a few tens of kilometers. For such an analysis to succeed for a given mountain, discrete intrarange source terranes must have unique U/Pb zircon age signatures and sediments eroded from the range must have well-defined depositional ages. Here we use ∼1400 single-grain U/Pb zircon ages from northeastern Tibet to identify and analyze an area that satisfies these conditions. This analysis shows that the edges of intermontane basins are stratigraphically sensitive to discrete, punctuated changes in local source terranes. By tracking eroding rock units chronologically through the stratigraphic record, this sensitivity permits the detection of the differential rock uplift and progressive erosion that began ca. 8 Ma in the Laji Shan, a 10-25-km-wide range in northeastern Tibet with a unique U/Pb age signature.

  1. A simulation model for risk assessment of turbine wheels

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Hage, Richard T.

    1991-01-01

    A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.

  2. A simulation model for risk assessment of turbine wheels

    NASA Astrophysics Data System (ADS)

    Safie, Fayssal M.; Hage, Richard T.

    A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.

  3. Comprehensive Design Reliability Activities for Aerospace Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Christenson, R. L.; Whitley, M. R.; Knight, K. C.

    2000-01-01

    This technical publication describes the methodology, model, software tool, input data, and analysis result that support aerospace design reliability studies. The focus of these activities is on propulsion systems mechanical design reliability. The goal of these activities is to support design from a reliability perspective. Paralleling performance analyses in schedule and method, this requires the proper use of metrics in a validated reliability model useful for design, sensitivity, and trade studies. Design reliability analysis in this view is one of several critical design functions. A design reliability method is detailed and two example analyses are provided-one qualitative and the other quantitative. The use of aerospace and commercial data sources for quantification is discussed and sources listed. A tool that was developed to support both types of analyses is presented. Finally, special topics discussed include the development of design criteria, issues of reliability quantification, quality control, and reliability verification.

  4. Validation of a Brazilian version of the moral sensitivity questionnaire.

    PubMed

    Dalla Nora, Carlise R; Zoboli, Elma Lcp; Vieira, Margarida M

    2017-01-01

    Moral sensitivity has been identified as a foundational component of ethical action. Diminished or absent moral sensitivity can result in deficient care. In this context, assessing moral sensitivity is imperative for designing interventions to facilitate ethical practice and ensure that nurses make appropriate decisions. The main purpose of this study was to validate a scale for examining the moral sensitivity of Brazilian nurses. A pre-existing scale, the Moral Sensitivity Questionnaire, which was developed by Lützén, was used after the deletion of three items. The reliability and validity of the scale were examined using Cronbach's alpha and factor analysis, respectively. Participants and research context: Overall, 316 nurses from Rio Grande do Sul, Brazil, participated in the study. Ethical considerations: This study was approved by the Ethics Committee of Research of the Nursing School of the University of São Paulo. The Moral Sensitivity Questionnaire contained 27 items that were distributed across four dimensions: interpersonal orientation, professional knowledge, moral conflict and moral meaning. The questionnaire accounted for 55.8% of the total variance, with Cronbach's alpha of 0.82. The mean score for moral sensitivity was 4.45 (out of 7). The results of this study were compared with studies from other countries to examine the structure and implications of the moral sensitivity of nurses in Brazil. The Moral Sensitivity Questionnaire is an appropriate tool for examining the moral sensitivity of Brazilian nurses.

  5. SEM analysis of ionizing radiation effects in linear integrated circuits. [Scanning Electron Microscope

    NASA Technical Reports Server (NTRS)

    Stanley, A. G.; Gauthier, M. K.

    1977-01-01

    A successful diagnostic technique was developed using a scanning electron microscope (SEM) as a precision tool to determine ionization effects in integrated circuits. Previous SEM methods radiated the entire semiconductor chip or major areas. The large area exposure methods do not reveal the exact components which are sensitive to radiation. To locate these sensitive components a new method was developed, which consisted in successively irradiating selected components on the device chip with equal doses of electrons /10 to the 6th rad (Si)/, while the whole device was subjected to representative bias conditions. A suitable device parameter was measured in situ after each successive irradiation with the beam off.

  6. Practical solution for control of the pre-analytical phase in decentralized clinical laboratories for meeting the requirements of the medical laboratory accreditation standard DIN EN ISO 15189.

    PubMed

    Vacata, Vladimir; Jahns-Streubel, Gerlinde; Baldus, Mirjana; Wood, William Graham

    2007-01-01

    This report was written in response to the article by Wood published recently in this journal. It describes a practical solution to the problems of controlling the pre-analytical phase in the clinical diagnostic laboratory. As an indicator of quality in the pre-analytical phase of sample processing, a target analyte was chosen which is sensitive to delay in centrifugation and/or analysis. The results of analyses of the samples sent by satellite medical practitioners were compared with those from an on-site hospital laboratory with a controllable optimized pre-analytical phase. The aim of the comparison was: (a) to identify those medical practices whose mean/median sample values significantly deviate from those of the control situation in the hospital laboratory due to the possible problems in the pre-analytical phase; (b) to aid these laboratories in the process of rectifying these problems. A Microsoft Excel-based Pre-Analytical Survey tool (PAS tool) has been developed which addresses the above mentioned problems. It has been tested on serum potassium which is known to be sensitive to delay and/or irregularities in sample treatment. The PAS tool has been shown to be one possibility for improving the quality of the analyses by identifying the sources of problems within the pre-analytical phase, thus allowing them to be rectified. Additionally, the PAS tool has an educational value and can also be adopted for use in other decentralized laboratories.

  7. BAYESIAN ANALYSIS TO EVALUATE TESTS FOR THE DETECTION OF MYCOBACTERIUM BOVIS INFECTION IN FREE-RANGING WILD BISON (BISON BISON ATHABASCAE) IN THE ABSENCE OF A GOLD STANDARD.

    PubMed

    Chapinal, Núria; Schumaker, Brant A; Joly, Damien O; Elkin, Brett T; Stephen, Craig

    2015-07-01

    We estimated the sensitivity and specificity of the caudal-fold skin test (CFT), the fluorescent polarization assay (FPA), and the rapid lateral-flow test (RT) for the detection of Mycobacterium bovis in free-ranging wild wood bison (Bison bison athabascae), in the absence of a gold standard, by using Bayesian analysis, and then used those estimates to forecast the performance of a pairwise combination of tests in parallel. In 1998-99, 212 wood bison from Wood Buffalo National Park (Canada) were tested for M. bovis infection using CFT and two serologic tests (FPA and RT). The sensitivity and specificity of each test were estimated using a three-test, one-population, Bayesian model allowing for conditional dependence between FPA and RT. The sensitivity and specificity of the combination of CFT and each serologic test in parallel were calculated assuming conditional independence. The test performance estimates were influenced by the prior values chosen. However, the rank of tests and combinations of tests based on those estimates remained constant. The CFT was the most sensitive test and the FPA was the least sensitive, whereas RT was the most specific test and CFT was the least specific. In conclusion, given the fact that gold standards for the detection of M. bovis are imperfect and difficult to obtain in the field, Bayesian analysis holds promise as a tool to rank tests and combinations of tests based on their performance. Combining a skin test with an animal-side serologic test, such as RT, increases sensitivity in the detection of M. bovis and is a good approach to enhance disease eradication or control in wild bison.

  8. Functional interaction analysis of GM1-related carbohydrates and Vibrio cholerae toxins using carbohydrate microarray.

    PubMed

    Kim, Chang Sup; Seo, Jeong Hyun; Cha, Hyung Joon

    2012-08-07

    The development of analytical tools is important for understanding the infection mechanisms of pathogenic bacteria or viruses. In the present work, a functional carbohydrate microarray combined with a fluorescence immunoassay was developed to analyze the interactions of Vibrio cholerae toxin (ctx) proteins and GM1-related carbohydrates. Ctx proteins were loaded onto the surface-immobilized GM1 pentasaccharide and six related carbohydrates, and their binding affinities were detected immunologically. The analysis of the ctx-carbohydrate interactions revealed that the intrinsic selectivity of ctx was GM1 pentasaccharide ≫ GM2 tetrasaccharide > asialo GM1 tetrasaccharide ≥ GM3trisaccharide, indicating that a two-finger grip formation and the terminal monosaccharides play important roles in the ctx-GM1 interaction. In addition, whole cholera toxin (ctxAB(5)) had a stricter substrate specificity and a stronger binding affinity than only the cholera toxin B subunit (ctxB). On the basis of the quantitative analysis, the carbohydrate microarray showed the sensitivity of detection of the ctxAB(5)-GM1 interaction with a limit-of-detection (LOD) of 2 ng mL(-1) (23 pM), which is comparable to other reported high sensitivity assay tools. In addition, the carbohydrate microarray successfully detected the actual toxin directly secreted from V. cholerae, without showing cross-reactivity to other bacteria. Collectively, these results demonstrate that the functional carbohydrate microarray is suitable for analyzing toxin protein-carbohydrate interactions and can be applied as a biosensor for toxin detection.

  9. Spinal Cord Injury Pain Instrument and painDETECT questionnaire: Convergent construct validity in individuals with Spinal Cord Injury.

    PubMed

    Franz, S; Schuld, C; Wilder-Smith, E P; Heutehaus, L; Lang, S; Gantz, S; Schuh-Hofer, S; Treede, R-D; Bryce, T N; Wang, H; Weidner, N

    2017-11-01

    Neuropathic pain (NeuP) is a frequent sequel of spinal cord injury (SCI). The SCI Pain Instrument (SCIPI) was developed as a SCI-specific NeuP screening tool. A preliminary validation reported encouraging results requiring further evaluation in terms of psychometric properties. The painDETECT questionnaire (PDQ), a commonly applied NeuP assessment tool, was primarily validated in German, but not specifically developed for SCI and not yet validated according to current diagnostic guidelines. We aimed to provide convergent construct validity and to identify the optimal item combination for the SCIPI. The PDQ was re-evaluated according to current guidelines with respect to SCI-related NeuP. Prospective monocentric study. Subjects received a neurological examination according to the International Standards for Neurological Classification of SCI. After linguistic validation of the SCIPI, the IASP-grading system served as reference to diagnose NeuP, accompanied by the PDQ after its re-evaluation as binary classifier. Statistics were evaluated through ROC-analysis, with the area under the ROC curve (AUROC) as optimality criterion. The SCIPI was refined by systematic item permutation. Eighty-eight individuals were assessed with the German SCIPI. Of 127 possible combinations, a 4-item-SCIPI (cut-off-score = 1.5/sensitivity = 0.864/specificity = 0.839) was identified as most reasonable. The SCIPI showed a strong correlation (r sp  = 0.76) with PDQ. ROC-analysis of SCIPI/PDQ (AUROC = 0.877) revealed comparable results to SCIPI/IASP (AUROC = 0.916). ROC-analysis of PDQ/IASP delivered a score threshold of 10.5 (sensitivity = 0.727/specificity = 0.903). The SCIPI is a valid easy-to-apply NeuP screening tool in SCI. The PDQ is recommended as complementary NeuP assessment tool in SCI, e.g. to monitor pain severity and/or its time-dependent course. In SCI-related pain, both SCIPI and PainDETECT show strong convergent construct validity versus the current IASP-grading system. SCIPI is now optimized from a 7-item to an easy-to-apply 4-item screening tool in German and English. We provided evidence that the scope for PainDETECT can be expanded to individuals with SCI. © 2017 European Pain Federation - EFIC®.

  10. Modern Material Analysis Instruments Add a New Dimension to Materials Characterization and Failure Analysis

    NASA Technical Reports Server (NTRS)

    Panda, Binayak

    2009-01-01

    Modern analytical tools can yield invaluable results during materials characterization and failure analysis. Scanning electron microscopes (SEMs) provide significant analytical capabilities, including angstrom-level resolution. These systems can be equipped with a silicon drift detector (SDD) for very fast yet precise analytical mapping of phases, as well as electron back-scattered diffraction (EBSD) units to map grain orientations, chambers that admit large samples, variable pressure for wet samples, and quantitative analysis software to examine phases. Advanced solid-state electronics have also improved surface and bulk analysis instruments: Secondary ion mass spectroscopy (SIMS) can quantitatively determine and map light elements such as hydrogen, lithium, and boron - with their isotopes. Its high sensitivity detects impurities at parts per billion (ppb) levels. X-ray photo-electron spectroscopy (XPS) can determine oxidation states of elements, as well as identifying polymers and measuring film thicknesses on coated composites. This technique is also known as electron spectroscopy for chemical analysis (ESCA). Scanning Auger electron spectroscopy (SAM) combines surface sensitivity, spatial lateral resolution (10 nm), and depth profiling capabilities to describe elemental compositions of near and below surface regions down to the chemical state of an atom.

  11. Verification, Validation and Sensitivity Studies in Computational Biomechanics

    PubMed Central

    Anderson, Andrew E.; Ellis, Benjamin J.; Weiss, Jeffrey A.

    2012-01-01

    Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation. The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of verification and validation principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques. PMID:17558646

  12. Efficient computation paths for the systematic analysis of sensitivities

    NASA Astrophysics Data System (ADS)

    Greppi, Paolo; Arato, Elisabetta

    2013-01-01

    A systematic sensitivity analysis requires computing the model on all points of a multi-dimensional grid covering the domain of interest, defined by the ranges of variability of the inputs. The issues to efficiently perform such analyses on algebraic models are handling solution failures within and close to the feasible region and minimizing the total iteration count. Scanning the domain in the obvious order is sub-optimal in terms of total iterations and is likely to cause many solution failures. The problem of choosing a better order can be translated geometrically into finding Hamiltonian paths on certain grid graphs. This work proposes two paths, one based on a mixed-radix Gray code and the other, a quasi-spiral path, produced by a novel heuristic algorithm. Some simple, easy-to-visualize examples are presented, followed by performance results for the quasi-spiral algorithm and the practical application of the different paths in a process simulation tool.

  13. Visible and Extended Near-Infrared Multispectral Imaging for Skin Cancer Diagnosis

    PubMed Central

    Rey-Barroso, Laura; Burgos-Fernández, Francisco J.; Delpueyo, Xana; Ares, Miguel; Malvehy, Josep; Puig, Susana

    2018-01-01

    With the goal of diagnosing skin cancer in an early and noninvasive way, an extended near infrared multispectral imaging system based on an InGaAs sensor with sensitivity from 995 nm to 1613 nm was built to evaluate deeper skin layers thanks to the higher penetration of photons at these wavelengths. The outcomes of this device were combined with those of a previously developed multispectral system that works in the visible and near infrared range (414 nm–995 nm). Both provide spectral and spatial information from skin lesions. A classification method to discriminate between melanomas and nevi was developed based on the analysis of first-order statistics descriptors, principal component analysis, and support vector machine tools. The system provided a sensitivity of 78.6% and a specificity of 84.6%, the latter one being improved with respect to that offered by silicon sensors. PMID:29734747

  14. Infrared imaging: a potential powerful tool for neuroimaging and neurodiagnostics

    PubMed Central

    Khoshakhlagh, Arezou; Gunapala, Sarath D.

    2017-01-01

    Abstract. Infrared (IR) imaging is used to detect the subtle changes in temperature needed to accurately detect and monitor disease. Technological advances have made IR a highly sensitive and reliable detection tool with strong potential in medical and neurophotonics applications. An overview of IR imaging specifically investigating quantum well IR detectors developed at Jet Propulsion Laboratory for a noninvasive, nonradiating imaging tool is provided, which could be applied for neuroscience and neurosurgery where it involves sensitive cellular temperature change. PMID:28382311

  15. An approach to value-based simulator selection: The creation and evaluation of the simulator value index tool.

    PubMed

    Rooney, Deborah M; Hananel, David M; Covington, Benjamin J; Dionise, Patrick L; Nykamp, Michael T; Pederson, Melvin; Sahloul, Jamal M; Vasquez, Rachael; Seagull, F Jacob; Pinsky, Harold M; Sweier, Domenica G; Cooke, James M

    2018-04-01

    Currently there is no reliable, standardized mechanism to support health care professionals during the evaluation of and procurement processes for simulators. A tool founded on best practices could facilitate simulator purchase processes. In a 3-phase process, we identified top factors considered during the simulator purchase process through expert consensus (n = 127), created the Simulator Value Index (SVI) tool, evaluated targeted validity evidence, and evaluated the practical value of this SVI. A web-based survey was sent to simulation professionals. Participants (n = 79) used the SVI and provided feedback. We evaluated the practical value of 4 tool variations by calculating their sensitivity to predict a preferred simulator. Seventeen top factors were identified and ranked. The top 2 were technical stability/reliability of the simulator and customer service, with no practical differences in rank across institution or stakeholder role. Full SVI variations predicted successfully the preferred simulator with good (87%) sensitivity, whereas the sensitivity of variations in cost and customer service and cost and technical stability decreased (≤54%). The majority (73%) of participants agreed that the SVI was helpful at guiding simulator purchase decisions, and 88% agreed the SVI tool would help facilitate discussion with peers and leadership. Our findings indicate the SVI supports the process of simulator purchase using a standardized framework. Sensitivity of the tool improved when factors extend beyond traditionally targeted factors. We propose the tool will facilitate discussion amongst simulation professionals dealing with simulation, provide essential information for finance and procurement professionals, and improve the long-term value of simulation solutions. Limitations and application of the tool are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. TSUNAMI Primer: A Primer for Sensitivity/Uncertainty Calculations with SCALE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Mueller, Don; Bowman, Stephen M

    2009-01-01

    This primer presents examples in the application of the SCALE/TSUNAMI tools to generate k{sub eff} sensitivity data for one- and three-dimensional models using TSUNAMI-1D and -3D and to examine uncertainties in the computed k{sub eff} values due to uncertainties in the cross-section data used in their calculation. The proper use of unit cell data and need for confirming the appropriate selection of input parameters through direct perturbations are described. The uses of sensitivity and uncertainty data to identify and rank potential sources of computational bias in an application system and TSUNAMI tools for assessment of system similarity using sensitivity andmore » uncertainty criteria are demonstrated. Uses of these criteria in trending analyses to assess computational biases, bias uncertainties, and gap analyses are also described. Additionally, an application of the data adjustment tool TSURFER is provided, including identification of specific details of sources of computational bias.« less

  17. Clinical and pathological tools for identifying microsatellite instability in colorectal cancer

    PubMed Central

    Krivokapić, Zoran; Marković, Srdjan; Antić, Jadranka; Dimitrijević, Ivan; Bojić, Daniela; Svorcan, Petar; Jojić, Njegica; Damjanović, Svetozar

    2012-01-01

    Aim To assess practical accuracy of revised Bethesda criteria (BGrev), pathological predictive model (MsPath), and histopathological parameters for detection of high-frequency of microsatellite instability (MSI-H) phenotype in patients with colorectal carcinoma (CRC). Method Tumors from 150 patients with CRC were analyzed for MSI using a fluorescence-based pentaplex polymerase chain reaction technique. For all patients, we evaluated age, sex, family history of cancer, localization, tumor differentiation, mucin production, lymphocytic infiltration (TIL), and Union for International Cancer Control stage. Patients were classified according to the BGrev, and the groups were compared. The utility of the BGrev, MsPath, and clinical and histopathological parameters for predicting microsatellite tumor status were assessed by univariate logistic regression analysis and by calculating the sensitivity, specificity, and positive (PPV) and negative (NPV) predictive values. Results Fifteen out of 45 patients who met and 4 of 105 patients who did not meet the BGrev criteria had MSI-H CRC. Sensitivity, specificity, PPV, and NPV for BGrev were 78.9%, 77%, 30%, and 70%, respectively. MSI histology (the third BGrev criterion without age limit) was as sensitive as BGrev, but more specific. MsPath model was more sensitive than BGrev (86%), with similar specificity. Any BGrev criterion fulfillment, mucinous differentiation, and right-sided CRC were singled out as independent factors to identify MSI-H colorectal cancer. Conclusion The BGrev, MsPath model, and MSI histology are useful tools for selecting patients for MSI testing. PMID:22911525

  18. Space station integrated wall design and penetration damage control

    NASA Technical Reports Server (NTRS)

    Coronado, A. R.; Gibbins, M. N.; Wright, M. A.; Stern, P. H.

    1987-01-01

    The analysis code BUMPER executes a numerical solution to the problem of calculating the probability of no penetration (PNP) of a spacecraft subject to man-made orbital debris or meteoroid impact. The codes were developed on a DEC VAX 11/780 computer that uses the Virtual Memory System (VMS) operating system, which is written in FORTRAN 77 with no VAX extensions. To help illustrate the steps involved, a single sample analysis is performed. The example used is the space station reference configuration. The finite element model (FEM) of this configuration is relatively complex but demonstrates many BUMPER features. The computer tools and guidelines are described for constructing a FEM for the space station under consideration. The methods used to analyze the sensitivity of PNP to variations in design, are described. Ways are suggested for developing contour plots of the sensitivity study data. Additional BUMPER analysis examples are provided, including FEMs, command inputs, and data outputs. The mathematical theory used as the basis for the code is described, and illustrates the data flow within the analysis.

  19. Increasing productivity for the analysis of trace contaminants in food by gas chromatography-mass spectrometry using automated liner exchange, backflushing and heart-cutting.

    PubMed

    David, Frank; Tienpont, Bart; Devos, Christophe; Lerch, Oliver; Sandra, Pat

    2013-10-25

    Laboratories focusing on residue analysis in food are continuously seeking to increase sample throughput by minimizing sample preparation. Generic sample extraction methods such as QuEChERS lack selectivity and consequently extracts are not free from non-volatile material that contaminates the analytical system. Co-extracted matrix constituents interfere with target analytes, even if highly sensitive and selective GC-MS/MS is used. A number of GC approaches are described that can be used to increase laboratory productivity. These techniques include automated inlet liner exchange and column backflushing for preservation of the performance of the analytical system and heart-cutting two-dimensional GC for increasing sensitivity and selectivity. The application of these tools is illustrated by the analysis of pesticides in vegetables and fruits, PCBs in milk powder and coplanar PCBs in fish. It is demonstrated that considerable increase in productivity can be achieved by decreasing instrument down-time, while analytical performance is equal or better compared to conventional trace contaminant analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Favorite, Jeffrey A.

    SENSMG is a tool for computing first-order sensitivities of neutron reaction rates, reaction-rate ratios, leakage, k eff, and α using the PARTISN multigroup discrete-ordinates code. SENSMG computes sensitivities to all of the transport cross sections and data (total, fission, nu, chi, and all scattering moments), two edit cross sections (absorption and capture), and the density for every isotope and energy group. It also computes sensitivities to the mass density for every material and derivatives with respect to all interface locations. The tool can be used for one-dimensional spherical (r) and two-dimensional cylindrical (r-z) geometries. The tool can be used formore » fixed-source and eigenvalue problems. The tool implements Generalized Perturbation Theory (GPT) as discussed by Williams and Stacey. Section II of this report describes the theory behind adjoint-based sensitivities, gives the equations that SENSMG solves, and defines the sensitivities that are output. Section III describes the user interface, including the input file and command line options. Section IV describes the output. Section V gives some notes about the coding that may be of interest. Section VI discusses verification, which is ongoing. Section VII lists needs and ideas for future work. Appendix A lists all of the input files whose results are presented in Sec. VI.« less

  1. Optical Coherence Tomography Angiography versus Dye Angiography in Age-Related Macular Degeneration: Sensitivity and Specificity Analysis.

    PubMed

    Nikolopoulou, Eleni; Lorusso, Massimo; Micelli Ferrari, Luisa; Cicinelli, Maria Vittoria; Bandello, Francesco; Querques, Giuseppe; Micelli Ferrari, Tommaso

    2018-01-01

    Optical coherence tomography angiography (OCTA) could be a valid tool to detect choroidal neovascularization (CNV) in neovascular age-related macular degeneration (nAMD), allowing the analysis of the type, the morphology, and the extension of CNV in most of the cases. To determine the sensitivity and specificity of OCTA in detecting CNV secondary to nAMD, compared to fluorescein angiography (FA) and indocyanine green angiography (ICGA). Prospective observational study. Patients with suspected nAMD were recruited between May and December 2016. Patients underwent FA, ICGA, spectral domain OCT, and OCTA (AngioVue, Optovue, Inc.). Sensitivity and specificity of FA, with or without ICGA, were assessed and compared with OCTA. Seventy eyes of 70 consecutive patients were included: 32 eyes (45.7%) with type I CNV, 8 eyes (11.4%) with type II CNV, 4 eyes (5.7%) with type III CNV, 6 eyes (8.6%) with mixed type I and type II CNV, and 20 eyes (28.6%) with no CNV. Sensitivity of OCTA was 88% and specificity was 90%. Concordance between FA/ICGA and OCTA was very good (0,91; range 0,81-1,00). OCTA showed high sensitivity and specificity for detection of CNV. Concordance between OCTA and gold-standard dye-based techniques was excellent. OCTA may represent a first-line noninvasive method for the diagnosis of nAMD.

  2. Temperature sensitivity analysis of polarity controlled electrostatically doped tunnel field-effect transistor

    NASA Astrophysics Data System (ADS)

    Nigam, Kaushal; Pandey, Sunil; Kondekar, P. N.; Sharma, Dheeraj

    2016-09-01

    The conventional tunnel field-effect transistors (TFETs) have shown potential to scale down in sub-22 nm regime due to its lower sub-threshold slope and robustness against short-channel effects (SCEs), however, sensitivity towards temperature variation is a major concern. Therefore, for the first time, we investigate temperature sensitivity analysis of a polarity controlled electrostatically doped tunnel field-effect transistor (ED-TFET). Different performance metrics and analog/RF figure-of-merits were considered and compared for both devices, and simulations were performed using Silvaco ATLAS device tool. We found that the variation in ON-state current in ED-TFET is almost temperature independent due to electrostatically doped mechanism, while, it increases in conventional TFET at higher temperature. Above room temperature, the variation in ION, IOFF, and SS sensitivity in ED-TFET are only 0.11%/K, 2.21%/K, and 0.63%/K, while, in conventional TFET the variations are 0.43%/K, 2.99%/K, and 0.71%/K, respectively. However, below room temperature, the variation in ED-TFET ION is 0.195%/K compared to 0.27%/K of conventional TFET. Moreover, it is analysed that the incomplete ionization effect in conventional TFET severely affects the drive current and the threshold voltage, while, ED-TFET remains unaffected. Hence, the proposed ED-TFET is less sensitive towards temperature variation and can be used for cryogenics as well as for high temperature applications.

  3. Evaluation of the AnnAGNPS Model for Predicting Runoff and Nutrient Export in a Typical Small Watershed in the Hilly Region of Taihu Lake

    PubMed Central

    Luo, Chuan; Li, Zhaofu; Li, Hengpeng; Chen, Xiaomin

    2015-01-01

    The application of hydrological and water quality models is an efficient approach to better understand the processes of environmental deterioration. This study evaluated the ability of the Annualized Agricultural Non-Point Source (AnnAGNPS) model to predict runoff, total nitrogen (TN) and total phosphorus (TP) loading in a typical small watershed of a hilly region near Taihu Lake, China. Runoff was calibrated and validated at both an annual and monthly scale, and parameter sensitivity analysis was performed for TN and TP before the two water quality components were calibrated. The results showed that the model satisfactorily simulated runoff at annual and monthly scales, both during calibration and validation processes. Additionally, results of parameter sensitivity analysis showed that the parameters Fertilizer rate, Fertilizer organic, Canopy cover and Fertilizer inorganic were more sensitive to TN output. In terms of TP, the parameters Residue mass ratio, Fertilizer rate, Fertilizer inorganic and Canopy cover were the most sensitive. Based on these sensitive parameters, calibration was performed. TN loading produced satisfactory results for both the calibration and validation processes, whereas the performance of TP loading was slightly poor. The simulation results showed that AnnAGNPS has the potential to be used as a valuable tool for the planning and management of watersheds. PMID:26364642

  4. What to do with thyroid nodules showing benign cytology and BRAF(V600E) mutation? A study based on clinical and radiologic features using a highly sensitive analytic method.

    PubMed

    Kim, Soo-Yeon; Kim, Eun-Kyung; Kwak, Jin Young; Moon, Hee Jung; Yoon, Jung Hyun

    2015-02-01

    BRAF(V600E) mutation analysis has been used as a complementary diagnostic tool to ultrasonography-guided, fine-needle aspiration (US-FNA) in the diagnosis of thyroid nodule with high specificity reported up to 100%. When highly sensitive analytic methods are used, however, false-positive results of BRAF(V600E) mutation analysis have been reported. In this study, we investigated the clinical, US features, and outcome of patients with thyroid nodules with benign cytology but positive BRAF(V600E) mutation using highly sensitive analytic methods from US-FNA. This study included 22 nodules in 22 patients (3 men, 19 women; mean age, 53 years) with benign cytology but positive BRAF(V600E) mutation from US-FNA. US features were categorized according to the internal components, echogenicity, margin, calcifications, and shape. Suspicious US features included markedly hypoechogenicity, noncircumscribed margins, micro or mixed calcifications, and nonparallel shape. Nodules were considered to have either concordant or discordant US features to benign cytology. Medical records and imaging studies were reviewed for final cytopathology results and outcomes during follow-up. Among the 22 nodules, 17 nodules were reviewed. Fifteen of 17 nodules were malignant, and 2 were benign. The benign nodules were confirmed as adenomatous hyperplasia with underlying lymphocytic thyroiditis and a fibrotic nodule with dense calcification. Thirteen of the 15 malignant nodules had 2 or more suspicious US features, and all 15 nodules were considered to have discordant cytology considering suspicious US features. Five nodules had been followed with US or US-FNA without resection, and did not show change in size or US features on follow-up US examinations. BRAF(V600E) mutation analysis is a highly sensitive diagnostic tool in the diagnosis of papillary thyroid carcinomas. In the management of thyroid nodules with benign cytology but positive BRAF(V600E) mutation, thyroidectomy should be considered in nodules which have 2 or more suspicious US features and are considered discordant on image-cytology correlation. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Spin, twist and hadron structure in deep inelastic processes

    NASA Astrophysics Data System (ADS)

    Jaffe, R. L.; Meyer, H.; Piller, G.

    These notes provide an introduction to polarization effects in deep inelastic processes in QCD. We emphasize recent work on transverse asymmetries, subdominant effects, and the role of polarization in fragmentation and in purely hadronic processes. After a review of kinematics and some basic tools of short distance analysis, we study the twist, helicity, chirality and transversity dependence of a variety of high energy processes sensitive to the quark and gluon substructure of hadrons.

  6. Diagnostic value of 18F-FDG-PET/CT for the evaluation of solitary pulmonary nodules: a systematic review and meta-analysis.

    PubMed

    Ruilong, Zong; Daohai, Xie; Li, Geng; Xiaohong, Wang; Chunjie, Wang; Lei, Tian

    2017-01-01

    To carry out a meta-analysis on the performance of fluorine-18-fluorodeoxyglucose (F-FDG) PET/computed tomography (PET/CT) for the evaluation of solitary pulmonary nodules. In the meta-analysis, we performed searches of several electronic databases for relevant studies, including Google Scholar, PubMed, Cochrane Library, and several Chinese databases. The quality of all included studies was assessed by Quality Assessment of Diagnostic Accuracy Studies (QUADAS-2). Two observers independently extracted data of eligible articles. For the meta-analysis, the total sensitivity, specificity, positive likelihood ratio, negative likelihood ratio, and diagnostic odds ratios were pooled. A summary receiver operating characteristic curve was constructed. The I-test was performed to assess the impact of study heterogeneity on the results of the meta-analysis. Meta-regression and subgroup analysis were carried out to investigate the potential covariates that might have considerable impacts on heterogeneity. Overall, 12 studies were included in this meta-analysis, including a total of 1297 patients and 1301 pulmonary nodules. The pooled sensitivity, specificity, positive likelihood ratio, and negative likelihood ratio with corresponding 95% confidence intervals (CIs) were 0.82 (95% CI, 0.76-0.87), 0.81 (95% CI, 0.66-0.90), 4.3 (95% CI, 2.3-7.9), and 0.22 (95% CI, 0.16-0.30), respectively. Significant heterogeneity was observed in sensitivity (I=81.1%) and specificity (I=89.6%). Subgroup analysis showed that the best results for sensitivity (0.90; 95% CI, 0.68-0.86) and accuracy (0.93; 95% CI, 0.90-0.95) were present in a prospective study. The results of our analysis suggest that PET/CT is a useful tool for detecting malignant pulmonary nodules qualitatively. Although current evidence showed moderate accuracy for PET/CT in differentiating malignant from benign solitary pulmonary nodules, further work needs to be carried out to improve its reliability.

  7. Implementation Strategies for Gender-Sensitive Public Health Practice: A European Workshop.

    PubMed

    Oertelt-Prigione, Sabine; Dalibert, Lucie; Verdonk, Petra; Stutz, Elisabeth Zemp; Klinge, Ineke

    2017-11-01

    Providing a robust scientific background for the focus on gender-sensitive public health and a systematic approach to its implementation. Within the FP7-EUGenMed project ( http://eugenmed.eu ) a workshop on sex and gender in public health was convened on February 2-3, 2015. The experts participated in moderated discussion rounds to (1) assemble available knowledge and (2) identify structural influences on practice implementation. The findings were summarized and analyzed in iterative rounds to define overarching strategies and principles. The participants discussed the rationale for implementing gender-sensitive public health and identified priorities and key stakeholders to engage in the process. Communication strategies and specific promotion strategies with distinct stakeholders were defined. A comprehensive list of gender-sensitive practices was established using the recently published taxonomy of the Expert Recommendations for Implementing Change (ERIC) project as a blueprint. A clearly defined implementation strategy should be mandated for all new projects in the field of gender-sensitive public health. Our tool can support researchers and practitioners with the analysis of current and past research as well as with the planning of new projects.

  8. Sensitivity of charge transport measurements to local inhomogeneities

    NASA Astrophysics Data System (ADS)

    Koon, Daniel; Wang, Fei; Hjorth Petersen, Dirch; Hansen, Ole

    2012-02-01

    We derive analytic expressions for the sensitivity of resistive and Hall measurements to local variations in a specimen's material properties in the combined linear limit of both small magnetic fields and small perturbations, presenting exact, algebraic expressions both for four-point probe measurements on an infinite plane and for symmetric, circular van der Pauw discs. We then generalize the results to obtain corrections to the sensitivities both for finite magnetic fields and for finite perturbations. Calculated functions match published results and computer simulations, and provide an intuitive, visual explanation for experimental misassignment of carrier type in n-type ZnO and agree with published experimental results for holes in a uniform material. These results simplify calculation and plotting of the sensitivities on an NxN grid from a problem of order N^5 to one of order N^3 in the arbitrary case and of order N^2 in the handful of cases that can be solved exactly, putting a powerful tool for inhomogeneity analysis in the hands of the researcher: calculation of the sensitivities requires little more than the solution of Laplace's equation on the specimen geometry.

  9. iMatTOUGH: An open-source Matlab-based graphical user interface for pre- and post-processing of TOUGH2 and iTOUGH2 models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tran, Anh Phuong; Dafflon, Baptiste; Hubbard, Susan

    TOUGH2 and iTOUGH2 are powerful models that simulate the heat and fluid flows in porous and fracture media, and perform parameter estimation, sensitivity analysis and uncertainty propagation analysis. However, setting up the input files is not only tedious, but error prone, and processing output files is time consuming. Here, we present an open source Matlab-based tool (iMatTOUGH) that supports the generation of all necessary inputs for both TOUGH2 and iTOUGH2 and visualize their outputs. The tool links the inputs of TOUGH2 and iTOUGH2, making sure the two input files are consistent. It supports the generation of rectangular computational mesh, i.e.,more » it automatically generates the elements and connections as well as their properties as required by TOUGH2. The tool also allows the specification of initial and time-dependent boundary conditions for better subsurface heat and water flow simulations. The effectiveness of the tool is illustrated by an example that uses TOUGH2 and iTOUGH2 to estimate soil hydrological and thermal properties from soil temperature data and simulate the heat and water flows at the Rifle site in Colorado.« less

  10. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  11. BMDExpress Data Viewer: A Visualization Tool to Analyze ...

    EPA Pesticide Factsheets

    Regulatory agencies increasingly apply benchmark dose (BMD) modeling to determine points of departure in human risk assessments. BMDExpress applies BMD modeling to transcriptomics datasets and groups genes to biological processes and pathways for rapid assessment of doses at which biological perturbations occur. However, graphing and analytical capabilities within BMDExpress are limited, and the analysis of output files is challenging. We developed a web-based application, BMDExpress Data Viewer, for visualization and graphical analyses of BMDExpress output files. The software application consists of two main components: ‘Summary Visualization Tools’ and ‘Dataset Exploratory Tools’. We demonstrate through two case studies that the ‘Summary Visualization Tools’ can be used to examine and assess the distributions of probe and pathway BMD outputs, as well as derive a potential regulatory BMD through the modes or means of the distributions. The ‘Functional Enrichment Analysis’ tool presents biological processes in a two-dimensional bubble chart view. By applying filters of pathway enrichment p-value and minimum number of significant genes, we showed that the Functional Enrichment Analysis tool can be applied to select pathways that are potentially sensitive to chemical perturbations. The ‘Multiple Dataset Comparison’ tool enables comparison of BMDs across multiple experiments (e.g., across time points, tissues, or organisms, etc.). The ‘BMDL-BM

  12. iMatTOUGH: An open-source Matlab-based graphical user interface for pre- and post-processing of TOUGH2 and iTOUGH2 models

    DOE PAGES

    Tran, Anh Phuong; Dafflon, Baptiste; Hubbard, Susan

    2016-04-01

    TOUGH2 and iTOUGH2 are powerful models that simulate the heat and fluid flows in porous and fracture media, and perform parameter estimation, sensitivity analysis and uncertainty propagation analysis. However, setting up the input files is not only tedious, but error prone, and processing output files is time consuming. Here, we present an open source Matlab-based tool (iMatTOUGH) that supports the generation of all necessary inputs for both TOUGH2 and iTOUGH2 and visualize their outputs. The tool links the inputs of TOUGH2 and iTOUGH2, making sure the two input files are consistent. It supports the generation of rectangular computational mesh, i.e.,more » it automatically generates the elements and connections as well as their properties as required by TOUGH2. The tool also allows the specification of initial and time-dependent boundary conditions for better subsurface heat and water flow simulations. The effectiveness of the tool is illustrated by an example that uses TOUGH2 and iTOUGH2 to estimate soil hydrological and thermal properties from soil temperature data and simulate the heat and water flows at the Rifle site in Colorado.« less

  13. The Sensitivity and Specificity of Depression Screening Tools among Adults with Intellectual Disabilities

    ERIC Educational Resources Information Center

    Ailey, Sarah H.

    2009-01-01

    This study describes the validity and the sensitivity and specificity of depression screening tools among adults with intellectual and disabilities (ID). Subjects (N = 75) were interviewed with the Beck Depression Inventory II (BDI-II) and the Glasgow Depression Scale for People with a Learning Disability (GDS-LD) and also completed a clinical…

  14. Identifying substance misuse in primary care: TAPS Tool compared to the WHO ASSIST.

    PubMed

    Schwartz, R P; McNeely, J; Wu, L T; Sharma, G; Wahle, A; Cushing, C; Nordeck, C D; Sharma, A; O'Grady, K E; Gryczynski, J; Mitchell, S G; Ali, R L; Marsden, J; Subramaniam, G A

    2017-05-01

    There is a need for screening and brief assessment instruments to identify primary care patients with substance use problems. This study's aim was to examine the performance of a two-step screening and brief assessment instrument, the TAPS Tool, compared to the WHO ASSIST. Two thousand adult primary care patients recruited from five primary care clinics in four Eastern US states completed the TAPS Tool followed by the ASSIST. The ability of the TAPS Tool to identify moderate- and high-risk use scores on the ASSIST was examined using sensitivity and specificity analyses. The interviewer and self-administered computer tablet versions of the TAPS Tool generated similar results. The interviewer-administered version (at cut-off of 2), had acceptable sensitivity and specificity for high-risk tobacco (0.90 and 0.77) and alcohol (0.87 and 0.80) use. For illicit drugs, sensitivities were >0.82 and specificities >0.92. The TAPS (at a cut-off of 1) had good sensitivity and specificity for moderate-risk tobacco use (0.83 and 0.97) and alcohol (0.83 and 0.74). Among illicit drugs, sensitivity was acceptable for moderate-risk of marijuana (0.71), while it was low for all other illicit drugs and non-medical use of prescription medications. Specificities were 0.97 or higher for all illicit drugs and prescription medications. The TAPS Tool identified adult primary care patients with high-risk ASSIST scores for all substances as well moderate-risk users of tobacco, alcohol, and marijuana, although it did not perform well in identifying patients with moderate-risk use of other drugs or non-medical use of prescription medications. The advantages of the TAPS Tool over the ASSIST are its more limited number of items and focus solely on substance use in the past 3months. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Clinical effectiveness and cost-effectiveness of parenting interventions for children with severe attachment problems: a systematic review and meta-analysis.

    PubMed

    Wright, Barry; Barry, Melissa; Hughes, Ellen; Trépel, Dominic; Ali, Shehzad; Allgar, Victoria; Cottrill, Lucy; Duffy, Steven; Fell, Jenny; Glanville, Julie; Glaser, Danya; Hackney, Lisa; Manea, Laura; McMillan, Dean; Palmer, Stephen; Prior, Vivien; Whitton, Clare; Perry, Amanda; Gilbody, Simon

    2015-07-01

    Services have variable practices for identifying and providing interventions for 'severe attachment problems' (disorganised attachment patterns and attachment disorders). Several government reports have highlighted the need for better parenting interventions in at-risk groups. This report was commissioned to evaluate the clinical effectiveness and cost-effectiveness of parenting interventions for children with severe attachment problems (the main review). One supplementary review explored the evaluation of assessment tools and a second reviewed 10-year outcome data to better inform health economic aspects of the main review. A total of 29 electronic databases were searched with additional mechanisms for identifying a wide pool of references using the Cochrane methodology. Examples of databases searched include PsycINFO (1806 to January week 1, 2012), MEDLINE and MEDLINE In-Process & Other Non-Indexed Citations (1946 to December week 4, 2011) and EMBASE (1974 to week 1, 2012). Searches were carried out between 6 and 12 January 2012. Papers identified were screened and data were extracted by two independent reviewers, with disagreements arbitrated by a third independent reviewer. Quality assessment tools were used, including quality assessment of diagnostic accuracy studies - version 2 and the Cochrane risk of bias tool. Meta-analysis of randomised controlled trials (RCTs) of parenting interventions was undertaken. A health economics analysis was conducted. The initial search returned 10,167 citations. This yielded 29 RCTs in the main review of parenting interventions to improve attachment patterns, and one involving children with reactive attachment disorder. A meta-analysis of eight studies seeking to improve outcome in at-risk populations showed statistically significant improvement in disorganised attachment. The interventions saw less disorganised attachment at outcome than the control (odds ratio 0.47, 95% confidence interval 0.34 to 0.65; p < 0.00001). Much of this focused around interventions improving maternal sensitivity, with or without video feedback. In our first supplementary review, 35 papers evaluated an attachment assessment tool demonstrating validity or psychometric data. Only five reported test-retest data. Twenty-six studies reported inter-rater reliability, with 24 reporting a level of 0.7 or above. Cronbach's alphas were reported in 12 studies for the comparative tests (11 with α > 0.7) and four studies for the reference tests (four with α > 0.7). Three carried out concurrent validity comparing the Strange Situation Procedure (SSP) with another assessment tool. These had good sensitivity but poor specificity. The Disturbances of Attachment Interview had good sensitivity and specificity with the research diagnostic criteria (RDC) for attachment disorders. In our supplementary review of 10-year outcomes in cohorts using a baseline reference standard, two studies were found with disorganised attachment at baseline, with one finding raised psychopathology in adolescence. Budget impact analysis of costs was estimated because a decision model could not be justifiably populated. This, alongside other findings, informed research priorities. There are relatively few UK-based clinical trials. A 10-year follow-up, while necessary for our health economists for long-term sequelae, yielded a limited number of papers. Maternal sensitivity interventions show good outcomes in at-risk populations, but require further research with complex children. The SSP and RDC for attachment disorders remain the reference standards for identification until more concurrent and predictive validity research is conducted. A birth cohort with sequential attachment measures and outcomes across different domains is recommended with further, methodologically sound randomised controlled intervention trials. The main area identified for future work was a need for good-quality RCTs in at-risk groups such as those entering foster care or adoption. This study is registered as PROSPERO CRD42011001395. The National Institute for Health Research Health Technology Assessment programme.

  16. Highly sensitive transient absorption imaging of graphene and graphene oxide in living cells and circulating blood.

    PubMed

    Li, Junjie; Zhang, Weixia; Chung, Ting-Fung; Slipchenko, Mikhail N; Chen, Yong P; Cheng, Ji-Xin; Yang, Chen

    2015-07-23

    We report a transient absorption (TA) imaging method for fast visualization and quantitative layer analysis of graphene and GO. Forward and backward imaging of graphene on various substrates under ambient condition was imaged with a speed of 2 μs per pixel. The TA intensity linearly increased with the layer number of graphene. Real-time TA imaging of GO in vitro with capability of quantitative analysis of intracellular concentration and ex vivo in circulating blood were demonstrated. These results suggest that TA microscopy is a valid tool for the study of graphene based materials.

  17. Accidental falls in hospital inpatients: evaluation of sensitivity and specificity of two risk assessment tools.

    PubMed

    Lovallo, Carmela; Rolandi, Stefano; Rossetti, Anna Maria; Lusignani, Maura

    2010-03-01

    This paper is a report of a study comparing the effectiveness of two falls risk assessment tools (Conley Scale and Hendrich Risk Model) by using them simultaneously with the same sample of hospital inpatients. Different risk assessment tools are available in literature. However, neither recent critical reviews nor international guidelines on fall prevention have identified tools that can be generalized to all categories of hospitalized patients. A prospective observational study was carried out in acute medical, surgical wards and rehabilitation units. From October 2007 to January 2008, 1148 patients were assessed with both instruments, subsequently noting the occurrence of falls. The sensitivity, specificity, positive and negative predictive values, and Receiver Operating Characteristics curves were calculated. The number of patients correctly identified with the Conley Scale (n = 41) was higher than with the Hendrich Model (n = 27). The Conley Scale gave sensitivity and specificity values of 69.49% and 61% respectively. The Hendrich Model gave a sensitivity value of 45.76% and a specificity value of 71%. Positive and negative predictive values were comparable. The Conley Scale is indicated for use in the medical sector, on the strength of its high sensitivity. However, since its specificity is very low, it is deemed useful to submit individual patients giving positive results to more in-depth clinical evaluation in order to decide whether preventive measures need to be taken. In surgical sectors, the low sensitivity values given by both scales suggest that further studies are warranted.

  18. Landfill Site Selection by AHP Based Multi-criteria Decision Making Tool: A Case Study in Kolkata, India

    NASA Astrophysics Data System (ADS)

    Majumdar, Ankush; Hazra, Tumpa; Dutta, Amit

    2017-09-01

    This work presents a Multi-criteria Decision Making (MCDM) tool to select a landfill site from three candidate sites proposed for Kolkata Municipal Corporation (KMC) area that complies with accessibility, receptor, environment, public acceptability, geological and economic criteria. Analytical Hierarchy Process has been used to solve the MCDM problem. Suitability of the three sites (viz. Natagachi, Gangajoara and Kharamba) as landfills as proposed by KMC has been checked by Landfill Site Sensitivity Index (LSSI) as well as Economic Viability Index (EVI). Land area availability for disposing huge quantity of Municipal Solid Waste for the design period has been checked. Analysis of the studied sites show that they are moderately suitable for landfill facility construction as both LSSI and EVI scores lay between 300 and 750. The proposed approach represents an effective MCDM tool for siting sanitary landfill in growing metropolitan cities of developing countries like India.

  19. Strategies and tools for whole genome alignments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Couronne, Olivier; Poliakov, Alexander; Bray, Nicolas

    2002-11-25

    The availability of the assembled mouse genome makespossible, for the first time, an alignment and comparison of two largevertebrate genomes. We have investigated different strategies ofalignment for the subsequent analysis of conservation of genomes that areeffective for different quality assemblies. These strategies were appliedto the comparison of the working draft of the human genome with the MouseGenome Sequencing Consortium assembly, as well as other intermediatemouse assemblies. Our methods are fast and the resulting alignmentsexhibit a high degree of sensitivity, covering more than 90 percent ofknown coding exons in the human genome. We have obtained such coveragewhile preserving specificity. With amore » view towards the end user, we havedeveloped a suite of tools and websites for automatically aligning, andsubsequently browsing and working with whole genome comparisons. Wedescribe the use of these tools to identify conserved non-coding regionsbetween the human and mouse genomes, some of which have not beenidentified by other methods.« less

  20. Value of circulating cell-free DNA analysis as a diagnostic tool for breast cancer: a meta-analysis

    PubMed Central

    Ma, Xuelei; Zhang, Jing; Hu, Xiuying

    2017-01-01

    Objectives The aim of this study was to systematically evaluate the diagnostic value of cell free DNA (cfDNA) for breast cancer. Results Among 308 candidate articles, 25 with relevant diagnostic screening qualified for final analysis. The mean sensitivity, specificity and area under the curve (AUC) of SROC plots for 24 studies that distinguished breast cancer patients from healthy controls were 0.70, 0.87, and 0.9314, yielding a DOR of 32.31. When analyzed in subgroups, the 14 quantitative studies produced sensitivity, specificity, AUC, and a DOR of 0.78, 0.83, 0.9116, and 24.40. The 10 qualitative studies produced 0.50, 0.98, 0.9919, and 68.45. For 8 studies that distinguished malignant breast cancer from benign diseases, the specificity, sensitivity, AUC and DOR were 0.75, 0.79, 0.8213, and 9.49. No covariate factors had a significant correlation with relative DOR. Deek's funnel plots indicated an absence of publication bias. Materials and Methods Databases were searched for studies involving the use of cfDNA to diagnose breast cancer. The studies were analyzed to determine sensitivity, specificity, positive likelihood ratio, negative likelihood ratio, diagnostic odds ratio (DOR), and the summary receiver operating characteristic (SROC). Covariates were evaluated for effect on relative DOR. Deek's Funnel plots were generated to measure publication bias. Conclusions Our analysis suggests a promising diagnostic potential of using cfDNA for breast cancer screening, but this diagnostic method is not yet independently sufficient. Further work refining qualitative cfDNA assays will improve the correct diagnosis of breast cancers. PMID:28460452

  1. "A Bayesian sensitivity analysis to evaluate the impact of unmeasured confounding with external data: a real world comparative effectiveness study in osteoporosis".

    PubMed

    Zhang, Xiang; Faries, Douglas E; Boytsov, Natalie; Stamey, James D; Seaman, John W

    2016-09-01

    Observational studies are frequently used to assess the effectiveness of medical interventions in routine clinical practice. However, the use of observational data for comparative effectiveness is challenged by selection bias and the potential of unmeasured confounding. This is especially problematic for analyses using a health care administrative database, in which key clinical measures are often not available. This paper provides an approach to conducting a sensitivity analyses to investigate the impact of unmeasured confounding in observational studies. In a real world osteoporosis comparative effectiveness study, the bone mineral density (BMD) score, an important predictor of fracture risk and a factor in the selection of osteoporosis treatments, is unavailable in the data base and lack of baseline BMD could potentially lead to significant selection bias. We implemented Bayesian twin-regression models, which simultaneously model both the observed outcome and the unobserved unmeasured confounder, using information from external sources. A sensitivity analysis was also conducted to assess the robustness of our conclusions to changes in such external data. The use of Bayesian modeling in this study suggests that the lack of baseline BMD did have a strong impact on the analysis, reversing the direction of the estimated effect (odds ratio of fracture incidence at 24 months: 0.40 vs. 1.36, with/without adjusting for unmeasured baseline BMD). The Bayesian twin-regression models provide a flexible sensitivity analysis tool to quantitatively assess the impact of unmeasured confounding in observational studies. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. A Survey of FDG- and Amyloid-PET Imaging in Dementia and GRADE Analysis

    PubMed Central

    Daniela, Perani; Orazio, Schillaci; Alessandro, Padovani; Mariano, Nobili Flavio; Leonardo, Iaccarino; Pasquale Anthony, Della Rosa; Giovanni, Frisoni; Carlo, Caltagirone

    2014-01-01

    PET based tools can improve the early diagnosis of Alzheimer's disease (AD) and differential diagnosis of dementia. The importance of identifying individuals at risk of developing dementia among people with subjective cognitive complaints or mild cognitive impairment has clinical, social, and therapeutic implications. Within the two major classes of AD biomarkers currently identified, that is, markers of pathology and neurodegeneration, amyloid- and FDG-PET imaging represent decisive tools for their measurement. As a consequence, the PET tools have been recognized to be of crucial value in the recent guidelines for the early diagnosis of AD and other dementia conditions. The references based recommendations, however, include large PET imaging literature based on visual methods that greatly reduces sensitivity and specificity and lacks a clear cut-off between normal and pathological findings. PET imaging can be assessed using parametric or voxel-wise analyses by comparing the subject's scan with a normative data set, significantly increasing the diagnostic accuracy. This paper is a survey of the relevant literature on FDG and amyloid-PET imaging aimed at providing the value of quantification for the early and differential diagnosis of AD. This allowed a meta-analysis and GRADE analysis revealing high values for PET imaging that might be useful in considering recommendations. PMID:24772437

  3. Assessing Predictive Validity of Pressure Ulcer Risk Scales- A Systematic Review and Meta-Analysis

    PubMed Central

    PARK, Seong-Hi; LEE, Hea Shoon

    2016-01-01

    Background: The purpose of this study was to present a scientific reason for pressure ulcer risk scales: Cubbin& Jackson modified Braden, Norton, and Waterlow, as a nursing diagnosis tool by utilizing predictive validity of pressure sores. Methods: Articles published between 1966 and 2013 from periodicals indexed in the Ovid Medline, Embase, CINAHL, KoreaMed, NDSL, and other databases were selected using the key word “pressure ulcer”. QUADAS-II was applied for assessment for internal validity of the diagnostic studies. Selected studies were analyzed using meta-analysis with MetaDisc 1.4. Results: Seventeen diagnostic studies with high methodological quality, involving 5,185 patients, were included. In the results of the meta-analysis, sROC AUC of Braden, Norton, and Waterflow scale was over 0.7, showing moderate predictive validity, but they have limited interpretation due to significant differences between studies. In addition, Waterlow scale is insufficient as a screening tool owing to low sensitivity compared with other scales. Conclusion: The contemporary pressure ulcer risk scale is not suitable for uninform practice on patients under standardized criteria. Therefore, in order to provide more effective nursing care for bedsores, a new or modified pressure ulcer risk scale should be developed upon strength and weaknesses of existing tools. PMID:27114977

  4. How can sensitivity analysis improve the robustness of mathematical models utilized by the re/insurance industry?

    NASA Astrophysics Data System (ADS)

    Noacco, V.; Wagener, T.; Pianosi, F.; Philp, T.

    2017-12-01

    Insurance companies provide insurance against a wide range of threats, such as natural catastrophes, nuclear incidents and terrorism. To quantify risk and support investment decisions, mathematical models are used, for example to set the premiums charged to clients that protect from financial loss, should deleterious events occur. While these models are essential tools for adequately assessing the risk attached to an insurer's portfolio, their development is costly and their value for decision-making may be limited by an incomplete understanding of uncertainty and sensitivity. Aside from the business need to understand risk and uncertainty, the insurance sector also faces regulation which requires them to test their models in such a way that uncertainties are appropriately captured and that plans are in place to assess the risks and their mitigation. The building and testing of models constitutes a high cost for insurance companies, and it is a time intensive activity. This study uses an established global sensitivity analysis toolbox (SAFE) to more efficiently capture the uncertainties and sensitivities embedded in models used by a leading re/insurance firm, with structured approaches to validate these models and test the impact of assumptions on the model predictions. It is hoped that this in turn will lead to better-informed and more robust business decisions.

  5. A Meta-analysis for the Diagnostic Performance of Transient Elastography for Clinically Significant Portal Hypertension.

    PubMed

    You, Myung-Won; Kim, Kyung Won; Pyo, Junhee; Huh, Jimi; Kim, Hyoung Jung; Lee, So Jung; Park, Seong Ho

    2017-01-01

    We aimed to evaluate the correlation between liver stiffness measurement using transient elastography (TE-LSM) and hepatic venous pressure gradient and the diagnostic performance of TE-LSM in assessing clinically significant portal hypertension through meta-analysis. Eleven studies were included from thorough literature research and selection processes. The summary correlation coefficient was 0.783 (95% confidence interval [CI], 0.737-0.823). Summary sensitivity, specificity and area under the hierarchical summary receiver operating characteristic curve (AUC) were 87.5% (95% CI, 75.8-93.9%), 85.3 % (95% CI, 76.9-90.9%) and 0.9, respectively. The subgroup with low cut-off values of 13.6-18 kPa had better summary estimates (sensitivity 91.2%, specificity 81.3% and partial AUC 0.921) than the subgroup with high cut-off values of 21-25 kPa (sensitivity 71.2%, specificity 90.9% and partial AUC 0.769). In summary, TE-LSM correlated well with hepatic venous pressure gradient and represented good diagnostic performance in diagnosing clinically significant portal hypertension. For use as a sensitive screening tool, we propose using low cut-off values of 13.6-18 kPa in TE-LSM. Copyright © 2016 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  6. Statistical methods for launch vehicle guidance, navigation, and control (GN&C) system design and analysis

    NASA Astrophysics Data System (ADS)

    Rose, Michael Benjamin

    A novel trajectory and attitude control and navigation analysis tool for powered ascent is developed. The tool is capable of rapid trade-space analysis and is designed to ultimately reduce turnaround time for launch vehicle design, mission planning, and redesign work. It is streamlined to quickly determine trajectory and attitude control dispersions, propellant dispersions, orbit insertion dispersions, and navigation errors and their sensitivities to sensor errors, actuator execution uncertainties, and random disturbances. The tool is developed by applying both Monte Carlo and linear covariance analysis techniques to a closed-loop, launch vehicle guidance, navigation, and control (GN&C) system. The nonlinear dynamics and flight GN&C software models of a closed-loop, six-degree-of-freedom (6-DOF), Monte Carlo simulation are formulated and developed. The nominal reference trajectory (NRT) for the proposed lunar ascent trajectory is defined and generated. The Monte Carlo truth models and GN&C algorithms are linearized about the NRT, the linear covariance equations are formulated, and the linear covariance simulation is developed. The performance of the launch vehicle GN&C system is evaluated using both Monte Carlo and linear covariance techniques and their trajectory and attitude control dispersion, propellant dispersion, orbit insertion dispersion, and navigation error results are validated and compared. Statistical results from linear covariance analysis are generally within 10% of Monte Carlo results, and in most cases the differences are less than 5%. This is an excellent result given the many complex nonlinearities that are embedded in the ascent GN&C problem. Moreover, the real value of this tool lies in its speed, where the linear covariance simulation is 1036.62 times faster than the Monte Carlo simulation. Although the application and results presented are for a lunar, single-stage-to-orbit (SSTO), ascent vehicle, the tools, techniques, and mathematical formulations that are discussed are applicable to ascent on Earth or other planets as well as other rocket-powered systems such as sounding rockets and ballistic missiles.

  7. Validity of a modified Parkinson's disease screening questionnaire in India: effects of literacy of participants and medical training of screeners and implications for screening efforts in developing countries.

    PubMed

    Sarangmath, Nagaraja; Rattihalli, Rohini; Ragothaman, Mona; Gopalkrishna, Gururaj; Doddaballapur, Subbakrishna; Louis, Elan D; Muthane, Uday B

    2005-12-01

    The prevalence of Parkinson's disease (PD) is low among Indians, except in the Parsis. Data for Indians come from studies using different screening tools and criteria to detect PD. An epidemiological study in India, which has nearly a billion people, more than 18 spoken languages, and varying levels of literacy, requires development and validation of a screening tool for PD. The objectives of this study are to (1) validate a modified version of a widely used screening questionnaire for PD to suit the needs of the Indian population; (2) compare the use of a nonmedical assistant (NMA) with the use of a medical person during screening; and (3) compare the effect of literacy of participants on the validity of the screening tool. The validity of the questionnaire was tested on 125 participants from a home for the elderly. NMAs of similar background and medical personnel administered the modified screening questionnaire. A movement disorder neurologist blind to the responses on the questionnaire, examined participants independently and diagnosed if participants had PD. The questionnaire was validated in the movement disorders clinic, on known PD patients and their family members without PD. In the movement disorders clinic, sensitivity and specificity of the questionnaire were 100% and 89%, respectively. Fifty-seven participants were included for analysis. The questionnaire had a higher sensitivity when NMAs (75%) rather than the medical personnel (61%) administered it, and its specificity was higher with the medical personnel (61%) than with NMAs (55% and 25%). The questionnaire had a higher specificity in literates than illiterates, whereas sensitivity varied considerably. The modified questionnaire translated in a local Indian language had reasonable sensitivity and can be used to screen individuals for PD in epidemiological studies in India. This questionnaire can be administered by NMAs to screen PD and this strategy would reduce manpower costs. Literacy may influence epidemiological estimates when screening PD.

  8. Recent advances in proteomic applications for schistosomiasis research: potential clinical impact.

    PubMed

    Sotillo, Javier; Doolan, Denise; Loukas, Alex

    2017-02-01

    Schistosomiasis is a neglected tropical disease affecting hundreds of millions of people worldwide. Recent advances in the field of proteomics and the development of new and highly sensitive mass spectrometers and quantitative techniques have provided new tools for advancing the molecular biology, cell biology, diagnosis and vaccine development for public health threats such as schistosomiasis. Areas covered: In this review we describe the latest advances in research that utilizes proteomics-based tools to address some of the key challenges to developing effective interventions against schistosomiasis. We also provide information about the potential of extracellular vesicles to advance the fight against this devastating disease. Expert commentary: Different proteins are already being tested as vaccines against schistosomiasis with promising results. The re-analysis of the Schistosoma spp. proteomes using new and more sensitive mass spectrometers as well as better separation approaches will help identify more vaccine targets in a rational and informed manner. In addition, the recent development of new proteome microarrays will facilitate characterisation of novel markers of infection as well as new vaccine and diagnostic candidate antigens.

  9. Development of a screening tool to predict malnutrition among children under two years old in Zambia

    PubMed Central

    Hasegawa, Junko; Ito, Yoichi M; Yamauchi, Taro

    2017-01-01

    ABSTRACT Background: Maternal and child undernutrition is an important issue, particularly in low- and middle-income countries. Children at high risk of malnutrition should be prioritized to receive necessary interventions to minimize such risk. Several risk factors have been proposed; however, until now, there has been no appropriate evaluation method to identify these children. In sub-Saharan Africa, children commonly receive regular check-ups from community health workers. A simple and easy nutrition assessment method is therefore needed for use by semi-professional health workers. Objectives: The aim of this study was to develop and test a practical screening tool for community use in predicting growth stunting in children under two years in rural Zambia. Methods: Field research was conducted from July to August 2014 in Southern Province, Zambia. Two hundred and sixty-four mother-child pairs participated in the study. Anthropometric measurements were performed on all children and mothers, and all mothers were interviewed. Risk factors for the screening test were estimated by using least absolute shrinkage and selection operator analysis. After re-evaluating all participants using the new screening tool, a receiver operating characteristic curve was drawn to set the cut-off value. Sensitivity and specificity were also calculated. Results: The screening tool included age, weight-for-age Z-score status, birth weight, feeding status, history of sibling death, multiple birth, and maternal education level. The total score ranged from 0 to 22, and the cut-off value was eight. Sensitivity and specificity were 0.963 and 0.697 respectively. Conclusions: A screening tool was developed to predict children at high risk of malnutrition living in Zambia. Further longitudinal studies are needed to confirm the test’s validity in detecting future stunting and to investigate the effectiveness of malnutrition treatment. PMID:28730929

  10. Assessment of olfactory function after traumatic brain injury: comparison of single odour tool with detailed assessment tool.

    PubMed

    Humphries, Thomas; Singh, Rajiv

    2018-01-01

    Olfactory disturbance (OD) is common after traumatic brain injury (TBI). Screening for OD can be performed by several different methods. While odour identification tools are considered more accurate, they are time consuming. The predominant method in clinical practice remains the use of a single odour. This study aimed to compare a brief single-odour identification tool (BSOIT) with a more detailed 12-odour assessment tool. One hundred seventy consecutive patients with TBI had their olfaction assessed using BSOIT and a 12-item tool at a single time point. The sensitivity and specificity of the BSOIT were calculated. The sensitivity and specificity of the BSOIT as compared to the Burghart tool were 57.5% and 100%, respectively, for all ODs (anosmia and hyposmia). The sensitivity and specificity for anosmia only were 93.5% and 96.7%, respectively. For the two tools, the Cohen's kappa coefficient showed moderate agreement when both anosmia and hyposmia were considered (k = 0.619, p < 0.001) but a very strong agreement when only anosmia was considered (k = 0.844, p < 0.001). For both the tools, anosmia had a significant association with TBI severity (p < 0.001). However, hyposmia showed no such association. The BSOIT is very effective at identifying anosmia but not hyposmia, producing comparable results to a more detailed test. It can be effective in clinical practice and takes considerably less time.

  11. Micropollutants throughout an integrated urban drainage model: Sensitivity and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Mannina, Giorgio; Cosenza, Alida; Viviani, Gaspare

    2017-11-01

    The paper presents the sensitivity and uncertainty analysis of an integrated urban drainage model which includes micropollutants. Specifically, a bespoke integrated model developed in previous studies has been modified in order to include the micropollutant assessment (namely, sulfamethoxazole - SMX). The model takes into account also the interactions between the three components of the system: sewer system (SS), wastewater treatment plant (WWTP) and receiving water body (RWB). The analysis has been applied to an experimental catchment nearby Palermo (Italy): the Nocella catchment. Overall, five scenarios, each characterized by different uncertainty combinations of sub-systems (i.e., SS, WWTP and RWB), have been considered applying, for the sensitivity analysis, the Extended-FAST method in order to select the key factors affecting the RWB quality and to design a reliable/useful experimental campaign. Results have demonstrated that sensitivity analysis is a powerful tool for increasing operator confidence in the modelling results. The approach adopted here can be used for blocking some non-identifiable factors, thus wisely modifying the structure of the model and reducing the related uncertainty. The model factors related to the SS have been found to be the most relevant factors affecting the SMX modeling in the RWB when all model factors (scenario 1) or model factors of SS (scenarios 2 and 3) are varied. If the only factors related to the WWTP are changed (scenarios 4 and 5), the SMX concentration in the RWB is mainly influenced (till to 95% influence of the total variance for SSMX,max) by the aerobic sorption coefficient. A progressive uncertainty reduction from the upstream to downstream was found for the soluble fraction of SMX in the RWB.

  12. Finite Element Model Calibration Approach for Area I-X

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, Mercedes C.; Buehrle, Ralph D.; Templeton, Justin D.; Gaspar, James L.; Lazor, Daniel R.; Parks, Russell A.; Bartolotta, Paul A.

    2010-01-01

    Ares I-X is a pathfinder vehicle concept under development by NASA to demonstrate a new class of launch vehicles. Although this vehicle is essentially a shell of what the Ares I vehicle will be, efforts are underway to model and calibrate the analytical models before its maiden flight. Work reported in this document will summarize the model calibration approach used including uncertainty quantification of vehicle responses and the use of non-conventional boundary conditions during component testing. Since finite element modeling is the primary modeling tool, the calibration process uses these models, often developed by different groups, to assess model deficiencies and to update parameters to reconcile test with predictions. Data for two major component tests and the flight vehicle are presented along with the calibration results. For calibration, sensitivity analysis is conducted using Analysis of Variance (ANOVA). To reduce the computational burden associated with ANOVA calculations, response surface models are used in lieu of computationally intensive finite element solutions. From the sensitivity studies, parameter importance is assessed as a function of frequency. In addition, the work presents an approach to evaluate the probability that a parameter set exists to reconcile test with analysis. Comparisons of pretest predictions of frequency response uncertainty bounds with measured data, results from the variance-based sensitivity analysis, and results from component test models with calibrated boundary stiffness models are all presented.

  13. Finite Element Model Calibration Approach for Ares I-X

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, Mercedes C.; Buehrle, Ralph D.; Templeton, Justin D.; Lazor, Daniel R.; Gaspar, James L.; Parks, Russel A.; Bartolotta, Paul A.

    2010-01-01

    Ares I-X is a pathfinder vehicle concept under development by NASA to demonstrate a new class of launch vehicles. Although this vehicle is essentially a shell of what the Ares I vehicle will be, efforts are underway to model and calibrate the analytical models before its maiden flight. Work reported in this document will summarize the model calibration approach used including uncertainty quantification of vehicle responses and the use of nonconventional boundary conditions during component testing. Since finite element modeling is the primary modeling tool, the calibration process uses these models, often developed by different groups, to assess model deficiencies and to update parameters to reconcile test with predictions. Data for two major component tests and the flight vehicle are presented along with the calibration results. For calibration, sensitivity analysis is conducted using Analysis of Variance (ANOVA). To reduce the computational burden associated with ANOVA calculations, response surface models are used in lieu of computationally intensive finite element solutions. From the sensitivity studies, parameter importance is assessed as a function of frequency. In addition, the work presents an approach to evaluate the probability that a parameter set exists to reconcile test with analysis. Comparisons of pre-test predictions of frequency response uncertainty bounds with measured data, results from the variance-based sensitivity analysis, and results from component test models with calibrated boundary stiffness models are all presented.

  14. Securing Sensitive Flight and Engine Simulation Data Using Smart Card Technology

    NASA Technical Reports Server (NTRS)

    Blaser, Tammy M.

    2003-01-01

    NASA Glenn Research Center has developed a smart card prototype capable of encrypting and decrypting disk files required to run a distributed aerospace propulsion simulation. Triple Data Encryption Standard (3DES) encryption is used to secure the sensitive intellectual property on disk pre, during, and post simulation execution. The prototype operates as a secure system and maintains its authorized state by safely storing and permanently retaining the encryption keys only on the smart card. The prototype is capable of authenticating a single smart card user and includes pre simulation and post simulation tools for analysis and training purposes. The prototype's design is highly generic and can be used to protect any sensitive disk files with growth capability to urn multiple simulations. The NASA computer engineer developed the prototype on an interoperable programming environment to enable porting to other Numerical Propulsion System Simulation (NPSS) capable operating system environments.

  15. Comparison of Subjective Global Assessment and Protein Energy Wasting Score to Nutrition Evaluations Conducted by Registered Dietitian Nutritionists in Identifying Protein Energy Wasting Risk in Maintenance Hemodialysis Patients.

    PubMed

    Sum, Simon Siu-Man; Marcus, Andrea F; Blair, Debra; Olejnik, Laura A; Cao, Joyce; Parrott, J Scott; Peters, Emily N; Hand, Rosa K; Byham-Gray, Laura D

    2017-09-01

    To compare the 7-point subjective global assessment (SGA) and the protein energy wasting (PEW) score with nutrition evaluations conducted by registered dietitian nutritionists in identifying PEW risk in stage 5 chronic kidney disease patients on maintenance hemodialysis. This study is a secondary analysis of a cross-sectional study entitled "Development and Validation of a Predictive energy Equation in Hemodialysis". PEW risk identified by the 7-point SGA and the PEW score was compared against the nutrition evaluations conducted by registered dietitian nutritionists through data examination from the original study (reference standard). A total of 133 patients were included for the analysis. The sensitivity, specificity, positive and negative predictive value (PPV and NPV), positive and negative likelihood ratio (PLR and NLR) of both scoring tools were calculated when compared against the reference standard. The patients were predominately African American (n = 112, 84.2%), non-Hispanic (n = 101, 75.9%), and male (n = 80, 60.2%). Both the 7-point SGA (sensitivity = 78.6%, specificity = 59.1%, PPV = 33.9%, NPV = 91.2%, PLR = 1.9, and NLR = 0.4) and the PEW score (sensitivity = 100%, specificity = 28.6%, PPV = 27.2%, NPV = 100%, PLR = 1.4, and NLR = 0) were more sensitive than specific in identifying PEW risk. The 7-point SGA may miss 21.4% patients having PEW and falsely identify 40.9% of patients who do not have PEW. The PEW score can identify PEW risk in all patients, but 71.4% of patients identified may not have PEW risk. Both the 7-point SGA and the PEW score could identify PEW risk. The 7-point SGA was more specific, and the PEW score was more sensitive. Both scoring tools were found to be clinically confident in identifying patients who were actually not at PEW risk. Copyright © 2017 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  16. An optimized rapid bisulfite conversion method with high recovery of cell-free DNA.

    PubMed

    Yi, Shaohua; Long, Fei; Cheng, Juanbo; Huang, Daixin

    2017-12-19

    Methylation analysis of cell-free DNA is a encouraging tool for tumor diagnosis, monitoring and prognosis. Sensitivity of methylation analysis is a very important matter due to the tiny amounts of cell-free DNA available in plasma. Most current methods of DNA methylation analysis are based on the difference of bisulfite-mediated deamination of cytosine between cytosine and 5-methylcytosine. However, the recovery of bisulfite-converted DNA based on current methods is very poor for the methylation analysis of cell-free DNA. We optimized a rapid method for the crucial steps of bisulfite conversion with high recovery of cell-free DNA. A rapid deamination step and alkaline desulfonation was combined with the purification of DNA on a silica column. The conversion efficiency and recovery of bisulfite-treated DNA was investigated by the droplet digital PCR. The optimization of the reaction results in complete cytosine conversion in 30 min at 70 °C and about 65% of recovery of bisulfite-treated cell-free DNA, which is higher than current methods. The method allows high recovery from low levels of bisulfite-treated cell-free DNA, enhancing the analysis sensitivity of methylation detection from cell-free DNA.

  17. Near-infrared confocal micro-Raman spectroscopy combined with PCA-LDA multivariate analysis for detection of esophageal cancer

    NASA Astrophysics Data System (ADS)

    Chen, Long; Wang, Yue; Liu, Nenrong; Lin, Duo; Weng, Cuncheng; Zhang, Jixue; Zhu, Lihuan; Chen, Weisheng; Chen, Rong; Feng, Shangyuan

    2013-06-01

    The diagnostic capability of using tissue intrinsic micro-Raman signals to obtain biochemical information from human esophageal tissue is presented in this paper. Near-infrared micro-Raman spectroscopy combined with multivariate analysis was applied for discrimination of esophageal cancer tissue from normal tissue samples. Micro-Raman spectroscopy measurements were performed on 54 esophageal cancer tissues and 55 normal tissues in the 400-1750 cm-1 range. The mean Raman spectra showed significant differences between the two groups. Tentative assignments of the Raman bands in the measured tissue spectra suggested some changes in protein structure, a decrease in the relative amount of lactose, and increases in the percentages of tryptophan, collagen and phenylalanine content in esophageal cancer tissue as compared to those of a normal subject. The diagnostic algorithms based on principal component analysis (PCA) and linear discriminate analysis (LDA) achieved a diagnostic sensitivity of 87.0% and specificity of 70.9% for separating cancer from normal esophageal tissue samples. The result demonstrated that near-infrared micro-Raman spectroscopy combined with PCA-LDA analysis could be an effective and sensitive tool for identification of esophageal cancer.

  18. Sensitivity tests to define the source apportionment performance criteria in the DeltaSA tool

    NASA Astrophysics Data System (ADS)

    Pernigotti, Denise; Belis, Claudio A.

    2017-04-01

    Identification and quantification of the contribution of emission sources to a given area is a key task for the design of abatement strategies. Moreover, European member states are obliged to report this kind of information for zones where the pollution levels exceed the limit values. At present, little is known about the performance and uncertainty of the variety of methodologies used for source apportionment and the comparability between the results of studies using different approaches. The source apportionment Delta (SA Delta) is a tool developed by the EC-JRC to support the particulate matter source apportionment modellers in the identification of sources (for factor analysis studies) and/or in the measure of their performance. The source identification is performed by the tool measuring the proximity of any user chemical profile to preloaded repository data (SPECIATE and SPECIEUROPE). The model performances criteria are based on standard statistical indexes calculated by comparing participants' source contribute estimates and their time series with preloaded references data. Those preloaded data refer to previous European SA intercomparison exercises: the first with real world data (22 participants), the second with synthetic data (25 participants) and the last with real world data which was also extended to Chemical Transport Models (38 receptor models and 4 CTMs). The references used for the model performances are 'true' (predefined by JRC) for the synthetic while they are calculated as ensemble average of the participants' results in real world intercomparisons. The candidates used for each source ensemble reference calculation were selected among participants results based on a number of consistency checks plus the similarity between their chemical profiles to the repository measured data. The estimation of the ensemble reference uncertainty is crucial in order to evaluate the users' performances against it. For this reason a sensitivity analysis on different methods to estimate the ensemble references' uncertainties was performed re-analyzing the synthetic intercomparison dataset, the only one where 'true' reference and ensemble reference contributions were both present. The Delta SA is now available on-line and will be presented, with a critical discussion of the sensitivity analysis on the ensemble reference uncertainty. In particular the grade of among participants mutual agreement on the presence of a certain source should be taken into account. Moreover also the importance of the synthetic intercomparisons in order to catch receptor models common biases will be stressed.

  19. Detecting anxiety in individuals with Parkinson disease: A systematic review.

    PubMed

    Mele, Bria; Holroyd-Leduc, Jayna; Smith, Eric E; Pringsheim, Tamara; Ismail, Zahinoor; Goodarzi, Zahra

    2018-01-02

    To examine diagnostic accuracy of anxiety detection tools compared with a gold standard in outpatient settings among adults with Parkinson disease (PD). A systematic review was conducted. MEDLINE, EMABASE, PsycINFO, and Cochrane Database of Systematic Reviews were searched to April 7, 2017. Prevalence of anxiety and diagnostic accuracy measures including sensitivity, specificity, and likelihood ratios were gathered. Pooled prevalence of anxiety was calculated using Mantel-Haenszel-weighted DerSimonian and Laird models. A total of 6,300 citations were reviewed with 6 full-text articles included for synthesis. Tools included within this study were the Beck Anxiety Inventory, Geriatric Anxiety Inventory (GAI), Hamilton Anxiety Rating Scale, Hospital Anxiety and Depression Scale-Anxiety, Parkinson's Anxiety Scale (PAS), and Mini-Social Phobia Inventory. Anxiety diagnoses made included generalized anxiety disorder, social phobia, and any anxiety type. Pooled prevalence of anxiety was 30.1% (95% confidence interval 26.1%-34.0%). The GAI had the best-reported sensitivity of 0.86 and specificity of 0.88. The observer-rated PAS had a sensitivity of 0.71 and the highest specificity of 0.91. While there are 6 tools validated for anxiety screening in PD populations, most tools are only validated in single studies. The GAI is brief and easy to use, with a good balance of sensitivity and specificity. The PAS was specifically developed for PD, is brief, and has self-/observer-rated scales, but with lower sensitivity. Health care practitioners involved in PD care need to be aware of available validated tools and choose one that fits their practice. Copyright © 2017 American Academy of Neurology.

  20. Performance characteristics of five triage tools for major incidents involving traumatic injuries to children.

    PubMed

    Price, C L; Brace-McDonnell, S J; Stallard, N; Bleetman, A; Maconochie, I; Perkins, G D

    2016-05-01

    Context Triage tools are an essential component of the emergency response to a major incident. Although fortunately rare, mass casualty incidents involving children are possible which mandate reliable triage tools to determine the priority of treatment. To determine the performance characteristics of five major incident triage tools amongst paediatric casualties who have sustained traumatic injuries. Retrospective observational cohort study using data from 31,292 patients aged less than 16 years who sustained a traumatic injury. Data were obtained from the UK Trauma Audit and Research Network (TARN) database. Interventions Statistical evaluation of five triage tools (JumpSTART, START, CareFlight, Paediatric Triage Tape/Sieve and Triage Sort) to predict death or severe traumatic injury (injury severity score >15). Main outcome measures Performance characteristics of triage tools (sensitivity, specificity and level of agreement between triage tools) to identify patients at high risk of death or severe injury. Of the 31,292 cases, 1029 died (3.3%), 6842 (21.9%) had major trauma (defined by an injury severity score >15) and 14,711 (47%) were aged 8 years or younger. There was variation in the performance accuracy of the tools to predict major trauma or death (sensitivities ranging between 36.4 and 96.2%; specificities 66.0-89.8%). Performance characteristics varied with the age of the child. CareFlight had the best overall performance at predicting death, with the following sensitivity and specificity (95% CI) respectively: 95.3% (93.8-96.8) and 80.4% (80.0-80.9). JumpSTART was superior for the triaging of children under 8 years; sensitivity and specificity (95% CI) respectively: 86.3% (83.1-89.5) and 84.8% (84.2-85.5). The triage tools were generally better at identifying patients who would die than those with non-fatal severe injury. This statistical evaluation has demonstrated variability in the accuracy of triage tools at predicting outcomes for children who sustain traumatic injuries. No single tool performed consistently well across all evaluated scenarios. Copyright © 2015 Elsevier Ltd. All rights reserved.

Top