Pasta, D J; Taylor, J L; Henning, J M
1999-01-01
Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.
Probabilistic Sensitivity Analysis of Fretting Fatigue (Preprint)
2009-04-01
AFRL-RX-WP-TP-2009-4091 PROBABILISTIC SENSITIVITY ANALYSIS OF FRETTING FATIGUE (Preprint) Patrick J. Golden, Harry R. Millwater , and...Sensitivity Analysis of Fretting Fatigue Patrick J. Golden * Air Force Research Laboratory, Wright-Patterson AFB, OH 45433 Harry R. Millwater † and
Probabilistic Sensitivity Analysis with Respect to Bounds of Truncated Distributions (PREPRINT)
2010-04-01
AFRL-RX-WP-TP-2010-4147 PROBABILISTIC SENSITIVITY ANALYSIS WITH RESPECT TO BOUNDS OF TRUNCATED DISTRIBUTIONS (PREPRINT) H. Millwater and...5a. CONTRACT NUMBER In-house 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 62102F 6. AUTHOR(S) H. Millwater and Y. Feng 5d. PROJECT...Z39-18 1 Probabilistic Sensitivity Analysis with respect to Bounds of Truncated Distributions H. Millwater and Y. Feng Department of Mechanical
Briggs, Andrew H; Ades, A E; Price, Martin J
2003-01-01
In structuring decision models of medical interventions, it is commonly recommended that only 2 branches be used for each chance node to avoid logical inconsistencies that can arise during sensitivity analyses if the branching probabilities do not sum to 1. However, information may be naturally available in an unconditional form, and structuring a tree in conditional form may complicate rather than simplify the sensitivity analysis of the unconditional probabilities. Current guidance emphasizes using probabilistic sensitivity analysis, and a method is required to provide probabilistic probabilities over multiple branches that appropriately represents uncertainty while satisfying the requirement that mutually exclusive event probabilities should sum to 1. The authors argue that the Dirichlet distribution, the multivariate equivalent of the beta distribution, is appropriate for this purpose and illustrate its use for generating a fully probabilistic transition matrix for a Markov model. Furthermore, they demonstrate that by adopting a Bayesian approach, the problem of observing zero counts for transitions of interest can be overcome.
Structural reliability methods: Code development status
NASA Astrophysics Data System (ADS)
Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.
1991-05-01
The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.
Structural reliability methods: Code development status
NASA Technical Reports Server (NTRS)
Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.
1991-01-01
The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.
Is probabilistic bias analysis approximately Bayesian?
MacLehose, Richard F.; Gustafson, Paul
2011-01-01
Case-control studies are particularly susceptible to differential exposure misclassification when exposure status is determined following incident case status. Probabilistic bias analysis methods have been developed as ways to adjust standard effect estimates based on the sensitivity and specificity of exposure misclassification. The iterative sampling method advocated in probabilistic bias analysis bears a distinct resemblance to a Bayesian adjustment; however, it is not identical. Furthermore, without a formal theoretical framework (Bayesian or frequentist), the results of a probabilistic bias analysis remain somewhat difficult to interpret. We describe, both theoretically and empirically, the extent to which probabilistic bias analysis can be viewed as approximately Bayesian. While the differences between probabilistic bias analysis and Bayesian approaches to misclassification can be substantial, these situations often involve unrealistic prior specifications and are relatively easy to detect. Outside of these special cases, probabilistic bias analysis and Bayesian approaches to exposure misclassification in case-control studies appear to perform equally well. PMID:22157311
Probabilistic Aeroelastic Analysis of Turbomachinery Components
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.
2004-01-01
A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.
Probabilistic Evaluation of Advanced Ceramic Matrix Composite Structures
NASA Technical Reports Server (NTRS)
Abumeri, Galib H.; Chamis, Christos C.
2003-01-01
The objective of this report is to summarize the deterministic and probabilistic structural evaluation results of two structures made with advanced ceramic composites (CMC): internally pressurized tube and uniformly loaded flange. The deterministic structural evaluation includes stress, displacement, and buckling analyses. It is carried out using the finite element code MHOST, developed for the 3-D inelastic analysis of structures that are made with advanced materials. The probabilistic evaluation is performed using the integrated probabilistic assessment of composite structures computer code IPACS. The affects of uncertainties in primitive variables related to the material, fabrication process, and loadings on the material property and structural response behavior are quantified. The primitive variables considered are: thermo-mechanical properties of fiber and matrix, fiber and void volume ratios, use temperature, and pressure. The probabilistic structural analysis and probabilistic strength results are used by IPACS to perform reliability and risk evaluation of the two structures. The results will show that the sensitivity information obtained for the two composite structures from the computational simulation can be used to alter the design process to meet desired service requirements. In addition to detailed probabilistic analysis of the two structures, the following were performed specifically on the CMC tube: (1) predicted the failure load and the buckling load, (2) performed coupled non-deterministic multi-disciplinary structural analysis, and (3) demonstrated that probabilistic sensitivities can be used to select a reduced set of design variables for optimization.
NASA Astrophysics Data System (ADS)
Králik, Juraj
2017-07-01
The paper presents the probabilistic and sensitivity analysis of the efficiency of the damping devices cover of nuclear power plant under impact of the container of nuclear fuel of type TK C30 drop. The finite element idealization of nuclear power plant structure is used in space. The steel pipe damper system is proposed for dissipation of the kinetic energy of the container free fall. The experimental results of the shock-damper basic element behavior under impact loads are presented. The Newmark integration method is used for solution of the dynamic equations. The sensitivity and probabilistic analysis of damping devices was realized in the AntHILL and ANSYS software.
Probabilistic structural analysis methods for select space propulsion system components
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Cruse, T. A.
1989-01-01
The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.
Probabilistic sensitivity analysis for NICE technology assessment: not an optional extra.
Claxton, Karl; Sculpher, Mark; McCabe, Chris; Briggs, Andrew; Akehurst, Ron; Buxton, Martin; Brazier, John; O'Hagan, Tony
2005-04-01
Recently the National Institute for Clinical Excellence (NICE) updated its methods guidance for technology assessment. One aspect of the new guidance is to require the use of probabilistic sensitivity analysis with all cost-effectiveness models submitted to the Institute. The purpose of this paper is to place the NICE guidance on dealing with uncertainty into a broader context of the requirements for decision making; to explain the general approach that was taken in its development; and to address each of the issues which have been raised in the debate about the role of probabilistic sensitivity analysis in general. The most appropriate starting point for developing guidance is to establish what is required for decision making. On the basis of these requirements, the methods and framework of analysis which can best meet these needs can then be identified. It will be argued that the guidance on dealing with uncertainty and, in particular, the requirement for probabilistic sensitivity analysis, is justified by the requirements of the type of decisions that NICE is asked to make. Given this foundation, the main issues and criticisms raised during and after the consultation process are reviewed. Finally, some of the methodological challenges posed by the need fully to characterise decision uncertainty and to inform the research agenda will be identified and discussed. Copyright (c) 2005 John Wiley & Sons, Ltd.
Probabilistic analysis of a materially nonlinear structure
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.
1990-01-01
A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.
NASA Technical Reports Server (NTRS)
Cruse, T. A.
1987-01-01
The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.
1988-01-01
The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.
Probabilistic structural analysis using a general purpose finite element program
NASA Astrophysics Data System (ADS)
Riha, D. S.; Millwater, H. R.; Thacker, B. H.
1992-07-01
This paper presents an accurate and efficient method to predict the probabilistic response for structural response quantities, such as stress, displacement, natural frequencies, and buckling loads, by combining the capabilities of MSC/NASTRAN, including design sensitivity analysis and fast probability integration. Two probabilistic structural analysis examples have been performed and verified by comparison with Monte Carlo simulation of the analytical solution. The first example consists of a cantilevered plate with several point loads. The second example is a probabilistic buckling analysis of a simply supported composite plate under in-plane loading. The coupling of MSC/NASTRAN and fast probability integration is shown to be orders of magnitude more efficient than Monte Carlo simulation with excellent accuracy.
A probabilistic model (SHEDS-Wood) was developed to examine children's exposure and dose to chromated copper arsenate (CCA)-treated wood, as described in Part 1 of this two part paper. This Part 2 paper discusses sensitivity and uncertainty analyses conducted to assess the key m...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, S.; Toll, J.; Cothern, K.
1995-12-31
The authors have performed robust sensitivity studies of the physico-chemical Hudson River PCB model PCHEPM to identify the parameters and process uncertainties contributing the most to uncertainty in predictions of water column and sediment PCB concentrations, over the time period 1977--1991 in one segment of the lower Hudson River. The term ``robust sensitivity studies`` refers to the use of several sensitivity analysis techniques to obtain a more accurate depiction of the relative importance of different sources of uncertainty. Local sensitivity analysis provided data on the sensitivity of PCB concentration estimates to small perturbations in nominal parameter values. Range sensitivity analysismore » provided information about the magnitude of prediction uncertainty associated with each input uncertainty. Rank correlation analysis indicated which parameters had the most dominant influence on model predictions. Factorial analysis identified important interactions among model parameters. Finally, term analysis looked at the aggregate influence of combinations of parameters representing physico-chemical processes. The authors scored the results of the local and range sensitivity and rank correlation analyses. The authors considered parameters that scored high on two of the three analyses to be important contributors to PCB concentration prediction uncertainty, and treated them probabilistically in simulations. They also treated probabilistically parameters identified in the factorial analysis as interacting with important parameters. The authors used the term analysis to better understand how uncertain parameters were influencing the PCB concentration predictions. The importance analysis allowed us to reduce the number of parameters to be modeled probabilistically from 16 to 5. This reduced the computational complexity of Monte Carlo simulations, and more importantly, provided a more lucid depiction of prediction uncertainty and its causes.« less
Probabilistic Structural Evaluation of Uncertainties in Radiator Sandwich Panel Design
NASA Technical Reports Server (NTRS)
Kuguoglu, Latife; Ludwiczak, Damian
2006-01-01
The Jupiter Icy Moons Orbiter (JIMO) Space System is part of the NASA's Prometheus Program. As part of the JIMO engineering team at NASA Glenn Research Center, the structural design of the JIMO Heat Rejection Subsystem (HRS) is evaluated. An initial goal of this study was to perform sensitivity analyses to determine the relative importance of the input variables on the structural responses of the radiator panel. The desire was to let the sensitivity analysis information identify the important parameters. The probabilistic analysis methods illustrated here support this objective. The probabilistic structural performance evaluation of a HRS radiator sandwich panel was performed. The radiator panel structural performance was assessed in the presence of uncertainties in the loading, fabrication process variables, and material properties. The stress and displacement contours of the deterministic structural analysis at mean probability was performed and results presented. It is followed by a probabilistic evaluation to determine the effect of the primitive variables on the radiator panel structural performance. Based on uncertainties in material properties, structural geometry and loading, the results of the displacement and stress analysis are used as an input file for the probabilistic analysis of the panel. The sensitivity of the structural responses, such as maximum displacement and maximum tensile and compressive stresses of the facesheet in x and y directions and maximum VonMises stresses of the tube, to the loading and design variables is determined under the boundary condition where all edges of the radiator panel are pinned. Based on this study, design critical material and geometric parameters of the considered sandwich panel are identified.
Probabilistic Structural Analysis Program
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.
2010-01-01
NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.
NASA Technical Reports Server (NTRS)
Price J. M.; Ortega, R.
1998-01-01
Probabilistic method is not a universally accepted approach for the design and analysis of aerospace structures. The validity of this approach must be demonstrated to encourage its acceptance as it viable design and analysis tool to estimate structural reliability. The objective of this Study is to develop a well characterized finite population of similar aerospace structures that can be used to (1) validate probabilistic codes, (2) demonstrate the basic principles behind probabilistic methods, (3) formulate general guidelines for characterization of material drivers (such as elastic modulus) when limited data is available, and (4) investigate how the drivers affect the results of sensitivity analysis at the component/failure mode level.
Corso, Phaedra S.; Ingels, Justin B.; Kogan, Steven M.; Foster, E. Michael; Chen, Yi-Fu; Brody, Gene H.
2013-01-01
Programmatic cost analyses of preventive interventions commonly have a number of methodological difficulties. To determine the mean total costs and properly characterize variability, one often has to deal with small sample sizes, skewed distributions, and especially missing data. Standard approaches for dealing with missing data such as multiple imputation may suffer from a small sample size, a lack of appropriate covariates, or too few details around the method used to handle the missing data. In this study, we estimate total programmatic costs for a prevention trial evaluating the Strong African American Families-Teen program. This intervention focuses on the prevention of substance abuse and risky sexual behavior. To account for missing data in the assessment of programmatic costs we compare multiple imputation to probabilistic sensitivity analysis. The latter approach uses collected cost data to create a distribution around each input parameter. We found that with the multiple imputation approach, the mean (95% confidence interval) incremental difference was $2149 ($397, $3901). With the probabilistic sensitivity analysis approach, the incremental difference was $2583 ($778, $4346). Although the true cost of the program is unknown, probabilistic sensitivity analysis may be a more viable alternative for capturing variability in estimates of programmatic costs when dealing with missing data, particularly with small sample sizes and the lack of strong predictor variables. Further, the larger standard errors produced by the probabilistic sensitivity analysis method may signal its ability to capture more of the variability in the data, thus better informing policymakers on the potentially true cost of the intervention. PMID:23299559
Corso, Phaedra S; Ingels, Justin B; Kogan, Steven M; Foster, E Michael; Chen, Yi-Fu; Brody, Gene H
2013-10-01
Programmatic cost analyses of preventive interventions commonly have a number of methodological difficulties. To determine the mean total costs and properly characterize variability, one often has to deal with small sample sizes, skewed distributions, and especially missing data. Standard approaches for dealing with missing data such as multiple imputation may suffer from a small sample size, a lack of appropriate covariates, or too few details around the method used to handle the missing data. In this study, we estimate total programmatic costs for a prevention trial evaluating the Strong African American Families-Teen program. This intervention focuses on the prevention of substance abuse and risky sexual behavior. To account for missing data in the assessment of programmatic costs we compare multiple imputation to probabilistic sensitivity analysis. The latter approach uses collected cost data to create a distribution around each input parameter. We found that with the multiple imputation approach, the mean (95 % confidence interval) incremental difference was $2,149 ($397, $3,901). With the probabilistic sensitivity analysis approach, the incremental difference was $2,583 ($778, $4,346). Although the true cost of the program is unknown, probabilistic sensitivity analysis may be a more viable alternative for capturing variability in estimates of programmatic costs when dealing with missing data, particularly with small sample sizes and the lack of strong predictor variables. Further, the larger standard errors produced by the probabilistic sensitivity analysis method may signal its ability to capture more of the variability in the data, thus better informing policymakers on the potentially true cost of the intervention.
A comprehensive probabilistic analysis model of oil pipelines network based on Bayesian network
NASA Astrophysics Data System (ADS)
Zhang, C.; Qin, T. X.; Jiang, B.; Huang, C.
2018-02-01
Oil pipelines network is one of the most important facilities of energy transportation. But oil pipelines network accident may result in serious disasters. Some analysis models for these accidents have been established mainly based on three methods, including event-tree, accident simulation and Bayesian network. Among these methods, Bayesian network is suitable for probabilistic analysis. But not all the important influencing factors are considered and the deployment rule of the factors has not been established. This paper proposed a probabilistic analysis model of oil pipelines network based on Bayesian network. Most of the important influencing factors, including the key environment condition and emergency response are considered in this model. Moreover, the paper also introduces a deployment rule for these factors. The model can be used in probabilistic analysis and sensitive analysis of oil pipelines network accident.
Probabilistic simulation of stress concentration in composite laminates
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Murthy, P. L. N.; Liaw, L.
1993-01-01
A computational methodology is described to probabilistically simulate the stress concentration factors in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The probabilistic composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties while probabilistic finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate stress concentration factors such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using it to simulate the stress concentration factors in composite laminates made from three different composite systems. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the stress concentration factors are influenced by local stiffness variables, by load eccentricities and by initial stress fields.
Application of a stochastic snowmelt model for probabilistic decisionmaking
NASA Technical Reports Server (NTRS)
Mccuen, R. H.
1983-01-01
A stochastic form of the snowmelt runoff model that can be used for probabilistic decision-making was developed. The use of probabilistic streamflow predictions instead of single valued deterministic predictions leads to greater accuracy in decisions. While the accuracy of the output function is important in decisionmaking, it is also important to understand the relative importance of the coefficients. Therefore, a sensitivity analysis was made for each of the coefficients.
Probabilistic Design and Analysis Framework
NASA Technical Reports Server (NTRS)
Strack, William C.; Nagpal, Vinod K.
2010-01-01
PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.
Probabilistic boundary element method
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Raveendra, S. T.
1989-01-01
The purpose of the Probabilistic Structural Analysis Method (PSAM) project is to develop structural analysis capabilities for the design analysis of advanced space propulsion system hardware. The boundary element method (BEM) is used as the basis of the Probabilistic Advanced Analysis Methods (PADAM) which is discussed. The probabilistic BEM code (PBEM) is used to obtain the structural response and sensitivity results to a set of random variables. As such, PBEM performs analogous to other structural analysis codes such as finite elements in the PSAM system. For linear problems, unlike the finite element method (FEM), the BEM governing equations are written at the boundary of the body only, thus, the method eliminates the need to model the volume of the body. However, for general body force problems, a direct condensation of the governing equations to the boundary of the body is not possible and therefore volume modeling is generally required.
NASA Technical Reports Server (NTRS)
McGhee, David S.; Peck, Jeff A.; McDonald, Emmett J.
2012-01-01
This paper examines Probabilistic Sensitivity Analysis (PSA) methods and tools in an effort to understand their utility in vehicle loads and dynamic analysis. Specifically, this study addresses how these methods may be used to establish limits on payload mass and cg location and requirements on adaptor stiffnesses while maintaining vehicle loads and frequencies within established bounds. To this end, PSA methods and tools are applied to a realistic, but manageable, integrated launch vehicle analysis where payload and payload adaptor parameters are modeled as random variables. This analysis is used to study both Regional Response PSA (RRPSA) and Global Response PSA (GRPSA) methods, with a primary focus on sampling based techniques. For contrast, some MPP based approaches are also examined.
Probabilistic Simulation of Stress Concentration in Composite Laminates
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Murthy, P. L. N.; Liaw, D. G.
1994-01-01
A computational methodology is described to probabilistically simulate the stress concentration factors (SCF's) in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties, whereas the finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate SCF's, such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using is to simulate the SCF's in three different composite laminates. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the SCF's are influenced by local stiffness variables, by load eccentricities, and by initial stress fields.
NASA Astrophysics Data System (ADS)
Yu, Bo; Ning, Chao-lie; Li, Bing
2017-03-01
A probabilistic framework for durability assessment of concrete structures in marine environments was proposed in terms of reliability and sensitivity analysis, which takes into account the uncertainties under the environmental, material, structural and executional conditions. A time-dependent probabilistic model of chloride ingress was established first to consider the variations in various governing parameters, such as the chloride concentration, chloride diffusion coefficient, and age factor. Then the Nataf transformation was adopted to transform the non-normal random variables from the original physical space into the independent standard Normal space. After that the durability limit state function and its gradient vector with respect to the original physical parameters were derived analytically, based on which the first-order reliability method was adopted to analyze the time-dependent reliability and parametric sensitivity of concrete structures in marine environments. The accuracy of the proposed method was verified by comparing with the second-order reliability method and the Monte Carlo simulation. Finally, the influences of environmental conditions, material properties, structural parameters and execution conditions on the time-dependent reliability of concrete structures in marine environments were also investigated. The proposed probabilistic framework can be implemented in the decision-making algorithm for the maintenance and repair of deteriorating concrete structures in marine environments.
Probabilistic Cues to Grammatical Category in English Orthography and Their Influence during Reading
ERIC Educational Resources Information Center
Arciuli, Joanne; Monaghan, Padraic
2009-01-01
We investigated probabilistic cues to grammatical category (noun vs. verb) in English orthography. These cues are located in both the beginnings and endings of words--as identified in our large-scale corpus analysis. Experiment 1 tested participants' sensitivity to beginning and ending cues while making speeded grammatical classifications.…
Probabilistic sizing of laminates with uncertainties
NASA Technical Reports Server (NTRS)
Shah, A. R.; Liaw, D. G.; Chamis, C. C.
1993-01-01
A reliability based design methodology for laminate sizing and configuration for a special case of composite structures is described. The methodology combines probabilistic composite mechanics with probabilistic structural analysis. The uncertainties of constituent materials (fiber and matrix) to predict macroscopic behavior are simulated using probabilistic theory. Uncertainties in the degradation of composite material properties are included in this design methodology. A multi-factor interaction equation is used to evaluate load and environment dependent degradation of the composite material properties at the micromechanics level. The methodology is integrated into a computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). Versatility of this design approach is demonstrated by performing a multi-level probabilistic analysis to size the laminates for design structural reliability of random type structures. The results show that laminate configurations can be selected to improve the structural reliability from three failures in 1000, to no failures in one million. Results also show that the laminates with the highest reliability are the least sensitive to the loading conditions.
Probabilistic structural analysis of a truss typical for space station
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.
1990-01-01
A three-bay, space, cantilever truss is probabilistically evaluated using the computer code NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) to identify and quantify the uncertainties and respective sensitivities associated with corresponding uncertainties in the primitive variables (structural, material, and loads parameters) that defines the truss. The distribution of each of these primitive variables is described in terms of one of several available distributions such as the Weibull, exponential, normal, log-normal, etc. The cumulative distribution function (CDF's) for the response functions considered and sensitivities associated with the primitive variables for given response are investigated. These sensitivities help in determining the dominating primitive variables for that response.
Influences of geological parameters to probabilistic assessment of slope stability of embankment
NASA Astrophysics Data System (ADS)
Nguyen, Qui T.; Le, Tuan D.; Konečný, Petr
2018-04-01
This article considers influences of geological parameters to slope stability of the embankment in probabilistic analysis using SLOPE/W computational system. Stability of a simple slope is evaluated with and without pore–water pressure on the basis of variation of soil properties. Normal distributions of unit weight, cohesion and internal friction angle are assumed. Monte Carlo simulation technique is employed to perform analysis of critical slip surface. Sensitivity analysis is performed to observe the variation of the geological parameters and their effects on safety factors of the slope stability.
Dong, Hengjin; Buxton, Martin
2006-01-01
The objective of this study is to apply a Markov model to compare cost-effectiveness of total knee replacement (TKR) using computer-assisted surgery (CAS) with that of TKR using a conventional manual method in the absence of formal clinical trial evidence. A structured search was carried out to identify evidence relating to the clinical outcome, cost, and effectiveness of TKR. Nine Markov states were identified based on the progress of the disease after TKR. Effectiveness was expressed by quality-adjusted life years (QALYs). The simulation was carried out initially for 120 cycles of a month each, starting with 1,000 TKRs. A discount rate of 3.5 percent was used for both cost and effectiveness in the incremental cost-effectiveness analysis. Then, a probabilistic sensitivity analysis was carried out using a Monte Carlo approach with 10,000 iterations. Computer-assisted TKR was a long-term cost-effective technology, but the QALYs gained were small. After the first 2 years, the incremental cost per QALY of computer-assisted TKR was dominant because of cheaper and more QALYs. The incremental cost-effectiveness ratio (ICER) was sensitive to the "effect of CAS," to the CAS extra cost, and to the utility of the state "Normal health after primary TKR," but it was not sensitive to utilities of other Markov states. Both probabilistic and deterministic analyses produced similar cumulative serious or minor complication rates and complex or simple revision rates. They also produced similar ICERs. Compared with conventional TKR, computer-assisted TKR is a cost-saving technology in the long-term and may offer small additional QALYs. The "effect of CAS" is to reduce revision rates and complications through more accurate and precise alignment, and although the conclusions from the model, even when allowing for a full probabilistic analysis of uncertainty, are clear, the "effect of CAS" on the rate of revisions awaits long-term clinical evidence.
Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling
NASA Technical Reports Server (NTRS)
Hojnicki, Jeffrey S.; Rusick, Jeffrey J.
2005-01-01
Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).
Design for cyclic loading endurance of composites
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Murthy, Pappu L. N.; Chamis, Christos C.; Liaw, Leslie D. G.
1993-01-01
The application of the computer code IPACS (Integrated Probabilistic Assessment of Composite Structures) to aircraft wing type structures is described. The code performs a complete probabilistic analysis for composites taking into account the uncertainties in geometry, boundary conditions, material properties, laminate lay-ups, and loads. Results of the analysis are presented in terms of cumulative distribution functions (CDF) and probability density function (PDF) of the fatigue life of a wing type composite structure under different hygrothermal environments subjected to the random pressure. The sensitivity of the fatigue life to a number of critical structural/material variables is also computed from the analysis.
Andronis, L; Barton, P; Bryan, S
2009-06-01
To determine how we define good practice in sensitivity analysis in general and probabilistic sensitivity analysis (PSA) in particular, and to what extent it has been adhered to in the independent economic evaluations undertaken for the National Institute for Health and Clinical Excellence (NICE) over recent years; to establish what policy impact sensitivity analysis has in the context of NICE, and policy-makers' views on sensitivity analysis and uncertainty, and what use is made of sensitivity analysis in policy decision-making. Three major electronic databases, MEDLINE, EMBASE and the NHS Economic Evaluation Database, were searched from inception to February 2008. The meaning of 'good practice' in the broad area of sensitivity analysis was explored through a review of the literature. An audit was undertaken of the 15 most recent NICE multiple technology appraisal judgements and their related reports to assess how sensitivity analysis has been undertaken by independent academic teams for NICE. A review of the policy and guidance documents issued by NICE aimed to assess the policy impact of the sensitivity analysis and the PSA in particular. Qualitative interview data from NICE Technology Appraisal Committee members, collected as part of an earlier study, were also analysed to assess the value attached to the sensitivity analysis components of the economic analyses conducted for NICE. All forms of sensitivity analysis, notably both deterministic and probabilistic approaches, have their supporters and their detractors. Practice in relation to univariate sensitivity analysis is highly variable, with considerable lack of clarity in relation to the methods used and the basis of the ranges employed. In relation to PSA, there is a high level of variability in the form of distribution used for similar parameters, and the justification for such choices is rarely given. Virtually all analyses failed to consider correlations within the PSA, and this is an area of concern. Uncertainty is considered explicitly in the process of arriving at a decision by the NICE Technology Appraisal Committee, and a correlation between high levels of uncertainty and negative decisions was indicated. The findings suggest considerable value in deterministic sensitivity analysis. Such analyses serve to highlight which model parameters are critical to driving a decision. Strong support was expressed for PSA, principally because it provides an indication of the parameter uncertainty around the incremental cost-effectiveness ratio. The review and the policy impact assessment focused exclusively on documentary evidence, excluding other sources that might have revealed further insights on this issue. In seeking to address parameter uncertainty, both deterministic and probabilistic sensitivity analyses should be used. It is evident that some cost-effectiveness work, especially around the sensitivity analysis components, represents a challenge in making it accessible to those making decisions. This speaks to the training agenda for those sitting on such decision-making bodies, and to the importance of clear presentation of analyses by the academic community.
NASA Astrophysics Data System (ADS)
Song, Lu-Kai; Wen, Jie; Fei, Cheng-Wei; Bai, Guang-Chen
2018-05-01
To improve the computing efficiency and precision of probabilistic design for multi-failure structure, a distributed collaborative probabilistic design method-based fuzzy neural network of regression (FR) (called as DCFRM) is proposed with the integration of distributed collaborative response surface method and fuzzy neural network regression model. The mathematical model of DCFRM is established and the probabilistic design idea with DCFRM is introduced. The probabilistic analysis of turbine blisk involving multi-failure modes (deformation failure, stress failure and strain failure) was investigated by considering fluid-structure interaction with the proposed method. The distribution characteristics, reliability degree, and sensitivity degree of each failure mode and overall failure mode on turbine blisk are obtained, which provides a useful reference for improving the performance and reliability of aeroengine. Through the comparison of methods shows that the DCFRM reshapes the probability of probabilistic analysis for multi-failure structure and improves the computing efficiency while keeping acceptable computational precision. Moreover, the proposed method offers a useful insight for reliability-based design optimization of multi-failure structure and thereby also enriches the theory and method of mechanical reliability design.
Probabilistic Analysis of Gas Turbine Field Performance
NASA Technical Reports Server (NTRS)
Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.
2002-01-01
A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.
Reliability, Risk and Cost Trade-Offs for Composite Designs
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.
1996-01-01
Risk and cost trade-offs have been simulated using a probabilistic method. The probabilistic method accounts for all naturally-occurring uncertainties including those in constituent material properties, fabrication variables, structure geometry and loading conditions. The probability density function of first buckling load for a set of uncertain variables is computed. The probabilistic sensitivity factors of uncertain variables to the first buckling load is calculated. The reliability-based cost for a composite fuselage panel is defined and minimized with respect to requisite design parameters. The optimization is achieved by solving a system of nonlinear algebraic equations whose coefficients are functions of probabilistic sensitivity factors. With optimum design parameters such as the mean and coefficient of variation (representing range of scatter) of uncertain variables, the most efficient and economical manufacturing procedure can be selected. In this paper, optimum values of the requisite design parameters for a predetermined cost due to failure occurrence are computationally determined. The results for the fuselage panel analysis show that the higher the cost due to failure occurrence, the smaller the optimum coefficient of variation of fiber modulus (design parameter) in longitudinal direction.
Nelson, S D; Nelson, R E; Cannon, G W; Lawrence, P; Battistone, M J; Grotzke, M; Rosenblum, Y; LaFleur, J
2014-12-01
This is a cost-effectiveness analysis of training rural providers to identify and treat osteoporosis. Results showed a slight cost savings, increase in life years, increase in treatment rates, and decrease in fracture incidence. However, the results were sensitive to small differences in effectiveness, being cost-effective in 70 % of simulations during probabilistic sensitivity analysis. We evaluated the cost-effectiveness of training rural providers to identify and treat veterans at risk for fragility fractures relative to referring these patients to an urban medical center for specialist care. The model evaluated the impact of training on patient life years, quality-adjusted life years (QALYs), treatment rates, fracture incidence, and costs from the perspective of the Department of Veterans Affairs. We constructed a Markov microsimulation model to compare costs and outcomes of a hypothetical cohort of veterans seen by rural providers. Parameter estimates were derived from previously published studies, and we conducted one-way and probabilistic sensitivity analyses on the parameter inputs. Base-case analysis showed that training resulted in no additional costs and an extra 0.083 life years (0.054 QALYs). Our model projected that as a result of training, more patients with osteoporosis would receive treatment (81.3 vs. 12.2 %), and all patients would have a lower incidence of fractures per 1,000 patient years (hip, 1.628 vs. 1.913; clinical vertebral, 0.566 vs. 1.037) when seen by a trained provider compared to an untrained provider. Results remained consistent in one-way sensitivity analysis and in probabilistic sensitivity analyses, training rural providers was cost-effective (less than $50,000/QALY) in 70 % of the simulations. Training rural providers to identify and treat veterans at risk for fragility fractures has a potential to be cost-effective, but the results are sensitive to small differences in effectiveness. It appears that provider education alone is not enough to make a significant difference in fragility fracture rates among veterans.
Safta, C.; Ricciuto, Daniel M.; Sargsyan, Khachik; ...
2015-07-01
In this paper we propose a probabilistic framework for an uncertainty quantification (UQ) study of a carbon cycle model and focus on the comparison between steady-state and transient simulation setups. A global sensitivity analysis (GSA) study indicates the parameters and parameter couplings that are important at different times of the year for quantities of interest (QoIs) obtained with the data assimilation linked ecosystem carbon (DALEC) model. We then employ a Bayesian approach and a statistical model error term to calibrate the parameters of DALEC using net ecosystem exchange (NEE) observations at the Harvard Forest site. The calibration results are employedmore » in the second part of the paper to assess the predictive skill of the model via posterior predictive checks.« less
Simulation of probabilistic wind loads and building analysis
NASA Technical Reports Server (NTRS)
Shah, Ashwin R.; Chamis, Christos C.
1991-01-01
Probabilistic wind loads likely to occur on a structure during its design life are predicted. Described here is a suitable multifactor interactive equation (MFIE) model and its use in the Composite Load Spectra (CLS) computer program to simulate the wind pressure cumulative distribution functions on four sides of a building. The simulated probabilistic wind pressure load was applied to a building frame, and cumulative distribution functions of sway displacements and reliability against overturning were obtained using NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), a stochastic finite element computer code. The geometry of the building and the properties of building members were also considered as random in the NESSUS analysis. The uncertainties of wind pressure, building geometry, and member section property were qualified in terms of their respective sensitivities on the structural response.
Probabilistic Aeroelastic Analysis Developed for Turbomachinery Components
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Mital, Subodh K.; Stefko, George L.; Pai, Shantaram S.
2003-01-01
Aeroelastic analyses for advanced turbomachines are being developed for use at the NASA Glenn Research Center and industry. However, these analyses at present are used for turbomachinery design with uncertainties accounted for by using safety factors. This approach may lead to overly conservative designs, thereby reducing the potential of designing higher efficiency engines. An integration of the deterministic aeroelastic analysis methods with probabilistic analysis methods offers the potential to design efficient engines with fewer aeroelastic problems and to make a quantum leap toward designing safe reliable engines. In this research, probabilistic analysis is integrated with aeroelastic analysis: (1) to determine the parameters that most affect the aeroelastic characteristics (forced response and stability) of a turbomachine component such as a fan, compressor, or turbine and (2) to give the acceptable standard deviation on the design parameters for an aeroelastically stable system. The approach taken is to combine the aeroelastic analysis of the MISER (MIStuned Engine Response) code with the FPI (fast probability integration) code. The role of MISER is to provide the functional relationships that tie the structural and aerodynamic parameters (the primitive variables) to the forced response amplitudes and stability eigenvalues (the response properties). The role of FPI is to perform probabilistic analyses by utilizing the response properties generated by MISER. The results are a probability density function for the response properties. The probabilistic sensitivities of the response variables to uncertainty in primitive variables are obtained as a byproduct of the FPI technique. The combined analysis of aeroelastic and probabilistic analysis is applied to a 12-bladed cascade vibrating in bending and torsion. Out of the total 11 design parameters, 6 are considered as having probabilistic variation. The six parameters are space-to-chord ratio (SBYC), stagger angle (GAMA), elastic axis (ELAXS), Mach number (MACH), mass ratio (MASSR), and frequency ratio (WHWB). The cascade is considered to be in subsonic flow with Mach 0.7. The results of the probabilistic aeroelastic analysis are the probability density function of predicted aerodynamic damping and frequency for flutter and the response amplitudes for forced response.
B-value and slip rate sensitivity analysis for PGA value in Lembang fault and Cimandiri fault area
NASA Astrophysics Data System (ADS)
Pratama, Cecep; Ito, Takeo; Meilano, Irwan; Nugraha, Andri Dian
2017-07-01
We examine slip rate and b-value contribution of Peak Ground Acceleration (PGA), in probabilistic seismic hazard maps (10% probability of exceedence in 50 years or 500 years return period). Hazard curve of PGA have been investigated for Sukabumi and Bandung using a PSHA (Probabilistic Seismic Hazard Analysis). We observe that the most influence in the hazard estimate is crustal fault. Monte Carlo approach has been developed to assess the sensitivity. Uncertainty and coefficient of variation from slip rate and b-value in Lembang and Cimandiri Fault area have been calculated. We observe that seismic hazard estimates are sensitive to fault slip rate and b-value with uncertainty result are 0.25 g dan 0.1-0.2 g, respectively. For specific site, we found seismic hazard estimate are 0.49 + 0.13 g with COV 27% and 0.39 + 0.05 g with COV 13% for Sukabumi and Bandung, respectively.
Monte Carlo simulation for slip rate sensitivity analysis in Cimandiri fault area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pratama, Cecep, E-mail: great.pratama@gmail.com; Meilano, Irwan; Nugraha, Andri Dian
Slip rate is used to estimate earthquake recurrence relationship which is the most influence for hazard level. We examine slip rate contribution of Peak Ground Acceleration (PGA), in probabilistic seismic hazard maps (10% probability of exceedance in 50 years or 500 years return period). Hazard curve of PGA have been investigated for Sukabumi using a PSHA (Probabilistic Seismic Hazard Analysis). We observe that the most influence in the hazard estimate is crustal fault. Monte Carlo approach has been developed to assess the sensitivity. Then, Monte Carlo simulations properties have been assessed. Uncertainty and coefficient of variation from slip rate formore » Cimandiri Fault area has been calculated. We observe that seismic hazard estimates is sensitive to fault slip rate with seismic hazard uncertainty result about 0.25 g. For specific site, we found seismic hazard estimate for Sukabumi is between 0.4904 – 0.8465 g with uncertainty between 0.0847 – 0.2389 g and COV between 17.7% – 29.8%.« less
Probabilistic analysis of bladed turbine disks and the effect of mistuning
NASA Technical Reports Server (NTRS)
Shah, A. R.; Nagpal, V. K.; Chamis, Christos C.
1990-01-01
Probabilistic assessment of the maximum blade response on a mistuned rotor disk is performed using the computer code NESSUS. The uncertainties in natural frequency, excitation frequency, amplitude of excitation and damping are included to obtain the cumulative distribution function (CDF) of blade responses. Advanced mean value first order analysis is used to compute CDF. The sensitivities of different random variables are identified. Effect of the number of blades on a rotor on mistuning is evaluated. It is shown that the uncertainties associated with the forcing function parameters have significant effect on the response distribution of the bladed rotor.
Probabilistic analysis of bladed turbine disks and the effect of mistuning
NASA Technical Reports Server (NTRS)
Shah, Ashwin; Nagpal, V. K.; Chamis, C. C.
1990-01-01
Probabilistic assessment of the maximum blade response on a mistuned rotor disk is performed using the computer code NESSUS. The uncertainties in natural frequency, excitation frequency, amplitude of excitation and damping have been included to obtain the cumulative distribution function (CDF) of blade responses. Advanced mean value first order analysis is used to compute CDF. The sensitivities of different random variables are identified. Effect of the number of blades on a rotor on mistuning is evaluated. It is shown that the uncertainties associated with the forcing function parameters have significant effect on the response distribution of the bladed rotor.
Hofer, Florian; Achelrod, Dmitrij; Stargardt, Tom
2016-12-01
Chronic obstructive pulmonary disease (COPD) poses major challenges for health care systems. Previous studies suggest that telemonitoring could be effective in preventing hospitalisations and hence reduce costs. The aim was to evaluate whether telemonitoring interventions for COPD are cost-effective from the perspective of German statutory sickness funds. A cost-utility analysis was conducted using a combination of a Markov model and a decision tree. Telemonitoring as add-on to standard treatment was compared with standard treatment alone. The model consisted of four transition stages to account for COPD severity, and a terminal stage for death. Within each cycle, the frequency of exacerbations as well as outcomes for 2015 costs and quality adjusted life years (QALYs) for each stage were calculated. Values for input parameters were taken from the literature. Deterministic and probabilistic sensitivity analyses were conducted. In the base case, telemonitoring led to an increase in incremental costs (€866 per patient) but also in incremental QALYs (0.05 per patient). The incremental cost-effectiveness ratio (ICER) was thus €17,410 per QALY gained. A deterministic sensitivity analysis showed that hospitalisation rate and costs for telemonitoring equipment greatly affected results. The probabilistic ICER averaged €34,432 per QALY (95 % confidence interval 12,161-56,703). We provide evidence that telemonitoring may be cost-effective in Germany from a payer's point of view. This holds even after deterministic and probabilistic sensitivity analyses.
Role of ionotropic glutamate receptors in delay and probability discounting in the rat.
Yates, Justin R; Batten, Seth R; Bardo, Michael T; Beckmann, Joshua S
2015-04-01
Discounting of delayed and probabilistic reinforcement is linked to increased drug use and pathological gambling. Understanding the neurobiology of discounting is important for designing treatments for these disorders. Glutamate is considered to be involved in addiction-like behaviors; however, the role of ionotropic glutamate receptors (iGluRs) in discounting remains unclear. The current study examined the effects of N-methyl-D-aspartate (NMDA) and α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid (AMPA) glutamate receptor blockade on performance in delay and probability discounting tasks. Following training in either delay or probability discounting, rats (n = 12, each task) received pretreatments of the NMDA receptor antagonists MK-801 (0, 0.01, 0.03, 0.1, or 0.3 mg/kg, s.c.) or ketamine (0, 1.0, 5.0, or 10.0 mg/kg, i.p.), as well as the AMPA receptor antagonist CNQX (0, 1.0, 3.0, or 5.6 mg/kg, i.p.). Hyperbolic discounting functions were used to estimate sensitivity to delayed/probabilistic reinforcement and sensitivity to reinforcer amount. An intermediate dose of MK-801 (0.03 mg/kg) decreased sensitivity to both delayed and probabilistic reinforcement. In contrast, ketamine did not affect the rate of discounting in either task but decreased sensitivity to reinforcer amount. CNQX did not alter sensitivity to reinforcer amount or delayed/probabilistic reinforcement. These results show that blockade of NMDA receptors, but not AMPA receptors, decreases sensitivity to delayed/probabilistic reinforcement (MK-801) and sensitivity to reinforcer amount (ketamine). The differential effects of MK-801 and ketamine demonstrate that sensitivities to delayed/probabilistic reinforcement and reinforcer amount are pharmacologically dissociable.
A probabilistic maintenance model for diesel engines
NASA Astrophysics Data System (ADS)
Pathirana, Shan; Abeygunawardane, Saranga Kumudu
2018-02-01
In this paper, a probabilistic maintenance model is developed for inspection based preventive maintenance of diesel engines based on the practical model concepts discussed in the literature. Developed model is solved using real data obtained from inspection and maintenance histories of diesel engines and experts' views. Reliability indices and costs were calculated for the present maintenance policy of diesel engines. A sensitivity analysis is conducted to observe the effect of inspection based preventive maintenance on the life cycle cost of diesel engines.
Dynamic Probabilistic Instability of Composite Structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2009-01-01
A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties in that order.
Probabilistic Analysis of Solid Oxide Fuel Cell Based Hybrid Gas Turbine System
NASA Technical Reports Server (NTRS)
Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.
2003-01-01
The emergence of fuel cell systems and hybrid fuel cell systems requires the evolution of analysis strategies for evaluating thermodynamic performance. A gas turbine thermodynamic cycle integrated with a fuel cell was computationally simulated and probabilistically evaluated in view of the several uncertainties in the thermodynamic performance parameters. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the uncertainties in the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design and make it cost effective. The analysis leads to the selection of criteria for gas turbine performance.
Probabilistic Analysis of Large-Scale Composite Structures Using the IPACS Code
NASA Technical Reports Server (NTRS)
Lemonds, Jeffrey; Kumar, Virendra
1995-01-01
An investigation was performed to ascertain the feasibility of using IPACS (Integrated Probabilistic Assessment of Composite Structures) for probabilistic analysis of a composite fan blade, the development of which is being pursued by various industries for the next generation of aircraft engines. A model representative of the class of fan blades used in the GE90 engine has been chosen as the structural component to be analyzed with IPACS. In this study, typical uncertainties are assumed in the level, and structural responses for ply stresses and frequencies are evaluated in the form of cumulative probability density functions. Because of the geometric complexity of the blade, the number of plies varies from several hundred at the root to about a hundred at the tip. This represents a extremely complex composites application for the IPACS code. A sensitivity study with respect to various random variables is also performed.
Aldridge, Robert W; Shaji, Kunju; Hayward, Andrew C; Abubakar, Ibrahim
2015-01-01
The Enhanced Matching System (EMS) is a probabilistic record linkage program developed by the tuberculosis section at Public Health England to match data for individuals across two datasets. This paper outlines how EMS works and investigates its accuracy for linkage across public health datasets. EMS is a configurable Microsoft SQL Server database program. To examine the accuracy of EMS, two public health databases were matched using National Health Service (NHS) numbers as a gold standard unique identifier. Probabilistic linkage was then performed on the same two datasets without inclusion of NHS number. Sensitivity analyses were carried out to examine the effect of varying matching process parameters. Exact matching using NHS number between two datasets (containing 5931 and 1759 records) identified 1071 matched pairs. EMS probabilistic linkage identified 1068 record pairs. The sensitivity of probabilistic linkage was calculated as 99.5% (95%CI: 98.9, 99.8), specificity 100.0% (95%CI: 99.9, 100.0), positive predictive value 99.8% (95%CI: 99.3, 100.0), and negative predictive value 99.9% (95%CI: 99.8, 100.0). Probabilistic matching was most accurate when including address variables and using the automatically generated threshold for determining links with manual review. With the establishment of national electronic datasets across health and social care, EMS enables previously unanswerable research questions to be tackled with confidence in the accuracy of the linkage process. In scenarios where a small sample is being matched into a very large database (such as national records of hospital attendance) then, compared to results presented in this analysis, the positive predictive value or sensitivity may drop according to the prevalence of matches between databases. Despite this possible limitation, probabilistic linkage has great potential to be used where exact matching using a common identifier is not possible, including in low-income settings, and for vulnerable groups such as homeless populations, where the absence of unique identifiers and lower data quality has historically hindered the ability to identify individuals across datasets.
Schlaier, Juergen R; Beer, Anton L; Faltermeier, Rupert; Fellner, Claudia; Steib, Kathrin; Lange, Max; Greenlee, Mark W; Brawanski, Alexander T; Anthofer, Judith M
2017-06-01
This study compared tractography approaches for identifying cerebellar-thalamic fiber bundles relevant to planning target sites for deep brain stimulation (DBS). In particular, probabilistic and deterministic tracking of the dentate-rubro-thalamic tract (DRTT) and differences between the spatial courses of the DRTT and the cerebello-thalamo-cortical (CTC) tract were compared. Six patients with movement disorders were examined by magnetic resonance imaging (MRI), including two sets of diffusion-weighted images (12 and 64 directions). Probabilistic and deterministic tractography was applied on each diffusion-weighted dataset to delineate the DRTT. Results were compared with regard to their sensitivity in revealing the DRTT and additional fiber tracts and processing time. Two sets of regions-of-interests (ROIs) guided deterministic tractography of the DRTT or the CTC, respectively. Tract distances to an atlas-based reference target were compared. Probabilistic fiber tracking with 64 orientations detected the DRTT in all twelve hemispheres. Deterministic tracking detected the DRTT in nine (12 directions) and in only two (64 directions) hemispheres. Probabilistic tracking was more sensitive in detecting additional fibers (e.g. ansa lenticularis and medial forebrain bundle) than deterministic tracking. Probabilistic tracking lasted substantially longer than deterministic. Deterministic tracking was more sensitive in detecting the CTC than the DRTT. CTC tracts were located adjacent but consistently more posterior to DRTT tracts. These results suggest that probabilistic tracking is more sensitive and robust in detecting the DRTT but harder to implement than deterministic approaches. Although sensitivity of deterministic tracking is higher for the CTC than the DRTT, targets for DBS based on these tracts likely differ. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
Rats bred for high alcohol drinking are more sensitive to delayed and probabilistic outcomes.
Wilhelm, C J; Mitchell, S H
2008-10-01
Alcoholics and heavy drinkers score higher on measures of impulsivity than nonalcoholics and light drinkers. This may be because of factors that predate drug exposure (e.g. genetics). This study examined the role of genetics by comparing impulsivity measures in ethanol-naive rats selectively bred based on their high [high alcohol drinking (HAD)] or low [low alcohol drinking (LAD)] consumption of ethanol. Replicates 1 and 2 of the HAD and LAD rats, developed by the University of Indiana Alcohol Research Center, completed two different discounting tasks. Delay discounting examines sensitivity to rewards that are delayed in time and is commonly used to assess 'choice' impulsivity. Probability discounting examines sensitivity to the uncertain delivery of rewards and has been used to assess risk taking and risk assessment. High alcohol drinking rats discounted delayed and probabilistic rewards more steeply than LAD rats. Discount rates associated with probabilistic and delayed rewards were weakly correlated, while bias was strongly correlated with discount rate in both delay and probability discounting. The results suggest that selective breeding for high alcohol consumption selects for animals that are more sensitive to delayed and probabilistic outcomes. Sensitivity to delayed or probabilistic outcomes may be predictive of future drinking in genetically predisposed individuals.
Bucci, Monica; Mandelli, Maria Luisa; Berman, Jeffrey I.; Amirbekian, Bagrat; Nguyen, Christopher; Berger, Mitchel S.; Henry, Roland G.
2013-01-01
Introduction Diffusion MRI tractography has been increasingly used to delineate white matter pathways in vivo for which the leading clinical application is presurgical mapping of eloquent regions. However, there is rare opportunity to quantify the accuracy or sensitivity of these approaches to delineate white matter fiber pathways in vivo due to the lack of a gold standard. Intraoperative electrical stimulation (IES) provides a gold standard for the location and existence of functional motor pathways that can be used to determine the accuracy and sensitivity of fiber tracking algorithms. In this study we used intraoperative stimulation from brain tumor patients as a gold standard to estimate the sensitivity and accuracy of diffusion tensor MRI (DTI) and q-ball models of diffusion with deterministic and probabilistic fiber tracking algorithms for delineation of motor pathways. Methods We used preoperative high angular resolution diffusion MRI (HARDI) data (55 directions, b = 2000 s/mm2) acquired in a clinically feasible time frame from 12 patients who underwent a craniotomy for resection of a cerebral glioma. The corticospinal fiber tracts were delineated with DTI and q-ball models using deterministic and probabilistic algorithms. We used cortical and white matter IES sites as a gold standard for the presence and location of functional motor pathways. Sensitivity was defined as the true positive rate of delineating fiber pathways based on cortical IES stimulation sites. For accuracy and precision of the course of the fiber tracts, we measured the distance between the subcortical stimulation sites and the tractography result. Positive predictive rate of the delineated tracts was assessed by comparison of subcortical IES motor function (upper extremity, lower extremity, face) with the connection of the tractography pathway in the motor cortex. Results We obtained 21 cortical and 8 subcortical IES sites from intraoperative mapping of motor pathways. Probabilistic q-ball had the best sensitivity (79%) as determined from cortical IES compared to deterministic q-ball (50%), probabilistic DTI (36%), and deterministic DTI (10%). The sensitivity using the q-ball algorithm (65%) was significantly higher than using DTI (23%) (p < 0.001) and the probabilistic algorithms (58%) were more sensitive than deterministic approaches (30%) (p = 0.003). Probabilistic q-ball fiber tracks had the smallest offset to the subcortical stimulation sites. The offsets between diffusion fiber tracks and subcortical IES sites were increased significantly for those cases where the diffusion fiber tracks were visibly thinner than expected. There was perfect concordance between the subcortical IES function (e.g. hand stimulation) and the cortical connection of the nearest diffusion fiber track (e.g. upper extremity cortex). Discussion This study highlights the tremendous utility of intraoperative stimulation sites to provide a gold standard from which to evaluate diffusion MRI fiber tracking methods and has provided an object standard for evaluation of different diffusion models and approaches to fiber tracking. The probabilistic q-ball fiber tractography was significantly better than DTI methods in terms of sensitivity and accuracy of the course through the white matter. The commonly used DTI fiber tracking approach was shown to have very poor sensitivity (as low as 10% for deterministic DTI fiber tracking) for delineation of the lateral aspects of the corticospinal tract in our study. Effects of the tumor/edema resulted in significantly larger offsets between the subcortical IES and the preoperative fiber tracks. The provided data show that probabilistic HARDI tractography is the most objective and reproducible analysis but given the small sample and number of stimulation points a generalization about our results should be given with caution. Indeed our results inform the capabilities of preoperative diffusion fiber tracking and indicate that such data should be used carefully when making pre-surgical and intra-operative management decisions. PMID:24273719
Sensitivity Analysis for Probabilistic Neural Network Structure Reduction.
Kowalski, Piotr A; Kusy, Maciej
2018-05-01
In this paper, we propose the use of local sensitivity analysis (LSA) for the structure simplification of the probabilistic neural network (PNN). Three algorithms are introduced. The first algorithm applies LSA to the PNN input layer reduction by selecting significant features of input patterns. The second algorithm utilizes LSA to remove redundant pattern neurons of the network. The third algorithm combines the proposed two and constitutes the solution of how they can work together. PNN with a product kernel estimator is used, where each multiplicand computes a one-dimensional Cauchy function. Therefore, the smoothing parameter is separately calculated for each dimension by means of the plug-in method. The classification qualities of the reduced and full structure PNN are compared. Furthermore, we evaluate the performance of PNN, for which global sensitivity analysis (GSA) and the common reduction methods are applied, both in the input layer and the pattern layer. The models are tested on the classification problems of eight repository data sets. A 10-fold cross validation procedure is used to determine the prediction ability of the networks. Based on the obtained results, it is shown that the LSA can be used as an alternative PNN reduction approach.
NASA Technical Reports Server (NTRS)
Bast, Callie C.; Jurena, Mark T.; Godines, Cody R.; Chamis, Christos C. (Technical Monitor)
2001-01-01
This project included both research and education objectives. The goal of this project was to advance innovative research and education objectives in theoretical and computational probabilistic structural analysis, reliability, and life prediction for improved reliability and safety of structural components of aerospace and aircraft propulsion systems. Research and education partners included Glenn Research Center (GRC) and Southwest Research Institute (SwRI) along with the University of Texas at San Antonio (UTSA). SwRI enhanced the NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) code and provided consulting support for NESSUS-related activities at UTSA. NASA funding supported three undergraduate students, two graduate students, a summer course instructor and the Principal Investigator. Matching funds from UTSA provided for the purchase of additional equipment for the enhancement of the Advanced Interactive Computational SGI Lab established during the first year of this Partnership Award to conduct the probabilistic finite element summer courses. The research portion of this report presents the cumulation of work performed through the use of the probabilistic finite element program, NESSUS, Numerical Evaluation and Structures Under Stress, and an embedded Material Strength Degradation (MSD) model. Probabilistic structural analysis provided for quantification of uncertainties associated with the design, thus enabling increased system performance and reliability. The structure examined was a Space Shuttle Main Engine (SSME) fuel turbopump blade. The blade material analyzed was Inconel 718, since the MSD model was previously calibrated for this material. Reliability analysis encompassing the effects of high temperature and high cycle fatigue, yielded a reliability value of 0.99978 using a fully correlated random field for the blade thickness. The reliability did not change significantly for a change in distribution type except for a change in distribution from Gaussian to Weibull for the centrifugal load. The sensitivity factors determined to be most dominant were the centrifugal loading and the initial strength of the material. These two sensitivity factors were influenced most by a change in distribution type from Gaussian to Weibull. The education portion of this report describes short-term and long-term educational objectives. Such objectives serve to integrate research and education components of this project resulting in opportunities for ethnic minority students, principally Hispanic. The primary vehicle to facilitate such integration was the teaching of two probabilistic finite element method courses to undergraduate engineering students in the summers of 1998 and 1999.
Probabilistic Finite Element Analysis & Design Optimization for Structural Designs
NASA Astrophysics Data System (ADS)
Deivanayagam, Arumugam
This study focuses on implementing probabilistic nature of material properties (Kevlar® 49) to the existing deterministic finite element analysis (FEA) of fabric based engine containment system through Monte Carlo simulations (MCS) and implementation of probabilistic analysis in engineering designs through Reliability Based Design Optimization (RBDO). First, the emphasis is on experimental data analysis focusing on probabilistic distribution models which characterize the randomness associated with the experimental data. The material properties of Kevlar® 49 are modeled using experimental data analysis and implemented along with an existing spiral modeling scheme (SMS) and user defined constitutive model (UMAT) for fabric based engine containment simulations in LS-DYNA. MCS of the model are performed to observe the failure pattern and exit velocities of the models. Then the solutions are compared with NASA experimental tests and deterministic results. MCS with probabilistic material data give a good prospective on results rather than a single deterministic simulation results. The next part of research is to implement the probabilistic material properties in engineering designs. The main aim of structural design is to obtain optimal solutions. In any case, in a deterministic optimization problem even though the structures are cost effective, it becomes highly unreliable if the uncertainty that may be associated with the system (material properties, loading etc.) is not represented or considered in the solution process. Reliable and optimal solution can be obtained by performing reliability optimization along with the deterministic optimization, which is RBDO. In RBDO problem formulation, in addition to structural performance constraints, reliability constraints are also considered. This part of research starts with introduction to reliability analysis such as first order reliability analysis, second order reliability analysis followed by simulation technique that are performed to obtain probability of failure and reliability of structures. Next, decoupled RBDO procedure is proposed with a new reliability analysis formulation with sensitivity analysis, which is performed to remove the highly reliable constraints in the RBDO, thereby reducing the computational time and function evaluations. Followed by implementation of the reliability analysis concepts and RBDO in finite element 2D truss problems and a planar beam problem are presented and discussed.
Dynamic Stability of Uncertain Laminated Beams Under Subtangential Loads
NASA Technical Reports Server (NTRS)
Goyal, Vijay K.; Kapania, Rakesh K.; Adelman, Howard (Technical Monitor); Horta, Lucas (Technical Monitor)
2002-01-01
Because of the inherent complexity of fiber-reinforced laminated composites, it can be challenging to manufacture composite structures according to their exact design specifications, resulting in unwanted material and geometric uncertainties. In this research, we focus on the deterministic and probabilistic stability analysis of laminated structures subject to subtangential loading, a combination of conservative and nonconservative tangential loads, using the dynamic criterion. Thus a shear-deformable laminated beam element, including warping effects, is derived to study the deterministic and probabilistic response of laminated beams. This twenty-one degrees of freedom element can be used for solving both static and dynamic problems. In the first-order shear deformable model used here we have employed a more accurate method to obtain the transverse shear correction factor. The dynamic version of the principle of virtual work for laminated composites is expressed in its nondimensional form and the element tangent stiffness and mass matrices are obtained using analytical integration The stability is studied by giving the structure a small disturbance about an equilibrium configuration, and observing if the resulting response remains small. In order to study the dynamic behavior by including uncertainties into the problem, three models were developed: Exact Monte Carlo Simulation, Sensitivity Based Monte Carlo Simulation, and Probabilistic FEA. These methods were integrated into the developed finite element analysis. Also, perturbation and sensitivity analysis have been used to study nonconservative problems, as well as to study the stability analysis, using the dynamic criterion.
Probabilistic methods for sensitivity analysis and calibration in the NASA challenge problem
Safta, Cosmin; Sargsyan, Khachik; Najm, Habib N.; ...
2015-01-01
In this study, a series of algorithms are proposed to address the problems in the NASA Langley Research Center Multidisciplinary Uncertainty Quantification Challenge. A Bayesian approach is employed to characterize and calibrate the epistemic parameters based on the available data, whereas a variance-based global sensitivity analysis is used to rank the epistemic and aleatory model parameters. A nested sampling of the aleatory–epistemic space is proposed to propagate uncertainties from model parameters to output quantities of interest.
Probabilistic methods for sensitivity analysis and calibration in the NASA challenge problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Safta, Cosmin; Sargsyan, Khachik; Najm, Habib N.
In this study, a series of algorithms are proposed to address the problems in the NASA Langley Research Center Multidisciplinary Uncertainty Quantification Challenge. A Bayesian approach is employed to characterize and calibrate the epistemic parameters based on the available data, whereas a variance-based global sensitivity analysis is used to rank the epistemic and aleatory model parameters. A nested sampling of the aleatory–epistemic space is proposed to propagate uncertainties from model parameters to output quantities of interest.
Affective and cognitive factors influencing sensitivity to probabilistic information.
Tyszka, Tadeusz; Sawicki, Przemyslaw
2011-11-01
In study 1 different groups of female students were randomly assigned to one of four probabilistic information formats. Five different levels of probability of a genetic disease in an unborn child were presented to participants (within-subject factor). After the presentation of the probability level, participants were requested to indicate the acceptable level of pain they would tolerate to avoid the disease (in their unborn child), their subjective evaluation of the disease risk, and their subjective evaluation of being worried by this risk. The results of study 1 confirmed the hypothesis that an experience-based probability format decreases the subjective sense of worry about the disease, thus, presumably, weakening the tendency to overrate the probability of rare events. Study 2 showed that for the emotionally laden stimuli, the experience-based probability format resulted in higher sensitivity to probability variations than other formats of probabilistic information. These advantages of the experience-based probability format are interpreted in terms of two systems of information processing: the rational deliberative versus the affective experiential and the principle of stimulus-response compatibility. © 2011 Society for Risk Analysis.
Arcella, D; Soggiu, M E; Leclercq, C
2003-10-01
For the assessment of exposure to food-borne chemicals, the most commonly used methods in the European Union follow a deterministic approach based on conservative assumptions. Over the past few years, to get a more realistic view of exposure to food chemicals, risk managers are getting more interested in the probabilistic approach. Within the EU-funded 'Monte Carlo' project, a stochastic model of exposure to chemical substances from the diet and a computer software program were developed. The aim of this paper was to validate the model with respect to the intake of saccharin from table-top sweeteners and cyclamate from soft drinks by Italian teenagers with the use of the software and to evaluate the impact of the inclusion/exclusion of indicators on market share and brand loyalty through a sensitivity analysis. Data on food consumption and the concentration of sweeteners were collected. A food frequency questionnaire aimed at identifying females who were high consumers of sugar-free soft drinks and/or of table top sweeteners was filled in by 3982 teenagers living in the District of Rome. Moreover, 362 subjects participated in a detailed food survey by recording, at brand level, all foods and beverages ingested over 12 days. Producers were asked to provide the intense sweeteners' concentration of sugar-free products. Results showed that consumer behaviour with respect to brands has an impact on exposure assessments. Only probabilistic models that took into account indicators of market share and brand loyalty met the validation criteria.
NASA Technical Reports Server (NTRS)
Bast, Callie Corinne Scheidt
1994-01-01
This thesis presents the on-going development of methodology for a probabilistic material strength degradation model. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes four effects that typically reduce lifetime strength: high temperature, mechanical fatigue, creep, and thermal fatigue. Statistical analysis was conducted on experimental Inconel 718 data obtained from the open literature. This analysis provided regression parameters for use as the model's empirical material constants, thus calibrating the model specifically for Inconel 718. Model calibration was carried out for four variables, namely, high temperature, mechanical fatigue, creep, and thermal fatigue. Methodology to estimate standard deviations of these material constants for input into the probabilistic material strength model was developed. Using the current version of PROMISS, entitled PROMISS93, a sensitivity study for the combined effects of mechanical fatigue, creep, and thermal fatigue was performed. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing a combination of mechanical fatigue and high temperature effects by model to the combination by experiment were conducted. Thus, for Inconel 718, the basic model assumption of independence between effects was evaluated. Results from this limited verification study strongly supported this assumption.
Hunnicutt, Jacob N; Ulbricht, Christine M; Chrysanthopoulou, Stavroula A; Lapane, Kate L
2016-12-01
We systematically reviewed pharmacoepidemiologic and comparative effectiveness studies that use probabilistic bias analysis to quantify the effects of systematic error including confounding, misclassification, and selection bias on study results. We found articles published between 2010 and October 2015 through a citation search using Web of Science and Google Scholar and a keyword search using PubMed and Scopus. Eligibility of studies was assessed by one reviewer. Three reviewers independently abstracted data from eligible studies. Fifteen studies used probabilistic bias analysis and were eligible for data abstraction-nine simulated an unmeasured confounder and six simulated misclassification. The majority of studies simulating an unmeasured confounder did not specify the range of plausible estimates for the bias parameters. Studies simulating misclassification were in general clearer when reporting the plausible distribution of bias parameters. Regardless of the bias simulated, the probability distributions assigned to bias parameters, number of simulated iterations, sensitivity analyses, and diagnostics were not discussed in the majority of studies. Despite the prevalence and concern of bias in pharmacoepidemiologic and comparative effectiveness studies, probabilistic bias analysis to quantitatively model the effect of bias was not widely used. The quality of reporting and use of this technique varied and was often unclear. Further discussion and dissemination of the technique are warranted. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Sensitivity Analysis in Sequential Decision Models.
Chen, Qiushi; Ayer, Turgay; Chhatwal, Jagpreet
2017-02-01
Sequential decision problems are frequently encountered in medical decision making, which are commonly solved using Markov decision processes (MDPs). Modeling guidelines recommend conducting sensitivity analyses in decision-analytic models to assess the robustness of the model results against the uncertainty in model parameters. However, standard methods of conducting sensitivity analyses cannot be directly applied to sequential decision problems because this would require evaluating all possible decision sequences, typically in the order of trillions, which is not practically feasible. As a result, most MDP-based modeling studies do not examine confidence in their recommended policies. In this study, we provide an approach to estimate uncertainty and confidence in the results of sequential decision models. First, we provide a probabilistic univariate method to identify the most sensitive parameters in MDPs. Second, we present a probabilistic multivariate approach to estimate the overall confidence in the recommended optimal policy considering joint uncertainty in the model parameters. We provide a graphical representation, which we call a policy acceptability curve, to summarize the confidence in the optimal policy by incorporating stakeholders' willingness to accept the base case policy. For a cost-effectiveness analysis, we provide an approach to construct a cost-effectiveness acceptability frontier, which shows the most cost-effective policy as well as the confidence in that for a given willingness to pay threshold. We demonstrate our approach using a simple MDP case study. We developed a method to conduct sensitivity analysis in sequential decision models, which could increase the credibility of these models among stakeholders.
Integration of RAMS in LCC analysis for linear transport infrastructures. A case study for railways.
NASA Astrophysics Data System (ADS)
Calle-Cordón, Álvaro; Jiménez-Redondo, Noemi; Morales-Gámiz, F. J.; García-Villena, F. A.; Garmabaki, Amir H. S.; Odelius, Johan
2017-09-01
Life-cycle cost (LCC) analysis is an economic technique used to assess the total costs associated with the lifetime of a system in order to support decision making in long term strategic planning. For complex systems, such as railway and road infrastructures, the cost of maintenance plays an important role in the LCC analysis. Costs associated with maintenance interventions can be more reliably estimated by integrating the probabilistic nature of the failures associated to these interventions in the LCC models. Reliability, Maintainability, Availability and Safety (RAMS) parameters describe the maintenance needs of an asset in a quantitative way by using probabilistic information extracted from registered maintenance activities. Therefore, the integration of RAMS in the LCC analysis allows obtaining reliable predictions of system maintenance costs and the dependencies of these costs with specific cost drivers through sensitivity analyses. This paper presents an innovative approach for a combined RAMS & LCC methodology for railway and road transport infrastructures being developed under the on-going H2020 project INFRALERT. Such RAMS & LCC analysis provides relevant probabilistic information to be used for condition and risk-based planning of maintenance activities as well as for decision support in long term strategic investment planning.
Development of probabilistic emission inventories of air toxics for Jacksonville, Florida, USA.
Zhao, Yuchao; Frey, H Christopher
2004-11-01
Probabilistic emission inventories were developed for 1,3-butadiene, mercury (Hg), arsenic (As), benzene, formaldehyde, and lead for Jacksonville, FL. To quantify inter-unit variability in empirical emission factor data, the Maximum Likelihood Estimation (MLE) method or the Method of Matching Moments was used to fit parametric distributions. For data sets that contain nondetected measurements, a method based upon MLE was used for parameter estimation. To quantify the uncertainty in urban air toxic emission factors, parametric bootstrap simulation and empirical bootstrap simulation were applied to uncensored and censored data, respectively. The probabilistic emission inventories were developed based on the product of the uncertainties in the emission factors and in the activity factors. The uncertainties in the urban air toxics emission inventories range from as small as -25 to +30% for Hg to as large as -83 to +243% for As. The key sources of uncertainty in the emission inventory for each toxic are identified based upon sensitivity analysis. Typically, uncertainty in the inventory of a given pollutant can be attributed primarily to a small number of source categories. Priorities for improving the inventories and for refining the probabilistic analysis are discussed.
NASA Astrophysics Data System (ADS)
Stockton, T. B.; Black, P. K.; Catlett, K. M.; Tauxe, J. D.
2002-05-01
Environmental modeling is an essential component in the evaluation of regulatory compliance of radioactive waste management sites (RWMSs) at the Nevada Test Site in southern Nevada, USA. For those sites that are currently operating, further goals are to support integrated decision analysis for the development of acceptance criteria for future wastes, as well as site maintenance, closure, and monitoring. At these RWMSs, the principal pathways for release of contamination to the environment are upward towards the ground surface rather than downwards towards the deep water table. Biotic processes, such as burrow excavation and plant uptake and turnover, dominate this upward transport. A combined multi-pathway contaminant transport and risk assessment model was constructed using the GoldSim modeling platform. This platform facilitates probabilistic analysis of environmental systems, and is especially well suited for assessments involving radionuclide decay chains. The model employs probabilistic definitions of key parameters governing contaminant transport, with the goals of quantifying cumulative uncertainty in the estimation of performance measures and providing information necessary to perform sensitivity analyses. This modeling differs from previous radiological performance assessments (PAs) in that the modeling parameters are intended to be representative of the current knowledge, and the uncertainty in that knowledge, of parameter values rather than reflective of a conservative assessment approach. While a conservative PA may be sufficient to demonstrate regulatory compliance, a parametrically honest PA can also be used for more general site decision-making. In particular, a parametrically honest probabilistic modeling approach allows both uncertainty and sensitivity analyses to be explicitly coupled to the decision framework using a single set of model realizations. For example, sensitivity analysis provides a guide for analyzing the value of collecting more information by quantifying the relative importance of each input parameter in predicting the model response. However, in these complex, high dimensional eco-system models, represented by the RWMS model, the dynamics of the systems can act in a non-linear manner. Quantitatively assessing the importance of input variables becomes more difficult as the dimensionality, the non-linearities, and the non-monotonicities of the model increase. Methods from data mining such as Multivariate Adaptive Regression Splines (MARS) and the Fourier Amplitude Sensitivity Test (FAST) provide tools that can be used in global sensitivity analysis in these high dimensional, non-linear situations. The enhanced interpretability of model output provided by the quantitative measures estimated by these global sensitivity analysis tools will be demonstrated using the RWMS model.
Non-Deterministic Dynamic Instability of Composite Shells
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Abumeri, Galib H.
2004-01-01
A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics, and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties, in that order.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bozoki, G.E.; Fitzpatrick, R.G.; Bohn, M.P.
This report details the review of the Diablo Canyon Probabilistic Risk Assessment (DCPRA). The study was performed under contract from the Probabilistic Risk Analysis Branch, Office of Nuclear Reactor Research, USNRC by Brookhaven National Laboratory. The DCPRA is a full scope Level I effort and although the review touched on all aspects of the PRA, the internal events and seismic events received the vast majority of the review effort. The report includes a number of independent systems analyses sensitivity studies, importance analyses as well as conclusions on the adequacy of the DCPRA for use in the Diablo Canyon Long Termmore » Seismic Program.« less
Jongeneel, W P; Delmaar, J E; Bokkers, B G H
2018-06-08
A methodology to assess the health impact of skin sensitizers is introduced, which consists of the comparison of the probabilistic aggregated exposure with a probabilistic (individual) human sensitization or elicitation induction dose. The health impact of potential policy measures aimed at reducing the concentration of a fragrance allergen, geraniol, in consumer products is analysed in a simulated population derived from multiple product use surveys. Our analysis shows that current dermal exposure to geraniol from personal care and household cleaning products lead to new cases of contact allergy and induce clinical symptoms for those already sensitized. We estimate that this exposure results yearly in 34 new cases of geraniol contact allergy per million consumers in Western and Northern Europe, mainly due to exposure to household cleaning products. About twice as many consumers (60 per million) are projected to suffer from clinical symptoms due to re-exposure to geraniol. Policy measures restricting geraniol concentrations to <0.01% will noticeably reduce new cases of sensitization and decrease the number of people with clinical symptoms as well as the frequency of occurrence of these clinical symptoms. The estimated numbers should be interpreted with caution and provide only a rough indication of the health impact. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Probabilistic Micromechanics and Macromechanics for Ceramic Matrix Composites
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Mital, Subodh K.; Shah, Ashwin R.
1997-01-01
The properties of ceramic matrix composites (CMC's) are known to display a considerable amount of scatter due to variations in fiber/matrix properties, interphase properties, interphase bonding, amount of matrix voids, and many geometry- or fabrication-related parameters, such as ply thickness and ply orientation. This paper summarizes preliminary studies in which formal probabilistic descriptions of the material-behavior- and fabrication-related parameters were incorporated into micromechanics and macromechanics for CMC'S. In this process two existing methodologies, namely CMC micromechanics and macromechanics analysis and a fast probability integration (FPI) technique are synergistically coupled to obtain the probabilistic composite behavior or response. Preliminary results in the form of cumulative probability distributions and information on the probability sensitivities of the response to primitive variables for a unidirectional silicon carbide/reaction-bonded silicon nitride (SiC/RBSN) CMC are presented. The cumulative distribution functions are computed for composite moduli, thermal expansion coefficients, thermal conductivities, and longitudinal tensile strength at room temperature. The variations in the constituent properties that directly affect these composite properties are accounted for via assumed probabilistic distributions. Collectively, the results show that the present technique provides valuable information about the composite properties and sensitivity factors, which is useful to design or test engineers. Furthermore, the present methodology is computationally more efficient than a standard Monte-Carlo simulation technique; and the agreement between the two solutions is excellent, as shown via select examples.
Arrieta, Oscar; Anaya, Pablo; Morales-Oyarvide, Vicente; Ramírez-Tirado, Laura Alejandra; Polanco, Ana C
2016-09-01
Assess the cost-effectiveness of an EGFR-mutation testing strategy for advanced NSCLC in first-line therapy with either gefitinib or carboplatin-paclitaxel in Mexican institutions. Cost-effectiveness analysis using a discrete event simulation (DES) model to simulate two therapeutic strategies in patients with advanced NSCLC. Strategy one included patients tested for EGFR-mutation and therapy given accordingly. Strategy two included chemotherapy for all patients without testing. All results are presented in 2014 US dollars. The analysis was made with data from the Mexican frequency of EGFR-mutation. A univariate sensitivity analysis was conducted on EGFR prevalence. Progression-free survival (PFS) transition probabilities were estimated on data from the IPASS and simulated with a Weibull distribution, run with parallel trials to calculate a probabilistic sensitivity analysis. PFS of patients in the testing strategy was 6.76 months (95 % CI 6.10-7.44) vs 5.85 months (95 % CI 5.43-6.29) in the non-testing group. The one-way sensitivity analysis showed that PFS has a direct relationship with EGFR-mutation prevalence, while the ICER and testing cost have an inverse relationship with EGFR-mutation prevalence. The probabilistic sensitivity analysis showed that all iterations had incremental costs and incremental PFS for strategy 1 in comparison with strategy 2. There is a direct relationship between the ICER and the cost of EGFR testing, with an inverse relationship with the prevalence of EGFR-mutation. When prevalence is >10 % ICER remains constant. This study could impact Mexican and Latin American health policies regarding mutation detection testing and treatment for advanced NSCLC.
Probabilistic brain tissue segmentation in neonatal magnetic resonance imaging.
Anbeek, Petronella; Vincken, Koen L; Groenendaal, Floris; Koeman, Annemieke; van Osch, Matthias J P; van der Grond, Jeroen
2008-02-01
A fully automated method has been developed for segmentation of four different structures in the neonatal brain: white matter (WM), central gray matter (CEGM), cortical gray matter (COGM), and cerebrospinal fluid (CSF). The segmentation algorithm is based on information from T2-weighted (T2-w) and inversion recovery (IR) scans. The method uses a K nearest neighbor (KNN) classification technique with features derived from spatial information and voxel intensities. Probabilistic segmentations of each tissue type were generated. By applying thresholds on these probability maps, binary segmentations were obtained. These final segmentations were evaluated by comparison with a gold standard. The sensitivity, specificity, and Dice similarity index (SI) were calculated for quantitative validation of the results. High sensitivity and specificity with respect to the gold standard were reached: sensitivity >0.82 and specificity >0.9 for all tissue types. Tissue volumes were calculated from the binary and probabilistic segmentations. The probabilistic segmentation volumes of all tissue types accurately estimated the gold standard volumes. The KNN approach offers valuable ways for neonatal brain segmentation. The probabilistic outcomes provide a useful tool for accurate volume measurements. The described method is based on routine diagnostic magnetic resonance imaging (MRI) and is suitable for large population studies.
Reliability Coupled Sensitivity Based Design Approach for Gravity Retaining Walls
NASA Astrophysics Data System (ADS)
Guha Ray, A.; Baidya, D. K.
2012-09-01
Sensitivity analysis involving different random variables and different potential failure modes of a gravity retaining wall focuses on the fact that high sensitivity of a particular variable on a particular mode of failure does not necessarily imply a remarkable contribution to the overall failure probability. The present paper aims at identifying a probabilistic risk factor ( R f ) for each random variable based on the combined effects of failure probability ( P f ) of each mode of failure of a gravity retaining wall and sensitivity of each of the random variables on these failure modes. P f is calculated by Monte Carlo simulation and sensitivity analysis of each random variable is carried out by F-test analysis. The structure, redesigned by modifying the original random variables with the risk factors, is safe against all the variations of random variables. It is observed that R f for friction angle of backfill soil ( φ 1 ) increases and cohesion of foundation soil ( c 2 ) decreases with an increase of variation of φ 1 , while R f for unit weights ( γ 1 and γ 2 ) for both soil and friction angle of foundation soil ( φ 2 ) remains almost constant for variation of soil properties. The results compared well with some of the existing deterministic and probabilistic methods and found to be cost-effective. It is seen that if variation of φ 1 remains within 5 %, significant reduction in cross-sectional area can be achieved. But if the variation is more than 7-8 %, the structure needs to be modified. Finally design guidelines for different wall dimensions, based on the present approach, are proposed.
Turnes, Juan; Domínguez-Hernández, Raquel; Casado, Miguel Ángel
To evaluate the cost-effectiveness of a strategy based on direct-acting antivirals (DAAs) following the marketing of simeprevir and sofosbuvir (post-DAA) versus a pre-direct-acting antiviral strategy (pre-DAA) in patients with chronic hepatitis C, from the perspective of the Spanish National Health System. A decision tree combined with a Markov model was used to estimate the direct health costs (€, 2016) and health outcomes (quality-adjusted life years, QALYs) throughout the patient's life, with an annual discount rate of 3%. The sustained virological response, percentage of patients treated or not treated in each strategy, clinical characteristics of the patients, annual likelihood of transition, costs of treating and managing the disease, and utilities were obtained from the literature. The cost-effectiveness analysis was expressed as an incremental cost-effectiveness ratio (incremental cost per QALY gained). A deterministic sensitivity analysis and a probabilistic sensitivity analysis were performed. The post-DAA strategy showed higher health costs per patient (€30,944 vs. €23,707) than the pre-DAA strategy. However, it was associated with an increase of QALYs gained (15.79 vs. 12.83), showing an incremental cost-effectiveness ratio of €2,439 per QALY. The deterministic sensitivity analysis and the probabilistic sensitivity analysis showed the robustness of the results, with the post-DAA strategy being cost-effective in 99% of cases compared to the pre-DAA strategy. Compared to the pre-DAA strategy, the post-DAA strategy is efficient for the treatment of chronic hepatitis C in Spain, resulting in a much lower cost per QALY than the efficiency threshold used in Spain (€30,000 per QALY). Copyright © 2017 Elsevier España, S.L.U., AEEH y AEG. All rights reserved.
Probabilistic Evaluation of Blade Impact Damage
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Abumeri, G. H.
2003-01-01
The response to high velocity impact of a composite blade is probabilistically evaluated. The evaluation is focused on quantifying probabilistically the effects of uncertainties (scatter) in the variables that describe the impact, the blade make-up (geometry and material), the blade response (displacements, strains, stresses, frequencies), the blade residual strength after impact, and the blade damage tolerance. The results of probabilistic evaluations results are in terms of probability cumulative distribution functions and probabilistic sensitivities. Results show that the blade has relatively low damage tolerance at 0.999 probability of structural failure and substantial at 0.01 probability.
NASA Technical Reports Server (NTRS)
Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.
2010-01-01
Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project
NASA Technical Reports Server (NTRS)
Lyle, Karen H.
2014-01-01
Acceptance of new spacecraft structural architectures and concepts requires validated design methods to minimize the expense involved with technology validation via flighttesting. This paper explores the implementation of probabilistic methods in the sensitivity analysis of the structural response of a Hypersonic Inflatable Aerodynamic Decelerator (HIAD). HIAD architectures are attractive for spacecraft deceleration because they are lightweight, store compactly, and utilize the atmosphere to decelerate a spacecraft during re-entry. However, designers are hesitant to include these inflatable approaches for large payloads or spacecraft because of the lack of flight validation. In the example presented here, the structural parameters of an existing HIAD model have been varied to illustrate the design approach utilizing uncertainty-based methods. Surrogate models have been used to reduce computational expense several orders of magnitude. The suitability of the design is based on assessing variation in the resulting cone angle. The acceptable cone angle variation would rely on the aerodynamic requirements.
A Probabilistic Design Method Applied to Smart Composite Structures
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Chamis, Christos C.
1995-01-01
A probabilistic design method is described and demonstrated using a smart composite wing. Probabilistic structural design incorporates naturally occurring uncertainties including those in constituent (fiber/matrix) material properties, fabrication variables, structure geometry and control-related parameters. Probabilistic sensitivity factors are computed to identify those parameters that have a great influence on a specific structural reliability. Two performance criteria are used to demonstrate this design methodology. The first criterion requires that the actuated angle at the wing tip be bounded by upper and lower limits at a specified reliability. The second criterion requires that the probability of ply damage due to random impact load be smaller than an assigned value. When the relationship between reliability improvement and the sensitivity factors is assessed, the results show that a reduction in the scatter of the random variable with the largest sensitivity factor (absolute value) provides the lowest failure probability. An increase in the mean of the random variable with a negative sensitivity factor will reduce the failure probability. Therefore, the design can be improved by controlling or selecting distribution parameters associated with random variables. This can be implemented during the manufacturing process to obtain maximum benefit with minimum alterations.
Multi-disciplinary coupling effects for integrated design of propulsion systems
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Singhal, S. N.
1993-01-01
Effective computational simulation procedures are described for modeling the inherent multi-disciplinary interactions which govern the accurate response of propulsion systems. Results are presented for propulsion system responses including multi-disciplinary coupling effects using coupled multi-discipline thermal, structural, and acoustic tailoring; an integrated system of multi-disciplinary simulators; coupled material behavior/fabrication process tailoring; sensitivities using a probabilistic simulator; and coupled materials, structures, fracture, and probabilistic behavior simulator. The results demonstrate that superior designs can be achieved if the analysis/tailoring methods account for the multi-disciplinary coupling effects. The coupling across disciplines can be used to develop an integrated coupled multi-discipline numerical propulsion system simulator.
Multi-disciplinary coupling for integrated design of propulsion systems
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Singhal, S. N.
1993-01-01
Effective computational simulation procedures are described for modeling the inherent multi-disciplinary interactions for determining the true response of propulsion systems. Results are presented for propulsion system responses including multi-discipline coupling effects via (1) coupled multi-discipline tailoring, (2) an integrated system of multidisciplinary simulators, (3) coupled material-behavior/fabrication-process tailoring, (4) sensitivities using a probabilistic simulator, and (5) coupled materials/structures/fracture/probabilistic behavior simulator. The results show that the best designs can be determined if the analysis/tailoring methods account for the multi-disciplinary coupling effects. The coupling across disciplines can be used to develop an integrated interactive multi-discipline numerical propulsion system simulator.
Donnarumma, Francesco; Maisto, Domenico; Pezzulo, Giovanni
2016-01-01
How do humans and other animals face novel problems for which predefined solutions are not available? Human problem solving links to flexible reasoning and inference rather than to slow trial-and-error learning. It has received considerable attention since the early days of cognitive science, giving rise to well known cognitive architectures such as SOAR and ACT-R, but its computational and brain mechanisms remain incompletely known. Furthermore, it is still unclear whether problem solving is a “specialized” domain or module of cognition, in the sense that it requires computations that are fundamentally different from those supporting perception and action systems. Here we advance a novel view of human problem solving as probabilistic inference with subgoaling. In this perspective, key insights from cognitive architectures are retained such as the importance of using subgoals to split problems into subproblems. However, here the underlying computations use probabilistic inference methods analogous to those that are increasingly popular in the study of perception and action systems. To test our model we focus on the widely used Tower of Hanoi (ToH) task, and show that our proposed method can reproduce characteristic idiosyncrasies of human problem solvers: their sensitivity to the “community structure” of the ToH and their difficulties in executing so-called “counterintuitive” movements. Our analysis reveals that subgoals have two key roles in probabilistic inference and problem solving. First, prior beliefs on (likely) useful subgoals carve the problem space and define an implicit metric for the problem at hand—a metric to which humans are sensitive. Second, subgoals are used as waypoints in the probabilistic problem solving inference and permit to find effective solutions that, when unavailable, lead to problem solving deficits. Our study thus suggests that a probabilistic inference scheme enhanced with subgoals provides a comprehensive framework to study problem solving and its deficits. PMID:27074140
Donnarumma, Francesco; Maisto, Domenico; Pezzulo, Giovanni
2016-04-01
How do humans and other animals face novel problems for which predefined solutions are not available? Human problem solving links to flexible reasoning and inference rather than to slow trial-and-error learning. It has received considerable attention since the early days of cognitive science, giving rise to well known cognitive architectures such as SOAR and ACT-R, but its computational and brain mechanisms remain incompletely known. Furthermore, it is still unclear whether problem solving is a "specialized" domain or module of cognition, in the sense that it requires computations that are fundamentally different from those supporting perception and action systems. Here we advance a novel view of human problem solving as probabilistic inference with subgoaling. In this perspective, key insights from cognitive architectures are retained such as the importance of using subgoals to split problems into subproblems. However, here the underlying computations use probabilistic inference methods analogous to those that are increasingly popular in the study of perception and action systems. To test our model we focus on the widely used Tower of Hanoi (ToH) task, and show that our proposed method can reproduce characteristic idiosyncrasies of human problem solvers: their sensitivity to the "community structure" of the ToH and their difficulties in executing so-called "counterintuitive" movements. Our analysis reveals that subgoals have two key roles in probabilistic inference and problem solving. First, prior beliefs on (likely) useful subgoals carve the problem space and define an implicit metric for the problem at hand-a metric to which humans are sensitive. Second, subgoals are used as waypoints in the probabilistic problem solving inference and permit to find effective solutions that, when unavailable, lead to problem solving deficits. Our study thus suggests that a probabilistic inference scheme enhanced with subgoals provides a comprehensive framework to study problem solving and its deficits.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qu, Xuanlu M.; Louie, Alexander V.; Ashman, Jonathan
Purpose: Surgery combined with radiation therapy (RT) is the cornerstone of multidisciplinary management of extremity soft tissue sarcoma (STS). Although RT can be given in either the preoperative or the postoperative setting with similar local recurrence and survival outcomes, the side effect profiles, costs, and long-term functional outcomes are different. The aim of this study was to use decision analysis to determine optimal sequencing of RT with surgery in patients with extremity STS. Methods and Materials: A cost-effectiveness analysis was conducted using a state transition Markov model, with quality-adjusted life years (QALYs) as the primary outcome. A time horizon ofmore » 5 years, a cycle length of 3 months, and a willingness-to-pay threshold of $50,000/QALY was used. One-way deterministic sensitivity analyses were performed to determine the thresholds at which each strategy would be preferred. The robustness of the model was assessed by probabilistic sensitivity analysis. Results: Preoperative RT is a more cost-effective strategy ($26,633/3.00 QALYs) than postoperative RT ($28,028/2.86 QALYs) in our base case scenario. Preoperative RT is the superior strategy with either 3-dimensional conformal RT or intensity-modulated RT. One-way sensitivity analyses identified the relative risk of chronic adverse events as having the greatest influence on the preferred timing of RT. The likelihood of preoperative RT being the preferred strategy was 82% on probabilistic sensitivity analysis. Conclusions: Preoperative RT is more cost effective than postoperative RT in the management of resectable extremity STS, primarily because of the higher incidence of chronic adverse events with RT in the postoperative setting.« less
Optimal management of colorectal liver metastases in older patients: a decision analysis
Yang, Simon; Alibhai, Shabbir MH; Kennedy, Erin D; El-Sedfy, Abraham; Dixon, Matthew; Coburn, Natalie; Kiss, Alex; Law, Calvin HL
2014-01-01
Background Comparative trials evaluating management strategies for colorectal cancer liver metastases (CLM) are lacking, especially for older patients. This study developed a decision-analytic model to quantify outcomes associated with treatment strategies for CLM in older patients. Methods A Markov-decision model was built to examine the effect on life expectancy (LE) and quality-adjusted life expectancy (QALE) for best supportive care (BSC), systemic chemotherapy (SC), radiofrequency ablation (RFA) and hepatic resection (HR). The baseline patient cohort assumptions included healthy 70-year-old CLM patients after a primary cancer resection. Event and transition probabilities and utilities were derived from a literature review. Deterministic and probabilistic sensitivity analyses were performed on all study parameters. Results In base case analysis, BSC, SC, RFA and HR yielded LEs of 11.9, 23.1, 34.8 and 37.0 months, and QALEs of 7.8, 13.2, 22.0 and 25.0 months, respectively. Model results were sensitive to age, comorbidity, length of model simulation and utility after HR. Probabilistic sensitivity analysis showed increasing preference for RFA over HR with increasing patient age. Conclusions HR may be optimal for healthy 70-year-old patients with CLM. In older patients with comorbidities, RFA may provide better LE and QALE. Treatment decisions in older cancer patients should account for patient age, comorbidities, local expertise and individual values. PMID:24961482
Addressing the Hard Factors for Command File Errors by Probabilistic Reasoning
NASA Technical Reports Server (NTRS)
Meshkat, Leila; Bryant, Larry
2014-01-01
Command File Errors (CFE) are managed using standard risk management approaches at the Jet Propulsion Laboratory. Over the last few years, more emphasis has been made on the collection, organization, and analysis of these errors for the purpose of reducing the CFE rates. More recently, probabilistic modeling techniques have been used for more in depth analysis of the perceived error rates of the DAWN mission and for managing the soft factors in the upcoming phases of the mission. We broadly classify the factors that can lead to CFE's as soft factors, which relate to the cognition of the operators and hard factors which relate to the Mission System which is composed of the hardware, software and procedures used for the generation, verification & validation and execution of commands. The focus of this paper is to use probabilistic models that represent multiple missions at JPL to determine the root cause and sensitivities of the various components of the mission system and develop recommendations and techniques for addressing them. The customization of these multi-mission models to a sample interplanetary spacecraft is done for this purpose.
Probabilistic structural analysis methods for space propulsion system components
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1986-01-01
The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.
NASA Astrophysics Data System (ADS)
Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan
2018-03-01
GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.
NASA Technical Reports Server (NTRS)
Bast, Callie C.; Boyce, Lola
1995-01-01
This report presents the results of both the fifth and sixth year effort of a research program conducted for NASA-LeRC by The University of Texas at San Antonio (UTSA). The research included on-going development of methodology for a probabilistic material strength degradation model. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes five effects that typically reduce lifetime strength: high temperature, high-cycle mechanical fatigue, low-cycle mechanical fatigue, creep and thermal fatigue. Statistical analysis was conducted on experimental Inconel 718 data obtained from the open literature. This analysis provided regression parameters for use as the model's empirical material constants, thus calibrating the model specifically for Inconel 718. Model calibration was carried out for five variables, namely, high temperature, high-cycle and low-cycle mechanical fatigue, creep and thermal fatigue. Methodology to estimate standard deviations of these material constants for input into the probabilistic material strength model was developed. Using an updated version of PROMISS, entitled PROMISS93, a sensitivity study for the combined effects of high-cycle mechanical fatigue, creep and thermal fatigue was performed. Then using the current version of PROMISS, entitled PROMISS94, a second sensitivity study including the effect of low-cycle mechanical fatigue, as well as, the three previous effects was performed. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing a combination of high-cycle mechanical fatigue and high temperature effects by model to the combination by experiment were conducted. Thus, for Inconel 718, the basic model assumption of independence between effects was evaluated. Results from this limited verification study strongly supported this assumption.
Space transportation architecture: Reliability sensitivities
NASA Technical Reports Server (NTRS)
Williams, A. M.
1992-01-01
A sensitivity analysis is given of the benefits and drawbacks associated with a proposed Earth to orbit vehicle architecture. The architecture represents a fleet of six vehicles (two existing, four proposed) that would be responsible for performing various missions as mandated by NASA and the U.S. Air Force. Each vehicle has a prescribed flight rate per year for a period of 31 years. By exposing this fleet of vehicles to a probabilistic environment where the fleet experiences failures, downtimes, setbacks, etc., the analysis involves determining the resiliency and costs associated with the fleet of specific vehicle/subsystem reliabilities. The resources required were actual observed data on the failures and downtimes associated with existing vehicles, data based on engineering judgement for proposed vehicles, and the development of a sensitivity analysis program.
Space system operations and support cost analysis using Markov chains
NASA Technical Reports Server (NTRS)
Unal, Resit; Dean, Edwin B.; Moore, Arlene A.; Fairbairn, Robert E.
1990-01-01
This paper evaluates the use of Markov chain process in probabilistic life cycle cost analysis and suggests further uses of the process as a design aid tool. A methodology is developed for estimating operations and support cost and expected life for reusable space transportation systems. Application of the methodology is demonstrated for the case of a hypothetical space transportation vehicle. A sensitivity analysis is carried out to explore the effects of uncertainty in key model inputs.
St-Onge, Maude; Fan, Eddy; Mégarbane, Bruno; Hancock-Howard, Rebecca; Coyte, Peter C
2015-04-01
Venoarterial extracorporeal membrane oxygenation represents an emerging and recommended option to treat life-threatening cardiotoxicant poisoning. The objective of this cost-effectiveness analysis was to estimate the incremental cost-effectiveness ratio of using venoarterial extracorporeal membrane oxygenation for adults in cardiotoxicant-induced shock or cardiac arrest compared with standard care. Adults in shock or in cardiac arrest secondary to cardiotoxicant poisoning were studied with a lifetime horizon and a societal perspective. Venoarterial extracorporeal membrane oxygenation cost effectiveness was calculated using a decision analysis tree, with the effect of the intervention and the probabilities used in the model taken from an observational study representing the highest level of evidence available. The costs (2013 Canadian dollars, where $1.00 Canadian = $0.9562 US dollars) were documented with interviews, reviews of official provincial documents, or published articles. A series of one-way sensitivity analyses and a probabilistic sensitivity analysis using Monte Carlo simulation were used to evaluate uncertainty in the decision model. The cost per life year (LY) gained in the extracorporeal membrane oxygenation group was $145 931/18 LY compared with $88 450/10 LY in the non-extracorporeal membrane oxygenation group. The incremental cost-effectiveness ratio ($7185/LY but $34 311/LY using a more pessimistic approach) was mainly influenced by the probability of survival. The probabilistic sensitivity analysis identified variability in both cost and effectiveness. Venoarterial extracorporeal membrane oxygenation may be cost effective in treating cardiotoxicant poisonings. Copyright © 2014 Elsevier Inc. All rights reserved.
Xie, Feng; O'Reilly, Daria; Ferrusi, Ilia L; Blackhouse, Gord; Bowen, James M; Tarride, Jean-Eric; Goeree, Ron
2009-05-01
The aim of this paper is to present an economic evaluation of diagnostic technologies using Helicobacter pylori screening strategies for the prevention of gastric cancer as an illustration. A Markov model was constructed to compare the lifetime cost and effectiveness of 4 potential strategies: no screening, the serology test by enzyme-linked immunosorbent assay (ELISA), the stool antigen test (SAT), and the (13)C-urea breath test (UBT) for the detection of H. pylori among a hypothetical cohort of 10,000 Canadian men aged 35 years. Special parameter consideration included the sensitivity and specificity of each screening strategy, which determined the model structure and treatment regimen. The primary outcome measured was the incremental cost-effectiveness ratio between the screening strategies and the no-screening strategy. Base-case analysis and probabilistic sensitivity analysis were performed using the point estimates of the parameters and Monte Carlo simulations, respectively. Compared with the no-screening strategy in the base-case analysis, the incremental cost-effectiveness ratio was $33,000 per quality-adjusted life-year (QALY) for the ELISA, $29,800 per QALY for the SAT, and $50,400 per QALY for the UBT. The probabilistic sensitivity analysis revealed that the no-screening strategy was more cost effective if the willingness to pay (WTP) was <$20,000 per QALY, while the SAT had the highest probability of being cost effective if the WTP was >$30,000 per QALY. Both the ELISA and the UBT were not cost-effective strategies over a wide range of WTP values. Although the UBT had the highest sensitivity and specificity, either no screening or the SAT could be the most cost-effective strategy depending on the WTP threshold values from an economic perspective. This highlights the importance of economic evaluations of diagnostic technologies.
A probabilistic approach to aircraft design emphasizing stability and control uncertainties
NASA Astrophysics Data System (ADS)
Delaurentis, Daniel Andrew
In order to address identified deficiencies in current approaches to aerospace systems design, a new method has been developed. This new method for design is based on the premise that design is a decision making activity, and that deterministic analysis and synthesis can lead to poor, or misguided decision making. This is due to a lack of disciplinary knowledge of sufficient fidelity about the product, to the presence of uncertainty at multiple levels of the aircraft design hierarchy, and to a failure to focus on overall affordability metrics as measures of goodness. Design solutions are desired which are robust to uncertainty and are based on the maximum knowledge possible. The new method represents advances in the two following general areas. 1. Design models and uncertainty. The research performed completes a transition from a deterministic design representation to a probabilistic one through a modeling of design uncertainty at multiple levels of the aircraft design hierarchy, including: (1) Consistent, traceable uncertainty classification and representation; (2) Concise mathematical statement of the Probabilistic Robust Design problem; (3) Variants of the Cumulative Distribution Functions (CDFs) as decision functions for Robust Design; (4) Probabilistic Sensitivities which identify the most influential sources of variability. 2. Multidisciplinary analysis and design. Imbedded in the probabilistic methodology is a new approach for multidisciplinary design analysis and optimization (MDA/O), employing disciplinary analysis approximations formed through statistical experimentation and regression. These approximation models are a function of design variables common to the system level as well as other disciplines. For aircraft, it is proposed that synthesis/sizing is the proper avenue for integrating multiple disciplines. Research hypotheses are translated into a structured method, which is subsequently tested for validity. Specifically, the implementation involves the study of the relaxed static stability technology for a supersonic commercial transport aircraft. The probabilistic robust design method is exercised resulting in a series of robust design solutions based on different interpretations of "robustness". Insightful results are obtained and the ability of the method to expose trends in the design space are noted as a key advantage.
Design of Probabilistic Random Forests with Applications to Anticancer Drug Sensitivity Prediction
Rahman, Raziur; Haider, Saad; Ghosh, Souparno; Pal, Ranadip
2015-01-01
Random forests consisting of an ensemble of regression trees with equal weights are frequently used for design of predictive models. In this article, we consider an extension of the methodology by representing the regression trees in the form of probabilistic trees and analyzing the nature of heteroscedasticity. The probabilistic tree representation allows for analytical computation of confidence intervals (CIs), and the tree weight optimization is expected to provide stricter CIs with comparable performance in mean error. We approached the ensemble of probabilistic trees’ prediction from the perspectives of a mixture distribution and as a weighted sum of correlated random variables. We applied our methodology to the drug sensitivity prediction problem on synthetic and cancer cell line encyclopedia dataset and illustrated that tree weights can be selected to reduce the average length of the CI without increase in mean error. PMID:27081304
Analytic uncertainty and sensitivity analysis of models with input correlations
NASA Astrophysics Data System (ADS)
Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu
2018-03-01
Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.
Smith, William B; Steinberg, Joni; Scholtes, Stefan; Mcnamara, Iain R
2017-03-01
To compare the age-based cost-effectiveness of total knee arthroplasty (TKA), unicompartmental knee arthroplasty (UKA), and high tibial osteotomy (HTO) for the treatment of medial compartment knee osteoarthritis (MCOA). A Markov model was used to simulate theoretical cohorts of patients 40, 50, 60, and 70 years of age undergoing primary TKA, UKA, or HTO. Costs and outcomes associated with initial and subsequent interventions were estimated by following these virtual cohorts over a 10-year period. Revision and mortality rates, costs, and functional outcome data were estimated from a systematic review of the literature. Probabilistic analysis was conducted to accommodate these parameters' inherent uncertainty, and both discrete and probabilistic sensitivity analyses were utilized to assess the robustness of the model's outputs to changes in key variables. HTO was most likely to be cost-effective in cohorts under 60, and UKA most likely in those 60 and over. Probabilistic results did not indicate one intervention to be significantly more cost-effective than another. The model was exquisitely sensitive to changes in utility (functional outcome), somewhat sensitive to changes in cost, and least sensitive to changes in 10-year revision risk. HTO may be the most cost-effective option when treating MCOA in younger patients, while UKA may be preferred in older patients. Functional utility is the primary driver of the cost-effectiveness of these interventions. For the clinician, this study supports HTO as a competitive treatment option in young patient populations. It also validates each one of the three interventions considered as potentially optimal, depending heavily on patient preferences and functional utility derived over time.
Kawasaki, Ryo; Akune, Yoko; Hiratsuka, Yoshimune; Fukuhara, Shunichi; Yamada, Masakazu
2015-02-01
To evaluate the cost-effectiveness for a screening interval longer than 1 year detecting diabetic retinopathy (DR) through the estimation of incremental costs per quality-adjusted life year (QALY) based on the best available clinical data in Japan. A Markov model with a probabilistic cohort analysis was framed to calculate incremental costs per QALY gained by implementing a screening program detecting DR in Japan. A 1-year cycle length and population size of 50,000 with a 50-year time horizon (age 40-90 years) was used. Best available clinical data from publications and national surveillance data was used, and a model was designed including current diagnosis and management of DR with corresponding visual outcomes. One-way and probabilistic sensitivity analyses were performed considering uncertainties in the parameters. In the base-case analysis, the strategy with a screening program resulted in an incremental cost of 5,147 Japanese yen (¥; US$64.6) and incremental effectiveness of 0.0054 QALYs per person screened. The incremental cost-effectiveness ratio was ¥944,981 (US$11,857) per QALY. The simulation suggested that screening would result in a significant reduction in blindness in people aged 40 years or over (-16%). Sensitivity analyses suggested that in order to achieve both reductions in blindness and cost-effectiveness in Japan, the screening program should screen those aged 53-84 years, at intervals of 3 years or less. An eye screening program in Japan would be cost-effective in detecting DR and preventing blindness from DR, even allowing for the uncertainties in estimates of costs, utility, and current management of DR.
Efficient Sensitivity Methods for Probabilistic Lifing and Engine Prognostics
2010-09-01
AFRL-RX-WP-TR-2010-4297 EFFICIENT SENSITIVITY METHODS FOR PROBABILISTIC LIFING AND ENGINE PROGNOSTICS Harry Millwater , Ronald Bagley, Jose...5c. PROGRAM ELEMENT NUMBER 62102F 6. AUTHOR(S) Harry Millwater , Ronald Bagley, Jose Garza, D. Wagner, Andrew Bates, and Andy Voorhees 5d...Reliability Assessment, MIL-HDBK-1823, 30 April 1999. 9. Leverant GR, Millwater HR, McClung RC, Enright MP, A New Tool for Design and Certification of
Analysis of the stochastic excitability in the flow chemical reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bashkirtseva, Irina
2015-11-30
A dynamic model of the thermochemical process in the flow reactor is considered. We study an influence of the random disturbances on the stationary regime of this model. A phenomenon of noise-induced excitability is demonstrated. For the analysis of this phenomenon, a constructive technique based on the stochastic sensitivity functions and confidence domains is applied. It is shown how elaborated technique can be used for the probabilistic analysis of the generation of mixed-mode stochastic oscillations in the flow chemical reactor.
Analysis of the stochastic excitability in the flow chemical reactor
NASA Astrophysics Data System (ADS)
Bashkirtseva, Irina
2015-11-01
A dynamic model of the thermochemical process in the flow reactor is considered. We study an influence of the random disturbances on the stationary regime of this model. A phenomenon of noise-induced excitability is demonstrated. For the analysis of this phenomenon, a constructive technique based on the stochastic sensitivity functions and confidence domains is applied. It is shown how elaborated technique can be used for the probabilistic analysis of the generation of mixed-mode stochastic oscillations in the flow chemical reactor.
Time to angiographic reperfusion in acute ischemic stroke: decision analysis.
Vagal, Achala S; Khatri, Pooja; Broderick, Joseph P; Tomsick, Thomas A; Yeatts, Sharon D; Eckman, Mark H
2014-12-01
Our objective was to use decision analytic modeling to compare 2 treatment strategies of intravenous recombinant tissue-type plasminogen activator (r-tPA) alone versus combined intravenous r-tPA/endovascular therapy in a subgroup of patients with large vessel (internal carotid artery terminus, M1, and M2) occlusion based on varying times to angiographic reperfusion and varying rates of reperfusion. We developed a decision model using Interventional Management of Stroke (IMS) III trial data and comprehensive literature review. We performed 1-way sensitivity analyses for time to reperfusion and 2-way sensitivity for time to reperfusion and rate of reperfusion success. We also performed probabilistic sensitivity analyses to address uncertainty in total time to reperfusion for the endovascular approach. In the base case, endovascular approach yielded a higher expected utility (6.38 quality-adjusted life years) than the intravenous-only arm (5.42 quality-adjusted life years). One-way sensitivity analyses demonstrated superiority of endovascular treatment to intravenous-only arm unless time to reperfusion exceeded 347 minutes. Two-way sensitivity analysis demonstrated that endovascular treatment was preferred when probability of reperfusion is high and time to reperfusion is small. Probabilistic sensitivity results demonstrated an average gain for endovascular therapy of 0.76 quality-adjusted life years (SD 0.82) compared with the intravenous-only approach. In our post hoc model with its underlying limitations, endovascular therapy after intravenous r-tPA is the preferred treatment as compared with intravenous r-tPA alone. However, if time to reperfusion exceeds 347 minutes, intravenous r-tPA alone is the recommended strategy. This warrants validation in a randomized, prospective trial among patients with large vessel occlusions. © 2014 American Heart Association, Inc.
Reconciling uncertain costs and benefits in bayes nets for invasive species management
Burgman, M.A.; Wintle, B.A.; Thompson, C.A.; Moilanen, A.; Runge, M.C.; Ben-Haim, Y.
2010-01-01
Bayes nets are used increasingly to characterize environmental systems and formalize probabilistic reasoning to support decision making. These networks treat probabilities as exact quantities. Sensitivity analysis can be used to evaluate the importance of assumptions and parameter estimates. Here, we outline an application of info-gap theory to Bayes nets that evaluates the sensitivity of decisions to possibly large errors in the underlying probability estimates and utilities. We apply it to an example of management and eradication of Red Imported Fire Ants in Southern Queensland, Australia and show how changes in management decisions can be justified when uncertainty is considered. ?? 2009 Society for Risk Analysis.
Regier, Dean A; Friedman, Jan M; Marra, Carlo A
2010-05-14
Array genomic hybridization (AGH) provides a higher detection rate than does conventional cytogenetic testing when searching for chromosomal imbalance causing intellectual disability (ID). AGH is more costly than conventional cytogenetic testing, and it remains unclear whether AGH provides good value for money. Decision analytic modeling was used to evaluate the trade-off between costs, clinical effectiveness, and benefit of an AGH testing strategy compared to a conventional testing strategy. The trade-off between cost and effectiveness was expressed via the incremental cost-effectiveness ratio. Probabilistic sensitivity analysis was performed via Monte Carlo simulation. The baseline AGH testing strategy led to an average cost increase of $217 (95% CI $172-$261) per patient and an additional 8.2 diagnoses in every 100 tested (0.082; 95% CI 0.044-0.119). The mean incremental cost per additional diagnosis was $2646 (95% CI $1619-$5296). Probabilistic sensitivity analysis demonstrated that there was a 95% probability that AGH would be cost effective if decision makers were willing to pay $4550 for an additional diagnosis. Our model suggests that using AGH instead of conventional karyotyping for most ID patients provides good value for money. Deterministic sensitivity analysis found that employing AGH after first-line cytogenetic testing had proven uninformative did not provide good value for money when compared to using AGH as first-line testing. Copyright (c) 2010 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.
Connectome sensitivity or specificity: which is more important?
Zalesky, Andrew; Fornito, Alex; Cocchi, Luca; Gollo, Leonardo L; van den Heuvel, Martijn P; Breakspear, Michael
2016-11-15
Connectomes with high sensitivity and high specificity are unattainable with current axonal fiber reconstruction methods, particularly at the macro-scale afforded by magnetic resonance imaging. Tensor-guided deterministic tractography yields sparse connectomes that are incomplete and contain false negatives (FNs), whereas probabilistic methods steered by crossing-fiber models yield dense connectomes, often with low specificity due to false positives (FPs). Densely reconstructed probabilistic connectomes are typically thresholded to improve specificity at the cost of a reduction in sensitivity. What is the optimal tradeoff between connectome sensitivity and specificity? We show empirically and theoretically that specificity is paramount. Our evaluations of the impact of FPs and FNs on empirical connectomes indicate that specificity is at least twice as important as sensitivity when estimating key properties of brain networks, including topological measures of network clustering, network efficiency and network modularity. Our asymptotic analysis of small-world networks with idealized modular structure reveals that as the number of nodes grows, specificity becomes exactly twice as important as sensitivity to the estimation of the clustering coefficient. For the estimation of network efficiency, the relative importance of specificity grows linearly with the number of nodes. The greater importance of specificity is due to FPs occurring more prevalently between network modules rather than within them. These spurious inter-modular connections have a dramatic impact on network topology. We argue that efforts to maximize the sensitivity of connectome reconstruction should be realigned with the need to map brain networks with high specificity. Copyright © 2016 Elsevier Inc. All rights reserved.
Malekpour, Shirin; Langeveld, Jeroen; Letema, Sammy; Clemens, François; van Lier, Jules B
2013-03-30
This paper introduces the probabilistic evaluation framework, to enable transparent and objective decision-making in technology selection for sanitation solutions in low-income countries. The probabilistic framework recognizes the often poor quality of the available data for evaluations. Within this framework, the evaluations will be done based on the probabilities that the expected outcomes occur in practice, considering the uncertainties in evaluation parameters. Consequently, the outcome of evaluations will not be single point estimates; but there exists a range of possible outcomes. A first trial application of this framework for evaluation of sanitation options in the Nyalenda settlement in Kisumu, Kenya, showed how the range of values that an evaluation parameter may obtain in practice would influence the evaluation outcomes. In addition, as the probabilistic evaluation requires various site-specific data, sensitivity analysis was performed to determine the influence of each data set quality on the evaluation outcomes. Based on that, data collection activities could be (re)directed, in a trade-off between the required investments in those activities and the resolution of the decisions that are to be made. Copyright © 2013 Elsevier Ltd. All rights reserved.
Global sensitivity analysis of multiscale properties of porous materials
NASA Astrophysics Data System (ADS)
Um, Kimoon; Zhang, Xuan; Katsoulakis, Markos; Plechac, Petr; Tartakovsky, Daniel M.
2018-02-01
Ubiquitous uncertainty about pore geometry inevitably undermines the veracity of pore- and multi-scale simulations of transport phenomena in porous media. It raises two fundamental issues: sensitivity of effective material properties to pore-scale parameters and statistical parameterization of Darcy-scale models that accounts for pore-scale uncertainty. Homogenization-based maps of pore-scale parameters onto their Darcy-scale counterparts facilitate both sensitivity analysis (SA) and uncertainty quantification. We treat uncertain geometric characteristics of a hierarchical porous medium as random variables to conduct global SA and to derive probabilistic descriptors of effective diffusion coefficients and effective sorption rate. Our analysis is formulated in terms of solute transport diffusing through a fluid-filled pore space, while sorbing to the solid matrix. Yet it is sufficiently general to be applied to other multiscale porous media phenomena that are amenable to homogenization.
Gray, Ewan; Donten, Anna; Karssemeijer, Nico; van Gils, Carla; Evans, D Gareth; Astley, Sue; Payne, Katherine
2017-09-01
To identify the incremental costs and consequences of stratified national breast screening programs (stratified NBSPs) and drivers of relative cost-effectiveness. A decision-analytic model (discrete event simulation) was conceptualized to represent four stratified NBSPs (risk 1, risk 2, masking [supplemental screening for women with higher breast density], and masking and risk 1) compared with the current UK NBSP and no screening. The model assumed a lifetime horizon, the health service perspective to identify costs (£, 2015), and measured consequences in quality-adjusted life-years (QALYs). Multiple data sources were used: systematic reviews of effectiveness and utility, published studies reporting costs, and cohort studies embedded in existing NBSPs. Model parameter uncertainty was assessed using probabilistic sensitivity analysis and one-way sensitivity analysis. The base-case analysis, supported by probabilistic sensitivity analysis, suggested that the risk stratified NBSPs (risk 1 and risk-2) were relatively cost-effective when compared with the current UK NBSP, with incremental cost-effectiveness ratios of £16,689 per QALY and £23,924 per QALY, respectively. Stratified NBSP including masking approaches (supplemental screening for women with higher breast density) was not a cost-effective alternative, with incremental cost-effectiveness ratios of £212,947 per QALY (masking) and £75,254 per QALY (risk 1 and masking). When compared with no screening, all stratified NBSPs could be considered cost-effective. Key drivers of cost-effectiveness were discount rate, natural history model parameters, mammographic sensitivity, and biopsy rates for recalled cases. A key assumption was that the risk model used in the stratification process was perfectly calibrated to the population. This early model-based cost-effectiveness analysis provides indicative evidence for decision makers to understand the key drivers of costs and QALYs for exemplar stratified NBSP. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Probabilistic analysis of preload in the abutment screw of a dental implant complex.
Guda, Teja; Ross, Thomas A; Lang, Lisa A; Millwater, Harry R
2008-09-01
Screw loosening is a problem for a percentage of implants. A probabilistic analysis to determine the cumulative probability distribution of the preload, the probability of obtaining an optimal preload, and the probabilistic sensitivities identifying important variables is lacking. The purpose of this study was to examine the inherent variability of material properties, surface interactions, and applied torque in an implant system to determine the probability of obtaining desired preload values and to identify the significant variables that affect the preload. Using software programs, an abutment screw was subjected to a tightening torque and the preload was determined from finite element (FE) analysis. The FE model was integrated with probabilistic analysis software. Two probabilistic analysis methods (advanced mean value and Monte Carlo sampling) were applied to determine the cumulative distribution function (CDF) of preload. The coefficient of friction, elastic moduli, Poisson's ratios, and applied torque were modeled as random variables and defined by probability distributions. Separate probability distributions were determined for the coefficient of friction in well-lubricated and dry environments. The probabilistic analyses were performed and the cumulative distribution of preload was determined for each environment. A distinct difference was seen between the preload probability distributions generated in a dry environment (normal distribution, mean (SD): 347 (61.9) N) compared to a well-lubricated environment (normal distribution, mean (SD): 616 (92.2) N). The probability of obtaining a preload value within the target range was approximately 54% for the well-lubricated environment and only 0.02% for the dry environment. The preload is predominately affected by the applied torque and coefficient of friction between the screw threads and implant bore at lower and middle values of the preload CDF, and by the applied torque and the elastic modulus of the abutment screw at high values of the preload CDF. Lubrication at the threaded surfaces between the abutment screw and implant bore affects the preload developed in the implant complex. For the well-lubricated surfaces, only approximately 50% of implants will have preload values within the generally accepted range. This probability can be improved by applying a higher torque than normally recommended or a more closely controlled torque than typically achieved. It is also suggested that materials with higher elastic moduli be used in the manufacture of the abutment screw to achieve a higher preload.
Probabilistic risk analysis of building contamination.
Bolster, D T; Tartakovsky, D M
2008-10-01
We present a general framework for probabilistic risk assessment (PRA) of building contamination. PRA provides a powerful tool for the rigorous quantification of risk in contamination of building spaces. A typical PRA starts by identifying relevant components of a system (e.g. ventilation system components, potential sources of contaminants, remediation methods) and proceeds by using available information and statistical inference to estimate the probabilities of their failure. These probabilities are then combined by means of fault-tree analyses to yield probabilistic estimates of the risk of system failure (e.g. building contamination). A sensitivity study of PRAs can identify features and potential problems that need to be addressed with the most urgency. Often PRAs are amenable to approximations, which can significantly simplify the approach. All these features of PRA are presented in this paper via a simple illustrative example, which can be built upon in further studies. The tool presented here can be used to design and maintain adequate ventilation systems to minimize exposure of occupants to contaminants.
Probabilistic Dynamic Buckling of Smart Composite Shells
NASA Technical Reports Server (NTRS)
Abumeri, Galib H.; Chamis, Christos C.
2003-01-01
A computational simulation method is presented to evaluate the deterministic and nondeterministic dynamic buckling of smart composite shells. The combined use of composite mechanics, finite element computer codes, and probabilistic analysis enable the effective assessment of the dynamic buckling load of smart composite shells. A universal plot is generated to estimate the dynamic buckling load of composite shells at various load rates and probabilities. The shell structure is also evaluated with smart fibers embedded in the plies right below the outer plies. The results show that, on the average, the use of smart fibers improved the shell buckling resistance by about 10 percent at different probabilities and delayed the buckling occurrence time. The probabilistic sensitivities results indicate that uncertainties in the fiber volume ratio and ply thickness have major effects on the buckling load while uncertainties in the electric field strength and smart material volume fraction have moderate effects. For the specific shell considered in this evaluation, the use of smart composite material is not recommended because the shell buckling resistance can be improved by simply re-arranging the orientation of the outer plies, as shown in the dynamic buckling analysis results presented in this report.
Probabilistic Dynamic Buckling of Smart Composite Shells
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Abumeri, Galib H.
2007-01-01
A computational simulation method is presented to evaluate the deterministic and nondeterministic dynamic buckling of smart composite shells. The combined use of intraply hybrid composite mechanics, finite element computer codes, and probabilistic analysis enable the effective assessment of the dynamic buckling load of smart composite shells. A universal plot is generated to estimate the dynamic buckling load of composite shells at various load rates and probabilities. The shell structure is also evaluated with smart fibers embedded in the plies right next to the outer plies. The results show that, on the average, the use of smart fibers improved the shell buckling resistance by about 10% at different probabilities and delayed the buckling occurrence time. The probabilistic sensitivities results indicate that uncertainties in the fiber volume ratio and ply thickness have major effects on the buckling load while uncertainties in the electric field strength and smart material volume fraction have moderate effects. For the specific shell considered in this evaluation, the use of smart composite material is not recommended because the shell buckling resistance can be improved by simply re-arranging the orientation of the outer plies, as shown in the dynamic buckling analysis results presented in this report.
2016-11-09
the model does not become a full probabilistic attack graph analysis of the network , whose data requirements are currently unrealistic. The second...flow. – Untrustworthy persons may intentionally try to exfiltrate known sensitive data to ex- ternal networks . People may also unintentionally leak...section will provide details on the components, procedures, data requirements, and parameters required to instantiate the network porosity model. These
NASA Astrophysics Data System (ADS)
Shah, Syed Muhammad Saqlain; Batool, Safeera; Khan, Imran; Ashraf, Muhammad Usman; Abbas, Syed Hussnain; Hussain, Syed Adnan
2017-09-01
Automatic diagnosis of human diseases are mostly achieved through decision support systems. The performance of these systems is mainly dependent on the selection of the most relevant features. This becomes harder when the dataset contains missing values for the different features. Probabilistic Principal Component Analysis (PPCA) has reputation to deal with the problem of missing values of attributes. This research presents a methodology which uses the results of medical tests as input, extracts a reduced dimensional feature subset and provides diagnosis of heart disease. The proposed methodology extracts high impact features in new projection by using Probabilistic Principal Component Analysis (PPCA). PPCA extracts projection vectors which contribute in highest covariance and these projection vectors are used to reduce feature dimension. The selection of projection vectors is done through Parallel Analysis (PA). The feature subset with the reduced dimension is provided to radial basis function (RBF) kernel based Support Vector Machines (SVM). The RBF based SVM serves the purpose of classification into two categories i.e., Heart Patient (HP) and Normal Subject (NS). The proposed methodology is evaluated through accuracy, specificity and sensitivity over the three datasets of UCI i.e., Cleveland, Switzerland and Hungarian. The statistical results achieved through the proposed technique are presented in comparison to the existing research showing its impact. The proposed technique achieved an accuracy of 82.18%, 85.82% and 91.30% for Cleveland, Hungarian and Switzerland dataset respectively.
NASA Astrophysics Data System (ADS)
Olivia, G.; Santoso, A.; Prayogo, D. N.
2017-11-01
Nowadays, the level of competition between supply chains is getting tighter and a good coordination system between supply chains members is very crucial in solving the issue. This paper focused on a model development of coordination system between single supplier and buyers in a supply chain as a solution. Proposed optimization model was designed to determine the optimal number of deliveries from a supplier to buyers in order to minimize the total cost over a planning horizon. Components of the total supply chain cost consist of transportation costs, handling costs of supplier and buyers and also stock out costs. In the proposed optimization model, the supplier can supply various types of items to retailers whose item demand patterns are probabilistic. Sensitivity analysis of the proposed model was conducted to test the effect of changes in transport costs, handling costs and production capacities of the supplier. The results of the sensitivity analysis showed a significant influence on the changes in the transportation cost, handling costs and production capacity to the decisions of the optimal numbers of product delivery for each item to the buyers.
Wu, James X; Sacks, Greg D; Dawes, Aaron J; DeUgarte, Daniel; Lee, Steven L
2017-07-01
Several studies have demonstrated the safety and short-term success of nonoperative management in children with acute, uncomplicated appendicitis. Nonoperative management spares the patients and their family the upfront cost and discomfort of surgery, but also risks recurrent appendicitis. Using decision-tree software, we evaluated the cost-effectiveness of nonoperative management versus routine laparoscopic appendectomy. Model variables were abstracted from a review of the literature, Healthcare Cost and Utilization Project, and Medicare Physician Fee schedule. Model uncertainty was assessed using both one-way and probabilistic sensitivity analyses. We used a $100,000 per quality adjusted life year (QALY) threshold for cost-effectiveness. Operative management cost $11,119 and yielded 23.56 quality-adjusted life months (QALMs). Nonoperative management cost $2277 less than operative management, but yielded 0.03 fewer QALMs. The incremental cost-to-effectiveness ratio of routine laparoscopic appendectomy was $910,800 per QALY gained. This greatly exceeds the $100,000/QALY threshold and was not cost-effective. One-way sensitivity analysis found that operative management would become cost-effective if the 1-year recurrence rate of acute appendicitis exceeded 39.8%. Probabilistic sensitivity analysis indicated that nonoperative management was cost-effective in 92% of simulations. Based on our model, nonoperative management is more cost-effective than routine laparoscopic appendectomy for children with acute, uncomplicated appendicitis. Cost-Effectiveness Study: Level II. Published by Elsevier Inc.
A cost-effectiveness analysis of two different antimicrobial stewardship programs.
Okumura, Lucas Miyake; Riveros, Bruno Salgado; Gomes-da-Silva, Monica Maria; Veroneze, Izelandia
2016-01-01
There is a lack of formal economic analysis to assess the efficiency of antimicrobial stewardship programs. Herein, we conducted a cost-effectiveness study to assess two different strategies of Antimicrobial Stewardship Programs. A 30-day Markov model was developed to analyze how cost-effective was a Bundled Antimicrobial Stewardship implemented in a university hospital in Brazil. Clinical data derived from a historical cohort that compared two different strategies of antimicrobial stewardship programs and had 30-day mortality as main outcome. Selected costs included: workload, cost of defined daily doses, length of stay, laboratory and imaging resources used to diagnose infections. Data were analyzed by deterministic and probabilistic sensitivity analysis to assess model's robustness, tornado diagram and Cost-Effectiveness Acceptability Curve. Bundled Strategy was more expensive (Cost difference US$ 2119.70), however, it was more efficient (US$ 27,549.15 vs 29,011.46). Deterministic and probabilistic sensitivity analysis suggested that critical variables did not alter final Incremental Cost-Effectiveness Ratio. Bundled Strategy had higher probabilities of being cost-effective, which was endorsed by cost-effectiveness acceptability curve. As health systems claim for efficient technologies, this study conclude that Bundled Antimicrobial Stewardship Program was more cost-effective, which means that stewardship strategies with such characteristics would be of special interest in a societal and clinical perspective. Copyright © 2016 Elsevier Editora Ltda. All rights reserved.
Louie, Michelle; Spencer, Jennifer; Wheeler, Stephanie; Ellis, Victoria; Toubia, Tarek; Schiff, Lauren D; Siedhoff, Matthew T; Moulder, Janelle K
2017-11-01
A better understanding of the relative risks and benefits of common treatment options for abnormal uterine bleeding (AUB) can help providers and patients to make balanced, evidence-based decisions. To provide comparative estimates of clinical outcomes after placement of levonorgestrel-releasing intrauterine system (LNG-IUS), ablation, or hysterectomy for AUB. A PubMED search was done using combinations of search terms related to abnormal uterine bleeding, LNG-IUS, hysterectomy, endometrial ablation, cost-benefit analysis, cost-effectiveness, and quality-adjusted life years. Full articles published in 2006-2016 available in English comparing at least two treatment modalities of interest among women of reproductive age with AUB were included. A decision tree was generated to compare clinical outcomes in a hypothetical cohort of 100 000 premenopausal women with nonmalignant AUB. We evaluated complications, mortality, and treatment outcomes over a 5-year period, calculated cumulative quality-adjusted life years (QALYs), and conducted probabilistic sensitivity analysis. Levonorgestrel-releasing intrauterine system had the highest number of QALYs (406 920), followed by hysterectomy (403 466), non-resectoscopic ablation (399 244), and resectoscopic ablation (395 827). Ablation had more treatment failures and complications than LNG-IUS and hysterectomy. Findings were robust in probabilistic sensitivity analysis. Levonorgestrel-releasing intrauterine system and hysterectomy outperformed endometrial ablation for treatment of AUB. © 2017 International Federation of Gynecology and Obstetrics.
Gandhoke, Gurpreet S; Pease, Matthew; Smith, Kenneth J; Sekula, Raymond F
2017-09-01
To perform a cost-minimization study comparing the supraorbital and endoscopic endonasal (EEA) approach with or without craniotomy for the resection of olfactory groove meningiomas (OGMs). We built a decision tree using probabilities of gross total resection (GTR) and cerebrospinal fluid (CSF) leak rates with the supraorbital approach versus EEA with and without additional craniotomy. The cost (not charge or reimbursement) at each "stem" of this decision tree for both surgical options was obtained from our hospital's finance department. After a base case calculation, we applied plausible ranges to all parameters and carried out multiple 1-way sensitivity analyses. Probabilistic sensitivity analyses confirmed our results. The probabilities of GTR (0.8) and CSF leak (0.2) for the supraorbital craniotomy were obtained from our series of 5 patients who underwent a supraorbital approach for the resection of an OGM. The mean tumor volume was 54.6 cm 3 (range, 17-94.2 cm 3 ). Literature-reported rates of GTR (0.6) and CSF leak (0.3) with EEA were applied to our economic analysis. Supraorbital craniotomy was the preferred strategy, with an expected value of $29,423, compared with an EEA cost of $83,838. On multiple 1-way sensitivity analyses, supraorbital craniotomy remained the preferred strategy, with a minimum cost savings of $46,000 and a maximum savings of $64,000. Probabilistic sensitivity analysis found the lowest cost difference between the 2 surgical options to be $37,431. Compared with EEA, supraorbital craniotomy provides substantial cost savings in the treatment of OGMs. Given the potential differences in effectiveness between approaches, a cost-effectiveness analysis should be undertaken. Copyright © 2017 Elsevier Inc. All rights reserved.
Wu, Bin; Ye, Ming; Chen, Huafeng; Shen, Jinfang F
2012-02-01
Adding trastuzumab to a conventional regimen of chemotherapy can improve survival in patients with human epidermal growth factor receptor 2 (HER2)-positive advanced gastric or gastroesophageal junction (GEJ) cancer, but the economic impact of this practice is unknown. The purpose of this cost-effectiveness analysis was to estimate the effects of adding trastuzumab to standard chemotherapy in patients with HER2-positive advanced gastric or GEJ cancer on health and economic outcomes in China. A Markov model was developed to simulate the clinical course of typical patients with HER2-positive advanced gastric or GEJ cancer. Five-year quality-adjusted life-years (QALYs), costs, and incremental cost-effectiveness ratios (ICERs) were estimated. Model inputs were derived from the published literature and government sources. Direct costs were estimated from the perspective of Chinese society. One-way and probabilistic sensitivity analyses were conducted. On baseline analysis, the addition of trastuzumab increased cost and QALY by $56,004.30 (year-2010 US $) and 0.18, respectively, relative to conventional chemotherapy, resulting in an ICER of $251,667.10/QALY gained. Probabilistic sensitivity analyses supported that the addition of trastuzumab was not cost-effective. Budgetary impact analysis estimated that the annual increase in fiscal expenditures would be ~$1 billion. On univariate sensitivity analysis, the median overall survival time for conventional chemotherapy was the most influential factor with respect to the robustness of the model. The findings from the present analysis suggest that the addition of trastuzumab to conventional chemotherapy might not be cost-effective in patients with HER2-positive advanced gastric or GEJ cancer. Copyright © 2012 Elsevier HS Journals, Inc. All rights reserved.
Recent developments of the NESSUS probabilistic structural analysis computer program
NASA Technical Reports Server (NTRS)
Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.
1992-01-01
The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.
Guo, Guang-Hui; Wu, Feng-Chang; He, Hong-Ping; Feng, Cheng-Lian; Zhang, Rui-Qing; Li, Hui-Xian
2012-04-01
Probabilistic approaches, such as Monte Carlo Sampling (MCS) and Latin Hypercube Sampling (LHS), and non-probabilistic approaches, such as interval analysis, fuzzy set theory and variance propagation, were used to characterize uncertainties associated with risk assessment of sigma PAH8 in surface water of Taihu Lake. The results from MCS and LHS were represented by probability distributions of hazard quotients of sigma PAH8 in surface waters of Taihu Lake. The probabilistic distribution of hazard quotient were obtained from the results of MCS and LHS based on probabilistic theory, which indicated that the confidence intervals of hazard quotient at 90% confidence level were in the range of 0.000 18-0.89 and 0.000 17-0.92, with the mean of 0.37 and 0.35, respectively. In addition, the probabilities that the hazard quotients from MCS and LHS exceed the threshold of 1 were 9.71% and 9.68%, respectively. The sensitivity analysis suggested the toxicity data contributed the most to the resulting distribution of quotients. The hazard quotient of sigma PAH8 to aquatic organisms ranged from 0.000 17 to 0.99 using interval analysis. The confidence interval was (0.001 5, 0.016 3) at the 90% confidence level calculated using fuzzy set theory, and the confidence interval was (0.000 16, 0.88) at the 90% confidence level based on the variance propagation. These results indicated that the ecological risk of sigma PAH8 to aquatic organisms were low. Each method has its own set of advantages and limitations, which was based on different theory; therefore, the appropriate method should be selected on a case-by-case to quantify the effects of uncertainties on the ecological risk assessment. Approach based on the probabilistic theory was selected as the most appropriate method to assess the risk of sigma PAH8 in surface water of Taihu Lake, which provided an important scientific foundation of risk management and control for organic pollutants in water.
A Robust Approach to Risk Assessment Based on Species Sensitivity Distributions.
Monti, Gianna S; Filzmoser, Peter; Deutsch, Roland C
2018-05-03
The guidelines for setting environmental quality standards are increasingly based on probabilistic risk assessment due to a growing general awareness of the need for probabilistic procedures. One of the commonly used tools in probabilistic risk assessment is the species sensitivity distribution (SSD), which represents the proportion of species affected belonging to a biological assemblage as a function of exposure to a specific toxicant. Our focus is on the inverse use of the SSD curve with the aim of estimating the concentration, HCp, of a toxic compound that is hazardous to p% of the biological community under study. Toward this end, we propose the use of robust statistical methods in order to take into account the presence of outliers or apparent skew in the data, which may occur without any ecological basis. A robust approach exploits the full neighborhood of a parametric model, enabling the analyst to account for the typical real-world deviations from ideal models. We examine two classic HCp estimation approaches and consider robust versions of these estimators. In addition, we also use data transformations in conjunction with robust estimation methods in case of heteroscedasticity. Different scenarios using real data sets as well as simulated data are presented in order to illustrate and compare the proposed approaches. These scenarios illustrate that the use of robust estimation methods enhances HCp estimation. © 2018 Society for Risk Analysis.
Cost-effectiveness of prucalopride in the treatment of chronic constipation in the Netherlands
Nuijten, Mark J. C.; Dubois, Dominique J.; Joseph, Alain; Annemans, Lieven
2015-01-01
Objective: To assess the cost-effectiveness of prucalopride vs. continued laxative treatment for chronic constipation in patients in the Netherlands in whom laxatives have failed to provide adequate relief. Methods: A Markov model was developed to estimate the cost-effectiveness of prucalopride in patients with chronic constipation receiving standard laxative treatment from the perspective of Dutch payers in 2011. Data sources included published prucalopride clinical trials, published Dutch price/tariff lists, and national population statistics. The model simulated the clinical and economic outcomes associated with prucalopride vs. standard treatment and had a cycle length of 1 month and a follow-up time of 1 year. Response to treatment was defined as the proportion of patients who achieved “normal bowel function”. One-way and probabilistic sensitivity analyses were conducted to test the robustness of the base case. Results: In the base case analysis, the cost of prucalopride relative to continued laxative treatment was € 9015 per quality-adjusted life-year (QALY). Extensive sensitivity analyses and scenario analyses confirmed that the base case cost-effectiveness estimate was robust. One-way sensitivity analyses showed that the model was most sensitive in response to prucalopride; incremental cost-effectiveness ratios ranged from € 6475 to 15,380 per QALY. Probabilistic sensitivity analyses indicated that there is a greater than 80% probability that prucalopride would be cost-effective compared with continued standard treatment, assuming a willingness-to-pay threshold of € 20,000 per QALY from a Dutch societal perspective. A scenario analysis was performed for women only, which resulted in a cost-effectiveness ratio of € 7773 per QALY. Conclusion: Prucalopride was cost-effective in a Dutch patient population, as well as in a women-only subgroup, who had chronic constipation and who obtained inadequate relief from laxatives. PMID:25926794
Herring, William; Pearson, Isobel; Purser, Molly; Nakhaipour, Hamid Reza; Haiderali, Amin; Wolowacz, Sorrel; Jayasundara, Kavisha
2016-01-01
Our objective was to estimate the cost effectiveness of ofatumumab plus chlorambucil (OChl) versus chlorambucil in patients with chronic lymphocytic leukaemia for whom fludarabine-based therapies are considered inappropriate from the perspective of the publicly funded healthcare system in Canada. A semi-Markov model (3-month cycle length) used survival curves to govern progression-free survival (PFS) and overall survival (OS). Efficacy and safety data and health-state utility values were estimated from the COMPLEMENT-1 trial. Post-progression treatment patterns were based on clinical guidelines, Canadian treatment practices and published literature. Total and incremental expected lifetime costs (in Canadian dollars [$Can], year 2013 values), life-years and quality-adjusted life-years (QALYs) were computed. Uncertainty was assessed via deterministic and probabilistic sensitivity analyses. The discounted lifetime health and economic outcomes estimated by the model showed that, compared with chlorambucil, first-line treatment with OChl led to an increase in QALYs (0.41) and total costs ($Can27,866) and to an incremental cost-effectiveness ratio (ICER) of $Can68,647 per QALY gained. In deterministic sensitivity analyses, the ICER was most sensitive to the modelling time horizon and to the extrapolation of OS treatment effects beyond the trial duration. In probabilistic sensitivity analysis, the probability of cost effectiveness at a willingness-to-pay threshold of $Can100,000 per QALY gained was 59 %. Base-case results indicated that improved overall response and PFS for OChl compared with chlorambucil translated to improved quality-adjusted life expectancy. Sensitivity analysis suggested that OChl is likely to be cost effective subject to uncertainty associated with the presence of any long-term OS benefit and the model time horizon.
Probabilistic structural analysis of aerospace components using NESSUS
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.
1988-01-01
Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.
Siribumrungwong, Boonying; Noorit, Pinit; Wilasrusmee, Chumpon; Leelahavarong, Pattara; Thakkinstian, Ammarin; Teerawattananon, Yot
2016-09-01
To conduct economic evaluations of radiofrequency ablation, ultrasound-guided foam sclerotherapy and surgery for great saphenous vein ablation. A cost-utility and cohort analysis from societal perspective was performed to estimate incremental cost-effectiveness ratio. Transitional probabilities were from meta-analysis. Direct medical, direct non-medical, indirect costs, and utility were from standard Thai costings and cohort. Probabilistic sensitivity analysis was performed to assess parameter uncertainties. Seventy-seven patients (31 radiofrequency ablation, 19 ultrasound-guided foam sclerotherapy, and 27 surgeries) were enrolled from October 2011 to February 2013. Compared with surgery, radiofrequency ablation costed 12,935 and 20,872 Baht higher, whereas ultrasound-guided foam sclerotherapy costed 6159 lower and 1558 Bath higher for outpatient and inpatient, respectively. At one year, radiofrequency ablation had slightly lower quality-adjusted life-year, whereas ultrasound-guided foam sclerotherapy yielded additional 0.025 quality-adjusted life-year gained. Because of costing lower and greater quality-adjusted life-year than other compared alternatives, outpatient ultrasound-guided foam sclerotherapy was an option being dominant. Probabilistic sensitivity analysis resulted that at the Thai ceiling threshold of 160,000 Baht/quality-adjusted life-year gained, ultrasound-guided foam sclerotherapy had chances of 0.71 to be cost-effective. Ultrasound-guided foam sclerotherapy seems to be cost-effective for treating great saphenous vein reflux compared to surgery in Thailand at one-year results. © The Author(s) 2015.
Economic evaluation of floseal compared to nasal packing for the management of anterior epistaxis.
Le, Andre; Thavorn, Kednapa; Lasso, Andrea; Kilty, Shaun J
2018-01-04
To evaluate the cost-effectiveness of Floseal, a topically applied hemostatic agent, and nasal packing for the management of epistaxis in Canada. Outcomes research, a cost-utility analysis. We developed a Markov model to compare the costs and health outcomes of Floseal with nasal packing over a lifetime horizon from the perspective of a publicly funded healthcare system. A cycle length of 1 year was used. Efficacy of Floseal and packing was sought from the published literature. Unit costs were gathered from a hospital case costing system, whereas physician fees were extracted from the Ontario Schedule of Benefits for Physician Services. Results were expressed as an incremental cost per quality-adjusted life year (QALY) gained. A series of one-way sensitivity and probabilistic sensitivity analyses were performed. From the perspective of a publicly funded health are system, the Floseal treatment strategy was associated with higher costs ($2,067) and greater QALYs (0.27) than nasal packing. Our findings were highly sensitive to discount rates, the cost of Floseal, and the cost of nasal packing. The probabilistic sensitivity analysis suggested that the probability that Floseal treatment is cost-effective reached 99% if the willingness-to-pay threshold was greater than $120,000 per QALY gained. Prior studies have demonstrated Floseal to be an effective treatment for anterior epistaxis. In the Canadian healthcare system, Floseal treatment appears to be a cost-effective treatment option compared to nasal packing for anterior epistaxis. 2c Laryngoscope, 2018. © 2018 The American Laryngological, Rhinological and Otological Society, Inc.
NASA Technical Reports Server (NTRS)
Gaebler, John A.; Tolson, Robert H.
2010-01-01
In the study of entry, descent, and landing, Monte Carlo sampling methods are often employed to study the uncertainty in the designed trajectory. The large number of uncertain inputs and outputs, coupled with complicated non-linear models, can make interpretation of the results difficult. Three methods that provide statistical insights are applied to an entry, descent, and landing simulation. The advantages and disadvantages of each method are discussed in terms of the insights gained versus the computational cost. The first method investigated was failure domain bounding which aims to reduce the computational cost of assessing the failure probability. Next a variance-based sensitivity analysis was studied for the ability to identify which input variable uncertainty has the greatest impact on the uncertainty of an output. Finally, probabilistic sensitivity analysis is used to calculate certain sensitivities at a reduced computational cost. These methods produce valuable information that identifies critical mission parameters and needs for new technology, but generally at a significant computational cost.
Analysing uncertainties of supply and demand in the future use of hydrogen as an energy vector
NASA Astrophysics Data System (ADS)
Lenel, U. R.; Davies, D. G. S.; Moore, M. A.
An analytical technique (Analysis with Uncertain Qualities), developed at Fulmer, is being used to examine the sensitivity of the outcome to uncertainties in input quantities in order to highlight which input quantities critically affect the potential role of hydrogen. The work presented here includes an outline of the model and the analysis technique, along with basic considerations of the input quantities to the model (demand, supply and constraints). Some examples are given of probabilistic estimates of input quantities.
NASA Technical Reports Server (NTRS)
Townsend, John S.; Peck, Jeff; Ayala, Samuel
2000-01-01
NASA has funded several major programs (the Probabilistic Structural Analysis Methods Project is an example) to develop probabilistic structural analysis methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element software code, known as Numerical Evaluation of Stochastic Structures Under Stress, is used to determine the reliability of a critical weld of the Space Shuttle solid rocket booster aft skirt. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process. Also, analysis findings are compared with measured Space Shuttle flight data.
Varier, Raghu U; Biltaji, Eman; Smith, Kenneth J; Roberts, Mark S; Kyle Jensen, M; LaFleur, Joanne; Nelson, Richard E
2015-04-01
Clostridium difficile infection (CDI) places a high burden on the US healthcare system. Recurrent CDI (RCDI) occurs frequently. Recently proposed guidelines from the American College of Gastroenterology (ACG) and the American Gastroenterology Association (AGA) include fecal microbiota transplantation (FMT) as a therapeutic option for RCDI. The purpose of this study was to estimate the cost-effectiveness of FMT compared with vancomycin for the treatment of RCDI in adults, specifically following guidelines proposed by the ACG and AGA. We constructed a decision-analytic computer simulation using inputs from the published literature to compare the standard approach using tapered vancomycin to FMT for RCDI from the third-party payer perspective. Our effectiveness measure was quality-adjusted life years (QALYs). Because simulated patients were followed for 90 days, discounting was not necessary. One-way and probabilistic sensitivity analyses were performed. Base-case analysis showed that FMT was less costly ($1,669 vs $3,788) and more effective (0.242 QALYs vs 0.235 QALYs) than vancomycin for RCDI. One-way sensitivity analyses showed that FMT was the dominant strategy (both less expensive and more effective) if cure rates for FMT and vancomycin were ≥70% and <91%, respectively, and if the cost of FMT was <$3,206. Probabilistic sensitivity analysis, varying all parameters simultaneously, showed that FMT was the dominant strategy over 10, 000 second-order Monte Carlo simulations. Our results suggest that FMT may be a cost-saving intervention in managing RCDI. Implementation of FMT for RCDI may help decrease the economic burden to the healthcare system.
Cost–effectiveness analysis of quadrivalent influenza vaccine in Spain
García, Amos; Ortiz de Lejarazu, Raúl; Reina, Jordi; Callejo, Daniel; Cuervo, Jesús; Morano Larragueta, Raúl
2016-01-01
ABSTRACT Influenza has a major impact on healthcare systems and society, but can be prevented using vaccination. The World Health Organization (WHO) currently recommends that influenza vaccines should include at least two virus A and one virus B lineage (trivalent vaccine; TIV). A new quadrivalent vaccine (QIV), which includes an additional B virus strain, received regulatory approval and is now recommended by several countries. The present study estimates the cost-effectiveness of replacing TIVs with QIV for risk groups and elderly population in Spain. A static, lifetime, multi-cohort Markov model with a one-year cycle time was adapted to assess the costs and health outcomes associated with a switch from TIV to QIV. The model followed a cohort vaccinated each year according to health authority recommendations, for the duration of their lives. National epidemiological data allowed the determination of whether the B strain included in TIVs matched the circulating one. Societal perspective was considered, costs and outcomes were discounted at 3% and one-way and probabilistic sensitivity analyses were performed. Compared to TIVs, QIV reduced more influenza cases and influenza-related complications and deaths during periods of B-mismatch strains in the TIV. The incremental cost-effectiveness ratio (ICER) was 8,748€/quality-adjusted life year (QALY). One-way sensitivity analysis showed mismatch with the B lineage included in the TIV was the main driver for ICER. Probabilistic sensitivity analysis shows ICER below 30,000€/QALY in 96% of simulations. Replacing TIVs with QIV in Spain could improve influenza prevention by avoiding B virus mismatch and provide a cost-effective healthcare intervention. PMID:27184622
Cost-effectiveness analysis of quadrivalent influenza vaccine in Spain.
García, Amos; Ortiz de Lejarazu, Raúl; Reina, Jordi; Callejo, Daniel; Cuervo, Jesús; Morano Larragueta, Raúl
2016-09-01
Influenza has a major impact on healthcare systems and society, but can be prevented using vaccination. The World Health Organization (WHO) currently recommends that influenza vaccines should include at least two virus A and one virus B lineage (trivalent vaccine; TIV). A new quadrivalent vaccine (QIV), which includes an additional B virus strain, received regulatory approval and is now recommended by several countries. The present study estimates the cost-effectiveness of replacing TIVs with QIV for risk groups and elderly population in Spain. A static, lifetime, multi-cohort Markov model with a one-year cycle time was adapted to assess the costs and health outcomes associated with a switch from TIV to QIV. The model followed a cohort vaccinated each year according to health authority recommendations, for the duration of their lives. National epidemiological data allowed the determination of whether the B strain included in TIVs matched the circulating one. Societal perspective was considered, costs and outcomes were discounted at 3% and one-way and probabilistic sensitivity analyses were performed. Compared to TIVs, QIV reduced more influenza cases and influenza-related complications and deaths during periods of B-mismatch strains in the TIV. The incremental cost-effectiveness ratio (ICER) was 8,748€/quality-adjusted life year (QALY). One-way sensitivity analysis showed mismatch with the B lineage included in the TIV was the main driver for ICER. Probabilistic sensitivity analysis shows ICER below 30,000€/QALY in 96% of simulations. Replacing TIVs with QIV in Spain could improve influenza prevention by avoiding B virus mismatch and provide a cost-effective healthcare intervention.
Improving the quality of pressure ulcer care with prevention: a cost-effectiveness analysis.
Padula, William V; Mishra, Manish K; Makic, Mary Beth F; Sullivan, Patrick W
2011-04-01
In October 2008, Centers for Medicare and Medicaid Services discontinued reimbursement for hospital-acquired pressure ulcers (HAPUs), thus placing stress on hospitals to prevent incidence of this costly condition. To evaluate whether prevention methods are cost-effective compared with standard care in the management of HAPUs. A semi-Markov model simulated the admission of patients to an acute care hospital from the time of admission through 1 year using the societal perspective. The model simulated health states that could potentially lead to an HAPU through either the practice of "prevention" or "standard care." Univariate sensitivity analyses, threshold analyses, and Bayesian multivariate probabilistic sensitivity analysis using 10,000 Monte Carlo simulations were conducted. Cost per quality-adjusted life-years (QALYs) gained for the prevention of HAPUs. Prevention was cost saving and resulted in greater expected effectiveness compared with the standard care approach per hospitalization. The expected cost of prevention was $7276.35, and the expected effectiveness was 11.241 QALYs. The expected cost for standard care was $10,053.95, and the expected effectiveness was 9.342 QALYs. The multivariate probabilistic sensitivity analysis showed that prevention resulted in cost savings in 99.99% of the simulations. The threshold cost of prevention was $821.53 per day per person, whereas the cost of prevention was estimated to be $54.66 per day per person. This study suggests that it is more cost effective to pay for prevention of HAPUs compared with standard care. Continuous preventive care of HAPUs in acutely ill patients could potentially reduce incidence and prevalence, as well as lead to lower expenditures.
Ariza, R; Van Walsem, A; Canal, C; Roldán, C; Betegón, L; Oyagüez, I; Janssen, K
2014-07-01
To compare the cost of treating rheumatoid arthritis patients that have failed an initial treatment with methotrexate, with subcutaneous abatacept versus other first-line biologic disease-modifying antirheumatic drugs. Subcutaneous abatacept was considered comparable to intravenous abatacept, adalimumab, certolizumab pegol, etanercept, golimumab, infliximab and tocilizumab, based on indirect comparison using mixed treatment analysis. A cost-minimization analysis was therefore considered appropriate. The Spanish Health System perspective and a 3 year time horizon were selected. Pharmaceutical and administration costs (Euros 2013) of all available first-line biological disease-modifying antirheumatic drugs were considered. Administration costs were obtained from a local costs database. Patients were considered to have a weight of 70 kg. A 3% annual discount rate was applied. Deterministic and probabilistic sensitivity analyses were performed. Subcutaneous abatacept proved in the base case to be less costly than all other biologic antirrheumatic drugs (ranging from Euros -831.42 to Euros -9,741.69 versus infliximab and tocilizumab, respectively). Subcutaneous abatacept was associated with a cost of Euros 10,760.41 per patient during the first year of treatment and Euros 10,261.29 in subsequent years. The total 3-year cost of subcutaneous abatacept was Euros 29,953.89 per patient. Sensitivity analyses proved the model to be robust. Subcutaneous abatacept remained cost-saving in 100% of probabilistic sensitivity analysis simulations versus adalimumab, certolizumab, etanercept and golimumab, in more than 99.6% versus intravenous abatacept and tocilizumab and in 62.3% versus infliximab. Treatment with subcutaneous abatacept is cost-saving versus intravenous abatacept, adalimumab, certolizumab, etanercept, golimumab, infliximab and tocilizumab in the management of rheumatoid arthritis patients initiating treatment with biological antirheumatic drugs. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
Zhang, Xinke; Hay, Joel W; Niu, Xiaoli
2015-01-01
The aim of the study was to compare the cost effectiveness of fingolimod, teriflunomide, dimethyl fumarate, and intramuscular (IM) interferon (IFN)-β(1a) as first-line therapies in the treatment of patients with relapsing-remitting multiple sclerosis (RRMS). A Markov model was developed to evaluate the cost effectiveness of disease-modifying drugs (DMDs) from a US societal perspective. The time horizon in the base case was 5 years. The primary outcome was incremental net monetary benefit (INMB), and the secondary outcome was incremental cost-effectiveness ratio (ICER). The base case INMB willingness-to-pay (WTP) threshold was assumed to be US$150,000 per quality-adjusted life year (QALY), and the costs were in 2012 US dollars. One-way sensitivity analyses and probabilistic sensitivity analysis were conducted to test the robustness of the model results. Dimethyl fumarate dominated all other therapies over the range of WTPs, from US$0 to US$180,000. Compared with IM IFN-β(1a), at a WTP of US$150,000, INMBs were estimated at US$36,567, US$49,780, and US$80,611 for fingolimod, teriflunomide, and dimethyl fumarate, respectively. The ICER of fingolimod versus teriflunomide was US$3,201,672. One-way sensitivity analyses demonstrated the model results were sensitive to the acquisition costs of DMDs and the time horizon, but in most scenarios, cost-effectiveness rankings remained stable. Probabilistic sensitivity analysis showed that for more than 90% of the simulations, dimethyl fumarate was the optimal therapy across all WTP values. The three oral therapies were favored in the cost-effectiveness analysis. Of the four DMDs, dimethyl fumarate was a dominant therapy to manage RRMS. Apart from dimethyl fumarate, teriflunomide was the most cost-effective therapy compared with IM IFN-β(1a), with an ICER of US$7,115.
Probabilistic estimates of drought impacts on agricultural production
NASA Astrophysics Data System (ADS)
Madadgar, Shahrbanou; AghaKouchak, Amir; Farahmand, Alireza; Davis, Steven J.
2017-08-01
Increases in the severity and frequency of drought in a warming climate may negatively impact agricultural production and food security. Unlike previous studies that have estimated agricultural impacts of climate condition using single-crop yield distributions, we develop a multivariate probabilistic model that uses projected climatic conditions (e.g., precipitation amount or soil moisture) throughout a growing season to estimate the probability distribution of crop yields. We demonstrate the model by an analysis of the historical period 1980-2012, including the Millennium Drought in Australia (2001-2009). We find that precipitation and soil moisture deficit in dry growing seasons reduced the average annual yield of the five largest crops in Australia (wheat, broad beans, canola, lupine, and barley) by 25-45% relative to the wet growing seasons. Our model can thus produce region- and crop-specific agricultural sensitivities to climate conditions and variability. Probabilistic estimates of yield may help decision-makers in government and business to quantitatively assess the vulnerability of agriculture to climate variations. We develop a multivariate probabilistic model that uses precipitation to estimate the probability distribution of crop yields. The proposed model shows how the probability distribution of crop yield changes in response to droughts. During Australia's Millennium Drought precipitation and soil moisture deficit reduced the average annual yield of the five largest crops.
Lunar Exploration Architecture Level Key Drivers and Sensitivities
NASA Technical Reports Server (NTRS)
Goodliff, Kandyce; Cirillo, William; Earle, Kevin; Reeves, J. D.; Shyface, Hilary; Andraschko, Mark; Merrill, R. Gabe; Stromgren, Chel; Cirillo, Christopher
2009-01-01
Strategic level analysis of the integrated behavior of lunar transportation and lunar surface systems architecture options is performed to assess the benefit, viability, affordability, and robustness of system design choices. This analysis employs both deterministic and probabilistic modeling techniques so that the extent of potential future uncertainties associated with each option are properly characterized. The results of these analyses are summarized in a predefined set of high-level Figures of Merit (FOMs) so as to provide senior NASA Constellation Program (CxP) and Exploration Systems Mission Directorate (ESMD) management with pertinent information to better inform strategic level decision making. The strategic level exploration architecture model is designed to perform analysis at as high a level as possible but still capture those details that have major impacts on system performance. The strategic analysis methodology focuses on integrated performance, affordability, and risk analysis, and captures the linkages and feedbacks between these three areas. Each of these results leads into the determination of the high-level FOMs. This strategic level analysis methodology has been previously applied to Space Shuttle and International Space Station assessments and is now being applied to the development of the Constellation Program point-of-departure lunar architecture. This paper provides an overview of the strategic analysis methodology and the lunar exploration architecture analyses to date. In studying these analysis results, the strategic analysis team has identified and characterized key drivers affecting the integrated architecture behavior. These key drivers include inclusion of a cargo lander, mission rate, mission location, fixed-versus- variable costs/return on investment, and the requirement for probabilistic analysis. Results of sensitivity analysis performed on lunar exploration architecture scenarios are also presented.
Palbociclib in hormone receptor positive advanced breast cancer: A cost-utility analysis.
Raphael, J; Helou, J; Pritchard, K I; Naimark, D M
2017-11-01
The addition of palbociclib to letrozole improves progression-free survival in the first-line treatment of hormone receptor positive advanced breast cancer (ABC). This study assesses the cost-utility of palbociclib from the Canadian healthcare payer perspective. A probabilistic discrete event simulation (DES) model was developed and parameterised with data from the PALOMA 1 and 2 trials and other sources. The incremental cost per quality-adjusted life-month (QALM) gained for palbociclib was calculated. A time horizon of 15 years was used in the base case with costs and effectiveness discounted at 5% annually. Time-to- progression and time-to-death were derived from a Weibull and exponential distribution. Expected costs were based on Ontario fees and other sources. Probabilistic sensitivity analyses were conducted to account for parameter uncertainty. Compared to letrozole, the addition of palbociclib provided an additional 14.7 QALM at an incremental cost of $161,508. The resulting incremental cost-effectiveness ratio was $10,999/QALM gained. Assuming a willingness-to-pay (WTP) of $4167/QALM, the probability of palbociclib to be cost-effective was 0%. Cost-effectiveness acceptability curves derived from a probabilistic sensitivity analysis showed that at a WTP of $11,000/QALM gained, the probability of palbociclib to be cost-effective was 50%. The addition of palbociclib to letrozole is unlikely to be cost-effective for the treatment of ABC from a Canadian healthcare perspective with its current price. While ABC patients derive a meaningful clinical benefit from palbociclib, considerations should be given to increase the WTP threshold and reduce the drug pricing, to render this strategy more affordable. Copyright © 2017 Elsevier Ltd. All rights reserved.
Probabilistic assessment of roadway departure risk in a curve
NASA Astrophysics Data System (ADS)
Rey, G.; Clair, D.; Fogli, M.; Bernardin, F.
2011-10-01
Roadway departure while cornering constitutes a major part of car accidents and casualties in France. Even though drastic policy about overspeeding contributes to reduce accidents, there obviously exist other factors. This article presents the construction of a probabilistic strategy for the roadway departure risk assessment. A specific vehicle dynamic model is developed in which some parameters are modelled by random variables. These parameters are deduced from a sensitivity analysis to ensure an efficient representation of the inherent uncertainties of the system. Then, structural reliability methods are employed to assess the roadway departure risk in function of the initial conditions measured at the entrance of the curve. This study is conducted within the French national road safety project SARI that aims to implement a warning systems alerting the driver in case of dangerous situation.
Acevedo, Joseph R; Fero, Katherine E; Wilson, Bayard; Sacco, Assuntina G; Mell, Loren K; Coffey, Charles S; Murphy, James D
2016-11-10
Purpose Recently, a large randomized trial found a survival advantage among patients who received elective neck dissection in conjunction with primary surgery for clinically node-negative oral cavity cancer compared with those receiving primary surgery alone. However, elective neck dissection comes with greater upfront cost and patient morbidity. We present a cost-effectiveness analysis of elective neck dissection for the initial surgical management of early-stage oral cavity cancer. Methods We constructed a Markov model to simulate primary, adjuvant, and salvage therapy; disease recurrence; and survival in patients with T1/T2 clinically node-negative oral cavity squamous cell carcinoma. Transition probabilities were derived from clinical trial data; costs (in 2015 US dollars) and health utilities were estimated from the literature. Incremental cost-effectiveness ratios, expressed as dollar per quality-adjusted life-year (QALY), were calculated with incremental cost-effectiveness ratios less than $100,000/QALY considered cost effective. We conducted one-way and probabilistic sensitivity analyses to examine model uncertainty. Results Our base-case model found that over a lifetime the addition of elective neck dissection to primary surgery reduced overall costs by $6,000 and improved effectiveness by 0.42 QALYs compared with primary surgery alone. The decrease in overall cost despite the added neck dissection was a result of less use of salvage therapy. On one-way sensitivity analysis, the model was most sensitive to assumptions about disease recurrence, survival, and the health utility reduction from a neck dissection. Probabilistic sensitivity analysis found that treatment with elective neck dissection was cost effective 76% of the time at a willingness-to-pay threshold of $100,000/QALY. Conclusion Our study found that the addition of elective neck dissection reduces costs and improves health outcomes, making this a cost-effective treatment strategy for patients with early-stage oral cavity cancer.
NASA Technical Reports Server (NTRS)
Thacker, B. H.; Mcclung, R. C.; Millwater, H. R.
1990-01-01
An eigenvalue analysis of a typical space propulsion system turbopump blade is presented using an approximate probabilistic analysis methodology. The methodology was developed originally to investigate the feasibility of computing probabilistic structural response using closed-form approximate models. This paper extends the methodology to structures for which simple closed-form solutions do not exist. The finite element method will be used for this demonstration, but the concepts apply to any numerical method. The results agree with detailed analysis results and indicate the usefulness of using a probabilistic approximate analysis in determining efficient solution strategies.
Shen, Nicole T; Leff, Jared A; Schneider, Yecheskel; Crawford, Carl V; Maw, Anna; Bosworth, Brian; Simon, Matthew S
2017-01-01
Systematic reviews with meta-analyses and meta-regression suggest that timely probiotic use can prevent Clostridium difficile infection (CDI) in hospitalized adults receiving antibiotics, but the cost effectiveness is unknown. We sought to evaluate the cost effectiveness of probiotic use for prevention of CDI versus no probiotic use in the United States. We programmed a decision analytic model using published literature and national databases with a 1-year time horizon. The base case was modeled as a hypothetical cohort of hospitalized adults (mean age 68) receiving antibiotics with and without concurrent probiotic administration. Projected outcomes included quality-adjusted life-years (QALYs), costs (2013 US dollars), incremental cost-effectiveness ratios (ICERs; $/QALY), and cost per infection avoided. One-way, two-way, and probabilistic sensitivity analyses were conducted, and scenarios of different age cohorts were considered. The ICERs less than $100000 per QALY were considered cost effective. Probiotic use dominated (more effective and less costly) no probiotic use. Results were sensitive to probiotic efficacy (relative risk <0.73), the baseline risk of CDI (>1.6%), the risk of probiotic-associated bactermia/fungemia (<0.26%), probiotic cost (<$130), and age (>65). In probabilistic sensitivity analysis, at a willingness-to-pay threshold of $100000/QALY, probiotics were the optimal strategy in 69.4% of simulations. Our findings suggest that probiotic use may be a cost-effective strategy to prevent CDI in hospitalized adults receiving antibiotics age 65 or older or when the baseline risk of CDI exceeds 1.6%.
Carlson, Lucas C; Slobogean, Gerard P; Pollak, Andrew N
2012-01-01
In an effort to sustainably strengthen orthopaedic trauma care in Haiti, a 2-year Orthopaedic Trauma Care Specialist (OTCS) program for Haitian physicians has been developed. The program will provide focused training in orthopaedic trauma surgery and fracture care utilizing a train-the-trainer approach. The purpose of this analysis was to calculate the cost-effectiveness of the program relative to its potential to decrease disability in the Haitian population. Using established methodology originally outlined in the World Health Organization's Global Burden of Disease project, a cost-effectiveness analysis was performed for the OTCS program in Haiti. Costs and disability-adjusted life-years (DALYs) averted were estimated per fellow trained in the OTCS program by using a 20-year career time horizon. Probabilistic sensitivity analysis was used to simultaneously test the joint uncertainty of the cost and averted DALY estimates. A willingness-to-pay threshold of $1200 per DALY averted, equal to the gross domestic product per capita in Haiti, was selected on the basis of World Health Organization's definition of highly cost-effective health interventions. The OTCS program results in an incremental cost of $1,542,544 ± $109,134 and 12,213 ± 2,983 DALYs averted per fellow trained. The cost-effectiveness ratio of $133.97 ± $34.71 per DALY averted is well below the threshold of $1200 per DALY averted. Furthermore, sensitivity analysis suggests that implementing the OTCS program is the economically preferred strategy with more than 95% probability at a willingness-to-pay threshold of $200 per DALY averted and across the entire range of potential variable inputs. The current economic analysis suggests the OTCS program to be a highly cost-effective intervention. Probabilistic sensitivity analysis demonstrates that the conclusions remain stable even when considering the joint uncertainty of the cost and DALY estimates. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
A look-ahead probabilistic contingency analysis framework incorporating smart sampling techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Etingov, Pavel V.; Ren, Huiying
2016-07-18
This paper describes a framework of incorporating smart sampling techniques in a probabilistic look-ahead contingency analysis application. The predictive probabilistic contingency analysis helps to reflect the impact of uncertainties caused by variable generation and load on potential violations of transmission limits.
Probabilistic structural analysis methods of hot engine structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Hopkins, D. A.
1989-01-01
Development of probabilistic structural analysis methods for hot engine structures is a major activity at Lewis Research Center. Recent activities have focused on extending the methods to include the combined uncertainties in several factors on structural response. This paper briefly describes recent progress on composite load spectra models, probabilistic finite element structural analysis, and probabilistic strength degradation modeling. Progress is described in terms of fundamental concepts, computer code development, and representative numerical results.
Learning to Look: Probabilistic Variation and Noise Guide Infants' Eye Movements
ERIC Educational Resources Information Center
Tummeltshammer, Kristen Swan; Kirkham, Natasha Z.
2013-01-01
Young infants have demonstrated a remarkable sensitivity to probabilistic relations among visual features (Fiser & Aslin, 2002; Kirkham et al., 2002). Previous research has raised important questions regarding the usefulness of statistical learning in an environment filled with variability and noise, such as an infant's natural world. In…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-22
... Staff Guidance on Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic... Seismic Margin Analysis for New Reactors Based on Probabilistic Risk Assessment,'' (Agencywide Documents.../COL-ISG-020 ``Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic Risk...
Fall 2014 SEI Research Review Probabilistic Analysis of Time Sensitive Systems
2014-10-28
Osmosis SMC Tool Osmosis is a tool for Statistical Model Checking (SMC) with Semantic Importance Sampling. • Input model is written in subset of C...ASSERT() statements in model indicate conditions that must hold. • Input probability distributions defined by the user. • Osmosis returns the...on: – Target relative error, or – Set number of simulations Osmosis Main Algorithm 1 http://dreal.cs.cmu.edu/ (?⃑?): Indicator
Kunz, Wolfgang G; Hunink, M G Myriam; Sommer, Wieland H; Beyer, Sebastian E; Meinel, Felix G; Dorn, Franziska; Wirth, Stefan; Reiser, Maximilian F; Ertl-Wagner, Birgit; Thierfelder, Kolja M
2016-11-01
Endovascular therapy in addition to standard care (EVT+SC) has been demonstrated to be more effective than SC in acute ischemic large vessel occlusion stroke. Our aim was to determine the cost-effectiveness of EVT+SC depending on patients' initial National Institutes of Health Stroke Scale (NIHSS) score, time from symptom onset, Alberta Stroke Program Early CT Score (ASPECTS), and occlusion location. A decision model based on Markov simulations estimated lifetime costs and quality-adjusted life years (QALYs) associated with both strategies applied in a US setting. Model input parameters were obtained from the literature, including recently pooled outcome data of 5 randomized controlled trials (ESCAPE [Endovascular Treatment for Small Core and Proximal Occlusion Ischemic Stroke], EXTEND-IA [Extending the Time for Thrombolysis in Emergency Neurological Deficits-Intra-Arterial], MR CLEAN [Multicenter Randomized Clinical Trial of Endovascular Treatment for Acute Ischemic Stroke in the Netherlands], REVASCAT [Randomized Trial of Revascularization With Solitaire FR Device Versus Best Medical Therapy in the Treatment of Acute Stroke Due to Anterior Circulation Large Vessel Occlusion Presenting Within 8 Hours of Symptom Onset], and SWIFT PRIME [Solitaire With the Intention for Thrombectomy as Primary Endovascular Treatment]). Probabilistic sensitivity analysis was performed to estimate uncertainty of the model results. Net monetary benefits, incremental costs, incremental effectiveness, and incremental cost-effectiveness ratios were derived from the probabilistic sensitivity analysis. The willingness-to-pay was set to $50 000/QALY. Overall, EVT+SC was cost-effective compared with SC (incremental cost: $4938, incremental effectiveness: 1.59 QALYs, and incremental cost-effectiveness ratio: $3110/QALY) in 100% of simulations. In all patient subgroups, EVT+SC led to gained QALYs (range: 0.47-2.12), and mean incremental cost-effectiveness ratios were considered cost-effective. However, subgroups with ASPECTS ≤5 or with M2 occlusions showed considerably higher incremental cost-effectiveness ratios ($14 273/QALY and $28 812/QALY, respectively) and only reached suboptimal acceptability in the probabilistic sensitivity analysis (75.5% and 59.4%, respectively). All other subgroups had acceptability rates of 90% to 100%. EVT+SC is cost-effective in most subgroups. In patients with ASPECTS ≤5 or with M2 occlusions, cost-effectiveness remains uncertain based on current data. © 2016 American Heart Association, Inc.
Zhao, Yueyuan; Zhang, Xuefeng; Zhu, Fengcai; Jin, Hui; Wang, Bei
2016-08-02
Objective To estimate the cost-effectiveness of hepatitis E vaccination among pregnant women in epidemic regions. Methods A decision tree model was constructed to evaluate the cost-effectiveness of 3 hepatitis E virus vaccination strategies from societal perspectives. The model parameters were estimated on the basis of published studies and experts' experience. Sensitivity analysis was used to evaluate the uncertainties of the model. Results Vaccination was more economically effective on the basis of the incremental cost-effectiveness ratio (ICER< 3 times China's per capital gross domestic product/quality-adjusted life years); moreover, screening and vaccination had higher QALYs and lower costs compared with universal vaccination. No parameters significantly impacted ICER in one-way sensitivity analysis, and probabilistic sensitivity analysis also showed screening and vaccination to be the dominant strategy. Conclusion Screening and vaccination is the most economical strategy for pregnant women in epidemic regions; however, further studies are necessary to confirm the efficacy and safety of the hepatitis E vaccines.
Budget impact analysis of trastuzumab in early breast cancer: a hospital district perspective.
Purmonen, Timo T; Auvinen, Päivi K; Martikainen, Janne A
2010-04-01
Adjuvant trastuzumab is widely used in HER2-positive (HER2+) early breast cancer, and despite its cost-effectiveness, it causes substantial costs for health care. The purpose of the study was to develop a tool for estimating the budget impact of new cancer treatments. With this tool, we were able to estimate the budget impact of adjuvant trastuzumab, as well as the probability of staying within a given budget constraint. The created model-based evaluation tool was used to explore the budget impact of trastuzumab in early breast cancer in a single Finnish hospital district with 250,000 inhabitants. The used model took into account the number of patients, HER2+ prevalence, length and cost of treatment, and the effectiveness of the therapy. Probabilistic sensitivity analysis and alternative case scenarios were performed to ensure the robustness of the results. Introduction of adjuvant trastuzumab caused substantial costs for a relatively small hospital district. In base-case analysis the 4-year net budget impact was 1.3 million euro. The trastuzumab acquisition costs were partially offset by the reduction in costs associated with the treatment of cancer recurrence and metastatic disease. Budget impact analyses provide important information about the overall economic impact of new treatments, and thus offer complementary information to cost-effectiveness analyses. Inclusion of treatment outcomes and probabilistic sensitivity analysis provides more realistic estimates of the net budget impact. The length of trastuzumab treatment has a strong effect on the budget impact.
NASA Technical Reports Server (NTRS)
Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.
1993-01-01
Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.
NASA Astrophysics Data System (ADS)
Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi
2017-08-01
Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.
Probabilistic failure analysis of bone using a finite element model of mineral-collagen composites.
Dong, X Neil; Guda, Teja; Millwater, Harry R; Wang, Xiaodu
2009-02-09
Microdamage accumulation is a major pathway for energy dissipation during the post-yield deformation of bone. In this study, a two-dimensional probabilistic finite element model of a mineral-collagen composite was developed to investigate the influence of the tissue and ultrastructural properties of bone on the evolution of microdamage from an initial defect in tension. The probabilistic failure analyses indicated that the microdamage progression would be along the plane of the initial defect when the debonding at mineral-collagen interfaces was either absent or limited in the vicinity of the defect. In this case, the formation of a linear microcrack would be facilitated. However, the microdamage progression would be scattered away from the initial defect plane if interfacial debonding takes place at a large scale. This would suggest the possible formation of diffuse damage. In addition to interfacial debonding, the sensitivity analyses indicated that the microdamage progression was also dependent on the other material and ultrastructural properties of bone. The intensity of stress concentration accompanied with microdamage progression was more sensitive to the elastic modulus of the mineral phase and the nonlinearity of the collagen phase, whereas the scattering of failure location was largely dependent on the mineral to collagen ratio and the nonlinearity of the collagen phase. The findings of this study may help understanding the post-yield behavior of bone at the ultrastructural level and shed light on the underlying mechanism of bone fractures.
Probabilistic Failure Analysis of Bone Using a Finite Element Model of Mineral-Collagen Composites
Dong, X. Neil; Guda, Teja; Millwater, Harry R.; Wang, Xiaodu
2009-01-01
Microdamage accumulation is a major pathway for energy dissipation during the post-yield deformation of bone. In this study, a two-dimensional probabilistic finite element model of a mineral-collagen composite was developed to investigate the influence of the tissue and ultrastructural properties of bone on the evolution of microdamage from an initial defect in tension. The probabilistic failure analyses indicated that the microdamage progression would be along the plane of the initial defect when the debonding at mineral-collagen interfaces was either absent or limited in the vicinity of the defect. In this case, the formation of a linear microcrack would be facilitated. However, the microdamage progression would be scattered away from the initial defect plane if interfacial debonding takes place at a large scale. This would suggest the possible formation of diffuse damage. In addition to interfacial debonding, the sensitivity analyses indicated that the microdamage progression was also dependent on the other material and ultrastructural properties of bone. The intensity of stress concentration accompanied with microdamage progression was more sensitive to the elastic modulus of the mineral phase and the nonlinearity of the collagen phase, whereas the scattering of failure location was largely dependent on the mineral to collagen ratio and the nonlinearity of the collagen phase. The findings of this study may help understanding the post-yield behavior of bone at the ultrastructural level and shed light on the underlying mechanism of bone fractures. PMID:19058806
Cost-utility analysis of botulinum toxin type A products for the treatment of cervical dystonia.
Kazerooni, Rashid; Broadhead, Christine
2015-02-15
A cost-utility analysis of botulinum toxin type A products for the treatment of cervical dystonia (CD) was conducted. A cost-utility analysis of botulinum toxin type A products was conducted from the U.S. government perspective using a decision-analysis model with a one-year time horizon. Probabilities of the model were taken from several studies using the three botulinum type A products approved by the Food and Drug Administration for the treatment of CD: onabotulinumtoxinA (Botox), abobotulinumtoxinA (Dysport), and incobotulinumtoxinA (Xeomin). The main outcome measurement was successful treatment response with botulinum toxin type A, measured in quality-adjusted life years (QALYs). Response was defined as a patient who experienced improvement of CD symptoms without a severe adverse event. Probabilistic sensitivity analysis was conducted to test robustness of the base-case results. All three botulinum toxin type A agents were cost-effective at a willingness-to-pay threshold of $100,000 per QALY. Xeomin was the most cost-effective with a cost-effectiveness ratio of $27,548 per QALY. Xeomin was dominant over the alternative agents with equivalent efficacy outcomes and lower costs. Dysport had the second lowest cost-effectiveness ratio ($36,678), followed by Botox ($49,337). The probabilistic sensitivity analysis supported the results of the base-case analysis. Dysport was associated with the lowest wastage (2.2%), followed by Xeomin (10%) and Botox (22.9%). A cost-utility analysis found that Xeomin was the more cost-effective botulinum toxin type A product compared with Botox and Dysport for the treatment of CD. Wastage associated with the respective products may have a large effect on the cost-effectiveness of the agents. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Probabilistic structural analysis methods of hot engine structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Hopkins, D. A.
1989-01-01
Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade temperature, pressure, and torque of the space shuttle main engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; and (3) evaluation of the failure probability. Collectively, the results demonstrate that the structural durability of hot engine structural components can be effectively evaluated in a formal probabilistic/reliability framework.
PROBABILISTIC SAFETY ASSESSMENT OF OPERATIONAL ACCIDENTS AT THE WASTE ISOLATION PILOT PLANT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rucker, D.F.
2000-09-01
This report presents a probabilistic safety assessment of radioactive doses as consequences from accident scenarios to complement the deterministic assessment presented in the Waste Isolation Pilot Plant (WIPP) Safety Analysis Report (SAR). The International Council of Radiation Protection (ICRP) recommends both assessments be conducted to ensure that ''an adequate level of safety has been achieved and that no major contributors to risk are overlooked'' (ICRP 1993). To that end, the probabilistic assessment for the WIPP accident scenarios addresses the wide range of assumptions, e.g. the range of values representing the radioactive source of an accident, that could possibly have beenmore » overlooked by the SAR. Routine releases of radionuclides from the WIPP repository to the environment during the waste emplacement operations are expected to be essentially zero. In contrast, potential accidental releases from postulated accident scenarios during waste handling and emplacement could be substantial, which necessitates the need for radiological air monitoring and confinement barriers (DOE 1999). The WIPP Safety Analysis Report (SAR) calculated doses from accidental releases to the on-site (at 100 m from the source) and off-site (at the Exclusive Use Boundary and Site Boundary) public by a deterministic approach. This approach, as demonstrated in the SAR, uses single-point values of key parameters to assess the 50-year, whole-body committed effective dose equivalent (CEDE). The basic assumptions used in the SAR to formulate the CEDE are retained for this report's probabilistic assessment. However, for the probabilistic assessment, single-point parameter values were replaced with probability density functions (PDF) and were sampled over an expected range. Monte Carlo simulations were run, in which 10,000 iterations were performed by randomly selecting one value for each parameter and calculating the dose. Statistical information was then derived from the 10,000 iteration batch, which included 5%, 50%, and 95% dose likelihood, and the sensitivity of each assumption to the calculated doses. As one would intuitively expect, the doses from the probabilistic assessment for most scenarios were found to be much less than the deterministic assessment. The lower dose of the probabilistic assessment can be attributed to a ''smearing'' of values from the high and low end of the PDF spectrum of the various input parameters. The analysis also found a potential weakness in the deterministic analysis used in the SAR, a detail on drum loading was not taken into consideration. Waste emplacement operations thus far have handled drums from each shipment as a single unit, i.e. drums from each shipment are kept together. Shipments typically come from a single waste stream, and therefore the curie loading of each drum can be considered nearly identical to that of its neighbor. Calculations show that if there are large numbers of drums used in the accident scenario assessment, e.g. 28 drums in the waste hoist failure scenario (CH5), then the probabilistic dose assessment calculations will diverge from the deterministically determined doses. As it is currently calculated, the deterministic dose assessment assumes one drum loaded to the maximum allowable (80 PE-Ci), and the remaining are 10% of the maximum. The effective average of drum curie content is therefore less in the deterministic assessment than the probabilistic assessment for a large number of drums. EEG recommends that the WIPP SAR calculations be revisited and updated to include a probabilistic safety assessment.« less
Dalziel, Kim; Round, Ali; Garside, Ruth; Stein, Ken
2005-01-01
To evaluate the cost utility of imatinib compared with interferon (IFN)-alpha or hydroxycarbamide (hydroxyurea) for first-line treatment of chronic myeloid leukaemia. A cost-utility (Markov) model within the setting of the UK NHS and viewed from a health system perspective was adopted. Transition probabilities and relative risks were estimated from published literature. Costs of drug treatment, outpatient care, bone marrow biopsies, radiography, blood transfusions and inpatient care were obtained from the British National Formulary and local hospital databases. Costs (pound, year 2001-03 values) were discounted at 6%. Quality-of-life (QOL) data were obtained from the published literature and discounted at 1.5%. The main outcome measure was cost per QALY gained. Extensive one-way sensitivity analyses were performed along with probabilistic (stochastic) analysis. The incremental cost-effectiveness ratio (ICER) of imatinib, compared with IFNalpha, was pound26,180 per QALY gained (one-way sensitivity analyses ranged from pound19,449 to pound51,870) and compared with hydroxycarbamide was pound86,934 per QALY (one-way sensitivity analyses ranged from pound69,701 to pound147,095) [ pound1=$US1.691=euro1.535 as at 31 December 2002].Based on the probabilistic sensitivity analysis, 50% of the ICERs for imatinib, compared with IFNalpha, fell below a threshold of approximately pound31,000 per QALY gained. Fifty percent of ICERs for imatinib, compared with hydroxycarbamide, fell below approximately pound95,000 per QALY gained. This model suggests, given its underlying data and assumptions, that imatinib may be moderately cost effective when compared with IFNalpha but considerably less cost effective when compared with hydroxycarbamide. There are, however, many uncertainties due to the lack of long-term data.
A Proposed Probabilistic Extension of the Halpern and Pearl Definition of ‘Actual Cause’
2017-01-01
ABSTRACT Joseph Halpern and Judea Pearl ([2005]) draw upon structural equation models to develop an attractive analysis of ‘actual cause’. Their analysis is designed for the case of deterministic causation. I show that their account can be naturally extended to provide an elegant treatment of probabilistic causation. 1Introduction2Preemption3Structural Equation Models4The Halpern and Pearl Definition of ‘Actual Cause’5Preemption Again6The Probabilistic Case7Probabilistic Causal Models8A Proposed Probabilistic Extension of Halpern and Pearl’s Definition9Twardy and Korb’s Account10Probabilistic Fizzling11Conclusion PMID:29593362
Willis, Sarah R; Ahmed, Hashim U; Moore, Caroline M; Donaldson, Ian; Emberton, Mark; Miners, Alec H; van der Meulen, Jan
2014-01-01
Objective To compare the diagnostic outcomes of the current approach of transrectal ultrasound (TRUS)-guided biopsy in men with suspected prostate cancer to an alternative approach using multiparametric MRI (mpMRI), followed by MRI-targeted biopsy if positive. Design Clinical decision analysis was used to synthesise data from recently emerging evidence in a format that is relevant for clinical decision making. Population A hypothetical cohort of 1000 men with suspected prostate cancer. Interventions mpMRI and, if positive, MRI-targeted biopsy compared with TRUS-guided biopsy in all men. Outcome measures We report the number of men expected to undergo a biopsy as well as the numbers of correctly identified patients with or without prostate cancer. A probabilistic sensitivity analysis was carried out using Monte Carlo simulation to explore the impact of statistical uncertainty in the diagnostic parameters. Results In 1000 men, mpMRI followed by MRI-targeted biopsy ‘clinically dominates’ TRUS-guided biopsy as it results in fewer expected biopsies (600 vs 1000), more men being correctly identified as having clinically significant cancer (320 vs 250), and fewer men being falsely identified (20 vs 50). The mpMRI-based strategy dominated TRUS-guided biopsy in 86% of the simulations in the probabilistic sensitivity analysis. Conclusions Our analysis suggests that mpMRI followed by MRI-targeted biopsy is likely to result in fewer and better biopsies than TRUS-guided biopsy. Future research in prostate cancer should focus on providing precise estimates of key diagnostic parameters. PMID:24934207
An approximate methods approach to probabilistic structural analysis
NASA Technical Reports Server (NTRS)
Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.
1989-01-01
A major research and technology program in Probabilistic Structural Analysis Methods (PSAM) is currently being sponsored by the NASA Lewis Research Center with Southwest Research Institute as the prime contractor. This program is motivated by the need to accurately predict structural response in an environment where the loadings, the material properties, and even the structure may be considered random. The heart of PSAM is a software package which combines advanced structural analysis codes with a fast probability integration (FPI) algorithm for the efficient calculation of stochastic structural response. The basic idea of PAAM is simple: make an approximate calculation of system response, including calculation of the associated probabilities, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The deterministic solution resulting should give a reasonable and realistic description of performance-limiting system responses, although some error will be inevitable. If the simple model has correctly captured the basic mechanics of the system, however, including the proper functional dependence of stress, frequency, etc. on design parameters, then the response sensitivities calculated may be of significantly higher accuracy.
NASA Technical Reports Server (NTRS)
1992-01-01
The papers presented at the symposium cover aerodynamics, design applications, propulsion systems, high-speed flight, structures, controls, sensitivity analysis, optimization algorithms, and space structures applications. Other topics include helicopter rotor design, artificial intelligence/neural nets, and computational aspects of optimization. Papers are included on flutter calculations for a system with interacting nonlinearities, optimization in solid rocket booster application, improving the efficiency of aerodynamic shape optimization procedures, nonlinear control theory, and probabilistic structural analysis of space truss structures for nonuniform thermal environmental effects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peace, Gerald; Goering, Timothy James; Miller, Mark Laverne
2007-01-01
A probabilistic performance assessment has been conducted to evaluate the fate and transport of radionuclides (americium-241, cesium-137, cobalt-60, plutonium-238, plutonium-239, radium-226, radon-222, strontium-90, thorium-232, tritium, uranium-238), heavy metals (lead and cadmium), and volatile organic compounds (VOCs) at the Mixed Waste Landfill (MWL). Probabilistic analyses were performed to quantify uncertainties inherent in the system and models for a 1,000-year period, and sensitivity analyses were performed to identify parameters and processes that were most important to the simulated performance metrics. Comparisons between simulated results and measured values at the MWL were made to gain confidence in the models and perform calibrations whenmore » data were available. In addition, long-term monitoring requirements and triggers were recommended based on the results of the quantified uncertainty and sensitivity analyses.« less
Development of probabilistic multimedia multipathway computer codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, C.; LePoire, D.; Gnanapragasam, E.
2002-01-01
The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less
Chatterjee, Abhishek; Macarios, David; Griffin, Leah; Kosowski, Tomasz; Pyfer, Bryan J; Offodile, Anaeze C; Driscoll, Daniel; Maddali, Sirish; Attwood, John
2015-11-01
Sartorius flap coverage and adjunctive negative pressure wound therapy (NPWT) have been described in managing infected vascular groin grafts with varying cost and clinical success. We performed a cost-utility analysis comparing sartorius flap with NPWT in managing an infected vascular groin graft. A literature review compiling outcomes for sartorius flap and NPWT interventions was conducted from peer-reviewed journals in MEDLINE (PubMed) and EMBASE. Utility scores were derived from expert opinion and used to estimate quality-adjusted life years (QALYs). Medicare current procedure terminology and diagnosis-related groups codes were used to assess the costs for successful graft salvage with the associated complications. Incremental cost-effectiveness was assessed at $50,000/QALY, and both univariate and probabilistic sensitivity analyses were conducted to assess robustness of the conclusions. Thirty-two studies were used pooling 384 patients (234 sartorius flaps and 150 NPWT). NPWT had better clinical outcomes (86.7% success rate, 0.9% minor complication rate, and 13.3% major complication rate) than sartorius flap (81.6% success rate, 8.0% minor complication rate, and 18.4% major complication rate). NPWT was less costly ($12,366 versus $23,516) and slightly more effective (12.06 QALY versus 12.05 QALY) compared with sartorius flap. Sensitivity analyses confirmed the robustness of the base case findings; NPWT was either cost-effective at $50,000/QALY or dominated sartorius flap in 81.6% of all probabilistic sensitivity analyses. In our cost-utility analysis, use of adjunctive NPWT, along with debridement and antibiotic treatment, for managing infected vascular groin graft wounds was found to be a more cost-effective option when compared with sartorius flaps.
Macarios, David; Griffin, Leah; Kosowski, Tomasz; Pyfer, Bryan J.; Offodile, Anaeze C.; Driscoll, Daniel; Maddali, Sirish; Attwood, John
2015-01-01
Background: Sartorius flap coverage and adjunctive negative pressure wound therapy (NPWT) have been described in managing infected vascular groin grafts with varying cost and clinical success. We performed a cost–utility analysis comparing sartorius flap with NPWT in managing an infected vascular groin graft. Methods: A literature review compiling outcomes for sartorius flap and NPWT interventions was conducted from peer-reviewed journals in MEDLINE (PubMed) and EMBASE. Utility scores were derived from expert opinion and used to estimate quality-adjusted life years (QALYs). Medicare current procedure terminology and diagnosis-related groups codes were used to assess the costs for successful graft salvage with the associated complications. Incremental cost-effectiveness was assessed at $50,000/QALY, and both univariate and probabilistic sensitivity analyses were conducted to assess robustness of the conclusions. Results: Thirty-two studies were used pooling 384 patients (234 sartorius flaps and 150 NPWT). NPWT had better clinical outcomes (86.7% success rate, 0.9% minor complication rate, and 13.3% major complication rate) than sartorius flap (81.6% success rate, 8.0% minor complication rate, and 18.4% major complication rate). NPWT was less costly ($12,366 versus $23,516) and slightly more effective (12.06 QALY versus 12.05 QALY) compared with sartorius flap. Sensitivity analyses confirmed the robustness of the base case findings; NPWT was either cost-effective at $50,000/QALY or dominated sartorius flap in 81.6% of all probabilistic sensitivity analyses. Conclusion: In our cost–utility analysis, use of adjunctive NPWT, along with debridement and antibiotic treatment, for managing infected vascular groin graft wounds was found to be a more cost-effective option when compared with sartorius flaps. PMID:26893991
van den Houten, M M L; Lauret, G J; Fakhry, F; Fokkenrood, H J P; van Asselt, A D I; Hunink, M G M; Teijink, J A W
2016-11-01
Current guidelines recommend supervised exercise therapy (SET) as the preferred initial treatment for patients with intermittent claudication. The availability of SET programmes is, however, limited and such programmes are often not reimbursed. Evidence for the long-term cost-effectiveness of SET compared with endovascular revascularization (ER) as primary treatment for intermittent claudication might aid widespread adoption in clinical practice. A Markov model was constructed to determine the incremental costs, incremental quality-adjusted life-years (QALYs) and incremental cost-effectiveness ratio of SET versus ER for a hypothetical cohort of patients with newly diagnosed intermittent claudication, from the Dutch healthcare payer's perspective. In the event of primary treatment failure, possible secondary interventions were repeat ER, open revascularization or major amputation. Data sources for model parameters included original data from two RCTs, as well as evidence from the medical literature. The robustness of the results was tested with probabilistic and one-way sensitivity analysis. Considering a 5-year time horizon, probabilistic sensitivity analysis revealed that SET was associated with cost savings compared with ER (-€6412, 95 per cent credibility interval (CrI) -€11 874 to -€1939). The mean difference in effectiveness was -0·07 (95 per cent CrI -0·27 to 0·16) QALYs. ER was associated with an additional €91 600 per QALY gained compared with SET. One-way sensitivity analysis indicated more favourable cost-effectiveness for ER in subsets of patients with low quality-of-life scores at baseline. SET is a more cost-effective primary treatment for intermittent claudication than ER. These results support implementation of supervised exercise programmes in clinical practice. © 2016 BJS Society Ltd Published by John Wiley & Sons Ltd.
Qian, Yushen; Pollom, Erqi L.; King, Martin T.; Dudley, Sara A.; Shaffer, Jenny L.; Chang, Daniel T.; Gibbs, Iris C.; Goldhaber-Fiebert, Jeremy D.; Horst, Kathleen C.
2016-01-01
Purpose The Clinical Evaluation of Pertuzumab and Trastuzumab (CLEOPATRA) study showed a 15.7-month survival benefit with the addition of pertuzumab to docetaxel and trastuzumab (THP) as first-line treatment for patients with human epidermal growth factor receptor 2 (HER2) –overexpressing metastatic breast cancer. We performed a cost-effectiveness analysis to assess the value of adding pertuzumab. Patient and Methods We developed a decision-analytic Markov model to evaluate the cost effectiveness of docetaxel plus trastuzumab (TH) with or without pertuzumab in US patients with metastatic breast cancer. The model followed patients weekly over their remaining lifetimes. Health states included stable disease, progressing disease, hospice, and death. Transition probabilities were based on the CLEOPATRA study. Costs reflected the 2014 Medicare rates. Health state utilities were the same as those used in other recent cost-effectiveness studies of trastuzumab and pertuzumab. Outcomes included health benefits expressed as discounted quality-adjusted life-years (QALYs), costs in US dollars, and cost effectiveness expressed as an incremental cost-effectiveness ratio. One- and multiway deterministic and probabilistic sensitivity analyses explored the effects of specific assumptions. Results Modeled median survival was 39.4 months for TH and 56.9 months for THP. The addition of pertuzumab resulted in an additional 1.81 life-years gained, or 0.62 QALYs, at a cost of $472,668 per QALY gained. Deterministic sensitivity analysis showed that THP is unlikely to be cost effective even under the most favorable assumptions, and probabilistic sensitivity analysis predicted 0% chance of cost effectiveness at a willingness to pay of $100,000 per QALY gained. Conclusion THP in patients with metastatic HER2-positive breast cancer is unlikely to be cost effective in the United States. PMID:26351332
Probabilistic Structural Analysis Methods (PSAM) for Select Space Propulsion System Components
NASA Technical Reports Server (NTRS)
1999-01-01
Probabilistic Structural Analysis Methods (PSAM) are described for the probabilistic structural analysis of engine components for current and future space propulsion systems. Components for these systems are subjected to stochastic thermomechanical launch loads. Uncertainties or randomness also occurs in material properties, structural geometry, and boundary conditions. Material property stochasticity, such as in modulus of elasticity or yield strength, exists in every structure and is a consequence of variations in material composition and manufacturing processes. Procedures are outlined for computing the probabilistic structural response or reliability of the structural components. The response variables include static or dynamic deflections, strains, and stresses at one or several locations, natural frequencies, fatigue or creep life, etc. Sample cases illustrates how the PSAM methods and codes simulate input uncertainties and compute probabilistic response or reliability using a finite element model with probabilistic methods.
Kolios, Athanasios; Jiang, Ying; Somorin, Tosin; Sowale, Ayodeji; Anastasopoulou, Aikaterini; Anthony, Edward J; Fidalgo, Beatriz; Parker, Alison; McAdam, Ewan; Williams, Leon; Collins, Matt; Tyrrel, Sean
2018-05-01
A probabilistic modelling approach was developed and applied to investigate the energy and environmental performance of an innovative sanitation system, the "Nano-membrane Toilet" (NMT). The system treats human excreta via an advanced energy and water recovery island with the aim of addressing current and future sanitation demands. Due to the complex design and inherent characteristics of the system's input material, there are a number of stochastic variables which may significantly affect the system's performance. The non-intrusive probabilistic approach adopted in this study combines a finite number of deterministic thermodynamic process simulations with an artificial neural network (ANN) approximation model and Monte Carlo simulations (MCS) to assess the effect of system uncertainties on the predicted performance of the NMT system. The joint probability distributions of the process performance indicators suggest a Stirling Engine (SE) power output in the range of 61.5-73 W with a high confidence interval (CI) of 95%. In addition, there is high probability (with 95% CI) that the NMT system can achieve positive net power output between 15.8 and 35 W. A sensitivity study reveals the system power performance is mostly affected by SE heater temperature. Investigation into the environmental performance of the NMT design, including water recovery and CO 2 /NO x emissions, suggests significant environmental benefits compared to conventional systems. Results of the probabilistic analysis can better inform future improvements on the system design and operational strategy and this probabilistic assessment framework can also be applied to similar complex engineering systems.
System Risk Assessment and Allocation in Conceptual Design
NASA Technical Reports Server (NTRS)
Mahadevan, Sankaran; Smith, Natasha L.; Zang, Thomas A. (Technical Monitor)
2003-01-01
As aerospace systems continue to evolve in addressing newer challenges in air and space transportation, there exists a heightened priority for significant improvement in system performance, cost effectiveness, reliability, and safety. Tools, which synthesize multidisciplinary integration, probabilistic analysis, and optimization, are needed to facilitate design decisions allowing trade-offs between cost and reliability. This study investigates tools for probabilistic analysis and probabilistic optimization in the multidisciplinary design of aerospace systems. A probabilistic optimization methodology is demonstrated for the low-fidelity design of a reusable launch vehicle at two levels, a global geometry design and a local tank design. Probabilistic analysis is performed on a high fidelity analysis of a Navy missile system. Furthermore, decoupling strategies are introduced to reduce the computational effort required for multidisciplinary systems with feedback coupling.
Probabilistic Structural Analysis Methods (PSAM) for select space propulsion systems components
NASA Technical Reports Server (NTRS)
1991-01-01
Summarized here is the technical effort and computer code developed during the five year duration of the program for probabilistic structural analysis methods. The summary includes a brief description of the computer code manuals and a detailed description of code validation demonstration cases for random vibrations of a discharge duct, probabilistic material nonlinearities of a liquid oxygen post, and probabilistic buckling of a transfer tube liner.
NASA Astrophysics Data System (ADS)
Bashkirtseva, Irina; Ryashko, Lev; Ryazanova, Tatyana
2017-09-01
A problem of the analysis of the noise-induced extinction in multidimensional population systems is considered. For the investigation of conditions of the extinction caused by random disturbances, a new approach based on the stochastic sensitivity function technique and confidence domains is suggested, and applied to tritrophic population model of interacting prey, predator and top predator. This approach allows us to analyze constructively the probabilistic mechanisms of the transition to the noise-induced extinction from both equilibrium and oscillatory regimes of coexistence. In this analysis, a method of principal directions for the reducing of the dimension of confidence domains is suggested. In the dispersion of random states, the principal subspace is defined by the ratio of eigenvalues of the stochastic sensitivity matrix. A detailed analysis of two scenarios of the noise-induced extinction in dependence on parameters of considered tritrophic system is carried out.
Superposition-Based Analysis of First-Order Probabilistic Timed Automata
NASA Astrophysics Data System (ADS)
Fietzke, Arnaud; Hermanns, Holger; Weidenbach, Christoph
This paper discusses the analysis of first-order probabilistic timed automata (FPTA) by a combination of hierarchic first-order superposition-based theorem proving and probabilistic model checking. We develop the overall semantics of FPTAs and prove soundness and completeness of our method for reachability properties. Basically, we decompose FPTAs into their time plus first-order logic aspects on the one hand, and their probabilistic aspects on the other hand. Then we exploit the time plus first-order behavior by hierarchic superposition over linear arithmetic. The result of this analysis is the basis for the construction of a reachability equivalent (to the original FPTA) probabilistic timed automaton to which probabilistic model checking is finally applied. The hierarchic superposition calculus required for the analysis is sound and complete on the first-order formulas generated from FPTAs. It even works well in practice. We illustrate the potential behind it with a real-life DHCP protocol example, which we analyze by means of tool chain support.
Probabilistic Structural Analysis of the SRB Aft Skirt External Fitting Modification
NASA Technical Reports Server (NTRS)
Townsend, John S.; Peck, J.; Ayala, S.
1999-01-01
NASA has funded several major programs (the PSAM Project is an example) to develop Probabilistic Structural Analysis Methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element design tool, known as NESSUS, is used to determine the reliability of the Space Shuttle Solid Rocket Booster (SRB) aft skirt critical weld. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process.
NASA Astrophysics Data System (ADS)
Rabatel, Matthias; Rampal, Pierre; Carrassi, Alberto; Bertino, Laurent; Jones, Christopher K. R. T.
2018-03-01
We present a sensitivity analysis and discuss the probabilistic forecast capabilities of the novel sea ice model neXtSIM used in hindcast mode. The study pertains to the response of the model to the uncertainty on winds using probabilistic forecasts of ice trajectories. neXtSIM is a continuous Lagrangian numerical model that uses an elasto-brittle rheology to simulate the ice response to external forces. The sensitivity analysis is based on a Monte Carlo sampling of 12 members. The response of the model to the uncertainties is evaluated in terms of simulated ice drift distances from their initial positions, and from the mean position of the ensemble, over the mid-term forecast horizon of 10 days. The simulated ice drift is decomposed into advective and diffusive parts that are characterised separately both spatially and temporally and compared to what is obtained with a free-drift model, that is, when the ice rheology does not play any role in the modelled physics of the ice. The seasonal variability of the model sensitivity is presented and shows the role of the ice compactness and rheology in the ice drift response at both local and regional scales in the Arctic. Indeed, the ice drift simulated by neXtSIM in summer is close to the one obtained with the free-drift model, while the more compact and solid ice pack shows a significantly different mechanical and drift behaviour in winter. For the winter period analysed in this study, we also show that, in contrast to the free-drift model, neXtSIM reproduces the sea ice Lagrangian diffusion regimes as found from observed trajectories. The forecast capability of neXtSIM is also evaluated using a large set of real buoy's trajectories and compared to the capability of the free-drift model. We found that neXtSIM performs significantly better in simulating sea ice drift, both in terms of forecast error and as a tool to assist search and rescue operations, although the sources of uncertainties assumed for the present experiment are not sufficient for complete coverage of the observed IABP positions.
Baschet, Louise; Bourguignon, Sandrine; Marque, Sébastien; Durand-Zaleski, Isabelle; Teiger, Emmanuel; Wilquin, Fanny; Levesque, Karine
2016-01-01
To determine the cost-effectiveness of drug-eluting stents (DES) compared with bare-metal stents (BMS) in patients requiring a percutaneous coronary intervention in France, using a recent meta-analysis including second-generation DES. A cost-effectiveness analysis was performed in the French National Health Insurance setting. Effectiveness settings were taken from a meta-analysis of 117 762 patient-years with 76 randomised trials. The main effectiveness criterion was major cardiac event-free survival. Effectiveness and costs were modelled over a 5-year horizon using a three-state Markov model. Incremental cost-effectiveness ratios and a cost-effectiveness acceptability curve were calculated for a range of thresholds for willingness to pay per year without major cardiac event gain. Deterministic and probabilistic sensitivity analyses were performed. Base case results demonstrated that DES are dominant over BMS, with an increase in event-free survival and a cost-reduction of €184, primarily due to a diminution of second revascularisations, and an absence of myocardial infarction and stent thrombosis. These results are robust for uncertainty on one-way deterministic and probabilistic sensitivity analyses. Using a cost-effectiveness threshold of €7000 per major cardiac event-free year gained, DES has a >95% probability of being cost-effective versus BMS. Following DES price decrease, new-generation DES development and taking into account recent meta-analyses results, the DES can now be considered cost-effective regardless of selective indication in France, according to European recommendations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodges, Joseph C., E-mail: joseph.hodges@utsouthwestern.edu; Beg, Muhammad S.; Das, Prajnan
2014-07-15
Purpose: To compare the cost-effectiveness of intensity modulated radiation therapy (IMRT) and 3-dimensional conformal radiation therapy (3D-CRT) for anal cancer and determine disease, patient, and treatment parameters that influence the result. Methods and Materials: A Markov decision model was designed with the various disease states for the base case of a 65-year-old patient with anal cancer treated with either IMRT or 3D-CRT and concurrent chemotherapy. Health states accounting for rates of local failure, colostomy failure, treatment breaks, patient prognosis, acute and late toxicities, and the utility of toxicities were informed by existing literature and analyzed with deterministic and probabilistic sensitivitymore » analysis. Results: In the base case, mean costs and quality-adjusted life expectancy in years (QALY) for IMRT and 3D-CRT were $32,291 (4.81) and $28,444 (4.78), respectively, resulting in an incremental cost-effectiveness ratio of $128,233/QALY for IMRT compared with 3D-CRT. Probabilistic sensitivity analysis found that IMRT was cost-effective in 22%, 47%, and 65% of iterations at willingness-to-pay thresholds of $50,000, $100,000, and $150,000 per QALY, respectively. Conclusions: In our base model, IMRT was a cost-ineffective strategy despite the reduced acute treatment toxicities and their associated costs of management. The model outcome was sensitive to variations in local and colostomy failure rates, as well as patient-reported utilities relating to acute toxicities.« less
Herzer, Kurt R; Niessen, Louis; Constenla, Dagna O; Ward, William J; Pronovost, Peter J
2014-01-01
Objective To assess the cost-effectiveness of a multifaceted quality improvement programme focused on reducing central line-associated bloodstream infections in intensive care units. Design Cost-effectiveness analysis using a decision tree model to compare programme to non-programme intensive care units. Setting USA. Population Adult patients in the intensive care unit. Costs Economic costs of the programme and of central line-associated bloodstream infections were estimated from the perspective of the hospital and presented in 2013 US dollars. Main outcome measures Central line-associated bloodstream infections prevented, deaths averted due to central line-associated bloodstream infections prevented, and incremental cost-effectiveness ratios. Probabilistic sensitivity analysis was performed. Results Compared with current practice, the programme is strongly dominant and reduces bloodstream infections and deaths at no additional cost. The probabilistic sensitivity analysis showed that there was an almost 80% probability that the programme reduces bloodstream infections and the infections’ economic costs to hospitals. The opportunity cost of a bloodstream infection to a hospital was the most important model parameter in these analyses. Conclusions This multifaceted quality improvement programme, as it is currently implemented by hospitals on an increasingly large scale in the USA, likely reduces the economic costs of central line-associated bloodstream infections for US hospitals. Awareness among hospitals about the programme's benefits should enhance implementation. The programme's implementation has the potential to substantially reduce morbidity, mortality and economic costs associated with central line-associated bloodstream infections. PMID:25256190
Kovac, Jason Ronald; Fantus, Jake; Lipshultz, Larry I; Fischer, Marc Anthony; Klinghoffer, Zachery
2014-09-01
Varicoceles are a common cause of male infertility; repair can be accomplished using either surgical or radiological means. We compare the cost-effectiveness of the gold standard, the microsurgical varicocele repair (MV), to the options of a nonmicrosurgical approach (NMV) and percutaneous embolization (PE) to manage varicocele-associated infertility. A Markov decision-analysis model was developed to estimate costs and pregnancy rates. Within the model, recurrences following MV and NMV were re-treated with PE and recurrences following PE were treated with repeat PE, MV or NMV. Pregnancy and recurrence rates were based on the literature, while costs were obtained from institutional and government supplied data. Univariate and probabilistic sensitivity-analyses were performed to determine the effects of the various parameters on model outcomes. Primary treatment with MV was the most cost-effective strategy at $5402 CAD (Canadian)/pregnancy. Primary treatment with NMV was the least costly approach, but it also yielded the fewest pregnancies. Primary treatment with PE was the least cost-effective strategy costing about $7300 CAD/pregnancy. Probabilistic sensitivity analysis reinforced MV as the most cost-effective strategy at a willingness-to-pay threshold of >$4100 CAD/pregnancy. MV yielded the most pregnancies at acceptable levels of incremental costs. As such, it is the preferred primary treatment strategy for varicocele-associated infertility. Treatment with PE was the least cost-effective approach and, as such, is best used only in cases of surgical failure.
Probabilistic Structural Analysis Theory Development
NASA Technical Reports Server (NTRS)
Burnside, O. H.
1985-01-01
The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.
Leff, Jared A; Schneider, Yecheskel; Crawford, Carl V; Maw, Anna; Bosworth, Brian; Simon, Matthew S
2017-01-01
Abstract Background Systematic reviews with meta-analyses and meta-regression suggest that timely probiotic use can prevent Clostridium difficile infection (CDI) in hospitalized adults receiving antibiotics, but the cost effectiveness is unknown. We sought to evaluate the cost effectiveness of probiotic use for prevention of CDI versus no probiotic use in the United States. Methods We programmed a decision analytic model using published literature and national databases with a 1-year time horizon. The base case was modeled as a hypothetical cohort of hospitalized adults (mean age 68) receiving antibiotics with and without concurrent probiotic administration. Projected outcomes included quality-adjusted life-years (QALYs), costs (2013 US dollars), incremental cost-effectiveness ratios (ICERs; $/QALY), and cost per infection avoided. One-way, two-way, and probabilistic sensitivity analyses were conducted, and scenarios of different age cohorts were considered. The ICERs less than $100000 per QALY were considered cost effective. Results Probiotic use dominated (more effective and less costly) no probiotic use. Results were sensitive to probiotic efficacy (relative risk <0.73), the baseline risk of CDI (>1.6%), the risk of probiotic-associated bactermia/fungemia (<0.26%), probiotic cost (<$130), and age (>65). In probabilistic sensitivity analysis, at a willingness-to-pay threshold of $100000/QALY, probiotics were the optimal strategy in 69.4% of simulations. Conclusions Our findings suggest that probiotic use may be a cost-effective strategy to prevent CDI in hospitalized adults receiving antibiotics age 65 or older or when the baseline risk of CDI exceeds 1.6%. PMID:29230429
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ho, Clifford Kuofei
Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-based exposure assessments and drug-delivery analyses, but previous mechanistic models have been largely deterministic. A probabilistic, transient, three-phase model of percutaneous absorption of chemicals has been developed to assess the relative importance of uncertain parameters and processes that may be important to risk-based assessments. Penetration routes through the skinmore » that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times. This probabilistic analysis of percutaneous absorption (PAPA) method has been developed to improve risk-based exposure assessments and transdermal drug-delivery analyses, where parameters and processes can be highly uncertain.« less
Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components
NASA Technical Reports Server (NTRS)
1991-01-01
The fourth year of technical developments on the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) system for Probabilistic Structural Analysis Methods is summarized. The effort focused on the continued expansion of the Probabilistic Finite Element Method (PFEM) code, the implementation of the Probabilistic Boundary Element Method (PBEM), and the implementation of the Probabilistic Approximate Methods (PAppM) code. The principal focus for the PFEM code is the addition of a multilevel structural dynamics capability. The strategy includes probabilistic loads, treatment of material, geometry uncertainty, and full probabilistic variables. Enhancements are included for the Fast Probability Integration (FPI) algorithms and the addition of Monte Carlo simulation as an alternate. Work on the expert system and boundary element developments continues. The enhanced capability in the computer codes is validated by applications to a turbine blade and to an oxidizer duct.
PCEMCAN - Probabilistic Ceramic Matrix Composites Analyzer: User's Guide, Version 1.0
NASA Technical Reports Server (NTRS)
Shah, Ashwin R.; Mital, Subodh K.; Murthy, Pappu L. N.
1998-01-01
PCEMCAN (Probabalistic CEramic Matrix Composites ANalyzer) is an integrated computer code developed at NASA Lewis Research Center that simulates uncertainties associated with the constituent properties, manufacturing process, and geometric parameters of fiber reinforced ceramic matrix composites and quantifies their random thermomechanical behavior. The PCEMCAN code can perform the deterministic as well as probabilistic analyses to predict thermomechanical properties. This User's guide details the step-by-step procedure to create input file and update/modify the material properties database required to run PCEMCAN computer code. An overview of the geometric conventions, micromechanical unit cell, nonlinear constitutive relationship and probabilistic simulation methodology is also provided in the manual. Fast probability integration as well as Monte-Carlo simulation methods are available for the uncertainty simulation. Various options available in the code to simulate probabilistic material properties and quantify sensitivity of the primitive random variables have been described. The description of deterministic as well as probabilistic results have been described using demonstration problems. For detailed theoretical description of deterministic and probabilistic analyses, the user is referred to the companion documents "Computational Simulation of Continuous Fiber-Reinforced Ceramic Matrix Composite Behavior," NASA TP-3602, 1996 and "Probabilistic Micromechanics and Macromechanics for Ceramic Matrix Composites", NASA TM 4766, June 1997.
Jayaraman, Sudha P; Jiang, Yushan; Resch, Stephen; Askari, Reza; Klompas, Michael
2016-10-01
Interventions to contain two multi-drug-resistant Acinetobacter (MDRA) outbreaks reduced the incidence of multi-drug-resistant (MDR) organisms, specifically methicillin-resistant Staphylococcus aureus, vancomycin-resistant Enterococcus, and Clostridium difficile in the general surgery intensive care unit (ICU) of our hospital. We therefore conducted a cost-effective analysis of a proactive model infection-control program to reduce transmission of MDR organisms based on the practices used to control the MDRA outbreak. We created a model of a proactive infection control program based on the 2011 MDRA outbreak response. We built a decision analysis model and performed univariable and probabilistic sensitivity analyses to evaluate the cost-effectiveness of the proposed program compared with standard infection control practices to reduce transmission of these MDR organisms. The cost of a proactive infection control program would be $68,509 per year. The incremental cost-effectiveness ratio (ICER) was calculated to be $3,804 per aversion of transmission of MDR organisms in a one-year period compared with standard infection control. On the basis of probabilistic sensitivity analysis, a willingness-to-pay (WTP) threshold of $14,000 per transmission averted would have a 42% probability of being cost-effective, rising to 100% at $22,000 per transmission averted. This analysis gives an estimated ICER for implementing a proactive program to prevent transmission of MDR organisms in the general surgery ICU. To better understand the causal relations between the critical steps in the program and the rate reductions, a randomized study of a package of interventions to prevent healthcare-associated infections should be considered.
Cost-effectiveness analysis of ibrutinib in patients with Waldenström macroglobulinemia in Italy.
Aiello, Andrea; D'Ausilio, Anna; Lo Muto, Roberta; Randon, Francesca; Laurenti, Luca
2017-01-01
Background and Objective: Ibrutinib has recently been approved in Europe for Waldenström Macroglobulinemia (WM) in symptomatic patients who have received at least one prior therapy, or in first-line treatment for patients unsuitable for chemo-immunotherapy. The aim of the study is to estimate the incremental cost-effectiveness ratio (ICER) of ibrutinib in relapse/refractory WM, compared with the Italian current therapeutic pathways (CTP). Methods: A Markov model was adapted for Italy considering the National Health System perspective. Input data from literature as well as global trials were used. The percentage use of therapies, and healthcare resources consumption were estimated according to expert panel advice. Drugs ex-factory prices and national tariffs were used for estimating costs. The model had a 15-year time horizon, with a 3.0% discount rate for both clinical and economic data. Deterministic and probabilistic sensitivity analyses were performed to test the results strength. Results: Ibrutinib resulted in increased Life Years Gained (LYGs) and increased costs compared to CTP, with an ICER of €52,698/LYG. Sensitivity analyses confirmed the results of the BaseCase. Specifically, in the probabilistic analysis, at a willingness to pay threshold of €60,000/LYG ibrutinib was cost-effective in 84% of simulations. Conclusions: Ibrutinib has demonstrated a positive cost-effectiveness profile in Italy.
Langenderfer, Joseph E; Rullkoetter, Paul J; Mell, Amy G; Laz, Peter J
2009-04-01
An accurate assessment of shoulder kinematics is useful for understanding healthy normal and pathological mechanics. Small variability in identifying and locating anatomical landmarks (ALs) has potential to affect reported shoulder kinematics. The objectives of this study were to quantify the effect of landmark location variability on scapular and humeral kinematic descriptions for multiple subjects using probabilistic analysis methods, and to evaluate the consistency in results across multiple subjects. Data from 11 healthy subjects performing humeral elevation in the scapular plane were used to calculate Euler angles describing humeral and scapular kinematics. Probabilistic analyses were performed for each subject to simulate uncertainty in the locations of 13 upper-extremity ALs. For standard deviations of 4 mm in landmark location, the analysis predicted Euler angle envelopes between the 1 and 99 percentile bounds of up to 16.6 degrees . While absolute kinematics varied with the subject, the average 1-99% kinematic ranges for the motion were consistent across subjects and sensitivity factors showed no statistically significant differences between subjects. The description of humeral kinematics was most sensitive to the location of landmarks on the thorax, while landmarks on the scapula had the greatest effect on the description of scapular elevation. The findings of this study can provide a better understanding of kinematic variability, which can aid in making accurate clinical diagnoses and refining kinematic measurement techniques.
Pouzou, Jane G; Cullen, Alison C; Yost, Michael G; Kissel, John C; Fenske, Richard A
2017-11-06
Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide-handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach. © 2017 Society for Risk Analysis.
Cost-Effectiveness Analysis of Regorafenib for Metastatic Colorectal Cancer
Goldstein, Daniel A.; Ahmad, Bilal B.; Chen, Qiushi; Ayer, Turgay; Howard, David H.; Lipscomb, Joseph; El-Rayes, Bassel F.; Flowers, Christopher R.
2015-01-01
Purpose Regorafenib is a standard-care option for treatment-refractory metastatic colorectal cancer that increases median overall survival by 6 weeks compared with placebo. Given this small incremental clinical benefit, we evaluated the cost-effectiveness of regorafenib in the third-line setting for patients with metastatic colorectal cancer from the US payer perspective. Methods We developed a Markov model to compare the cost and effectiveness of regorafenib with those of placebo in the third-line treatment of metastatic colorectal cancer. Health outcomes were measured in life-years and quality-adjusted life-years (QALYs). Drug costs were based on Medicare reimbursement rates in 2014. Model robustness was addressed in univariable and probabilistic sensitivity analyses. Results Regorafenib provided an additional 0.04 QALYs (0.13 life-years) at a cost of $40,000, resulting in an incremental cost-effectiveness ratio of $900,000 per QALY. The incremental cost-effectiveness ratio for regorafenib was > $550,000 per QALY in all of our univariable and probabilistic sensitivity analyses. Conclusion Regorafenib provides minimal incremental benefit at high incremental cost per QALY in the third-line management of metastatic colorectal cancer. The cost-effectiveness of regorafenib could be improved by the use of value-based pricing. PMID:26304904
Probabilistic QoS Analysis In Wireless Sensor Networks
2012-04-01
and A.O. Fapojuwo. TDMA scheduling with optimized energy efficiency and minimum delay in clustered wireless sensor networks . IEEE Trans. on Mobile...Research Computer Science and Engineering, Department of 5-1-2012 Probabilistic QoS Analysis in Wireless Sensor Networks Yunbo Wang University of...Wang, Yunbo, "Probabilistic QoS Analysis in Wireless Sensor Networks " (2012). Computer Science and Engineering: Theses, Dissertations, and Student
Process for computing geometric perturbations for probabilistic analysis
Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX
2012-04-10
A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.
Sensitivity of the Dengue Surveillance System in Brazil for Detecting Hospitalized Cases
2016-01-01
We evaluated the sensitivity of the dengue surveillance system in detecting hospitalized cases in ten capital cities in Brazil from 2008 to 2013 using a probabilistic record linkage of two independent information systems hospitalization (SIH-SUS) adopted as the gold standard and surveillance (SINAN). Sensitivity was defined as the proportion of cases reported to the surveillance system amid the suspected hospitalized cases registered in SIH-SUS. Of the 48,174 hospitalizations registered in SIH-SUS, 24,469 (50.7%) were reported and registered in SINAN, indicating an overall sensitivity of 50.8% (95%CI 50.3–51.2). The observed sensitivity for each of the municipalities included in the study ranged from 22.0% to 99.1%. The combination of the two data sources identified 71,161 hospitalizations, an increase of 97.0% over SINAN itself. Our results allowed establishing the proportion of underreported dengue hospitalizations in the public health system in Brazil, highlighting the use of probabilistic record linkage as a valuable tool for evaluating surveillance systems. PMID:27192405
Ranking of sabotage/tampering avoidance technology alternatives
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, W.B.; Tabatabai, A.S.; Powers, T.B.
1986-01-01
Pacific Northwest Laboratory conducted a study to evaluate alternatives to the design and operation of nuclear power plants, emphasizing a reduction of their vulnerability to sabotage. Estimates of core melt accident frequency during normal operations and from sabotage/tampering events were used to rank the alternatives. Core melt frequency for normal operations was estimated using sensitivity analysis of results of probabilistic risk assessments. Core melt frequency for sabotage/tampering was estimated by developing a model based on probabilistic risk analyses, historic data, engineering judgment, and safeguards analyses of plant locations where core melt events could be initiated. Results indicate the most effectivemore » alternatives focus on large areas of the plant, increase safety system redundancy, and reduce reliance on single locations for mitigation of transients. Less effective options focus on specific areas of the plant, reduce reliance on some plant areas for safe shutdown, and focus on less vulnerable targets.« less
Assuring Life in Composite Systems
NASA Technical Reports Server (NTRS)
Chamis, Christos c.
2008-01-01
A computational simulation method is presented to assure life in composite systems by using dynamic buckling of smart composite shells as an example. The combined use of composite mechanics, finite element computer codes, and probabilistic analysis enable the effective assessment of the dynamic buckling load of smart composite shells. A universal plot is generated to estimate the dynamic buckling load of composite shells at various load rates and probabilities. The shell structure is also evaluated with smart fibers embedded in the plies right below the outer plies. The results show that, on the average, the use of smart fibers improved the shell buckling resistance by about 9% at different probabilities and delayed the buckling occurrence time. The probabilistic sensitivities results indicate that uncertainties in the fiber volume ratio and ply thickness have major effects on the buckling load. The uncertainties in the electric field strength and smart material volume fraction have moderate effects and thereby in the assured life of the shell.
You, J H S; Wong, W C W; Ip, M; Lee, N L S; Ho, S C
2009-11-01
To compare cost and quality-adjusted life-years (QALYs) gained by influenza vaccination with or without pneumococcal vaccination in the elderly living in long-term care facilities (LTCFs). Cost-effectiveness analysis based on Markov modelling over 5 years, from a Hong Kong public health provider's perspective, on a hypothetical cohort of LTCF residents aged > or = 65 years. Benefit-cost ratio (BCR) and net present value (NPV) of two vaccination strategies versus no vaccination were estimated. The cost and QALYs gained by two vaccination strategies were compared by Student's t-test in probabilistic sensitivity analysis (10,000 Monte Carlo simulations). Both vaccination strategies had high BCRs and NPVs (6.39 and US$334 for influenza vaccination; 5.10 and US$332 for influenza plus pneumococcal vaccination). In base case analysis, the two vaccination strategies were expected to cost less and gain higher QALYs than no vaccination. In probabilistic sensitivity analysis, the cost of combined vaccination and influenza vaccination was significantly lower (p<0.001) than the cost of no vaccination. Both vaccination strategies gained significantly higher (p<0.001) QALYs than no vaccination. The QALYs gained by combined vaccination were significantly higher (p = 0.030) than those gained by influenza vaccination alone. The total cost of combined vaccination was significantly lower (p = 0.011) than that of influenza vaccination. Influenza vaccination with or without pneumococcal vaccination appears to be less costly with higher QALYs gained than no vaccination, over a 5-year period, for elderly people living in LTCFs from the perspective of a Hong Kong public health organisation. Combined vaccination was more likely to gain higher QALYs with lower total cost than influenza vaccination alone.
Impact of Missing Data for Body Mass Index in an Epidemiologic Study.
Razzaghi, Hilda; Tinker, Sarah C; Herring, Amy H; Howards, Penelope P; Waller, D Kim; Johnson, Candice Y
2016-07-01
Objective To assess the potential impact of missing data on body mass index (BMI) on the association between prepregnancy obesity and specific birth defects. Methods Data from the National Birth Defects Prevention Study (NBDPS) were analyzed. We assessed the factors associated with missing BMI data among mothers of infants without birth defects. Four analytic methods were then used to assess the impact of missing BMI data on the association between maternal prepregnancy obesity and three birth defects; spina bifida, gastroschisis, and cleft lip with/without cleft palate. The analytic methods were: (1) complete case analysis; (2) assignment of missing values to either obese or normal BMI; (3) multiple imputation; and (4) probabilistic sensitivity analysis. Logistic regression was used to estimate crude and adjusted odds ratios (aOR) and 95 % confidence intervals (CI). Results Of NBDPS control mothers 4.6 % were missing BMI data, and most of the missing values were attributable to missing height (~90 %). Missing BMI data was associated with birth outside of the US (aOR 8.6; 95 % CI 5.5, 13.4), interview in Spanish (aOR 2.4; 95 % CI 1.8, 3.2), Hispanic ethnicity (aOR 2.0; 95 % CI 1.2, 3.4), and <12 years education (aOR 2.3; 95 % CI 1.7, 3.1). Overall the results of the multiple imputation and probabilistic sensitivity analysis were similar to the complete case analysis. Conclusions Although in some scenarios missing BMI data can bias the magnitude of association, it does not appear likely to have impacted conclusions from a traditional complete case analysis of these data.
Makhija, D; Rock, M; Xiong, Y; Epstein, J D; Arnold, M R; Lattouf, O M; Calcaterra, D
2017-06-01
A recent retrospective comparative effectiveness study found that use of the FLOSEAL Hemostatic Matrix in cardiac surgery was associated with significantly lower risks of complications, blood transfusions, surgical revisions, and shorter length of surgery than use of SURGIFLO Hemostatic Matrix. These outcome improvements in cardiac surgery procedures may translate to economic savings for hospitals and payers. The objective of this study was to estimate the cost-consequence of two flowable hemostatic matrices (FLOSEAL or SURGIFLO) in cardiac surgeries for US hospitals. A cost-consequence model was constructed using clinical outcomes from a previously published retrospective comparative effectiveness study of FLOSEAL vs SURGIFLO in adult cardiac surgeries. The model accounted for the reported differences between these products in length of surgery, rates of major and minor complications, surgical revisions, and blood product transfusions. Costs were derived from Healthcare Cost and Utilization Project's National Inpatient Sample (NIS) 2012 database and converted to 2015 US dollars. Savings were modeled for a hospital performing 245 cardiac surgeries annually, as identified as the average for hospitals in the NIS dataset. One-way sensitivity analysis and probabilistic sensitivity analysis were performed to test model robustness. The results suggest that if FLOSEAL is utilized in a hospital that performs 245 mixed cardiac surgery procedures annually, 11 major complications, 31 minor complications, nine surgical revisions, 79 blood product transfusions, and 260.3 h of cumulative operating time could be avoided. These improved outcomes correspond to a net annualized saving of $1,532,896. Cost savings remained consistent between $1.3m and $1.8m and between $911k and $2.4m, even after accounting for the uncertainty around clinical and cost inputs, in a one-way and probabilistic sensitivity analysis, respectively. Outcome differences associated with FLOSEAL vs SURGIFLO that were previously reported in a comparative effectiveness study may result in substantial cost savings for US hospitals.
Bounthavong, Mark; Pruitt, Larry D; Smolenski, Derek J; Gahm, Gregory A; Bansal, Aasthaa; Hansen, Ryan N
2018-02-01
Introduction Home-based telebehavioural healthcare improves access to mental health care for patients restricted by travel burden. However, there is limited evidence assessing the economic value of home-based telebehavioural health care compared to in-person care. We sought to compare the economic impact of home-based telebehavioural health care and in-person care for depression among current and former US service members. Methods We performed trial-based cost-minimisation and cost-utility analyses to assess the economic impact of home-based telebehavioural health care versus in-person behavioural care for depression. Our analyses focused on the payer perspective (Department of Defense and Department of Veterans Affairs) at three months. We also performed a scenario analysis where all patients possessed video-conferencing technology that was approved by these agencies. The cost-utility analysis evaluated the impact of different depression categories on the incremental cost-effectiveness ratio. One-way and probabilistic sensitivity analyses were performed to test the robustness of the model assumptions. Results In the base case analysis the total direct cost of home-based telebehavioural health care was higher than in-person care (US$71,974 versus US$20,322). Assuming that patients possessed government-approved video-conferencing technology, home-based telebehavioural health care was less costly compared to in-person care (US$19,177 versus US$20,322). In one-way sensitivity analyses, the proportion of patients possessing personal computers was a major driver of direct costs. In the cost-utility analysis, home-based telebehavioural health care was dominant when patients possessed video-conferencing technology. Results from probabilistic sensitivity analyses did not differ substantially from base case results. Discussion Home-based telebehavioural health care is dependent on the cost of supplying video-conferencing technology to patients but offers the opportunity to increase access to care. Health-care policies centred on implementation of home-based telebehavioural health care should ensure that these technologies are able to be successfully deployed on patients' existing technology.
Brandsch, Rainer
2017-10-01
Migration modelling provides reliable migration estimates from food-contact materials (FCM) to food or food simulants based on mass-transfer parameters like diffusion and partition coefficients related to individual materials. In most cases, mass-transfer parameters are not readily available from the literature and for this reason are estimated with a given uncertainty. Historically, uncertainty was accounted for by introducing upper limit concepts first, turning out to be of limited applicability due to highly overestimated migration results. Probabilistic migration modelling gives the possibility to consider uncertainty of the mass-transfer parameters as well as other model inputs. With respect to a functional barrier, the most important parameters among others are the diffusion properties of the functional barrier and its thickness. A software tool that accepts distribution as inputs and is capable of applying Monte Carlo methods, i.e., random sampling from the input distributions of the relevant parameters (i.e., diffusion coefficient and layer thickness), predicts migration results with related uncertainty and confidence intervals. The capabilities of probabilistic migration modelling are presented in the view of three case studies (1) sensitivity analysis, (2) functional barrier efficiency and (3) validation by experimental testing. Based on the predicted migration by probabilistic migration modelling and related exposure estimates, safety evaluation of new materials in the context of existing or new packaging concepts is possible. Identifying associated migration risk and potential safety concerns in the early stage of packaging development is possible. Furthermore, dedicated material selection exhibiting required functional barrier efficiency under application conditions becomes feasible. Validation of the migration risk assessment by probabilistic migration modelling through a minimum of dedicated experimental testing is strongly recommended.
Zeng, Xiaohui; Li, Jianhe; Peng, Liubao; Wang, Yunhua; Tan, Chongqing; Chen, Gannong; Wan, Xiaomin; Lu, Qiong; Yi, Lidan
2014-01-01
Maintenance gefitinib significantly prolonged progression-free survival (PFS) compared with placebo in patients from eastern Asian with locally advanced/metastatic non-small-cell lung cancer (NSCLC) after four chemotherapeutic cycles (21 days per cycle) of first-line platinum-based combination chemotherapy without disease progression. The objective of the current study was to evaluate the cost-effectiveness of maintenance gefitinib therapy after four chemotherapeutic cycle's stand first-line platinum-based chemotherapy for patients with locally advanced or metastatic NSCLC with unknown EGFR mutations, from a Chinese health care system perspective. A semi-Markov model was designed to evaluate cost-effectiveness of the maintenance gefitinib treatment. Two-parametric Weibull and Log-logistic distribution were fitted to PFS and overall survival curves independently. One-way and probabilistic sensitivity analyses were conducted to assess the stability of the model designed. The model base-case analysis suggested that maintenance gefitinib would increase benefits in a 1, 3, 6 or 10-year time horizon, with incremental $184,829, $19,214, $19,328, and $21,308 per quality-adjusted life-year (QALY) gained, respectively. The most sensitive influential variable in the cost-effectiveness analysis was utility of PFS plus rash, followed by utility of PFS plus diarrhoea, utility of progressed disease, price of gefitinib, cost of follow-up treatment in progressed survival state, and utility of PFS on oral therapy. The price of gefitinib is the most significant parameter that could reduce the incremental cost per QALY. Probabilistic sensitivity analysis indicated that the cost-effective probability of maintenance gefitinib was zero under the willingness-to-pay (WTP) threshold of $16,349 (3 × per-capita gross domestic product of China). The sensitivity analyses all suggested that the model was robust. Maintenance gefitinib following first-line platinum-based chemotherapy for patients with locally advanced/metastatic NSCLC with unknown EGFR mutations is not cost-effective. Decreasing the price of gefitinib may be a preferential choice for meeting widely treatment demands in China.
Guerriero, Carla; Cairns, John; Roberts, Ian; Rodgers, Anthony; Whittaker, Robyn; Free, Caroline
2013-10-01
The txt2stop trial has shown that mobile-phone-based smoking cessation support doubles biochemically validated quitting at 6 months. This study examines the cost-effectiveness of smoking cessation support delivered by mobile phone text messaging. The lifetime incremental costs and benefits of adding text-based support to current practice are estimated from a UK NHS perspective using a Markov model. The cost-effectiveness was measured in terms of cost per quitter, cost per life year gained and cost per QALY gained. As in previous studies, smokers are assumed to face a higher risk of experiencing the following five diseases: lung cancer, stroke, myocardial infarction, chronic obstructive pulmonary disease, and coronary heart disease (i.e. the main fatal or disabling, but by no means the only, adverse effects of prolonged smoking). The treatment costs and health state values associated with these diseases were identified from the literature. The analysis was based on the age and gender distribution observed in the txt2stop trial. Effectiveness and cost parameters were varied in deterministic sensitivity analyses, and a probabilistic sensitivity analysis was also performed. The cost of text-based support per 1,000 enrolled smokers is £16,120, which, given an estimated 58 additional quitters at 6 months, equates to £278 per quitter. However, when the future NHS costs saved (as a result of reduced smoking) are included, text-based support would be cost saving. It is estimated that 18 LYs are gained per 1,000 smokers (0.3 LYs per quitter) receiving text-based support, and 29 QALYs are gained (0.5 QALYs per quitter). The deterministic sensitivity analysis indicated that changes in individual model parameters did not alter the conclusion that this is a cost-effective intervention. Similarly, the probabilistic sensitivity analysis indicated a >90 % chance that the intervention will be cost saving. This study shows that under a wide variety of conditions, personalised smoking cessation advice and support by mobile phone message is both beneficial for health and cost saving to a health system.
Data-Conditioned Distributions of Groundwater Recharge Under Climate Change Scenarios
NASA Astrophysics Data System (ADS)
McLaughlin, D.; Ng, G. C.; Entekhabi, D.; Scanlon, B.
2008-12-01
Groundwater recharge is likely to be impacted by climate change, with changes in precipitation amounts altering moisture availability and changes in temperature affecting evaporative demand. This could have major implications for sustainable aquifer pumping rates and contaminant transport into groundwater reservoirs in the future, thus making predictions of recharge under climate change very important. Unfortunately, in dry environments where groundwater resources are often most critical, low recharge rates are difficult to resolve due to high sensitivity to modeling and input errors. Some recent studies on climate change and groundwater have considered recharge using a suite of general circulation model (GCM) weather predictions, an obvious and key source of uncertainty. This work extends beyond those efforts by also accounting for uncertainty in other land-surface model inputs in a probabilistic manner. Recharge predictions are made using a range of GCM projections for a rain-fed cotton site in the semi-arid Southern High Plains region of Texas. Results showed that model simulations using a range of unconstrained literature-based parameter values produce highly uncertain and often misleading recharge rates. Thus, distributional recharge predictions are found using soil and vegetation parameters conditioned on current unsaturated zone soil moisture and chloride concentration observations; assimilation of observations is carried out with an ensemble importance sampling method. Our findings show that the predicted distribution shapes can differ for the various GCM conditions considered, underscoring the importance of probabilistic analysis over deterministic simulations. The recharge predictions indicate that the temporal distribution (over seasons and rain events) of climate change will be particularly critical for groundwater impacts. Overall, changes in recharge amounts and intensity were often more pronounced than changes in annual precipitation and temperature, thus suggesting high susceptibility of groundwater systems to future climate change. Our approach provides a probabilistic sensitivity analysis of recharge under potential climate changes, which will be critical for future management of water resources.
NASA Technical Reports Server (NTRS)
Prassinos, Peter G.; Stamatelatos, Michael G.; Young, Jonathan; Smith, Curtis
2010-01-01
Managed by NASA's Office of Safety and Mission Assurance, a pilot probabilistic risk analysis (PRA) of the NASA Crew Exploration Vehicle (CEV) was performed in early 2006. The PRA methods used follow the general guidance provided in the NASA PRA Procedures Guide for NASA Managers and Practitioners'. Phased-mission based event trees and fault trees are used to model a lunar sortie mission of the CEV - involving the following phases: launch of a cargo vessel and a crew vessel; rendezvous of these two vessels in low Earth orbit; transit to th$: moon; lunar surface activities; ascension &om the lunar surface; and return to Earth. The analysis is based upon assumptions, preliminary system diagrams, and failure data that may involve large uncertainties or may lack formal validation. Furthermore, some of the data used were based upon expert judgment or extrapolated from similar componentssystemsT. his paper includes a discussion of the system-level models and provides an overview of the analysis results used to identify insights into CEV risk drivers, and trade and sensitivity studies. Lastly, the PRA model was used to determine changes in risk as the system configurations or key parameters are modified.
NASA Astrophysics Data System (ADS)
Mazzaracchio, Antonio; Marchetti, Mario
2010-03-01
Implicit ablation and thermal response software was developed to analyse and size charring ablative thermal protection systems for entry vehicles. A statistical monitor integrated into the tool, which uses the Monte Carlo technique, allows a simulation to run over stochastic series. This performs an uncertainty and sensitivity analysis, which estimates the probability of maintaining the temperature of the underlying material within specified requirements. This approach and the associated software are primarily helpful during the preliminary design phases of spacecraft thermal protection systems. They are proposed as an alternative to traditional approaches, such as the Root-Sum-Square method. The developed tool was verified by comparing the results with those from previous work on thermal protection system probabilistic sizing methodologies, which are based on an industry standard high-fidelity ablation and thermal response program. New case studies were analysed to establish thickness margins on sizing heat shields that are currently proposed for vehicles using rigid aeroshells for future aerocapture missions at Neptune, and identifying the major sources of uncertainty in the material response.
Alternate Methods in Refining the SLS Nozzle Plug Loads
NASA Technical Reports Server (NTRS)
Burbank, Scott; Allen, Andrew
2013-01-01
Numerical analysis has shown that the SLS nozzle environmental barrier (nozzle plug) design is inadequate for the prelaunch condition, which consists of two dominant loads: 1) the main engines startup pressure and 2) an environmentally induced pressure. Efforts to reduce load conservatisms included a dynamic analysis which showed a 31% higher safety factor compared to the standard static analysis. The environmental load is typically approached with a deterministic method using the worst possible combinations of pressures and temperatures. An alternate probabilistic approach, utilizing the distributions of pressures and temperatures, resulted in a 54% reduction in the environmental pressure load. A Monte Carlo simulation of environmental load that used five years of historical pressure and temperature data supported the results of the probabilistic analysis, indicating the probabilistic load is reflective of a 3-sigma condition (1 in 370 probability). Utilizing the probabilistic load analysis eliminated excessive conservatisms and will prevent a future overdesign of the nozzle plug. Employing a similar probabilistic approach to other design and analysis activities can result in realistic yet adequately conservative solutions.
Development of probabilistic internal dosimetry computer code
NASA Astrophysics Data System (ADS)
Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki
2017-02-01
Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values ( e.g. the 2.5th, 5th, median, 95th, and 97.5th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases of severe internal exposure, the causation probability of a deterministic health effect can be derived from the dose distribution, and a high statistical value ( e.g., the 95th percentile of the distribution) can be used to determine the appropriate intervention. The distribution-based sensitivity analysis can also be used to quantify the contribution of each factor to the dose uncertainty, which is essential information for reducing and optimizing the uncertainty in the internal dose assessment. Therefore, the present study can contribute to retrospective dose assessment for accidental internal exposure scenarios, as well as to internal dose monitoring optimization and uncertainty reduction.
Failed rib region prediction in a human body model during crash events with precrash braking.
Guleyupoglu, B; Koya, B; Barnard, R; Gayzik, F S
2018-02-28
The objective of this study is 2-fold. We used a validated human body finite element model to study the predicted chest injury (focusing on rib fracture as a function of element strain) based on varying levels of simulated precrash braking. Furthermore, we compare deterministic and probabilistic methods of rib injury prediction in the computational model. The Global Human Body Models Consortium (GHBMC) M50-O model was gravity settled in the driver position of a generic interior equipped with an advanced 3-point belt and airbag. Twelve cases were investigated with permutations for failure, precrash braking system, and crash severity. The severities used were median (17 kph), severe (34 kph), and New Car Assessment Program (NCAP; 56.4 kph). Cases with failure enabled removed rib cortical bone elements once 1.8% effective plastic strain was exceeded. Alternatively, a probabilistic framework found in the literature was used to predict rib failure. Both the probabilistic and deterministic methods take into consideration location (anterior, lateral, and posterior). The deterministic method is based on a rubric that defines failed rib regions dependent on a threshold for contiguous failed elements. The probabilistic method depends on age-based strain and failure functions. Kinematics between both methods were similar (peak max deviation: ΔX head = 17 mm; ΔZ head = 4 mm; ΔX thorax = 5 mm; ΔZ thorax = 1 mm). Seat belt forces at the time of probabilistic failed region initiation were lower than those at deterministic failed region initiation. The probabilistic method for rib fracture predicted more failed regions in the rib (an analog for fracture) than the deterministic method in all but 1 case where they were equal. The failed region patterns between models are similar; however, there are differences that arise due to stress reduced from element elimination that cause probabilistic failed regions to continue to rise after no deterministic failed region would be predicted. Both the probabilistic and deterministic methods indicate similar trends with regards to the effect of precrash braking; however, there are tradeoffs. The deterministic failed region method is more spatially sensitive to failure and is more sensitive to belt loads. The probabilistic failed region method allows for increased capability in postprocessing with respect to age. The probabilistic failed region method predicted more failed regions than the deterministic failed region method due to force distribution differences.
Probabilistic Risk Assessment: A Bibliography
NASA Technical Reports Server (NTRS)
2000-01-01
Probabilistic risk analysis is an integration of failure modes and effects analysis (FMEA), fault tree analysis and other techniques to assess the potential for failure and to find ways to reduce risk. This bibliography references 160 documents in the NASA STI Database that contain the major concepts, probabilistic risk assessment, risk and probability theory, in the basic index or major subject terms, An abstract is included with most citations, followed by the applicable subject terms.
Probabilistic simulation of uncertainties in thermal structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Shiao, Michael
1990-01-01
Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.
Herzer, Kurt R; Niessen, Louis; Constenla, Dagna O; Ward, William J; Pronovost, Peter J
2014-09-25
To assess the cost-effectiveness of a multifaceted quality improvement programme focused on reducing central line-associated bloodstream infections in intensive care units. Cost-effectiveness analysis using a decision tree model to compare programme to non-programme intensive care units. USA. Adult patients in the intensive care unit. Economic costs of the programme and of central line-associated bloodstream infections were estimated from the perspective of the hospital and presented in 2013 US dollars. Central line-associated bloodstream infections prevented, deaths averted due to central line-associated bloodstream infections prevented, and incremental cost-effectiveness ratios. Probabilistic sensitivity analysis was performed. Compared with current practice, the programme is strongly dominant and reduces bloodstream infections and deaths at no additional cost. The probabilistic sensitivity analysis showed that there was an almost 80% probability that the programme reduces bloodstream infections and the infections' economic costs to hospitals. The opportunity cost of a bloodstream infection to a hospital was the most important model parameter in these analyses. This multifaceted quality improvement programme, as it is currently implemented by hospitals on an increasingly large scale in the USA, likely reduces the economic costs of central line-associated bloodstream infections for US hospitals. Awareness among hospitals about the programme's benefits should enhance implementation. The programme's implementation has the potential to substantially reduce morbidity, mortality and economic costs associated with central line-associated bloodstream infections. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
A probabilistic Hu-Washizu variational principle
NASA Technical Reports Server (NTRS)
Liu, W. K.; Belytschko, T.; Besterfield, G. H.
1987-01-01
A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.
Cost-effectiveness analysis of neurocognitive-sparing treatments for brain metastases.
Savitz, Samuel T; Chen, Ronald C; Sher, David J
2015-12-01
Decisions regarding how to treat patients who have 1 to 3 brain metastases require important tradeoffs between controlling recurrences, side effects, and costs. In this analysis, the authors compared novel treatments versus usual care to determine the incremental cost-effectiveness ratio from a payer's (Medicare) perspective. Cost-effectiveness was evaluated using a microsimulation of a Markov model for 60 one-month cycles. The model used 4 simulated cohorts of patients aged 65 years with 1 to 3 brain metastases. The 4 cohorts had a median survival of 3, 6, 12, and 24 months to test the sensitivity of the model to different prognoses. The treatment alternatives evaluated included stereotactic radiosurgery (SRS) with 3 variants of salvage after recurrence (whole-brain radiotherapy [WBRT], hippocampal avoidance WBRT [HA-WBRT], SRS plus WBRT, and SRS plus HA-WBRT). The findings were tested for robustness using probabilistic and deterministic sensitivity analyses. Traditional radiation therapies remained cost-effective for patients in the 3-month and 6-month cohorts. In the cohorts with longer median survival, HA-WBRT and SRS plus HA-WBRT became cost-effective relative to traditional treatments. When the treatments that involved HA-WBRT were excluded, either SRS alone or SRS plus WBRT was cost-effective relative to WBRT alone. The deterministic and probabilistic sensitivity analyses confirmed the robustness of these results. HA-WBRT and SRS plus HA-WBRT were cost-effective for 2 of the 4 cohorts, demonstrating the value of controlling late brain toxicity with this novel therapy. Cost-effectiveness depended on patient life expectancy. SRS was cost-effective in the cohorts with short prognoses (3 and 6 months), whereas HA-WBRT and SRS plus HA-WBRT were cost-effective in the cohorts with longer prognoses (12 and 24 months). © 2015 American Cancer Society.
The potential cost-effectiveness of vaccination against herpes zoster and post-herpetic neuralgia.
Brisson, Marc; Pellissier, James M; Camden, Stéphanie; Quach, Caroline; De Wals, Philippe
2008-01-01
A clinical trial has shown that a live-attenuated varicella-zoster virus vaccine is effective against herpes zoster (HZ) and post-herpetic neuralgia (PHN). The aim of this study was to examine the cost-effectiveness of vaccination against HZ and PHN in Canada. A cohort model was developed to estimate the burden of HZ and the cost-effectiveness of HZ vaccination, using Canadian population-based data. Different ages at vaccination were examined and probabilistic sensitivity analysis was performed. The economic evaluation was conducted from the ministry of health perspective and 5% discounting was used for costs and benefits. In Canada (population = 30 million), we estimate that each year there are 130,000 new cases of HZ, 17,000 cases of PHN and 20 deaths. Most of the pain and suffering is borne by adults over the age of 60 years and is due to PHN. Vaccinating 65-year-olds (HZ efficacy = 63%, PHN efficacy = 67%, no waning, cost/course = $150) is estimated to cost $33,000 per QALY-gained (90% CrI: 19,000-63,000). Assuming the cost per course of HZ vaccination is $150, probabilistic sensitivity analysis suggest that vaccinating between 65 and 75 years of age will likely yield cost-effectiveness ratios below $40,000 per Quality-Adjusted Life-Year (QALY) gained, while vaccinating adults older than 75 years will yield ratios less than $70,000 per QALY-gained. These results are most sensitive to the duration of vaccine protection and the cost of vaccination. In conclusion, results suggest that vaccinating adults between the ages of 65 and 75 years is likely to be cost-effective and thus to be a judicious use of scarce health care resources.
Degeling, Koen; IJzerman, Maarten J; Koopman, Miriam; Koffijberg, Hendrik
2017-12-15
Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive guidance on reflecting parameter uncertainty in the (correlated) parameters of distributions used to represent stochastic uncertainty in patient-level models. This study aims to provide this guidance by proposing appropriate methods and illustrating the impact of this uncertainty on modeling outcomes. Two approaches, 1) using non-parametric bootstrapping and 2) using multivariate Normal distributions, were applied in a simulation and case study. The approaches were compared based on point-estimates and distributions of time-to-event and health economic outcomes. To assess sample size impact on the uncertainty in these outcomes, sample size was varied in the simulation study and subgroup analyses were performed for the case-study. Accounting for parameter uncertainty in distributions that reflect stochastic uncertainty substantially increased the uncertainty surrounding health economic outcomes, illustrated by larger confidence ellipses surrounding the cost-effectiveness point-estimates and different cost-effectiveness acceptability curves. Although both approaches performed similar for larger sample sizes (i.e. n = 500), the second approach was more sensitive to extreme values for small sample sizes (i.e. n = 25), yielding infeasible modeling outcomes. Modelers should be aware that parameter uncertainty in distributions used to describe stochastic uncertainty needs to be reflected in probabilistic sensitivity analysis, as it could substantially impact the total amount of uncertainty surrounding health economic outcomes. If feasible, the bootstrap approach is recommended to account for this uncertainty.
Specifying design conservatism: Worst case versus probabilistic analysis
NASA Technical Reports Server (NTRS)
Miles, Ralph F., Jr.
1993-01-01
Design conservatism is the difference between specified and required performance, and is introduced when uncertainty is present. The classical approach of worst-case analysis for specifying design conservatism is presented, along with the modern approach of probabilistic analysis. The appropriate degree of design conservatism is a tradeoff between the required resources and the probability and consequences of a failure. A probabilistic analysis properly models this tradeoff, while a worst-case analysis reveals nothing about the probability of failure, and can significantly overstate the consequences of failure. Two aerospace examples will be presented that illustrate problems that can arise with a worst-case analysis.
Phillips, Benjamin U; Dewan, Sigma; Nilsson, Simon R O; Robbins, Trevor W; Heath, Christopher J; Saksida, Lisa M; Bussey, Timothy J; Alsiö, Johan
2018-04-22
Dysregulation of the serotonin (5-HT) system is a pathophysiological component in major depressive disorder (MDD), a condition closely associated with abnormal emotional responsivity to positive and negative feedback. However, the precise mechanism through which 5-HT tone biases feedback responsivity remains unclear. 5-HT2C receptors (5-HT2CRs) are closely linked with aspects of depressive symptomatology, including abnormalities in reinforcement processes and response to stress. Thus, we aimed to determine the impact of 5-HT2CR function on response to feedback in biased reinforcement learning. We used two touchscreen assays designed to assess the impact of positive and negative feedback on probabilistic reinforcement in mice, including a novel valence-probe visual discrimination (VPVD) and a probabilistic reversal learning procedure (PRL). Systemic administration of a 5-HT2CR agonist and antagonist resulted in selective changes in the balance of feedback sensitivity bias on these tasks. Specifically, on VPVD, SB 242084, the 5-HT2CR antagonist, impaired acquisition of a discrimination dependent on appropriate integration of positive and negative feedback. On PRL, SB 242084 at 1 mg/kg resulted in changes in behaviour consistent with reduced sensitivity to positive feedback. In contrast, WAY 163909, the 5-HT2CR agonist, resulted in changes associated with increased sensitivity to positive feedback and decreased sensitivity to negative feedback. These results suggest that 5-HT2CRs tightly regulate feedback sensitivity bias in mice with consequent effects on learning and cognitive flexibility and specify a framework for the influence of 5-HT2CRs on sensitivity to reinforcement.
Life Predicted in a Probabilistic Design Space for Brittle Materials With Transient Loads
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Palfi, Tamas; Reh, Stefan
2005-01-01
Analytical techniques have progressively become more sophisticated, and now we can consider the probabilistic nature of the entire space of random input variables on the lifetime reliability of brittle structures. This was demonstrated with NASA s CARES/Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code combined with the commercially available ANSYS/Probabilistic Design System (ANSYS/PDS), a probabilistic analysis tool that is an integral part of the ANSYS finite-element analysis program. ANSYS/PDS allows probabilistic loads, component geometry, and material properties to be considered in the finite-element analysis. CARES/Life predicts the time dependent probability of failure of brittle material structures under generalized thermomechanical loading--such as that found in a turbine engine hot-section. Glenn researchers coupled ANSYS/PDS with CARES/Life to assess the effects of the stochastic variables of component geometry, loading, and material properties on the predicted life of the component for fully transient thermomechanical loading and cyclic loading.
Emulation for probabilistic weather forecasting
NASA Astrophysics Data System (ADS)
Cornford, Dan; Barillec, Remi
2010-05-01
Numerical weather prediction models are typically very expensive to run due to their complexity and resolution. Characterising the sensitivity of the model to its initial condition and/or to its parameters requires numerous runs of the model, which is impractical for all but the simplest models. To produce probabilistic forecasts requires knowledge of the distribution of the model outputs, given the distribution over the inputs, where the inputs include the initial conditions, boundary conditions and model parameters. Such uncertainty analysis for complex weather prediction models seems a long way off, given current computing power, with ensembles providing only a partial answer. One possible way forward that we develop in this work is the use of statistical emulators. Emulators provide an efficient statistical approximation to the model (or simulator) while quantifying the uncertainty introduced. In the emulator framework, a Gaussian process is fitted to the simulator response as a function of the simulator inputs using some training data. The emulator is essentially an interpolator of the simulator output and the response in unobserved areas is dictated by the choice of covariance structure and parameters in the Gaussian process. Suitable parameters are inferred from the data in a maximum likelihood, or Bayesian framework. Once trained, the emulator allows operations such as sensitivity analysis or uncertainty analysis to be performed at a much lower computational cost. The efficiency of emulators can be further improved by exploiting the redundancy in the simulator output through appropriate dimension reduction techniques. We demonstrate this using both Principal Component Analysis on the model output and a new reduced-rank emulator in which an optimal linear projection operator is estimated jointly with other parameters, in the context of simple low order models, such as the Lorenz 40D system. We present the application of emulators to probabilistic weather forecasting, where the construction of the emulator training set replaces the traditional ensemble model runs. Thus the actual forecast distributions are computed using the emulator conditioned on the ‘ensemble runs' which are chosen to explore the plausible input space using relatively crude experimental design methods. One benefit here is that the ensemble does not need to be a sample from the true distribution of the input space, rather it should cover that input space in some sense. The probabilistic forecasts are computed using Monte Carlo methods sampling from the input distribution and using the emulator to produce the output distribution. Finally we discuss the limitations of this approach and briefly mention how we might use similar methods to learn the model error within a framework that incorporates a data assimilation like aspect, using emulators and learning complex model error representations. We suggest future directions for research in the area that will be necessary to apply the method to more realistic numerical weather prediction models.
Dinov, Martin; Leech, Robert
2017-01-01
Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses.
Dinov, Martin; Leech, Robert
2017-01-01
Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses. PMID:29163110
Tu, H Y V; Pemberton, J; Lorenzo, A J; Braga, L H
2015-10-01
For infants with hydronephrosis, continuous antibiotic prophylaxis (CAP) may reduce urinary tract infections (UTIs); however, its value remains controversial. Recent studies have suggested that neonates with severe obstructive hydronephrosis are at an increased risk of UTIs, and support the use of CAP. Other studies have demonstrated the negligible risk for UTIs in the setting of suspected ureteropelvic junction obstruction and have highlighted the limited role of CAP in hydronephrosis. Furthermore, economic studies in this patient population have been sparse. This study aimed to evaluate whether the use of CAP is an efficient expenditure for preventing UTIs in children with high-grade hydronephrosis within the first 2 years of life. A decision model was used to estimate expected costs, clinical outcomes and quality-adjusted life years (QALYs) of CAP versus no CAP (Fig. 1). Cost data were collected from provincial databases and converted to 2013 Canadian dollars (CAD). Estimates of risks and health utility values were extracted from published literature. The analysis was performed over a time horizon of 2 years. One-way and probabilistic sensitivity analyses were carried out to assess uncertainty and robustness. Overall, CAP use was less costly and provided a minimal increase in health utility when compared to no CAP (Table). The mean cost over two years for CAP and no CAP was CAD$1571.19 and CAD$1956.44, respectively. The use of CAP reduced outpatient-managed UTIs by 0.21 infections and UTIs requiring hospitalization by 0.04 infections over 2 years. Cost-utility analysis revealed an increase of 0.0001 QALYs/year when using CAP. The CAP arm exhibited strong dominance over no CAP in all sensitivity analyses and across all willingness-to-pay thresholds. The use of CAP exhibited strong dominance in the economic evaluation, despite a small gain of 0.0001 QALYs/year. Whether this slight gain is clinically significant remains to be determined. However, small QALY gains have been reported in other pediatric economic evaluations. Strengths of this study included the use of data from a recent systematic review and meta-analysis, in addition to a comprehensive probabilistic sensitivity analysis. Limitations of this study included the use of estimates for UTI probabilities in the second year of life and health utility values, given that they were lacking in the literature. Spontaneous resolution of hydronephrosis and surgical management were also not implemented in this model. To prevent UTIs within the first 2 years of life in infants with high-grade hydronephrosis, this probabilistic model has shown that CAP use is a prudent expenditure of healthcare resources when compared to no CAP. Copyright © 2015 Journal of Pediatric Urology Company. Published by Elsevier Ltd. All rights reserved.
A Comparison of Probabilistic and Deterministic Campaign Analysis for Human Space Exploration
NASA Technical Reports Server (NTRS)
Merrill, R. Gabe; Andraschko, Mark; Stromgren, Chel; Cirillo, Bill; Earle, Kevin; Goodliff, Kandyce
2008-01-01
Human space exploration is by its very nature an uncertain endeavor. Vehicle reliability, technology development risk, budgetary uncertainty, and launch uncertainty all contribute to stochasticity in an exploration scenario. However, traditional strategic analysis has been done in a deterministic manner, analyzing and optimizing the performance of a series of planned missions. History has shown that exploration scenarios rarely follow such a planned schedule. This paper describes a methodology to integrate deterministic and probabilistic analysis of scenarios in support of human space exploration. Probabilistic strategic analysis is used to simulate "possible" scenario outcomes, based upon the likelihood of occurrence of certain events and a set of pre-determined contingency rules. The results of the probabilistic analysis are compared to the nominal results from the deterministic analysis to evaluate the robustness of the scenario to adverse events and to test and optimize contingency planning.
Internal validation of STRmix™ for the interpretation of single source and mixed DNA profiles.
Moretti, Tamyra R; Just, Rebecca S; Kehl, Susannah C; Willis, Leah E; Buckleton, John S; Bright, Jo-Anne; Taylor, Duncan A; Onorato, Anthony J
2017-07-01
The interpretation of DNA evidence can entail analysis of challenging STR typing results. Genotypes inferred from low quality or quantity specimens, or mixed DNA samples originating from multiple contributors, can result in weak or inconclusive match probabilities when a binary interpretation method and necessary thresholds (such as a stochastic threshold) are employed. Probabilistic genotyping approaches, such as fully continuous methods that incorporate empirically determined biological parameter models, enable usage of more of the profile information and reduce subjectivity in interpretation. As a result, software-based probabilistic analyses tend to produce more consistent and more informative results regarding potential contributors to DNA evidence. Studies to assess and internally validate the probabilistic genotyping software STRmix™ for casework usage at the Federal Bureau of Investigation Laboratory were conducted using lab-specific parameters and more than 300 single-source and mixed contributor profiles. Simulated forensic specimens, including constructed mixtures that included DNA from two to five donors across a broad range of template amounts and contributor proportions, were used to examine the sensitivity and specificity of the system via more than 60,000 tests comparing hundreds of known contributors and non-contributors to the specimens. Conditioned analyses, concurrent interpretation of amplification replicates, and application of an incorrect contributor number were also performed to further investigate software performance and probe the limitations of the system. In addition, the results from manual and probabilistic interpretation of both prepared and evidentiary mixtures were compared. The findings support that STRmix™ is sufficiently robust for implementation in forensic laboratories, offering numerous advantages over historical methods of DNA profile analysis and greater statistical power for the estimation of evidentiary weight, and can be used reliably in human identification testing. With few exceptions, likelihood ratio results reflected intuitively correct estimates of the weight of the genotype possibilities and known contributor genotypes. This comprehensive evaluation provides a model in accordance with SWGDAM recommendations for internal validation of a probabilistic genotyping system for DNA evidence interpretation. Copyright © 2017. Published by Elsevier B.V.
Global/local methods for probabilistic structural analysis
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Wu, Y.-T.
1993-01-01
A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.
Global/local methods for probabilistic structural analysis
NASA Astrophysics Data System (ADS)
Millwater, H. R.; Wu, Y.-T.
1993-04-01
A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.
Carlson, Josh J; Suh, Kangho; Orfanos, Panos; Wong, William
2018-04-01
The recently completed ALEX trial demonstrated that alectinib improved progression-free survival, and delayed time to central nervous system progression compared with crizotinib in patients with anaplastic lymphoma kinase-positive non-small-cell lung cancer. However, the long-term clinical and economic impact of using alectinib vs. crizotinib has not been evaluated. The objective of this study was to determine the potential cost utility of alectinib vs. crizotinib from a US payer perspective. A cost-utility model was developed using partition survival methods and three health states: progression-free, post-progression, and death. ALEX trial data informed the progression-free and overall survival estimates. Costs included drug treatments and supportive care (central nervous system and non-central nervous system). Utility values were obtained from trial data and literature. Sensitivity analyses included one-way and probabilistic sensitivity analyses. Treatment with alectinib vs. crizotinib resulted in a gain of 0.91 life-years, 0.87 quality-adjusted life-years, and incremental costs of US$34,151, resulting in an incremental cost-effectiveness ratio of US$39,312/quality-adjusted life-year. Drug costs and utilities in the progression-free health state were the main drivers of the model in the one-way sensitivity analysis. From the probabilistic sensitivity analysis, alectinib had a 64% probability of being cost effective at a willingness-to-pay threshold of US$100,000/quality adjusted life-year. Alectinib increased time in the progression-free state and quality-adjusted life-years vs. crizotinib. The marginal cost increase was reflective of longer treatment durations in the progression-free state. Central nervous system-related costs were considerably lower with alectinib. Our results suggest that compared with crizotinib, alectinib may be a cost-effective therapy for treatment-naïve patients with anaplastic lymphoma kinase-positive non-small-cell lung cancer.
Probabilistic simulation of multi-scale composite behavior
NASA Technical Reports Server (NTRS)
Liaw, D. G.; Shiao, M. C.; Singhal, S. N.; Chamis, Christos C.
1993-01-01
A methodology is developed to computationally assess the probabilistic composite material properties at all composite scale levels due to the uncertainties in the constituent (fiber and matrix) properties and in the fabrication process variables. The methodology is computationally efficient for simulating the probability distributions of material properties. The sensitivity of the probabilistic composite material property to each random variable is determined. This information can be used to reduce undesirable uncertainties in material properties at the macro scale of the composite by reducing the uncertainties in the most influential random variables at the micro scale. This methodology was implemented into the computer code PICAN (Probabilistic Integrated Composite ANalyzer). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in the material properties of a typical laminate and comparing the results with the Monte Carlo simulation method. The experimental data of composite material properties at all scales fall within the scatters predicted by PICAN.
Akinci, A.; Galadini, F.; Pantosti, D.; Petersen, M.; Malagnini, L.; Perkins, D.
2009-01-01
We produce probabilistic seismic-hazard assessments for the central Apennines, Italy, using time-dependent models that are characterized using a Brownian passage time recurrence model. Using aperiodicity parameters, ?? of 0.3, 0.5, and 0.7, we examine the sensitivity of the probabilistic ground motion and its deaggregation to these parameters. For the seismic source model we incorporate both smoothed historical seismicity over the area and geological information on faults. We use the maximum magnitude model for the fault sources together with a uniform probability of rupture along the fault (floating fault model) to model fictitious faults to account for earthquakes that cannot be correlated with known geologic structural segmentation.
Probabilistic assessment of smart composite structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Shiao, Michael C.
1994-01-01
A composite wing with spars and bulkheads is used to demonstrate the effectiveness of probabilistic assessment of smart composite structures to control uncertainties in distortions and stresses. Results show that a smart composite wing can be controlled to minimize distortions and to have specified stress levels in the presence of defects. Structural responses such as changes in angle of attack, vertical displacements, and stress in the control and controlled plies are probabilistically assessed to quantify their respective uncertainties. Sensitivity factors are evaluated to identify those parameters that have the greatest influence on a specific structural response. Results show that smart composite structures can be configured to control both distortions and ply stresses to satisfy specified design requirements.
Probabilistic simulation of the human factor in structural reliability
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1993-01-01
A formal approach is described in an attempt to computationally simulate the probable ranges of uncertainties of the human factor in structural probabilistic assessments. A multi-factor interaction equation (MFIE) model has been adopted for this purpose. Human factors such as marital status, professional status, home life, job satisfaction, work load and health, are considered to demonstrate the concept. Parametric studies in conjunction with judgment are used to select reasonable values for the participating factors (primitive variables). Suitability of the MFIE in the subsequently probabilistic sensitivity studies are performed to assess the validity of the whole approach. Results obtained show that the uncertainties for no error range from five to thirty percent for the most optimistic case.
Probabilistic simulation of the human factor in structural reliability
NASA Astrophysics Data System (ADS)
Chamis, Christos C.; Singhal, Surendra N.
1994-09-01
The formal approach described herein computationally simulates the probable ranges of uncertainties for the human factor in probabilistic assessments of structural reliability. Human factors such as marital status, professional status, home life, job satisfaction, work load, and health are studied by using a multifactor interaction equation (MFIE) model to demonstrate the approach. Parametric studies in conjunction with judgment are used to select reasonable values for the participating factors (primitive variables). Subsequently performed probabilistic sensitivity studies assess the suitability of the MFIE as well as the validity of the whole approach. Results show that uncertainties range from 5 to 30 percent for the most optimistic case, assuming 100 percent for no error (perfect performance).
Probabilistic Simulation of the Human Factor in Structural Reliability
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Singhal, Surendra N.
1994-01-01
The formal approach described herein computationally simulates the probable ranges of uncertainties for the human factor in probabilistic assessments of structural reliability. Human factors such as marital status, professional status, home life, job satisfaction, work load, and health are studied by using a multifactor interaction equation (MFIE) model to demonstrate the approach. Parametric studies in conjunction with judgment are used to select reasonable values for the participating factors (primitive variables). Subsequently performed probabilistic sensitivity studies assess the suitability of the MFIE as well as the validity of the whole approach. Results show that uncertainties range from 5 to 30 percent for the most optimistic case, assuming 100 percent for no error (perfect performance).
Large-scale automated histology in the pursuit of connectomes.
Kleinfeld, David; Bharioke, Arjun; Blinder, Pablo; Bock, Davi D; Briggman, Kevin L; Chklovskii, Dmitri B; Denk, Winfried; Helmstaedter, Moritz; Kaufhold, John P; Lee, Wei-Chung Allen; Meyer, Hanno S; Micheva, Kristina D; Oberlaender, Marcel; Prohaska, Steffen; Reid, R Clay; Smith, Stephen J; Takemura, Shinya; Tsai, Philbert S; Sakmann, Bert
2011-11-09
How does the brain compute? Answering this question necessitates neuronal connectomes, annotated graphs of all synaptic connections within defined brain areas. Further, understanding the energetics of the brain's computations requires vascular graphs. The assembly of a connectome requires sensitive hardware tools to measure neuronal and neurovascular features in all three dimensions, as well as software and machine learning for data analysis and visualization. We present the state of the art on the reconstruction of circuits and vasculature that link brain anatomy and function. Analysis at the scale of tens of nanometers yields connections between identified neurons, while analysis at the micrometer scale yields probabilistic rules of connection between neurons and exact vascular connectivity.
Large-Scale Automated Histology in the Pursuit of Connectomes
Bharioke, Arjun; Blinder, Pablo; Bock, Davi D.; Briggman, Kevin L.; Chklovskii, Dmitri B.; Denk, Winfried; Helmstaedter, Moritz; Kaufhold, John P.; Lee, Wei-Chung Allen; Meyer, Hanno S.; Micheva, Kristina D.; Oberlaender, Marcel; Prohaska, Steffen; Reid, R. Clay; Smith, Stephen J.; Takemura, Shinya; Tsai, Philbert S.; Sakmann, Bert
2011-01-01
How does the brain compute? Answering this question necessitates neuronal connectomes, annotated graphs of all synaptic connections within defined brain areas. Further, understanding the energetics of the brain's computations requires vascular graphs. The assembly of a connectome requires sensitive hardware tools to measure neuronal and neurovascular features in all three dimensions, as well as software and machine learning for data analysis and visualization. We present the state of the art on the reconstruction of circuits and vasculature that link brain anatomy and function. Analysis at the scale of tens of nanometers yields connections between identified neurons, while analysis at the micrometer scale yields probabilistic rules of connection between neurons and exact vascular connectivity. PMID:22072665
NASA Technical Reports Server (NTRS)
1992-01-01
The technical effort and computer code developed during the first year are summarized. Several formulations for Probabilistic Finite Element Analysis (PFEA) are described with emphasis on the selected formulation. The strategies being implemented in the first-version computer code to perform linear, elastic PFEA is described. The results of a series of select Space Shuttle Main Engine (SSME) component surveys are presented. These results identify the critical components and provide the information necessary for probabilistic structural analysis.
Cost-effectiveness analysis of treatment strategies for initial Clostridium difficile infection.
Varier, R U; Biltaji, E; Smith, K J; Roberts, M S; Jensen, M K; LaFleur, J; Nelson, R E
2014-12-01
Clostridium difficile infection (CDI) is costly. Current guidelines recommend metronidazole as first-line therapy and vancomycin as an alternative. Recurrence is common. Faecal microbiota transplantation (FMT) is an effective therapy for recurrent CDI (RCDI). This study explores the cost-effectiveness of FMT, vancomycin and metronidazole for initial CDI. We constructed a decision-analytic computer simulation using inputs from published literature to compare FMT with a 10-14-day course of oral metronidazole or vancomycin for initial CDI. Parameters included cure rates (baseline value (range)) for metronidazole (80% (65-85%)), vancomycin (90% (88-92%)) and FMT(91% (83-100%)). Direct costs of metronidazole, vancomycin and FMT, adjusted to 2011 dollars, were $57 ($43-72), $1347 ($1195-1499) and $1086 ($815-1358), respectively. Our effectiveness measure was quality-adjusted life years (QALYs). One-way and probabilistic sensitivity analyses were conducted from the third-party payer perspective. Analysis using baseline values showed that FMT($1669, 0.242 QALYs) dominated (i.e. was less costly and more effective) vancomycin ($1890, 0.241 QALYs). FMT was more costly and more effective than metronidazole ($1167, 0.238 QALYs), yielding an incremental cost-effectiveness ratio (ICER) of $124 964/QALY. One-way sensitivity analyses showed that metronidazole dominated both strategies if its probability of cure were >90%; FMT dominated if it cost <$584. In a probabilistic sensitivity analysis at a willingness-to-pay threshold of $100 000/QALY, metronidazole was favoured in 55% of model iterations; FMT was favoured in 38%. Metronidazole, as the first-line treatment for CDIs, is less costly. FMT and vancomycin are more effective. However, FMT is less likely to be economically favourable, and vancomycin is unlikely to be favourable as first-line therapy when compared with FMT. © 2014 The Authors Clinical Microbiology and Infection © 2014 European Society of Clinical Microbiology and Infectious Diseases.
Tadmouri, Abir; Blomkvist, Josefin; Landais, Cécile; Seymour, Jerome; Azmoun, Alexandre
2018-02-01
Although left ventricular assist devices (LVADs) are currently approved for coverage and reimbursement in France, no French cost-effectiveness (CE) data are available to support this decision. This study aimed at estimating the CE of LVAD compared with medical management in the French health system. Individual patient data from the 'French hospital discharge database' (Medicalization of information systems program) were analysed using Kaplan-Meier method. Outcomes were time to death, time to heart transplantation (HTx), and time to death after HTx. A micro-costing method was used to calculate the monthly costs extracted from the Program for the Medicalization of Information Systems. A multistate Markov monthly cycle model was developed to assess CE. The analysis over a lifetime horizon was performed from the perspective of the French healthcare payer; discount rates were 4%. Probabilistic and deterministic sensitivity analyses were performed. Outcomes were quality-adjusted life years (QALYs) and incremental CE ratio (ICER). Mean QALY for an LVAD patient was 1.5 at a lifetime cost of €190 739, delivering a probabilistic ICER of €125 580/QALY [95% confidence interval: 105 587 to 150 314]. The sensitivity analysis showed that the ICER was mainly sensitive to two factors: (i) the high acquisition cost of the device and (ii) the device performance in terms of patient survival. Our economic evaluation showed that the use of LVAD in patients with end-stage heart failure yields greater benefit in terms of survival than medical management at an extra lifetime cost exceeding the €100 000/QALY. Technological advances and device costs reduction shall hence lead to an improvement in overall CE. © 2017 The Authors. ESC Heart Failure published by John Wiley & Sons Ltd on behalf of the European Society of Cardiology.
NASA Astrophysics Data System (ADS)
Ha, Taesung
A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential usefulness of quantifying model uncertainty as sensitivity analysis in the PRA model.
Haucke, Florian
2010-11-01
Radon is a naturally occurring inert radioactive gas found in soils and rocks that can accumulate in dwellings, and is associated with an increased risk of lung cancer. This study aims to analyze the cost effectiveness of different intervention strategies to reduce radon concentrations in existing German dwellings. The cost effectiveness analysis (CEA) was conducted as a scenario analysis, where each scenario represents a specific regulatory regime. A decision theoretic model was developed, which reflects accepted recommendations for radon screening and mitigation and uses most up-to-date data on radon distribution and relative risks. The model was programmed to account for compliance with respect to the single steps of radon intervention, as well as data on the sensitivity/specificity of radon tests. A societal perspective was adopted to calculate costs and effects. All scenarios were calculated for different action levels. Cost effectiveness was measured in costs per averted case of lung cancer, costs per life year gained and costs per quality adjusted life year (QALY) gained. Univariate and multivariate deterministic and probabilistic sensitivity analyses (SA) were performed. Probabilistic sensitivity analyses were based on Monte Carlo simulations with 5000 model runs. The results show that legal regulations with mandatory screening and mitigation for indoor radon levels >100 Bq/m(3) are most cost effective. Incremental cost effectiveness compared to the no mitigation base case is 25,181 euro (95% CI: 7371 euro-90,593 euro) per QALY gained. Other intervention strategies focussing primarily on the personal responsibility for screening and/or mitigative actions show considerably worse cost effectiveness ratios. However, targeting radon intervention to radon-prone areas is significantly more cost effective. Most of the uncertainty that surrounds the results can be ascribed to the relative risk of radon exposure. It can be concluded that in the light of international experience a legal regulation requiring radon screening and, if necessary, mitigation is justifiable under the terms of CEA. Copyright 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Khabbazan, Mohammad Mohammadi; Roshan, Elnaz; Held, Hermann
2017-04-01
In principle solar radiation management (SRM) offers an option to ameliorate anthropogenic temperature rise. However we cannot expect it to simultaneously compensate for anthropogenic changes in further climate variables in a perfect manner. Here, we ask to what extent a proponent of the 2°C-temperature target would apply SRM in conjunction with mitigation in view of global or regional disparities in precipitation changes. We apply cost-risk analysis (CRA), which is a decision analytic framework that makes a trade-off between the expected welfare-loss from climate policy costs and the climate risks from transgressing a climate target. Here, in both global-scale and 'Giorgi'-regional-scale analyses, we evaluate the optimal mixture of SRM and mitigation under probabilistic information about climate sensitivity. To do so, we generalize CRA for the sake of including not only temperature risk, but also globally aggregated and regionally disaggregated precipitation risks. Social welfare is maximized for the following three valuation scenarios: temperature-risk-only, precipitation-risk-only, and equally weighted both-risks. For now, the Giorgi regions are treated by equal weight. We find that for regionally differentiated precipitation targets, the usage of SRM will be comparably more restricted. In the course of time, a cooling of up to 1.3°C can be attributed to SRM for the latter scenario and for a median climate sensitivity of 3°C (for a global target only, this number reduces by 0.5°C). Our results indicate that although SRM would almost completely substitute for mitigation in the globally aggregated analysis, it only saves 70% to 75% of the welfare-loss compared to a purely mitigation-based analysis (from economic costs and climate risks, approximately 4% in terms of BGE) when considering regional precipitation risks in precipitation-risk-only and both-risks scenarios. It remains to be shown how the inclusion of further risks or different regional weights would change that picture.
Schechter, Clyde B; Basch, Charles E; Caban, Arlene; Walker, Elizabeth A
2008-01-01
In a clinical trial, we have previously shown that a telephone intervention can significantly increase participation in dilated fundus examination (DFE) screening among low-income adults with diabetes. Here the costs and cost-effectiveness ratio of this intervention are calculated. Intervention effectiveness was estimated as the difference in DFE utilization between the telephone intervention and print groups from the clinical trial multiplied by the size of the telephone intervention group. A micro-costing approach was used. Personnel time was aggregated from logs kept during the clinical trial of the intervention. Wage rates were taken from a commercial compensation database. Telephone charges were estimated based on prevailing fees. The cost-effectiveness ratio was calculated as the ratio of total costs of the intervention to the number of DFEs gained by the intervention. A sensitivity analysis estimated the cost-effectiveness of a more limited telephone intervention. A probabilistic sensitivity analysis using bootstrap samples from the clinical trial results quantified the uncertainties in resource utilization and intervention effectiveness. Net intervention costs were US$18,676.06, with an associated gain of 43.7 DFEs and 16.4 new diagnoses of diabetic retinopathy. The cost-effectiveness ratio is US$427.37 per DFE gained. A restricted intervention limiting the number of calls to 5, as opposed to 7, would achieve the same results, but would cost approximately 17% less. In the probabilistic sensitivity analysis, the 5th and 95th percentiles of the cost-effectiveness ratio were US$304.05 and US$692.52 per DFE gained, respectively. Our telephone intervention is more expensive than simple mail or telephone reminders used in other settings to promote preventive care; it is, however, also considerably more effective, and is effective in a low-income minority population at greater risk for diabetes complications. The costs are dominated by labor costs, and may be substantially defrayed, without loss of effectiveness, by restricting the number of telephone calls to 5 per patient. PMID:19668428
NASA Technical Reports Server (NTRS)
Fayssal, Safie; Weldon, Danny
2008-01-01
The United States National Aeronautics and Space Administration (NASA) is in the midst of a space exploration program called Constellation to send crew and cargo to the international Space Station, to the moon, and beyond. As part of the Constellation program, a new launch vehicle, Ares I, is being developed by NASA Marshall Space Flight Center. Designing a launch vehicle with high reliability and increased safety requires a significant effort in understanding design variability and design uncertainty at the various levels of the design (system, element, subsystem, component, etc.) and throughout the various design phases (conceptual, preliminary design, etc.). In a previous paper [1] we discussed a probabilistic functional failure analysis approach intended mainly to support system requirements definition, system design, and element design during the early design phases. This paper provides an overview of the application of probabilistic engineering methods to support the detailed subsystem/component design and development as part of the "Design for Reliability and Safety" approach for the new Ares I Launch Vehicle. Specifically, the paper discusses probabilistic engineering design analysis cases that had major impact on the design and manufacturing of the Space Shuttle hardware. The cases represent important lessons learned from the Space Shuttle Program and clearly demonstrate the significance of probabilistic engineering analysis in better understanding design deficiencies and identifying potential design improvement for Ares I. The paper also discusses the probabilistic functional failure analysis approach applied during the early design phases of Ares I and the forward plans for probabilistic design analysis in the detailed design and development phases.
Probabilistic Simulation of Multi-Scale Composite Behavior
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2012-01-01
A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.
Lesage, Elise; Aronson, Sarah E; Sutherland, Matthew T; Ross, Thomas J; Salmeron, Betty Jo; Stein, Elliot A
2017-06-01
Withdrawal from nicotine is an important contributor to smoking relapse. Understanding how reward-based decision making is affected by abstinence and by pharmacotherapies such as nicotine replacement therapy and varenicline tartrate may aid cessation treatment. To independently assess the effects of nicotine dependence and stimulation of the nicotinic acetylcholine receptor on the ability to interpret valence information (reward sensitivity) and subsequently alter behavior as reward contingencies change (cognitive flexibility) in a probabilistic reversal learning task. Nicotine-dependent smokers and nonsmokers completed a probabilistic reversal learning task during acquisition of functional magnetic resonance imaging (fMRI) in a 2-drug, double-blind placebo-controlled crossover design conducted from January 21, 2009, to September 29, 2011. Smokers were abstinent from cigarette smoking for 12 hours for all sessions. In a fully Latin square fashion, participants in both groups underwent MRI twice while receiving varenicline and twice while receiving a placebo pill, wearing either a nicotine or a placebo patch. Imaging analysis was performed from June 15, 2015, to August 10, 2016. A well-established computational model captured effects of smoking status and administration of nicotine and varenicline on probabilistic reversal learning choice behavior. Neural effects of smoking status, nicotine, and varenicline were tested for on MRI contrasts that captured reward sensitivity and cognitive flexibility. The study included 24 nicotine-dependent smokers (12 women and 12 men; mean [SD] age, 35.8 [9.9] years) and 20 nonsmokers (10 women and 10 men; mean [SD] age, 30.4 [7.2] years). Computational modeling indicated that abstinent smokers were biased toward response shifting and that their decisions were less sensitive to the available evidence, suggesting increased impulsivity during withdrawal. These behavioral impairments were mitigated with nicotine and varenicline. Similarly, decreased mesocorticolimbic activity associated with cognitive flexibility in abstinent smokers was restored to the level of nonsmokers following stimulation of nicotinic acetylcholine receptors (familywise error-corrected P < .05). Conversely, neural signatures of decreased reward sensitivity in smokers (vs nonsmokers; familywise error-corrected P < .05) in the dorsal striatum and anterior cingulate cortex were not mitigated by nicotine or varenicline. There was a double dissociation between the effects of chronic nicotine dependence on neural representations of reward sensitivity and acute effects of stimulation of nicotinic acetylcholine receptors on behavioral and neural signatures of cognitive flexibility in smokers. These chronic and acute pharmacologic effects were observed in overlapping mesocorticolimbic regions, suggesting that available pharmacotherapies may alleviate deficits in the same circuitry for certain mental computations but not for others. clinicaltrials.gov Identifier: NCT00830739.
Probabilistic load simulation: Code development status
NASA Astrophysics Data System (ADS)
Newell, J. F.; Ho, H.
1991-05-01
The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.
Screen or not to screen for peripheral arterial disease: guidance from a decision model.
Vaidya, Anil; Joore, Manuela A; Ten Cate-Hoek, Arina J; Ten Cate, Hugo; Severens, Johan L
2014-01-29
Asymptomatic Peripheral Arterial Disease (PAD) is associated with greater risk of acute cardiovascular events. This study aims to determine the cost-effectiveness of one time only PAD screening using Ankle Brachial Index (ABI) test and subsequent anti platelet preventive treatment (low dose aspirin or clopidogrel) in individuals at high risk for acute cardiovascular events compared to no screening and no treatment using decision analytic modelling. A probabilistic Markov model was developed to evaluate the life time cost-effectiveness of the strategy of selective PAD screening and consequent preventive treatment compared to no screening and no preventive treatment. The analysis was conducted from the Dutch societal perspective and to address decision uncertainty, probabilistic sensitivity analysis was performed. Results were based on average values of 1000 Monte Carlo simulations and using discount rates of 1.5% and 4% for effects and costs respectively. One way sensitivity analyses were performed to identify the two most influential model parameters affecting model outputs. Then, a two way sensitivity analysis was conducted for combinations of values tested for these two most influential parameters. For the PAD screening strategy, life years and quality adjusted life years gained were 21.79 and 15.66 respectively at a lifetime cost of 26,548 Euros. Compared to no screening and treatment (20.69 life years, 15.58 Quality Adjusted Life Years, 28,052 Euros), these results indicate that PAD screening and treatment is a dominant strategy. The cost effectiveness acceptability curves show 88% probability of PAD screening being cost effective at the Willingness To Pay (WTP) threshold of 40000 Euros. In a scenario analysis using clopidogrel as an alternative anti-platelet drug, PAD screening strategy remained dominant. This decision analysis suggests that targeted ABI screening and consequent secondary prevention of cardiovascular events using low dose aspirin or clopidogrel in the identified patients is a cost-effective strategy. Implementation of targeted PAD screening and subsequent treatment in primary care practices and in public health programs is likely to improve the societal health and to save health care costs by reducing catastrophic cardiovascular events.
Zeng, Xiaohui; Peng, Liubao; Li, Jianhe; Chen, Gannong; Tan, Chongqing; Wang, Siying; Wan, Xiaomin; Ouyang, Lihui; Zhao, Ziying
2013-01-01
Continuation maintenance treatment with pemetrexed is approved by current clinical guidelines as a category 2A recommendation after induction therapy with cisplatin and pemetrexed chemotherapy (CP strategy) for patients with advanced nonsquamous non-small-cell lung cancer (NSCLC). However, the cost-effectiveness of the treatment remains unclear. We completed a trial-based assessment, from the perspective of the Chinese health care system, of the cost-effectiveness of maintenance pemetrexed treatment after a CP strategy for patients with advanced nonsquamous NSCLC. A Markov model was developed to estimate costs and benefits. It was based on a clinical trial that compared continuation maintenance pemetrexed therapy plus best supportive care (BSC) versus placebo plus BSC after a CP strategy for advanced nonsquamous NSCLC. Sensitivity analyses were conducted to assess the stability of the model. The model base case analysis suggested that continuation maintenance pemetrexed therapy after a CP strategy would increase benefits in a 1-, 2-, 5-, or 10-year time horizon, with incremental costs of $183,589.06, $126,353.16, $124,766.68, and $124,793.12 per quality-adjusted life-year gained, respectively. The most sensitive influential variable in the cost-effectiveness analysis was the utility of the progression-free survival state, followed by proportion of patients with postdiscontinuation therapy in both arms, proportion of BSC costs for PFS versus progressed survival state, and cost of pemetrexed. Probabilistic sensitivity analysis indicated that the cost-effective probability of adding continuation maintenance pemetrexed therapy to BSC was zero. One-way and probabilistic sensitivity analyses revealed that the Markov model was robust. Continuation maintenance of pemetrexed after a CP strategy for patients with advanced nonsquamous NSCLC is not cost-effective based on a recent clinical trial. Decreasing the price or adjusting the dosage of pemetrexed may be a better option for meeting the treatment demands of Chinese patients. Copyright © 2013 Elsevier HS Journals, Inc. All rights reserved.
Benchmarking Exercises To Validate The Updated ELLWF GoldSim Slit Trench Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, G. A.; Hiergesell, R. A.
2013-11-12
The Savannah River National Laboratory (SRNL) results of the 2008 Performance Assessment (PA) (WSRC, 2008) sensitivity/uncertainty analyses conducted for the trenches located in the EArea LowLevel Waste Facility (ELLWF) were subject to review by the United States Department of Energy (U.S. DOE) Low-Level Waste Disposal Facility Federal Review Group (LFRG) (LFRG, 2008). LFRG comments were generally approving of the use of probabilistic modeling in GoldSim to support the quantitative sensitivity analysis. A recommendation was made, however, that the probabilistic models be revised and updated to bolster their defensibility. SRS committed to addressing those comments and, in response, contracted with Neptunemore » and Company to rewrite the three GoldSim models. The initial portion of this work, development of Slit Trench (ST), Engineered Trench (ET) and Components-in-Grout (CIG) trench GoldSim models, has been completed. The work described in this report utilizes these revised models to test and evaluate the results against the 2008 PORFLOW model results. This was accomplished by first performing a rigorous code-to-code comparison of the PORFLOW and GoldSim codes and then performing a deterministic comparison of the two-dimensional (2D) unsaturated zone and three-dimensional (3D) saturated zone PORFLOW Slit Trench models against results from the one-dimensional (1D) GoldSim Slit Trench model. The results of the code-to-code comparison indicate that when the mechanisms of radioactive decay, partitioning of contaminants between solid and fluid, implementation of specific boundary conditions and the imposition of solubility controls were all tested using identical flow fields, that GoldSim and PORFLOW produce nearly identical results. It is also noted that GoldSim has an advantage over PORFLOW in that it simulates all radionuclides simultaneously - thus avoiding a potential problem as demonstrated in the Case Study (see Section 2.6). Hence, it was concluded that the follow-on work using GoldSim to develop 1D equivalent models of the PORFLOW multi-dimensional models was justified. The comparison of GoldSim 1D equivalent models to PORFLOW multi-dimensional models was made at two locations in the model domains - at the unsaturated-saturated zone interface and at the 100m point of compliance. PORFLOW model results from the 2008 PA were utilized to investigate the comparison. By making iterative adjustments to certain water flux terms in the GoldSim models it was possible to produce contaminant mass fluxes and water concentrations that were highly similar to the PORFLOW model results at the two locations where comparisons were made. Based on the ability of the GoldSim 1D trench models to produce mass flux and concentration curves that are sufficiently similar to multi-dimensional PORFLOW models for all of the evaluated radionuclides and their progeny, it is concluded that the use of the GoldSim 1D equivalent Slit and Engineered trenches models for further probabilistic sensitivity and uncertainty analysis of ELLWF trench units is justified. A revision to the original report was undertaken to correct mislabeling on the y-axes of the compliance point concentration graphs, to modify the terminology used to define the ''blended'' source term Case for the saturated zone to make it consistent with terminology used in the 2008 PA, and to make a more definitive statement regarding the justification of the use of the GoldSim 1D equivalent trench models for follow-on probabilistic sensitivity and uncertainty analysis.« less
Dornic, N; Ficheux, A S; Bernard, A; Roudot, A C
2017-08-01
The notes of guidance for the testing of cosmetic ingredients and their safety evaluation by the Scientific Committee on Consumer Safety (SCCS) is a document dedicated to ensuring the safety of European consumers. This contains useful data for risk assessment such as default values for Skin Surface Area (SSA). A more in-depth study of anthropometric data across Europe reveals considerable variations. The default SSA value was derived from a study on the Dutch population, which is known to be one of the tallest nations in the World. This value could be inadequate for shorter populations of Europe. Data were collected in a survey on cosmetic consumption in France. Probabilistic treatment of these data and analysis of the case of methylisothiazolinone, a sensitizer recently evaluated by a deterministic approach submitted to SCCS, suggest that the default value for SSA used in the quantitative risk assessment might not be relevant for a significant share of the French female population. Others female populations of Southern Europe may also be excluded. This is of importance given that some studies show an increasing risk of developping skin sensitization among women. The disparities in anthropometric data across Europe should be taken into consideration. Copyright © 2017 Elsevier Ltd. All rights reserved.
Mahajan, Ruhi; Viangteeravat, Teeradache; Akbilgic, Oguz
2017-12-01
A timely diagnosis of congestive heart failure (CHF) is crucial to evade a life-threatening event. This paper presents a novel probabilistic symbol pattern recognition (PSPR) approach to detect CHF in subjects from their cardiac interbeat (R-R) intervals. PSPR discretizes each continuous R-R interval time series by mapping them onto an eight-symbol alphabet and then models the pattern transition behavior in the symbolic representation of the series. The PSPR-based analysis of the discretized series from 107 subjects (69 normal and 38 CHF subjects) yielded discernible features to distinguish normal subjects and subjects with CHF. In addition to PSPR features, we also extracted features using the time-domain heart rate variability measures such as average and standard deviation of R-R intervals. An ensemble of bagged decision trees was used to classify two groups resulting in a five-fold cross-validation accuracy, specificity, and sensitivity of 98.1%, 100%, and 94.7%, respectively. However, a 20% holdout validation yielded an accuracy, specificity, and sensitivity of 99.5%, 100%, and 98.57%, respectively. Results from this study suggest that features obtained with the combination of PSPR and long-term heart rate variability measures can be used in developing automated CHF diagnosis tools. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.
1995-01-01
Volume 5 is Appendix C, Auxiliary Shuttle Risk Analyses, and contains the following reports: Probabilistic Risk Assessment of Space Shuttle Phase 1 - Space Shuttle Catastrophic Failure Frequency Final Report; Risk Analysis Applied to the Space Shuttle Main Engine - Demonstration Project for the Main Combustion Chamber Risk Assessment; An Investigation of the Risk Implications of Space Shuttle Solid Rocket Booster Chamber Pressure Excursions; Safety of the Thermal Protection System of the Space Shuttle Orbiter - Quantitative Analysis and Organizational Factors; Space Shuttle Main Propulsion Pressurization System Probabilistic Risk Assessment, Final Report; and Space Shuttle Probabilistic Risk Assessment Proof-of-Concept Study - Auxiliary Power Unit and Hydraulic Power Unit Analysis Report.
Hiraishi, Kunihiko
2014-01-01
One of the significant topics in systems biology is to develop control theory of gene regulatory networks (GRNs). In typical control of GRNs, expression of some genes is inhibited (activated) by manipulating external stimuli and expression of other genes. It is expected to apply control theory of GRNs to gene therapy technologies in the future. In this paper, a control method using a Boolean network (BN) is studied. A BN is widely used as a model of GRNs, and gene expression is expressed by a binary value (ON or OFF). In particular, a context-sensitive probabilistic Boolean network (CS-PBN), which is one of the extended models of BNs, is used. For CS-PBNs, the verification problem and the optimal control problem are considered. For the verification problem, a solution method using the probabilistic model checker PRISM is proposed. For the optimal control problem, a solution method using polynomial optimization is proposed. Finally, a numerical example on the WNT5A network, which is related to melanoma, is presented. The proposed methods provide us useful tools in control theory of GRNs. PMID:24587766
Integrated Technology Assessment Center (ITAC) Update
NASA Technical Reports Server (NTRS)
Taylor, J. L.; Neely, M. A.; Curran, F. M.; Christensen, E. R.; Escher, D.; Lovell, N.; Morris, Charles (Technical Monitor)
2002-01-01
The Integrated Technology Assessment Center (ITAC) has developed a flexible systems analysis framework to identify long-term technology needs, quantify payoffs for technology investments, and assess the progress of ASTP-sponsored technology programs in the hypersonics area. For this, ITAC has assembled an experienced team representing a broad sector of the aerospace community and developed a systematic assessment process complete with supporting tools. Concepts for transportation systems are selected based on relevance to the ASTP and integrated concept models (ICM) of these concepts are developed. Key technologies of interest are identified and projections are made of their characteristics with respect to their impacts on key aspects of the specific concepts of interest. Both the models and technology projections are then fed into the ITAC's probabilistic systems analysis framework in ModelCenter. This framework permits rapid sensitivity analysis, single point design assessment, and a full probabilistic assessment of each concept with respect to both embedded and enhancing technologies. Probabilistic outputs are weighed against metrics of interest to ASTP using a multivariate decision making process to provide inputs for technology prioritization within the ASTP. ITAC program is currently finishing the assessment of a two-stage-to-orbit (TSTO), rocket-based combined cycle (RBCC) concept and a TSTO turbine-based combined cycle (TBCC) concept developed by the team with inputs from NASA. A baseline all rocket TSTO concept is also being developed for comparison. Boeing has recently submitted a performance model for their Flexible Aerospace System Solution for Tomorrow (FASST) concept and the ISAT program will provide inputs for a single-stage-to-orbit (SSTO) TBCC based concept in the near-term. Both of these latter concepts will be analyzed within the ITAC framework over the summer. This paper provides a status update of the ITAC program.
NASA Technical Reports Server (NTRS)
Duffy, S. F.; Hu, J.; Hopkins, D. A.
1995-01-01
The article begins by examining the fundamentals of traditional deterministic design philosophy. The initial section outlines the concepts of failure criteria and limit state functions two traditional notions that are embedded in deterministic design philosophy. This is followed by a discussion regarding safety factors (a possible limit state function) and the common utilization of statistical concepts in deterministic engineering design approaches. Next the fundamental aspects of a probabilistic failure analysis are explored and it is shown that deterministic design concepts mentioned in the initial portion of the article are embedded in probabilistic design methods. For components fabricated from ceramic materials (and other similarly brittle materials) the probabilistic design approach yields the widely used Weibull analysis after suitable assumptions are incorporated. The authors point out that Weibull analysis provides the rare instance where closed form solutions are available for a probabilistic failure analysis. Since numerical methods are usually required to evaluate component reliabilities, a section on Monte Carlo methods is included to introduce the concept. The article concludes with a presentation of the technical aspects that support the numerical method known as fast probability integration (FPI). This includes a discussion of the Hasofer-Lind and Rackwitz-Fiessler approximations.
Fusar-Poli, P; Schultze-Lutter, F
2016-02-01
Prediction of psychosis in patients at clinical high risk (CHR) has become a mainstream focus of clinical and research interest worldwide. When using CHR instruments for clinical purposes, the predicted outcome is but only a probability; and, consequently, any therapeutic action following the assessment is based on probabilistic prognostic reasoning. Yet, probabilistic reasoning makes considerable demands on the clinicians. We provide here a scholarly practical guide summarising the key concepts to support clinicians with probabilistic prognostic reasoning in the CHR state. We review risk or cumulative incidence of psychosis in, person-time rate of psychosis, Kaplan-Meier estimates of psychosis risk, measures of prognostic accuracy, sensitivity and specificity in receiver operator characteristic curves, positive and negative predictive values, Bayes' theorem, likelihood ratios, potentials and limits of real-life applications of prognostic probabilistic reasoning in the CHR state. Understanding basic measures used for prognostic probabilistic reasoning is a prerequisite for successfully implementing the early detection and prevention of psychosis in clinical practice. Future refinement of these measures for CHR patients may actually influence risk management, especially as regards initiating or withholding treatment. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Gutiérrez, Simón; Fernandez, Carlos; Barata, Carlos; Tarazona, José Vicente
2009-12-20
This work presents a computer model for Risk Assessment of Basins by Ecotoxicological Evaluation (RABETOX). The model is based on whole effluent toxicity testing and water flows along a specific river basin. It is capable of estimating the risk along a river segment using deterministic and probabilistic approaches. The Henares River Basin was selected as a case study to demonstrate the importance of seasonal hydrological variations in Mediterranean regions. As model inputs, two different ecotoxicity tests (the miniaturized Daphnia magna acute test and the D.magna feeding test) were performed on grab samples from 5 waste water treatment plant effluents. Also used as model inputs were flow data from the past 25 years, water velocity measurements and precise distance measurements using Geographical Information Systems (GIS). The model was implemented into a spreadsheet and the results were interpreted and represented using GIS in order to facilitate risk communication. To better understand the bioassays results, the effluents were screened through SPME-GC/MS analysis. The deterministic model, performed each month during one calendar year, showed a significant seasonal variation of risk while revealing that September represents the worst-case scenario with values up to 950 Risk Units. This classifies the entire area of study for the month of September as "sublethal significant risk for standard species". The probabilistic approach using Monte Carlo analysis was performed on 7 different forecast points distributed along the Henares River. A 0% probability of finding "low risk" was found at all forecast points with a more than 50% probability of finding "potential risk for sensitive species". The values obtained through both the deterministic and probabilistic approximations reveal the presence of certain substances, which might be causing sublethal effects in the aquatic species present in the Henares River.
Electromagnetic Compatibility (EMC) in Microelectronics.
1983-02-01
Fault Tree Analysis", System Saftey Symposium, June 8-9, 1965, Seattle: The Boeing Company . 12. Fussell, J.B., "Fault Tree Analysis-Concepts and...procedure for assessing EMC in microelectronics and for applying DD, 1473 EOiTO OP I, NOV6 IS OESOL.ETE UNCLASSIFIED SECURITY CLASSIFICATION OF THIS...CRITERIA 2.1 Background 2 2.2 The Probabilistic Nature of EMC 2 2.3 The Probabilistic Approach 5 2.4 The Compatibility Factor 6 3 APPLYING PROBABILISTIC
NESSUS/EXPERT - An expert system for probabilistic structural analysis methods
NASA Technical Reports Server (NTRS)
Millwater, H.; Palmer, K.; Fink, P.
1988-01-01
An expert system (NESSUS/EXPERT) is presented which provides assistance in using probabilistic structural analysis methods. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator. NESSUS/EXPERT was developed with a combination of FORTRAN and CLIPS, a C language expert system tool, to exploit the strengths of each language.
Modeling marine oily wastewater treatment by a probabilistic agent-based approach.
Jing, Liang; Chen, Bing; Zhang, Baiyu; Ye, Xudong
2018-02-01
This study developed a novel probabilistic agent-based approach for modeling of marine oily wastewater treatment processes. It begins first by constructing a probability-based agent simulation model, followed by a global sensitivity analysis and a genetic algorithm-based calibration. The proposed modeling approach was tested through a case study of the removal of naphthalene from marine oily wastewater using UV irradiation. The removal of naphthalene was described by an agent-based simulation model using 8 types of agents and 11 reactions. Each reaction was governed by a probability parameter to determine its occurrence. The modeling results showed that the root mean square errors between modeled and observed removal rates were 8.73 and 11.03% for calibration and validation runs, respectively. Reaction competition was analyzed by comparing agent-based reaction probabilities, while agents' heterogeneity was visualized by plotting their real-time spatial distribution, showing a strong potential for reactor design and process optimization. Copyright © 2017 Elsevier Ltd. All rights reserved.
Guymon, Gary L.; Yen, Chung-Cheng
1990-01-01
The applicability of a deterministic-probabilistic model for predicting water tables in southern Owens Valley, California, is evaluated. The model is based on a two-layer deterministic model that is cascaded with a two-point probability model. To reduce the potentially large number of uncertain variables in the deterministic model, lumping of uncertain variables was evaluated by sensitivity analysis to reduce the total number of uncertain variables to three variables: hydraulic conductivity, storage coefficient or specific yield, and source-sink function. Results demonstrate that lumping of uncertain parameters reduces computational effort while providing sufficient precision for the case studied. Simulated spatial coefficients of variation for water table temporal position in most of the basin is small, which suggests that deterministic models can predict water tables in these areas with good precision. However, in several important areas where pumping occurs or the geology is complex, the simulated spatial coefficients of variation are over estimated by the two-point probability method.
NASA Astrophysics Data System (ADS)
Guymon, Gary L.; Yen, Chung-Cheng
1990-07-01
The applicability of a deterministic-probabilistic model for predicting water tables in southern Owens Valley, California, is evaluated. The model is based on a two-layer deterministic model that is cascaded with a two-point probability model. To reduce the potentially large number of uncertain variables in the deterministic model, lumping of uncertain variables was evaluated by sensitivity analysis to reduce the total number of uncertain variables to three variables: hydraulic conductivity, storage coefficient or specific yield, and source-sink function. Results demonstrate that lumping of uncertain parameters reduces computational effort while providing sufficient precision for the case studied. Simulated spatial coefficients of variation for water table temporal position in most of the basin is small, which suggests that deterministic models can predict water tables in these areas with good precision. However, in several important areas where pumping occurs or the geology is complex, the simulated spatial coefficients of variation are over estimated by the two-point probability method.
NASA Astrophysics Data System (ADS)
Fei, Cheng-Wei; Bai, Guang-Chen
2014-12-01
To improve the computational precision and efficiency of probabilistic design for mechanical dynamic assembly like the blade-tip radial running clearance (BTRRC) of gas turbine, a distribution collaborative probabilistic design method-based support vector machine of regression (SR)(called as DCSRM) is proposed by integrating distribution collaborative response surface method and support vector machine regression model. The mathematical model of DCSRM is established and the probabilistic design idea of DCSRM is introduced. The dynamic assembly probabilistic design of aeroengine high-pressure turbine (HPT) BTRRC is accomplished to verify the proposed DCSRM. The analysis results reveal that the optimal static blade-tip clearance of HPT is gained for designing BTRRC, and improving the performance and reliability of aeroengine. The comparison of methods shows that the DCSRM has high computational accuracy and high computational efficiency in BTRRC probabilistic analysis. The present research offers an effective way for the reliability design of mechanical dynamic assembly and enriches mechanical reliability theory and method.
Quantification of uncertainties in the performance of smart composite structures
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Chamis, Christos C.
1993-01-01
A composite wing with spars, bulkheads, and built-in control devices is evaluated using a method for the probabilistic assessment of smart composite structures. Structural responses (such as change in angle of attack, vertical displacements, and stresses in regular plies with traditional materials and in control plies with mixed traditional and actuation materials) are probabilistically assessed to quantify their respective scatter. Probabilistic sensitivity factors are computed to identify those parameters that have a significant influence on a specific structural response. Results show that the uncertainties in the responses of smart composite structures can be quantified. Responses such as structural deformation, ply stresses, frequencies, and buckling loads in the presence of defects can be reliably controlled to satisfy specified design requirements.
Tee, Augustine; Chow, Wai Leng; Burke, Colin; Basavarajaiah, Guruprasad
2018-03-16
In light of the growing evidence base for better clinical results with the use of the dual bronchodilator indacaterol/glycopyrronium (IND/GLY) over inhaled corticosteroid-containing salmeterol/fluticasone combination (SFC), this study aimed to evaluate the cost-effectiveness of IND/GLY over SFC in patients with moderate-to-severe chronic obstructive pulmonary disease (COPD) who are at low risk of exacerbations in the Singapore healthcare setting. A previously published patient-level simulation model was adapted for use in Singapore by applying local unit costs. The model was populated with clinical data from the LANTERN and ECLIPSE studies. Both costs and health outcomes were predicted for the lifetime horizon from a payer's perspective and were discounted at 3% per annum. Costs were expressed in 2015 USD. Uncertainty was assessed through probabilistic sensitivity analysis. Compared to SFC, use of IND/GLY increased mean life expectancy by 0.316 years and mean quality-adjusted life-years (QALYs) by 0.246 years, and decreased mean total treatment costs (drug costs and management of associated events) by USD 1,474 over the entire lifetime horizon. IND/GLY was considered to be 100% cost-effective at a threshold of 1 × gross domestic product per capita. The probabilistic sensitivity analysis results showed that IND/GLY was 100% cost-effective at a threshold of USD 2,000 when compared to SFC. IND/GLY was estimated to be highly cost-effective compared to SFC in patients with moderate-to-severe COPD who are not at high risk of exacerbations in the Singapore healthcare setting.
Casciano, Roman; Chulikavit, Maruit; Di Lorenzo, Giuseppe; Liu, Zhimei; Baladi, Jean-Francois; Wang, Xufang; Robertson, Justin; Garrison, Lou
2011-01-01
A recent indirect comparison study showed that sunitinib-refractory metastatic renal cell carcinoma (mRCC) patients treated with everolimus are expected to have improved overall survival outcomes compared to patients treated with sorafenib. This analysis examines the likely cost-effectiveness of everolimus versus sorafenib in this setting from a US payer perspective. A Markov model was developed to simulate a cohort of sunitinib-refractory mRCC patients and to estimate the cost per incremental life-years gained (LYG) and quality-adjusted life-years (QALYs) gained. Markov states included are stable disease without adverse events, stable disease with adverse events, disease progression, and death. Transition probabilities were estimated using a subset of the RECORD-1 patient population receiving everolimus after sunitinib, and a comparable population receiving sorafenib in a single-arm phase II study. Costs of antitumor therapies were based on wholesale acquisition cost. Health state costs accounted for physician visits, tests, adverse events, postprogression therapy, and end-of-life care. The model extrapolated beyond the trial time horizon for up to 6 years based on published trial data. Deterministic and probabilistic sensitivity analyses were conducted. The estimated gain over sorafenib treatment was 1.273 LYs (0.916 QALYs) at an incremental cost of $81,643. The deterministic analysis resulted in an incremental cost-effectiveness ratio (ICER) of $64,155/LYG ($89,160/QALY). The probabilistic sensitivity analysis demonstrated that results were highly consistent across simulations. As the ICER fell within the cost per QALY range for many other widely used oncology medicines, everolimus is projected to be a cost-effective treatment relative to sorafenib for sunitinib-refractory mRCC. Copyright © 2011 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Candia, Roberto; Naimark, David; Sander, Beate; Nguyen, Geoffrey C
2017-11-01
Postoperative recurrence of Crohn's disease is common. This study sought to assess whether the postoperative management should be based on biological therapy alone or combined with thiopurines and whether the therapy should be started immediately after surgery or guided by either endoscopic or clinical recurrence. A Markov model was developed to estimate expected health outcomes in quality-adjusted life years (QALYs) and costs in Canadian dollars (CAD$) accrued by hypothetical patients with high recurrence risk after ileocolic resection. Eight strategies of postoperative management were evaluated. A lifetime time horizon, an annual discount rate of 5%, a societal perspective, and a cost-effectiveness threshold of 50,000 CAD$/QALY were assumed. Deterministic and probabilistic sensitivity analyses were conducted. The model was validated against randomized trials and historical cohorts. Three strategies dominated the others: endoscopy-guided full step-up therapy (14.80 QALYs, CAD$ 462,180), thiopurines immediately post-surgery plus endoscopy-guided biological step-up therapy (14.89 QALYs, CAD$ 464,099) and combination therapy immediately post-surgery (14.94 QALYs, CAD$ 483,685). The second strategy was the most cost-effective, assuming a cost-effectiveness threshold of 50,000 CAD$/QALY. Probabilistic sensitivity analysis showed that the second strategy has the highest probability of being the optimal alternative in all comparisons at cost-effectiveness thresholds from 30,000 to 100,000 CAD$/QALY. The strategies guided only by clinical recurrence and those using biologics alone were dominated. According to this decision analysis, thiopurines immediately after surgery and addition of biologics guided by endoscopic recurrence is the optimal strategy of postoperative management in patients with Crohn's disease with high risk of recurrence (see Video Abstract, Supplemental Digital Content 1, http://links.lww.com/IBD/B654).
Sutton, A J; Vohra, R S; Hollyman, M; Marriott, P J; Buja, A; Alderson, D; Pasquali, S; Griffiths, E A
2017-01-01
The optimal timing of cholecystectomy for patients admitted with acute gallbladder pathology is unclear. Some studies have shown that emergency cholecystectomy during the index admission can reduce length of hospital stay with similar rates of conversion to open surgery, complications and mortality compared with a 'delayed' operation following discharge. Others have reported that cholecystectomy during the index acute admission results in higher morbidity, extended length of stay and increased costs. This study examined the cost-effectiveness of emergency versus delayed cholecystectomy for acute benign gallbladder disease. Using data from a prospective population-based cohort study examining the outcomes of cholecystectomy in the UK and Ireland, a model-based cost-utility analysis was conducted from the perspective of the UK National Health Service, with a 1-year time horizon for costs and outcomes. Probabilistic sensitivity analysis was used to investigate the impact of parameter uncertainty on the results obtained from the model. Emergency cholecystectomy was found to be less costly (£4570 versus £4720; €5484 versus €5664) and more effective (0·8868 versus 0·8662 QALYs) than delayed cholecystectomy. Probabilistic sensitivity analysis showed that the emergency strategy is more than 60 per cent likely to be cost-effective across willingness-to-pay values for the QALY from £0 to £100 000 (€0-120 000). Emergency cholecystectomy is less costly and more effective than delayed cholecystectomy. This approach is likely to be beneficial to patients in terms of improved health outcomes and to the healthcare provider owing to the reduced costs. © 2016 BJS Society Ltd Published by John Wiley & Sons Ltd.
Parallel computing for probabilistic fatigue analysis
NASA Technical Reports Server (NTRS)
Sues, Robert H.; Lua, Yuan J.; Smith, Mark D.
1993-01-01
This paper presents the results of Phase I research to investigate the most effective parallel processing software strategies and hardware configurations for probabilistic structural analysis. We investigate the efficiency of both shared and distributed-memory architectures via a probabilistic fatigue life analysis problem. We also present a parallel programming approach, the virtual shared-memory paradigm, that is applicable across both types of hardware. Using this approach, problems can be solved on a variety of parallel configurations, including networks of single or multiprocessor workstations. We conclude that it is possible to effectively parallelize probabilistic fatigue analysis codes; however, special strategies will be needed to achieve large-scale parallelism to keep large number of processors busy and to treat problems with the large memory requirements encountered in practice. We also conclude that distributed-memory architecture is preferable to shared-memory for achieving large scale parallelism; however, in the future, the currently emerging hybrid-memory architectures will likely be optimal.
Probabilistic Parameter Uncertainty Analysis of Single Input Single Output Control Systems
NASA Technical Reports Server (NTRS)
Smith, Brett A.; Kenny, Sean P.; Crespo, Luis G.
2005-01-01
The current standards for handling uncertainty in control systems use interval bounds for definition of the uncertain parameters. This approach gives no information about the likelihood of system performance, but simply gives the response bounds. When used in design, current methods of m-analysis and can lead to overly conservative controller design. With these methods, worst case conditions are weighted equally with the most likely conditions. This research explores a unique approach for probabilistic analysis of control systems. Current reliability methods are examined showing the strong areas of each in handling probability. A hybrid method is developed using these reliability tools for efficiently propagating probabilistic uncertainty through classical control analysis problems. The method developed is applied to classical response analysis as well as analysis methods that explore the effects of the uncertain parameters on stability and performance metrics. The benefits of using this hybrid approach for calculating the mean and variance of responses cumulative distribution functions are shown. Results of the probabilistic analysis of a missile pitch control system, and a non-collocated mass spring system, show the added information provided by this hybrid analysis.
Probabilistic structural analysis methods for space transportation propulsion systems
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Moore, N.; Anis, C.; Newell, J.; Nagpal, V.; Singhal, S.
1991-01-01
Information on probabilistic structural analysis methods for space propulsion systems is given in viewgraph form. Information is given on deterministic certification methods, probability of failure, component response analysis, stress responses for 2nd stage turbine blades, Space Shuttle Main Engine (SSME) structural durability, and program plans. .
Probabilistic Simulation of Combined Thermo-Mechanical Cyclic Fatigue in Composites
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2011-01-01
A methodology to compute probabilistically-combined thermo-mechanical fatigue life of polymer matrix laminated composites has been developed and is demonstrated. Matrix degradation effects caused by long-term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress-dependent multifactor-interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability-integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/-45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical-cyclic loads and low thermal-cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical-cyclic loads and high thermal-cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.
Probabilistic Simulation for Combined Cycle Fatigue in Composites
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2010-01-01
A methodology to compute probabilistic fatigue life of polymer matrix laminated composites has been developed and demonstrated. Matrix degradation effects caused by long term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress dependent multifactor interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/- 45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical cyclic loads and low thermal cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical cyclic loads and high thermal cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.
Probabilistic Simulation of Combined Thermo-Mechanical Cyclic Fatigue in Composites
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2010-01-01
A methodology to compute probabilistically-combined thermo-mechanical fatigue life of polymer matrix laminated composites has been developed and is demonstrated. Matrix degradation effects caused by long-term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress-dependent multifactor-interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability-integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/-45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical-cyclic loads and low thermal-cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical-cyclic loads and high thermal-cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.
Tsugawa, Hiroshi; Arita, Masanori; Kanazawa, Mitsuhiro; Ogiwara, Atsushi; Bamba, Takeshi; Fukusaki, Eiichiro
2013-05-21
We developed a new software program, MRMPROBS, for widely targeted metabolomics by using the large-scale multiple reaction monitoring (MRM) mode. The strategy became increasingly popular for the simultaneous analysis of up to several hundred metabolites at high sensitivity, selectivity, and quantitative capability. However, the traditional method of assessing measured metabolomics data without probabilistic criteria is not only time-consuming but is often subjective and makeshift work. Our program overcomes these problems by detecting and identifying metabolites automatically, by separating isomeric metabolites, and by removing background noise using a probabilistic score defined as the odds ratio from an optimized multivariate logistic regression model. Our software program also provides a user-friendly graphical interface to curate and organize data matrices and to apply principal component analyses and statistical tests. For a demonstration, we conducted a widely targeted metabolome analysis (152 metabolites) of propagating Saccharomyces cerevisiae measured at 15 time points by gas and liquid chromatography coupled to triple quadrupole mass spectrometry. MRMPROBS is a useful and practical tool for the assessment of large-scale MRM data available to any instrument or any experimental condition.
NASA Astrophysics Data System (ADS)
Mahata, Avik; Mukhopadhyay, Tanmoy; Adhikari, Sondipon
2016-03-01
Nano-twinned structures are mechanically stronger, ductile and stable than its non-twinned form. We have investigated the effect of varying twin spacing and twin boundary width (TBW) on the yield strength of the nano-twinned copper in a probabilistic framework. An efficient surrogate modelling approach based on polynomial chaos expansion has been proposed for the analysis. Effectively utilising 15 sets of expensive molecular dynamics simulations, thousands of outputs have been obtained corresponding to different sets of twin spacing and twin width using virtual experiments based on the surrogates. One of the major outcomes of this work is that there exists an optimal combination of twin boundary spacing and twin width until which the strength can be increased and after that critical point the nanowires weaken. This study also reveals that the yield strength of nano-twinned copper is more sensitive to TBW than twin spacing. Such robust inferences have been possible to be drawn only because of applying the surrogate modelling approach, which makes it feasible to obtain results corresponding to 40 000 combinations of different twin boundary spacing and twin width in a computationally efficient framework.
A Methodology for Robust Comparative Life Cycle Assessments Incorporating Uncertainty.
Gregory, Jeremy R; Noshadravan, Arash; Olivetti, Elsa A; Kirchain, Randolph E
2016-06-21
We propose a methodology for conducting robust comparative life cycle assessments (LCA) by leveraging uncertainty. The method evaluates a broad range of the possible scenario space in a probabilistic fashion while simultaneously considering uncertainty in input data. The method is intended to ascertain which scenarios have a definitive environmentally preferable choice among the alternatives being compared and the significance of the differences given uncertainty in the parameters, which parameters have the most influence on this difference, and how we can identify the resolvable scenarios (where one alternative in the comparison has a clearly lower environmental impact). This is accomplished via an aggregated probabilistic scenario-aware analysis, followed by an assessment of which scenarios have resolvable alternatives. Decision-tree partitioning algorithms are used to isolate meaningful scenario groups. In instances where the alternatives cannot be resolved for scenarios of interest, influential parameters are identified using sensitivity analysis. If those parameters can be refined, the process can be iterated using the refined parameters. We also present definitions of uncertainty quantities that have not been applied in the field of LCA and approaches for characterizing uncertainty in those quantities. We then demonstrate the methodology through a case study of pavements.
Probabilistic Prediction of Lifetimes of Ceramic Parts
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.
2006-01-01
ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.
Verhoef, Talitha I; Trend, Verena; Kelly, Barry; Robinson, Nigel; Fox, Paul; Morris, Stephen
2016-07-22
We evaluated the cost-effectiveness of the Give-it-a-Go programme, which offers free leisure centre memberships to physically inactive members of the public in a single London Borough receiving state benefits. A decision analytic Markov model was developed to analyse lifetime costs and quality-adjusted life-years (QALYs) of 1025 people recruited to the intervention versus no intervention. In the intervention group, people were offered 4 months of free membership at a leisure centre. Physical activity levels were assessed at 0 and 4 months using the International Physical Activity Questionnaire (IPAQ). Higher levels of physical activity were assumed to decrease the risk of coronary heart disease, stroke and diabetes mellitus type II, as well as improve mental health. Costs were assessed from a National Health Service (NHS) perspective. Uncertainty was assessed using one-way and probabilistic sensitivity analyses. One-hundred fifty nine participants (15.5 %) completed the programme by attending the leisure centre for 4 months. Compared with no intervention, Give it a Go increased costs by £67.25 and QALYs by 0.0033 (equivalent to 1.21 days in full health) per recruited person. The incremental costs per QALY gained were £20,347. The results were highly sensitive to the magnitude of mental health gain due to physical activity and the duration of the effect of the programme (1 year in the base case analysis). When the mental health gain was omitted from the analysis, the incremental cost per QALY gained increased to almost £1.5 million. In the probabilistic sensitivity analysis, the incremental costs per QALY gained were below £20,000 in 39 % of the 5000 simulations. Give it a Go did not significantly increase life-expectancy, but had a positive influence on quality of life due to the mental health gain of physical activity. If the increase in physical activity caused by Give it a Go lasts for more than 1 year, the programme would be cost-effective given a willingness to pay for a QALY of £20,000.
Probabilistic structural analysis methods for improving Space Shuttle engine reliability
NASA Technical Reports Server (NTRS)
Boyce, L.
1989-01-01
Probabilistic structural analysis methods are particularly useful in the design and analysis of critical structural components and systems that operate in very severe and uncertain environments. These methods have recently found application in space propulsion systems to improve the structural reliability of Space Shuttle Main Engine (SSME) components. A computer program, NESSUS, based on a deterministic finite-element program and a method of probabilistic analysis (fast probability integration) provides probabilistic structural analysis for selected SSME components. While computationally efficient, it considers both correlated and nonnormal random variables as well as an implicit functional relationship between independent and dependent variables. The program is used to determine the response of a nickel-based superalloy SSME turbopump blade. Results include blade tip displacement statistics due to the variability in blade thickness, modulus of elasticity, Poisson's ratio or density. Modulus of elasticity significantly contributed to blade tip variability while Poisson's ratio did not. Thus, a rational method for choosing parameters to be modeled as random is provided.
ERIC Educational Resources Information Center
Little, Daniel R.; Lewandowsky, Stephan
2009-01-01
Despite the fact that categories are often composed of correlated features, the evidence that people detect and use these correlations during intentional category learning has been overwhelmingly negative to date. Nonetheless, on other categorization tasks, such as feature prediction, people show evidence of correlational sensitivity. A…
Durán, I; Beiras, R
2013-10-01
Acute water quality criteria (WQC) for the protection of coastal ecosystems are developed on the basis of short-term ecotoxicological data using the most sensitive life stages of representative species from the main taxa of marine water column organisms. A probabilistic approach based on species sensitivity distribution (SSD) curves has been chosen and compared to the WQC obtained applying an assessment factor to the critical toxicity values, i.e. the 'deterministic' approach. The criteria obtained from HC5 values (5th percentile of the SSD) were 1.01 μg/l for Hg, 1.39 μg/l for Cu, 3.83 μg/l for Cd, 25.3 μg/l for Pb and 8.24 μg/l for Zn. Using sensitive early life stages and very sensitive endpoints allowed calculation of WQC for marine coastal ecosystems. These probabilistic WQC, intended to protect 95% of the species in 95% of the cases, were calculated on the basis of a limited ecotoxicological dataset, avoiding the use of large and uncertain assessment factors. Copyright © 2013 Elsevier B.V. All rights reserved.
Do, Hongdo; Molania, Ramyar
2017-01-01
The identification of genomic rearrangements with high sensitivity and specificity using massively parallel sequencing remains a major challenge, particularly in precision medicine and cancer research. Here, we describe a new method for detecting rearrangements, GRIDSS (Genome Rearrangement IDentification Software Suite). GRIDSS is a multithreaded structural variant (SV) caller that performs efficient genome-wide break-end assembly prior to variant calling using a novel positional de Bruijn graph-based assembler. By combining assembly, split read, and read pair evidence using a probabilistic scoring, GRIDSS achieves high sensitivity and specificity on simulated, cell line, and patient tumor data, recently winning SV subchallenge #5 of the ICGC-TCGA DREAM8.5 Somatic Mutation Calling Challenge. On human cell line data, GRIDSS halves the false discovery rate compared to other recent methods while matching or exceeding their sensitivity. GRIDSS identifies nontemplate sequence insertions, microhomologies, and large imperfect homologies, estimates a quality score for each breakpoint, stratifies calls into high or low confidence, and supports multisample analysis. PMID:29097403
Almansa, Carmen; Martínez-Paz, José M
2011-03-01
Cost-benefit analysis is a standard methodological platform for public investment evaluation. In high environmental impact projects, with a long-term effect on future generations, the choice of discount rate and time horizon is of particular relevance, because it can lead to very different profitability assessments. This paper describes some recent approaches to environmental discounting and applies them, together with a number of classical procedures, to the economic evaluation of a plant for the desalination of irrigation return water from intensive farming, aimed at halting the degradation of an area of great ecological value, the Mar Menor, in South Eastern Spain. A Monte Carlo procedure is used in four CBA approaches and three time horizons to carry out a probabilistic sensitivity analysis designed to integrate the views of an international panel of experts in environmental discounting with the uncertainty affecting the market price of the project's main output, i.e., irrigation water for a water-deprived area. The results show which discounting scenarios most accurately estimate the socio-environmental profitability of the project while also considering the risk associated with these two key parameters. The analysis also provides some methodological findings regarding ways of assessing financial and environmental profitability in decisions concerning public investment in the environment. Copyright © 2010 Elsevier B.V. All rights reserved.
Coupled Multi-Disciplinary Optimization for Structural Reliability and Affordability
NASA Technical Reports Server (NTRS)
Abumeri, Galib H.; Chamis, Christos C.
2003-01-01
A computational simulation method is presented for Non-Deterministic Multidisciplinary Optimization of engine composite materials and structures. A hypothetical engine duct made with ceramic matrix composites (CMC) is evaluated probabilistically in the presence of combined thermo-mechanical loading. The structure is tailored by quantifying the uncertainties in all relevant design variables such as fabrication, material, and loading parameters. The probabilistic sensitivities are used to select critical design variables for optimization. In this paper, two approaches for non-deterministic optimization are presented. The non-deterministic minimization of combined failure stress criterion is carried out by: (1) performing probabilistic evaluation first and then optimization and (2) performing optimization first and then probabilistic evaluation. The first approach shows that the optimization feasible region can be bounded by a set of prescribed probability limits and that the optimization follows the cumulative distribution function between those limits. The second approach shows that the optimization feasible region is bounded by 0.50 and 0.999 probabilities.
NASA Technical Reports Server (NTRS)
Nagpal, Vinod K.; Tong, Michael; Murthy, P. L. N.; Mital, Subodh
1998-01-01
An integrated probabilistic approach has been developed to assess composites for high temperature applications. This approach was used to determine thermal and mechanical properties and their probabilistic distributions of a 5-harness 0/90 Sylramic fiber/CVI-SiC/Mi-SiC woven Ceramic Matrix Composite (CMC) at high temperatures. The purpose of developing this approach was to generate quantitative probabilistic information on this CMC to help complete the evaluation for its potential application for HSCT combustor liner. This approach quantified the influences of uncertainties inherent in constituent properties called primitive variables on selected key response variables of the CMC at 2200 F. The quantitative information is presented in the form of Cumulative Density Functions (CDFs). Probability Density Functions (PDFS) and primitive variable sensitivities on response. Results indicate that the scatters in response variables were reduced by 30-50% when the uncertainties in the primitive variables, which showed the most influence, were reduced by 50%.
Arrospide, Arantzazu; Rue, Montserrat; van Ravesteyn, Nicolien T; Comas, Merce; Soto-Gordoa, Myriam; Sarriugarte, Garbiñe; Mar, Javier
2016-06-01
Breast cancer screening in the Basque Country has shown 20 % reduction of the number of BC deaths and an acceptable overdiagnosis level (4 % of screen detected BC). The aim of this study was to evaluate the breast cancer early detection programme in the Basque Country in terms of retrospective cost-effectiveness and budget impact from 1996 to 2011. A discrete event simulation model was built to reproduce the natural history of breast cancer (BC). We estimated for lifetime follow-up the total cost of BC (screening, diagnosis and treatment), as well as quality-adjusted life years (QALY), for women invited to participate in the evaluated programme during the 15-year period in the actual screening scenario and in a hypothetical unscreened scenario. An incremental cost-effectiveness ratio was calculated with the use of aggregated costs. Besides, annual costs were considered for budget impact analysis. Both population level and single-cohort analysis were performed. A probabilistic sensitivity analysis was applied to assess the impact of parameters uncertainty. The actual screening programme involved a cost of 1,127 million euros and provided 6.7 million QALYs over the lifetime of the target population, resulting in a gain of 8,666 QALYs for an additional cost of 36.4 million euros, compared with the unscreened scenario. Thus, the incremental cost-effectiveness ratio was 4,214€/QALY. All the model runs in the probabilistic sensitivity analysis resulted in an incremental cost-effectiveness ratio lower than 10,000€/QALY. The screening programme involved an increase of the annual budget of the Basque Health Service by 5.2 million euros from year 2000 onwards. The BC screening programme in the Basque Country proved to be cost-effective during the evaluated period and determined an affordable budget impact. These results confirm the epidemiological benefits related to the centralised screening system and support the continuation of the programme.
NASA Technical Reports Server (NTRS)
Rajagopal, K. R.
1992-01-01
The technical effort and computer code development is summarized. Several formulations for Probabilistic Finite Element Analysis (PFEA) are described with emphasis on the selected formulation. The strategies being implemented in the first-version computer code to perform linear, elastic PFEA is described. The results of a series of select Space Shuttle Main Engine (SSME) component surveys are presented. These results identify the critical components and provide the information necessary for probabilistic structural analysis. Volume 2 is a summary of critical SSME components.
An approximate methods approach to probabilistic structural analysis
NASA Technical Reports Server (NTRS)
Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.
1989-01-01
A probabilistic structural analysis method (PSAM) is described which makes an approximate calculation of the structural response of a system, including the associated probabilistic distributions, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The method employs the fast probability integration (FPI) algorithm of Wu and Wirsching. Typical solution strategies are illustrated by formulations for a representative critical component chosen from the Space Shuttle Main Engine (SSME) as part of a major NASA-sponsored program on PSAM. Typical results are presented to demonstrate the role of the methodology in engineering design and analysis.
Ruiz-Ramos, Jesus; Frasquet, Juan; Romá, Eva; Poveda-Andres, Jose Luis; Salavert-Leti, Miguel; Castellanos, Alvaro; Ramirez, Paula
2017-06-01
To evaluate the cost-effectiveness of antimicrobial stewardship (AS) program implementation focused on critical care units based on assumptions for the Spanish setting. A decision model comparing costs and outcomes of sepsis, community-acquired pneumonia, and nosocomial infections (including catheter-related bacteremia, urinary tract infection, and ventilator-associated pneumonia) in critical care units with or without an AS was designed. Model variables and costs, along with their distributions, were obtained from the literature. The study was performed from the Spanish National Health System (NHS) perspective, including only direct costs. The Incremental Cost-Effectiveness Ratio (ICER) was analysed regarding the ability of the program to reduce multi-drug resistant bacteria. Uncertainty in ICERs was evaluated with probabilistic sensitivity analyses. In the short-term, implementing an AS reduces the consumption of antimicrobials with a net benefit of €71,738. In the long-term, the maintenance of the program involves an additional cost to the system of €107,569. Cost per avoided resistance was €7,342, and cost-per-life-years gained (LYG) was €9,788. Results from the probabilistic sensitivity analysis showed that there was a more than 90% likelihood that an AS would be cost-effective at a level of €8,000 per LYG. Wide variability of economic results obtained from the implementation of this type of AS program and short information on their impact on patient evolution and any resistance avoided. Implementing an AS focusing on critical care patients is a long-term cost-effective tool. Implementation costs are amortized by reducing antimicrobial consumption to prevent infection by multidrug-resistant pathogens.
Fleurence, Rachael L
2005-01-01
The cost-effectiveness of alternating pressure-relieving devices, mattress replacements, and mattress overlays compared with a standard hospital (high-specification foam mattress) for the prevention and treatment of pressure ulcers in hospital patients in the United Kingdom was investigated. A decision-analytic model was constructed to evaluate different strategies to prevent or treat pressure ulcers. Three scenarios were evaluated: the prevention of pressure ulcers, the treatment of superficial ulcers, and the treatment of severe ulcers. Epidemiological and effectiveness data were obtained from the clinical literature. Expert opinion using a rating scale technique was used to obtain quality of life data. Costs of the devices were obtained from manufacturers, whereas costs of treatment were obtained from the literature. Uncertainty was explored through probabilistic sensitivity analysis. Using 30,000 pounds sterling/QALY (quality-adjusted life year) as the decision-maker's cut off point (the current UK standard), in scenario 1 (prevention), the cost-effective strategy was the mattress overlay at 1, 4, and 12 weeks. In scenarios 2 and 3, the cost-effective strategy was the mattress replacement at 1, 4, and 12 weeks. Standard care was a dominated intervention in all scenarios for values of the decision-maker's ceiling ratio ranging from 5,000 pounds sterling to 100,000 pounds sterling/QALY. However, the probabilistic sensitivity analysis results reflected the high uncertainty surrounding the choice of devices. Current information suggests that alternating pressure mattress overlays may be cost-effective for the prevention of pressure ulcers, whereas alternating pressure mattress replacements appears to be cost-effective for the treatment of superficial and severe pressure ulcers.
Müller, Dirk; Danner, Marion; Rhiem, Kerstin; Stollenwerk, Björn; Engel, Christoph; Rasche, Linda; Borsi, Lisa; Schmutzler, Rita; Stock, Stephanie
2018-04-01
Women with a BRCA1 or BRCA2 mutation are at increased risk of developing breast and/or ovarian cancer. This economic modeling study evaluated different preventive interventions for 30-year-old women with a confirmed BRCA (1 or 2) mutation. A Markov model was developed to estimate the costs and benefits [i.e., quality-adjusted life years (QALYs), and life years gained (LYG)] associated with prophylactic bilateral mastectomy (BM), prophylactic bilateral salpingo-oophorectomy (BSO), BM plus BSO, BM plus BSO at age 40, and intensified surveillance. Relevant input data was obtained from a large German database including 5902 women with BRCA 1 or 2, and from the literature. The analysis was performed from the German Statutory Health Insurance (SHI) perspective. In order to assess the robustness of the results, deterministic and probabilistic sensitivity analyses were performed. With costs of €29,434 and a gain in QALYs of 17.7 (LYG 19.9), BM plus BSO at age 30 was less expensive and more effective than the other strategies, followed by BM plus BSO at age 40. Women who were offered the surveillance strategy had the highest costs at the lowest gain in QALYs/LYS. In the probabilistic sensitivity analysis, the probability of cost-saving was 57% for BM plus BSO. At a WTP of 10,000 € per QALY, the probability of the intervention being cost-effective was 80%. From the SHI perspective, undergoing BM plus immediate BSO should be recommended to BRCA 1 or 2 mutation carriers due to its favorable comparative cost-effectiveness.
Probabilistic reasoning in data analysis.
Sirovich, Lawrence
2011-09-20
This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.
The cost-effectiveness of screening for colorectal cancer.
Telford, Jennifer J; Levy, Adrian R; Sambrook, Jennifer C; Zou, Denise; Enns, Robert A
2010-09-07
Published decision analyses show that screening for colorectal cancer is cost-effective. However, because of the number of tests available, the optimal screening strategy in Canada is unknown. We estimated the incremental cost-effectiveness of 10 strategies for colorectal cancer screening, as well as no screening, incorporating quality of life, noncompliance and data on the costs and benefits of chemotherapy. We used a probabilistic Markov model to estimate the costs and quality-adjusted life expectancy of 50-year-old average-risk Canadians without screening and with screening by each test. We populated the model with data from the published literature. We calculated costs from the perspective of a third-party payer, with inflation to 2007 Canadian dollars. Of the 10 strategies considered, we focused on three tests currently being used for population screening in some Canadian provinces: low-sensitivity guaiac fecal occult blood test, performed annually; fecal immunochemical test, performed annually; and colonoscopy, performed every 10 years. These strategies reduced the incidence of colorectal cancer by 44%, 65% and 81%, and mortality by 55%, 74% and 83%, respectively, compared with no screening. These strategies generated incremental cost-effectiveness ratios of $9159, $611 and $6133 per quality-adjusted life year, respectively. The findings were robust to probabilistic sensitivity analysis. Colonoscopy every 10 years yielded the greatest net health benefit. Screening for colorectal cancer is cost-effective over conventional levels of willingness to pay. Annual high-sensitivity fecal occult blood testing, such as a fecal immunochemical test, or colonoscopy every 10 years offer the best value for the money in Canada.
Probabilistic and deterministic evaluation of uncertainty in a local scale multi-risk analysis
NASA Astrophysics Data System (ADS)
Lari, S.; Frattini, P.; Crosta, G. B.
2009-04-01
We performed a probabilistic multi-risk analysis (QPRA) at the local scale for a 420 km2 area surrounding the town of Brescia (Northern Italy). We calculated the expected annual loss in terms of economical damage and life loss, for a set of risk scenarios of flood, earthquake and industrial accident with different occurrence probabilities and different intensities. The territorial unit used for the study was the census parcel, of variable area, for which a large amount of data was available. Due to the lack of information related to the evaluation of the hazards, to the value of the exposed elements (e.g., residential and industrial area, population, lifelines, sensitive elements as schools, hospitals) and to the process-specific vulnerability, and to a lack of knowledge of the processes (floods, industrial accidents, earthquakes), we assigned an uncertainty to the input variables of the analysis. For some variables an homogeneous uncertainty was assigned on the whole study area, as for instance for the number of buildings of various typologies, and for the event occurrence probability. In other cases, as for phenomena intensity (e.g.,depth of water during flood) and probability of impact, the uncertainty was defined in relation to the census parcel area. In fact assuming some variables homogeneously diffused or averaged on the census parcels, we introduce a larger error for larger parcels. We propagated the uncertainty in the analysis using three different models, describing the reliability of the output (risk) as a function of the uncertainty of the inputs (scenarios and vulnerability functions). We developed a probabilistic approach based on Monte Carlo simulation, and two deterministic models, namely First Order Second Moment (FOSM) and Point Estimate (PE). In general, similar values of expected losses are obtained with the three models. The uncertainty of the final risk value is in the three cases around the 30% of the expected value. Each of the models, nevertheless, requires different assumptions and computational efforts, and provides results with different level of detail.
Probabilistic arithmetic automata and their applications.
Marschall, Tobias; Herms, Inke; Kaltenbach, Hans-Michael; Rahmann, Sven
2012-01-01
We present a comprehensive review on probabilistic arithmetic automata (PAAs), a general model to describe chains of operations whose operands depend on chance, along with two algorithms to numerically compute the distribution of the results of such probabilistic calculations. PAAs provide a unifying framework to approach many problems arising in computational biology and elsewhere. We present five different applications, namely 1) pattern matching statistics on random texts, including the computation of the distribution of occurrence counts, waiting times, and clump sizes under hidden Markov background models; 2) exact analysis of window-based pattern matching algorithms; 3) sensitivity of filtration seeds used to detect candidate sequence alignments; 4) length and mass statistics of peptide fragments resulting from enzymatic cleavage reactions; and 5) read length statistics of 454 and IonTorrent sequencing reads. The diversity of these applications indicates the flexibility and unifying character of the presented framework. While the construction of a PAA depends on the particular application, we single out a frequently applicable construction method: We introduce deterministic arithmetic automata (DAAs) to model deterministic calculations on sequences, and demonstrate how to construct a PAA from a given DAA and a finite-memory random text model. This procedure is used for all five discussed applications and greatly simplifies the construction of PAAs. Implementations are available as part of the MoSDi package. Its application programming interface facilitates the rapid development of new applications based on the PAA framework.
The direct and indirect cost of diabetes in Italy: a prevalence probabilistic approach.
Marcellusi, A; Viti, R; Mecozzi, A; Mennini, F S
2016-03-01
Diabetes mellitus is a chronic degenerative disease associated with a high risk of chronic complications and comorbidities. However, very few data are available on the associated cost. The objective of this study is to identify the available information on the epidemiology of the disease and estimate the average annual cost incurred by the National Health Service and Society for the Treatment of Diabetes in Italy. A probabilistic prevalence cost of illness model was developed to calculate an aggregate measure of the economic burden associated with the disease, in terms of direct medical costs (drugs, hospitalizations, monitoring and adverse events) and indirect costs (absenteeism and early retirement). A systematic review of the literature was conducted to determine both the epidemiological and economic data. Furthermore, a one-way and probabilistic sensitivity analysis with 5,000 Monte Carlo simulations was performed to test the robustness of the results and define a 95% CI. The model estimated a prevalence of 2.6 million patients under drug therapies in Italy. The total economic burden of diabetic patients in Italy amounted to €20.3 billion/year (95% CI €18.61 to €22.29 billion), 54% of which are associated with indirect costs (95% CI €10.10 to €11.62 billion) and 46% with direct costs only (95% CI €8.11 to €11.06 billion). This is the first study conducted in Italy aimed at estimating the direct and indirect cost of diabetes with a probabilistic prevalence approach. As might be expected, the lack of information means that the real burden of diabetes is partly underestimated, especially with regard to indirect costs. However, this is a useful approach for policy makers to understand the economic implications of diabetes treatment in Italy.
Cost-effectiveness of breast cancer screening using mammography in Vietnamese women
2018-01-01
Background The incidence rate of breast cancer is increasing and has become the most common cancer in Vietnamese women while the survival rate is lower than that of developed countries. Early detection to improve breast cancer survival as well as reducing risk factors remains the cornerstone of breast cancer control according to the World Health Organization (WHO). This study aims to evaluate the costs and outcomes of introducing a mammography screening program for Vietnamese women aged 45–64 years, compared to the current situation of no screening. Methods Decision analytical modeling using Markov chain analysis was used to estimate costs and health outcomes over a lifetime horizon. Model inputs were derived from published literature and the results were reported as incremental cost-effectiveness ratios (ICERs) and/or incremental net monetary benefits (INMBs). One-way sensitivity analyses and probabilistic sensitivity analyses were performed to assess parameter uncertainty. Results The ICER per life year gained of the first round of mammography screening was US$3647.06 and US$4405.44 for women aged 50–54 years and 55–59 years, respectively. In probabilistic sensitivity analyses, mammography screening in the 50–54 age group and the 55–59 age group were cost-effective in 100% of cases at a threshold of three times the Vietnamese Gross Domestic Product (GDP) i.e., US$6332.70. However, less than 50% of the cases in the 60–64 age group and 0% of the cases in the 45–49 age group were cost effective at the WHO threshold. The ICERs were sensitive to the discount rate, mammography sensitivity, and transition probability from remission to distant recurrence in stage II for all age groups. Conclusion From the healthcare payer viewpoint, offering the first round of mammography screening to Vietnamese women aged 50–59 years should be considered, with the given threshold of three times the Vietnamese GDP per capita. PMID:29579131
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, T.L.; Simonen, F.A.
1992-05-01
Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, T.L.; Simonen, F.A.
1992-01-01
Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less
Commercialization of NESSUS: Status
NASA Technical Reports Server (NTRS)
Thacker, Ben H.; Millwater, Harry R.
1991-01-01
A plan was initiated in 1988 to commercialize the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) probabilistic structural analysis software. The goal of the on-going commercialization effort is to begin the transfer of Probabilistic Structural Analysis Method (PSAM) developed technology into industry and to develop additional funding resources in the general area of structural reliability. The commercialization effort is summarized. The SwRI NESSUS Software System is a general purpose probabilistic finite element computer program using state of the art methods for predicting stochastic structural response due to random loads, material properties, part geometry, and boundary conditions. NESSUS can be used to assess structural reliability, to compute probability of failure, to rank the input random variables by importance, and to provide a more cost effective design than traditional methods. The goal is to develop a general probabilistic structural analysis methodology to assist in the certification of critical components in the next generation Space Shuttle Main Engine.
Rocket engine system reliability analyses using probabilistic and fuzzy logic techniques
NASA Technical Reports Server (NTRS)
Hardy, Terry L.; Rapp, Douglas C.
1994-01-01
The reliability of rocket engine systems was analyzed by using probabilistic and fuzzy logic techniques. Fault trees were developed for integrated modular engine (IME) and discrete engine systems, and then were used with the two techniques to quantify reliability. The IRRAS (Integrated Reliability and Risk Analysis System) computer code, developed for the U.S. Nuclear Regulatory Commission, was used for the probabilistic analyses, and FUZZYFTA (Fuzzy Fault Tree Analysis), a code developed at NASA Lewis Research Center, was used for the fuzzy logic analyses. Although both techniques provided estimates of the reliability of the IME and discrete systems, probabilistic techniques emphasized uncertainty resulting from randomness in the system whereas fuzzy logic techniques emphasized uncertainty resulting from vagueness in the system. Because uncertainty can have both random and vague components, both techniques were found to be useful tools in the analysis of rocket engine system reliability.
NASA Astrophysics Data System (ADS)
Kaźmierczak, Bartosz; Wartalska, Katarzyna; Wdowikowski, Marcin; Kotowski, Andrzej
2017-11-01
Modern scientific research in the area of heavy rainfall analysis regarding to the sewerage design indicates the need to develop and use probabilistic rain models. One of the issues that remains to be resolved is the length of the shortest amount of rain to be analyzed. It is commonly believed that the best time is 5 minutes, while the least rain duration measured by the national services is often 10 or even 15 minutes. Main aim of this paper is to present the difference between probabilistic rainfall models results given from rainfall time series including and excluding 5 minutes rainfall duration. Analysis were made for long-time period from 1961-2010 on polish meteorological station Legnica. To develop best fitted to measurement rainfall data probabilistic model 4 probabilistic distributions were used. Results clearly indicates that models including 5 minutes rainfall duration remains more appropriate to use.
Time Alignment as a Necessary Step in the Analysis of Sleep Probabilistic Curves
NASA Astrophysics Data System (ADS)
Rošt'áková, Zuzana; Rosipal, Roman
2018-02-01
Sleep can be characterised as a dynamic process that has a finite set of sleep stages during the night. The standard Rechtschaffen and Kales sleep model produces discrete representation of sleep and does not take into account its dynamic structure. In contrast, the continuous sleep representation provided by the probabilistic sleep model accounts for the dynamics of the sleep process. However, analysis of the sleep probabilistic curves is problematic when time misalignment is present. In this study, we highlight the necessity of curve synchronisation before further analysis. Original and in time aligned sleep probabilistic curves were transformed into a finite dimensional vector space, and their ability to predict subjects' age or daily measures is evaluated. We conclude that curve alignment significantly improves the prediction of the daily measures, especially in the case of the S2-related sleep states or slow wave sleep.
Probabilistic classifiers with high-dimensional data
Kim, Kyung In; Simon, Richard
2011-01-01
For medical classification problems, it is often desirable to have a probability associated with each class. Probabilistic classifiers have received relatively little attention for small n large p classification problems despite of their importance in medical decision making. In this paper, we introduce 2 criteria for assessment of probabilistic classifiers: well-calibratedness and refinement and develop corresponding evaluation measures. We evaluated several published high-dimensional probabilistic classifiers and developed 2 extensions of the Bayesian compound covariate classifier. Based on simulation studies and analysis of gene expression microarray data, we found that proper probabilistic classification is more difficult than deterministic classification. It is important to ensure that a probabilistic classifier is well calibrated or at least not “anticonservative” using the methods developed here. We provide this evaluation for several probabilistic classifiers and also evaluate their refinement as a function of sample size under weak and strong signal conditions. We also present a cross-validation method for evaluating the calibration and refinement of any probabilistic classifier on any data set. PMID:21087946
Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects
NASA Technical Reports Server (NTRS)
Nagpal, V. K.
1985-01-01
A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.
Précis of bayesian rationality: The probabilistic approach to human reasoning.
Oaksford, Mike; Chater, Nick
2009-02-01
According to Aristotle, humans are the rational animal. The borderline between rationality and irrationality is fundamental to many aspects of human life including the law, mental health, and language interpretation. But what is it to be rational? One answer, deeply embedded in the Western intellectual tradition since ancient Greece, is that rationality concerns reasoning according to the rules of logic--the formal theory that specifies the inferential connections that hold with certainty between propositions. Piaget viewed logical reasoning as defining the end-point of cognitive development; and contemporary psychology of reasoning has focussed on comparing human reasoning against logical standards. Bayesian Rationality argues that rationality is defined instead by the ability to reason about uncertainty. Although people are typically poor at numerical reasoning about probability, human thought is sensitive to subtle patterns of qualitative Bayesian, probabilistic reasoning. In Chapters 1-4 of Bayesian Rationality (Oaksford & Chater 2007), the case is made that cognition in general, and human everyday reasoning in particular, is best viewed as solving probabilistic, rather than logical, inference problems. In Chapters 5-7 the psychology of "deductive" reasoning is tackled head-on: It is argued that purportedly "logical" reasoning problems, revealing apparently irrational behaviour, are better understood from a probabilistic point of view. Data from conditional reasoning, Wason's selection task, and syllogistic inference are captured by recasting these problems probabilistically. The probabilistic approach makes a variety of novel predictions which have been experimentally confirmed. The book considers the implications of this work, and the wider "probabilistic turn" in cognitive science and artificial intelligence, for understanding human rationality.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-09
... Role of Risk Analysis in Decision-Making AGENCY: Environmental Protection Agency (EPA). ACTION: Notice... documents entitled, ``Using Probabilistic Methods to Enhance the Role of Risk Analysis in Decision- Making... Probabilistic Methods to Enhance the Role of Risk Analysis in Decision-Making, with Case Study Examples'' and...
NASA Astrophysics Data System (ADS)
Avital, Matan; Kamai, Ronnie; Davis, Michael; Dor, Ory
2018-02-01
We present a full probabilistic seismic hazard analysis (PSHA) sensitivity analysis for two sites in southern Israel - one in the near field of a major fault system and one farther away. The PSHA analysis is conducted for alternative source representations, using alternative model parameters for the main seismic sources, such as slip rate and Mmax, among others. The analysis also considers the effect of the ground motion prediction equation (GMPE) on the hazard results. In this way, the two types of epistemic uncertainty - modelling uncertainty and parametric uncertainty - are treated and addressed. We quantify the uncertainty propagation by testing its influence on the final calculated hazard, such that the controlling knowledge gaps are identified and can be treated in future studies. We find that current practice in Israel, as represented by the current version of the building code, grossly underestimates the hazard, by approximately 40 % in short return periods (e.g. 10 % in 50 years) and by as much as 150 % in long return periods (e.g. 10E-5). The analysis shows that this underestimation is most probably due to a combination of factors, including source definitions as well as the GMPE used for analysis.
ProbCD: enrichment analysis accounting for categorization uncertainty.
Vêncio, Ricardo Z N; Shmulevich, Ilya
2007-10-12
As in many other areas of science, systems biology makes extensive use of statistical association and significance estimates in contingency tables, a type of categorical data analysis known in this field as enrichment (also over-representation or enhancement) analysis. In spite of efforts to create probabilistic annotations, especially in the Gene Ontology context, or to deal with uncertainty in high throughput-based datasets, current enrichment methods largely ignore this probabilistic information since they are mainly based on variants of the Fisher Exact Test. We developed an open-source R-based software to deal with probabilistic categorical data analysis, ProbCD, that does not require a static contingency table. The contingency table for the enrichment problem is built using the expectation of a Bernoulli Scheme stochastic process given the categorization probabilities. An on-line interface was created to allow usage by non-programmers and is available at: http://xerad.systemsbiology.net/ProbCD/. We present an analysis framework and software tools to address the issue of uncertainty in categorical data analysis. In particular, concerning the enrichment analysis, ProbCD can accommodate: (i) the stochastic nature of the high-throughput experimental techniques and (ii) probabilistic gene annotation.
Tan, Chongqing; Peng, Liubao; Zeng, Xiaohui; Li, Jianhe; Wan, Xiaomin; Chen, Gannong; Yi, Lidan; Luo, Xia; Zhao, Ziying
2013-01-01
First-line postoperative adjuvant chemotherapies with S-1 and capecitabine and oxaliplatin (XELOX) were first recommended for resectable gastric cancer patients in the 2010 and 2011 Chinese NCCN Clinical Practice Guidelines in Oncology: Gastric Cancer; however, their economic impact in China is unknown. The aim of this study was to compare the cost-effectiveness of adjuvant chemotherapy with XELOX, with S-1 and no treatment after a gastrectomy with extended (D2) lymph-node dissection among patients with stage II-IIIB gastric cancer. A Markov model, based on data from two clinical phase III trials, was developed to analyse the cost-effectiveness of patients in the XELOX group, S-1 group and surgery only (SO) group. The costs were estimated from the perspective of Chinese healthcare system. The utilities were assumed on the basis of previously published reports. Costs, quality-adjusted life-years (QALYs) and incremental cost-effectiveness ratios (ICER) were calculated with a lifetime horizon. One-way and probabilistic sensitivity analyses were performed. For the base case, XELOX had the lowest total cost ($44,568) and cost-effectiveness ratio ($7,360/QALY). The relative scenario analyses showed that SO was dominated by XELOX and the ICERs of S-1 was $58,843/QALY compared with XELOX. The one-way sensitivity analysis showed that the most influential parameter was the utility of disease-free survival. The probabilistic sensitivity analysis predicted a 75.8% likelihood that the ICER for XELOX would be less than $13,527 compared with S-1. When ICER was more than $38,000, the likelihood of cost-effectiveness achieved by S-1 group was greater than 50%. Our results suggest that for patients in China with resectable disease, first-line adjuvant chemotherapy with XELOX after a D2 gastrectomy is a best option comparing with S-1 and SO in view of our current study. In addition, S-1 might be a better choice, especially with a higher value of willingness-to-pay threshold.
The Probability Heuristics Model of Syllogistic Reasoning.
ERIC Educational Resources Information Center
Chater, Nick; Oaksford, Mike
1999-01-01
Proposes a probability heuristic model for syllogistic reasoning and confirms the rationality of this heuristic by an analysis of the probabilistic validity of syllogistic reasoning that treats logical inference as a limiting case of probabilistic inference. Meta-analysis and two experiments involving 40 adult participants and using generalized…
An Instructional Module on Mokken Scale Analysis
ERIC Educational Resources Information Center
Wind, Stefanie A.
2017-01-01
Mokken scale analysis (MSA) is a probabilistic-nonparametric approach to item response theory (IRT) that can be used to evaluate fundamental measurement properties with less strict assumptions than parametric IRT models. This instructional module provides an introduction to MSA as a probabilistic-nonparametric framework in which to explore…
Milanović, Jovica V
2017-08-13
Future power systems will be significantly different compared with their present states. They will be characterized by an unprecedented mix of a wide range of electricity generation and transmission technologies, as well as responsive and highly flexible demand and storage devices with significant temporal and spatial uncertainty. The importance of probabilistic approaches towards power system stability analysis, as a subsection of power system studies routinely carried out by power system operators, has been highlighted in previous research. However, it may not be feasible (or even possible) to accurately model all of the uncertainties that exist within a power system. This paper describes for the first time an integral approach to probabilistic stability analysis of power systems, including small and large angular stability and frequency stability. It provides guidance for handling uncertainties in power system stability studies and some illustrative examples of the most recent results of probabilistic stability analysis of uncertain power systems.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).
Mulla, Mubashir; Schulte, Klaus-Martin
2012-01-01
Cervical lymph nodes (CLNs) are the most common site of metastases in papillary thyroid cancer (PTC). Ultrasound scan (US) is the most commonly used imaging modality in the evaluation of CLNs in PTC. Computerised tomography (CT) and 18fluorodeoxyglucose positron emission tomography (18FDG PET–CT) are used less commonly. It is widely believed that the above imaging techniques should guide the surgical approach to the patient with PTC. Methods We performed a systematic review of imaging studies from the literature assessing the usefulness for the detection of metastatic CLNs in PTC. We evaluated the author's interpretation of their numeric findings specifically with regard to ‘sensitivity’ and ‘negative predictive value’ (NPV) by comparing their use against standard definitions of these terms in probabilistic statistics. Results A total of 16 studies used probabilistic terms to describe the value of US for the detection of LN metastases. Only 6 (37.5%) calculated sensitivity and NPV correctly. For CT, out of the eight studies, only 1 (12.5%) used correct terms to describe analytical results. One study looked at magnetic resonance imaging, while three assessed 18FDG PET–CT, none of which provided correct calculations for sensitivity and NPV. Conclusion Imaging provides high specificity for the detection of cervical metastases of PTC. However, sensitivity and NPV are low. The majority of studies reporting on a high sensitivity have not used key terms according to standard definitions of probabilistic statistics. Against common opinion, there is no current evidence that failure to find LN metastases on ultrasound or cross-sectional imaging can be used to guide surgical decision making. PMID:23781308
NASA Technical Reports Server (NTRS)
Ryan, Robert S.; Townsend, John S.
1993-01-01
The prospective improvement of probabilistic methods for space program analysis/design entails the further development of theories, codes, and tools which match specific areas of application, the drawing of lessons from previous uses of probability and statistics data bases, the enlargement of data bases (especially in the field of structural failures), and the education of engineers and managers on the advantages of these methods. An evaluation is presently made of the current limitations of probabilistic engineering methods. Recommendations are made for specific applications.
Contreras-Hernández, Iris; Mould-Quevedo, Joaquín F; Torres-González, Rubén; Goycochea-Robles, María Victoria; Pacheco-Domínguez, Reyna Lizette; Sánchez-García, Sergio; Mejía-Aranguré, Juan Manuel; Garduño-Espinosa, Juan
2008-11-12
Osteoarthritis (OA) is one of the main causes of disability worldwide, especially in persons >55 years of age. Currently, controversy remains about the best therapeutic alternative for this disease when evaluated from a cost-effectiveness viewpoint. For Social Security Institutions in developing countries, it is very important to assess what drugs may decrease the subsequent use of medical care resources, considering their adverse events that are known to have a significant increase in medical care costs of patients with OA. Three treatment alternatives were compared: celecoxib (200 mg twice daily), non-selective NSAIDs (naproxen, 500 mg twice daily; diclofenac, 100 mg twice daily; and piroxicam, 20 mg/day) and acetaminophen, 1000 mg twice daily. The aim of this study was to identify the most cost-effective first-choice pharmacological treatment for the control of joint pain secondary to OA in patients treated at the Instituto Mexicano del Seguro Social (IMSS). A cost-effectiveness assessment was carried out. A systematic review of the literature was performed to obtain transition probabilities. In order to evaluate analysis robustness, one-way and probabilistic sensitivity analyses were conducted. Estimations were done for a 6-month period. Treatment demonstrating the best cost-effectiveness results [lowest cost-effectiveness ratio $17.5 pesos/patient ($1.75 USD)] was celecoxib. According to the one-way sensitivity analysis, celecoxib would need to markedly decrease its effectiveness in order for it to not be the optimal treatment option. In the probabilistic analysis, both in the construction of the acceptability curves and in the estimation of net economic benefits, the most cost-effective option was celecoxib. From a Mexican institutional perspective and probably in other Social Security Institutions in similar developing countries, the most cost-effective option for treatment of knee and/or hip OA would be celecoxib.
Contreras-Hernández, Iris; Mould-Quevedo, Joaquín F; Torres-González, Rubén; Goycochea-Robles, María Victoria; Pacheco-Domínguez, Reyna Lizette; Sánchez-García, Sergio; Mejía-Aranguré, Juan Manuel; Garduño-Espinosa, Juan
2008-01-01
Background Osteoarthritis (OA) is one of the main causes of disability worldwide, especially in persons >55 years of age. Currently, controversy remains about the best therapeutic alternative for this disease when evaluated from a cost-effectiveness viewpoint. For Social Security Institutions in developing countries, it is very important to assess what drugs may decrease the subsequent use of medical care resources, considering their adverse events that are known to have a significant increase in medical care costs of patients with OA. Three treatment alternatives were compared: celecoxib (200 mg twice daily), non-selective NSAIDs (naproxen, 500 mg twice daily; diclofenac, 100 mg twice daily; and piroxicam, 20 mg/day) and acetaminophen, 1000 mg twice daily. The aim of this study was to identify the most cost-effective first-choice pharmacological treatment for the control of joint pain secondary to OA in patients treated at the Instituto Mexicano del Seguro Social (IMSS). Methods A cost-effectiveness assessment was carried out. A systematic review of the literature was performed to obtain transition probabilities. In order to evaluate analysis robustness, one-way and probabilistic sensitivity analyses were conducted. Estimations were done for a 6-month period. Results Treatment demonstrating the best cost-effectiveness results [lowest cost-effectiveness ratio $17.5 pesos/patient ($1.75 USD)] was celecoxib. According to the one-way sensitivity analysis, celecoxib would need to markedly decrease its effectiveness in order for it to not be the optimal treatment option. In the probabilistic analysis, both in the construction of the acceptability curves and in the estimation of net economic benefits, the most cost-effective option was celecoxib. Conclusion From a Mexican institutional perspective and probably in other Social Security Institutions in similar developing countries, the most cost-effective option for treatment of knee and/or hip OA would be celecoxib. PMID:19014495
Blázquez-Pérez, Antonio; San Miguel, Ramón; Mar, Javier
2013-10-01
Chronic hepatitis C is the leading cause of chronic liver disease, representing a significant burden in terms of morbidity, mortality and costs. A new scenario of therapy for hepatitis C virus (HCV) genotype 1 infection is being established with the approval of two effective HCV protease inhibitors (PIs) in combination with the standard of care (SOC), peginterferon and ribavirin. Our objective was to estimate the cost effectiveness of combination therapy with new PIs (boceprevir and telaprevir) plus peginterferon and ribavirin versus SOC in treatment-naive patients with HCV genotype 1 according to data obtained from clinical trials (CTs). A Markov model simulating chronic HCV progression was used to estimate disease treatment costs and effects over patients' lifetimes, in the Spanish national public healthcare system. The target population was treatment-naive patients with chronic HCV genotype 1, demographic characteristics for whom were obtained from the published pivotal CTs SPRINT and ADVANCE. Three options were analysed for each PI based on results from the two CTs: universal triple therapy, interleukin (IL)-28B-guided therapy and dual therapy with peginterferon and ribavirin. A univariate sensitivity analysis was performed to evaluate the uncertainty of certain parameters: age at start of treatment, transition probabilities, drug costs, CT efficacy results and a higher hazard ratio for all-cause mortality for patients with chronic HCV. Probabilistic sensitivity analyses were also carried out. Incremental cost-effectiveness ratios (ICERs) of €2012 per quality-adjusted life-year (QALY) gained were used as outcome measures. According to the base-case analysis, using dual therapy as the comparator, the alternative IL28B-guided therapy presents a more favorable ICER (€18,079/QALY for boceprevir and €25,914/QALY for telaprevir) than the universal triple therapy option (€27,594/QALY for boceprevir and €33,751/QALY for telaprevir), with an ICER clearly below the efficiency threshold for medical interventions in the Spanish setting. Sensitivity analysis showed that age at the beginning of treatment was an important factor that influenced the ICER. A potential reduction in PI costs would also clearly improve the ICER, and transition probabilities influenced the results, but to a lesser extent. Probabilistic sensitivity analyses showed that 95 % of the simulations presented an ICER below €40,000/QALY. Post hoc estimations of sustained virological responses of the IL28B-guided therapeutic option represented a limitation of the study. The therapeutic options analysed for the base-case cohort can be considered cost-effective interventions for the Spanish healthcare framework. Sensitivity analysis estimated an acceptability threshold of the IL28B-guided strategy of patients younger than 60 years.
Losina, Elena; Dervan, Elizabeth E.; Paltiel, A. David; Dong, Yan; Wright, R. John; Spindler, Kurt P.; Mandl, Lisa A.; Jones, Morgan H.; Marx, Robert G.; Safran-Norton, Clare E.; Katz, Jeffrey N.
2015-01-01
Background Arthroscopic partial meniscectomy (APM) is extensively used to relieve pain in patients with symptomatic meniscal tear (MT) and knee osteoarthritis (OA). Recent studies have failed to show the superiority of APM compared to other treatments. We aim to examine whether existing evidence is sufficient to reject use of APM as a cost-effective treatment for MT+OA. Methods We built a patient-level microsimulation using Monte Carlo methods and evaluated three strategies: Physical therapy (‘PT’) alone; PT followed by APM if subjects continued to experience pain (‘Delayed APM’); and ‘Immediate APM’. Our subject population was US adults with symptomatic MT and knee OA over a 10 year time horizon. We assessed treatment outcomes using societal costs, quality-adjusted life years (QALYs), and calculated incremental cost-effectiveness ratios (ICERs), incorporating productivity costs as a sensitivity analysis. We also conducted a value-of-information analysis using probabilistic sensitivity analyses. Results Calculated ICERs were estimated to be $12,900/QALY for Delayed APM as compared to PT and $103,200/QALY for Immediate APM as compared to Delayed APM. In sensitivity analyses, inclusion of time costs made Delayed APM cost-saving as compared to PT. Improving efficacy of Delayed APM led to higher incremental costs and lower incremental effectiveness of Immediate APM in comparison to Delayed APM. Probabilistic sensitivity analyses indicated that PT had 3.0% probability of being cost-effective at a willingness-to-pay (WTP) threshold of $50,000/QALY. Delayed APM was cost effective 57.7% of the time at WTP = $50,000/QALY and 50.2% at WTP = $100,000/QALY. The probability of Immediate APM being cost-effective did not exceed 50% unless WTP exceeded $103,000/QALY. Conclusions We conclude that current cost-effectiveness evidence does not support unqualified rejection of either Immediate or Delayed APM for the treatment of MT+OA. The amount to which society would be willing to pay for additional information on treatment outcomes greatly exceeds the cost of conducting another randomized controlled trial on APM. PMID:26086246
Fischer, Paul W; Cullen, Alison C; Ettl, Gregory J
2017-01-01
The objectives of this study are to understand tradeoffs between forest carbon and timber values, and evaluate the impact of uncertainty in improved forest management (IFM) carbon offset projects to improve forest management decisions. The study uses probabilistic simulation of uncertainty in financial risk for three management scenarios (clearcutting in 45- and 65-year rotations and no harvest) under three carbon price schemes (historic voluntary market prices, cap and trade, and carbon prices set to equal net present value (NPV) from timber-oriented management). Uncertainty is modeled for value and amount of carbon credits and wood products, the accuracy of forest growth model forecasts, and four other variables relevant to American Carbon Registry methodology. Calculations use forest inventory data from a 1,740 ha forest in western Washington State, using the Forest Vegetation Simulator (FVS) growth model. Sensitivity analysis shows that FVS model uncertainty contributes more than 70% to overall NPV variance, followed in importance by variability in inventory sample (3-14%), and short-term prices for timber products (8%), while variability in carbon credit price has little influence (1.1%). At regional average land-holding costs, a no-harvest management scenario would become revenue-positive at a carbon credit break-point price of $14.17/Mg carbon dioxide equivalent (CO 2 e). IFM carbon projects are associated with a greater chance of both large payouts and large losses to landowners. These results inform policymakers and forest owners of the carbon credit price necessary for IFM approaches to equal or better the business-as-usual strategy, while highlighting the magnitude of financial risk and reward through probabilistic simulation. © 2016 Society for Risk Analysis.
EXPERIENCES WITH USING PROBABILISTIC EXPOSURE ANALYSIS METHODS IN THE U.S. EPA
Over the past decade various Offices and Programs within the U.S. EPA have either initiated or increased the development and application of probabilistic exposure analysis models. These models have been applied to a broad range of research or regulatory problems in EPA, such as e...
NASA Technical Reports Server (NTRS)
Hill, Geoffrey A.; Olson, Erik D.
2004-01-01
Due to the growing problem of noise in today's air transportation system, there have arisen needs to incorporate noise considerations in the conceptual design of revolutionary aircraft. Through the use of response surfaces, complex noise models may be converted into polynomial equations for rapid and simplified evaluation. This conversion allows many of the commonly used response surface-based trade space exploration methods to be applied to noise analysis. This methodology is demonstrated using a noise model of a notional 300 passenger Blended-Wing-Body (BWB) transport. Response surfaces are created relating source noise levels of the BWB vehicle to its corresponding FAR-36 certification noise levels and the resulting trade space is explored. Methods demonstrated include: single point analysis, parametric study, an optimization technique for inverse analysis, sensitivity studies, and probabilistic analysis. Extended applications of response surface-based methods in noise analysis are also discussed.
An advanced probabilistic structural analysis method for implicit performance functions
NASA Technical Reports Server (NTRS)
Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.
1989-01-01
In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.
MacDonald, Gary P
2010-08-01
The Food and Drug Administration (FDA) recently approved rosuvastatin calcium for prevention of cardiovascular events in patients who have elevated levels of high-sensitivity C-reactive protein (hs-CRP) but not overt hyperlipidemia. The FDA's decision was based primarily on research reported by the JUPITER (Justification for the Use of Statins in Prevention: An Intervention Trial Evaluating Rosuvastatin) Study Group. The cost-effectiveness of such treatment is unknown. To compare the cost-effectiveness of treatment with rosuvastatin vs standard management, according to Framingham Risk Score (FRS), for the primary prevention of cardiovascular events in patients who have hs-CRP levels of 2.0 mg/L or higher and low-density lipoprotein cholesterol (LDL-C) levels of less than 130 mg/dL. A Markov-type model was used to calculate the incremental cost-effectiveness ratio of rosuvastatin (20 mg daily) vs standard management for the primary prevention of cardiovascular events in patients over a 10-year period. Cost data were obtained from the Centers for Medicare & Medicaid Services and the Red Book drug reference. Health utility measures were obtained from the literature. Cardiovascular event data were obtained directly from the JUPITER Study Group. One-way sensitivity analysis and probabilistic sensitivity analysis were conducted. Treating patients with rosuvastatin to prevent cardiovascular events based on a hs-CRP level greater than 2.0 mg/L and an LDL-C level of 130 mg/dL or lower would result in estimated incremental cost-effectiveness ratios of $35,455 per quality-adjusted life year (QALY) in patients with an FRS greater than 10% and $90,714 per QALY in patients with an FRS less than or equal to 10%. Results of probabilistic sensitivity analysis suggested that in patients with an FRS greater than 10%, the probability that rosuvastatin is considered cost-effective at $50,000 per QALY is approximately 98%. In patients with an FRS less than or equal to 10%, the probability that rosuvastatin is considered cost-effective at $50,000 per QALY is 0%. Compared with standard management, treatment with rosuvastatin is a cost-effective strategy over a 10-year period for preventing cardiovascular events in patients with FRS greater than 10%, elevated hs-CRP levels, and normal LDL-C levels.
Probabilistic interpretation of Peelle's pertinent puzzle and its resolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanson, Kenneth M.; Kawano, T.; Talou, P.
2004-01-01
Peelle's Pertinent Puzzle (PPP) states a seemingly plausible set of measurements with their covariance matrix, which produce an implausible answer. To answer the PPP question, we describe a reasonable experimental situation that is consistent with the PPP solution. The confusion surrounding the PPP arises in part because of its imprecise statement, which permits to a variety of interpretations and resulting answers, some of which seem implausible. We emphasize the importance of basing the analysis on an unambiguous probabilistic model that reflects the experimental situation. We present several different models of how the measurements quoted in the PPP problem could bemore » obtained, and interpret their solution in terms of a detailed probabilistic analysis. We suggest a probabilistic approach to handling uncertainties about which model to use.« less
Probabilistic Interpretation of Peelle's Pertinent Puzzle and its Resolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanson, Kenneth M.; Kawano, Toshihiko; Talou, Patrick
2005-05-24
Peelle's Pertinent Puzzle (PPP) states a seemingly plausible set of measurements with their covariance matrix, which produce an implausible answer. To answer the PPP question, we describe a reasonable experimental situation that is consistent with the PPP solution. The confusion surrounding the PPP arises in part because of its imprecise statement, which permits to a variety of interpretations and resulting answers, some of which seem implausible. We emphasize the importance of basing the analysis on an unambiguous probabilistic model that reflects the experimental situation. We present several different models of how the measurements quoted in the PPP problem could bemore » obtained, and interpret their solution in terms of a detailed probabilistic analysis. We suggest a probabilistic approach to handling uncertainties about which model to use.« less
Structural reliability assessment capability in NESSUS
NASA Technical Reports Server (NTRS)
Millwater, H.; Wu, Y.-T.
1992-01-01
The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.
Structural reliability assessment capability in NESSUS
NASA Astrophysics Data System (ADS)
Millwater, H.; Wu, Y.-T.
1992-07-01
The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.
Fast computation of derivative based sensitivities of PSHA models via algorithmic differentiation
NASA Astrophysics Data System (ADS)
Leövey, Hernan; Molkenthin, Christian; Scherbaum, Frank; Griewank, Andreas; Kuehn, Nicolas; Stafford, Peter
2015-04-01
Probabilistic seismic hazard analysis (PSHA) is the preferred tool for estimation of potential ground-shaking hazard due to future earthquakes at a site of interest. A modern PSHA represents a complex framework which combines different models with possible many inputs. Sensitivity analysis is a valuable tool for quantifying changes of a model output as inputs are perturbed, identifying critical input parameters and obtaining insight in the model behavior. Differential sensitivity analysis relies on calculating first-order partial derivatives of the model output with respect to its inputs. Moreover, derivative based global sensitivity measures (Sobol' & Kucherenko '09) can be practically used to detect non-essential inputs of the models, thus restricting the focus of attention to a possible much smaller set of inputs. Nevertheless, obtaining first-order partial derivatives of complex models with traditional approaches can be very challenging, and usually increases the computation complexity linearly with the number of inputs appearing in the models. In this study we show how Algorithmic Differentiation (AD) tools can be used in a complex framework such as PSHA to successfully estimate derivative based sensitivities, as is the case in various other domains such as meteorology or aerodynamics, without no significant increase in the computation complexity required for the original computations. First we demonstrate the feasibility of the AD methodology by comparing AD derived sensitivities to analytically derived sensitivities for a basic case of PSHA using a simple ground-motion prediction equation. In a second step, we derive sensitivities via AD for a more complex PSHA study using a ground motion attenuation relation based on a stochastic method to simulate strong motion. The presented approach is general enough to accommodate more advanced PSHA studies of higher complexity.
Chit, Ayman; Roiz, Julie; Aballea, Samuel
2015-01-01
Ontario, Canada, immunizes against influenza using a trivalent inactivated influenza vaccine (IIV3) under a Universal Influenza Immunization Program (UIIP). The UIIP offers IIV3 free-of-charge to all Ontarians over 6 months of age. A newly approved quadrivalent inactivated influenza vaccine (IIV4) offers wider protection against influenza B disease. We explored the expected cost-utility and budget impact of replacing IIV3 with IIV4, within the context of Ontario's UIIP, using a probabilistic and static cost-utility model. Wherever possible, epidemiological and cost data were obtained from Ontario sources. Canadian or U.S. sources were used when Ontario data were not available. Vaccine efficacy for IIV3 was obtained from the literature. IIV4 efficacy was derived from meta-analysis of strain-specific vaccine efficacy. Conservatively, herd protection was not considered. In the base case, we used IIV3 and IIV4 prices of $5.5/dose and $7/dose, respectively. We conducted a sensitivity analysis on the price of IIV4, as well as standard univariate and multivariate statistical uncertainty analyses. Over a typical influenza season, relative to IIV3, IIV4 is expected to avert an additional 2,516 influenza cases, 1,683 influenza-associated medical visits, 27 influenza-associated hospitalizations, and 5 influenza-associated deaths. From a societal perspective, IIV4 would generate 76 more Quality Adjusted Life Years (QALYs) and a net societal budget impact of $4,784,112. The incremental cost effectiveness ratio for this comparison was $63,773/QALY. IIV4 remains cost-effective up to a 53% price premium over IIV3. A probabilistic sensitivity analysis showed that IIV4 was cost-effective with a probability of 65% for a threshold of $100,000/QALY gained. IIV4 is expected to achieve reductions in influenza-related morbidity and mortality compared to IIV3. Despite not accounting for herd protection, IIV4 is still expected to be a cost-effective alternative to IIV3 up to a price premium of 53%. Our conclusions were robust in the face of sensitivity analyses.
Grau, Santiago; Lozano, Virginia; Valladares, Amparo; Cavanillas, Rafael; Xie, Yang; Nocea, Gonzalo
2014-01-01
Background Clinical efficacy of antibiotics may be affected by changes in the susceptibility of microorganisms to antimicrobial agents. The purpose of this study is to assess how these changes could affect the initial efficacy of ertapenem and ceftriaxone in the treatment of community-acquired pneumonia (CAP) in elderly patients and the potential consequences this may have in health care costs. Methods Initial efficacy in elderly was obtained from a combined analysis of two multicenter, randomized studies. An alternative scenario was carried out using initial efficacy data according to the pneumonia severity index (PSI). Country-specific pathogens distribution was obtained from a national epidemiological study, and microbiological susceptibilities to first- and second-line therapies were obtained from Spanish or European surveillance studies. A decision analytic model was used to compare ertapenem versus ceftriaxone for CAP inpatient treatment. Inputs of the model were the expected effectiveness previously estimated and resource use considering a Spanish national health system perspective. Outcomes include difference in proportion of successfully treated patients and difference in total costs between ertapenem and ceftriaxone. The model performed one-way and probabilistic sensitivity analyses. Results First-line treatment of CAP with ertapenem led to a higher proportion of successfully treated patients compared with ceftriaxone in Spain. One-way sensitivity analysis showed that length of stay was the key parameter of the model. Probabilistic sensitivity analysis showed that ertapenem can be a cost-saving strategy compared with ceftriaxone, with a 59% probability of being dominant (lower costs with additional health benefits) for both, elderly patients (>65 years) and patients with PSI >3. Conclusion The incorporation of the current antimicrobial susceptibility into the initial clinical efficacy has a significant impact in outcomes and costs in CAP treatment. The treatment with ertapenem compared with ceftriaxone resulted in better clinical outcomes and lower treatment costs for two segments of the Spanish population: elderly patients and patients with severe pneumonia (PSI >3). PMID:24611019
NASA Astrophysics Data System (ADS)
Sari, Dwi Ivayana; Hermanto, Didik
2017-08-01
This research is a developmental research of probabilistic thinking-oriented learning tools for probability materials at ninth grade students. This study is aimed to produce a good probabilistic thinking-oriented learning tools. The subjects were IX-A students of MTs Model Bangkalan. The stages of this development research used 4-D development model which has been modified into define, design and develop. Teaching learning tools consist of lesson plan, students' worksheet, learning teaching media and students' achievement test. The research instrument used was a sheet of learning tools validation, a sheet of teachers' activities, a sheet of students' activities, students' response questionnaire and students' achievement test. The result of those instruments were analyzed descriptively to answer research objectives. The result was teaching learning tools in which oriented to probabilistic thinking of probability at ninth grade students which has been valid. Since teaching and learning tools have been revised based on validation, and after experiment in class produced that teachers' ability in managing class was effective, students' activities were good, students' responses to the learning tools were positive and the validity, sensitivity and reliability category toward achievement test. In summary, this teaching learning tools can be used by teacher to teach probability for develop students' probabilistic thinking.
Puttarajappa, Chethan; Wijkstrom, Martin; Ganoza, Armando; Lopez, Roberto; Tevar, Amit
2018-01-01
Background Recent studies have reported a significant decrease in wound problems and hospital stay in obese patients undergoing renal transplantation by robotic-assisted minimally invasive techniques with no difference in graft function. Objective Due to the lack of cost-benefit studies on the use of robotic-assisted renal transplantation versus open surgical procedure, the primary aim of our study is to develop a Markov model to analyze the cost-benefit of robotic surgery versus open traditional surgery in obese patients in need of a renal transplant. Methods Electronic searches will be conducted to identify studies comparing open renal transplantation versus robotic-assisted renal transplantation. Costs associated with the two surgical techniques will incorporate the expenses of the resources used for the operations. A decision analysis model will be developed to simulate a randomized controlled trial comparing three interventional arms: (1) continuation of renal replacement therapy for patients who are considered non-suitable candidates for renal transplantation due to obesity, (2) transplant recipients undergoing open transplant surgery, and (3) transplant patients undergoing robotic-assisted renal transplantation. TreeAge Pro 2017 R1 TreeAge Software, Williamstown, MA, USA) will be used to create a Markov model and microsimulation will be used to compare costs and benefits for the two competing surgical interventions. Results The model will simulate a randomized controlled trial of adult obese patients affected by end-stage renal disease undergoing renal transplantation. The absorbing state of the model will be patients' death from any cause. By choosing death as the absorbing state, we will be able simulate the population of renal transplant recipients from the day of their randomization to transplant surgery or continuation on renal replacement therapy to their death and perform sensitivity analysis around patients' age at the time of randomization to determine if age is a critical variable for cost-benefit analysis or cost-effectiveness analysis comparing renal replacement therapy, robotic-assisted surgery or open renal transplant surgery. After running the model, one of the three competing strategies will result as the most cost-beneficial or cost-effective under common circumstances. To assess the robustness of the results of the model, a multivariable probabilistic sensitivity analysis will be performed by modifying the mean values and confidence intervals of key parameters with the main intent of assessing if the winning strategy is sensitive to rigorous and plausible variations of those values. Conclusions After running the model, one of the three competing strategies will result as the most cost-beneficial or cost-effective under common circumstances. To assess the robustness of the results of the model, a multivariable probabilistic sensitivity analysis will be performed by modifying the mean values and confidence intervals of key parameters with the main intent of assessing if the winning strategy is sensitive to rigorous and plausible variations of those values. PMID:29519780
Sironi, Emanuele; Taroni, Franco; Baldinotti, Claudio; Nardi, Cosimo; Norelli, Gian-Aristide; Gallidabino, Matteo; Pinchi, Vilma
2017-11-14
The present study aimed to investigate the performance of a Bayesian method in the evaluation of dental age-related evidence collected by means of a geometrical approximation procedure of the pulp chamber volume. Measurement of this volume was based on three-dimensional cone beam computed tomography images. The Bayesian method was applied by means of a probabilistic graphical model, namely a Bayesian network. Performance of that method was investigated in terms of accuracy and bias of the decisional outcomes. Influence of an informed elicitation of the prior belief of chronological age was also studied by means of a sensitivity analysis. Outcomes in terms of accuracy were adequate with standard requirements for forensic adult age estimation. Findings also indicated that the Bayesian method does not show a particular tendency towards under- or overestimation of the age variable. Outcomes of the sensitivity analysis showed that results on estimation are improved with a ration elicitation of the prior probabilities of age.
Agreement With Conjoined NPs Reflects Language Experience.
Lorimor, Heidi; Adams, Nora C; Middleton, Erica L
2018-01-01
An important question within psycholinguistic research is whether grammatical features, such as number values on nouns, are probabilistic or discrete. Similarly, researchers have debated whether grammatical specifications are only set for individual lexical items, or whether certain types of noun phrases (NPs) also obtain number valuations at the phrasal level. Through a corpus analysis and an oral production task, we show that conjoined NPs can take both singular and plural verb agreement and that notional number (i.e., the numerosity of the referent of the subject noun phrase) plays an important role in agreement with conjoined NPs. In two written production tasks, we show that participants who are exposed to plural (versus singular or unmarked) agreement with conjoined NPs in a biasing story are more likely to produce plural agreement with conjoined NPs on a subsequent production task. This suggests that, in addition to their sensitivity to notional information, conjoined NPs have probabilistic grammatical specifications that reflect their distributional properties in language. These results provide important evidence that grammatical number reflects language experience, and that this language experience impacts agreement at the phrasal level, and not just the lexical level.
Agreement With Conjoined NPs Reflects Language Experience
Lorimor, Heidi; Adams, Nora C.; Middleton, Erica L.
2018-01-01
An important question within psycholinguistic research is whether grammatical features, such as number values on nouns, are probabilistic or discrete. Similarly, researchers have debated whether grammatical specifications are only set for individual lexical items, or whether certain types of noun phrases (NPs) also obtain number valuations at the phrasal level. Through a corpus analysis and an oral production task, we show that conjoined NPs can take both singular and plural verb agreement and that notional number (i.e., the numerosity of the referent of the subject noun phrase) plays an important role in agreement with conjoined NPs. In two written production tasks, we show that participants who are exposed to plural (versus singular or unmarked) agreement with conjoined NPs in a biasing story are more likely to produce plural agreement with conjoined NPs on a subsequent production task. This suggests that, in addition to their sensitivity to notional information, conjoined NPs have probabilistic grammatical specifications that reflect their distributional properties in language. These results provide important evidence that grammatical number reflects language experience, and that this language experience impacts agreement at the phrasal level, and not just the lexical level. PMID:29725311
NASA Astrophysics Data System (ADS)
Yuniar, S.; Wangsaputra, R.; Sinaga, A. T.
2018-03-01
This study aims to develop a combined economical lot size model between supplier and manufacturer for imperfect production processes with probabilistic demand patterns and constant lead times. The supplier side produces the product within a certain time interval then sent to the manufacturer with a certain amount of lot size. Imperfect supplier production systems are characterized by the probability of defective product (γ). The model decision variables are the lot size of the manufacturer's ordering, supplier lot size, and the reorder point of the manufacturer. The optimal decision variables are obtained by minimizing the total expected cost of the combined costs between the suppliers and the manufacturers borne by both parties. The model is built compared to the transactional partnership model, in which the supplier does not participate in the efficiency of its inventory system. A numerical example is given as an illustration of the JELS model and the transactional partnership model. Sensitivity analysis of the model is done by changing the parameters aimed at analyzing the behavior of the developed model.
Objectively combining AR5 instrumental period and paleoclimate climate sensitivity evidence
NASA Astrophysics Data System (ADS)
Lewis, Nicholas; Grünwald, Peter
2018-03-01
Combining instrumental period evidence regarding equilibrium climate sensitivity with largely independent paleoclimate proxy evidence should enable a more constrained sensitivity estimate to be obtained. Previous, subjective Bayesian approaches involved selection of a prior probability distribution reflecting the investigators' beliefs about climate sensitivity. Here a recently developed approach employing two different statistical methods—objective Bayesian and frequentist likelihood-ratio—is used to combine instrumental period and paleoclimate evidence based on data presented and assessments made in the IPCC Fifth Assessment Report. Probabilistic estimates from each source of evidence are represented by posterior probability density functions (PDFs) of physically-appropriate form that can be uniquely factored into a likelihood function and a noninformative prior distribution. The three-parameter form is shown accurately to fit a wide range of estimated climate sensitivity PDFs. The likelihood functions relating to the probabilistic estimates from the two sources are multiplicatively combined and a prior is derived that is noninformative for inference from the combined evidence. A posterior PDF that incorporates the evidence from both sources is produced using a single-step approach, which avoids the order-dependency that would arise if Bayesian updating were used. Results are compared with an alternative approach using the frequentist signed root likelihood ratio method. Results from these two methods are effectively identical, and provide a 5-95% range for climate sensitivity of 1.1-4.05 K (median 1.87 K).
Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Nagpal, Vinod K.
2007-01-01
An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.
Galperine, Tatiana; Denies, Fanette; Lannoy, Damien; Lenne, Xavier; Odou, Pascal; Guery, Benoit; Dervaux, Benoit
2017-01-01
Background Clostridium difficile infection (CDI) is characterized by high rates of recurrence, resulting in substantial health care costs. The aim of this study was to analyze the cost-effectiveness of treatments for the management of second recurrence of community-onset CDI in France. Methods We developed a decision-analytic simulation model to compare 5 treatments for the management of second recurrence of community-onset CDI: pulsed-tapered vancomycin, fidaxomicin, fecal microbiota transplantation (FMT) via colonoscopy, FMT via duodenal infusion, and FMT via enema. The model outcome was the incremental cost-effectiveness ratio (ICER), expressed as cost per quality-adjusted life year (QALY) among the 5 treatments. ICERs were interpreted using a willingness-to-pay threshold of €32,000/QALY. Uncertainty was evaluated through deterministic and probabilistic sensitivity analyses. Results Three strategies were on the efficiency frontier: pulsed-tapered vancomycin, FMT via enema, and FMT via colonoscopy, in order of increasing effectiveness. FMT via duodenal infusion and fidaxomicin were dominated (i.e. less effective and costlier) by FMT via colonoscopy and FMT via enema. FMT via enema compared with pulsed-tapered vancomycin had an ICER of €18,092/QALY. The ICER for FMT via colonoscopy versus FMT via enema was €73,653/QALY. Probabilistic sensitivity analysis with 10,000 Monte Carlo simulations showed that FMT via enema was the most cost-effective strategy in 58% of simulations and FMT via colonoscopy was favored in 19% at a willingness-to-pay threshold of €32,000/QALY. Conclusions FMT via enema is the most cost-effective initial strategy for the management of second recurrence of community-onset CDI at a willingness-to-pay threshold of €32,000/QALY. PMID:28103289
Baro, Emilie; Galperine, Tatiana; Denies, Fanette; Lannoy, Damien; Lenne, Xavier; Odou, Pascal; Guery, Benoit; Dervaux, Benoit
2017-01-01
Clostridium difficile infection (CDI) is characterized by high rates of recurrence, resulting in substantial health care costs. The aim of this study was to analyze the cost-effectiveness of treatments for the management of second recurrence of community-onset CDI in France. We developed a decision-analytic simulation model to compare 5 treatments for the management of second recurrence of community-onset CDI: pulsed-tapered vancomycin, fidaxomicin, fecal microbiota transplantation (FMT) via colonoscopy, FMT via duodenal infusion, and FMT via enema. The model outcome was the incremental cost-effectiveness ratio (ICER), expressed as cost per quality-adjusted life year (QALY) among the 5 treatments. ICERs were interpreted using a willingness-to-pay threshold of €32,000/QALY. Uncertainty was evaluated through deterministic and probabilistic sensitivity analyses. Three strategies were on the efficiency frontier: pulsed-tapered vancomycin, FMT via enema, and FMT via colonoscopy, in order of increasing effectiveness. FMT via duodenal infusion and fidaxomicin were dominated (i.e. less effective and costlier) by FMT via colonoscopy and FMT via enema. FMT via enema compared with pulsed-tapered vancomycin had an ICER of €18,092/QALY. The ICER for FMT via colonoscopy versus FMT via enema was €73,653/QALY. Probabilistic sensitivity analysis with 10,000 Monte Carlo simulations showed that FMT via enema was the most cost-effective strategy in 58% of simulations and FMT via colonoscopy was favored in 19% at a willingness-to-pay threshold of €32,000/QALY. FMT via enema is the most cost-effective initial strategy for the management of second recurrence of community-onset CDI at a willingness-to-pay threshold of €32,000/QALY.
Oakley, Jeremy E.; Brennan, Alan; Breeze, Penny
2015-01-01
Health economic decision-analytic models are used to estimate the expected net benefits of competing decision options. The true values of the input parameters of such models are rarely known with certainty, and it is often useful to quantify the value to the decision maker of reducing uncertainty through collecting new data. In the context of a particular decision problem, the value of a proposed research design can be quantified by its expected value of sample information (EVSI). EVSI is commonly estimated via a 2-level Monte Carlo procedure in which plausible data sets are generated in an outer loop, and then, conditional on these, the parameters of the decision model are updated via Bayes rule and sampled in an inner loop. At each iteration of the inner loop, the decision model is evaluated. This is computationally demanding and may be difficult if the posterior distribution of the model parameters conditional on sampled data is hard to sample from. We describe a fast nonparametric regression-based method for estimating per-patient EVSI that requires only the probabilistic sensitivity analysis sample (i.e., the set of samples drawn from the joint distribution of the parameters and the corresponding net benefits). The method avoids the need to sample from the posterior distributions of the parameters and avoids the need to rerun the model. The only requirement is that sample data sets can be generated. The method is applicable with a model of any complexity and with any specification of model parameter distribution. We demonstrate in a case study the superior efficiency of the regression method over the 2-level Monte Carlo method. PMID:25810269
Mittmann, Nicole; Chan, Brian C; Craven, B Cathy; Isogai, Pierre K; Houghton, Pamela
2011-06-01
To evaluate the incremental cost-effectiveness of electrical stimulation (ES) plus standard wound care (SWC) as compared with SWC only in a spinal cord injury (SCI) population with grade III/IV pressure ulcers (PUs) from the public payer perspective. A decision analytic model was constructed for a 1-year time horizon to determine the incremental cost-effectiveness of ES plus SWC to SWC in a cohort of participants with SCI and grade III/IV PUs. Model inputs for clinical probabilities were based on published literature. Model inputs, namely clinical probabilities and direct health system and medical resources were based on a randomized controlled trial of ES plus SWC versus SWC. Costs (Can $) included outpatient (clinic, home care, health professional) and inpatient management (surgery, complications). One way and probabilistic sensitivity (1000 Monte Carlo iterations) analyses were conducted. The perspective of this analysis is from a Canadian public health system payer. Model target population was an SCI cohort with grade III/IV PUs. Not applicable. Incremental cost per PU healed. ES plus SWC were associated with better outcomes and lower costs. There was a 16.4% increase in the PUs healed and a cost savings of $224 at 1 year. ES plus SWC were thus considered a dominant economic comparator. Probabilistic sensitivity analysis resulted in economic dominance for ES plus SWC in 62%, with another 35% having incremental cost-effectiveness ratios of $50,000 or less per PU healed. The largest driver of the economic model was the percentage of PU healed with ES plus SWC. The addition of ES to SWC improved healing in grade III/IV PU and reduced costs in an SCI population. Copyright © 2011 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Potential cost-effectiveness of universal access to modern contraceptives in Uganda.
Babigumira, Joseph B; Stergachis, Andy; Veenstra, David L; Gardner, Jacqueline S; Ngonzi, Joseph; Mukasa-Kivunike, Peter; Garrison, Louis P
2012-01-01
Over two thirds of women who need contraception in Uganda lack access to modern effective methods. This study was conducted to estimate the potential cost-effectiveness of achieving universal access to modern contraceptives in Uganda by implementing a hypothetical new contraceptive program (NCP) from both societal and governmental (Ministry of Health (MoH)) perspectives. A Markov model was developed to compare the NCP to the status quo or current contraceptive program (CCP). The model followed a hypothetical cohort of 15-year old girls over a lifetime horizon. Data were obtained from the Uganda National Demographic and Health Survey and from published and unpublished sources. Costs, life expectancy, disability-adjusted life expectancy, pregnancies, fertility and incremental cost-effectiveness measured as cost per life-year (LY) gained, cost per disability-adjusted life-year (DALY) averted, cost per pregnancy averted and cost per unit of fertility reduction were calculated. Univariate and probabilistic sensitivity analyses were performed to examine the robustness of results. Mean discounted life expectancy and disability-adjusted life expectancy (DALE) were higher under the NCP vs. CCP (28.74 vs. 28.65 years and 27.38 vs. 27.01 respectively). Mean pregnancies and live births per woman were lower under the NCP (9.51 vs. 7.90 and 6.92 vs. 5.79 respectively). Mean lifetime societal costs per woman were lower for the NCP from the societal perspective ($1,949 vs. $1,987) and the MoH perspective ($636 vs. $685). In the incremental analysis, the NCP dominated the CCP, i.e. it was both less costly and more effective. The results were robust to univariate and probabilistic sensitivity analysis. Universal access to modern contraceptives in Uganda appears to be highly cost-effective. Increasing contraceptive coverage should be considered among Uganda's public health priorities.
Cost-Effectiveness of Thrombolysis within 4.5 Hours of Acute Ischemic Stroke in China
Zhao, Xingquan; Liao, Xiaoling; Wang, Chunjuan; Du, Wanliang; Liu, Gaifen; Liu, Liping; Wang, Chunxue; Wang, Yilong; Wang, Yongjun
2014-01-01
Background Previous economic studies conducted in developed countries showed intravenous tissue-type plasminogen activator (tPA) is cost-effective for acute ischemic stroke. The present study aimed to determine the cost-effectiveness of tPA treatment in China, the largest developing country. Methods A combination of decision tree and Markov model was developed to determine the cost-effectiveness of tPA treatment versus non-tPA treatment within 4.5 hours after stroke onset. Outcomes and costs data were derived from the database of Thrombolysis Implementation and Monitor of acute ischemic Stroke in China (TIMS-China) study. Efficacy data were derived from a pooled analysis of ECASS, ATLANTIS, NINDS, and EPITHET trials. Costs and quality-adjusted life-years (QALYs) were compared in both short term (2 years) and long term (30 years). One-way and probabilistic sensitivity analyses were performed to test the robustness of the results. Results Comparing to non-tPA treatment, tPA treatment within 4.5 hours led to a short-term gain of 0.101 QALYs at an additional cost of CNY 9,520 (US$ 1,460), yielding an incremental cost-effectiveness ratio (ICER) of CNY 94,300 (US$ 14,500) per QALY gained in 2 years; and to a long-term gain of 0.422 QALYs at an additional cost of CNY 6,530 (US$ 1,000), yielding an ICER of CNY 15,500 (US$ 2,380) per QALY gained in 30 years. Probabilistic sensitivity analysis showed that tPA treatment is cost-effective in 98.7% of the simulations at a willingness-to-pay threshold of CNY 105,000 (US$ 16,200) per QALY. Conclusions Intravenous tPA treatment within 4.5 hours is highly cost-effective for acute ischemic strokes in China. PMID:25329637
de Geus, S W L; Evans, D B; Bliss, L A; Eskander, M F; Smith, J K; Wolff, R A; Miksad, R A; Weinstein, M C; Tseng, J F
2016-10-01
Neoadjuvant therapy is gaining acceptance as a valid treatment option for borderline resectable pancreatic cancer; however, its value for clearly resectable pancreatic cancer remains controversial. The aim of this study was to use a Markov decision analysis model, in the absence of adequately powered randomized trials, to compare the life expectancy (LE) and quality-adjusted life expectancy (QALE) of neoadjuvant therapy to conventional upfront surgical strategies in resectable pancreatic cancer patients. A Markov decision model was created to compare two strategies: attempted pancreatic resection followed by adjuvant chemoradiotherapy and neoadjuvant chemoradiotherapy followed by restaging with, if appropriate, attempted pancreatic resection. Data obtained through a comprehensive systematic search in PUBMED of the literature from 2000 to 2015 were used to estimate the probabilities used in the model. Deterministic and probabilistic sensitivity analyses were performed. Of the 786 potentially eligible studies identified, 22 studies met the inclusion criteria and were used to extract the probabilities used in the model. Base case analyses of the model showed a higher LE (32.2 vs. 26.7 months) and QALE (25.5 vs. 20.8 quality-adjusted life months) for patients in the neoadjuvant therapy arm compared to upfront surgery. Probabilistic sensitivity analyses for LE and QALE revealed that neoadjuvant therapy is favorable in 59% and 60% of the cases respectively. Although conceptual, these data suggest that neoadjuvant therapy offers substantial benefit in LE and QALE for resectable pancreatic cancer patients. These findings highlight the value of further prospective randomized trials comparing neoadjuvant therapy to conventional upfront surgical strategies. Copyright © 2016 Elsevier Ltd, BASO ~ The Association for Cancer Surgery, and the European Society of Surgical Oncology. All rights reserved.
Kim, Dong-Jin; Kim, Ho-Sook; Oh, Minkyung; Kim, Eun-Young; Shin, Jae-Gook
2017-10-01
Although studies assessing the cost effectiveness of genotype-guided warfarin dosing for the management of atrial fibrillation, deep vein thrombosis, and pulmonary embolism have been reported, no publications have addressed genotype-guided warfarin therapy in mechanical heart valve replacement (MHVR) patients or genotype-guided warfarin therapy under the fee-for-service (FFS) insurance system. The aim of this study was to evaluate the cost effectiveness of genotype-guided warfarin dosing in patients with MHVR under the FFS system from the Korea healthcare sector perspective. A decision-analytic Markov model was developed to evaluate the cost effectiveness of genotype-guided warfarin dosing compared with standard dosing. Estimates of clinical adverse event rates and health state utilities were derived from the published literature. The outcome measure was the incremental cost-effectiveness ratio (ICER) per quality-adjusted life-year (QALY). One-way and probabilistic sensitivity analyses were performed to explore the range of plausible results. In a base-case analysis, genotype-guided warfarin dosing was associated with marginally higher QALYs than standard warfarin dosing (6.088 vs. 6.083, respectively), at a slightly higher cost (US$6.8) (year 2016 values). The ICER was US$1356.2 per QALY gained. In probabilistic sensitivity analysis, there was an 82.7% probability that genotype-guided dosing was dominant compared with standard dosing, and a 99.8% probability that it was cost effective at a willingness-to-pay threshold of US$50,000 per QALY gained. Compared with only standard warfarin therapy, genotype-guided warfarin dosing was cost effective in MHVR patients under the FFS insurance system.
NASA Astrophysics Data System (ADS)
Zamora, N.; Hoechner, A.; Babeyko, A. Y.
2014-12-01
Iran and Pakistan are countries frequently affected by destructive earthquakes, as for instance, the magnitude 6.6 Bam earthquake in 2003 in Iran with about 30 000 casualties, or the magnitude 7.6 Kashmir earthquake 2005 in Pakistan with about 80'000 casualties. Both events took place inland, but in terms of magnitude, even significantly larger events can be expected to happen offshore, at the Makran subduction zone. This small subduction zone is seismically rather quiescent, nevertheless a tsunami caused by a thrust event in 1945 (Balochistan earthquake) led to about 4000 casualties. Nowadays, the coastal regions are more densely populated and vulnerable to similar events. Furthermore, some recent publications discuss the possiblity of rather rare huge magnitude 9 events at the Makran subduction zone. We analyze the seismicity at the subduction plate interface and generate various synthetic earthquake catalogs spanning 100000 years. All the events are projected onto the plate interface using scaling relations and a tsunami model is run for every scenario. The tsunami hazard along the coast is computed and presented in the form of annual probability of exceedance, probabilistic tsunami height for different time periods and other measures. We show how the hazard reacts to variation of the Gutenberg-Richter parameters and maximum magnitudes.We model the historic Balochistan event and its effect in terms of coastal wave heights. Finally, we show how an effective tsunami early warning could be achieved by using an array of high-precision real-time GNSS (Global Navigation Satellite System) receivers along the coast by applying it to the 1945 event and by performing a sensitivity analysis.
Wu, Bin; Li, Jin; Wu, Haixiang
2015-11-01
To investigate the cost-effectiveness of different screening intervals for diabetic retinopathy (DR) in Chinese patients with newly diagnosed type 2 diabetes mellitus (T2DM). Chinese healthcare system.Chinese general clinical setting. A cost-effectiveness model was developed to simulate the disease course of Chinese population with newly diagnosed with diabetes. Different DR screening programs were modeled to project economic outcomes. To develop the economic model, we calibrated the progression rates of DR that fit Chinese epidemiologic data derived from the published literature. Costs were estimated from the perspective of the Chinese healthcare system, and the analysis was run over a lifetime horizon. One-way and probabilistic sensitivity analyses were performed. Total costs, vision outcomes, costs per quality-adjusted life year (QALY), the incremental cost-effectiveness ratio (ICER) of screening strategies compared to no screening. DR screening is effective in Chinese patients with newly diagnosed T2DM, and screen strategies with ≥4-year intervals were cost-effective (ICER <$7,485 per QALY) compared to no screening. Screening every 4 years produced the greatest increase in QALYs (11.066) among the cost-effective strategies. The screening intervals could be varied dramatically by age at T2DM diagnosis. Probabilistic sensitivity analyses demonstrated the consistency and robustness of the cost-effectiveness of the 4-year interval screening strategy. The findings suggest that a 4-year interval screening strategy is likely to be more cost-effective than screening every 1 to 3 years in comparison with no screening in the Chinese setting. The screening intervals might be tailored according to the age at T2DM diagnosis.
Cost-effectiveness of thrombolysis within 4.5 hours of acute ischemic stroke in China.
Pan, Yuesong; Chen, Qidong; Zhao, Xingquan; Liao, Xiaoling; Wang, Chunjuan; Du, Wanliang; Liu, Gaifen; Liu, Liping; Wang, Chunxue; Wang, Yilong; Wang, Yongjun
2014-01-01
Previous economic studies conducted in developed countries showed intravenous tissue-type plasminogen activator (tPA) is cost-effective for acute ischemic stroke. The present study aimed to determine the cost-effectiveness of tPA treatment in China, the largest developing country. A combination of decision tree and Markov model was developed to determine the cost-effectiveness of tPA treatment versus non-tPA treatment within 4.5 hours after stroke onset. Outcomes and costs data were derived from the database of Thrombolysis Implementation and Monitor of acute ischemic Stroke in China (TIMS-China) study. Efficacy data were derived from a pooled analysis of ECASS, ATLANTIS, NINDS, and EPITHET trials. Costs and quality-adjusted life-years (QALYs) were compared in both short term (2 years) and long term (30 years). One-way and probabilistic sensitivity analyses were performed to test the robustness of the results. Comparing to non-tPA treatment, tPA treatment within 4.5 hours led to a short-term gain of 0.101 QALYs at an additional cost of CNY 9,520 (US$ 1,460), yielding an incremental cost-effectiveness ratio (ICER) of CNY 94,300 (US$ 14,500) per QALY gained in 2 years; and to a long-term gain of 0.422 QALYs at an additional cost of CNY 6,530 (US$ 1,000), yielding an ICER of CNY 15,500 (US$ 2,380) per QALY gained in 30 years. Probabilistic sensitivity analysis showed that tPA treatment is cost-effective in 98.7% of the simulations at a willingness-to-pay threshold of CNY 105,000 (US$ 16,200) per QALY. Intravenous tPA treatment within 4.5 hours is highly cost-effective for acute ischemic strokes in China.
Probabilistic safety assessment of the design of a tall buildings under the extreme load
DOE Office of Scientific and Technical Information (OSTI.GOV)
Králik, Juraj, E-mail: juraj.kralik@stuba.sk
2016-06-08
The paper describes some experiences from the deterministic and probabilistic analysis of the safety of the tall building structure. There are presented the methods and requirements of Eurocode EN 1990, standard ISO 2394 and JCSS. The uncertainties of the model and resistance of the structures are considered using the simulation methods. The MONTE CARLO, LHS and RSM probabilistic methods are compared with the deterministic results. On the example of the probability analysis of the safety of the tall buildings is demonstrated the effectiveness of the probability design of structures using Finite Element Methods.
Probabilistic safety assessment of the design of a tall buildings under the extreme load
NASA Astrophysics Data System (ADS)
Králik, Juraj
2016-06-01
The paper describes some experiences from the deterministic and probabilistic analysis of the safety of the tall building structure. There are presented the methods and requirements of Eurocode EN 1990, standard ISO 2394 and JCSS. The uncertainties of the model and resistance of the structures are considered using the simulation methods. The MONTE CARLO, LHS and RSM probabilistic methods are compared with the deterministic results. On the example of the probability analysis of the safety of the tall buildings is demonstrated the effectiveness of the probability design of structures using Finite Element Methods.
Constraining ozone-precursor responsiveness using ambient measurements
This study develops probabilistic estimates of ozone (O3) sensitivities to precursoremissions by incorporating uncertainties in photochemical modeling and evaluating modelperformance based on ground-level observations of O3 and oxides of nitrogen (NOx).Uncertainties in model form...
Corcoran, R; Rowse, G; Moore, R; Blackwood, N; Kinderman, P; Howard, R; Cummins, S; Bentall, R P
2008-11-01
A tendency to make hasty decisions on probabilistic reasoning tasks and a difficulty attributing mental states to others are key cognitive features of persecutory delusions (PDs) in the context of schizophrenia. This study examines whether these same psychological anomalies characterize PDs when they present in the context of psychotic depression. Performance on measures of probabilistic reasoning and theory of mind (ToM) was examined in five subgroups differing in diagnostic category and current illness status. The tendency to draw hasty decisions in probabilistic settings and poor ToM tested using story format feature in PDs irrespective of diagnosis. Furthermore, performance on the ToM story task correlated with the degree of distress caused by and preoccupation with the current PDs in the currently deluded groups. By contrast, performance on the non-verbal ToM task appears to be more sensitive to diagnosis, as patients with schizophrenia spectrum disorders perform worse on this task than those with depression irrespective of the presence of PDs. The psychological anomalies associated with PDs examined here are transdiagnostic but different measures of ToM may be more or less sensitive to indices of severity of the PDs, diagnosis and trait- or state-related cognitive effects.
Risk analysis of analytical validations by probabilistic modification of FMEA.
Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J
2012-05-01
Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.
Quantifying Uncertainties in the Thermo-Mechanical Properties of Particulate Reinforced Composites
NASA Technical Reports Server (NTRS)
Mital, Subodh K.; Murthy, Pappu L. N.
1999-01-01
The present paper reports results from a computational simulation of probabilistic particulate reinforced composite behavior. The approach consists use of simplified micromechanics of particulate reinforced composites together with a Fast Probability Integration (FPI) technique. Sample results are presented for a Al/SiC(sub p)(silicon carbide particles in aluminum matrix) composite. The probability density functions for composite moduli, thermal expansion coefficient and thermal conductivities along with their sensitivity factors are computed. The effect of different assumed distributions and the effect of reducing scatter in constituent properties on the thermal expansion coefficient are also evaluated. The variations in the constituent properties that directly effect these composite properties are accounted for by assumed probabilistic distributions. The results show that the present technique provides valuable information about the scatter in composite properties and sensitivity factors, which are useful to test or design engineers.
Accuracy analysis and design of A3 parallel spindle head
NASA Astrophysics Data System (ADS)
Ni, Yanbing; Zhang, Biao; Sun, Yupeng; Zhang, Yuan
2016-03-01
As functional components of machine tools, parallel mechanisms are widely used in high efficiency machining of aviation components, and accuracy is one of the critical technical indexes. Lots of researchers have focused on the accuracy problem of parallel mechanisms, but in terms of controlling the errors and improving the accuracy in the stage of design and manufacturing, further efforts are required. Aiming at the accuracy design of a 3-DOF parallel spindle head(A3 head), its error model, sensitivity analysis and tolerance allocation are investigated. Based on the inverse kinematic analysis, the error model of A3 head is established by using the first-order perturbation theory and vector chain method. According to the mapping property of motion and constraint Jacobian matrix, the compensatable and uncompensatable error sources which affect the accuracy in the end-effector are separated. Furthermore, sensitivity analysis is performed on the uncompensatable error sources. The sensitivity probabilistic model is established and the global sensitivity index is proposed to analyze the influence of the uncompensatable error sources on the accuracy in the end-effector of the mechanism. The results show that orientation error sources have bigger effect on the accuracy in the end-effector. Based upon the sensitivity analysis results, the tolerance design is converted into the issue of nonlinearly constrained optimization with the manufacturing cost minimum being the optimization objective. By utilizing the genetic algorithm, the allocation of the tolerances on each component is finally determined. According to the tolerance allocation results, the tolerance ranges of ten kinds of geometric error sources are obtained. These research achievements can provide fundamental guidelines for component manufacturing and assembly of this kind of parallel mechanisms.
Development of a probabilistic analysis methodology for structural reliability estimation
NASA Technical Reports Server (NTRS)
Torng, T. Y.; Wu, Y.-T.
1991-01-01
The novel probabilistic analysis method for assessment of structural reliability presented, which combines fast-convolution with an efficient structural reliability analysis, can after identifying the most important point of a limit state proceed to establish a quadratic-performance function. It then transforms the quadratic function into a linear one, and applies fast convolution. The method is applicable to problems requiring computer-intensive structural analysis. Five illustrative examples of the method's application are given.
NASA Astrophysics Data System (ADS)
Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.
2017-08-01
While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.
Fernandes, Silke; Sicuri, Elisa; Kayentao, Kassoum; van Eijk, Anne Maria; Hill, Jenny; Webster, Jayne; Were, Vincent; Akazili, James; Madanitsa, Mwayi; ter Kuile, Feiko O; Hanson, Kara
2015-03-01
In 2012, WHO changed its recommendation for intermittent preventive treatment of malaria during pregnancy (IPTp) from two doses to monthly doses of sulfadoxine-pyrimethamine during the second and third trimesters, but noted the importance of a cost-effectiveness analysis to lend support to the decision of policy makers. We therefore estimated the incremental cost-effectiveness of IPTp with three or more (IPTp-SP3+) versus two doses of sulfadoxine-pyrimethamine (IPTp-SP2). For this analysis, we used data from a 2013 meta-analysis of seven studies in sub-Saharan Africa. We developed a decision tree model with a lifetime horizon. We analysed the base case from a societal perspective. We did deterministic and probabilistic sensitivity analyses with appropriate parameter ranges and distributions for settings with low, moderate, and high background risk of low birthweight, and did a separate analysis for HIV-negative women. Parameters in the model were obtained for all countries included in the original meta-analysis. We did simulations in hypothetical cohorts of 1000 pregnant women receiving either IPTp-SP3+ or IPTp-SP2. We calculated disability-adjusted life-years (DALYs) for low birthweight, severe to moderate anaemia, and clinical malaria. We calculated cost estimates from data obtained in observational studies, exit surveys, and from public procurement databases. We give financial and economic costs in constant 2012 US$. The main outcome measure was the incremental cost per DALY averted. The delivery of IPTp-SP3+ to 1000 pregnant women averted 113·4 DALYs at an incremental cost of $825·67 producing an incremental cost-effectiveness ratio (ICER) of $7·28 per DALY averted. The results remained robust in the deterministic sensitivity analysis. In the probabilistic sensitivity analyses, the ICER was $7·7 per DALY averted for moderate risk of low birthweight, $19·4 per DALY averted for low risk, and $4·0 per DALY averted for high risk. The ICER for HIV-negative women was $6·2 per DALY averted. Our findings lend strong support to the WHO guidelines that recommend a monthly dose of IPTp-SP from the second trimester onwards. Malaria in Pregnancy Consortium and the Bill & Melinda Gates Foundation. Copyright © 2015 Fernandes et al. Open Access article distributed under the terms of CC BY-NC-SA. Published by .. All rights reserved.
Gofer-Levi, M; Silberg, T; Brezner, A; Vakil, E
2014-09-01
Children learn to engage their surroundings skillfully, adopting implicit knowledge of complex regularities and associations. Probabilistic classification learning (PCL) is a type of cognitive procedural learning in which different cues are probabilistically associated with specific outcomes. Little is known about the effects of developmental disorders on cognitive skill acquisition. Twenty-four children and adolescents with cerebral palsy (CP) were compared to 24 typically developing (TD) youth in their ability to learn probabilistic associations. Performance was examined in relation to general cognitive abilities, level of motor impairment and age. Improvement in PCL was observed for all participants, with no relation to IQ. An age effect was found only among TD children. Learning curves of children with CP on a cognitive procedural learning task differ from those of TD peers and do not appear to be age sensitive. Copyright © 2014 Elsevier Ltd. All rights reserved.
Noradrenergic modulation of risk/reward decision making.
Montes, David R; Stopper, Colin M; Floresco, Stan B
2015-08-01
Catecholamine transmission modulates numerous cognitive and reward-related processes that can subserve more complex functions such as cost/benefit decision making. Dopamine has been shown to play an integral role in decisions involving reward uncertainty, yet there is a paucity of research investigating the contributions of noradrenaline (NA) transmission to these functions. The present study was designed to elucidate the contribution of NA to risk/reward decision making in rats, assessed with a probabilistic discounting task. We examined the effects of reducing noradrenergic transmission with the α2 agonist clonidine (10-100 μg/kg), and increasing activity at α2A receptor sites with the agonist guanfacine (0.1-1 mg/kg), the α2 antagonist yohimbine (1-3 mg/kg), and the noradrenaline transporter (NET) inhibitor atomoxetine (0.3-3 mg/kg) on probabilistic discounting. Rats chose between a small/certain reward and a larger/risky reward, wherein the probability of obtaining the larger reward either decreased (100-12.5 %) or increased (12.5-100 %) over a session. In well-trained rats, clonidine reduced risky choice by decreasing reward sensitivity, whereas guanfacine did not affect choice behavior. Yohimbine impaired adjustments in decision biases as reward probability changed within a session by altering negative feedback sensitivity. In a subset of rats that displayed prominent discounting of probabilistic rewards, the lowest dose of atomoxetine increased preference for the large/risky reward when this option had greater long-term utility. These data highlight an important and previously uncharacterized role for noradrenergic transmission in mediating different aspects of risk/reward decision making and mediating reward and negative feedback sensitivity.
Probabilistic machine learning and artificial intelligence.
Ghahramani, Zoubin
2015-05-28
How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.
Probabilistic machine learning and artificial intelligence
NASA Astrophysics Data System (ADS)
Ghahramani, Zoubin
2015-05-01
How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.
Toward a Probabilistic Phenological Model for Wheat Growing Degree Days (GDD)
NASA Astrophysics Data System (ADS)
Rahmani, E.; Hense, A.
2017-12-01
Are there deterministic relations between phenological and climate parameters? The answer is surely `No'. This answer motivated us to solve the problem through probabilistic theories. Thus, we developed a probabilistic phenological model which has the advantage of giving additional information in terms of uncertainty. To that aim, we turned to a statistical analysis named survival analysis. Survival analysis deals with death in biological organisms and failure in mechanical systems. In survival analysis literature, death or failure is considered as an event. By event, in this research we mean ripening date of wheat. We will assume only one event in this special case. By time, we mean the growing duration from sowing to ripening as lifetime for wheat which is a function of GDD. To be more precise we will try to perform the probabilistic forecast for wheat ripening. The probability value will change between 0 and 1. Here, the survivor function gives the probability that the not ripened wheat survives longer than a specific time or will survive to the end of its lifetime as a ripened crop. The survival function at each station is determined by fitting a normal distribution to the GDD as the function of growth duration. Verification of the models obtained is done using CRPS skill score (CRPSS). The positive values of CRPSS indicate the large superiority of the probabilistic phonologic survival model to the deterministic models. These results demonstrate that considering uncertainties in modeling are beneficial, meaningful and necessary. We believe that probabilistic phenological models have the potential to help reduce the vulnerability of agricultural production systems to climate change thereby increasing food security.
Naci, Huseyin; de Lissovoy, Gregory; Hollenbeak, Christopher; Custer, Brian; Hofmann, Axel; McClellan, William; Gitlin, Matthew
2012-01-01
To determine whether Medicare's decision to cover routine administration of erythropoietin stimulating agents (ESAs) to treat anemia of end-stage renal disease (ESRD) has been a cost-effective policy relative to standard of care at the time. The authors used summary statistics from the actual cohort of ESRD patients receiving ESAs between 1995 and 2004 to create a simulated patient cohort, which was compared with a comparable simulated cohort assumed to rely solely on blood transfusions. Outcomes modeled from the Medicare perspective included estimated treatment costs, life-years gained, and quality-adjusted life-years (QALYs). Incremental cost-effectiveness ratio (ICER) was calculated relative to the hypothetical reference case of no ESA use in the transfusion cohort. Sensitivity of the results to model assumptions was tested using one-way and probabilistic sensitivity analyses. Estimated total costs incurred by the ESRD population were $155.47B for the cohort receiving ESAs and $155.22B for the cohort receiving routine blood transfusions. Estimated QALYs were 2.56M and 2.29M, respectively, for the two groups. The ICER of ESAs compared to routine blood transfusions was estimated as $873 per QALY gained. The model was sensitive to a number of parameters according to one-way and probabilistic sensitivity analyses. This model was counter-factual as the actual comparison group, whose anemia was managed via transfusion and iron supplements, rapidly disappeared following introduction of ESAs. In addition, a large number of model parameters were obtained from observational studies due to the lack of randomized trial evidence in the literature. This study indicates that Medicare's coverage of ESAs appears to have been cost effective based on commonly accepted levels of willingness-to-pay. The ESRD population achieved substantial clinical benefit at a reasonable cost to society.
Probabilistic evaluation of uncertainties and risks in aerospace components
NASA Technical Reports Server (NTRS)
Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.
1992-01-01
A methodology is presented for the computational simulation of primitive variable uncertainties, and attention is given to the simulation of specific aerospace components. Specific examples treated encompass a probabilistic material behavior model, as well as static, dynamic, and fatigue/damage analyses of a turbine blade in a mistuned bladed rotor in the SSME turbopumps. An account is given of the use of the NESSES probabilistic FEM analysis CFD code.
Incorporating psychological influences in probabilistic cost analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kujawski, Edouard; Alvaro, Mariana; Edwards, William
2004-01-08
Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations thatmore » are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for allocating baseline budgets and contingencies. Given the scope and magnitude of the cost-overrun problem, the benefits are likely to be significant.« less
ENDANGERED AQUATIC VERTEBRATES: COMPARATIVE AND PROBABILISTIC-BASED TOXICOLOGY
It has previously been assumed that endangered, threatened, and candidate endangered species (collectively known as “listed” species) are uniquely sensitive to chemicals. The purpose of this cooperative research effort (U.S. Environmental Protection Agency, U.S. Geological Surve...
Probability and possibility-based representations of uncertainty in fault tree analysis.
Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje
2013-01-01
Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context. © 2012 Society for Risk Analysis.
Cost-effectiveness of minimally invasive sacroiliac joint fusion.
Cher, Daniel J; Frasco, Melissa A; Arnold, Renée Jg; Polly, David W
2016-01-01
Sacroiliac joint (SIJ) disorders are common in patients with chronic lower back pain. Minimally invasive surgical options have been shown to be effective for the treatment of chronic SIJ dysfunction. To determine the cost-effectiveness of minimally invasive SIJ fusion. Data from two prospective, multicenter, clinical trials were used to inform a Markov process cost-utility model to evaluate cumulative 5-year health quality and costs after minimally invasive SIJ fusion using triangular titanium implants or non-surgical treatment. The analysis was performed from a third-party perspective. The model specifically incorporated variation in resource utilization observed in the randomized trial. Multiple one-way and probabilistic sensitivity analyses were performed. SIJ fusion was associated with a gain of approximately 0.74 quality-adjusted life years (QALYs) at a cost of US$13,313 per QALY gained. In multiple one-way sensitivity analyses all scenarios resulted in an incremental cost-effectiveness ratio (ICER) <$26,000/QALY. Probabilistic analyses showed a high degree of certainty that the maximum ICER for SIJ fusion was less than commonly selected thresholds for acceptability (mean ICER =$13,687, 95% confidence interval $5,162-$28,085). SIJ fusion provided potential cost savings per QALY gained compared to non-surgical treatment after a treatment horizon of greater than 13 years. Compared to traditional non-surgical treatments, SIJ fusion is a cost-effective, and, in the long term, cost-saving strategy for the treatment of SIJ dysfunction due to degenerative sacroiliitis or SIJ disruption.
Cost-effectiveness of minimally invasive sacroiliac joint fusion
Cher, Daniel J; Frasco, Melissa A; Arnold, Renée JG; Polly, David W
2016-01-01
Background Sacroiliac joint (SIJ) disorders are common in patients with chronic lower back pain. Minimally invasive surgical options have been shown to be effective for the treatment of chronic SIJ dysfunction. Objective To determine the cost-effectiveness of minimally invasive SIJ fusion. Methods Data from two prospective, multicenter, clinical trials were used to inform a Markov process cost-utility model to evaluate cumulative 5-year health quality and costs after minimally invasive SIJ fusion using triangular titanium implants or non-surgical treatment. The analysis was performed from a third-party perspective. The model specifically incorporated variation in resource utilization observed in the randomized trial. Multiple one-way and probabilistic sensitivity analyses were performed. Results SIJ fusion was associated with a gain of approximately 0.74 quality-adjusted life years (QALYs) at a cost of US$13,313 per QALY gained. In multiple one-way sensitivity analyses all scenarios resulted in an incremental cost-effectiveness ratio (ICER) <$26,000/QALY. Probabilistic analyses showed a high degree of certainty that the maximum ICER for SIJ fusion was less than commonly selected thresholds for acceptability (mean ICER =$13,687, 95% confidence interval $5,162–$28,085). SIJ fusion provided potential cost savings per QALY gained compared to non-surgical treatment after a treatment horizon of greater than 13 years. Conclusion Compared to traditional non-surgical treatments, SIJ fusion is a cost-effective, and, in the long term, cost-saving strategy for the treatment of SIJ dysfunction due to degenerative sacroiliitis or SIJ disruption. PMID:26719717
Bernard-Arnoux, F; Lamure, M; Ducray, F; Aulagner, G; Honnorat, J; Armoiry, X
2016-08-01
There is strong concern about the costs associated with adding tumor-treating fields (TTF) therapy to standard first-line treatment for glioblastoma (GBM). Hence, we aimed to determine the cost-effectiveness of TTF therapy for the treatment of newly diagnosed patients with GBM. We developed a 3-health-state Markov model. The perspective was that of the French Health Insurance, and the horizon was lifetime. We calculated the transition probabilities from the survival parameters reported in the EF-14 trial. The main outcome measure was incremental effectiveness expressed as life-years gained (LYG). Input costs were derived from the literature. We calculated the incremental cost-effectiveness ratio (ICER) expressed as cost/LYG. We used 1-way deterministic and probabilistic sensitivity analysis to evaluate the model uncertainty. In the base-case analysis, adding TTF therapy to standard of care resulted in increases of life expectancy of 4.08 months (0.34 LYG) and €185 476 per patient. The ICER was €549 909/LYG. The discounted ICER was €596 411/LYG. Parameters with the most influence on ICER were the cost of TTF therapy, followed equally by overall survival and progression-free survival in both arms. The probabilistic sensitivity analysis showed a 95% confidence interval of the ICER of €447 017/LYG to €745 805/LYG with 0% chance to be cost-effective at a threshold of €100 000/LYG. The ICER of TTF therapy at first-line treatment is far beyond conventional thresholds due to the prohibitive announced cost of the device. Strong price regulation by health authorities could make this technology more affordable and consequently accessible to patients. © The Author(s) 2016. Published by Oxford University Press on behalf of the Society for Neuro-Oncology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Design optimization and probabilistic analysis of a hydrodynamic journal bearing
NASA Technical Reports Server (NTRS)
Liniecki, Alexander G.
1990-01-01
A nonlinear constrained optimization of a hydrodynamic bearing was performed yielding three main variables: radial clearance, bearing length to diameter ratio, and lubricating oil viscosity. As an objective function a combined model of temperature rise and oil supply has been adopted. The optimized model of the bearing has been simulated for population of 1000 cases using Monte Carlo statistical method. It appeared that the so called 'optimal solution' generated more than 50 percent of failed bearings, because their minimum oil film thickness violated stipulated minimum constraint value. As a remedy change of oil viscosity is suggested after several sensitivities of variables have been investigated.
Learning Probabilistic Logic Models from Probabilistic Examples
Chen, Jianzhong; Muggleton, Stephen; Santos, José
2009-01-01
Abstract We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples. PMID:19888348
Learning Probabilistic Logic Models from Probabilistic Examples.
Chen, Jianzhong; Muggleton, Stephen; Santos, José
2008-10-01
We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.
Probabilistically modeling lava flows with MOLASSES
NASA Astrophysics Data System (ADS)
Richardson, J. A.; Connor, L.; Connor, C.; Gallant, E.
2017-12-01
Modeling lava flows through Cellular Automata methods enables a computationally inexpensive means to quickly forecast lava flow paths and ultimate areal extents. We have developed a lava flow simulator, MOLASSES, that forecasts lava flow inundation over an elevation model from a point source eruption. This modular code can be implemented in a deterministic fashion with given user inputs that will produce a single lava flow simulation. MOLASSES can also be implemented in a probabilistic fashion where given user inputs define parameter distributions that are randomly sampled to create many lava flow simulations. This probabilistic approach enables uncertainty in input data to be expressed in the model results and MOLASSES outputs a probability map of inundation instead of a determined lava flow extent. Since the code is comparatively fast, we use it probabilistically to investigate where potential vents are located that may impact specific sites and areas, as well as the unconditional probability of lava flow inundation of sites or areas from any vent. We have validated the MOLASSES code to community-defined benchmark tests and to the real world lava flows at Tolbachik (2012-2013) and Pico do Fogo (2014-2015). To determine the efficacy of the MOLASSES simulator at accurately and precisely mimicking the inundation area of real flows, we report goodness of fit using both model sensitivity and the Positive Predictive Value, the latter of which is a Bayesian posterior statistic. Model sensitivity is often used in evaluating lava flow simulators, as it describes how much of the lava flow was successfully modeled by the simulation. We argue that the positive predictive value is equally important in determining how good a simulator is, as it describes the percentage of the simulation space that was actually inundated by lava.
Probabilistic reversal learning is impaired in Parkinson's disease
Peterson, David A.; Elliott, Christian; Song, David D.; Makeig, Scott; Sejnowski, Terrence J.; Poizner, Howard
2009-01-01
In many everyday settings, the relationship between our choices and their potentially rewarding outcomes is probabilistic and dynamic. In addition, the difficulty of the choices can vary widely. Although a large body of theoretical and empirical evidence suggests that dopamine mediates rewarded learning, the influence of dopamine in probabilistic and dynamic rewarded learning remains unclear. We adapted a probabilistic rewarded learning task originally used to study firing rates of dopamine cells in primate substantia nigra pars compacta (Morris et al. 2006) for use as a reversal learning task with humans. We sought to investigate how the dopamine depletion in Parkinson's disease (PD) affects probabilistic reward learning and adaptation to a reversal in reward contingencies. Over the course of 256 trials subjects learned to choose the more favorable from among pairs of images with small or large differences in reward probabilities. During a subsequent otherwise identical reversal phase, the reward probability contingencies for the stimuli were reversed. Seventeen Parkinson's disease (PD) patients of mild to moderate severity were studied off of their dopaminergic medications and compared to 15 age-matched controls. Compared to controls, PD patients had distinct pre- and post-reversal deficiencies depending upon the difficulty of the choices they had to learn. The patients also exhibited compromised adaptability to the reversal. A computational model of the subjects’ trial-by-trial choices demonstrated that the adaptability was sensitive to the gain with which patients weighted pre-reversal feedback. Collectively, the results implicate the nigral dopaminergic system in learning to make choices in environments with probabilistic and dynamic reward contingencies. PMID:19628022
Application of the Probabilistic Dynamic Synthesis Method to the Analysis of a Realistic Structure
NASA Technical Reports Server (NTRS)
Brown, Andrew M.; Ferri, Aldo A.
1998-01-01
The Probabilistic Dynamic Synthesis method is a new technique for obtaining the statistics of a desired response engineering quantity for a structure with non-deterministic parameters. The method uses measured data from modal testing of the structure as the input random variables, rather than more "primitive" quantities like geometry or material variation. This modal information is much more comprehensive and easily measured than the "primitive" information. The probabilistic analysis is carried out using either response surface reliability methods or Monte Carlo simulation. A previous work verified the feasibility of the PDS method on a simple seven degree-of-freedom spring-mass system. In this paper, extensive issues involved with applying the method to a realistic three-substructure system are examined, and free and forced response analyses are performed. The results from using the method are promising, especially when the lack of alternatives for obtaining quantitative output for probabilistic structures is considered.
Probabilistic evaluation of SSME structural components
NASA Astrophysics Data System (ADS)
Rajagopal, K. R.; Newell, J. F.; Ho, H.
1991-05-01
The application is described of Composite Load Spectra (CLS) and Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) family of computer codes to the probabilistic structural analysis of four Space Shuttle Main Engine (SSME) space propulsion system components. These components are subjected to environments that are influenced by many random variables. The applications consider a wide breadth of uncertainties encountered in practice, while simultaneously covering a wide area of structural mechanics. This has been done consistent with the primary design requirement for each component. The probabilistic application studies are discussed using finite element models that have been typically used in the past in deterministic analysis studies.
Establishing Cost-Effective Allocation of Proton Therapy for Breast Irradiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mailhot Vega, Raymond B.; Ishaq, Omar; Raldow, Ann
Purpose: Cardiac toxicity due to conventional breast radiation therapy (RT) has been extensively reported, and it affects both the life expectancy and quality of life of affected women. Given the favorable oncologic outcomes in most women irradiated for breast cancer, it is increasingly paramount to minimize treatment side effects and improve survivorship for these patients. Proton RT offers promise in limiting heart dose, but the modality is costly and access is limited. Using cost-effectiveness analysis, we provide a decision-making tool to help determine which breast cancer patients may benefit from proton RT referral. Methods and Materials: A Markov cohort model wasmore » constructed to compare the cost-effectiveness of proton versus photon RT for breast cancer management. The model was analyzed for different strata of women based on age (40 years, 50 years, and 60 years) and the presence or lack of cardiac risk factors (CRFs). Model entrants could have 1 of 3 health states: healthy, alive with coronary heart disease (CHD), or dead. Base-case analysis assumed CHD was managed medically. No difference in tumor control was assumed between arms. Probabilistic sensitivity analysis was performed to test model robustness and the influence of including catheterization as a downstream possibility within the health state of CHD. Results: Proton RT was not cost-effective in women without CRFs or a mean heart dose (MHD) <5 Gy. Base-case analysis noted cost-effectiveness for proton RT in women with ≥1 CRF at an approximate minimum MHD of 6 Gy with a willingness-to-pay threshold of $100,000/quality-adjusted life-year. For women with ≥1 CRF, probabilistic sensitivity analysis noted the preference of proton RT for an MHD ≥5 Gy with a similar willingness-to-pay threshold. Conclusions: Despite the cost of treatment, scenarios do exist whereby proton therapy is cost-effective. Referral for proton therapy may be cost-effective for patients with ≥1 CRF in cases for which photon plans are unable to achieve an MHD <5 Gy.« less
Seismic Hazard Analysis — Quo vadis?
NASA Astrophysics Data System (ADS)
Klügel, Jens-Uwe
2008-05-01
The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies. Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design of critical infrastructures, especially with systematic sensitivity analyses based on validated phenomenological models. Deterministic seismic hazard analysis incorporates uncertainties in the safety factors. These factors are derived from experience as well as from expert judgment. Deterministic methods associated with high safety factors may lead to too conservative results, especially if applied for generally short-lived civil structures. Scenarios used in deterministic seismic hazard analysis have a clear physical basis. They are related to seismic sources discovered by geological, geomorphologic, geodetic and seismological investigations or derived from historical references. Scenario-based methods can be expanded for risk analysis applications with an extended data analysis providing the frequency of seismic events. Such an extension provides a better informed risk model that is suitable for risk-informed decision making.
Reliability and risk assessment of structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1991-01-01
Development of reliability and risk assessment of structural components and structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) the evaluation of the various uncertainties in terms of cumulative distribution functions for various structural response variables based on known or assumed uncertainties in primitive structural variables; (2) evaluation of the failure probability; (3) reliability and risk-cost assessment; and (4) an outline of an emerging approach for eventual certification of man-rated structures by computational methods. Collectively, the results demonstrate that the structural durability/reliability of man-rated structural components and structures can be effectively evaluated by using formal probabilistic methods.
Probabilistic Based Modeling and Simulation Assessment
2010-06-01
different crash and blast scenarios. With the integration of the high fidelity neck and head model, a methodology to calculate the probability of injury...variability, correlation, and multiple (often competing) failure metrics. Important scenarios include vehicular collisions, blast /fragment impact, and...first area of focus is to develop a methodology to integrate probabilistic analysis into finite element analysis of vehicle collisions and blast . The
Research on probabilistic information processing
NASA Technical Reports Server (NTRS)
Edwards, W.
1973-01-01
The work accomplished on probabilistic information processing (PIP) is reported. The research proposals and decision analysis are discussed along with the results of research on MSC setting, multiattribute utilities, and Bayesian research. Abstracts of reports concerning the PIP research are included.
Application of Probabilistic Analysis to Aircraft Impact Dynamics
NASA Technical Reports Server (NTRS)
Lyle, Karen H.; Padula, Sharon L.; Stockwell, Alan E.
2003-01-01
Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stressstrain behaviors, laminated composites, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the uncertainty in the simulated responses. Several criteria are used to determine that a response surface method is the most appropriate probabilistic approach. The work is extended to compare optimization results with and without probabilistic constraints.
NASA Astrophysics Data System (ADS)
Yan, F.; Winijkul, E.; Bond, T. C.; Streets, D. G.
2012-12-01
There is deficiency in the determination of emission reduction potential in the future, especially with consideration of uncertainty. Mitigation measures for some economic sectors have been proposed, but few studies provide an evaluation of the amount of PM emission reduction that can be obtained in future years by different emission reduction strategies. We attribute the absence of helpful mitigation strategy analysis to limitations in the technical detail of future emission scenarios, which result in the inability to relate technological or regulatory intervention to emission changes. The purpose of this work is to provide a better understanding of the potential benefits of mitigation policies in addressing global and regional emissions. In this work, we introduce a probabilistic approach to explore the impacts of retrofit and scrappage on global PM emissions from on-road vehicles in the coming decades. This approach includes scenario analysis, sensitivity analysis and Monte Carlo simulations. A dynamic model of vehicle population linked to emission characteristics, SPEW-Trend, is used to estimate future emissions and make policy evaluations. Three basic questions will be answered in this work: (1) what contribution can these two programs make to improve global emissions in the future? (2) in which regions are such programs most and least effective in reducing emissions and what features of the vehicle fleet cause these results? (3) what is the level of confidence in the projected emission reductions, given uncertain parameters in describing the dynamic vehicle fleet?
NASA Technical Reports Server (NTRS)
Sobel, Larry; Buttitta, Claudio; Suarez, James
1993-01-01
Probabilistic predictions based on the Integrated Probabilistic Assessment of Composite Structures (IPACS) code are presented for the material and structural response of unnotched and notched, 1M6/3501-6 Gr/Ep laminates. Comparisons of predicted and measured modulus and strength distributions are given for unnotched unidirectional, cross-ply, and quasi-isotropic laminates. The predicted modulus distributions were found to correlate well with the test results for all three unnotched laminates. Correlations of strength distributions for the unnotched laminates are judged good for the unidirectional laminate and fair for the cross-ply laminate, whereas the strength correlation for the quasi-isotropic laminate is deficient because IPACS did not yet have a progressive failure capability. The paper also presents probabilistic and structural reliability analysis predictions for the strain concentration factor (SCF) for an open-hole, quasi-isotropic laminate subjected to longitudinal tension. A special procedure was developed to adapt IPACS for the structural reliability analysis. The reliability results show the importance of identifying the most significant random variables upon which the SCF depends, and of having accurate scatter values for these variables.
NASA Technical Reports Server (NTRS)
Dec, John A.; Braun, Robert D.
2011-01-01
A finite element ablation and thermal response program is presented for simulation of three-dimensional transient thermostructural analysis. The three-dimensional governing differential equations and finite element formulation are summarized. A novel probabilistic design methodology for thermal protection systems is presented. The design methodology is an eight step process beginning with a parameter sensitivity study and is followed by a deterministic analysis whereby an optimum design can determined. The design process concludes with a Monte Carlo simulation where the probabilities of exceeding design specifications are estimated. The design methodology is demonstrated by applying the methodology to the carbon phenolic compression pads of the Crew Exploration Vehicle. The maximum allowed values of bondline temperature and tensile stress are used as the design specifications in this study.
Linear regression metamodeling as a tool to summarize and present simulation model results.
Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M
2013-10-01
Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.
Probabilistic evaluation of fuselage-type composite structures
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Chamis, Christos C.
1992-01-01
A methodology is developed to computationally simulate the uncertain behavior of composite structures. The uncertain behavior includes buckling loads, natural frequencies, displacements, stress/strain etc., which are the consequences of the random variation (scatter) of the primitive (independent random) variables in the constituent, ply, laminate and structural levels. This methodology is implemented in the IPACS (Integrated Probabilistic Assessment of Composite Structures) computer code. A fuselage-type composite structure is analyzed to demonstrate the code's capability. The probability distribution functions of the buckling loads, natural frequency, displacement, strain and stress are computed. The sensitivity of each primitive (independent random) variable to a given structural response is also identified from the analyses.
Probabilistic simulation of uncertainties in composite uniaxial strengths
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Stock, T. A.
1990-01-01
Probabilistic composite micromechanics methods are developed that simulate uncertainties in unidirectional fiber composite strengths. These methods are in the form of computational procedures using composite mechanics with Monte Carlo simulation. The variables for which uncertainties are accounted include constituent strengths and their respective scatter. A graphite/epoxy unidirectional composite (ply) is studied to illustrate the procedure and its effectiveness to formally estimate the probable scatter in the composite uniaxial strengths. The results show that ply longitudinal tensile and compressive, transverse compressive and intralaminar shear strengths are not sensitive to single fiber anomalies (breaks, intergacial disbonds, matrix microcracks); however, the ply transverse tensile strength is.
Bardach, Ariel Esteban; Garay, Osvaldo Ulises; Calderón, María; Pichón-Riviére, Andrés; Augustovski, Federico; Martí, Sebastián García; Cortiñas, Paula; Gonzalez, Marino; Naranjo, Laura T; Gomez, Jorge Alberto; Caporale, Joaquín Enzo
2017-02-02
Cervical cancer (CC) and genital warts (GW) are a significant public health issue in Venezuela. Our objective was to assess the cost-effectiveness of the two available vaccines, bivalent and quadrivalent, against Human Papillomavirus (HPV) in Venezuelan girls in order to inform decision-makers. A previously published Markov cohort model, informed by the best available evidence, was adapted to the Venezuelan context to evaluate the effects of vaccination on health and healthcare costs from the perspective of the healthcare payer in an 11-year-old girls cohort of 264,489. Costs and quality-adjusted life years (QALYs) were discounted at 5%. Eight scenarios were analyzed to depict the cost-effectiveness under alternative vaccine prices, exchange rates and dosing schemes. Deterministic and probabilistic sensitivity analyses were performed. Compared to screening only, the bivalent and quadrivalent vaccines were cost-saving in all scenarios, avoiding 2,310 and 2,143 deaths, 4,781 and 4,431 CCs up to 18,459 GW for the quadrivalent vaccine and gaining 4,486 and 4,395 discounted QALYs respectively. For both vaccines, the main determinants of variations in the incremental costs-effectiveness ratio after running deterministic and probabilistic sensitivity analyses were transition probabilities, vaccine and cancer-treatment costs and HPV 16 and 18 distribution in CC cases. When comparing vaccines, none of them was consistently more cost-effective than the other. In sensitivity analyses, for these comparisons, the main determinants were GW incidence, the level of cross-protection and, for some scenarios, vaccines costs. Immunization with the bivalent or quadrivalent HPV vaccines showed to be cost-saving or cost-effective in Venezuela, falling below the threshold of one Gross Domestic Product (GDP) per capita (104,404 VEF) per QALY gained. Deterministic and probabilistic sensitivity analyses confirmed the robustness of these results.
Economic evaluation of DNA ploidy analysis vs liquid-based cytology for cervical screening.
Nghiem, V T; Davies, K R; Beck, J R; Follen, M; MacAulay, C; Guillaud, M; Cantor, S B
2015-06-09
DNA ploidy analysis involves automated quantification of chromosomal aneuploidy, a potential marker of progression toward cervical carcinoma. We evaluated the cost-effectiveness of this method for cervical screening, comparing five ploidy strategies (using different numbers of aneuploid cells as cut points) with liquid-based Papanicolaou smear and no screening. A state-transition Markov model simulated the natural history of HPV infection and possible progression into cervical neoplasia in a cohort of 12-year-old females. The analysis evaluated cost in 2012 US$ and effectiveness in quality-adjusted life-years (QALYs) from a health-system perspective throughout a lifetime horizon in the US setting. We calculated incremental cost-effectiveness ratios (ICERs) to determine the best strategy. The robustness of optimal choices was examined in deterministic and probabilistic sensitivity analyses. In the base-case analysis, the ploidy 4 cell strategy was cost-effective, yielding an increase of 0.032 QALY and an ICER of $18 264/QALY compared to no screening. For most scenarios in the deterministic sensitivity analysis, the ploidy 4 cell strategy was the only cost-effective strategy. Cost-effectiveness acceptability curves showed that this strategy was more likely to be cost-effective than the Papanicolaou smear. Compared to the liquid-based Papanicolaou smear, screening with a DNA ploidy strategy appeared less costly and comparably effective.
Probabilistic finite elements for fracture and fatigue analysis
NASA Technical Reports Server (NTRS)
Liu, W. K.; Belytschko, T.; Lawrence, M.; Besterfield, G. H.
1989-01-01
The fusion of the probabilistic finite element method (PFEM) and reliability analysis for probabilistic fracture mechanics (PFM) is presented. A comprehensive method for determining the probability of fatigue failure for curved crack growth was developed. The criterion for failure or performance function is stated as: the fatigue life of a component must exceed the service life of the component; otherwise failure will occur. An enriched element that has the near-crack-tip singular strain field embedded in the element is used to formulate the equilibrium equation and solve for the stress intensity factors at the crack-tip. Performance and accuracy of the method is demonstrated on a classical mode 1 fatigue problem.
NASA Astrophysics Data System (ADS)
Klügel, J.
2006-12-01
Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.
Paixão, Enny S; Harron, Katie; Andrade, Kleydson; Teixeira, Maria Glória; Fiaccone, Rosemeire L; Costa, Maria da Conceição N; Rodrigues, Laura C
2017-07-17
Due to the increasing availability of individual-level information across different electronic datasets, record linkage has become an efficient and important research tool. High quality linkage is essential for producing robust results. The objective of this study was to describe the process of preparing and linking national Brazilian datasets, and to compare the accuracy of different linkage methods for assessing the risk of stillbirth due to dengue in pregnancy. We linked mothers and stillbirths in two routinely collected datasets from Brazil for 2009-2010: for dengue in pregnancy, notifications of infectious diseases (SINAN); for stillbirths, mortality (SIM). Since there was no unique identifier, we used probabilistic linkage based on maternal name, age and municipality. We compared two probabilistic approaches, each with two thresholds: 1) a bespoke linkage algorithm; 2) a standard linkage software widely used in Brazil (ReclinkIII), and used manual review to identify further links. Sensitivity and positive predictive value (PPV) were estimated using a subset of gold-standard data created through manual review. We examined the characteristics of false-matches and missed-matches to identify any sources of bias. From records of 678,999 dengue cases and 62,373 stillbirths, the gold-standard linkage identified 191 cases. The bespoke linkage algorithm with a conservative threshold produced 131 links, with sensitivity = 64.4% (68 missed-matches) and PPV = 92.5% (8 false-matches). Manual review of uncertain links identified an additional 37 links, increasing sensitivity to 83.7%. The bespoke algorithm with a relaxed threshold identified 132 true matches (sensitivity = 69.1%), but introduced 61 false-matches (PPV = 68.4%). ReclinkIII produced lower sensitivity and PPV than the bespoke linkage algorithm. Linkage error was not associated with any recorded study variables. Despite a lack of unique identifiers for linking mothers and stillbirths, we demonstrate a high standard of linkage of large routine databases from a middle income country. Probabilistic linkage and manual review were essential for accurately identifying cases for a case-control study, but this approach may not be feasible for larger databases or for linkage of more common outcomes.
Probabilistic projections of 21st century climate change over Northern Eurasia
NASA Astrophysics Data System (ADS)
Monier, E.; Sokolov, A. P.; Schlosser, C. A.; Scott, J. R.; Gao, X.
2013-12-01
We present probabilistic projections of 21st century climate change over Northern Eurasia using the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an earth system model of intermediate complexity, with a two-dimensional zonal-mean atmosphere, to a human activity model. Regional climate change is obtained by two downscaling methods: a dynamical downscaling, where the IGSM is linked to a three dimensional atmospheric model; and a statistical downscaling, where a pattern scaling algorithm uses climate-change patterns from 17 climate models. This framework allows for key sources of uncertainty in future projections of regional climate change to be accounted for: emissions projections; climate system parameters (climate sensitivity, strength of aerosol forcing and ocean heat uptake rate); natural variability; and structural uncertainty. Results show that the choice of climate policy and the climate parameters are the largest drivers of uncertainty. We also nd that dierent initial conditions lead to dierences in patterns of change as large as when using different climate models. Finally, this analysis reveals the wide range of possible climate change over Northern Eurasia, emphasizing the need to consider all sources of uncertainty when modeling climate impacts over Northern Eurasia.
Probabilistic projections of 21st century climate change over Northern Eurasia
NASA Astrophysics Data System (ADS)
Monier, Erwan; Sokolov, Andrei; Schlosser, Adam; Scott, Jeffery; Gao, Xiang
2013-12-01
We present probabilistic projections of 21st century climate change over Northern Eurasia using the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an Earth system model of intermediate complexity with a two-dimensional zonal-mean atmosphere to a human activity model. Regional climate change is obtained by two downscaling methods: a dynamical downscaling, where the IGSM is linked to a three-dimensional atmospheric model, and a statistical downscaling, where a pattern scaling algorithm uses climate change patterns from 17 climate models. This framework allows for four major sources of uncertainty in future projections of regional climate change to be accounted for: emissions projections, climate system parameters (climate sensitivity, strength of aerosol forcing and ocean heat uptake rate), natural variability, and structural uncertainty. The results show that the choice of climate policy and the climate parameters are the largest drivers of uncertainty. We also find that different initial conditions lead to differences in patterns of change as large as when using different climate models. Finally, this analysis reveals the wide range of possible climate change over Northern Eurasia, emphasizing the need to consider these sources of uncertainty when modeling climate impacts over Northern Eurasia.
NASA Astrophysics Data System (ADS)
Peng, Chi; Cai, Yimin; Wang, Tieyu; Xiao, Rongbo; Chen, Weiping
2016-11-01
In this study, we proposed a Regional Probabilistic Risk Assessment (RPRA) to estimate the health risks of exposing residents to heavy metals in different environmental media and land uses. The mean and ranges of heavy metal concentrations were measured in water, sediments, soil profiles and surface soils under four land uses along the Shunde Waterway, a drinking water supply area in China. Hazard quotients (HQs) were estimated for various exposure routes and heavy metal species. Riverbank vegetable plots and private vegetable plots had 95th percentiles of total HQs greater than 3 and 1, respectively, indicating high risks of cultivation on the flooded riverbank. Vegetable uptake and leaching to groundwater were the two transfer routes of soil metals causing high health risks. Exposure risks during outdoor recreation, farming and swimming along the Shunde Waterway are theoretically safe. Arsenic and cadmium were identified as the priority pollutants that contribute the most risk among the heavy metals. Sensitivity analysis showed that the exposure route, variations in exposure parameters, mobility of heavy metals in soil, and metal concentrations all influenced the risk estimates.
Impact of uncertainties in free stream conditions on the aerodynamics of a rectangular cylinder
NASA Astrophysics Data System (ADS)
Mariotti, Alessandro; Shoeibi Omrani, Pejman; Witteveen, Jeroen; Salvetti, Maria Vittoria
2015-11-01
The BARC benchmark deals with the flow around a rectangular cylinder with chord-to-depth ratio equal to 5. This flow configuration is of practical interest for civil and industrial structures and it is characterized by massively separated flow and unsteadiness. In a recent review of BARC results, significant dispersion was observed both in experimental and numerical predictions of some flow quantities, which are extremely sensitive to various uncertainties, which may be present in experiments and simulations. Besides modeling and numerical errors, in simulations it is difficult to exactly reproduce the experimental conditions due to uncertainties in the set-up parameters, which sometimes cannot be exactly controlled or characterized. Probabilistic methods and URANS simulations are used to investigate the impact of the uncertainties in the following set-up parameters: the angle of incidence, the free stream longitudinal turbulence intensity and length scale. Stochastic collocation is employed to perform the probabilistic propagation of the uncertainty. The discretization and modeling errors are estimated by repeating the same analysis for different grids and turbulence models. The results obtained for different assumed PDF of the set-up parameters are also compared.
Sensitivity to Uncertainty in Asteroid Impact Risk Assessment
NASA Astrophysics Data System (ADS)
Mathias, D.; Wheeler, L.; Prabhu, D. K.; Aftosmis, M.; Dotson, J.; Robertson, D. K.
2015-12-01
The Engineering Risk Assessment (ERA) team at NASA Ames Research Center is developing a physics-based impact risk model for probabilistically assessing threats from potential asteroid impacts on Earth. The model integrates probabilistic sampling of asteroid parameter ranges with physics-based analyses of entry, breakup, and impact to estimate damage areas and casualties from various impact scenarios. Assessing these threats is a highly coupled, dynamic problem involving significant uncertainties in the range of expected asteroid characteristics, how those characteristics may affect the level of damage, and the fidelity of various modeling approaches and assumptions. The presented model is used to explore the sensitivity of impact risk estimates to these uncertainties in order to gain insight into what additional data or modeling refinements are most important for producing effective, meaningful risk assessments. In the extreme cases of very small or very large impacts, the results are generally insensitive to many of the characterization and modeling assumptions. However, the nature of the sensitivity can change across moderate-sized impacts. Results will focus on the value of additional information in this critical, mid-size range, and how this additional data can support more robust mitigation decisions.
Radomyski, Artur; Giubilato, Elisa; Ciffroy, Philippe; Critto, Andrea; Brochot, Céline; Marcomini, Antonio
2016-11-01
The study is focused on applying uncertainty and sensitivity analysis to support the application and evaluation of large exposure models where a significant number of parameters and complex exposure scenarios might be involved. The recently developed MERLIN-Expo exposure modelling tool was applied to probabilistically assess the ecological and human exposure to PCB 126 and 2,3,7,8-TCDD in the Venice lagoon (Italy). The 'Phytoplankton', 'Aquatic Invertebrate', 'Fish', 'Human intake' and PBPK models available in MERLIN-Expo library were integrated to create a specific food web to dynamically simulate bioaccumulation in various aquatic species and in the human body over individual lifetimes from 1932 until 1998. MERLIN-Expo is a high tier exposure modelling tool allowing propagation of uncertainty on the model predictions through Monte Carlo simulation. Uncertainty in model output can be further apportioned between parameters by applying built-in sensitivity analysis tools. In this study, uncertainty has been extensively addressed in the distribution functions to describe the data input and the effect on model results by applying sensitivity analysis techniques (screening Morris method, regression analysis, and variance-based method EFAST). In the exposure scenario developed for the Lagoon of Venice, the concentrations of 2,3,7,8-TCDD and PCB 126 in human blood turned out to be mainly influenced by a combination of parameters (half-lives of the chemicals, body weight variability, lipid fraction, food assimilation efficiency), physiological processes (uptake/elimination rates), environmental exposure concentrations (sediment, water, food) and eating behaviours (amount of food eaten). In conclusion, this case study demonstrated feasibility of MERLIN-Expo to be successfully employed in integrated, high tier exposure assessment. Copyright © 2016 Elsevier B.V. All rights reserved.
Tong, Ruipeng; Yang, Xiaoyi; Su, Hanrui; Pan, Yue; Zhang, Qiuzhuo; Wang, Juan; Long, Mingce
2018-03-01
The levels, sources and quantitative probabilistic health risks for polycyclic aromatic hydrocarbons (PAHs) in agricultural soils in the vicinity of power, steel and petrochemical plants in the suburbs of Shanghai are discussed. The total concentration of 16 PAHs in the soils ranges from 223 to 8214ng g -1 . The sources of PAHs were analyzed by both isomeric ratios and a principal component analysis-multiple linear regression method. The results indicate that PAHs mainly originated from the incomplete combustion of coal and oil. The probabilistic risk assessments for both carcinogenic and non-carcinogenic risks posed by PAHs in soils with adult farmers as concerned receptors were quantitatively calculated by Monte Carlo simulation. The estimated total carcinogenic risks (TCR) for the agricultural soils has a 45% possibility of exceeding the acceptable threshold value (10 -6 ), indicating potential adverse health effects. However, all non-carcinogenic risks are below the threshold value. Oral intake is the dominant exposure pathway, accounting for 77.7% of TCR, while inhalation intake is negligible. The three PAHs with the highest contribution for TCR are BaP (64.35%), DBA (17.56%) and InP (9.06%). Sensitivity analyses indicate that exposure frequency has the greatest impact on the total risk uncertainty, followed by the exposure dose through oral intake and exposure duration. These results indicate that it is essential to manage the health risks of PAH-contaminated agricultural soils in the vicinity of typical industries in megacities. Copyright © 2017 Elsevier B.V. All rights reserved.
Sensitivity and specificity of univariate MRI analysis of experimentally degraded cartilage
Lin, Ping-Chang; Reiter, David A.; Spencer, Richard G.
2010-01-01
MRI is increasingly used to evaluate cartilage in tissue constructs, explants, and animal and patient studies. However, while mean values of MR parameters, including T1, T2, magnetization transfer rate km, apparent diffusion coefficient ADC, and the dGEMRIC-derived fixed charge density, correlate with tissue status, the ability to classify tissue according to these parameters has not been explored. Therefore, the sensitivity and specificity with which each of these parameters was able to distinguish between normal and trypsin- degraded, and between normal and collagenase-degraded, cartilage explants were determined. Initial analysis was performed using a training set to determine simple group means to which parameters obtained from a validation set were compared. T1 and ADC showed the greatest ability to discriminate between normal and degraded cartilage. Further analysis with k-means clustering, which eliminates the need for a priori identification of sample status, generally performed comparably. Use of fuzzy c-means (FCM) clustering to define centroids likewise did not result in improvement in discrimination. Finally, a FCM clustering approach in which validation samples were assigned in a probabilistic fashion to control and degraded groups was implemented, reflecting the range of tissue characteristics seen with cartilage degradation. PMID:19705467
NASA Technical Reports Server (NTRS)
Onwubiko, Chinyere; Onyebueke, Landon
1996-01-01
This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.
Probabilistic structural analysis to quantify uncertainties associated with turbopump blades
NASA Technical Reports Server (NTRS)
Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.
1988-01-01
A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach was developed to quantify the effects of the random uncertainties. The results indicate that only the variations in geometry have significant effects.
Shih, Ya-Chen Tina; Chien, Chun-Ru; Moguel, Rocio; Hernandez, Mike; Hajek, Richard A; Jones, Lovell A
2016-04-01
To assess the cost-effectiveness of implementing a patient navigation (PN) program with capitated payment for Medicare beneficiaries diagnosed with lung cancer. Cost-effectiveness analysis. A Markov model to capture the disease progression of lung cancer and characterize clinical benefits of PN services as timeliness of treatment and care coordination. Taking a payer's perspective, we estimated the lifetime costs, life years (LYs), and quality-adjusted life years (QALYs) and addressed uncertainties in one-way and probabilistic sensitivity analyses. Model inputs were extracted from the literature, supplemented with data from a Centers for Medicare and Medicaid Services demonstration project. Compared to usual care, PN services incurred higher costs but also yielded better outcomes. The incremental cost and effectiveness was $9,145 and 0.47 QALYs, respectively, resulting in an incremental cost-effectiveness ratio of $19,312/QALY. One-way sensitivity analysis indicated that findings were most sensitive to a parameter capturing PN survival benefit for local-stage patients. CE-acceptability curve showed the probability that the PN program was cost-effective was 0.80 and 0.91 at a societal willingness-to-pay of $50,000 and $100,000/QALY, respectively. Instituting a capitated PN program is cost-effective for lung cancer patients in Medicare. Future research should evaluate whether the same conclusion holds in other cancers. © Health Research and Educational Trust.
Probabilistic structural analysis of space propulsion system LOX post
NASA Technical Reports Server (NTRS)
Newell, J. F.; Rajagopal, K. R.; Ho, H. W.; Cunniff, J. M.
1990-01-01
The probabilistic structural analysis program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress; Cruse et al., 1988) is applied to characterize the dynamic loading and response of the Space Shuttle main engine (SSME) LOX post. The design and operation of the SSME are reviewed; the LOX post structure is described; and particular attention is given to the generation of composite load spectra, the finite-element model of the LOX post, and the steps in the NESSUS structural analysis. The results are presented in extensive tables and graphs, and it is shown that NESSUS correctly predicts the structural effects of changes in the temperature loading. The probabilistic approach also facilitates (1) damage assessments for a given failure model (based on gas temperature, heat-shield gap, and material properties) and (2) correlation of the gas temperature with operational parameters such as engine thrust.
Probabilistic wind/tornado/missile analyses for hazard and fragility evaluations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Y.J.; Reich, M.
Detailed analysis procedures and examples are presented for the probabilistic evaluation of hazard and fragility against high wind, tornado, and tornado-generated missiles. In the tornado hazard analysis, existing risk models are modified to incorporate various uncertainties including modeling errors. A significant feature of this paper is the detailed description of the Monte-Carlo simulation analyses of tornado-generated missiles. A simulation procedure, which includes the wind field modeling, missile injection, solution of flight equations, and missile impact analysis, is described with application examples.
Costing the satellite power system
NASA Technical Reports Server (NTRS)
Hazelrigg, G. A., Jr.
1978-01-01
The paper presents a methodology for satellite power system costing, places approximate limits on the accuracy possible in cost estimates made at this time, and outlines the use of probabilistic cost information in support of the decision-making process. Reasons for using probabilistic costing or risk analysis procedures instead of standard deterministic costing procedures are considered. Components of cost, costing estimating relationships, grass roots costing, and risk analysis are discussed. Risk analysis using a Monte Carlo simulation model is used to estimate future costs.
76 FR 28102 - Notice of Issuance of Regulatory Guide
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-13
..., Probabilistic Risk Assessment Branch, Division of Risk Analysis, Office of Nuclear Regulatory Research, U.S... approaches and methods (whether quantitative or qualitative, deterministic or probabilistic), data, and... uses in evaluating specific problems or postulated accidents, and data that the staff needs in its...
Probability from a Socio-Cultural Perspective
ERIC Educational Resources Information Center
Sharma, Sashi
2016-01-01
There exists considerable and rich literature on students' misconceptions about probability; less attention has been paid to the development of students' probabilistic thinking in the classroom. Grounded in an analysis of the literature, this article offers a lesson sequence for developing students' probabilistic understanding. In particular, a…
Probabilistic Exposure Analysis for Chemical Risk Characterization
Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.
2009-01-01
This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660
Data analysis using scale-space filtering and Bayesian probabilistic reasoning
NASA Technical Reports Server (NTRS)
Kulkarni, Deepak; Kutulakos, Kiriakos; Robinson, Peter
1991-01-01
This paper describes a program for analysis of output curves from Differential Thermal Analyzer (DTA). The program first extracts probabilistic qualitative features from a DTA curve of a soil sample, and then uses Bayesian probabilistic reasoning to infer the mineral in the soil. The qualifier module employs a simple and efficient extension of scale-space filtering suitable for handling DTA data. We have observed that points can vanish from contours in the scale-space image when filtering operations are not highly accurate. To handle the problem of vanishing points, perceptual organizations heuristics are used to group the points into lines. Next, these lines are grouped into contours by using additional heuristics. Probabilities are associated with these contours using domain-specific correlations. A Bayes tree classifier processes probabilistic features to infer the presence of different minerals in the soil. Experiments show that the algorithm that uses domain-specific correlation to infer qualitative features outperforms a domain-independent algorithm that does not.
Osterhoff, Georg; O'Hara, Nathan N; D'Cruz, Jennifer; Sprague, Sheila A; Bansback, Nick; Evaniew, Nathan; Slobogean, Gerard P
2017-03-01
There is ongoing debate regarding the optimal surgical treatment of complex proximal humeral fractures in elderly patients. To evaluate the cost-effectiveness of reverse total shoulder arthroplasty (RTSA) compared with hemiarthroplasty (HA) in the management of complex proximal humeral fractures, using a cost-utility analysis. On the basis of data from published literature, a cost-utility analysis was conducted using decision tree and Markov modeling. A single-payer perspective, with a willingness-to-pay (WTP) threshold of Can$50,000 (Canadian dollars), and a lifetime time horizon were used. The incremental cost-effectiveness ratio (ICER) was used as the study's primary outcome measure. In comparison with HA, the incremental cost per quality-adjusted life-year gained for RTSA was Can$13,679. One-way sensitivity analysis revealed the model to be sensitive to the RTSA implant cost and the RTSA procedural cost. The ICER of Can$13,679 is well below the WTP threshold of Can$50,000, and probabilistic sensitivity analysis demonstrated that 92.6% of model simulations favored RTSA. Our economic analysis found that RTSA for the treatment of complex proximal humeral fractures in the elderly is the preferred economic strategy when compared with HA. The ICER of RTSA is well below standard WTP thresholds, and its estimate of cost-effectiveness is similar to other highly successful orthopedic strategies such as total hip arthroplasty for the treatment of hip arthritis. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Dopamine neurons learn relative chosen value from probabilistic rewards
Lak, Armin; Stauffer, William R; Schultz, Wolfram
2016-01-01
Economic theories posit reward probability as one of the factors defining reward value. Individuals learn the value of cues that predict probabilistic rewards from experienced reward frequencies. Building on the notion that responses of dopamine neurons increase with reward probability and expected value, we asked how dopamine neurons in monkeys acquire this value signal that may represent an economic decision variable. We found in a Pavlovian learning task that reward probability-dependent value signals arose from experienced reward frequencies. We then assessed neuronal response acquisition during choices among probabilistic rewards. Here, dopamine responses became sensitive to the value of both chosen and unchosen options. Both experiments showed also the novelty responses of dopamine neurones that decreased as learning advanced. These results show that dopamine neurons acquire predictive value signals from the frequency of experienced rewards. This flexible and fast signal reflects a specific decision variable and could update neuronal decision mechanisms. DOI: http://dx.doi.org/10.7554/eLife.18044.001 PMID:27787196
Probabilistic Assessment of Fracture Progression in Composite Structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Minnetyan, Levon; Mauget, Bertrand; Huang, Dade; Addi, Frank
1999-01-01
This report describes methods and corresponding computer codes that are used to evaluate progressive damage and fracture and to perform probabilistic assessment in built-up composite structures. Structural response is assessed probabilistically, during progressive fracture. The effects of design variable uncertainties on structural fracture progression are quantified. The fast probability integrator (FPI) is used to assess the response scatter in the composite structure at damage initiation. The sensitivity of the damage response to design variables is computed. The methods are general purpose and are applicable to stitched and unstitched composites in all types of structures and fracture processes starting from damage initiation to unstable propagation and to global structure collapse. The methods are demonstrated for a polymer matrix composite stiffened panel subjected to pressure. The results indicated that composite constituent properties, fabrication parameters, and respective uncertainties have a significant effect on structural durability and reliability. Design implications with regard to damage progression, damage tolerance, and reliability of composite structures are examined.
NASA Technical Reports Server (NTRS)
Bast, Callie C.; Boyce, Lola
1995-01-01
The development of methodology for a probabilistic material strength degradation is described. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes five effects that typically reduce lifetime strength: high temperature, high-cycle mechanical fatigue, low-cycle mechanical fatigue, creep and thermal fatigue. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing predictions of high-cycle mechanical fatigue and high temperature effects with experiments are presented. Results from this limited verification study strongly supported that material degradation can be represented by randomized multifactor interaction models.
Optimization of Adaptive Intraply Hybrid Fiber Composites with Reliability Considerations
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Chamis, Christos C.
1994-01-01
The reliability with bounded distribution parameters (mean, standard deviation) was maximized and the reliability-based cost was minimized for adaptive intra-ply hybrid fiber composites by using a probabilistic method. The probabilistic method accounts for all naturally occurring uncertainties including those in constituent material properties, fabrication variables, structure geometry, and control-related parameters. Probabilistic sensitivity factors were computed and used in the optimization procedures. For actuated change in the angle of attack of an airfoil-like composite shell structure with an adaptive torque plate, the reliability was maximized to 0.9999 probability, with constraints on the mean and standard deviation of the actuation material volume ratio (percentage of actuation composite material in a ply) and the actuation strain coefficient. The reliability-based cost was minimized for an airfoil-like composite shell structure with an adaptive skin and a mean actuation material volume ratio as the design parameter. At a O.9-mean actuation material volume ratio, the minimum cost was obtained.
Wali, Arvin R; Park, Charlie C; Santiago-Dieppa, David R; Vaida, Florin; Murphy, James D; Khalessi, Alexander A
2017-06-01
OBJECTIVE Rupture of large or giant intracranial aneurysms leads to significant morbidity, mortality, and health care costs. Both coiling and the Pipeline embolization device (PED) have been shown to be safe and clinically effective for the treatment of unruptured large and giant intracranial aneurysms; however, the relative cost-to-outcome ratio is unknown. The authors present the first cost-effectiveness analysis to compare the economic impact of the PED compared with coiling or no treatment for the endovascular management of large or giant intracranial aneurysms. METHODS A Markov model was constructed to simulate a 60-year-old woman with a large or giant intracranial aneurysm considering a PED, endovascular coiling, or no treatment in terms of neurological outcome, angiographic outcome, retreatment rates, procedural and rehabilitation costs, and rupture rates. Transition probabilities were derived from prior literature reporting outcomes and costs of PED, coiling, and no treatment for the management of aneurysms. Cost-effectiveness was defined, with the incremental cost-effectiveness ratios (ICERs) defined as difference in costs divided by the difference in quality-adjusted life years (QALYs). The ICERs < $50,000/QALY gained were considered cost-effective. To study parameter uncertainty, 1-way, 2-way, and probabilistic sensitivity analyses were performed. RESULTS The base-case model demonstrated lifetime QALYs of 12.72 for patients in the PED cohort, 12.89 for the endovascular coiling cohort, and 9.7 for patients in the no-treatment cohort. Lifetime rehabilitation and treatment costs were $59,837.52 for PED; $79,025.42 for endovascular coiling; and $193,531.29 in the no-treatment cohort. Patients who did not undergo elective treatment were subject to increased rates of aneurysm rupture and high treatment and rehabilitation costs. One-way sensitivity analysis demonstrated that the model was most sensitive to assumptions about the costs and mortality risks for PED and coiling. Probabilistic sampling demonstrated that PED was the cost-effective strategy in 58.4% of iterations, coiling was the cost-effective strategy in 41.4% of iterations, and the no-treatment option was the cost-effective strategy in only 0.2% of iterations. CONCLUSIONS The authors' cost-effective model demonstrated that elective endovascular techniques such as PED and endovascular coiling are cost-effective strategies for improving health outcomes and lifetime quality of life measures in patients with large or giant unruptured intracranial aneurysm.
Chowdhury, Enayet K.; Ademi, Zanfina; Moss, John R.; Wing, Lindon M.H.; Reid, Christopher M.
2015-01-01
Abstract The objective of this study was to examine the cost-effectiveness of angiotensin-converting enzyme inhibitor (ACEI)-based treatment compared with thiazide diuretic-based treatment for hypertension in elderly Australians considering diabetes as an outcome along with cardiovascular outcomes from the Australian government's perspective. We used a cost–utility analysis to estimate the incremental cost-effectiveness ratio (ICER) per quality-adjusted life-year (QALY) gained. Data on cardiovascular events and new onset of diabetes were used from the Second Australian National Blood Pressure Study, a randomized clinical trial comparing diuretic-based (hydrochlorothiazide) versus ACEI-based (enalapril) treatment in 6083 elderly (age ≥65 years) hypertensive patients over a median 4.1-year period. For this economic analysis, the total study population was stratified into 2 groups. Group A was restricted to participants diabetes free at baseline (n = 5642); group B was restricted to participants with preexisting diabetes mellitus (type 1 or type 2) at baseline (n = 441). Data on utility scores for different events were used from available published literatures; whereas, treatment and adverse event management costs were calculated from direct health care costs available from Australian government reimbursement data. Costs and QALYs were discounted at 5% per annum. One-way and probabilistic sensitivity analyses were performed to assess the uncertainty around utilities and cost data. After a treatment period of 5 years, for group A, the ICER was Australian dollars (AUD) 27,698 (€ 18,004; AUD 1–€ 0.65) per QALY gained comparing ACEI-based treatment with diuretic-based treatment (sensitive to the utility value for new-onset diabetes). In group B, ACEI-based treatment was a dominant strategy (both more effective and cost-saving). On probabilistic sensitivity analysis, the ICERs per QALY gained were always below AUD 50,000 for group B; whereas for group A, the probability of being below AUD 50,000 was 85%. Although the dispensed price of diuretic-based treatment of hypertension in the elderly is lower, upon considering the potential enhanced likelihood of the development of diabetes in addition to the costs of treating cardiovascular disease, ACEI-based treatment may be a more cost-effective strategy in this population. PMID:25738481
Method and system for dynamic probabilistic risk assessment
NASA Technical Reports Server (NTRS)
Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)
2013-01-01
The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.
Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew
2018-01-01
This study assessed the extent to which within-individual variation in schizotypy and paranormal belief influenced performance on probabilistic reasoning tasks. A convenience sample of 725 non-clinical adults completed measures assessing schizotypy (Oxford-Liverpool Inventory of Feelings and Experiences; O-Life brief), belief in the paranormal (Revised Paranormal Belief Scale; RPBS) and probabilistic reasoning (perception of randomness, conjunction fallacy, paranormal perception of randomness, and paranormal conjunction fallacy). Latent profile analysis (LPA) identified four distinct groups: class 1, low schizotypy and low paranormal belief (43.9% of sample); class 2, moderate schizotypy and moderate paranormal belief (18.2%); class 3, moderate schizotypy (high cognitive disorganization) and low paranormal belief (29%); and class 4, moderate schizotypy and high paranormal belief (8.9%). Identification of homogeneous classes provided a nuanced understanding of the relative contribution of schizotypy and paranormal belief to differences in probabilistic reasoning performance. Multivariate analysis of covariance revealed that groups with lower levels of paranormal belief (classes 1 and 3) performed significantly better on perception of randomness, but not conjunction problems. Schizotypy had only a negligible effect on performance. Further analysis indicated that framing perception of randomness and conjunction problems in a paranormal context facilitated performance for all groups but class 4. PMID:29434562
Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew
2018-01-01
This study assessed the extent to which within-individual variation in schizotypy and paranormal belief influenced performance on probabilistic reasoning tasks. A convenience sample of 725 non-clinical adults completed measures assessing schizotypy (Oxford-Liverpool Inventory of Feelings and Experiences; O-Life brief), belief in the paranormal (Revised Paranormal Belief Scale; RPBS) and probabilistic reasoning (perception of randomness, conjunction fallacy, paranormal perception of randomness, and paranormal conjunction fallacy). Latent profile analysis (LPA) identified four distinct groups: class 1, low schizotypy and low paranormal belief (43.9% of sample); class 2, moderate schizotypy and moderate paranormal belief (18.2%); class 3, moderate schizotypy (high cognitive disorganization) and low paranormal belief (29%); and class 4, moderate schizotypy and high paranormal belief (8.9%). Identification of homogeneous classes provided a nuanced understanding of the relative contribution of schizotypy and paranormal belief to differences in probabilistic reasoning performance. Multivariate analysis of covariance revealed that groups with lower levels of paranormal belief (classes 1 and 3) performed significantly better on perception of randomness, but not conjunction problems. Schizotypy had only a negligible effect on performance. Further analysis indicated that framing perception of randomness and conjunction problems in a paranormal context facilitated performance for all groups but class 4.
Probabilistic liquefaction triggering based on the cone penetration test
Moss, R.E.S.; Seed, R.B.; Kayen, R.E.; Stewart, J.P.; Tokimatsu, K.
2005-01-01
Performance-based earthquake engineering requires a probabilistic treatment of potential failure modes in order to accurately quantify the overall stability of the system. This paper is a summary of the application portions of the probabilistic liquefaction triggering correlations proposed recently proposed by Moss and co-workers. To enable probabilistic treatment of liquefaction triggering, the variables comprising the seismic load and the liquefaction resistance were treated as inherently uncertain. Supporting data from an extensive Cone Penetration Test (CPT)-based liquefaction case history database were used to develop a probabilistic correlation. The methods used to measure the uncertainty of the load and resistance variables, how the interactions of these variables were treated using Bayesian updating, and how reliability analysis was applied to produce curves of equal probability of liquefaction are presented. The normalization for effective overburden stress, the magnitude correlated duration weighting factor, and the non-linear shear mass participation factor used are also discussed.
Risk-Based Probabilistic Approach to Aeropropulsion System Assessment
NASA Technical Reports Server (NTRS)
Tong, Michael T.
2002-01-01
In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decision makers to determine the feasibility and return-on-investment of a new aircraft engine. In this work, an alternative approach based on the probabilistic method was described for a comprehensive assessment of an aeropropulsion system. The statistical approach quantifies the design uncertainties inherent in a new aeropropulsion system and their influences on engine performance. Because of this, it enhances the reliability of a system assessment. A technical assessment of a wave-rotor-enhanced gas turbine engine was performed to demonstrate the methodology. The assessment used probability distributions to account for the uncertainties that occur in component efficiencies and flows and in mechanical design variables. The approach taken in this effort was to integrate the thermodynamic cycle analysis embedded in the computer code NEPP (NASA Engine Performance Program) and the engine weight analysis embedded in the computer code WATE (Weight Analysis of Turbine Engines) with the fast probability integration technique (FPI). FPI was developed by Southwest Research Institute under contract with the NASA Glenn Research Center. The results were plotted in the form of cumulative distribution functions and sensitivity analyses and were compared with results from the traditional deterministic approach. The comparison showed that the probabilistic approach provides a more realistic and systematic way to assess an aeropropulsion system. The current work addressed the application of the probabilistic approach to assess specific fuel consumption, engine thrust, and weight. Similarly, the approach can be used to assess other aspects of aeropropulsion system performance, such as cost, acoustic noise, and emissions. Additional information is included in the original extended abstract.
NASA Astrophysics Data System (ADS)
Sanchez, J.
2018-06-01
In this paper, the application and analysis of the asymptotic approximation method to a single degree-of-freedom has recently been produced. The original concepts are summarized, and the necessary probabilistic concepts are developed and applied to single degree-of-freedom systems. Then, these concepts are united, and the theoretical and computational models are developed. To determine the viability of the proposed method in a probabilistic context, numerical experiments are conducted, and consist of a frequency analysis, analysis of the effects of measurement noise, and a statistical analysis. In addition, two examples are presented and discussed.
Concurrent Probabilistic Simulation of High Temperature Composite Structural Response
NASA Technical Reports Server (NTRS)
Abdi, Frank
1996-01-01
A computational structural/material analysis and design tool which would meet industry's future demand for expedience and reduced cost is presented. This unique software 'GENOA' is dedicated to parallel and high speed analysis to perform probabilistic evaluation of high temperature composite response of aerospace systems. The development is based on detailed integration and modification of diverse fields of specialized analysis techniques and mathematical models to combine their latest innovative capabilities into a commercially viable software package. The technique is specifically designed to exploit the availability of processors to perform computationally intense probabilistic analysis assessing uncertainties in structural reliability analysis and composite micromechanics. The primary objectives which were achieved in performing the development were: (1) Utilization of the power of parallel processing and static/dynamic load balancing optimization to make the complex simulation of structure, material and processing of high temperature composite affordable; (2) Computational integration and synchronization of probabilistic mathematics, structural/material mechanics and parallel computing; (3) Implementation of an innovative multi-level domain decomposition technique to identify the inherent parallelism, and increasing convergence rates through high- and low-level processor assignment; (4) Creating the framework for Portable Paralleled architecture for the machine independent Multi Instruction Multi Data, (MIMD), Single Instruction Multi Data (SIMD), hybrid and distributed workstation type of computers; and (5) Market evaluation. The results of Phase-2 effort provides a good basis for continuation and warrants Phase-3 government, and industry partnership.
Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cetiner, Mustafa Sacit; none,; Flanagan, George F.
2014-07-30
An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two typesmore » of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.« less
Fisher, Mark; Walker, Andrew; Falqués, Meritxell; Moya, Miguel; Rance, Mark; Taylor, Douglas; Lindner, Leandro
2016-12-01
Presently, linaclotide is the only EMA-approved therapy indicated for the treatment of irritable bowel syndrome with constipation (IBS-C). This study sought to determine the cost-effectiveness of linaclotide compared to antidepressants for the treatment of adults with moderate to severe IBS-C who have previously received antispasmodics and/or laxatives. A Markov model was created to estimate costs and QALYs over a 5-year time horizon from the perspective of NHS Scotland. Health states were based on treatment satisfaction (satisfied, moderately satisfied, not satisfied) and mortality. Transition probabilities were based on satisfaction data from the linaclotide pivotal studies and Scottish general all-cause mortality statistics. Treatment costs were calculated from the British National Formulary. NHS resource use and disease-related costs for each health state were estimated from Scottish clinician interviews in combination with NHS Reference costs. Quality of life was based on EQ-5D data collected from the pivotal studies. Costs and QALYs were discounted at 3.5 % per annum. Uncertainty was explored through extensive deterministic and probabilistic sensitivity analyses. Over a 5-year time horizon, the additional costs and QALYs generated with linaclotide were £659 and 0.089, resulting in an incremental cost-effectiveness ratio of £7370 per QALY versus antidepressants. Based on the probabilistic sensitivity analysis, the likelihood that linaclotide was cost-effective at a willingness to pay of £20,000 per QALY was 73 %. Linaclotide can be a cost-effective treatment for adults with moderate-to-severe IBS-C who have previously received antispasmodics and/or laxatives in Scotland.
Pan, Yuesong; Wang, Anxin; Liu, Gaifen; Zhao, Xingquan; Meng, Xia; Zhao, Kun; Liu, Liping; Wang, Chunxue; Johnston, S. Claiborne; Wang, Yilong; Wang, Yongjun
2014-01-01
Background Treatment with the combination of clopidogrel and aspirin taken soon after a transient ischemic attack (TIA) or minor stroke was shown to reduce the 90‐day risk of stroke in a large trial in China, but the cost‐effectiveness is unknown. This study sought to estimate the cost‐effectiveness of the clopidogrel‐aspirin regimen for acute TIA or minor stroke. Methods and Results A Markov model was created to determine the cost‐effectiveness of treatment of acute TIA or minor stroke patients with clopidogrel‐aspirin compared with aspirin alone. Inputs for the model were obtained from clinical trial data, claims databases, and the published literature. The main outcome measure was cost per quality‐adjusted life‐years (QALYs) gained. One‐way and multivariable probabilistic sensitivity analyses were performed to test the robustness of the findings. Compared with aspirin alone, clopidogrel‐aspirin resulted in a lifetime gain of 0.037 QALYs at an additional cost of CNY 1250 (US$ 192), yielding an incremental cost‐effectiveness ratio of CNY 33 800 (US$ 5200) per QALY gained. Probabilistic sensitivity analysis showed that clopidogrel‐aspirin therapy was more cost‐effective in 95.7% of the simulations at a willingness‐to‐pay threshold recommended by the World Health Organization of CNY 105 000 (US$ 16 200) per QALY. Conclusions Early 90‐day clopidogrel‐aspirin regimen for acute TIA or minor stroke is highly cost‐effective in China. Although clopidogrel is generic, Plavix is brand in China. If Plavix were generic, treatment with clopidogrel‐aspirin would have been cost saving. PMID:24904018
Linke, Julia; Wessa, Michèle
2017-09-01
High reward sensitivity and wanting of rewarding stimuli help to identify and motivate repetition of pleasant activities. This behavioral activation is thought to increase positive emotions. Therefore, both mechanisms are highly relevant for resilience against depressive symptoms. Yet, these mechanisms have not been targeted by psychotherapeutic interventions. In the present study, we tested a mental imagery training comprising eight 10-minute sessions every second day and delivered via the Internet to healthy volunteers (N = 30, 21 female, mean age of 23.8 years, Caucasian) who were preselected for low reward sensitivity. Participants were paired according to age, sex, reward sensitivity, and mental imagery ability. Then, members of each pair were randomly assigned to either the intervention or wait condition. Ratings of wanting and response bias toward probabilistic reward cues (Probabilistic Reward Task) served as primary outcomes. We further tested whether training effects extended to approach behavior (Approach Avoidance Task) and depressive symptoms (Beck Depression Inventory). The intervention led to an increase in wanting (p < .001, η 2 p = .45) and reward sensitivity (p = .004, η 2 p = .27). Further, the training group displayed faster approach toward positive edibles and activities (p = .025, η 2 p = .18) and reductions in depressive symptoms (p = .028, η 2 p = .16). Results extend existing literature by showing that mental imagery training can increase wanting of rewarding stimuli and reward sensitivity. Further, the training appears to reduce depressive symptoms and thus may foster the successful implementation of exsiting treatments for depression such as behavioral activation and could also increase resilience against depressive symptoms. Copyright © 2017. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Legget, J.; Pepper, W.; Sankovski, A.; Smith, J.; Tol, R.; Wigley, T.
2003-04-01
Potential risks of human-induced climate change are subject to a three-fold uncertainty associated with: the extent of future anthropogenic and natural GHG emissions; global and regional climatic responses to emissions; and impacts of climatic changes on economies and the biosphere. Long-term analyses are also subject to uncertainty regarding how humans will respond to actual or perceived changes, through adaptation or mitigation efforts. Explicitly addressing these uncertainties is a high priority in the scientific and policy communities Probabilistic modeling is gaining momentum as a technique to quantify uncertainties explicitly and use decision analysis techniques that take advantage of improved risk information. The Climate Change Risk Assessment Framework (CCRAF) presented here a new integrative tool that combines the probabilistic approaches developed in population, energy and economic sciences with empirical data and probabilistic results of climate and impact models. The main CCRAF objective is to assess global climate change as a risk management challenge and to provide insights regarding robust policies that address the risks, by mitigating greenhouse gas emissions and by adapting to climate change consequences. The CCRAF endogenously simulates to 2100 or beyond annual region-specific changes in population; GDP; primary (by fuel) and final energy (by type) use; a wide set of associated GHG emissions; GHG concentrations; global temperature change and sea level rise; economic, health, and biospheric impacts; costs of mitigation and adaptation measures and residual costs or benefits of climate change. Atmospheric and climate components of CCRAF are formulated based on the latest version of Wigley's and Raper's MAGICC model and impacts are simulated based on a modified version of Tol's FUND model. The CCRAF is based on series of log-linear equations with deterministic and random components and is implemented using a Monte-Carlo method with up to 5000 variants per set of fixed input parameters. The shape and coefficients of CCRAF equations are derived from regression analyses of historic data and expert assessments. There are two types of random components in CCRAF - one reflects a year-to-year fluctuations around the expected value of a given variable (e.g., standard error of the annual GDP growth) and another is fixed within each CCRAF variant and represents some essential constants within a "world" represented by that variant (e.g., the value of climate sensitivity). Both types of random components are drawn from pre-defined probability distributions functions developed based on historic data or expert assessments. Preliminary CCRAF results emphasize the relative importance of uncertainties associated with the conversion of GHG and particulate emissions into radiative forcing and quantifying climate change effects at the regional level. A separates analysis involves an "adaptive decision-making", which optimizes the expected future policy effects given the estimated probabilistic uncertainties. As uncertainty for some variables evolve over the time steps, the decisions also adapt. This modeling approach is feasible only with explicit modeling of uncertainties.
Lexical Frequency Profiles and Zipf's Law
ERIC Educational Resources Information Center
Edwards, Roderick; Collins, Laura
2011-01-01
Laufer and Nation (1995) proposed that the Lexical Frequency Profile (LFP) can estimate the size of a second-language writer's productive vocabulary. Meara (2005) questioned the sensitivity and the reliability of LFPs for estimating vocabulary sizes, based on the results obtained from probabilistic simulations of LFPs. However, the underlying…
Cost-effectiveness of unicondylar versus total knee arthroplasty: a Markov model analysis.
Peersman, Geert; Jak, Wouter; Vandenlangenbergh, Tom; Jans, Christophe; Cartier, Philippe; Fennema, Peter
2014-01-01
Unicondylar knee arthroplasty (UKA) is believed to lead to less morbidity and enhanced functional outcomes when compared with total knee arthroplasty (TKA). Conversely, UKA is also associated with a higher revision risk than TKA. In order to further clarify the key differences between these separate procedures, the current study assessing the cost-effectiveness of UKA versus TKA was undertaken. A state-transition Markov model was developed to compare the cost-effectiveness of UKA versus TKA for unicondylar osteoarthritis using a Belgian payer's perspective. The model was designed to include the possibility of two revision procedures. Model estimates were obtained through literature review and revision rates were based on registry data. Threshold analysis and probabilistic sensitivity analysis were performed to assess the model's robustness. UKA was associated with a cost reduction of €2,807 and a utility gain of 0.04 quality-adjusted life years in comparison with TKA. Analysis determined that the model is sensitive to clinical effectiveness, and that a marginal reduction in the clinical performance of UKA would lead to TKA being the more cost-effective solution. UKA yields clear advantages in terms of costs and marginal advantages in terms of health effects, in comparison with TKA. © 2014 Elsevier B.V. All rights reserved.
Modeling Array Stations in SIG-VISA
NASA Astrophysics Data System (ADS)
Ding, N.; Moore, D.; Russell, S.
2013-12-01
We add support for array stations to SIG-VISA, a system for nuclear monitoring using probabilistic inference on seismic signals. Array stations comprise a large portion of the IMS network; they can provide increased sensitivity and more accurate directional information compared to single-component stations. Our existing model assumed that signals were independent at each station, which is false when lots of stations are close together, as in an array. The new model removes that assumption by jointly modeling signals across array elements. This is done by extending our existing Gaussian process (GP) regression models, also known as kriging, from a 3-dimensional single-component space of events to a 6-dimensional space of station-event pairs. For each array and each event attribute (including coda decay, coda height, amplitude transfer and travel time), we model the joint distribution across array elements using a Gaussian process that learns the correlation lengthscale across the array, thereby incorporating information of array stations into the probabilistic inference framework. To evaluate the effectiveness of our model, we perform ';probabilistic beamforming' on new events using our GP model, i.e., we compute the event azimuth having highest posterior probability under the model, conditioned on the signals at array elements. We compare the results from our probabilistic inference model to the beamforming currently performed by IMS station processing.
Rygula, Rafal; Clarke, Hannah F.; Cardinal, Rudolf N.; Cockcroft, Gemma J.; Xia, Jing; Dalley, Jeff W.; Robbins, Trevor W.; Roberts, Angela C.
2015-01-01
Understanding the role of serotonin (or 5-hydroxytryptamine, 5-HT) in aversive processing has been hampered by the contradictory findings, across studies, of increased sensitivity to punishment in terms of subsequent response choice but decreased sensitivity to punishment-induced response suppression following gross depletion of central 5-HT. To address this apparent discrepancy, the present study determined whether both effects could be found in the same animals by performing localized 5-HT depletions in the amygdala or orbitofrontal cortex (OFC) of a New World monkey, the common marmoset. 5-HT depletion in the amygdala impaired response choice on a probabilistic visual discrimination task by increasing the effectiveness of misleading, or false, punishment and reward, and decreased response suppression in a variable interval test of punishment sensitivity that employed the same reward and punisher. 5-HT depletion in the OFC also disrupted probabilistic discrimination learning and decreased response suppression. Computational modeling of behavior on the discrimination task showed that the lesions reduced reinforcement sensitivity. A novel, unitary account of the findings in terms of the causal role of 5-HT in the anticipation of both negative and positive motivational outcomes is proposed and discussed in relation to current theories of 5-HT function and our understanding of mood and anxiety disorders. PMID:24879752
Probabilistic Simulation of Progressive Fracture in Bolted-Joint Composite Laminates
NASA Technical Reports Server (NTRS)
Minnetyan, L.; Singhal, S. N.; Chamis, C. C.
1996-01-01
This report describes computational methods to probabilistically simulate fracture in bolted composite structures. An innovative approach that is independent of stress intensity factors and fracture toughness was used to simulate progressive fracture. The effect of design variable uncertainties on structural damage was also quantified. A fast probability integrator assessed the scatter in the composite structure response before and after damage. Then the sensitivity of the response to design variables was computed. General-purpose methods, which are applicable to bolted joints in all types of structures and in all fracture processes-from damage initiation to unstable propagation and global structure collapse-were used. These methods were demonstrated for a bolted joint of a polymer matrix composite panel under edge loads. The effects of the fabrication process were included in the simulation of damage in the bolted panel. Results showed that the most effective way to reduce end displacement at fracture is to control both the load and the ply thickness. The cumulative probability for longitudinal stress in all plies was most sensitive to the load; in the 0 deg. plies it was very sensitive to ply thickness. The cumulative probability for transverse stress was most sensitive to the matrix coefficient of thermal expansion. In addition, fiber volume ratio and fiber transverse modulus both contributed significantly to the cumulative probability for the transverse stresses in all the plies.
NASA Astrophysics Data System (ADS)
Wahl, N.; Hennig, P.; Wieser, H. P.; Bangert, M.
2017-07-01
The sensitivity of intensity-modulated proton therapy (IMPT) treatment plans to uncertainties can be quantified and mitigated with robust/min-max and stochastic/probabilistic treatment analysis and optimization techniques. Those methods usually rely on sparse random, importance, or worst-case sampling. Inevitably, this imposes a trade-off between computational speed and accuracy of the uncertainty propagation. Here, we investigate analytical probabilistic modeling (APM) as an alternative for uncertainty propagation and minimization in IMPT that does not rely on scenario sampling. APM propagates probability distributions over range and setup uncertainties via a Gaussian pencil-beam approximation into moments of the probability distributions over the resulting dose in closed form. It supports arbitrary correlation models and allows for efficient incorporation of fractionation effects regarding random and systematic errors. We evaluate the trade-off between run-time and accuracy of APM uncertainty computations on three patient datasets. Results are compared against reference computations facilitating importance and random sampling. Two approximation techniques to accelerate uncertainty propagation and minimization based on probabilistic treatment plan optimization are presented. Runtimes are measured on CPU and GPU platforms, dosimetric accuracy is quantified in comparison to a sampling-based benchmark (5000 random samples). APM accurately propagates range and setup uncertainties into dose uncertainties at competitive run-times (GPU ≤slant {5} min). The resulting standard deviation (expectation value) of dose show average global γ{3% / {3}~mm} pass rates between 94.2% and 99.9% (98.4% and 100.0%). All investigated importance sampling strategies provided less accuracy at higher run-times considering only a single fraction. Considering fractionation, APM uncertainty propagation and treatment plan optimization was proven to be possible at constant time complexity, while run-times of sampling-based computations are linear in the number of fractions. Using sum sampling within APM, uncertainty propagation can only be accelerated at the cost of reduced accuracy in variance calculations. For probabilistic plan optimization, we were able to approximate the necessary pre-computations within seconds, yielding treatment plans of similar quality as gained from exact uncertainty propagation. APM is suited to enhance the trade-off between speed and accuracy in uncertainty propagation and probabilistic treatment plan optimization, especially in the context of fractionation. This brings fully-fledged APM computations within reach of clinical application.
Wahl, N; Hennig, P; Wieser, H P; Bangert, M
2017-06-26
The sensitivity of intensity-modulated proton therapy (IMPT) treatment plans to uncertainties can be quantified and mitigated with robust/min-max and stochastic/probabilistic treatment analysis and optimization techniques. Those methods usually rely on sparse random, importance, or worst-case sampling. Inevitably, this imposes a trade-off between computational speed and accuracy of the uncertainty propagation. Here, we investigate analytical probabilistic modeling (APM) as an alternative for uncertainty propagation and minimization in IMPT that does not rely on scenario sampling. APM propagates probability distributions over range and setup uncertainties via a Gaussian pencil-beam approximation into moments of the probability distributions over the resulting dose in closed form. It supports arbitrary correlation models and allows for efficient incorporation of fractionation effects regarding random and systematic errors. We evaluate the trade-off between run-time and accuracy of APM uncertainty computations on three patient datasets. Results are compared against reference computations facilitating importance and random sampling. Two approximation techniques to accelerate uncertainty propagation and minimization based on probabilistic treatment plan optimization are presented. Runtimes are measured on CPU and GPU platforms, dosimetric accuracy is quantified in comparison to a sampling-based benchmark (5000 random samples). APM accurately propagates range and setup uncertainties into dose uncertainties at competitive run-times (GPU [Formula: see text] min). The resulting standard deviation (expectation value) of dose show average global [Formula: see text] pass rates between 94.2% and 99.9% (98.4% and 100.0%). All investigated importance sampling strategies provided less accuracy at higher run-times considering only a single fraction. Considering fractionation, APM uncertainty propagation and treatment plan optimization was proven to be possible at constant time complexity, while run-times of sampling-based computations are linear in the number of fractions. Using sum sampling within APM, uncertainty propagation can only be accelerated at the cost of reduced accuracy in variance calculations. For probabilistic plan optimization, we were able to approximate the necessary pre-computations within seconds, yielding treatment plans of similar quality as gained from exact uncertainty propagation. APM is suited to enhance the trade-off between speed and accuracy in uncertainty propagation and probabilistic treatment plan optimization, especially in the context of fractionation. This brings fully-fledged APM computations within reach of clinical application.
A Computational Approach for Probabilistic Analysis of Water Impact Simulations
NASA Technical Reports Server (NTRS)
Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.
2009-01-01
NASA's development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.
Modelling default and likelihood reasoning as probabilistic reasoning
NASA Technical Reports Server (NTRS)
Buntine, Wray
1990-01-01
A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. Likely and by default are in fact treated as duals in the same sense as possibility and necessity. To model these four forms probabilistically, a qualitative default probabilistic (QDP) logic and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequent results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.
2009-07-01
Performance Analysis of the Probabilistic Multi- Hypothesis Tracking Algorithm On the SEABAR Data Sets Dr. Christian G . Hempel Naval...Hypothesis Tracking,” NUWC-NPT Technical Report 10,428, Naval Undersea Warfare Center Division, Newport, RI, 15 February 1995. [2] G . McLachlan, T...the 9th International Conference on Information Fusion, Florence Italy, July, 2006. [8] C. Hempel, “Track Initialization for Multi-Static Active Sonay
Campbell, Kieran R.
2016-01-01
Single cell gene expression profiling can be used to quantify transcriptional dynamics in temporal processes, such as cell differentiation, using computational methods to label each cell with a ‘pseudotime’ where true time series experimentation is too difficult to perform. However, owing to the high variability in gene expression between individual cells, there is an inherent uncertainty in the precise temporal ordering of the cells. Pre-existing methods for pseudotime estimation have predominantly given point estimates precluding a rigorous analysis of the implications of uncertainty. We use probabilistic modelling techniques to quantify pseudotime uncertainty and propagate this into downstream differential expression analysis. We demonstrate that reliance on a point estimate of pseudotime can lead to inflated false discovery rates and that probabilistic approaches provide greater robustness and measures of the temporal resolution that can be obtained from pseudotime inference. PMID:27870852
Three key areas of scientific inquiry in the study of human exposure to environmental contaminants are 1) assessment of aggregate (i.e., multi-pathway, multi-route) exposures, 2) application of probabilistic methods to exposure prediction, and 3) the interpretation of biomarker m...
A Probabilistic Model of Phonological Relationships from Contrast to Allophony
ERIC Educational Resources Information Center
Hall, Kathleen Currie
2009-01-01
This dissertation proposes a model of phonological relationships, the Probabilistic Phonological Relationship Model (PPRM), that quantifies how predictably distributed two sounds in a relationship are. It builds on a core premise of traditional phonological analysis, that the ability to define phonological relationships such as contrast and…
Mapping anhedonia onto reinforcement learning: a behavioural meta-analysis
2013-01-01
Background Depression is characterised partly by blunted reactions to reward. However, tasks probing this deficiency have not distinguished insensitivity to reward from insensitivity to the prediction errors for reward that determine learning and are putatively reported by the phasic activity of dopamine neurons. We attempted to disentangle these factors with respect to anhedonia in the context of stress, Major Depressive Disorder (MDD), Bipolar Disorder (BPD) and a dopaminergic challenge. Methods Six behavioural datasets involving 392 experimental sessions were subjected to a model-based, Bayesian meta-analysis. Participants across all six studies performed a probabilistic reward task that used an asymmetric reinforcement schedule to assess reward learning. Healthy controls were tested under baseline conditions, stress or after receiving the dopamine D2 agonist pramipexole. In addition, participants with current or past MDD or BPD were evaluated. Reinforcement learning models isolated the contributions of variation in reward sensitivity and learning rate. Results MDD and anhedonia reduced reward sensitivity more than they affected the learning rate, while a low dose of the dopamine D2 agonist pramipexole showed the opposite pattern. Stress led to a pattern consistent with a mixed effect on reward sensitivity and learning rate. Conclusion Reward-related learning reflected at least two partially separable contributions. The first related to phasic prediction error signalling, and was preferentially modulated by a low dose of the dopamine agonist pramipexole. The second related directly to reward sensitivity, and was preferentially reduced in MDD and anhedonia. Stress altered both components. Collectively, these findings highlight the contribution of model-based reinforcement learning meta-analysis for dissecting anhedonic behavior. PMID:23782813
Jensen, Cathrine Elgaard; Riis, Allan; Petersen, Karin Dam; Jensen, Martin Bach; Pedersen, Kjeld Møller
2017-05-01
In connection with the publication of a clinical practice guideline on the management of low back pain (LBP) in general practice in Denmark, a cluster randomised controlled trial was conducted. In this trial, a multifaceted guideline implementation strategy to improve general practitioners' treatment of patients with LBP was compared with a usual implementation strategy. The aim was to determine whether the multifaceted strategy was cost effective, as compared with the usual implementation strategy. The economic evaluation was conducted as a cost-utility analysis where cost collected from a societal perspective and quality-adjusted life years were used as outcome measures. The analysis was conducted as a within-trial analysis with a 12-month time horizon consistent with the follow-up period of the clinical trial. To adjust for a priori selected covariates, generalised linear models with a gamma family were used to estimate incremental costs and quality-adjusted life years. Furthermore, both deterministic and probabilistic sensitivity analyses were conducted. Results showed that costs associated with primary health care were higher, whereas secondary health care costs were lower for the intervention group when compared with the control group. When adjusting for covariates, the intervention was less costly, and there was no significant difference in effect between the 2 groups. Sensitivity analyses showed that results were sensitive to uncertainty. In conclusion, the multifaceted implementation strategy was cost saving when compared with the usual strategy for implementing LBP clinical practice guidelines in general practice. Furthermore, there was no significant difference in effect, and the estimate was sensitive to uncertainty.
Probabilistic structural analysis to quantify uncertainties associated with turbopump blades
NASA Technical Reports Server (NTRS)
Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.
1987-01-01
A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach has been developed to quantify the effects of the random uncertainties. The results of this study indicate that only the variations in geometry have significant effects.
Chlan, Linda L; Heiderscheit, Annette; Skaar, Debra J; Neidecker, Marjorie V
2018-05-04
Music intervention has been shown to reduce anxiety and sedative exposure among mechanically ventilated patients. Whether music intervention reduces ICU costs is not known. The aim of this study was to examine ICU costs for patients receiving a patient-directed music intervention compared with patients who received usual ICU care. A cost-effectiveness analysis from the hospital perspective was conducted to determine if patient-directed music intervention was cost-effective in improving patient-reported anxiety. Cost savings were also evaluated. One-way and probabilistic sensitivity analyses determined the influence of input variation on the cost-effectiveness. Midwestern ICUs. Adult ICU patients from a parent clinical trial receiving mechanical ventilatory support. Patients receiving the experimental patient-directed music intervention received a MP3 player, noise-canceling headphones, and music tailored to individual preferences by a music therapist. The base case cost-effectiveness analysis estimated patient-directed music intervention reduced anxiety by 19 points on the Visual Analogue Scale-Anxiety with a reduction in cost of $2,322/patient compared with usual ICU care, resulting in patient-directed music dominance. The probabilistic cost-effectiveness analysis found that average patient-directed music intervention costs were $2,155 less than usual ICU care and projected that cost saving is achieved in 70% of 1,000 iterations. Based on break-even analyses, cost saving is achieved if the per-patient cost of patient-directed music intervention remains below $2,651, a value eight times the base case of $329. Patient-directed music intervention is cost-effective for reducing anxiety in mechanically ventilated ICU patients.
NASA Technical Reports Server (NTRS)
Singhal, Surendra N.
2003-01-01
The SAE G-11 RMSL (Reliability, Maintainability, Supportability, and Logistics) Division activities include identification and fulfillment of joint industry, government, and academia needs for development and implementation of RMSL technologies. Four Projects in the Probabilistic Methods area and two in the area of RMSL have been identified. These are: (1) Evaluation of Probabilistic Technology - progress has been made toward the selection of probabilistic application cases. Future effort will focus on assessment of multiple probabilistic softwares in solving selected engineering problems using probabilistic methods. Relevance to Industry & Government - Case studies of typical problems encountering uncertainties, results of solutions to these problems run by different codes, and recommendations on which code is applicable for what problems; (2) Probabilistic Input Preparation - progress has been made in identifying problem cases such as those with no data, little data and sufficient data. Future effort will focus on developing guidelines for preparing input for probabilistic analysis, especially with no or little data. Relevance to Industry & Government - Too often, we get bogged down thinking we need a lot of data before we can quantify uncertainties. Not True. There are ways to do credible probabilistic analysis with little data; (3) Probabilistic Reliability - probabilistic reliability literature search has been completed along with what differentiates it from statistical reliability. Work on computation of reliability based on quantification of uncertainties in primitive variables is in progress. Relevance to Industry & Government - Correct reliability computations both at the component and system level are needed so one can design an item based on its expected usage and life span; (4) Real World Applications of Probabilistic Methods (PM) - A draft of volume 1 comprising aerospace applications has been released. Volume 2, a compilation of real world applications of probabilistic methods with essential information demonstrating application type and timehost savings by the use of probabilistic methods for generic applications is in progress. Relevance to Industry & Government - Too often, we say, 'The Proof is in the Pudding'. With help from many contributors, we hope to produce such a document. Problem is - not too many people are coming forward due to proprietary nature. So, we are asking to document only minimum information including problem description, what method used, did it result in any savings, and how much?; (5) Software Reliability - software reliability concept, program, implementation, guidelines, and standards are being documented. Relevance to Industry & Government - software reliability is a complex issue that must be understood & addressed in all facets of business in industry, government, and other institutions. We address issues, concepts, ways to implement solutions, and guidelines for maximizing software reliability; (6) Maintainability Standards - maintainability/serviceability industry standard/guidelines and industry best practices and methodologies used in performing maintainability/ serviceability tasks are being documented. Relevance to Industry & Government - Any industry or government process, project, and/or tool must be maintained and serviced to realize the life and performance it was designed for. We address issues and develop guidelines for optimum performance & life.
Nazir, Jameel; Maman, Khaled; Neine, Mohamed-Elmoctar; Briquet, Benjamin; Odeyemi, Isaac A O; Hakimi, Zalmai; Garnham, Andy; Aballéa, Samuel
2015-09-01
Mirabegron, a first-in-class selective oral β3-adrenoceptor agonist, has similar efficacy to most antimuscarinic agents and a lower incidence of dry mouth in patients with overactive bladder (OAB). To evaluate the cost-effectiveness of mirabegron 50 mg compared with oral antimuscarinic agents in adults with OAB from a UK National Health Service perspective. A Markov model including health states for symptom severity, treatment status, and adverse events was developed. Cycle length was 1 month, and the time horizon was 5 years. Antimuscarinic comparators were tolterodine extended release, solifenacin, fesoterodine, oxybutynin extended release and immediate release (IR), darifenacin, and trospium chloride modified release. Transition probabilities for symptom severity levels and adverse events were estimated from a mirabegron trial and a mixed treatment comparison. Estimates for other inputs were obtained from published literature or expert opinion. Quality-adjusted life-years (QALYs) and total health care costs, including costs of drug acquisition, physician visits, incontinence pad use, and botox injections, were modeled. Deterministic and probabilistic sensitivity analyses were performed. Base-case incremental cost-effectiveness ratios ranged from £367 (vs. solifenacin 10 mg) to £15,593 (vs. oxybutynin IR 10 mg) per QALY gained. Probabilistic sensitivity analyses showed that at a willingness-to-pay threshold of £20,000/QALY gained, the probability of mirabegron 50 mg being cost-effective ranged from 70.2% versus oxybutynin IR 10 mg to 97.8% versus darifenacin 15 mg. A limitation of our analysis is the uncertainty due to the lack of direct comparisons of mirabegron with other agents; a mixed treatment comparison using rigorous methodology provided the data for the analysis, but the studies involved showed heterogeneity. Mirabegron 50 mg appears to be cost-effective compared with standard oral antimuscarinic agents for the treatment of adults with OAB from a UK National Health Service perspective. Copyright © 2015. Published by Elsevier Inc.
Cost-effectiveness of pazopanib compared with sunitinib in metastatic renal cell carcinoma in Canada
Amdahl, J.; Diaz, J.; Park, J.; Nakhaipour, H.R.; Delea, T.E.
2016-01-01
Background In Canada and elsewhere, pazopanib and sunitinib—tyrosine kinase inhibitors targeting the vascular endothelial growth factor receptors—are recommended as first-line treatment for patients with metastatic renal cell carcinoma (mrcc). A large randomized noninferiority trial of pazopanib versus sunitinib (comparz) demonstrated that the two drugs have similar efficacy; however, patients randomized to pazopanib experienced better health-related quality of life (hrqol) and nominally lower rates of non-study medical resource utilization. Methods The cost-effectiveness of pazopanib compared with sunitinib for first-line treatment of mrcc from a Canadian health care system perspective was evaluated using a partitioned-survival model that incorporated data from comparz and other secondary sources. The time horizon of 5 years was based on the maximum duration of follow-up in the final analysis of overall survival from the comparz trial. Analyses were conducted first using list prices for pazopanib and sunitinib and then by assuming that the prices of sunitinib and pazopanib would be equivalent. Results Based on list prices, expected costs were CA$10,293 less with pazopanib than with sunitinib. Pazopanib was estimated to yield 0.059 more quality-adjusted life-years (qalys). Pazopanib was therefore dominant (more qalys and lower costs) compared with sunitinib in the base case. In probabilistic sensitivity analyses, pazopanib was dominant in 79% of simulations and was cost-effective in 90%–100% of simulations at a threshold cost-effectiveness ratio of CA$100,000. Assuming equivalent pricing, pazopanib yielded CA$917 in savings in the base case, was dominant in 36% of probabilistic sensitivity analysis simulations, and was cost-effective in 89% of simulations at a threshold cost-effectiveness ratio of CA$100,000. Conclusions Compared with sunitinib, pazopanib is likely to be a cost-effective option for first-line treatment of mrcc from a Canadian health care perspective. PMID:27536183
Luebke, Thomas; Brunkwall, Jan
2014-05-01
This study weighed the cost and benefit of thoracic endovascular aortic repair (TEVAR) vs open repair (OR) in the treatment of an acute complicated type B aortic dissection by (TBAD) estimating the cost-effectiveness to determine an optimal treatment strategy based on the best currently available evidence. A cost-utility analysis from the perspective of the health system payer was performed using a decision analytic model. Within this model, the 1-year survival, quality-adjusted life-years (QALYs), and costs for a hypothetical cohort of patients with an acute complicated TBAD managed with TEVAR or OR were evaluated. Clinical effectiveness data, cost data, and transitional probabilities of different health states were derived from previously published high-quality studies or meta-analyses. Probabilistic sensitivity analyses were performed on uncertain model parameters. The base-case analysis showed, in terms of QALYs, that OR appeared to be more expensive (incremental cost of €17,252.60) and less effective (-0.19 QALYs) compared with TEVAR; hence, in terms of the incremental cost-effectiveness ratio, OR was dominated by TEVAR. As a result, the incremental cost-effectiveness ratio (ie, the cost per life-year saved) was not calculated. The average cost-effectiveness ratio of TEVAR and OR per QALY gained was €56,316.79 and €108,421.91, respectively. In probabilistic sensitivity analyses, TEVAR was economically dominant in 100% of cases. The probability that TEVAR was economically attractive at a willingness-to-pay threshold of €50,000/QALY gained was 100%. The present results suggest that TEVAR yielded more QALYs and was associated with lower 1-year costs compared with OR in patients with an acute complicated TBAD. As a result, from the cost-effectiveness point of view, TEVAR is the dominant therapy over OR for this disease under the predefined conditions. Copyright © 2014 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.
Potential Cost-Effectiveness of Universal Access to Modern Contraceptives in Uganda
Babigumira, Joseph B.; Stergachis, Andy; Veenstra, David L.; Gardner, Jacqueline S.; Ngonzi, Joseph; Mukasa-Kivunike, Peter; Garrison, Louis P.
2012-01-01
Background Over two thirds of women who need contraception in Uganda lack access to modern effective methods. This study was conducted to estimate the potential cost-effectiveness of achieving universal access to modern contraceptives in Uganda by implementing a hypothetical new contraceptive program (NCP) from both societal and governmental (Ministry of Health (MoH)) perspectives. Methodology/Principal Findings A Markov model was developed to compare the NCP to the status quo or current contraceptive program (CCP). The model followed a hypothetical cohort of 15-year old girls over a lifetime horizon. Data were obtained from the Uganda National Demographic and Health Survey and from published and unpublished sources. Costs, life expectancy, disability-adjusted life expectancy, pregnancies, fertility and incremental cost-effectiveness measured as cost per life-year (LY) gained, cost per disability-adjusted life-year (DALY) averted, cost per pregnancy averted and cost per unit of fertility reduction were calculated. Univariate and probabilistic sensitivity analyses were performed to examine the robustness of results. Mean discounted life expectancy and disability-adjusted life expectancy (DALE) were higher under the NCP vs. CCP (28.74 vs. 28.65 years and 27.38 vs. 27.01 respectively). Mean pregnancies and live births per woman were lower under the NCP (9.51 vs. 7.90 and 6.92 vs. 5.79 respectively). Mean lifetime societal costs per woman were lower for the NCP from the societal perspective ($1,949 vs. $1,987) and the MoH perspective ($636 vs. $685). In the incremental analysis, the NCP dominated the CCP, i.e. it was both less costly and more effective. The results were robust to univariate and probabilistic sensitivity analysis. Conclusion/Significance Universal access to modern contraceptives in Uganda appears to be highly cost-effective. Increasing contraceptive coverage should be considered among Uganda's public health priorities. PMID:22363480
Pollom, Erqi L; Lee, Kyueun; Durkee, Ben Y; Grade, Madeline; Mokhtari, Daniel A; Wahl, Daniel R; Feng, Mary; Kothary, Nishita; Koong, Albert C; Owens, Douglas K; Goldhaber-Fiebert, Jeremy; Chang, Daniel T
2017-05-01
Purpose To assess the cost-effectiveness of stereotactic body radiation therapy (SBRT) versus radiofrequency ablation (RFA) for patients with inoperable localized hepatocellular carcinoma (HCC) who are eligible for both SBRT and RFA. Materials and Methods A decision-analytic Markov model was developed for patients with inoperable, localized HCC who were eligible for both RFA and SBRT to evaluate the cost-effectiveness of the following treatment strategies: (a) SBRT as initial treatment followed by SBRT for local progression (SBRT-SBRT), (b) RFA followed by RFA for local progression (RFA-RFA), (c) SBRT followed by RFA for local progression (SBRT-RFA), and (d) RFA followed by SBRT for local progression (RFA-SBRT). Probabilities of disease progression, treatment characteristics, and mortality were derived from published studies. Outcomes included health benefits expressed as discounted quality-adjusted life years (QALYs), costs in U.S. dollars, and cost-effectiveness expressed as an incremental cost-effectiveness ratio. Deterministic and probabilistic sensitivity analysis was performed to assess the robustness of the findings. Results In the base case, SBRT-SBRT yielded the most QALYs (1.565) and cost $197 557. RFA-SBRT yielded 1.558 QALYs and cost $193 288. SBRT-SBRT was not cost-effective, at $558 679 per QALY gained relative to RFA-SBRT. RFA-SBRT was the preferred strategy, because RFA-RFA and SBRT-RFA were less effective and more costly. In all evaluated scenarios, SBRT was preferred as salvage therapy for local progression after RFA. Probabilistic sensitivity analysis showed that at a willingness-to-pay threshold of $100 000 per QALY gained, RFA-SBRT was preferred in 65.8% of simulations. Conclusion SBRT for initial treatment of localized, inoperable HCC is not cost-effective. However, SBRT is the preferred salvage therapy for local progression after RFA. © RSNA, 2017 Online supplemental material is available for this article.
Lee, Kyueun; Durkee, Ben Y.; Grade, Madeline; Mokhtari, Daniel A.; Wahl, Daniel R.; Feng, Mary; Kothary, Nishita; Koong, Albert C.; Owens, Douglas K.; Goldhaber-Fiebert, Jeremy; Chang, Daniel T.
2017-01-01
Purpose To assess the cost-effectiveness of stereotactic body radiation therapy (SBRT) versus radiofrequency ablation (RFA) for patients with inoperable localized hepatocellular carcinoma (HCC) who are eligible for both SBRT and RFA. Materials and Methods A decision-analytic Markov model was developed for patients with inoperable, localized HCC who were eligible for both RFA and SBRT to evaluate the cost-effectiveness of the following treatment strategies: (a) SBRT as initial treatment followed by SBRT for local progression (SBRT-SBRT), (b) RFA followed by RFA for local progression (RFA-RFA), (c) SBRT followed by RFA for local progression (SBRT-RFA), and (d) RFA followed by SBRT for local progression (RFA-SBRT). Probabilities of disease progression, treatment characteristics, and mortality were derived from published studies. Outcomes included health benefits expressed as discounted quality-adjusted life years (QALYs), costs in U.S. dollars, and cost-effectiveness expressed as an incremental cost-effectiveness ratio. Deterministic and probabilistic sensitivity analysis was performed to assess the robustness of the findings. Results In the base case, SBRT-SBRT yielded the most QALYs (1.565) and cost $197 557. RFA-SBRT yielded 1.558 QALYs and cost $193 288. SBRT-SBRT was not cost-effective, at $558 679 per QALY gained relative to RFA-SBRT. RFA-SBRT was the preferred strategy, because RFA-RFA and SBRT-RFA were less effective and more costly. In all evaluated scenarios, SBRT was preferred as salvage therapy for local progression after RFA. Probabilistic sensitivity analysis showed that at a willingness-to-pay threshold of $100 000 per QALY gained, RFA-SBRT was preferred in 65.8% of simulations. Conclusion SBRT for initial treatment of localized, inoperable HCC is not cost-effective. However, SBRT is the preferred salvage therapy for local progression after RFA. © RSNA, 2017 Online supplemental material is available for this article. PMID:28045603
Banack, Hailey R; Stokes, Andrew; Fox, Matthew P; Hovey, Kathleen M; Cespedes-Feliciano, Elizabeth M; LeBlanc, Erin; Bird, Chloe; Caan, Bette J; Kroenke, Candyce H; Allison, Matthew A; Going, Scott B; Snetslaar, Linda; Cheng, Ting-Yuan David; Chlebowski, Rowan T; Stefanick, Marcia L; LaMonte, Michael J; Wactawski-Wende, Jean
2018-06-01
There is widespread concern about the use of body mass index (BMI) to define obesity status in postmenopausal women because it may not accurately represent an individual's true obesity status. The objective of the present study is to examine and adjust for exposure misclassification bias from using an indirect measure of obesity (BMI) compared with a direct measure of obesity (percent body fat). We used data from postmenopausal non-Hispanic black and non-Hispanic white women in the Women's Health Initiative (WHI; n=126,459). Within the WHI, a sample of 11,018 women were invited to participate in a sub-study involving dual-energy x-ray absorptiometry (DXA) scans. We examined indices of validity comparing BMI-defined obesity (≥30kg/m) with obesity defined by percent body fat. We then used probabilistic bias analysis models stratified by age and race to explore the effect of exposure misclassification on the obesity-mortality relationship. Validation analyses highlight that using a BMI cutpoint of 30 kg/m to define obesity in postmenopausal women is associated with poor validity. There were notable differences in sensitivity by age and race. Results from the stratified bias analysis demonstrated that failing to adjust for exposure misclassification bias results in attenuated estimates of the obesity-mortality relationship. For example, in non-Hispanic white women age 50-59, the conventional risk difference was 0.017 (95% CI 0.01, 0.023) and the bias-adjusted risk difference was 0.035 (95% SI 0.028, 0.043). These results demonstrate the importance of using quantitative bias analysis techniques to account for non-differential exposure misclassification of BMI-defined obesity.
Chou, Berry Yun-Hua; Liao, Chung-Min; Lin, Ming-Chao; Cheng, Hsu-Hui
2006-05-01
This paper presents a toxicokinetic/toxicodynamic analysis to appraise arsenic (As) bioaccumulation in farmed juvenile milkfish Chanos chanos at blackfoot disease (BFD)-endemic area in Taiwan, whereas probabilistic incremental lifetime cancer risk (ILCR) and hazard quotient (HQ) models are also employed to assess the range of exposures for the fishers and non-fishers who eat the contaminated fish. We conducted a 7-day exposure experiment to obtain toxicokinetic parameters, whereas a simple critical body burden toxicity model was verified with LC50(t) data obtained from a 7-day acute toxicity bioassay. Acute toxicity bioassay indicates that 96-h LC50 for juvenile milkfish exposed to As is 7.29 (95% CI: 3.10-10.47) mg l(-1). Our risk analysis for milkfish reared in BFD-endemic area indicates a low likelihood that survival is being affected by waterborne As. Human risk analysis demonstrates that 90%-tile probability exposure ILCRs for fishers in BFD-endemic area have orders of magnitude of 10(-3), indicating a high potential carcinogenic risk, whereas there is no significant cancer risk for non-fishers (ILCRs around 10(-5)). All predicted 90%-tiles of HQ are less than 1 for non-fishers, yet larger than 10 for fishers which indicate larger contributions from farmed milkfish consumptions. Sensitivity analysis indicates that to increase the accuracy of the results, efforts should focus on a better definition of probability distributions for milkfish daily consumption rate and As level in milkfish. Here we show that theoretical human health risks for consuming As-contaminated milkfish in the BFD-endemic area are alarming under a conservative condition based on a probabilistic risk assessment model.
Fracture mechanics analysis of cracked structures using weight function and neural network method
NASA Astrophysics Data System (ADS)
Chen, J. G.; Zang, F. G.; Yang, Y.; Shi, K. K.; Fu, X. L.
2018-06-01
Stress intensity factors(SIFs) due to thermal-mechanical load has been established by using weight function method. Two reference stress states sere used to determine the coefficients in the weight function. Results were evaluated by using data from literature and show a good agreement between them. So, the SIFs can be determined quickly using the weight function obtained when cracks subjected to arbitrary loads, and presented method can be used for probabilistic fracture mechanics analysis. A probabilistic methodology considering Monte-Carlo with neural network (MCNN) has been developed. The results indicate that an accurate probabilistic characteristic of the KI can be obtained by using the developed method. The probability of failure increases with the increasing of loads, and the relationship between is nonlinear.
Probabilistic evaluation of on-line checks in fault-tolerant multiprocessor systems
NASA Technical Reports Server (NTRS)
Nair, V. S. S.; Hoskote, Yatin V.; Abraham, Jacob A.
1992-01-01
The analysis of fault-tolerant multiprocessor systems that use concurrent error detection (CED) schemes is much more difficult than the analysis of conventional fault-tolerant architectures. Various analytical techniques have been proposed to evaluate CED schemes deterministically. However, these approaches are based on worst-case assumptions related to the failure of system components. Often, the evaluation results do not reflect the actual fault tolerance capabilities of the system. A probabilistic approach to evaluate the fault detecting and locating capabilities of on-line checks in a system is developed. The various probabilities associated with the checking schemes are identified and used in the framework of the matrix-based model. Based on these probabilistic matrices, estimates for the fault tolerance capabilities of various systems are derived analytically.
Lester-Coll, Nataniel H; Dosoretz, Arie P; Magnuson, William J; Laurans, Maxwell S; Chiang, Veronica L; Yu, James B
2016-12-01
OBJECTIVE The JLGK0901 study found that stereotactic radiosurgery (SRS) is a safe and effective treatment option for treating up to 10 brain metastases. The purpose of this study is to determine the cost-effectiveness of treating up to 10 brain metastases with SRS, whole-brain radiation therapy (WBRT), or SRS and immediate WBRT (SRS+WBRT). METHODS A Markov model was developed to evaluate the cost effectiveness of SRS, WBRT, and SRS+WBRT in patients with 1 or 2-10 brain metastases. Transition probabilities were derived from the JLGK0901 study and modified according to the recurrence rates observed in the Radiation Therapy Oncology Group (RTOG) 9508 and European Organization for Research and Treatment of Cancer (EORTC) 22952-26001 studies to simulate the outcomes for patients who receive WBRT. Costs are based on 2015 Medicare reimbursements. Health state utilities were prospectively collected using the Standard Gamble method. End points included cost, quality-adjusted life years (QALYs), and incremental cost-effectiveness ratios (ICERs). The willingness-to-pay (WTP) threshold was $100,000 per QALY. One-way and probabilistic sensitivity analyses explored uncertainty with regard to the model assumptions. RESULTS In patients with 1 brain metastasis, the ICERs for SRS versus WBRT, SRS versus SRS+WBRT, and SRS+WBRT versus WBRT were $117,418, $51,348, and $746,997 per QALY gained, respectively. In patients with 2-10 brain metastases, the ICERs were $123,256, $58,903, and $821,042 per QALY gained, respectively. On the sensitivity analyses, the model was sensitive to the cost of SRS and the utilities associated with stable post-SRS and post-WBRT states. In patients with 2-10 brain metastases, SRS versus WBRT becomes cost-effective if the cost of SRS is reduced by $3512. SRS versus WBRT was also cost effective at a WTP of $200,000 per QALY on the probabilistic sensitivity analysis. CONCLUSIONS The most cost-effective strategy for patients with up to 10 brain metastases is SRS alone relative to SRS+WBRT. SRS alone may also be cost-effective relative to WBRT alone, but this depends on WTP, the cost of SRS, and patient preferences.
Probabilistic structural analysis methods and applications
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Wu, Y.-T.; Dias, B.; Rajagopal, K. R.
1988-01-01
An advanced algorithm for simulating the probabilistic distribution of structural responses due to statistical uncertainties in loads, geometry, material properties, and boundary conditions is reported. The method effectively combines an advanced algorithm for calculating probability levels for multivariate problems (fast probability integration) together with a general-purpose finite-element code for stress, vibration, and buckling analysis. Application is made to a space propulsion system turbine blade for which the geometry and material properties are treated as random variables.
Probabilistic methods for rotordynamics analysis
NASA Technical Reports Server (NTRS)
Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.
1991-01-01
This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.
Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.
Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James
2009-04-01
The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.
NASA Astrophysics Data System (ADS)
Baer, P.; Mastrandrea, M.
2006-12-01
Simple probabilistic models which attempt to estimate likely transient temperature change from specified CO2 emissions scenarios must make assumptions about at least six uncertain aspects of the causal chain between emissions and temperature: current radiative forcing (including but not limited to aerosols), current land use emissions, carbon sinks, future non-CO2 forcing, ocean heat uptake, and climate sensitivity. Of these, multiple PDFs (probability density functions) have been published for the climate sensitivity, a couple for current forcing and ocean heat uptake, one for future non-CO2 forcing, and none for current land use emissions or carbon cycle uncertainty (which are interdependent). Different assumptions about these parameters, as well as different model structures, will lead to different estimates of likely temperature increase from the same emissions pathway. Thus policymakers will be faced with a range of temperature probability distributions for the same emissions scenarios, each described by a central tendency and spread. Because our conventional understanding of uncertainty and probability requires that a probabilistically defined variable of interest have only a single mean (or median, or modal) value and a well-defined spread, this "multidimensional" uncertainty defies straightforward utilization in policymaking. We suggest that there are no simple solutions to the questions raised. Crucially, we must dispel the notion that there is a "true" probability probabilities of this type are necessarily subjective, and reasonable people may disagree. Indeed, we suggest that what is at stake is precisely the question, what is it reasonable to believe, and to act as if we believe? As a preliminary suggestion, we demonstrate how the output of a simple probabilistic climate model might be evaluated regarding the reasonableness of the outputs it calculates with different input PDFs. We suggest further that where there is insufficient evidence to clearly favor one range of probabilistic projections over another, that the choice of results on which to base policy must necessarily involve ethical considerations, as they have inevitable consequences for the distribution of risk In particular, the choice to use a more "optimistic" PDF for climate sensitivity (or other components of the causal chain) leads to the allowance of higher emissions consistent with any specified goal for risk reduction, and thus leads to higher climate impacts, in exchange for lower mitigation costs.
Development of a Probabilistic Tsunami Hazard Analysis in Japan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka
2006-07-01
It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less
Nonlinear probabilistic finite element models of laminated composite shells
NASA Technical Reports Server (NTRS)
Engelstad, S. P.; Reddy, J. N.
1993-01-01
A probabilistic finite element analysis procedure for laminated composite shells has been developed. A total Lagrangian finite element formulation, employing a degenerated 3-D laminated composite shell with the full Green-Lagrange strains and first-order shear deformable kinematics, forms the modeling foundation. The first-order second-moment technique for probabilistic finite element analysis of random fields is employed and results are presented in the form of mean and variance of the structural response. The effects of material nonlinearity are included through the use of a rate-independent anisotropic plasticity formulation with the macroscopic point of view. Both ply-level and micromechanics-level random variables can be selected, the latter by means of the Aboudi micromechanics model. A number of sample problems are solved to verify the accuracy of the procedures developed and to quantify the variability of certain material type/structure combinations. Experimental data is compared in many cases, and the Monte Carlo simulation method is used to check the probabilistic results. In general, the procedure is quite effective in modeling the mean and variance response of the linear and nonlinear behavior of laminated composite shells.
Use of limited data to construct Bayesian networks for probabilistic risk assessment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groth, Katrina M.; Swiler, Laura Painton
2013-03-01
Probabilistic Risk Assessment (PRA) is a fundamental part of safety/quality assurance for nuclear power and nuclear weapons. Traditional PRA very effectively models complex hardware system risks using binary probabilistic models. However, traditional PRA models are not flexible enough to accommodate non-binary soft-causal factors, such as digital instrumentation&control, passive components, aging, common cause failure, and human errors. Bayesian Networks offer the opportunity to incorporate these risks into the PRA framework. This report describes the results of an early career LDRD project titled %E2%80%9CUse of Limited Data to Construct Bayesian Networks for Probabilistic Risk Assessment%E2%80%9D. The goal of the work was tomore » establish the capability to develop Bayesian Networks from sparse data, and to demonstrate this capability by producing a data-informed Bayesian Network for use in Human Reliability Analysis (HRA) as part of nuclear power plant Probabilistic Risk Assessment (PRA). This report summarizes the research goal and major products of the research.« less
Scalable DB+IR Technology: Processing Probabilistic Datalog with HySpirit.
Frommholz, Ingo; Roelleke, Thomas
2016-01-01
Probabilistic Datalog (PDatalog, proposed in 1995) is a probabilistic variant of Datalog and a nice conceptual idea to model Information Retrieval in a logical, rule-based programming paradigm. Making PDatalog work in real-world applications requires more than probabilistic facts and rules, and the semantics associated with the evaluation of the programs. We report in this paper some of the key features of the HySpirit system required to scale the execution of PDatalog programs. Firstly, there is the requirement to express probability estimation in PDatalog. Secondly, fuzzy-like predicates are required to model vague predicates (e.g. vague match of attributes such as age or price). Thirdly, to handle large data sets there are scalability issues to be addressed, and therefore, HySpirit provides probabilistic relational indexes and parallel and distributed processing . The main contribution of this paper is a consolidated view on the methods of the HySpirit system to make PDatalog applicable in real-scale applications that involve a wide range of requirements typical for data (information) management and analysis.
Incremental dynamical downscaling for probabilistic analysis based on multiple GCM projections
NASA Astrophysics Data System (ADS)
Wakazuki, Y.
2015-12-01
A dynamical downscaling method for probabilistic regional scale climate change projections was developed to cover an uncertainty of multiple general circulation model (GCM) climate simulations. The climatological increments (future minus present climate states) estimated by GCM simulation results were statistically analyzed using the singular vector decomposition. Both positive and negative perturbations from the ensemble mean with the magnitudes of their standard deviations were extracted and were added to the ensemble mean of the climatological increments. The analyzed multiple modal increments were utilized to create multiple modal lateral boundary conditions for the future climate regional climate model (RCM) simulations by adding to an objective analysis data. This data handling is regarded to be an advanced method of the pseudo-global-warming (PGW) method previously developed by Kimura and Kitoh (2007). The incremental handling for GCM simulations realized approximated probabilistic climate change projections with the smaller number of RCM simulations. Three values of a climatological variable simulated by RCMs for a mode were used to estimate the response to the perturbation of the mode. For the probabilistic analysis, climatological variables of RCMs were assumed to show linear response to the multiple modal perturbations, although the non-linearity was seen for local scale rainfall. Probability of temperature was able to be estimated within two modes perturbation simulations, where the number of RCM simulations for the future climate is five. On the other hand, local scale rainfalls needed four modes simulations, where the number of the RCM simulations is nine. The probabilistic method is expected to be used for regional scale climate change impact assessment in the future.
A framework for the probabilistic analysis of meteotsunamis
Geist, Eric L.; ten Brink, Uri S.; Gove, Matthew D.
2014-01-01
A probabilistic technique is developed to assess the hazard from meteotsunamis. Meteotsunamis are unusual sea-level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation is proposed for the probabilistic analysis, based on previous frameworks established for both tsunamis and storm surges, incorporating different sources and source parameters of meteotsunamis. Parameterization of atmospheric disturbances and numerical modeling is performed for the computation of maximum meteotsunami wave amplitudes near the coast. A historical record of pressure disturbances is used to establish a continuous analytic distribution of each parameter as well as the overall Poisson rate of occurrence. A demonstration study is presented for the northeast U.S. in which only isolated atmospheric pressure disturbances from squall lines and derechos are considered. For this study, Automated Surface Observing System stations are used to determine the historical parameters of squall lines from 2000 to 2013. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of squall lines is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a numerical hydrodynamic model. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using multiple synthetic catalogs, resampled from the parent parameter distributions, yield mean and quantile hazard curves. Further refinements and improvements for probabilistic analysis of meteotsunamis are discussed.
Bolaños-Díaz, Rafael; Tejada, Romina A; Beltrán, Jessica; Escobedo-Palza, Seimer
2016-01-01
To determine the cost-effectiveness of human papillomavirus (HPV) vaccination and cervical lesion screening versus screening alone for the prevention of uterine cervical cancer (UCC). This cost-effectiveness evaluation from the perspective of the Ministry of Health employed a Markov model with a 70-year time horizon and three alternatives for UCC prevention (screening alone, screening + bivalent vaccine, and screening + quadrivalent vaccine) in a hypothetical cohort of 10-year-old girls. Our model, which was particularly sensitive to variations in coverage and in the prevalence of persistent infection by oncologic genotypes not included in the vaccine, revealed that HPV vaccination and screening is more cost-effective than screening alone, assuming a payment availability from S/ 2 000 (US dollars (USD) 1 290.32) per subject. In the deterministic analysis, the bivalent vaccine was marginally more cost-effective than the quadrivalent vaccine (S/ 48 [USD 30.97] vs. S/ 166 [USD 107.10] per quality-adjusted life-year, respectively). However, in the probabilistic analysis, both interventions generated clouds of overlapping points and were thus cost-effective and interchangeable, although the quadrivalent vaccine tended to be more cost-effective. Assuming a payment availability from S/ 2000 [USD 1,290.32], screening and vaccination were more cost-effective than screening alone. The difference in cost-effectiveness between the two vaccines lacked probabilistic robustness, and therefore the vaccines can be considered interchangeable from a cost-effectiveness perspective.
Rudmik, Luke; Smith, Kristine A; Soler, Zachary M; Schlosser, Rodney J; Smith, Timothy L
2014-10-01
Idiopathic olfactory loss is a common clinical scenario encountered by otolaryngologists. While trying to allocate limited health care resources appropriately, the decision to obtain a magnetic resonance imaging (MRI) scan to investigate for a rare intracranial abnormality can be difficult. To evaluate the cost-effectiveness of ordering routine MRI in patients with idiopathic olfactory loss. We performed a modeling-based economic evaluation with a time horizon of less than 1 year. Patients included in the analysis had idiopathic olfactory loss defined by no preceding viral illness or head trauma and negative findings of a physical examination and nasal endoscopy. Routine MRI vs no-imaging strategies. We developed a decision tree economic model from the societal perspective. Effectiveness, probability, and cost data were obtained from the published literature. Litigation rates and costs related to a missed diagnosis were obtained from the Physicians Insurers Association of America. A univariate threshold analysis and multivariate probabilistic sensitivity analysis were performed to quantify the degree of certainty in the economic conclusion of the reference case. The comparative groups included those who underwent routine MRI of the brain with contrast alone and those who underwent no brain imaging. The primary outcome was the cost per correct diagnosis of idiopathic olfactory loss. The mean (SD) cost for the MRI strategy totaled $2400.00 ($1717.54) and was effective 100% of the time, whereas the mean (SD) cost for the no-imaging strategy totaled $86.61 ($107.40) and was effective 98% of the time. The incremental cost-effectiveness ratio for the MRI strategy compared with the no-imaging strategy was $115 669.50, which is higher than most acceptable willingness-to-pay thresholds. The threshold analysis demonstrated that when the probability of having a treatable intracranial disease process reached 7.9%, the incremental cost-effectiveness ratio for MRI vs no imaging was $24 654.38. The probabilistic sensitivity analysis demonstrated that the no-imaging strategy was the cost-effective decision with 81% certainty at a willingness-to-pay threshold of $50 000. This economic evaluation suggests that the most cost-effective decision is to not obtain a routine MRI scan of the brain in patients with idiopathic olfactory loss. Outcomes from this study may be used to counsel patients and aid in the decision-making process.
Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's
NASA Technical Reports Server (NTRS)
Jadaan, Osama
2003-01-01
This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.
A lifetime Markov model for the economic evaluation of chronic obstructive pulmonary disease.
Menn, Petra; Leidl, Reiner; Holle, Rolf
2012-09-01
Chronic obstructive pulmonary disease (COPD) is currently the fourth leading cause of death worldwide. It has serious health effects and causes substantial costs for society. The aim of the present paper was to develop a state-of-the-art decision-analytic model of COPD whereby the cost effectiveness of interventions in Germany can be estimated. To demonstrate the applicability of the model, a smoking cessation programme was evaluated against usual care. A seven-stage Markov model (disease stages I to IV according to the GOLD [Global Initiative for Chronic Obstructive Lung Disease] classification, states after lung-volume reduction surgery and lung transplantation, death) was developed to conduct a cost-utility analysis from the societal perspective over a time horizon of 10, 40 and 60 years. Patients entered the cohort model at the age of 45 with mild COPD. Exacerbations were classified into three levels: mild, moderate and severe. Estimation of stage-specific probabilities (for smokers and quitters), utilities and costs was based on German data where possible. Data on effectiveness of the intervention was retrieved from the literature. A discount rate of 3% was applied to costs and effects. Probabilistic sensitivity analysis was used to assess the robustness of the results. The smoking cessation programme was the dominant strategy compared with usual care, and the intervention resulted in an increase in health effects of 0.54 QALYs and a cost reduction of &U20AC;1115 per patient (year 2007 prices) after 60 years. In the probabilistic analysis, the intervention dominated in about 95% of the simulations. Sensitivity analyses showed that uncertainty primarily originated from data on disease progression and treatment cost in the early stages of disease. The model developed allows the long-term cost effectiveness of interventions to be estimated, and has been adapted to Germany. The model suggests that the smoking cessation programme evaluated was more effective than usual care as well as being cost-saving. Most patients had mild or moderate COPD, stages for which parameter uncertainty was found to be high. This raises the need to improve data on the early stages of COPD.
Moshyk, A; Martel, M-J; Tahami Monfared, A A; Goeree, R
2016-01-01
New regimens for the treatment of chronic hepatitis C virus (HCV) genotype 3 have demonstrated substantial improvement in sustained virologic response (SVR) compared with existing therapies, but are considerably more expensive. The objective of this study was to evaluate the cost-effectiveness of two novel all-oral, interferon-free regimens for the treatment of patients with HCV genotype 3: daclatasvir plus sofosbuvir (DCV + SOF) and sofosbuvir plus ribavirin (SOF + RBV), from a Canadian health-system perspective. A decision analytic Markov model was developed to compare the effect of various treatment strategies on the natural history of the disease and their associated costs in treatment-naïve and treatment-experienced patients. Patients were initially distributed across fibrosis stages F0-F4, and may incur disease progression through fibrosis stages and on to end-stage liver disease complications and death; or may achieve SVR. Clinical efficacy, health-related quality-of-life, costs, and transition probabilities were based on published literature. Probabilistic sensitivity analysis was performed to assess parameter uncertainty associated with the analysis. In treatment-naive patients, the expected quality-adjusted life years (QALYs) for interferon-free regimens were higher for DCV + SOF (12.37) and SOF + RBV (12.48) compared to that of pINF + RBV (11.71) over a lifetime horizon, applying their clinical trial treatment durations. The expected costs were higher for DCV + SOF ($170,371) and SOF + RBV ($194,776) vs pINF + RBV regimen ($90,905). Compared to pINF + RBV, the incremental cost-effectiveness ratios (ICER) were $120,671 and $135,398 per QALYs for DCV + SOF and SOF + RBV, respectively. In treatment-experienced patients, DCV + SOF regimen dominated the SOF + RBV regimen. Probabilistic sensitivity analysis indicated a 100% probability that a DCV + SOF regimen was cost saving in treatment-experienced patients. Daclatasvir plus sofosbuvir is a safe and effective option for the treatment of chronic HCV genotype 3 patients. This regimen could be considered a cost-effective option following a first-line treatment of peg-interferon/ribavirin treatment experienced patients with HCV genotype-3 infection.
NASA Astrophysics Data System (ADS)
Bin, Che; Ruoying, Yu; Dongsheng, Dang; Xiangyan, Wang
2017-05-01
Distributed Generation (DG) integrating to the network would cause the harmonic pollution which would cause damages on electrical devices and affect the normal operation of power system. On the other hand, due to the randomness of the wind and solar irradiation, the output of DG is random, too, which leads to an uncertainty of the harmonic generated by the DG. Thus, probabilistic methods are needed to analyse the impacts of the DG integration. In this work we studied the harmonic voltage probabilistic distribution and the harmonic distortion in distributed network after the distributed photovoltaic (DPV) system integrating in different weather conditions, mainly the sunny day, cloudy day, rainy day and the snowy day. The probabilistic distribution function of the DPV output power in different typical weather conditions could be acquired via the parameter identification method of maximum likelihood estimation. The Monte-Carlo simulation method was adopted to calculate the probabilistic distribution of harmonic voltage content at different frequency orders as well as the harmonic distortion (THD) in typical weather conditions. The case study was based on the IEEE33 system and the results of harmonic voltage content probabilistic distribution as well as THD in typical weather conditions were compared.
Cost-Utility Analysis of Lurasidone Versus Aripiprazole in Adults with Schizophrenia.
Rajagopalan, Krithika; Trueman, David; Crowe, Lydia; Squirrell, Daniel; Loebel, Antony
2016-07-01
In 2014, lurasidone, an atypical antipsychotic, was approved for the treatment of schizophrenia in adults. It is an alternative treatment option to aripiprazole, and when compared with aripiprazole, lurasidone was associated with improved symptom reduction and reduced risk of weight gain and relapse. We conducted a cost-utility analysis of lurasidone versus aripiprazole from the perspective of healthcare services, using Scotland and Wales as specific case studies. A 10-year Markov model, incorporating a 6-week acute phase and a maintenance phase across three health states (discontinuation, relapse, death) was constructed. Six-week probabilities of discontinuation and adverse events were based on a published independent mixed-treatment comparison; long-term risks of relapse and discontinuation were from an indirect comparison. Costs included drug therapy, relapse, and outpatient, primary and residential care. Costs and benefits were discounted at 3.5 %. Utility estimates were taken from published literature, and cost effectiveness was expressed as total 10-year incremental costs and quality-adjusted life-years (QALYs). Lurasidone yielded a cost saving of £3383 and an improvement of 0.005 QALYs versus aripiprazole, in Scotland. Deterministic sensitivity analysis demonstrated that results were sensitive to relapse rates, while probabilistic sensitivity analysis suggested that lurasidone had the highest expected net benefit at willingness-to-pay thresholds of £20,000-30,000 per QALY. The probability that lurasidone was a cost-effective treatment strategy was approximately 75 % at all willingness-to-pay thresholds, with similar results being obtained for the Welsh analysis. Our analysis suggests that lurasidone would provide an effective, cost-saving alternative for the healthcare service in the treatment of adult patients with schizophrenia.
Nonequilibrium Probabilistic Dynamics of the Logistic Map at the Edge of Chaos
NASA Astrophysics Data System (ADS)
Borges, Ernesto P.; Tsallis, Constantino; Añaños, Garín F.; de Oliveira, Paulo Murilo
2002-12-01
We consider nonequilibrium probabilistic dynamics in logisticlike maps xt+1=1-a|xt|z, (z>1) at their chaos threshold: We first introduce many initial conditions within one among W>>1 intervals partitioning the phase space and focus on the unique value qsen<1 for which the entropic form Sq≡(1- ∑
Probabilistic direct counterfactual quantum communication
NASA Astrophysics Data System (ADS)
Zhang, Sheng
2017-02-01
It is striking that the quantum Zeno effect can be used to launch a direct counterfactual communication between two spatially separated parties, Alice and Bob. So far, existing protocols of this type only provide a deterministic counterfactual communication service. However, this counterfactuality should be payed at a price. Firstly, the transmission time is much longer than a classical transmission costs. Secondly, the chained-cycle structure makes them more sensitive to channel noises. Here, we extend the idea of counterfactual communication, and present a probabilistic-counterfactual quantum communication protocol, which is proved to have advantages over the deterministic ones. Moreover, the presented protocol could evolve to a deterministic one solely by adjusting the parameters of the beam splitters. Project supported by the National Natural Science Foundation of China (Grant No. 61300203).
The Importance of Calibration in Clinical Psychology.
Lindhiem, Oliver; Petersen, Isaac T; Mentch, Lucas K; Youngstrom, Eric A
2018-02-01
Accuracy has several elements, not all of which have received equal attention in the field of clinical psychology. Calibration, the degree to which a probabilistic estimate of an event reflects the true underlying probability of the event, has largely been neglected in the field of clinical psychology in favor of other components of accuracy such as discrimination (e.g., sensitivity, specificity, area under the receiver operating characteristic curve). Although it is frequently overlooked, calibration is a critical component of accuracy with particular relevance for prognostic models and risk-assessment tools. With advances in personalized medicine and the increasing use of probabilistic (0% to 100%) estimates and predictions in mental health research, the need for careful attention to calibration has become increasingly important.
Influential input classification in probabilistic multimedia models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maddalena, Randy L.; McKone, Thomas E.; Hsieh, Dennis P.H.
1999-05-01
Monte Carlo analysis is a statistical simulation method that is often used to assess and quantify the outcome variance in complex environmental fate and effects models. Total outcome variance of these models is a function of (1) the uncertainty and/or variability associated with each model input and (2) the sensitivity of the model outcome to changes in the inputs. To propagate variance through a model using Monte Carlo techniques, each variable must be assigned a probability distribution. The validity of these distributions directly influences the accuracy and reliability of the model outcome. To efficiently allocate resources for constructing distributions onemore » should first identify the most influential set of variables in the model. Although existing sensitivity and uncertainty analysis methods can provide a relative ranking of the importance of model inputs, they fail to identify the minimum set of stochastic inputs necessary to sufficiently characterize the outcome variance. In this paper, we describe and demonstrate a novel sensitivity/uncertainty analysis method for assessing the importance of each variable in a multimedia environmental fate model. Our analyses show that for a given scenario, a relatively small number of input variables influence the central tendency of the model and an even smaller set determines the shape of the outcome distribution. For each input, the level of influence depends on the scenario under consideration. This information is useful for developing site specific models and improving our understanding of the processes that have the greatest influence on the variance in outcomes from multimedia models.« less
Analysis of sensitivity to different parameterization schemes for a subtropical cyclone
NASA Astrophysics Data System (ADS)
Quitián-Hernández, L.; Fernández-González, S.; González-Alemán, J. J.; Valero, F.; Martín, M. L.
2018-05-01
A sensitivity analysis to diverse WRF model physical parameterization schemes is carried out during the lifecycle of a Subtropical cyclone (STC). STCs are low-pressure systems that share tropical and extratropical characteristics, with hybrid thermal structures. In October 2014, a STC made landfall in the Canary Islands, causing widespread damage from strong winds and precipitation there. The system began to develop on October 18 and its effects lasted until October 21. Accurate simulation of this type of cyclone continues to be a major challenge because of its rapid intensification and unique characteristics. In the present study, several numerical simulations were performed using the WRF model to do a sensitivity analysis of its various parameterization schemes for the development and intensification of the STC. The combination of parameterization schemes that best simulated this type of phenomenon was thereby determined. In particular, the parameterization combinations that included the Tiedtke cumulus schemes had the most positive effects on model results. Moreover, concerning STC track validation, optimal results were attained when the STC was fully formed and all convective processes stabilized. Furthermore, to obtain the parameterization schemes that optimally categorize STC structure, a verification using Cyclone Phase Space is assessed. Consequently, the combination of parameterizations including the Tiedtke cumulus schemes were again the best in categorizing the cyclone's subtropical structure. For strength validation, related atmospheric variables such as wind speed and precipitable water were analyzed. Finally, the effects of using a deterministic or probabilistic approach in simulating intense convective phenomena were evaluated.
The Nature and Variability of Ensemble Sensitivity Fields that Diagnose Severe Convection
NASA Astrophysics Data System (ADS)
Ancell, B. C.
2017-12-01
Ensemble sensitivity analysis (ESA) is a statistical technique that uses information from an ensemble of forecasts to reveal relationships between chosen forecast metrics and the larger atmospheric state at various forecast times. A number of studies have employed ESA from the perspectives of dynamical interpretation, observation targeting, and ensemble subsetting toward improved probabilistic prediction of high-impact events, mostly at synoptic scales. We tested ESA using convective forecast metrics at the 2016 HWT Spring Forecast Experiment to understand the utility of convective ensemble sensitivity fields in improving forecasts of severe convection and its individual hazards. The main purpose of this evaluation was to understand the temporal coherence and general characteristics of convective sensitivity fields toward future use in improving ensemble predictability within an operational framework.The magnitude and coverage of simulated reflectivity, updraft helicity, and surface wind speed were used as response functions, and the sensitivity of these functions to winds, temperatures, geopotential heights, and dew points at different atmospheric levels and at different forecast times were evaluated on a daily basis throughout the HWT Spring Forecast experiment. These sensitivities were calculated within the Texas Tech real-time ensemble system, which possesses 42 members that run twice daily to 48-hr forecast time. Here we summarize both the findings regarding the nature of the sensitivity fields and the evaluation of the participants that reflects their opinions of the utility of operational ESA. The future direction of ESA for operational use will also be discussed.
A new discriminative kernel from probabilistic models.
Tsuda, Koji; Kawanabe, Motoaki; Rätsch, Gunnar; Sonnenburg, Sören; Müller, Klaus-Robert
2002-10-01
Recently, Jaakkola and Haussler (1999) proposed a method for constructing kernel functions from probabilistic models. Their so-called Fisher kernel has been combined with discriminative classifiers such as support vector machines and applied successfully in, for example, DNA and protein analysis. Whereas the Fisher kernel is calculated from the marginal log-likelihood, we propose the TOP kernel derived; from tangent vectors of posterior log-odds. Furthermore, we develop a theoretical framework on feature extractors from probabilistic models and use it for analyzing the TOP kernel. In experiments, our new discriminative TOP kernel compares favorably to the Fisher kernel.
Probabilistic assessment of uncertain adaptive hybrid composites
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.
1994-01-01
Adaptive composite structures using actuation materials, such as piezoelectric fibers, were assessed probabilistically utilizing intraply hybrid composite mechanics in conjunction with probabilistic composite structural analysis. Uncertainties associated with the actuation material as well as the uncertainties in the regular (traditional) composite material properties were quantified and considered in the assessment. Static and buckling analyses were performed for rectangular panels with various boundary conditions and different control arrangements. The probability density functions of the structural behavior, such as maximum displacement and critical buckling load, were computationally simulated. The results of the assessment indicate that improved design and reliability can be achieved with actuation material.
Zhao, Zhenguo; Shi, Wenbo
2014-01-01
Probabilistic signature scheme has been widely used in modern electronic commerce since it could provide integrity, authenticity, and nonrepudiation. Recently, Wu and Lin proposed a novel probabilistic signature (PS) scheme using the bilinear square Diffie-Hellman (BSDH) problem. They also extended it to a universal designated verifier signature (UDVS) scheme. In this paper, we analyze the security of Wu et al.'s PS scheme and UDVS scheme. Through concrete attacks, we demonstrate both of their schemes are not unforgeable. The security analysis shows that their schemes are not suitable for practical applications.