Pasta, D J; Taylor, J L; Henning, J M
1999-01-01
Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.
A probabilistic model (SHEDS-Wood) was developed to examine children's exposure and dose to chromated copper arsenate (CCA)-treated wood, as described in Part 1 of this two part paper. This Part 2 paper discusses sensitivity and uncertainty analyses conducted to assess the key m...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peace, Gerald; Goering, Timothy James; Miller, Mark Laverne
2007-01-01
A probabilistic performance assessment has been conducted to evaluate the fate and transport of radionuclides (americium-241, cesium-137, cobalt-60, plutonium-238, plutonium-239, radium-226, radon-222, strontium-90, thorium-232, tritium, uranium-238), heavy metals (lead and cadmium), and volatile organic compounds (VOCs) at the Mixed Waste Landfill (MWL). Probabilistic analyses were performed to quantify uncertainties inherent in the system and models for a 1,000-year period, and sensitivity analyses were performed to identify parameters and processes that were most important to the simulated performance metrics. Comparisons between simulated results and measured values at the MWL were made to gain confidence in the models and perform calibrations whenmore » data were available. In addition, long-term monitoring requirements and triggers were recommended based on the results of the quantified uncertainty and sensitivity analyses.« less
Briggs, Andrew H; Ades, A E; Price, Martin J
2003-01-01
In structuring decision models of medical interventions, it is commonly recommended that only 2 branches be used for each chance node to avoid logical inconsistencies that can arise during sensitivity analyses if the branching probabilities do not sum to 1. However, information may be naturally available in an unconditional form, and structuring a tree in conditional form may complicate rather than simplify the sensitivity analysis of the unconditional probabilities. Current guidance emphasizes using probabilistic sensitivity analysis, and a method is required to provide probabilistic probabilities over multiple branches that appropriately represents uncertainty while satisfying the requirement that mutually exclusive event probabilities should sum to 1. The authors argue that the Dirichlet distribution, the multivariate equivalent of the beta distribution, is appropriate for this purpose and illustrate its use for generating a fully probabilistic transition matrix for a Markov model. Furthermore, they demonstrate that by adopting a Bayesian approach, the problem of observing zero counts for transitions of interest can be overcome.
PCEMCAN - Probabilistic Ceramic Matrix Composites Analyzer: User's Guide, Version 1.0
NASA Technical Reports Server (NTRS)
Shah, Ashwin R.; Mital, Subodh K.; Murthy, Pappu L. N.
1998-01-01
PCEMCAN (Probabalistic CEramic Matrix Composites ANalyzer) is an integrated computer code developed at NASA Lewis Research Center that simulates uncertainties associated with the constituent properties, manufacturing process, and geometric parameters of fiber reinforced ceramic matrix composites and quantifies their random thermomechanical behavior. The PCEMCAN code can perform the deterministic as well as probabilistic analyses to predict thermomechanical properties. This User's guide details the step-by-step procedure to create input file and update/modify the material properties database required to run PCEMCAN computer code. An overview of the geometric conventions, micromechanical unit cell, nonlinear constitutive relationship and probabilistic simulation methodology is also provided in the manual. Fast probability integration as well as Monte-Carlo simulation methods are available for the uncertainty simulation. Various options available in the code to simulate probabilistic material properties and quantify sensitivity of the primitive random variables have been described. The description of deterministic as well as probabilistic results have been described using demonstration problems. For detailed theoretical description of deterministic and probabilistic analyses, the user is referred to the companion documents "Computational Simulation of Continuous Fiber-Reinforced Ceramic Matrix Composite Behavior," NASA TP-3602, 1996 and "Probabilistic Micromechanics and Macromechanics for Ceramic Matrix Composites", NASA TM 4766, June 1997.
Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling
NASA Technical Reports Server (NTRS)
Hojnicki, Jeffrey S.; Rusick, Jeffrey J.
2005-01-01
Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).
Probabilistic Evaluation of Advanced Ceramic Matrix Composite Structures
NASA Technical Reports Server (NTRS)
Abumeri, Galib H.; Chamis, Christos C.
2003-01-01
The objective of this report is to summarize the deterministic and probabilistic structural evaluation results of two structures made with advanced ceramic composites (CMC): internally pressurized tube and uniformly loaded flange. The deterministic structural evaluation includes stress, displacement, and buckling analyses. It is carried out using the finite element code MHOST, developed for the 3-D inelastic analysis of structures that are made with advanced materials. The probabilistic evaluation is performed using the integrated probabilistic assessment of composite structures computer code IPACS. The affects of uncertainties in primitive variables related to the material, fabrication process, and loadings on the material property and structural response behavior are quantified. The primitive variables considered are: thermo-mechanical properties of fiber and matrix, fiber and void volume ratios, use temperature, and pressure. The probabilistic structural analysis and probabilistic strength results are used by IPACS to perform reliability and risk evaluation of the two structures. The results will show that the sensitivity information obtained for the two composite structures from the computational simulation can be used to alter the design process to meet desired service requirements. In addition to detailed probabilistic analysis of the two structures, the following were performed specifically on the CMC tube: (1) predicted the failure load and the buckling load, (2) performed coupled non-deterministic multi-disciplinary structural analysis, and (3) demonstrated that probabilistic sensitivities can be used to select a reduced set of design variables for optimization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bozoki, G.E.; Fitzpatrick, R.G.; Bohn, M.P.
This report details the review of the Diablo Canyon Probabilistic Risk Assessment (DCPRA). The study was performed under contract from the Probabilistic Risk Analysis Branch, Office of Nuclear Reactor Research, USNRC by Brookhaven National Laboratory. The DCPRA is a full scope Level I effort and although the review touched on all aspects of the PRA, the internal events and seismic events received the vast majority of the review effort. The report includes a number of independent systems analyses sensitivity studies, importance analyses as well as conclusions on the adequacy of the DCPRA for use in the Diablo Canyon Long Termmore » Seismic Program.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, S.; Toll, J.; Cothern, K.
1995-12-31
The authors have performed robust sensitivity studies of the physico-chemical Hudson River PCB model PCHEPM to identify the parameters and process uncertainties contributing the most to uncertainty in predictions of water column and sediment PCB concentrations, over the time period 1977--1991 in one segment of the lower Hudson River. The term ``robust sensitivity studies`` refers to the use of several sensitivity analysis techniques to obtain a more accurate depiction of the relative importance of different sources of uncertainty. Local sensitivity analysis provided data on the sensitivity of PCB concentration estimates to small perturbations in nominal parameter values. Range sensitivity analysismore » provided information about the magnitude of prediction uncertainty associated with each input uncertainty. Rank correlation analysis indicated which parameters had the most dominant influence on model predictions. Factorial analysis identified important interactions among model parameters. Finally, term analysis looked at the aggregate influence of combinations of parameters representing physico-chemical processes. The authors scored the results of the local and range sensitivity and rank correlation analyses. The authors considered parameters that scored high on two of the three analyses to be important contributors to PCB concentration prediction uncertainty, and treated them probabilistically in simulations. They also treated probabilistically parameters identified in the factorial analysis as interacting with important parameters. The authors used the term analysis to better understand how uncertain parameters were influencing the PCB concentration predictions. The importance analysis allowed us to reduce the number of parameters to be modeled probabilistically from 16 to 5. This reduced the computational complexity of Monte Carlo simulations, and more importantly, provided a more lucid depiction of prediction uncertainty and its causes.« less
Andronis, L; Barton, P; Bryan, S
2009-06-01
To determine how we define good practice in sensitivity analysis in general and probabilistic sensitivity analysis (PSA) in particular, and to what extent it has been adhered to in the independent economic evaluations undertaken for the National Institute for Health and Clinical Excellence (NICE) over recent years; to establish what policy impact sensitivity analysis has in the context of NICE, and policy-makers' views on sensitivity analysis and uncertainty, and what use is made of sensitivity analysis in policy decision-making. Three major electronic databases, MEDLINE, EMBASE and the NHS Economic Evaluation Database, were searched from inception to February 2008. The meaning of 'good practice' in the broad area of sensitivity analysis was explored through a review of the literature. An audit was undertaken of the 15 most recent NICE multiple technology appraisal judgements and their related reports to assess how sensitivity analysis has been undertaken by independent academic teams for NICE. A review of the policy and guidance documents issued by NICE aimed to assess the policy impact of the sensitivity analysis and the PSA in particular. Qualitative interview data from NICE Technology Appraisal Committee members, collected as part of an earlier study, were also analysed to assess the value attached to the sensitivity analysis components of the economic analyses conducted for NICE. All forms of sensitivity analysis, notably both deterministic and probabilistic approaches, have their supporters and their detractors. Practice in relation to univariate sensitivity analysis is highly variable, with considerable lack of clarity in relation to the methods used and the basis of the ranges employed. In relation to PSA, there is a high level of variability in the form of distribution used for similar parameters, and the justification for such choices is rarely given. Virtually all analyses failed to consider correlations within the PSA, and this is an area of concern. Uncertainty is considered explicitly in the process of arriving at a decision by the NICE Technology Appraisal Committee, and a correlation between high levels of uncertainty and negative decisions was indicated. The findings suggest considerable value in deterministic sensitivity analysis. Such analyses serve to highlight which model parameters are critical to driving a decision. Strong support was expressed for PSA, principally because it provides an indication of the parameter uncertainty around the incremental cost-effectiveness ratio. The review and the policy impact assessment focused exclusively on documentary evidence, excluding other sources that might have revealed further insights on this issue. In seeking to address parameter uncertainty, both deterministic and probabilistic sensitivity analyses should be used. It is evident that some cost-effectiveness work, especially around the sensitivity analysis components, represents a challenge in making it accessible to those making decisions. This speaks to the training agenda for those sitting on such decision-making bodies, and to the importance of clear presentation of analyses by the academic community.
Bardach, Ariel Esteban; Garay, Osvaldo Ulises; Calderón, María; Pichón-Riviére, Andrés; Augustovski, Federico; Martí, Sebastián García; Cortiñas, Paula; Gonzalez, Marino; Naranjo, Laura T; Gomez, Jorge Alberto; Caporale, Joaquín Enzo
2017-02-02
Cervical cancer (CC) and genital warts (GW) are a significant public health issue in Venezuela. Our objective was to assess the cost-effectiveness of the two available vaccines, bivalent and quadrivalent, against Human Papillomavirus (HPV) in Venezuelan girls in order to inform decision-makers. A previously published Markov cohort model, informed by the best available evidence, was adapted to the Venezuelan context to evaluate the effects of vaccination on health and healthcare costs from the perspective of the healthcare payer in an 11-year-old girls cohort of 264,489. Costs and quality-adjusted life years (QALYs) were discounted at 5%. Eight scenarios were analyzed to depict the cost-effectiveness under alternative vaccine prices, exchange rates and dosing schemes. Deterministic and probabilistic sensitivity analyses were performed. Compared to screening only, the bivalent and quadrivalent vaccines were cost-saving in all scenarios, avoiding 2,310 and 2,143 deaths, 4,781 and 4,431 CCs up to 18,459 GW for the quadrivalent vaccine and gaining 4,486 and 4,395 discounted QALYs respectively. For both vaccines, the main determinants of variations in the incremental costs-effectiveness ratio after running deterministic and probabilistic sensitivity analyses were transition probabilities, vaccine and cancer-treatment costs and HPV 16 and 18 distribution in CC cases. When comparing vaccines, none of them was consistently more cost-effective than the other. In sensitivity analyses, for these comparisons, the main determinants were GW incidence, the level of cross-protection and, for some scenarios, vaccines costs. Immunization with the bivalent or quadrivalent HPV vaccines showed to be cost-saving or cost-effective in Venezuela, falling below the threshold of one Gross Domestic Product (GDP) per capita (104,404 VEF) per QALY gained. Deterministic and probabilistic sensitivity analyses confirmed the robustness of these results.
Cost-effectiveness of prucalopride in the treatment of chronic constipation in the Netherlands
Nuijten, Mark J. C.; Dubois, Dominique J.; Joseph, Alain; Annemans, Lieven
2015-01-01
Objective: To assess the cost-effectiveness of prucalopride vs. continued laxative treatment for chronic constipation in patients in the Netherlands in whom laxatives have failed to provide adequate relief. Methods: A Markov model was developed to estimate the cost-effectiveness of prucalopride in patients with chronic constipation receiving standard laxative treatment from the perspective of Dutch payers in 2011. Data sources included published prucalopride clinical trials, published Dutch price/tariff lists, and national population statistics. The model simulated the clinical and economic outcomes associated with prucalopride vs. standard treatment and had a cycle length of 1 month and a follow-up time of 1 year. Response to treatment was defined as the proportion of patients who achieved “normal bowel function”. One-way and probabilistic sensitivity analyses were conducted to test the robustness of the base case. Results: In the base case analysis, the cost of prucalopride relative to continued laxative treatment was € 9015 per quality-adjusted life-year (QALY). Extensive sensitivity analyses and scenario analyses confirmed that the base case cost-effectiveness estimate was robust. One-way sensitivity analyses showed that the model was most sensitive in response to prucalopride; incremental cost-effectiveness ratios ranged from € 6475 to 15,380 per QALY. Probabilistic sensitivity analyses indicated that there is a greater than 80% probability that prucalopride would be cost-effective compared with continued standard treatment, assuming a willingness-to-pay threshold of € 20,000 per QALY from a Dutch societal perspective. A scenario analysis was performed for women only, which resulted in a cost-effectiveness ratio of € 7773 per QALY. Conclusion: Prucalopride was cost-effective in a Dutch patient population, as well as in a women-only subgroup, who had chronic constipation and who obtained inadequate relief from laxatives. PMID:25926794
Hofer, Florian; Achelrod, Dmitrij; Stargardt, Tom
2016-12-01
Chronic obstructive pulmonary disease (COPD) poses major challenges for health care systems. Previous studies suggest that telemonitoring could be effective in preventing hospitalisations and hence reduce costs. The aim was to evaluate whether telemonitoring interventions for COPD are cost-effective from the perspective of German statutory sickness funds. A cost-utility analysis was conducted using a combination of a Markov model and a decision tree. Telemonitoring as add-on to standard treatment was compared with standard treatment alone. The model consisted of four transition stages to account for COPD severity, and a terminal stage for death. Within each cycle, the frequency of exacerbations as well as outcomes for 2015 costs and quality adjusted life years (QALYs) for each stage were calculated. Values for input parameters were taken from the literature. Deterministic and probabilistic sensitivity analyses were conducted. In the base case, telemonitoring led to an increase in incremental costs (€866 per patient) but also in incremental QALYs (0.05 per patient). The incremental cost-effectiveness ratio (ICER) was thus €17,410 per QALY gained. A deterministic sensitivity analysis showed that hospitalisation rate and costs for telemonitoring equipment greatly affected results. The probabilistic ICER averaged €34,432 per QALY (95 % confidence interval 12,161-56,703). We provide evidence that telemonitoring may be cost-effective in Germany from a payer's point of view. This holds even after deterministic and probabilistic sensitivity analyses.
Probabilistic Structural Evaluation of Uncertainties in Radiator Sandwich Panel Design
NASA Technical Reports Server (NTRS)
Kuguoglu, Latife; Ludwiczak, Damian
2006-01-01
The Jupiter Icy Moons Orbiter (JIMO) Space System is part of the NASA's Prometheus Program. As part of the JIMO engineering team at NASA Glenn Research Center, the structural design of the JIMO Heat Rejection Subsystem (HRS) is evaluated. An initial goal of this study was to perform sensitivity analyses to determine the relative importance of the input variables on the structural responses of the radiator panel. The desire was to let the sensitivity analysis information identify the important parameters. The probabilistic analysis methods illustrated here support this objective. The probabilistic structural performance evaluation of a HRS radiator sandwich panel was performed. The radiator panel structural performance was assessed in the presence of uncertainties in the loading, fabrication process variables, and material properties. The stress and displacement contours of the deterministic structural analysis at mean probability was performed and results presented. It is followed by a probabilistic evaluation to determine the effect of the primitive variables on the radiator panel structural performance. Based on uncertainties in material properties, structural geometry and loading, the results of the displacement and stress analysis are used as an input file for the probabilistic analysis of the panel. The sensitivity of the structural responses, such as maximum displacement and maximum tensile and compressive stresses of the facesheet in x and y directions and maximum VonMises stresses of the tube, to the loading and design variables is determined under the boundary condition where all edges of the radiator panel are pinned. Based on this study, design critical material and geometric parameters of the considered sandwich panel are identified.
Nelson, S D; Nelson, R E; Cannon, G W; Lawrence, P; Battistone, M J; Grotzke, M; Rosenblum, Y; LaFleur, J
2014-12-01
This is a cost-effectiveness analysis of training rural providers to identify and treat osteoporosis. Results showed a slight cost savings, increase in life years, increase in treatment rates, and decrease in fracture incidence. However, the results were sensitive to small differences in effectiveness, being cost-effective in 70 % of simulations during probabilistic sensitivity analysis. We evaluated the cost-effectiveness of training rural providers to identify and treat veterans at risk for fragility fractures relative to referring these patients to an urban medical center for specialist care. The model evaluated the impact of training on patient life years, quality-adjusted life years (QALYs), treatment rates, fracture incidence, and costs from the perspective of the Department of Veterans Affairs. We constructed a Markov microsimulation model to compare costs and outcomes of a hypothetical cohort of veterans seen by rural providers. Parameter estimates were derived from previously published studies, and we conducted one-way and probabilistic sensitivity analyses on the parameter inputs. Base-case analysis showed that training resulted in no additional costs and an extra 0.083 life years (0.054 QALYs). Our model projected that as a result of training, more patients with osteoporosis would receive treatment (81.3 vs. 12.2 %), and all patients would have a lower incidence of fractures per 1,000 patient years (hip, 1.628 vs. 1.913; clinical vertebral, 0.566 vs. 1.037) when seen by a trained provider compared to an untrained provider. Results remained consistent in one-way sensitivity analysis and in probabilistic sensitivity analyses, training rural providers was cost-effective (less than $50,000/QALY) in 70 % of the simulations. Training rural providers to identify and treat veterans at risk for fragility fractures has a potential to be cost-effective, but the results are sensitive to small differences in effectiveness. It appears that provider education alone is not enough to make a significant difference in fragility fracture rates among veterans.
Corso, Phaedra S.; Ingels, Justin B.; Kogan, Steven M.; Foster, E. Michael; Chen, Yi-Fu; Brody, Gene H.
2013-01-01
Programmatic cost analyses of preventive interventions commonly have a number of methodological difficulties. To determine the mean total costs and properly characterize variability, one often has to deal with small sample sizes, skewed distributions, and especially missing data. Standard approaches for dealing with missing data such as multiple imputation may suffer from a small sample size, a lack of appropriate covariates, or too few details around the method used to handle the missing data. In this study, we estimate total programmatic costs for a prevention trial evaluating the Strong African American Families-Teen program. This intervention focuses on the prevention of substance abuse and risky sexual behavior. To account for missing data in the assessment of programmatic costs we compare multiple imputation to probabilistic sensitivity analysis. The latter approach uses collected cost data to create a distribution around each input parameter. We found that with the multiple imputation approach, the mean (95% confidence interval) incremental difference was $2149 ($397, $3901). With the probabilistic sensitivity analysis approach, the incremental difference was $2583 ($778, $4346). Although the true cost of the program is unknown, probabilistic sensitivity analysis may be a more viable alternative for capturing variability in estimates of programmatic costs when dealing with missing data, particularly with small sample sizes and the lack of strong predictor variables. Further, the larger standard errors produced by the probabilistic sensitivity analysis method may signal its ability to capture more of the variability in the data, thus better informing policymakers on the potentially true cost of the intervention. PMID:23299559
Corso, Phaedra S; Ingels, Justin B; Kogan, Steven M; Foster, E Michael; Chen, Yi-Fu; Brody, Gene H
2013-10-01
Programmatic cost analyses of preventive interventions commonly have a number of methodological difficulties. To determine the mean total costs and properly characterize variability, one often has to deal with small sample sizes, skewed distributions, and especially missing data. Standard approaches for dealing with missing data such as multiple imputation may suffer from a small sample size, a lack of appropriate covariates, or too few details around the method used to handle the missing data. In this study, we estimate total programmatic costs for a prevention trial evaluating the Strong African American Families-Teen program. This intervention focuses on the prevention of substance abuse and risky sexual behavior. To account for missing data in the assessment of programmatic costs we compare multiple imputation to probabilistic sensitivity analysis. The latter approach uses collected cost data to create a distribution around each input parameter. We found that with the multiple imputation approach, the mean (95 % confidence interval) incremental difference was $2,149 ($397, $3,901). With the probabilistic sensitivity analysis approach, the incremental difference was $2,583 ($778, $4,346). Although the true cost of the program is unknown, probabilistic sensitivity analysis may be a more viable alternative for capturing variability in estimates of programmatic costs when dealing with missing data, particularly with small sample sizes and the lack of strong predictor variables. Further, the larger standard errors produced by the probabilistic sensitivity analysis method may signal its ability to capture more of the variability in the data, thus better informing policymakers on the potentially true cost of the intervention.
Probabilistic Aeroelastic Analysis Developed for Turbomachinery Components
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Mital, Subodh K.; Stefko, George L.; Pai, Shantaram S.
2003-01-01
Aeroelastic analyses for advanced turbomachines are being developed for use at the NASA Glenn Research Center and industry. However, these analyses at present are used for turbomachinery design with uncertainties accounted for by using safety factors. This approach may lead to overly conservative designs, thereby reducing the potential of designing higher efficiency engines. An integration of the deterministic aeroelastic analysis methods with probabilistic analysis methods offers the potential to design efficient engines with fewer aeroelastic problems and to make a quantum leap toward designing safe reliable engines. In this research, probabilistic analysis is integrated with aeroelastic analysis: (1) to determine the parameters that most affect the aeroelastic characteristics (forced response and stability) of a turbomachine component such as a fan, compressor, or turbine and (2) to give the acceptable standard deviation on the design parameters for an aeroelastically stable system. The approach taken is to combine the aeroelastic analysis of the MISER (MIStuned Engine Response) code with the FPI (fast probability integration) code. The role of MISER is to provide the functional relationships that tie the structural and aerodynamic parameters (the primitive variables) to the forced response amplitudes and stability eigenvalues (the response properties). The role of FPI is to perform probabilistic analyses by utilizing the response properties generated by MISER. The results are a probability density function for the response properties. The probabilistic sensitivities of the response variables to uncertainty in primitive variables are obtained as a byproduct of the FPI technique. The combined analysis of aeroelastic and probabilistic analysis is applied to a 12-bladed cascade vibrating in bending and torsion. Out of the total 11 design parameters, 6 are considered as having probabilistic variation. The six parameters are space-to-chord ratio (SBYC), stagger angle (GAMA), elastic axis (ELAXS), Mach number (MACH), mass ratio (MASSR), and frequency ratio (WHWB). The cascade is considered to be in subsonic flow with Mach 0.7. The results of the probabilistic aeroelastic analysis are the probability density function of predicted aerodynamic damping and frequency for flutter and the response amplitudes for forced response.
Time to angiographic reperfusion in acute ischemic stroke: decision analysis.
Vagal, Achala S; Khatri, Pooja; Broderick, Joseph P; Tomsick, Thomas A; Yeatts, Sharon D; Eckman, Mark H
2014-12-01
Our objective was to use decision analytic modeling to compare 2 treatment strategies of intravenous recombinant tissue-type plasminogen activator (r-tPA) alone versus combined intravenous r-tPA/endovascular therapy in a subgroup of patients with large vessel (internal carotid artery terminus, M1, and M2) occlusion based on varying times to angiographic reperfusion and varying rates of reperfusion. We developed a decision model using Interventional Management of Stroke (IMS) III trial data and comprehensive literature review. We performed 1-way sensitivity analyses for time to reperfusion and 2-way sensitivity for time to reperfusion and rate of reperfusion success. We also performed probabilistic sensitivity analyses to address uncertainty in total time to reperfusion for the endovascular approach. In the base case, endovascular approach yielded a higher expected utility (6.38 quality-adjusted life years) than the intravenous-only arm (5.42 quality-adjusted life years). One-way sensitivity analyses demonstrated superiority of endovascular treatment to intravenous-only arm unless time to reperfusion exceeded 347 minutes. Two-way sensitivity analysis demonstrated that endovascular treatment was preferred when probability of reperfusion is high and time to reperfusion is small. Probabilistic sensitivity results demonstrated an average gain for endovascular therapy of 0.76 quality-adjusted life years (SD 0.82) compared with the intravenous-only approach. In our post hoc model with its underlying limitations, endovascular therapy after intravenous r-tPA is the preferred treatment as compared with intravenous r-tPA alone. However, if time to reperfusion exceeds 347 minutes, intravenous r-tPA alone is the recommended strategy. This warrants validation in a randomized, prospective trial among patients with large vessel occlusions. © 2014 American Heart Association, Inc.
Aldridge, Robert W; Shaji, Kunju; Hayward, Andrew C; Abubakar, Ibrahim
2015-01-01
The Enhanced Matching System (EMS) is a probabilistic record linkage program developed by the tuberculosis section at Public Health England to match data for individuals across two datasets. This paper outlines how EMS works and investigates its accuracy for linkage across public health datasets. EMS is a configurable Microsoft SQL Server database program. To examine the accuracy of EMS, two public health databases were matched using National Health Service (NHS) numbers as a gold standard unique identifier. Probabilistic linkage was then performed on the same two datasets without inclusion of NHS number. Sensitivity analyses were carried out to examine the effect of varying matching process parameters. Exact matching using NHS number between two datasets (containing 5931 and 1759 records) identified 1071 matched pairs. EMS probabilistic linkage identified 1068 record pairs. The sensitivity of probabilistic linkage was calculated as 99.5% (95%CI: 98.9, 99.8), specificity 100.0% (95%CI: 99.9, 100.0), positive predictive value 99.8% (95%CI: 99.3, 100.0), and negative predictive value 99.9% (95%CI: 99.8, 100.0). Probabilistic matching was most accurate when including address variables and using the automatically generated threshold for determining links with manual review. With the establishment of national electronic datasets across health and social care, EMS enables previously unanswerable research questions to be tackled with confidence in the accuracy of the linkage process. In scenarios where a small sample is being matched into a very large database (such as national records of hospital attendance) then, compared to results presented in this analysis, the positive predictive value or sensitivity may drop according to the prevalence of matches between databases. Despite this possible limitation, probabilistic linkage has great potential to be used where exact matching using a common identifier is not possible, including in low-income settings, and for vulnerable groups such as homeless populations, where the absence of unique identifiers and lower data quality has historically hindered the ability to identify individuals across datasets.
Sensitivity Analysis in Sequential Decision Models.
Chen, Qiushi; Ayer, Turgay; Chhatwal, Jagpreet
2017-02-01
Sequential decision problems are frequently encountered in medical decision making, which are commonly solved using Markov decision processes (MDPs). Modeling guidelines recommend conducting sensitivity analyses in decision-analytic models to assess the robustness of the model results against the uncertainty in model parameters. However, standard methods of conducting sensitivity analyses cannot be directly applied to sequential decision problems because this would require evaluating all possible decision sequences, typically in the order of trillions, which is not practically feasible. As a result, most MDP-based modeling studies do not examine confidence in their recommended policies. In this study, we provide an approach to estimate uncertainty and confidence in the results of sequential decision models. First, we provide a probabilistic univariate method to identify the most sensitive parameters in MDPs. Second, we present a probabilistic multivariate approach to estimate the overall confidence in the recommended optimal policy considering joint uncertainty in the model parameters. We provide a graphical representation, which we call a policy acceptability curve, to summarize the confidence in the optimal policy by incorporating stakeholders' willingness to accept the base case policy. For a cost-effectiveness analysis, we provide an approach to construct a cost-effectiveness acceptability frontier, which shows the most cost-effective policy as well as the confidence in that for a given willingness to pay threshold. We demonstrate our approach using a simple MDP case study. We developed a method to conduct sensitivity analysis in sequential decision models, which could increase the credibility of these models among stakeholders.
Linear regression metamodeling as a tool to summarize and present simulation model results.
Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M
2013-10-01
Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.
Probabilistic evaluation of fuselage-type composite structures
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Chamis, Christos C.
1992-01-01
A methodology is developed to computationally simulate the uncertain behavior of composite structures. The uncertain behavior includes buckling loads, natural frequencies, displacements, stress/strain etc., which are the consequences of the random variation (scatter) of the primitive (independent random) variables in the constituent, ply, laminate and structural levels. This methodology is implemented in the IPACS (Integrated Probabilistic Assessment of Composite Structures) computer code. A fuselage-type composite structure is analyzed to demonstrate the code's capability. The probability distribution functions of the buckling loads, natural frequency, displacement, strain and stress are computed. The sensitivity of each primitive (independent random) variable to a given structural response is also identified from the analyses.
Probabilistic failure analysis of bone using a finite element model of mineral-collagen composites.
Dong, X Neil; Guda, Teja; Millwater, Harry R; Wang, Xiaodu
2009-02-09
Microdamage accumulation is a major pathway for energy dissipation during the post-yield deformation of bone. In this study, a two-dimensional probabilistic finite element model of a mineral-collagen composite was developed to investigate the influence of the tissue and ultrastructural properties of bone on the evolution of microdamage from an initial defect in tension. The probabilistic failure analyses indicated that the microdamage progression would be along the plane of the initial defect when the debonding at mineral-collagen interfaces was either absent or limited in the vicinity of the defect. In this case, the formation of a linear microcrack would be facilitated. However, the microdamage progression would be scattered away from the initial defect plane if interfacial debonding takes place at a large scale. This would suggest the possible formation of diffuse damage. In addition to interfacial debonding, the sensitivity analyses indicated that the microdamage progression was also dependent on the other material and ultrastructural properties of bone. The intensity of stress concentration accompanied with microdamage progression was more sensitive to the elastic modulus of the mineral phase and the nonlinearity of the collagen phase, whereas the scattering of failure location was largely dependent on the mineral to collagen ratio and the nonlinearity of the collagen phase. The findings of this study may help understanding the post-yield behavior of bone at the ultrastructural level and shed light on the underlying mechanism of bone fractures.
Probabilistic Failure Analysis of Bone Using a Finite Element Model of Mineral-Collagen Composites
Dong, X. Neil; Guda, Teja; Millwater, Harry R.; Wang, Xiaodu
2009-01-01
Microdamage accumulation is a major pathway for energy dissipation during the post-yield deformation of bone. In this study, a two-dimensional probabilistic finite element model of a mineral-collagen composite was developed to investigate the influence of the tissue and ultrastructural properties of bone on the evolution of microdamage from an initial defect in tension. The probabilistic failure analyses indicated that the microdamage progression would be along the plane of the initial defect when the debonding at mineral-collagen interfaces was either absent or limited in the vicinity of the defect. In this case, the formation of a linear microcrack would be facilitated. However, the microdamage progression would be scattered away from the initial defect plane if interfacial debonding takes place at a large scale. This would suggest the possible formation of diffuse damage. In addition to interfacial debonding, the sensitivity analyses indicated that the microdamage progression was also dependent on the other material and ultrastructural properties of bone. The intensity of stress concentration accompanied with microdamage progression was more sensitive to the elastic modulus of the mineral phase and the nonlinearity of the collagen phase, whereas the scattering of failure location was largely dependent on the mineral to collagen ratio and the nonlinearity of the collagen phase. The findings of this study may help understanding the post-yield behavior of bone at the ultrastructural level and shed light on the underlying mechanism of bone fractures. PMID:19058806
Ranking of sabotage/tampering avoidance technology alternatives
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, W.B.; Tabatabai, A.S.; Powers, T.B.
1986-01-01
Pacific Northwest Laboratory conducted a study to evaluate alternatives to the design and operation of nuclear power plants, emphasizing a reduction of their vulnerability to sabotage. Estimates of core melt accident frequency during normal operations and from sabotage/tampering events were used to rank the alternatives. Core melt frequency for normal operations was estimated using sensitivity analysis of results of probabilistic risk assessments. Core melt frequency for sabotage/tampering was estimated by developing a model based on probabilistic risk analyses, historic data, engineering judgment, and safeguards analyses of plant locations where core melt events could be initiated. Results indicate the most effectivemore » alternatives focus on large areas of the plant, increase safety system redundancy, and reduce reliance on single locations for mitigation of transients. Less effective options focus on specific areas of the plant, reduce reliance on some plant areas for safe shutdown, and focus on less vulnerable targets.« less
Dalziel, Kim; Round, Ali; Garside, Ruth; Stein, Ken
2005-01-01
To evaluate the cost utility of imatinib compared with interferon (IFN)-alpha or hydroxycarbamide (hydroxyurea) for first-line treatment of chronic myeloid leukaemia. A cost-utility (Markov) model within the setting of the UK NHS and viewed from a health system perspective was adopted. Transition probabilities and relative risks were estimated from published literature. Costs of drug treatment, outpatient care, bone marrow biopsies, radiography, blood transfusions and inpatient care were obtained from the British National Formulary and local hospital databases. Costs (pound, year 2001-03 values) were discounted at 6%. Quality-of-life (QOL) data were obtained from the published literature and discounted at 1.5%. The main outcome measure was cost per QALY gained. Extensive one-way sensitivity analyses were performed along with probabilistic (stochastic) analysis. The incremental cost-effectiveness ratio (ICER) of imatinib, compared with IFNalpha, was pound26,180 per QALY gained (one-way sensitivity analyses ranged from pound19,449 to pound51,870) and compared with hydroxycarbamide was pound86,934 per QALY (one-way sensitivity analyses ranged from pound69,701 to pound147,095) [ pound1=$US1.691=euro1.535 as at 31 December 2002].Based on the probabilistic sensitivity analysis, 50% of the ICERs for imatinib, compared with IFNalpha, fell below a threshold of approximately pound31,000 per QALY gained. Fifty percent of ICERs for imatinib, compared with hydroxycarbamide, fell below approximately pound95,000 per QALY gained. This model suggests, given its underlying data and assumptions, that imatinib may be moderately cost effective when compared with IFNalpha but considerably less cost effective when compared with hydroxycarbamide. There are, however, many uncertainties due to the lack of long-term data.
Gandhoke, Gurpreet S; Pease, Matthew; Smith, Kenneth J; Sekula, Raymond F
2017-09-01
To perform a cost-minimization study comparing the supraorbital and endoscopic endonasal (EEA) approach with or without craniotomy for the resection of olfactory groove meningiomas (OGMs). We built a decision tree using probabilities of gross total resection (GTR) and cerebrospinal fluid (CSF) leak rates with the supraorbital approach versus EEA with and without additional craniotomy. The cost (not charge or reimbursement) at each "stem" of this decision tree for both surgical options was obtained from our hospital's finance department. After a base case calculation, we applied plausible ranges to all parameters and carried out multiple 1-way sensitivity analyses. Probabilistic sensitivity analyses confirmed our results. The probabilities of GTR (0.8) and CSF leak (0.2) for the supraorbital craniotomy were obtained from our series of 5 patients who underwent a supraorbital approach for the resection of an OGM. The mean tumor volume was 54.6 cm 3 (range, 17-94.2 cm 3 ). Literature-reported rates of GTR (0.6) and CSF leak (0.3) with EEA were applied to our economic analysis. Supraorbital craniotomy was the preferred strategy, with an expected value of $29,423, compared with an EEA cost of $83,838. On multiple 1-way sensitivity analyses, supraorbital craniotomy remained the preferred strategy, with a minimum cost savings of $46,000 and a maximum savings of $64,000. Probabilistic sensitivity analysis found the lowest cost difference between the 2 surgical options to be $37,431. Compared with EEA, supraorbital craniotomy provides substantial cost savings in the treatment of OGMs. Given the potential differences in effectiveness between approaches, a cost-effectiveness analysis should be undertaken. Copyright © 2017 Elsevier Inc. All rights reserved.
Probabilistic Sensitivity Analysis with Respect to Bounds of Truncated Distributions (PREPRINT)
2010-04-01
AFRL-RX-WP-TP-2010-4147 PROBABILISTIC SENSITIVITY ANALYSIS WITH RESPECT TO BOUNDS OF TRUNCATED DISTRIBUTIONS (PREPRINT) H. Millwater and...5a. CONTRACT NUMBER In-house 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 62102F 6. AUTHOR(S) H. Millwater and Y. Feng 5d. PROJECT...Z39-18 1 Probabilistic Sensitivity Analysis with respect to Bounds of Truncated Distributions H. Millwater and Y. Feng Department of Mechanical
Pouzou, Jane G.; Cullen, Alison C.; Yost, Michael G.; Kissel, John C.; Fenske, Richard A.
2018-01-01
Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide-handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach. PMID:29105804
Pouzou, Jane G; Cullen, Alison C; Yost, Michael G; Kissel, John C; Fenske, Richard A
2017-11-06
Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide-handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach. © 2017 Society for Risk Analysis.
Cost-Effectiveness Analysis of Regorafenib for Metastatic Colorectal Cancer
Goldstein, Daniel A.; Ahmad, Bilal B.; Chen, Qiushi; Ayer, Turgay; Howard, David H.; Lipscomb, Joseph; El-Rayes, Bassel F.; Flowers, Christopher R.
2015-01-01
Purpose Regorafenib is a standard-care option for treatment-refractory metastatic colorectal cancer that increases median overall survival by 6 weeks compared with placebo. Given this small incremental clinical benefit, we evaluated the cost-effectiveness of regorafenib in the third-line setting for patients with metastatic colorectal cancer from the US payer perspective. Methods We developed a Markov model to compare the cost and effectiveness of regorafenib with those of placebo in the third-line treatment of metastatic colorectal cancer. Health outcomes were measured in life-years and quality-adjusted life-years (QALYs). Drug costs were based on Medicare reimbursement rates in 2014. Model robustness was addressed in univariable and probabilistic sensitivity analyses. Results Regorafenib provided an additional 0.04 QALYs (0.13 life-years) at a cost of $40,000, resulting in an incremental cost-effectiveness ratio of $900,000 per QALY. The incremental cost-effectiveness ratio for regorafenib was > $550,000 per QALY in all of our univariable and probabilistic sensitivity analyses. Conclusion Regorafenib provides minimal incremental benefit at high incremental cost per QALY in the third-line management of metastatic colorectal cancer. The cost-effectiveness of regorafenib could be improved by the use of value-based pricing. PMID:26304904
Probabilistic Sensitivity Analysis of Fretting Fatigue (Preprint)
2009-04-01
AFRL-RX-WP-TP-2009-4091 PROBABILISTIC SENSITIVITY ANALYSIS OF FRETTING FATIGUE (Preprint) Patrick J. Golden, Harry R. Millwater , and...Sensitivity Analysis of Fretting Fatigue Patrick J. Golden * Air Force Research Laboratory, Wright-Patterson AFB, OH 45433 Harry R. Millwater † and
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ho, Clifford Kuofei
Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-based exposure assessments and drug-delivery analyses, but previous mechanistic models have been largely deterministic. A probabilistic, transient, three-phase model of percutaneous absorption of chemicals has been developed to assess the relative importance of uncertain parameters and processes that may be important to risk-based assessments. Penetration routes through the skinmore » that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times. This probabilistic analysis of percutaneous absorption (PAPA) method has been developed to improve risk-based exposure assessments and transdermal drug-delivery analyses, where parameters and processes can be highly uncertain.« less
Dong, Hengjin; Buxton, Martin
2006-01-01
The objective of this study is to apply a Markov model to compare cost-effectiveness of total knee replacement (TKR) using computer-assisted surgery (CAS) with that of TKR using a conventional manual method in the absence of formal clinical trial evidence. A structured search was carried out to identify evidence relating to the clinical outcome, cost, and effectiveness of TKR. Nine Markov states were identified based on the progress of the disease after TKR. Effectiveness was expressed by quality-adjusted life years (QALYs). The simulation was carried out initially for 120 cycles of a month each, starting with 1,000 TKRs. A discount rate of 3.5 percent was used for both cost and effectiveness in the incremental cost-effectiveness analysis. Then, a probabilistic sensitivity analysis was carried out using a Monte Carlo approach with 10,000 iterations. Computer-assisted TKR was a long-term cost-effective technology, but the QALYs gained were small. After the first 2 years, the incremental cost per QALY of computer-assisted TKR was dominant because of cheaper and more QALYs. The incremental cost-effectiveness ratio (ICER) was sensitive to the "effect of CAS," to the CAS extra cost, and to the utility of the state "Normal health after primary TKR," but it was not sensitive to utilities of other Markov states. Both probabilistic and deterministic analyses produced similar cumulative serious or minor complication rates and complex or simple revision rates. They also produced similar ICERs. Compared with conventional TKR, computer-assisted TKR is a cost-saving technology in the long-term and may offer small additional QALYs. The "effect of CAS" is to reduce revision rates and complications through more accurate and precise alignment, and although the conclusions from the model, even when allowing for a full probabilistic analysis of uncertainty, are clear, the "effect of CAS" on the rate of revisions awaits long-term clinical evidence.
Cost-effectiveness of breast cancer screening using mammography in Vietnamese women
2018-01-01
Background The incidence rate of breast cancer is increasing and has become the most common cancer in Vietnamese women while the survival rate is lower than that of developed countries. Early detection to improve breast cancer survival as well as reducing risk factors remains the cornerstone of breast cancer control according to the World Health Organization (WHO). This study aims to evaluate the costs and outcomes of introducing a mammography screening program for Vietnamese women aged 45–64 years, compared to the current situation of no screening. Methods Decision analytical modeling using Markov chain analysis was used to estimate costs and health outcomes over a lifetime horizon. Model inputs were derived from published literature and the results were reported as incremental cost-effectiveness ratios (ICERs) and/or incremental net monetary benefits (INMBs). One-way sensitivity analyses and probabilistic sensitivity analyses were performed to assess parameter uncertainty. Results The ICER per life year gained of the first round of mammography screening was US$3647.06 and US$4405.44 for women aged 50–54 years and 55–59 years, respectively. In probabilistic sensitivity analyses, mammography screening in the 50–54 age group and the 55–59 age group were cost-effective in 100% of cases at a threshold of three times the Vietnamese Gross Domestic Product (GDP) i.e., US$6332.70. However, less than 50% of the cases in the 60–64 age group and 0% of the cases in the 45–49 age group were cost effective at the WHO threshold. The ICERs were sensitive to the discount rate, mammography sensitivity, and transition probability from remission to distant recurrence in stage II for all age groups. Conclusion From the healthcare payer viewpoint, offering the first round of mammography screening to Vietnamese women aged 50–59 years should be considered, with the given threshold of three times the Vietnamese GDP per capita. PMID:29579131
Role of ionotropic glutamate receptors in delay and probability discounting in the rat.
Yates, Justin R; Batten, Seth R; Bardo, Michael T; Beckmann, Joshua S
2015-04-01
Discounting of delayed and probabilistic reinforcement is linked to increased drug use and pathological gambling. Understanding the neurobiology of discounting is important for designing treatments for these disorders. Glutamate is considered to be involved in addiction-like behaviors; however, the role of ionotropic glutamate receptors (iGluRs) in discounting remains unclear. The current study examined the effects of N-methyl-D-aspartate (NMDA) and α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid (AMPA) glutamate receptor blockade on performance in delay and probability discounting tasks. Following training in either delay or probability discounting, rats (n = 12, each task) received pretreatments of the NMDA receptor antagonists MK-801 (0, 0.01, 0.03, 0.1, or 0.3 mg/kg, s.c.) or ketamine (0, 1.0, 5.0, or 10.0 mg/kg, i.p.), as well as the AMPA receptor antagonist CNQX (0, 1.0, 3.0, or 5.6 mg/kg, i.p.). Hyperbolic discounting functions were used to estimate sensitivity to delayed/probabilistic reinforcement and sensitivity to reinforcer amount. An intermediate dose of MK-801 (0.03 mg/kg) decreased sensitivity to both delayed and probabilistic reinforcement. In contrast, ketamine did not affect the rate of discounting in either task but decreased sensitivity to reinforcer amount. CNQX did not alter sensitivity to reinforcer amount or delayed/probabilistic reinforcement. These results show that blockade of NMDA receptors, but not AMPA receptors, decreases sensitivity to delayed/probabilistic reinforcement (MK-801) and sensitivity to reinforcer amount (ketamine). The differential effects of MK-801 and ketamine demonstrate that sensitivities to delayed/probabilistic reinforcement and reinforcer amount are pharmacologically dissociable.
Bounthavong, Mark; Pruitt, Larry D; Smolenski, Derek J; Gahm, Gregory A; Bansal, Aasthaa; Hansen, Ryan N
2018-02-01
Introduction Home-based telebehavioural healthcare improves access to mental health care for patients restricted by travel burden. However, there is limited evidence assessing the economic value of home-based telebehavioural health care compared to in-person care. We sought to compare the economic impact of home-based telebehavioural health care and in-person care for depression among current and former US service members. Methods We performed trial-based cost-minimisation and cost-utility analyses to assess the economic impact of home-based telebehavioural health care versus in-person behavioural care for depression. Our analyses focused on the payer perspective (Department of Defense and Department of Veterans Affairs) at three months. We also performed a scenario analysis where all patients possessed video-conferencing technology that was approved by these agencies. The cost-utility analysis evaluated the impact of different depression categories on the incremental cost-effectiveness ratio. One-way and probabilistic sensitivity analyses were performed to test the robustness of the model assumptions. Results In the base case analysis the total direct cost of home-based telebehavioural health care was higher than in-person care (US$71,974 versus US$20,322). Assuming that patients possessed government-approved video-conferencing technology, home-based telebehavioural health care was less costly compared to in-person care (US$19,177 versus US$20,322). In one-way sensitivity analyses, the proportion of patients possessing personal computers was a major driver of direct costs. In the cost-utility analysis, home-based telebehavioural health care was dominant when patients possessed video-conferencing technology. Results from probabilistic sensitivity analyses did not differ substantially from base case results. Discussion Home-based telebehavioural health care is dependent on the cost of supplying video-conferencing technology to patients but offers the opportunity to increase access to care. Health-care policies centred on implementation of home-based telebehavioural health care should ensure that these technologies are able to be successfully deployed on patients' existing technology.
Herring, William; Pearson, Isobel; Purser, Molly; Nakhaipour, Hamid Reza; Haiderali, Amin; Wolowacz, Sorrel; Jayasundara, Kavisha
2016-01-01
Our objective was to estimate the cost effectiveness of ofatumumab plus chlorambucil (OChl) versus chlorambucil in patients with chronic lymphocytic leukaemia for whom fludarabine-based therapies are considered inappropriate from the perspective of the publicly funded healthcare system in Canada. A semi-Markov model (3-month cycle length) used survival curves to govern progression-free survival (PFS) and overall survival (OS). Efficacy and safety data and health-state utility values were estimated from the COMPLEMENT-1 trial. Post-progression treatment patterns were based on clinical guidelines, Canadian treatment practices and published literature. Total and incremental expected lifetime costs (in Canadian dollars [$Can], year 2013 values), life-years and quality-adjusted life-years (QALYs) were computed. Uncertainty was assessed via deterministic and probabilistic sensitivity analyses. The discounted lifetime health and economic outcomes estimated by the model showed that, compared with chlorambucil, first-line treatment with OChl led to an increase in QALYs (0.41) and total costs ($Can27,866) and to an incremental cost-effectiveness ratio (ICER) of $Can68,647 per QALY gained. In deterministic sensitivity analyses, the ICER was most sensitive to the modelling time horizon and to the extrapolation of OS treatment effects beyond the trial duration. In probabilistic sensitivity analysis, the probability of cost effectiveness at a willingness-to-pay threshold of $Can100,000 per QALY gained was 59 %. Base-case results indicated that improved overall response and PFS for OChl compared with chlorambucil translated to improved quality-adjusted life expectancy. Sensitivity analysis suggested that OChl is likely to be cost effective subject to uncertainty associated with the presence of any long-term OS benefit and the model time horizon.
Probabilistic risk analysis of building contamination.
Bolster, D T; Tartakovsky, D M
2008-10-01
We present a general framework for probabilistic risk assessment (PRA) of building contamination. PRA provides a powerful tool for the rigorous quantification of risk in contamination of building spaces. A typical PRA starts by identifying relevant components of a system (e.g. ventilation system components, potential sources of contaminants, remediation methods) and proceeds by using available information and statistical inference to estimate the probabilities of their failure. These probabilities are then combined by means of fault-tree analyses to yield probabilistic estimates of the risk of system failure (e.g. building contamination). A sensitivity study of PRAs can identify features and potential problems that need to be addressed with the most urgency. Often PRAs are amenable to approximations, which can significantly simplify the approach. All these features of PRA are presented in this paper via a simple illustrative example, which can be built upon in further studies. The tool presented here can be used to design and maintain adequate ventilation systems to minimize exposure of occupants to contaminants.
Chatterjee, Abhishek; Macarios, David; Griffin, Leah; Kosowski, Tomasz; Pyfer, Bryan J; Offodile, Anaeze C; Driscoll, Daniel; Maddali, Sirish; Attwood, John
2015-11-01
Sartorius flap coverage and adjunctive negative pressure wound therapy (NPWT) have been described in managing infected vascular groin grafts with varying cost and clinical success. We performed a cost-utility analysis comparing sartorius flap with NPWT in managing an infected vascular groin graft. A literature review compiling outcomes for sartorius flap and NPWT interventions was conducted from peer-reviewed journals in MEDLINE (PubMed) and EMBASE. Utility scores were derived from expert opinion and used to estimate quality-adjusted life years (QALYs). Medicare current procedure terminology and diagnosis-related groups codes were used to assess the costs for successful graft salvage with the associated complications. Incremental cost-effectiveness was assessed at $50,000/QALY, and both univariate and probabilistic sensitivity analyses were conducted to assess robustness of the conclusions. Thirty-two studies were used pooling 384 patients (234 sartorius flaps and 150 NPWT). NPWT had better clinical outcomes (86.7% success rate, 0.9% minor complication rate, and 13.3% major complication rate) than sartorius flap (81.6% success rate, 8.0% minor complication rate, and 18.4% major complication rate). NPWT was less costly ($12,366 versus $23,516) and slightly more effective (12.06 QALY versus 12.05 QALY) compared with sartorius flap. Sensitivity analyses confirmed the robustness of the base case findings; NPWT was either cost-effective at $50,000/QALY or dominated sartorius flap in 81.6% of all probabilistic sensitivity analyses. In our cost-utility analysis, use of adjunctive NPWT, along with debridement and antibiotic treatment, for managing infected vascular groin graft wounds was found to be a more cost-effective option when compared with sartorius flaps.
Macarios, David; Griffin, Leah; Kosowski, Tomasz; Pyfer, Bryan J.; Offodile, Anaeze C.; Driscoll, Daniel; Maddali, Sirish; Attwood, John
2015-01-01
Background: Sartorius flap coverage and adjunctive negative pressure wound therapy (NPWT) have been described in managing infected vascular groin grafts with varying cost and clinical success. We performed a cost–utility analysis comparing sartorius flap with NPWT in managing an infected vascular groin graft. Methods: A literature review compiling outcomes for sartorius flap and NPWT interventions was conducted from peer-reviewed journals in MEDLINE (PubMed) and EMBASE. Utility scores were derived from expert opinion and used to estimate quality-adjusted life years (QALYs). Medicare current procedure terminology and diagnosis-related groups codes were used to assess the costs for successful graft salvage with the associated complications. Incremental cost-effectiveness was assessed at $50,000/QALY, and both univariate and probabilistic sensitivity analyses were conducted to assess robustness of the conclusions. Results: Thirty-two studies were used pooling 384 patients (234 sartorius flaps and 150 NPWT). NPWT had better clinical outcomes (86.7% success rate, 0.9% minor complication rate, and 13.3% major complication rate) than sartorius flap (81.6% success rate, 8.0% minor complication rate, and 18.4% major complication rate). NPWT was less costly ($12,366 versus $23,516) and slightly more effective (12.06 QALY versus 12.05 QALY) compared with sartorius flap. Sensitivity analyses confirmed the robustness of the base case findings; NPWT was either cost-effective at $50,000/QALY or dominated sartorius flap in 81.6% of all probabilistic sensitivity analyses. Conclusion: In our cost–utility analysis, use of adjunctive NPWT, along with debridement and antibiotic treatment, for managing infected vascular groin graft wounds was found to be a more cost-effective option when compared with sartorius flaps. PMID:26893991
Shen, Nicole T; Leff, Jared A; Schneider, Yecheskel; Crawford, Carl V; Maw, Anna; Bosworth, Brian; Simon, Matthew S
2017-01-01
Systematic reviews with meta-analyses and meta-regression suggest that timely probiotic use can prevent Clostridium difficile infection (CDI) in hospitalized adults receiving antibiotics, but the cost effectiveness is unknown. We sought to evaluate the cost effectiveness of probiotic use for prevention of CDI versus no probiotic use in the United States. We programmed a decision analytic model using published literature and national databases with a 1-year time horizon. The base case was modeled as a hypothetical cohort of hospitalized adults (mean age 68) receiving antibiotics with and without concurrent probiotic administration. Projected outcomes included quality-adjusted life-years (QALYs), costs (2013 US dollars), incremental cost-effectiveness ratios (ICERs; $/QALY), and cost per infection avoided. One-way, two-way, and probabilistic sensitivity analyses were conducted, and scenarios of different age cohorts were considered. The ICERs less than $100000 per QALY were considered cost effective. Probiotic use dominated (more effective and less costly) no probiotic use. Results were sensitive to probiotic efficacy (relative risk <0.73), the baseline risk of CDI (>1.6%), the risk of probiotic-associated bactermia/fungemia (<0.26%), probiotic cost (<$130), and age (>65). In probabilistic sensitivity analysis, at a willingness-to-pay threshold of $100000/QALY, probiotics were the optimal strategy in 69.4% of simulations. Our findings suggest that probiotic use may be a cost-effective strategy to prevent CDI in hospitalized adults receiving antibiotics age 65 or older or when the baseline risk of CDI exceeds 1.6%.
Kawasaki, Ryo; Akune, Yoko; Hiratsuka, Yoshimune; Fukuhara, Shunichi; Yamada, Masakazu
2015-02-01
To evaluate the cost-effectiveness for a screening interval longer than 1 year detecting diabetic retinopathy (DR) through the estimation of incremental costs per quality-adjusted life year (QALY) based on the best available clinical data in Japan. A Markov model with a probabilistic cohort analysis was framed to calculate incremental costs per QALY gained by implementing a screening program detecting DR in Japan. A 1-year cycle length and population size of 50,000 with a 50-year time horizon (age 40-90 years) was used. Best available clinical data from publications and national surveillance data was used, and a model was designed including current diagnosis and management of DR with corresponding visual outcomes. One-way and probabilistic sensitivity analyses were performed considering uncertainties in the parameters. In the base-case analysis, the strategy with a screening program resulted in an incremental cost of 5,147 Japanese yen (¥; US$64.6) and incremental effectiveness of 0.0054 QALYs per person screened. The incremental cost-effectiveness ratio was ¥944,981 (US$11,857) per QALY. The simulation suggested that screening would result in a significant reduction in blindness in people aged 40 years or over (-16%). Sensitivity analyses suggested that in order to achieve both reductions in blindness and cost-effectiveness in Japan, the screening program should screen those aged 53-84 years, at intervals of 3 years or less. An eye screening program in Japan would be cost-effective in detecting DR and preventing blindness from DR, even allowing for the uncertainties in estimates of costs, utility, and current management of DR.
Smith, William B; Steinberg, Joni; Scholtes, Stefan; Mcnamara, Iain R
2017-03-01
To compare the age-based cost-effectiveness of total knee arthroplasty (TKA), unicompartmental knee arthroplasty (UKA), and high tibial osteotomy (HTO) for the treatment of medial compartment knee osteoarthritis (MCOA). A Markov model was used to simulate theoretical cohorts of patients 40, 50, 60, and 70 years of age undergoing primary TKA, UKA, or HTO. Costs and outcomes associated with initial and subsequent interventions were estimated by following these virtual cohorts over a 10-year period. Revision and mortality rates, costs, and functional outcome data were estimated from a systematic review of the literature. Probabilistic analysis was conducted to accommodate these parameters' inherent uncertainty, and both discrete and probabilistic sensitivity analyses were utilized to assess the robustness of the model's outputs to changes in key variables. HTO was most likely to be cost-effective in cohorts under 60, and UKA most likely in those 60 and over. Probabilistic results did not indicate one intervention to be significantly more cost-effective than another. The model was exquisitely sensitive to changes in utility (functional outcome), somewhat sensitive to changes in cost, and least sensitive to changes in 10-year revision risk. HTO may be the most cost-effective option when treating MCOA in younger patients, while UKA may be preferred in older patients. Functional utility is the primary driver of the cost-effectiveness of these interventions. For the clinician, this study supports HTO as a competitive treatment option in young patient populations. It also validates each one of the three interventions considered as potentially optimal, depending heavily on patient preferences and functional utility derived over time.
Schlaier, Juergen R; Beer, Anton L; Faltermeier, Rupert; Fellner, Claudia; Steib, Kathrin; Lange, Max; Greenlee, Mark W; Brawanski, Alexander T; Anthofer, Judith M
2017-06-01
This study compared tractography approaches for identifying cerebellar-thalamic fiber bundles relevant to planning target sites for deep brain stimulation (DBS). In particular, probabilistic and deterministic tracking of the dentate-rubro-thalamic tract (DRTT) and differences between the spatial courses of the DRTT and the cerebello-thalamo-cortical (CTC) tract were compared. Six patients with movement disorders were examined by magnetic resonance imaging (MRI), including two sets of diffusion-weighted images (12 and 64 directions). Probabilistic and deterministic tractography was applied on each diffusion-weighted dataset to delineate the DRTT. Results were compared with regard to their sensitivity in revealing the DRTT and additional fiber tracts and processing time. Two sets of regions-of-interests (ROIs) guided deterministic tractography of the DRTT or the CTC, respectively. Tract distances to an atlas-based reference target were compared. Probabilistic fiber tracking with 64 orientations detected the DRTT in all twelve hemispheres. Deterministic tracking detected the DRTT in nine (12 directions) and in only two (64 directions) hemispheres. Probabilistic tracking was more sensitive in detecting additional fibers (e.g. ansa lenticularis and medial forebrain bundle) than deterministic tracking. Probabilistic tracking lasted substantially longer than deterministic. Deterministic tracking was more sensitive in detecting the CTC than the DRTT. CTC tracts were located adjacent but consistently more posterior to DRTT tracts. These results suggest that probabilistic tracking is more sensitive and robust in detecting the DRTT but harder to implement than deterministic approaches. Although sensitivity of deterministic tracking is higher for the CTC than the DRTT, targets for DBS based on these tracts likely differ. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
Cost-effectiveness of minimally invasive sacroiliac joint fusion.
Cher, Daniel J; Frasco, Melissa A; Arnold, Renée Jg; Polly, David W
2016-01-01
Sacroiliac joint (SIJ) disorders are common in patients with chronic lower back pain. Minimally invasive surgical options have been shown to be effective for the treatment of chronic SIJ dysfunction. To determine the cost-effectiveness of minimally invasive SIJ fusion. Data from two prospective, multicenter, clinical trials were used to inform a Markov process cost-utility model to evaluate cumulative 5-year health quality and costs after minimally invasive SIJ fusion using triangular titanium implants or non-surgical treatment. The analysis was performed from a third-party perspective. The model specifically incorporated variation in resource utilization observed in the randomized trial. Multiple one-way and probabilistic sensitivity analyses were performed. SIJ fusion was associated with a gain of approximately 0.74 quality-adjusted life years (QALYs) at a cost of US$13,313 per QALY gained. In multiple one-way sensitivity analyses all scenarios resulted in an incremental cost-effectiveness ratio (ICER) <$26,000/QALY. Probabilistic analyses showed a high degree of certainty that the maximum ICER for SIJ fusion was less than commonly selected thresholds for acceptability (mean ICER =$13,687, 95% confidence interval $5,162-$28,085). SIJ fusion provided potential cost savings per QALY gained compared to non-surgical treatment after a treatment horizon of greater than 13 years. Compared to traditional non-surgical treatments, SIJ fusion is a cost-effective, and, in the long term, cost-saving strategy for the treatment of SIJ dysfunction due to degenerative sacroiliitis or SIJ disruption.
Baschet, Louise; Bourguignon, Sandrine; Marque, Sébastien; Durand-Zaleski, Isabelle; Teiger, Emmanuel; Wilquin, Fanny; Levesque, Karine
2016-01-01
To determine the cost-effectiveness of drug-eluting stents (DES) compared with bare-metal stents (BMS) in patients requiring a percutaneous coronary intervention in France, using a recent meta-analysis including second-generation DES. A cost-effectiveness analysis was performed in the French National Health Insurance setting. Effectiveness settings were taken from a meta-analysis of 117 762 patient-years with 76 randomised trials. The main effectiveness criterion was major cardiac event-free survival. Effectiveness and costs were modelled over a 5-year horizon using a three-state Markov model. Incremental cost-effectiveness ratios and a cost-effectiveness acceptability curve were calculated for a range of thresholds for willingness to pay per year without major cardiac event gain. Deterministic and probabilistic sensitivity analyses were performed. Base case results demonstrated that DES are dominant over BMS, with an increase in event-free survival and a cost-reduction of €184, primarily due to a diminution of second revascularisations, and an absence of myocardial infarction and stent thrombosis. These results are robust for uncertainty on one-way deterministic and probabilistic sensitivity analyses. Using a cost-effectiveness threshold of €7000 per major cardiac event-free year gained, DES has a >95% probability of being cost-effective versus BMS. Following DES price decrease, new-generation DES development and taking into account recent meta-analyses results, the DES can now be considered cost-effective regardless of selective indication in France, according to European recommendations.
Cost-effectiveness of minimally invasive sacroiliac joint fusion
Cher, Daniel J; Frasco, Melissa A; Arnold, Renée JG; Polly, David W
2016-01-01
Background Sacroiliac joint (SIJ) disorders are common in patients with chronic lower back pain. Minimally invasive surgical options have been shown to be effective for the treatment of chronic SIJ dysfunction. Objective To determine the cost-effectiveness of minimally invasive SIJ fusion. Methods Data from two prospective, multicenter, clinical trials were used to inform a Markov process cost-utility model to evaluate cumulative 5-year health quality and costs after minimally invasive SIJ fusion using triangular titanium implants or non-surgical treatment. The analysis was performed from a third-party perspective. The model specifically incorporated variation in resource utilization observed in the randomized trial. Multiple one-way and probabilistic sensitivity analyses were performed. Results SIJ fusion was associated with a gain of approximately 0.74 quality-adjusted life years (QALYs) at a cost of US$13,313 per QALY gained. In multiple one-way sensitivity analyses all scenarios resulted in an incremental cost-effectiveness ratio (ICER) <$26,000/QALY. Probabilistic analyses showed a high degree of certainty that the maximum ICER for SIJ fusion was less than commonly selected thresholds for acceptability (mean ICER =$13,687, 95% confidence interval $5,162–$28,085). SIJ fusion provided potential cost savings per QALY gained compared to non-surgical treatment after a treatment horizon of greater than 13 years. Conclusion Compared to traditional non-surgical treatments, SIJ fusion is a cost-effective, and, in the long term, cost-saving strategy for the treatment of SIJ dysfunction due to degenerative sacroiliitis or SIJ disruption. PMID:26719717
Rats bred for high alcohol drinking are more sensitive to delayed and probabilistic outcomes.
Wilhelm, C J; Mitchell, S H
2008-10-01
Alcoholics and heavy drinkers score higher on measures of impulsivity than nonalcoholics and light drinkers. This may be because of factors that predate drug exposure (e.g. genetics). This study examined the role of genetics by comparing impulsivity measures in ethanol-naive rats selectively bred based on their high [high alcohol drinking (HAD)] or low [low alcohol drinking (LAD)] consumption of ethanol. Replicates 1 and 2 of the HAD and LAD rats, developed by the University of Indiana Alcohol Research Center, completed two different discounting tasks. Delay discounting examines sensitivity to rewards that are delayed in time and is commonly used to assess 'choice' impulsivity. Probability discounting examines sensitivity to the uncertain delivery of rewards and has been used to assess risk taking and risk assessment. High alcohol drinking rats discounted delayed and probabilistic rewards more steeply than LAD rats. Discount rates associated with probabilistic and delayed rewards were weakly correlated, while bias was strongly correlated with discount rate in both delay and probability discounting. The results suggest that selective breeding for high alcohol consumption selects for animals that are more sensitive to delayed and probabilistic outcomes. Sensitivity to delayed or probabilistic outcomes may be predictive of future drinking in genetically predisposed individuals.
Sensitivity to Uncertainty in Asteroid Impact Risk Assessment
NASA Astrophysics Data System (ADS)
Mathias, D.; Wheeler, L.; Prabhu, D. K.; Aftosmis, M.; Dotson, J.; Robertson, D. K.
2015-12-01
The Engineering Risk Assessment (ERA) team at NASA Ames Research Center is developing a physics-based impact risk model for probabilistically assessing threats from potential asteroid impacts on Earth. The model integrates probabilistic sampling of asteroid parameter ranges with physics-based analyses of entry, breakup, and impact to estimate damage areas and casualties from various impact scenarios. Assessing these threats is a highly coupled, dynamic problem involving significant uncertainties in the range of expected asteroid characteristics, how those characteristics may affect the level of damage, and the fidelity of various modeling approaches and assumptions. The presented model is used to explore the sensitivity of impact risk estimates to these uncertainties in order to gain insight into what additional data or modeling refinements are most important for producing effective, meaningful risk assessments. In the extreme cases of very small or very large impacts, the results are generally insensitive to many of the characterization and modeling assumptions. However, the nature of the sensitivity can change across moderate-sized impacts. Results will focus on the value of additional information in this critical, mid-size range, and how this additional data can support more robust mitigation decisions.
NASA Astrophysics Data System (ADS)
Stockton, T. B.; Black, P. K.; Catlett, K. M.; Tauxe, J. D.
2002-05-01
Environmental modeling is an essential component in the evaluation of regulatory compliance of radioactive waste management sites (RWMSs) at the Nevada Test Site in southern Nevada, USA. For those sites that are currently operating, further goals are to support integrated decision analysis for the development of acceptance criteria for future wastes, as well as site maintenance, closure, and monitoring. At these RWMSs, the principal pathways for release of contamination to the environment are upward towards the ground surface rather than downwards towards the deep water table. Biotic processes, such as burrow excavation and plant uptake and turnover, dominate this upward transport. A combined multi-pathway contaminant transport and risk assessment model was constructed using the GoldSim modeling platform. This platform facilitates probabilistic analysis of environmental systems, and is especially well suited for assessments involving radionuclide decay chains. The model employs probabilistic definitions of key parameters governing contaminant transport, with the goals of quantifying cumulative uncertainty in the estimation of performance measures and providing information necessary to perform sensitivity analyses. This modeling differs from previous radiological performance assessments (PAs) in that the modeling parameters are intended to be representative of the current knowledge, and the uncertainty in that knowledge, of parameter values rather than reflective of a conservative assessment approach. While a conservative PA may be sufficient to demonstrate regulatory compliance, a parametrically honest PA can also be used for more general site decision-making. In particular, a parametrically honest probabilistic modeling approach allows both uncertainty and sensitivity analyses to be explicitly coupled to the decision framework using a single set of model realizations. For example, sensitivity analysis provides a guide for analyzing the value of collecting more information by quantifying the relative importance of each input parameter in predicting the model response. However, in these complex, high dimensional eco-system models, represented by the RWMS model, the dynamics of the systems can act in a non-linear manner. Quantitatively assessing the importance of input variables becomes more difficult as the dimensionality, the non-linearities, and the non-monotonicities of the model increase. Methods from data mining such as Multivariate Adaptive Regression Splines (MARS) and the Fourier Amplitude Sensitivity Test (FAST) provide tools that can be used in global sensitivity analysis in these high dimensional, non-linear situations. The enhanced interpretability of model output provided by the quantitative measures estimated by these global sensitivity analysis tools will be demonstrated using the RWMS model.
Jongeneel, W P; Delmaar, J E; Bokkers, B G H
2018-06-08
A methodology to assess the health impact of skin sensitizers is introduced, which consists of the comparison of the probabilistic aggregated exposure with a probabilistic (individual) human sensitization or elicitation induction dose. The health impact of potential policy measures aimed at reducing the concentration of a fragrance allergen, geraniol, in consumer products is analysed in a simulated population derived from multiple product use surveys. Our analysis shows that current dermal exposure to geraniol from personal care and household cleaning products lead to new cases of contact allergy and induce clinical symptoms for those already sensitized. We estimate that this exposure results yearly in 34 new cases of geraniol contact allergy per million consumers in Western and Northern Europe, mainly due to exposure to household cleaning products. About twice as many consumers (60 per million) are projected to suffer from clinical symptoms due to re-exposure to geraniol. Policy measures restricting geraniol concentrations to <0.01% will noticeably reduce new cases of sensitization and decrease the number of people with clinical symptoms as well as the frequency of occurrence of these clinical symptoms. The estimated numbers should be interpreted with caution and provide only a rough indication of the health impact. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qu, Xuanlu M.; Louie, Alexander V.; Ashman, Jonathan
Purpose: Surgery combined with radiation therapy (RT) is the cornerstone of multidisciplinary management of extremity soft tissue sarcoma (STS). Although RT can be given in either the preoperative or the postoperative setting with similar local recurrence and survival outcomes, the side effect profiles, costs, and long-term functional outcomes are different. The aim of this study was to use decision analysis to determine optimal sequencing of RT with surgery in patients with extremity STS. Methods and Materials: A cost-effectiveness analysis was conducted using a state transition Markov model, with quality-adjusted life years (QALYs) as the primary outcome. A time horizon ofmore » 5 years, a cycle length of 3 months, and a willingness-to-pay threshold of $50,000/QALY was used. One-way deterministic sensitivity analyses were performed to determine the thresholds at which each strategy would be preferred. The robustness of the model was assessed by probabilistic sensitivity analysis. Results: Preoperative RT is a more cost-effective strategy ($26,633/3.00 QALYs) than postoperative RT ($28,028/2.86 QALYs) in our base case scenario. Preoperative RT is the superior strategy with either 3-dimensional conformal RT or intensity-modulated RT. One-way sensitivity analyses identified the relative risk of chronic adverse events as having the greatest influence on the preferred timing of RT. The likelihood of preoperative RT being the preferred strategy was 82% on probabilistic sensitivity analysis. Conclusions: Preoperative RT is more cost effective than postoperative RT in the management of resectable extremity STS, primarily because of the higher incidence of chronic adverse events with RT in the postoperative setting.« less
An economic evaluation of intravenous versus oral iron supplementation in people on haemodialysis.
Wong, Germaine; Howard, Kirsten; Hodson, Elisabeth; Irving, Michelle; Craig, Jonathan C
2013-02-01
Iron supplementation can be administered either intravenously or orally in patients with chronic kidney disease (CKD) and iron deficiency anaemia, but practice varies widely. The aim of this study was to estimate the health care costs and benefits of parenteral iron compared with oral iron in haemodialysis patients receiving erythropoiesis-stimulating agents (ESAs). Using broad health care funder perspective, a probabilistic Markov model was constructed to compare the cost-effectiveness and cost-utility of parenteral iron therapy versus oral iron for the management of haemodialysis patients with relative iron deficiency. A series of one-way, multi-way and probabilistic sensitivity analyses were conducted to assess the robustness of the model structure and the extent in which the model's assumptions were sensitive to the uncertainties within the input variables. Compared with oral iron, the incremental cost-effectiveness ratios (ICERs) for parenteral iron were $74,760 per life year saved and $34,660 per quality-adjusted life year (QALY) gained. A series of one-way sensitivity analyses show that the ICER is most sensitive to the probability of achieving haemoglobin (Hb) targets using supplemental iron with a consequential decrease in the standard ESA doses and the relative increased risk in all-cause mortality associated with low Hb levels (Hb < 9.0 g/dL). If the willingness-to-pay threshold was set at $50,000/QALY, the proportions of simulations that showed parenteral iron was cost-effective compared with oral iron were over 90%. Assuming that there is an overall increased mortality risk associated with very low Hb level (<9.0 g/dL), using parenteral iron to achieve an Hb target between 9.5 and 12 g/L is cost-effective compared with oral iron therapy among haemodialysis patients with relative iron deficiency.
NASA Technical Reports Server (NTRS)
Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.
2010-01-01
Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project
Cost-effectiveness analysis of ibrutinib in patients with Waldenström macroglobulinemia in Italy.
Aiello, Andrea; D'Ausilio, Anna; Lo Muto, Roberta; Randon, Francesca; Laurenti, Luca
2017-01-01
Background and Objective: Ibrutinib has recently been approved in Europe for Waldenström Macroglobulinemia (WM) in symptomatic patients who have received at least one prior therapy, or in first-line treatment for patients unsuitable for chemo-immunotherapy. The aim of the study is to estimate the incremental cost-effectiveness ratio (ICER) of ibrutinib in relapse/refractory WM, compared with the Italian current therapeutic pathways (CTP). Methods: A Markov model was adapted for Italy considering the National Health System perspective. Input data from literature as well as global trials were used. The percentage use of therapies, and healthcare resources consumption were estimated according to expert panel advice. Drugs ex-factory prices and national tariffs were used for estimating costs. The model had a 15-year time horizon, with a 3.0% discount rate for both clinical and economic data. Deterministic and probabilistic sensitivity analyses were performed to test the results strength. Results: Ibrutinib resulted in increased Life Years Gained (LYGs) and increased costs compared to CTP, with an ICER of €52,698/LYG. Sensitivity analyses confirmed the results of the BaseCase. Specifically, in the probabilistic analysis, at a willingness to pay threshold of €60,000/LYG ibrutinib was cost-effective in 84% of simulations. Conclusions: Ibrutinib has demonstrated a positive cost-effectiveness profile in Italy.
Leff, Jared A; Schneider, Yecheskel; Crawford, Carl V; Maw, Anna; Bosworth, Brian; Simon, Matthew S
2017-01-01
Abstract Background Systematic reviews with meta-analyses and meta-regression suggest that timely probiotic use can prevent Clostridium difficile infection (CDI) in hospitalized adults receiving antibiotics, but the cost effectiveness is unknown. We sought to evaluate the cost effectiveness of probiotic use for prevention of CDI versus no probiotic use in the United States. Methods We programmed a decision analytic model using published literature and national databases with a 1-year time horizon. The base case was modeled as a hypothetical cohort of hospitalized adults (mean age 68) receiving antibiotics with and without concurrent probiotic administration. Projected outcomes included quality-adjusted life-years (QALYs), costs (2013 US dollars), incremental cost-effectiveness ratios (ICERs; $/QALY), and cost per infection avoided. One-way, two-way, and probabilistic sensitivity analyses were conducted, and scenarios of different age cohorts were considered. The ICERs less than $100000 per QALY were considered cost effective. Results Probiotic use dominated (more effective and less costly) no probiotic use. Results were sensitive to probiotic efficacy (relative risk <0.73), the baseline risk of CDI (>1.6%), the risk of probiotic-associated bactermia/fungemia (<0.26%), probiotic cost (<$130), and age (>65). In probabilistic sensitivity analysis, at a willingness-to-pay threshold of $100000/QALY, probiotics were the optimal strategy in 69.4% of simulations. Conclusions Our findings suggest that probiotic use may be a cost-effective strategy to prevent CDI in hospitalized adults receiving antibiotics age 65 or older or when the baseline risk of CDI exceeds 1.6%. PMID:29230429
Varier, Raghu U; Biltaji, Eman; Smith, Kenneth J; Roberts, Mark S; Kyle Jensen, M; LaFleur, Joanne; Nelson, Richard E
2015-04-01
Clostridium difficile infection (CDI) places a high burden on the US healthcare system. Recurrent CDI (RCDI) occurs frequently. Recently proposed guidelines from the American College of Gastroenterology (ACG) and the American Gastroenterology Association (AGA) include fecal microbiota transplantation (FMT) as a therapeutic option for RCDI. The purpose of this study was to estimate the cost-effectiveness of FMT compared with vancomycin for the treatment of RCDI in adults, specifically following guidelines proposed by the ACG and AGA. We constructed a decision-analytic computer simulation using inputs from the published literature to compare the standard approach using tapered vancomycin to FMT for RCDI from the third-party payer perspective. Our effectiveness measure was quality-adjusted life years (QALYs). Because simulated patients were followed for 90 days, discounting was not necessary. One-way and probabilistic sensitivity analyses were performed. Base-case analysis showed that FMT was less costly ($1,669 vs $3,788) and more effective (0.242 QALYs vs 0.235 QALYs) than vancomycin for RCDI. One-way sensitivity analyses showed that FMT was the dominant strategy (both less expensive and more effective) if cure rates for FMT and vancomycin were ≥70% and <91%, respectively, and if the cost of FMT was <$3,206. Probabilistic sensitivity analysis, varying all parameters simultaneously, showed that FMT was the dominant strategy over 10, 000 second-order Monte Carlo simulations. Our results suggest that FMT may be a cost-saving intervention in managing RCDI. Implementation of FMT for RCDI may help decrease the economic burden to the healthcare system.
Improving the quality of pressure ulcer care with prevention: a cost-effectiveness analysis.
Padula, William V; Mishra, Manish K; Makic, Mary Beth F; Sullivan, Patrick W
2011-04-01
In October 2008, Centers for Medicare and Medicaid Services discontinued reimbursement for hospital-acquired pressure ulcers (HAPUs), thus placing stress on hospitals to prevent incidence of this costly condition. To evaluate whether prevention methods are cost-effective compared with standard care in the management of HAPUs. A semi-Markov model simulated the admission of patients to an acute care hospital from the time of admission through 1 year using the societal perspective. The model simulated health states that could potentially lead to an HAPU through either the practice of "prevention" or "standard care." Univariate sensitivity analyses, threshold analyses, and Bayesian multivariate probabilistic sensitivity analysis using 10,000 Monte Carlo simulations were conducted. Cost per quality-adjusted life-years (QALYs) gained for the prevention of HAPUs. Prevention was cost saving and resulted in greater expected effectiveness compared with the standard care approach per hospitalization. The expected cost of prevention was $7276.35, and the expected effectiveness was 11.241 QALYs. The expected cost for standard care was $10,053.95, and the expected effectiveness was 9.342 QALYs. The multivariate probabilistic sensitivity analysis showed that prevention resulted in cost savings in 99.99% of the simulations. The threshold cost of prevention was $821.53 per day per person, whereas the cost of prevention was estimated to be $54.66 per day per person. This study suggests that it is more cost effective to pay for prevention of HAPUs compared with standard care. Continuous preventive care of HAPUs in acutely ill patients could potentially reduce incidence and prevalence, as well as lead to lower expenditures.
NASA Astrophysics Data System (ADS)
Mazzaracchio, Antonio; Marchetti, Mario
2010-03-01
Implicit ablation and thermal response software was developed to analyse and size charring ablative thermal protection systems for entry vehicles. A statistical monitor integrated into the tool, which uses the Monte Carlo technique, allows a simulation to run over stochastic series. This performs an uncertainty and sensitivity analysis, which estimates the probability of maintaining the temperature of the underlying material within specified requirements. This approach and the associated software are primarily helpful during the preliminary design phases of spacecraft thermal protection systems. They are proposed as an alternative to traditional approaches, such as the Root-Sum-Square method. The developed tool was verified by comparing the results with those from previous work on thermal protection system probabilistic sizing methodologies, which are based on an industry standard high-fidelity ablation and thermal response program. New case studies were analysed to establish thickness margins on sizing heat shields that are currently proposed for vehicles using rigid aeroshells for future aerocapture missions at Neptune, and identifying the major sources of uncertainty in the material response.
Zhang, Xinke; Hay, Joel W; Niu, Xiaoli
2015-01-01
The aim of the study was to compare the cost effectiveness of fingolimod, teriflunomide, dimethyl fumarate, and intramuscular (IM) interferon (IFN)-β(1a) as first-line therapies in the treatment of patients with relapsing-remitting multiple sclerosis (RRMS). A Markov model was developed to evaluate the cost effectiveness of disease-modifying drugs (DMDs) from a US societal perspective. The time horizon in the base case was 5 years. The primary outcome was incremental net monetary benefit (INMB), and the secondary outcome was incremental cost-effectiveness ratio (ICER). The base case INMB willingness-to-pay (WTP) threshold was assumed to be US$150,000 per quality-adjusted life year (QALY), and the costs were in 2012 US dollars. One-way sensitivity analyses and probabilistic sensitivity analysis were conducted to test the robustness of the model results. Dimethyl fumarate dominated all other therapies over the range of WTPs, from US$0 to US$180,000. Compared with IM IFN-β(1a), at a WTP of US$150,000, INMBs were estimated at US$36,567, US$49,780, and US$80,611 for fingolimod, teriflunomide, and dimethyl fumarate, respectively. The ICER of fingolimod versus teriflunomide was US$3,201,672. One-way sensitivity analyses demonstrated the model results were sensitive to the acquisition costs of DMDs and the time horizon, but in most scenarios, cost-effectiveness rankings remained stable. Probabilistic sensitivity analysis showed that for more than 90% of the simulations, dimethyl fumarate was the optimal therapy across all WTP values. The three oral therapies were favored in the cost-effectiveness analysis. Of the four DMDs, dimethyl fumarate was a dominant therapy to manage RRMS. Apart from dimethyl fumarate, teriflunomide was the most cost-effective therapy compared with IM IFN-β(1a), with an ICER of US$7,115.
Konopka, Joseph F.; Gomoll, Andreas H.; Thornhill, Thomas S.; Katz, Jeffrey N.; Losina, Elena
2015-01-01
Background: Surgical options for the management of medial compartment osteoarthritis of the varus knee include high tibial osteotomy, unicompartmental knee arthroplasty, and total knee arthroplasty. We sought to determine the cost-effectiveness of high tibial osteotomy and unicompartmental knee arthroplasty as alternatives to total knee arthroplasty for patients fifty to sixty years of age. Methods: We built a probabilistic state-transition computer model with health states defined by pain, postoperative complications, and subsequent surgical procedures. We estimated transition probabilities from published literature. Costs were determined from Medicare reimbursement schedules. Health outcomes were measured in quality-adjusted life-years (QALYs). We conducted analyses over patients’ lifetimes from the societal perspective, with health and cost outcomes discounted by 3% annually. We used probabilistic sensitivity analyses to account for uncertainty in data inputs. Results: The estimated discounted QALYs were 14.62, 14.63, and 14.64 for high tibial osteotomy, unicompartmental knee arthroplasty, and total knee arthroplasty, respectively. Discounted total direct medical costs were $20,436 for high tibial osteotomy, $24,637 for unicompartmental knee arthroplasty, and $24,761 for total knee arthroplasty (in 2012 U.S. dollars). The incremental cost-effectiveness ratio (ICER) was $231,900 per QALY for total knee arthroplasty and $420,100 per QALY for unicompartmental knee arthroplasty. Probabilistic sensitivity analyses showed that, at a willingness-to-pay (WTP) threshold of $50,000 per QALY, high tibial osteotomy was cost-effective 57% of the time; total knee arthroplasty, 24%; and unicompartmental knee arthroplasty, 19%. At a WTP threshold of $100,000 per QALY, high tibial osteotomy was cost-effective 43% of time; total knee arthroplasty, 31%; and unicompartmental knee arthroplasty, 26%. Conclusions: In fifty to sixty-year-old patients with medial unicompartmental knee osteoarthritis, high tibial osteotomy is an attractive option compared with unicompartmental knee arthroplasty and total knee arthroplasty. This finding supports greater utilization of high tibial osteotomy for these patients. The cost-effectiveness of high tibial osteotomy and of unicompartmental knee arthroplasty depend on rates of conversion to total knee arthroplasty and the clinical outcomes of the conversions. Level of Evidence: Economic Level II. See Instructions for Authors for a complete description of levels of evidence. PMID:25995491
Analysing uncertainties of supply and demand in the future use of hydrogen as an energy vector
NASA Astrophysics Data System (ADS)
Lenel, U. R.; Davies, D. G. S.; Moore, M. A.
An analytical technique (Analysis with Uncertain Qualities), developed at Fulmer, is being used to examine the sensitivity of the outcome to uncertainties in input quantities in order to highlight which input quantities critically affect the potential role of hydrogen. The work presented here includes an outline of the model and the analysis technique, along with basic considerations of the input quantities to the model (demand, supply and constraints). Some examples are given of probabilistic estimates of input quantities.
Probabilistic brain tissue segmentation in neonatal magnetic resonance imaging.
Anbeek, Petronella; Vincken, Koen L; Groenendaal, Floris; Koeman, Annemieke; van Osch, Matthias J P; van der Grond, Jeroen
2008-02-01
A fully automated method has been developed for segmentation of four different structures in the neonatal brain: white matter (WM), central gray matter (CEGM), cortical gray matter (COGM), and cerebrospinal fluid (CSF). The segmentation algorithm is based on information from T2-weighted (T2-w) and inversion recovery (IR) scans. The method uses a K nearest neighbor (KNN) classification technique with features derived from spatial information and voxel intensities. Probabilistic segmentations of each tissue type were generated. By applying thresholds on these probability maps, binary segmentations were obtained. These final segmentations were evaluated by comparison with a gold standard. The sensitivity, specificity, and Dice similarity index (SI) were calculated for quantitative validation of the results. High sensitivity and specificity with respect to the gold standard were reached: sensitivity >0.82 and specificity >0.9 for all tissue types. Tissue volumes were calculated from the binary and probabilistic segmentations. The probabilistic segmentation volumes of all tissue types accurately estimated the gold standard volumes. The KNN approach offers valuable ways for neonatal brain segmentation. The probabilistic outcomes provide a useful tool for accurate volume measurements. The described method is based on routine diagnostic magnetic resonance imaging (MRI) and is suitable for large population studies.
Probabilistic Evaluation of Blade Impact Damage
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Abumeri, G. H.
2003-01-01
The response to high velocity impact of a composite blade is probabilistically evaluated. The evaluation is focused on quantifying probabilistically the effects of uncertainties (scatter) in the variables that describe the impact, the blade make-up (geometry and material), the blade response (displacements, strains, stresses, frequencies), the blade residual strength after impact, and the blade damage tolerance. The results of probabilistic evaluations results are in terms of probability cumulative distribution functions and probabilistic sensitivities. Results show that the blade has relatively low damage tolerance at 0.999 probability of structural failure and substantial at 0.01 probability.
Tan, Chongqing; Peng, Liubao; Zeng, Xiaohui; Li, Jianhe; Wan, Xiaomin; Chen, Gannong; Yi, Lidan; Luo, Xia; Zhao, Ziying
2013-01-01
First-line postoperative adjuvant chemotherapies with S-1 and capecitabine and oxaliplatin (XELOX) were first recommended for resectable gastric cancer patients in the 2010 and 2011 Chinese NCCN Clinical Practice Guidelines in Oncology: Gastric Cancer; however, their economic impact in China is unknown. The aim of this study was to compare the cost-effectiveness of adjuvant chemotherapy with XELOX, with S-1 and no treatment after a gastrectomy with extended (D2) lymph-node dissection among patients with stage II-IIIB gastric cancer. A Markov model, based on data from two clinical phase III trials, was developed to analyse the cost-effectiveness of patients in the XELOX group, S-1 group and surgery only (SO) group. The costs were estimated from the perspective of Chinese healthcare system. The utilities were assumed on the basis of previously published reports. Costs, quality-adjusted life-years (QALYs) and incremental cost-effectiveness ratios (ICER) were calculated with a lifetime horizon. One-way and probabilistic sensitivity analyses were performed. For the base case, XELOX had the lowest total cost ($44,568) and cost-effectiveness ratio ($7,360/QALY). The relative scenario analyses showed that SO was dominated by XELOX and the ICERs of S-1 was $58,843/QALY compared with XELOX. The one-way sensitivity analysis showed that the most influential parameter was the utility of disease-free survival. The probabilistic sensitivity analysis predicted a 75.8% likelihood that the ICER for XELOX would be less than $13,527 compared with S-1. When ICER was more than $38,000, the likelihood of cost-effectiveness achieved by S-1 group was greater than 50%. Our results suggest that for patients in China with resectable disease, first-line adjuvant chemotherapy with XELOX after a D2 gastrectomy is a best option comparing with S-1 and SO in view of our current study. In addition, S-1 might be a better choice, especially with a higher value of willingness-to-pay threshold.
Structural reliability methods: Code development status
NASA Astrophysics Data System (ADS)
Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.
1991-05-01
The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.
Structural reliability methods: Code development status
NASA Technical Reports Server (NTRS)
Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.
1991-01-01
The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.
Hunnicutt, Jacob N; Ulbricht, Christine M; Chrysanthopoulou, Stavroula A; Lapane, Kate L
2016-12-01
We systematically reviewed pharmacoepidemiologic and comparative effectiveness studies that use probabilistic bias analysis to quantify the effects of systematic error including confounding, misclassification, and selection bias on study results. We found articles published between 2010 and October 2015 through a citation search using Web of Science and Google Scholar and a keyword search using PubMed and Scopus. Eligibility of studies was assessed by one reviewer. Three reviewers independently abstracted data from eligible studies. Fifteen studies used probabilistic bias analysis and were eligible for data abstraction-nine simulated an unmeasured confounder and six simulated misclassification. The majority of studies simulating an unmeasured confounder did not specify the range of plausible estimates for the bias parameters. Studies simulating misclassification were in general clearer when reporting the plausible distribution of bias parameters. Regardless of the bias simulated, the probability distributions assigned to bias parameters, number of simulated iterations, sensitivity analyses, and diagnostics were not discussed in the majority of studies. Despite the prevalence and concern of bias in pharmacoepidemiologic and comparative effectiveness studies, probabilistic bias analysis to quantitatively model the effect of bias was not widely used. The quality of reporting and use of this technique varied and was often unclear. Further discussion and dissemination of the technique are warranted. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Probabilistic simulation of stress concentration in composite laminates
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Murthy, P. L. N.; Liaw, L.
1993-01-01
A computational methodology is described to probabilistically simulate the stress concentration factors in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The probabilistic composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties while probabilistic finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate stress concentration factors such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using it to simulate the stress concentration factors in composite laminates made from three different composite systems. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the stress concentration factors are influenced by local stiffness variables, by load eccentricities and by initial stress fields.
Ariza, R; Van Walsem, A; Canal, C; Roldán, C; Betegón, L; Oyagüez, I; Janssen, K
2014-07-01
To compare the cost of treating rheumatoid arthritis patients that have failed an initial treatment with methotrexate, with subcutaneous abatacept versus other first-line biologic disease-modifying antirheumatic drugs. Subcutaneous abatacept was considered comparable to intravenous abatacept, adalimumab, certolizumab pegol, etanercept, golimumab, infliximab and tocilizumab, based on indirect comparison using mixed treatment analysis. A cost-minimization analysis was therefore considered appropriate. The Spanish Health System perspective and a 3 year time horizon were selected. Pharmaceutical and administration costs (Euros 2013) of all available first-line biological disease-modifying antirheumatic drugs were considered. Administration costs were obtained from a local costs database. Patients were considered to have a weight of 70 kg. A 3% annual discount rate was applied. Deterministic and probabilistic sensitivity analyses were performed. Subcutaneous abatacept proved in the base case to be less costly than all other biologic antirrheumatic drugs (ranging from Euros -831.42 to Euros -9,741.69 versus infliximab and tocilizumab, respectively). Subcutaneous abatacept was associated with a cost of Euros 10,760.41 per patient during the first year of treatment and Euros 10,261.29 in subsequent years. The total 3-year cost of subcutaneous abatacept was Euros 29,953.89 per patient. Sensitivity analyses proved the model to be robust. Subcutaneous abatacept remained cost-saving in 100% of probabilistic sensitivity analysis simulations versus adalimumab, certolizumab, etanercept and golimumab, in more than 99.6% versus intravenous abatacept and tocilizumab and in 62.3% versus infliximab. Treatment with subcutaneous abatacept is cost-saving versus intravenous abatacept, adalimumab, certolizumab, etanercept, golimumab, infliximab and tocilizumab in the management of rheumatoid arthritis patients initiating treatment with biological antirheumatic drugs. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
Cost-effectiveness analysis of neurocognitive-sparing treatments for brain metastases.
Savitz, Samuel T; Chen, Ronald C; Sher, David J
2015-12-01
Decisions regarding how to treat patients who have 1 to 3 brain metastases require important tradeoffs between controlling recurrences, side effects, and costs. In this analysis, the authors compared novel treatments versus usual care to determine the incremental cost-effectiveness ratio from a payer's (Medicare) perspective. Cost-effectiveness was evaluated using a microsimulation of a Markov model for 60 one-month cycles. The model used 4 simulated cohorts of patients aged 65 years with 1 to 3 brain metastases. The 4 cohorts had a median survival of 3, 6, 12, and 24 months to test the sensitivity of the model to different prognoses. The treatment alternatives evaluated included stereotactic radiosurgery (SRS) with 3 variants of salvage after recurrence (whole-brain radiotherapy [WBRT], hippocampal avoidance WBRT [HA-WBRT], SRS plus WBRT, and SRS plus HA-WBRT). The findings were tested for robustness using probabilistic and deterministic sensitivity analyses. Traditional radiation therapies remained cost-effective for patients in the 3-month and 6-month cohorts. In the cohorts with longer median survival, HA-WBRT and SRS plus HA-WBRT became cost-effective relative to traditional treatments. When the treatments that involved HA-WBRT were excluded, either SRS alone or SRS plus WBRT was cost-effective relative to WBRT alone. The deterministic and probabilistic sensitivity analyses confirmed the robustness of these results. HA-WBRT and SRS plus HA-WBRT were cost-effective for 2 of the 4 cohorts, demonstrating the value of controlling late brain toxicity with this novel therapy. Cost-effectiveness depended on patient life expectancy. SRS was cost-effective in the cohorts with short prognoses (3 and 6 months), whereas HA-WBRT and SRS plus HA-WBRT were cost-effective in the cohorts with longer prognoses (12 and 24 months). © 2015 American Cancer Society.
Wu, Bin; Ye, Ming; Chen, Huafeng; Shen, Jinfang F
2012-02-01
Adding trastuzumab to a conventional regimen of chemotherapy can improve survival in patients with human epidermal growth factor receptor 2 (HER2)-positive advanced gastric or gastroesophageal junction (GEJ) cancer, but the economic impact of this practice is unknown. The purpose of this cost-effectiveness analysis was to estimate the effects of adding trastuzumab to standard chemotherapy in patients with HER2-positive advanced gastric or GEJ cancer on health and economic outcomes in China. A Markov model was developed to simulate the clinical course of typical patients with HER2-positive advanced gastric or GEJ cancer. Five-year quality-adjusted life-years (QALYs), costs, and incremental cost-effectiveness ratios (ICERs) were estimated. Model inputs were derived from the published literature and government sources. Direct costs were estimated from the perspective of Chinese society. One-way and probabilistic sensitivity analyses were conducted. On baseline analysis, the addition of trastuzumab increased cost and QALY by $56,004.30 (year-2010 US $) and 0.18, respectively, relative to conventional chemotherapy, resulting in an ICER of $251,667.10/QALY gained. Probabilistic sensitivity analyses supported that the addition of trastuzumab was not cost-effective. Budgetary impact analysis estimated that the annual increase in fiscal expenditures would be ~$1 billion. On univariate sensitivity analysis, the median overall survival time for conventional chemotherapy was the most influential factor with respect to the robustness of the model. The findings from the present analysis suggest that the addition of trastuzumab to conventional chemotherapy might not be cost-effective in patients with HER2-positive advanced gastric or GEJ cancer. Copyright © 2012 Elsevier HS Journals, Inc. All rights reserved.
A Probabilistic Design Method Applied to Smart Composite Structures
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Chamis, Christos C.
1995-01-01
A probabilistic design method is described and demonstrated using a smart composite wing. Probabilistic structural design incorporates naturally occurring uncertainties including those in constituent (fiber/matrix) material properties, fabrication variables, structure geometry and control-related parameters. Probabilistic sensitivity factors are computed to identify those parameters that have a great influence on a specific structural reliability. Two performance criteria are used to demonstrate this design methodology. The first criterion requires that the actuated angle at the wing tip be bounded by upper and lower limits at a specified reliability. The second criterion requires that the probability of ply damage due to random impact load be smaller than an assigned value. When the relationship between reliability improvement and the sensitivity factors is assessed, the results show that a reduction in the scatter of the random variable with the largest sensitivity factor (absolute value) provides the lowest failure probability. An increase in the mean of the random variable with a negative sensitivity factor will reduce the failure probability. Therefore, the design can be improved by controlling or selecting distribution parameters associated with random variables. This can be implemented during the manufacturing process to obtain maximum benefit with minimum alterations.
Losina, Elena; Dervan, Elizabeth E.; Paltiel, A. David; Dong, Yan; Wright, R. John; Spindler, Kurt P.; Mandl, Lisa A.; Jones, Morgan H.; Marx, Robert G.; Safran-Norton, Clare E.; Katz, Jeffrey N.
2015-01-01
Background Arthroscopic partial meniscectomy (APM) is extensively used to relieve pain in patients with symptomatic meniscal tear (MT) and knee osteoarthritis (OA). Recent studies have failed to show the superiority of APM compared to other treatments. We aim to examine whether existing evidence is sufficient to reject use of APM as a cost-effective treatment for MT+OA. Methods We built a patient-level microsimulation using Monte Carlo methods and evaluated three strategies: Physical therapy (‘PT’) alone; PT followed by APM if subjects continued to experience pain (‘Delayed APM’); and ‘Immediate APM’. Our subject population was US adults with symptomatic MT and knee OA over a 10 year time horizon. We assessed treatment outcomes using societal costs, quality-adjusted life years (QALYs), and calculated incremental cost-effectiveness ratios (ICERs), incorporating productivity costs as a sensitivity analysis. We also conducted a value-of-information analysis using probabilistic sensitivity analyses. Results Calculated ICERs were estimated to be $12,900/QALY for Delayed APM as compared to PT and $103,200/QALY for Immediate APM as compared to Delayed APM. In sensitivity analyses, inclusion of time costs made Delayed APM cost-saving as compared to PT. Improving efficacy of Delayed APM led to higher incremental costs and lower incremental effectiveness of Immediate APM in comparison to Delayed APM. Probabilistic sensitivity analyses indicated that PT had 3.0% probability of being cost-effective at a willingness-to-pay (WTP) threshold of $50,000/QALY. Delayed APM was cost effective 57.7% of the time at WTP = $50,000/QALY and 50.2% at WTP = $100,000/QALY. The probability of Immediate APM being cost-effective did not exceed 50% unless WTP exceeded $103,000/QALY. Conclusions We conclude that current cost-effectiveness evidence does not support unqualified rejection of either Immediate or Delayed APM for the treatment of MT+OA. The amount to which society would be willing to pay for additional information on treatment outcomes greatly exceeds the cost of conducting another randomized controlled trial on APM. PMID:26086246
Rocket engine system reliability analyses using probabilistic and fuzzy logic techniques
NASA Technical Reports Server (NTRS)
Hardy, Terry L.; Rapp, Douglas C.
1994-01-01
The reliability of rocket engine systems was analyzed by using probabilistic and fuzzy logic techniques. Fault trees were developed for integrated modular engine (IME) and discrete engine systems, and then were used with the two techniques to quantify reliability. The IRRAS (Integrated Reliability and Risk Analysis System) computer code, developed for the U.S. Nuclear Regulatory Commission, was used for the probabilistic analyses, and FUZZYFTA (Fuzzy Fault Tree Analysis), a code developed at NASA Lewis Research Center, was used for the fuzzy logic analyses. Although both techniques provided estimates of the reliability of the IME and discrete systems, probabilistic techniques emphasized uncertainty resulting from randomness in the system whereas fuzzy logic techniques emphasized uncertainty resulting from vagueness in the system. Because uncertainty can have both random and vague components, both techniques were found to be useful tools in the analysis of rocket engine system reliability.
Budget impact analysis of trastuzumab in early breast cancer: a hospital district perspective.
Purmonen, Timo T; Auvinen, Päivi K; Martikainen, Janne A
2010-04-01
Adjuvant trastuzumab is widely used in HER2-positive (HER2+) early breast cancer, and despite its cost-effectiveness, it causes substantial costs for health care. The purpose of the study was to develop a tool for estimating the budget impact of new cancer treatments. With this tool, we were able to estimate the budget impact of adjuvant trastuzumab, as well as the probability of staying within a given budget constraint. The created model-based evaluation tool was used to explore the budget impact of trastuzumab in early breast cancer in a single Finnish hospital district with 250,000 inhabitants. The used model took into account the number of patients, HER2+ prevalence, length and cost of treatment, and the effectiveness of the therapy. Probabilistic sensitivity analysis and alternative case scenarios were performed to ensure the robustness of the results. Introduction of adjuvant trastuzumab caused substantial costs for a relatively small hospital district. In base-case analysis the 4-year net budget impact was 1.3 million euro. The trastuzumab acquisition costs were partially offset by the reduction in costs associated with the treatment of cancer recurrence and metastatic disease. Budget impact analyses provide important information about the overall economic impact of new treatments, and thus offer complementary information to cost-effectiveness analyses. Inclusion of treatment outcomes and probabilistic sensitivity analysis provides more realistic estimates of the net budget impact. The length of trastuzumab treatment has a strong effect on the budget impact.
Probabilistic Aeroelastic Analysis of Turbomachinery Components
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.
2004-01-01
A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.
de Geus, S W L; Evans, D B; Bliss, L A; Eskander, M F; Smith, J K; Wolff, R A; Miksad, R A; Weinstein, M C; Tseng, J F
2016-10-01
Neoadjuvant therapy is gaining acceptance as a valid treatment option for borderline resectable pancreatic cancer; however, its value for clearly resectable pancreatic cancer remains controversial. The aim of this study was to use a Markov decision analysis model, in the absence of adequately powered randomized trials, to compare the life expectancy (LE) and quality-adjusted life expectancy (QALE) of neoadjuvant therapy to conventional upfront surgical strategies in resectable pancreatic cancer patients. A Markov decision model was created to compare two strategies: attempted pancreatic resection followed by adjuvant chemoradiotherapy and neoadjuvant chemoradiotherapy followed by restaging with, if appropriate, attempted pancreatic resection. Data obtained through a comprehensive systematic search in PUBMED of the literature from 2000 to 2015 were used to estimate the probabilities used in the model. Deterministic and probabilistic sensitivity analyses were performed. Of the 786 potentially eligible studies identified, 22 studies met the inclusion criteria and were used to extract the probabilities used in the model. Base case analyses of the model showed a higher LE (32.2 vs. 26.7 months) and QALE (25.5 vs. 20.8 quality-adjusted life months) for patients in the neoadjuvant therapy arm compared to upfront surgery. Probabilistic sensitivity analyses for LE and QALE revealed that neoadjuvant therapy is favorable in 59% and 60% of the cases respectively. Although conceptual, these data suggest that neoadjuvant therapy offers substantial benefit in LE and QALE for resectable pancreatic cancer patients. These findings highlight the value of further prospective randomized trials comparing neoadjuvant therapy to conventional upfront surgical strategies. Copyright © 2016 Elsevier Ltd, BASO ~ The Association for Cancer Surgery, and the European Society of Surgical Oncology. All rights reserved.
Dolk, Christiaan; Eichner, Martin; Welte, Robert; Anastassopoulou, Anastassia; Van Bellinghen, Laure-Anne; Poulsen Nautrup, Barbara; Van Vlaenderen, Ilse; Schmidt-Ott, Ruprecht; Schwehm, Markus; Postma, Maarten
2016-12-01
Seasonal influenza infection is primarily caused by circulation of two influenza A strain subtypes and strains from two B lineages that vary each year. Trivalent influenza vaccine (TIV) contains only one of the two B-lineage strains, resulting in mismatches between vaccine strains and the predominant circulating B lineage. Quadrivalent influenza vaccine (QIV) includes both B-lineage strains. The objective was to estimate the cost-utility of introducing QIV to replace TIV in Germany. An individual-based dynamic transmission model (4Flu) using German data was used to provide realistic estimates of the impact of TIV and QIV on age-specific influenza infections. Cases were linked to health and economic outcomes to calculate the cost-utility of QIV versus TIV, from both a societal and payer perspective. Costs and effects were discounted at 3.0 and 1.5 % respectively, with 2014 as the base year. Univariate and probabilistic sensitivity analyses were conducted. Using QIV instead of TIV resulted in additional quality-adjusted life-years (QALYs) and cost savings from the societal perspective (i.e. it represents the dominant strategy) and an incremental cost-utility ratio (ICUR) of €14,461 per QALY from a healthcare payer perspective. In all univariate analyses, QIV remained cost-effective (ICUR <€50,000). In probabilistic sensitivity analyses, QIV was cost-effective in >98 and >99 % of the simulations from the societal and payer perspective, respectively. This analysis suggests that QIV in Germany would provide additional health gains while being cost-saving to society or costing €14,461 per QALY gained from the healthcare payer perspective, compared with TIV.
Probabilistic analysis of a materially nonlinear structure
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.
1990-01-01
A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.
Integration of RAMS in LCC analysis for linear transport infrastructures. A case study for railways.
NASA Astrophysics Data System (ADS)
Calle-Cordón, Álvaro; Jiménez-Redondo, Noemi; Morales-Gámiz, F. J.; García-Villena, F. A.; Garmabaki, Amir H. S.; Odelius, Johan
2017-09-01
Life-cycle cost (LCC) analysis is an economic technique used to assess the total costs associated with the lifetime of a system in order to support decision making in long term strategic planning. For complex systems, such as railway and road infrastructures, the cost of maintenance plays an important role in the LCC analysis. Costs associated with maintenance interventions can be more reliably estimated by integrating the probabilistic nature of the failures associated to these interventions in the LCC models. Reliability, Maintainability, Availability and Safety (RAMS) parameters describe the maintenance needs of an asset in a quantitative way by using probabilistic information extracted from registered maintenance activities. Therefore, the integration of RAMS in the LCC analysis allows obtaining reliable predictions of system maintenance costs and the dependencies of these costs with specific cost drivers through sensitivity analyses. This paper presents an innovative approach for a combined RAMS & LCC methodology for railway and road transport infrastructures being developed under the on-going H2020 project INFRALERT. Such RAMS & LCC analysis provides relevant probabilistic information to be used for condition and risk-based planning of maintenance activities as well as for decision support in long term strategic investment planning.
NASA Technical Reports Server (NTRS)
Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.
1993-01-01
Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.
Naci, Huseyin; de Lissovoy, Gregory; Hollenbeak, Christopher; Custer, Brian; Hofmann, Axel; McClellan, William; Gitlin, Matthew
2012-01-01
To determine whether Medicare's decision to cover routine administration of erythropoietin stimulating agents (ESAs) to treat anemia of end-stage renal disease (ESRD) has been a cost-effective policy relative to standard of care at the time. The authors used summary statistics from the actual cohort of ESRD patients receiving ESAs between 1995 and 2004 to create a simulated patient cohort, which was compared with a comparable simulated cohort assumed to rely solely on blood transfusions. Outcomes modeled from the Medicare perspective included estimated treatment costs, life-years gained, and quality-adjusted life-years (QALYs). Incremental cost-effectiveness ratio (ICER) was calculated relative to the hypothetical reference case of no ESA use in the transfusion cohort. Sensitivity of the results to model assumptions was tested using one-way and probabilistic sensitivity analyses. Estimated total costs incurred by the ESRD population were $155.47B for the cohort receiving ESAs and $155.22B for the cohort receiving routine blood transfusions. Estimated QALYs were 2.56M and 2.29M, respectively, for the two groups. The ICER of ESAs compared to routine blood transfusions was estimated as $873 per QALY gained. The model was sensitive to a number of parameters according to one-way and probabilistic sensitivity analyses. This model was counter-factual as the actual comparison group, whose anemia was managed via transfusion and iron supplements, rapidly disappeared following introduction of ESAs. In addition, a large number of model parameters were obtained from observational studies due to the lack of randomized trial evidence in the literature. This study indicates that Medicare's coverage of ESAs appears to have been cost effective based on commonly accepted levels of willingness-to-pay. The ESRD population achieved substantial clinical benefit at a reasonable cost to society.
Carlson, Josh J; Suh, Kangho; Orfanos, Panos; Wong, William
2018-04-01
The recently completed ALEX trial demonstrated that alectinib improved progression-free survival, and delayed time to central nervous system progression compared with crizotinib in patients with anaplastic lymphoma kinase-positive non-small-cell lung cancer. However, the long-term clinical and economic impact of using alectinib vs. crizotinib has not been evaluated. The objective of this study was to determine the potential cost utility of alectinib vs. crizotinib from a US payer perspective. A cost-utility model was developed using partition survival methods and three health states: progression-free, post-progression, and death. ALEX trial data informed the progression-free and overall survival estimates. Costs included drug treatments and supportive care (central nervous system and non-central nervous system). Utility values were obtained from trial data and literature. Sensitivity analyses included one-way and probabilistic sensitivity analyses. Treatment with alectinib vs. crizotinib resulted in a gain of 0.91 life-years, 0.87 quality-adjusted life-years, and incremental costs of US$34,151, resulting in an incremental cost-effectiveness ratio of US$39,312/quality-adjusted life-year. Drug costs and utilities in the progression-free health state were the main drivers of the model in the one-way sensitivity analysis. From the probabilistic sensitivity analysis, alectinib had a 64% probability of being cost effective at a willingness-to-pay threshold of US$100,000/quality adjusted life-year. Alectinib increased time in the progression-free state and quality-adjusted life-years vs. crizotinib. The marginal cost increase was reflective of longer treatment durations in the progression-free state. Central nervous system-related costs were considerably lower with alectinib. Our results suggest that compared with crizotinib, alectinib may be a cost-effective therapy for treatment-naïve patients with anaplastic lymphoma kinase-positive non-small-cell lung cancer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, T.L.; Simonen, F.A.
1992-05-01
Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, T.L.; Simonen, F.A.
1992-01-01
Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designedmore » to reduce the probability of failure of a reactor vessel. 10 refs.« less
NASA Technical Reports Server (NTRS)
Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.
1995-01-01
Volume 5 is Appendix C, Auxiliary Shuttle Risk Analyses, and contains the following reports: Probabilistic Risk Assessment of Space Shuttle Phase 1 - Space Shuttle Catastrophic Failure Frequency Final Report; Risk Analysis Applied to the Space Shuttle Main Engine - Demonstration Project for the Main Combustion Chamber Risk Assessment; An Investigation of the Risk Implications of Space Shuttle Solid Rocket Booster Chamber Pressure Excursions; Safety of the Thermal Protection System of the Space Shuttle Orbiter - Quantitative Analysis and Organizational Factors; Space Shuttle Main Propulsion Pressurization System Probabilistic Risk Assessment, Final Report; and Space Shuttle Probabilistic Risk Assessment Proof-of-Concept Study - Auxiliary Power Unit and Hydraulic Power Unit Analysis Report.
Wu, James X; Sacks, Greg D; Dawes, Aaron J; DeUgarte, Daniel; Lee, Steven L
2017-07-01
Several studies have demonstrated the safety and short-term success of nonoperative management in children with acute, uncomplicated appendicitis. Nonoperative management spares the patients and their family the upfront cost and discomfort of surgery, but also risks recurrent appendicitis. Using decision-tree software, we evaluated the cost-effectiveness of nonoperative management versus routine laparoscopic appendectomy. Model variables were abstracted from a review of the literature, Healthcare Cost and Utilization Project, and Medicare Physician Fee schedule. Model uncertainty was assessed using both one-way and probabilistic sensitivity analyses. We used a $100,000 per quality adjusted life year (QALY) threshold for cost-effectiveness. Operative management cost $11,119 and yielded 23.56 quality-adjusted life months (QALMs). Nonoperative management cost $2277 less than operative management, but yielded 0.03 fewer QALMs. The incremental cost-to-effectiveness ratio of routine laparoscopic appendectomy was $910,800 per QALY gained. This greatly exceeds the $100,000/QALY threshold and was not cost-effective. One-way sensitivity analysis found that operative management would become cost-effective if the 1-year recurrence rate of acute appendicitis exceeded 39.8%. Probabilistic sensitivity analysis indicated that nonoperative management was cost-effective in 92% of simulations. Based on our model, nonoperative management is more cost-effective than routine laparoscopic appendectomy for children with acute, uncomplicated appendicitis. Cost-Effectiveness Study: Level II. Published by Elsevier Inc.
Optimal management of colorectal liver metastases in older patients: a decision analysis
Yang, Simon; Alibhai, Shabbir MH; Kennedy, Erin D; El-Sedfy, Abraham; Dixon, Matthew; Coburn, Natalie; Kiss, Alex; Law, Calvin HL
2014-01-01
Background Comparative trials evaluating management strategies for colorectal cancer liver metastases (CLM) are lacking, especially for older patients. This study developed a decision-analytic model to quantify outcomes associated with treatment strategies for CLM in older patients. Methods A Markov-decision model was built to examine the effect on life expectancy (LE) and quality-adjusted life expectancy (QALE) for best supportive care (BSC), systemic chemotherapy (SC), radiofrequency ablation (RFA) and hepatic resection (HR). The baseline patient cohort assumptions included healthy 70-year-old CLM patients after a primary cancer resection. Event and transition probabilities and utilities were derived from a literature review. Deterministic and probabilistic sensitivity analyses were performed on all study parameters. Results In base case analysis, BSC, SC, RFA and HR yielded LEs of 11.9, 23.1, 34.8 and 37.0 months, and QALEs of 7.8, 13.2, 22.0 and 25.0 months, respectively. Model results were sensitive to age, comorbidity, length of model simulation and utility after HR. Probabilistic sensitivity analysis showed increasing preference for RFA over HR with increasing patient age. Conclusions HR may be optimal for healthy 70-year-old patients with CLM. In older patients with comorbidities, RFA may provide better LE and QALE. Treatment decisions in older cancer patients should account for patient age, comorbidities, local expertise and individual values. PMID:24961482
Internal validation of STRmix™ for the interpretation of single source and mixed DNA profiles.
Moretti, Tamyra R; Just, Rebecca S; Kehl, Susannah C; Willis, Leah E; Buckleton, John S; Bright, Jo-Anne; Taylor, Duncan A; Onorato, Anthony J
2017-07-01
The interpretation of DNA evidence can entail analysis of challenging STR typing results. Genotypes inferred from low quality or quantity specimens, or mixed DNA samples originating from multiple contributors, can result in weak or inconclusive match probabilities when a binary interpretation method and necessary thresholds (such as a stochastic threshold) are employed. Probabilistic genotyping approaches, such as fully continuous methods that incorporate empirically determined biological parameter models, enable usage of more of the profile information and reduce subjectivity in interpretation. As a result, software-based probabilistic analyses tend to produce more consistent and more informative results regarding potential contributors to DNA evidence. Studies to assess and internally validate the probabilistic genotyping software STRmix™ for casework usage at the Federal Bureau of Investigation Laboratory were conducted using lab-specific parameters and more than 300 single-source and mixed contributor profiles. Simulated forensic specimens, including constructed mixtures that included DNA from two to five donors across a broad range of template amounts and contributor proportions, were used to examine the sensitivity and specificity of the system via more than 60,000 tests comparing hundreds of known contributors and non-contributors to the specimens. Conditioned analyses, concurrent interpretation of amplification replicates, and application of an incorrect contributor number were also performed to further investigate software performance and probe the limitations of the system. In addition, the results from manual and probabilistic interpretation of both prepared and evidentiary mixtures were compared. The findings support that STRmix™ is sufficiently robust for implementation in forensic laboratories, offering numerous advantages over historical methods of DNA profile analysis and greater statistical power for the estimation of evidentiary weight, and can be used reliably in human identification testing. With few exceptions, likelihood ratio results reflected intuitively correct estimates of the weight of the genotype possibilities and known contributor genotypes. This comprehensive evaluation provides a model in accordance with SWGDAM recommendations for internal validation of a probabilistic genotyping system for DNA evidence interpretation. Copyright © 2017. Published by Elsevier B.V.
Probabilistic load simulation: Code development status
NASA Astrophysics Data System (ADS)
Newell, J. F.; Ho, H.
1991-05-01
The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.
Design of Probabilistic Random Forests with Applications to Anticancer Drug Sensitivity Prediction
Rahman, Raziur; Haider, Saad; Ghosh, Souparno; Pal, Ranadip
2015-01-01
Random forests consisting of an ensemble of regression trees with equal weights are frequently used for design of predictive models. In this article, we consider an extension of the methodology by representing the regression trees in the form of probabilistic trees and analyzing the nature of heteroscedasticity. The probabilistic tree representation allows for analytical computation of confidence intervals (CIs), and the tree weight optimization is expected to provide stricter CIs with comparable performance in mean error. We approached the ensemble of probabilistic trees’ prediction from the perspectives of a mixture distribution and as a weighted sum of correlated random variables. We applied our methodology to the drug sensitivity prediction problem on synthetic and cancer cell line encyclopedia dataset and illustrated that tree weights can be selected to reduce the average length of the CI without increase in mean error. PMID:27081304
Probabilistic Simulation of Stress Concentration in Composite Laminates
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Murthy, P. L. N.; Liaw, D. G.
1994-01-01
A computational methodology is described to probabilistically simulate the stress concentration factors (SCF's) in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties, whereas the finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate SCF's, such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using is to simulate the SCF's in three different composite laminates. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the SCF's are influenced by local stiffness variables, by load eccentricities, and by initial stress fields.
NASA Astrophysics Data System (ADS)
Králik, Juraj
2017-07-01
The paper presents the probabilistic and sensitivity analysis of the efficiency of the damping devices cover of nuclear power plant under impact of the container of nuclear fuel of type TK C30 drop. The finite element idealization of nuclear power plant structure is used in space. The steel pipe damper system is proposed for dissipation of the kinetic energy of the container free fall. The experimental results of the shock-damper basic element behavior under impact loads are presented. The Newmark integration method is used for solution of the dynamic equations. The sensitivity and probabilistic analysis of damping devices was realized in the AntHILL and ANSYS software.
Efficient Sensitivity Methods for Probabilistic Lifing and Engine Prognostics
2010-09-01
AFRL-RX-WP-TR-2010-4297 EFFICIENT SENSITIVITY METHODS FOR PROBABILISTIC LIFING AND ENGINE PROGNOSTICS Harry Millwater , Ronald Bagley, Jose...5c. PROGRAM ELEMENT NUMBER 62102F 6. AUTHOR(S) Harry Millwater , Ronald Bagley, Jose Garza, D. Wagner, Andrew Bates, and Andy Voorhees 5d...Reliability Assessment, MIL-HDBK-1823, 30 April 1999. 9. Leverant GR, Millwater HR, McClung RC, Enright MP, A New Tool for Design and Certification of
Ruiz-Ramos, Jesus; Frasquet, Juan; Romá, Eva; Poveda-Andres, Jose Luis; Salavert-Leti, Miguel; Castellanos, Alvaro; Ramirez, Paula
2017-06-01
To evaluate the cost-effectiveness of antimicrobial stewardship (AS) program implementation focused on critical care units based on assumptions for the Spanish setting. A decision model comparing costs and outcomes of sepsis, community-acquired pneumonia, and nosocomial infections (including catheter-related bacteremia, urinary tract infection, and ventilator-associated pneumonia) in critical care units with or without an AS was designed. Model variables and costs, along with their distributions, were obtained from the literature. The study was performed from the Spanish National Health System (NHS) perspective, including only direct costs. The Incremental Cost-Effectiveness Ratio (ICER) was analysed regarding the ability of the program to reduce multi-drug resistant bacteria. Uncertainty in ICERs was evaluated with probabilistic sensitivity analyses. In the short-term, implementing an AS reduces the consumption of antimicrobials with a net benefit of €71,738. In the long-term, the maintenance of the program involves an additional cost to the system of €107,569. Cost per avoided resistance was €7,342, and cost-per-life-years gained (LYG) was €9,788. Results from the probabilistic sensitivity analysis showed that there was a more than 90% likelihood that an AS would be cost-effective at a level of €8,000 per LYG. Wide variability of economic results obtained from the implementation of this type of AS program and short information on their impact on patient evolution and any resistance avoided. Implementing an AS focusing on critical care patients is a long-term cost-effective tool. Implementation costs are amortized by reducing antimicrobial consumption to prevent infection by multidrug-resistant pathogens.
Application of a stochastic snowmelt model for probabilistic decisionmaking
NASA Technical Reports Server (NTRS)
Mccuen, R. H.
1983-01-01
A stochastic form of the snowmelt runoff model that can be used for probabilistic decision-making was developed. The use of probabilistic streamflow predictions instead of single valued deterministic predictions leads to greater accuracy in decisions. While the accuracy of the output function is important in decisionmaking, it is also important to understand the relative importance of the coefficients. Therefore, a sensitivity analysis was made for each of the coefficients.
Economic evaluation of floseal compared to nasal packing for the management of anterior epistaxis.
Le, Andre; Thavorn, Kednapa; Lasso, Andrea; Kilty, Shaun J
2018-01-04
To evaluate the cost-effectiveness of Floseal, a topically applied hemostatic agent, and nasal packing for the management of epistaxis in Canada. Outcomes research, a cost-utility analysis. We developed a Markov model to compare the costs and health outcomes of Floseal with nasal packing over a lifetime horizon from the perspective of a publicly funded healthcare system. A cycle length of 1 year was used. Efficacy of Floseal and packing was sought from the published literature. Unit costs were gathered from a hospital case costing system, whereas physician fees were extracted from the Ontario Schedule of Benefits for Physician Services. Results were expressed as an incremental cost per quality-adjusted life year (QALY) gained. A series of one-way sensitivity and probabilistic sensitivity analyses were performed. From the perspective of a publicly funded health are system, the Floseal treatment strategy was associated with higher costs ($2,067) and greater QALYs (0.27) than nasal packing. Our findings were highly sensitive to discount rates, the cost of Floseal, and the cost of nasal packing. The probabilistic sensitivity analysis suggested that the probability that Floseal treatment is cost-effective reached 99% if the willingness-to-pay threshold was greater than $120,000 per QALY gained. Prior studies have demonstrated Floseal to be an effective treatment for anterior epistaxis. In the Canadian healthcare system, Floseal treatment appears to be a cost-effective treatment option compared to nasal packing for anterior epistaxis. 2c Laryngoscope, 2018. © 2018 The American Laryngological, Rhinological and Otological Society, Inc.
Is probabilistic bias analysis approximately Bayesian?
MacLehose, Richard F.; Gustafson, Paul
2011-01-01
Case-control studies are particularly susceptible to differential exposure misclassification when exposure status is determined following incident case status. Probabilistic bias analysis methods have been developed as ways to adjust standard effect estimates based on the sensitivity and specificity of exposure misclassification. The iterative sampling method advocated in probabilistic bias analysis bears a distinct resemblance to a Bayesian adjustment; however, it is not identical. Furthermore, without a formal theoretical framework (Bayesian or frequentist), the results of a probabilistic bias analysis remain somewhat difficult to interpret. We describe, both theoretically and empirically, the extent to which probabilistic bias analysis can be viewed as approximately Bayesian. While the differences between probabilistic bias analysis and Bayesian approaches to misclassification can be substantial, these situations often involve unrealistic prior specifications and are relatively easy to detect. Outside of these special cases, probabilistic bias analysis and Bayesian approaches to exposure misclassification in case-control studies appear to perform equally well. PMID:22157311
Bucci, Monica; Mandelli, Maria Luisa; Berman, Jeffrey I.; Amirbekian, Bagrat; Nguyen, Christopher; Berger, Mitchel S.; Henry, Roland G.
2013-01-01
Introduction Diffusion MRI tractography has been increasingly used to delineate white matter pathways in vivo for which the leading clinical application is presurgical mapping of eloquent regions. However, there is rare opportunity to quantify the accuracy or sensitivity of these approaches to delineate white matter fiber pathways in vivo due to the lack of a gold standard. Intraoperative electrical stimulation (IES) provides a gold standard for the location and existence of functional motor pathways that can be used to determine the accuracy and sensitivity of fiber tracking algorithms. In this study we used intraoperative stimulation from brain tumor patients as a gold standard to estimate the sensitivity and accuracy of diffusion tensor MRI (DTI) and q-ball models of diffusion with deterministic and probabilistic fiber tracking algorithms for delineation of motor pathways. Methods We used preoperative high angular resolution diffusion MRI (HARDI) data (55 directions, b = 2000 s/mm2) acquired in a clinically feasible time frame from 12 patients who underwent a craniotomy for resection of a cerebral glioma. The corticospinal fiber tracts were delineated with DTI and q-ball models using deterministic and probabilistic algorithms. We used cortical and white matter IES sites as a gold standard for the presence and location of functional motor pathways. Sensitivity was defined as the true positive rate of delineating fiber pathways based on cortical IES stimulation sites. For accuracy and precision of the course of the fiber tracts, we measured the distance between the subcortical stimulation sites and the tractography result. Positive predictive rate of the delineated tracts was assessed by comparison of subcortical IES motor function (upper extremity, lower extremity, face) with the connection of the tractography pathway in the motor cortex. Results We obtained 21 cortical and 8 subcortical IES sites from intraoperative mapping of motor pathways. Probabilistic q-ball had the best sensitivity (79%) as determined from cortical IES compared to deterministic q-ball (50%), probabilistic DTI (36%), and deterministic DTI (10%). The sensitivity using the q-ball algorithm (65%) was significantly higher than using DTI (23%) (p < 0.001) and the probabilistic algorithms (58%) were more sensitive than deterministic approaches (30%) (p = 0.003). Probabilistic q-ball fiber tracks had the smallest offset to the subcortical stimulation sites. The offsets between diffusion fiber tracks and subcortical IES sites were increased significantly for those cases where the diffusion fiber tracks were visibly thinner than expected. There was perfect concordance between the subcortical IES function (e.g. hand stimulation) and the cortical connection of the nearest diffusion fiber track (e.g. upper extremity cortex). Discussion This study highlights the tremendous utility of intraoperative stimulation sites to provide a gold standard from which to evaluate diffusion MRI fiber tracking methods and has provided an object standard for evaluation of different diffusion models and approaches to fiber tracking. The probabilistic q-ball fiber tractography was significantly better than DTI methods in terms of sensitivity and accuracy of the course through the white matter. The commonly used DTI fiber tracking approach was shown to have very poor sensitivity (as low as 10% for deterministic DTI fiber tracking) for delineation of the lateral aspects of the corticospinal tract in our study. Effects of the tumor/edema resulted in significantly larger offsets between the subcortical IES and the preoperative fiber tracks. The provided data show that probabilistic HARDI tractography is the most objective and reproducible analysis but given the small sample and number of stimulation points a generalization about our results should be given with caution. Indeed our results inform the capabilities of preoperative diffusion fiber tracking and indicate that such data should be used carefully when making pre-surgical and intra-operative management decisions. PMID:24273719
Probabilistic sensitivity analysis for NICE technology assessment: not an optional extra.
Claxton, Karl; Sculpher, Mark; McCabe, Chris; Briggs, Andrew; Akehurst, Ron; Buxton, Martin; Brazier, John; O'Hagan, Tony
2005-04-01
Recently the National Institute for Clinical Excellence (NICE) updated its methods guidance for technology assessment. One aspect of the new guidance is to require the use of probabilistic sensitivity analysis with all cost-effectiveness models submitted to the Institute. The purpose of this paper is to place the NICE guidance on dealing with uncertainty into a broader context of the requirements for decision making; to explain the general approach that was taken in its development; and to address each of the issues which have been raised in the debate about the role of probabilistic sensitivity analysis in general. The most appropriate starting point for developing guidance is to establish what is required for decision making. On the basis of these requirements, the methods and framework of analysis which can best meet these needs can then be identified. It will be argued that the guidance on dealing with uncertainty and, in particular, the requirement for probabilistic sensitivity analysis, is justified by the requirements of the type of decisions that NICE is asked to make. Given this foundation, the main issues and criticisms raised during and after the consultation process are reviewed. Finally, some of the methodological challenges posed by the need fully to characterise decision uncertainty and to inform the research agenda will be identified and discussed. Copyright (c) 2005 John Wiley & Sons, Ltd.
The influence of number line estimation precision and numeracy on risky financial decision making.
Park, Inkyung; Cho, Soohyun
2018-01-10
This study examined whether different aspects of mathematical proficiency influence one's ability to make adaptive financial decisions. "Numeracy" refers to the ability to process numerical and probabilistic information and is commonly reported as an important factor which contributes to financial decision-making ability. The precision of mental number representation (MNR), measured with the number line estimation (NLE) task has been reported to be another critical factor. This study aimed to examine the contribution of these mathematical proficiencies while controlling for the influence of fluid intelligence, math anxiety and personality factors. In our decision-making task, participants chose between two options offering probabilistic monetary gain or loss. Sensitivity to expected value was measured as an index for the ability to discriminate between optimal versus suboptimal options. Partial correlation and hierarchical regression analyses revealed that NLE precision better explained EV sensitivity compared to numeracy, after controlling for all covariates. These results suggest that individuals with more precise MNR are capable of making more rational financial decisions. We also propose that the measurement of "numeracy," which is commonly used interchangeably with general mathematical proficiency, should include more diverse aspects of mathematical cognition including basic understanding of number magnitude. © 2018 International Union of Psychological Science.
Huang, Huan; Taylor, Douglas C A; Carson, Robyn T; Sarocco, Phil; Friedman, Mark; Munsell, Michael; Blum, Steven I; Menzin, Joseph
2015-04-01
To use techniques of decision-analytic modeling to evaluate the effectiveness and costs of linaclotide vs lubiprostone in the treatment of adult patients with irritable bowel syndrome with constipation (IBS-C). Using model inputs derived from published literature, linaclotide Phase III trial data and a physician survey, a decision-tree model was constructed. Response to therapy was defined as (1) a ≥ 14-point increase from baseline in IBS-Quality-of-Life (IBS-QoL) questionnaire overall score at week 12 or (2) one of the top two responses (moderately/significantly relieved) on a 7-point IBS symptom relief question in ≥ 2 of 3 months. Patients who do not respond to therapy are assumed to fail therapy and accrue costs associated with a treatment failure. Model time horizon is aligned with clinical trial duration of 12 weeks. Model outputs include number of responders, quality-adjusted life-years (QALYs), and total costs (including direct and indirect). Both one-way and probabilistic sensitivity analyses were conducted. Treatment for IBS-C with linaclotide produced more responders than lubiprostone for both response definitions (19.3% vs 13.0% and 61.8% vs 57.2% for IBS-QoL and symptom relief, respectively), lower per-patient costs ($803 vs $911 and $977 vs $1056), and higher QALYs (0.1921 vs 0.1917 and 0.1909 vs 0.1894) over the 12-week time horizon. Results were similar for most one-way sensitivity analyses. In probabilistic sensitivity analyses, the majority of simulations resulted in linaclotide having higher treatment response rates and lower per-patient costs. There are no available head-to-head trials that compare linaclotide with lubiprostone; therefore, placebo-adjusted estimates of relative efficacy were derived for model inputs. The time horizon for this model is relatively short, as it was limited to the duration of available clinical trial data. Linaclotide was found to be a less costly option vs lubiprostone for the treatment of adult patients with IBS-C.
Mandrik, Olena; Corro Ramos, Isaac; Knies, Saskia; Al, Maiwenn; Severens, Johan L
2015-01-01
The aim of this study was to assess the cost-effectiveness, from a health care perspective, of adding rituximab to fludarabine and cyclophosphamide scheme (FCR versus FC) for treatment-naïve and refractory/relapsed Ukrainian patients with chronic lymphocytic leukemia. A decision-analytic Markov cohort model with three health states and 1-month cycle time was developed and run within a life time horizon. Data from two multinational, prospective, open-label Phase 3 studies were used to assess patients' survival. While utilities were generalized from UK data, local resource utilization and disease-associated treatment, hospitalization, and side effect costs were applied. The alternative scenario was performed to assess the impact of lower life expectancy of the general population in Ukraine on the incremental cost-effectiveness ratio (ICER) for treatment-naïve patients. One-way, two-way, and probabilistic sensitivity analyses were conducted to assess the robustness of the results. The ICER (in US dollars) of treating chronic lymphocytic leukemia patients with FCR versus FC is US$8,704 per quality-adjusted life year gained for treatment-naïve patients and US$11,056 for refractory/relapsed patients. When survival data were modified to the lower life expectancy of the general population in Ukraine, the ICER for treatment-naïve patients was higher than US$13,000. This value is higher than three times the current gross domestic product per capita in Ukraine. Sensitivity analyses have shown a high impact of rituximab costs and a moderate impact of differences in utilities on the ICER. Furthermore, probabilistic sensitivity analyses have shown that for refractory/relapsed patients the probability of FCR being cost-effective is higher than for treatment-naïve patients and is close to one if the threshold is higher than US$15,000. State coverage of rituximab treatment may be considered a cost-effective treatment for the Ukrainian population under conditions of economic stability, cost-effectiveness threshold growth, or rituximab price negotiations.
Palbociclib in hormone receptor positive advanced breast cancer: A cost-utility analysis.
Raphael, J; Helou, J; Pritchard, K I; Naimark, D M
2017-11-01
The addition of palbociclib to letrozole improves progression-free survival in the first-line treatment of hormone receptor positive advanced breast cancer (ABC). This study assesses the cost-utility of palbociclib from the Canadian healthcare payer perspective. A probabilistic discrete event simulation (DES) model was developed and parameterised with data from the PALOMA 1 and 2 trials and other sources. The incremental cost per quality-adjusted life-month (QALM) gained for palbociclib was calculated. A time horizon of 15 years was used in the base case with costs and effectiveness discounted at 5% annually. Time-to- progression and time-to-death were derived from a Weibull and exponential distribution. Expected costs were based on Ontario fees and other sources. Probabilistic sensitivity analyses were conducted to account for parameter uncertainty. Compared to letrozole, the addition of palbociclib provided an additional 14.7 QALM at an incremental cost of $161,508. The resulting incremental cost-effectiveness ratio was $10,999/QALM gained. Assuming a willingness-to-pay (WTP) of $4167/QALM, the probability of palbociclib to be cost-effective was 0%. Cost-effectiveness acceptability curves derived from a probabilistic sensitivity analysis showed that at a WTP of $11,000/QALM gained, the probability of palbociclib to be cost-effective was 50%. The addition of palbociclib to letrozole is unlikely to be cost-effective for the treatment of ABC from a Canadian healthcare perspective with its current price. While ABC patients derive a meaningful clinical benefit from palbociclib, considerations should be given to increase the WTP threshold and reduce the drug pricing, to render this strategy more affordable. Copyright © 2017 Elsevier Ltd. All rights reserved.
Probabilistic Integrated Assessment of ``Dangerous'' Climate Change
NASA Astrophysics Data System (ADS)
Mastrandrea, Michael D.; Schneider, Stephen H.
2004-04-01
Climate policy decisions are being made despite layers of uncertainty. Such decisions directly influence the potential for ``dangerous anthropogenic interference with the climate system.'' We mapped a metric for this concept, based on Intergovernmental Panel on Climate Change assessment of climate impacts, onto probability distributions of future climate change produced from uncertainty in key parameters of the coupled social-natural system-climate sensitivity, climate damages, and discount rate. Analyses with a simple integrated assessment model found that, under midrange assumptions, endogenously calculated, optimal climate policy controls can reduce the probability of dangerous anthropogenic interference from ~45% under minimal controls to near zero.
Hypoglycemia alarm enhancement using data fusion.
Skladnev, Victor N; Tarnavskii, Stanislav; McGregor, Thomas; Ghevondian, Nejhdeh; Gourlay, Steve; Jones, Timothy W
2010-01-01
The acceptance of closed-loop blood glucose (BG) control using continuous glucose monitoring systems (CGMS) is likely to improve with enhanced performance of their integral hypoglycemia alarms. This article presents an in silico analysis (based on clinical data) of a modeled CGMS alarm system with trained thresholds on type 1 diabetes mellitus (T1DM) patients that is augmented by sensor fusion from a prototype hypoglycemia alarm system (HypoMon). This prototype alarm system is based on largely independent autonomic nervous system (ANS) response features. Alarm performance was modeled using overnight BG profiles recorded previously on 98 T1DM volunteers. These data included the corresponding ANS response features detected by HypoMon (AiMedics Pty. Ltd.) systems. CGMS data and alarms were simulated by applying a probabilistic model to these overnight BG profiles. The probabilistic model developed used a mean response delay of 7.1 minutes, measurement error offsets on each sample of +/- standard deviation (SD) = 4.5 mg/dl (0.25 mmol/liter), and vertical shifts (calibration offsets) of +/- SD = 19.8 mg/dl (1.1 mmol/liter). Modeling produced 90 to 100 simulated measurements per patient. Alarm systems for all analyses were optimized on a training set of 46 patients and evaluated on the test set of 56 patients. The split between the sets was based on enrollment dates. Optimization was based on detection accuracy but not time to detection for these analyses. The contribution of this form of data fusion to hypoglycemia alarm performance was evaluated by comparing the performance of the trained CGMS and fused data algorithms on the test set under the same evaluation conditions. The simulated addition of HypoMon data produced an improvement in CGMS hypoglycemia alarm performance of 10% at equal specificity. Sensitivity improved from 87% (CGMS as stand-alone measurement) to 97% for the enhanced alarm system. Specificity was maintained constant at 85%. Positive predictive values on the test set improved from 61 to 66% with negative predictive values improving from 96 to 99%. These enhancements were stable within sensitivity analyses. Sensitivity analyses also suggested larger performance increases at lower CGMS alarm performance levels. Autonomic nervous system response features provide complementary information suitable for fusion with CGMS data to enhance nocturnal hypoglycemia alarms. 2010 Diabetes Technology Society.
Blázquez-Pérez, Antonio; San Miguel, Ramón; Mar, Javier
2013-10-01
Chronic hepatitis C is the leading cause of chronic liver disease, representing a significant burden in terms of morbidity, mortality and costs. A new scenario of therapy for hepatitis C virus (HCV) genotype 1 infection is being established with the approval of two effective HCV protease inhibitors (PIs) in combination with the standard of care (SOC), peginterferon and ribavirin. Our objective was to estimate the cost effectiveness of combination therapy with new PIs (boceprevir and telaprevir) plus peginterferon and ribavirin versus SOC in treatment-naive patients with HCV genotype 1 according to data obtained from clinical trials (CTs). A Markov model simulating chronic HCV progression was used to estimate disease treatment costs and effects over patients' lifetimes, in the Spanish national public healthcare system. The target population was treatment-naive patients with chronic HCV genotype 1, demographic characteristics for whom were obtained from the published pivotal CTs SPRINT and ADVANCE. Three options were analysed for each PI based on results from the two CTs: universal triple therapy, interleukin (IL)-28B-guided therapy and dual therapy with peginterferon and ribavirin. A univariate sensitivity analysis was performed to evaluate the uncertainty of certain parameters: age at start of treatment, transition probabilities, drug costs, CT efficacy results and a higher hazard ratio for all-cause mortality for patients with chronic HCV. Probabilistic sensitivity analyses were also carried out. Incremental cost-effectiveness ratios (ICERs) of €2012 per quality-adjusted life-year (QALY) gained were used as outcome measures. According to the base-case analysis, using dual therapy as the comparator, the alternative IL28B-guided therapy presents a more favorable ICER (€18,079/QALY for boceprevir and €25,914/QALY for telaprevir) than the universal triple therapy option (€27,594/QALY for boceprevir and €33,751/QALY for telaprevir), with an ICER clearly below the efficiency threshold for medical interventions in the Spanish setting. Sensitivity analysis showed that age at the beginning of treatment was an important factor that influenced the ICER. A potential reduction in PI costs would also clearly improve the ICER, and transition probabilities influenced the results, but to a lesser extent. Probabilistic sensitivity analyses showed that 95 % of the simulations presented an ICER below €40,000/QALY. Post hoc estimations of sustained virological responses of the IL28B-guided therapeutic option represented a limitation of the study. The therapeutic options analysed for the base-case cohort can be considered cost-effective interventions for the Spanish healthcare framework. Sensitivity analysis estimated an acceptability threshold of the IL28B-guided strategy of patients younger than 60 years.
Cost-effectiveness analysis of treatment strategies for initial Clostridium difficile infection.
Varier, R U; Biltaji, E; Smith, K J; Roberts, M S; Jensen, M K; LaFleur, J; Nelson, R E
2014-12-01
Clostridium difficile infection (CDI) is costly. Current guidelines recommend metronidazole as first-line therapy and vancomycin as an alternative. Recurrence is common. Faecal microbiota transplantation (FMT) is an effective therapy for recurrent CDI (RCDI). This study explores the cost-effectiveness of FMT, vancomycin and metronidazole for initial CDI. We constructed a decision-analytic computer simulation using inputs from published literature to compare FMT with a 10-14-day course of oral metronidazole or vancomycin for initial CDI. Parameters included cure rates (baseline value (range)) for metronidazole (80% (65-85%)), vancomycin (90% (88-92%)) and FMT(91% (83-100%)). Direct costs of metronidazole, vancomycin and FMT, adjusted to 2011 dollars, were $57 ($43-72), $1347 ($1195-1499) and $1086 ($815-1358), respectively. Our effectiveness measure was quality-adjusted life years (QALYs). One-way and probabilistic sensitivity analyses were conducted from the third-party payer perspective. Analysis using baseline values showed that FMT($1669, 0.242 QALYs) dominated (i.e. was less costly and more effective) vancomycin ($1890, 0.241 QALYs). FMT was more costly and more effective than metronidazole ($1167, 0.238 QALYs), yielding an incremental cost-effectiveness ratio (ICER) of $124 964/QALY. One-way sensitivity analyses showed that metronidazole dominated both strategies if its probability of cure were >90%; FMT dominated if it cost <$584. In a probabilistic sensitivity analysis at a willingness-to-pay threshold of $100 000/QALY, metronidazole was favoured in 55% of model iterations; FMT was favoured in 38%. Metronidazole, as the first-line treatment for CDIs, is less costly. FMT and vancomycin are more effective. However, FMT is less likely to be economically favourable, and vancomycin is unlikely to be favourable as first-line therapy when compared with FMT. © 2014 The Authors Clinical Microbiology and Infection © 2014 European Society of Clinical Microbiology and Infectious Diseases.
Tadmouri, Abir; Blomkvist, Josefin; Landais, Cécile; Seymour, Jerome; Azmoun, Alexandre
2018-02-01
Although left ventricular assist devices (LVADs) are currently approved for coverage and reimbursement in France, no French cost-effectiveness (CE) data are available to support this decision. This study aimed at estimating the CE of LVAD compared with medical management in the French health system. Individual patient data from the 'French hospital discharge database' (Medicalization of information systems program) were analysed using Kaplan-Meier method. Outcomes were time to death, time to heart transplantation (HTx), and time to death after HTx. A micro-costing method was used to calculate the monthly costs extracted from the Program for the Medicalization of Information Systems. A multistate Markov monthly cycle model was developed to assess CE. The analysis over a lifetime horizon was performed from the perspective of the French healthcare payer; discount rates were 4%. Probabilistic and deterministic sensitivity analyses were performed. Outcomes were quality-adjusted life years (QALYs) and incremental CE ratio (ICER). Mean QALY for an LVAD patient was 1.5 at a lifetime cost of €190 739, delivering a probabilistic ICER of €125 580/QALY [95% confidence interval: 105 587 to 150 314]. The sensitivity analysis showed that the ICER was mainly sensitive to two factors: (i) the high acquisition cost of the device and (ii) the device performance in terms of patient survival. Our economic evaluation showed that the use of LVAD in patients with end-stage heart failure yields greater benefit in terms of survival than medical management at an extra lifetime cost exceeding the €100 000/QALY. Technological advances and device costs reduction shall hence lead to an improvement in overall CE. © 2017 The Authors. ESC Heart Failure published by John Wiley & Sons Ltd on behalf of the European Society of Cardiology.
Haucke, Florian
2010-11-01
Radon is a naturally occurring inert radioactive gas found in soils and rocks that can accumulate in dwellings, and is associated with an increased risk of lung cancer. This study aims to analyze the cost effectiveness of different intervention strategies to reduce radon concentrations in existing German dwellings. The cost effectiveness analysis (CEA) was conducted as a scenario analysis, where each scenario represents a specific regulatory regime. A decision theoretic model was developed, which reflects accepted recommendations for radon screening and mitigation and uses most up-to-date data on radon distribution and relative risks. The model was programmed to account for compliance with respect to the single steps of radon intervention, as well as data on the sensitivity/specificity of radon tests. A societal perspective was adopted to calculate costs and effects. All scenarios were calculated for different action levels. Cost effectiveness was measured in costs per averted case of lung cancer, costs per life year gained and costs per quality adjusted life year (QALY) gained. Univariate and multivariate deterministic and probabilistic sensitivity analyses (SA) were performed. Probabilistic sensitivity analyses were based on Monte Carlo simulations with 5000 model runs. The results show that legal regulations with mandatory screening and mitigation for indoor radon levels >100 Bq/m(3) are most cost effective. Incremental cost effectiveness compared to the no mitigation base case is 25,181 euro (95% CI: 7371 euro-90,593 euro) per QALY gained. Other intervention strategies focussing primarily on the personal responsibility for screening and/or mitigative actions show considerably worse cost effectiveness ratios. However, targeting radon intervention to radon-prone areas is significantly more cost effective. Most of the uncertainty that surrounds the results can be ascribed to the relative risk of radon exposure. It can be concluded that in the light of international experience a legal regulation requiring radon screening and, if necessary, mitigation is justifiable under the terms of CEA. Copyright 2010 Elsevier Ltd. All rights reserved.
Probabilistic structural analysis of a truss typical for space station
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.
1990-01-01
A three-bay, space, cantilever truss is probabilistically evaluated using the computer code NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) to identify and quantify the uncertainties and respective sensitivities associated with corresponding uncertainties in the primitive variables (structural, material, and loads parameters) that defines the truss. The distribution of each of these primitive variables is described in terms of one of several available distributions such as the Weibull, exponential, normal, log-normal, etc. The cumulative distribution function (CDF's) for the response functions considered and sensitivities associated with the primitive variables for given response are investigated. These sensitivities help in determining the dominating primitive variables for that response.
Probabilistic and Possibilistic Analyses of the Strength of a Bonded Joint
NASA Technical Reports Server (NTRS)
Stroud, W. Jefferson; Krishnamurthy, T.; Smith, Steven A.
2001-01-01
The effects of uncertainties on the strength of a single lap shear joint are explained. Probabilistic and possibilistic methods are used to account for uncertainties. Linear and geometrically nonlinear finite element analyses are used in the studies. To evaluate the strength of the joint, fracture in the adhesive and material strength failure in the strap are considered. The study shows that linear analyses yield conservative predictions for failure loads. The possibilistic approach for treating uncertainties appears to be viable for preliminary design, but with several qualifications.
NASA Astrophysics Data System (ADS)
Yu, Bo; Ning, Chao-lie; Li, Bing
2017-03-01
A probabilistic framework for durability assessment of concrete structures in marine environments was proposed in terms of reliability and sensitivity analysis, which takes into account the uncertainties under the environmental, material, structural and executional conditions. A time-dependent probabilistic model of chloride ingress was established first to consider the variations in various governing parameters, such as the chloride concentration, chloride diffusion coefficient, and age factor. Then the Nataf transformation was adopted to transform the non-normal random variables from the original physical space into the independent standard Normal space. After that the durability limit state function and its gradient vector with respect to the original physical parameters were derived analytically, based on which the first-order reliability method was adopted to analyze the time-dependent reliability and parametric sensitivity of concrete structures in marine environments. The accuracy of the proposed method was verified by comparing with the second-order reliability method and the Monte Carlo simulation. Finally, the influences of environmental conditions, material properties, structural parameters and execution conditions on the time-dependent reliability of concrete structures in marine environments were also investigated. The proposed probabilistic framework can be implemented in the decision-making algorithm for the maintenance and repair of deteriorating concrete structures in marine environments.
Luebke, Thomas; Brunkwall, Jan
2014-05-01
This study weighed the cost and benefit of thoracic endovascular aortic repair (TEVAR) vs open repair (OR) in the treatment of an acute complicated type B aortic dissection by (TBAD) estimating the cost-effectiveness to determine an optimal treatment strategy based on the best currently available evidence. A cost-utility analysis from the perspective of the health system payer was performed using a decision analytic model. Within this model, the 1-year survival, quality-adjusted life-years (QALYs), and costs for a hypothetical cohort of patients with an acute complicated TBAD managed with TEVAR or OR were evaluated. Clinical effectiveness data, cost data, and transitional probabilities of different health states were derived from previously published high-quality studies or meta-analyses. Probabilistic sensitivity analyses were performed on uncertain model parameters. The base-case analysis showed, in terms of QALYs, that OR appeared to be more expensive (incremental cost of €17,252.60) and less effective (-0.19 QALYs) compared with TEVAR; hence, in terms of the incremental cost-effectiveness ratio, OR was dominated by TEVAR. As a result, the incremental cost-effectiveness ratio (ie, the cost per life-year saved) was not calculated. The average cost-effectiveness ratio of TEVAR and OR per QALY gained was €56,316.79 and €108,421.91, respectively. In probabilistic sensitivity analyses, TEVAR was economically dominant in 100% of cases. The probability that TEVAR was economically attractive at a willingness-to-pay threshold of €50,000/QALY gained was 100%. The present results suggest that TEVAR yielded more QALYs and was associated with lower 1-year costs compared with OR in patients with an acute complicated TBAD. As a result, from the cost-effectiveness point of view, TEVAR is the dominant therapy over OR for this disease under the predefined conditions. Copyright © 2014 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.
Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects
NASA Technical Reports Server (NTRS)
Nagpal, V. K.
1985-01-01
A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.
Probabilistic structural analysis methods for select space propulsion system components
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Cruse, T. A.
1989-01-01
The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.
Probabilistic evaluation of uncertainties and risks in aerospace components
NASA Technical Reports Server (NTRS)
Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.
1992-01-01
A methodology is presented for the computational simulation of primitive variable uncertainties, and attention is given to the simulation of specific aerospace components. Specific examples treated encompass a probabilistic material behavior model, as well as static, dynamic, and fatigue/damage analyses of a turbine blade in a mistuned bladed rotor in the SSME turbopumps. An account is given of the use of the NESSES probabilistic FEM analysis CFD code.
Zeng, Xiaohui; Li, Jianhe; Peng, Liubao; Wang, Yunhua; Tan, Chongqing; Chen, Gannong; Wan, Xiaomin; Lu, Qiong; Yi, Lidan
2014-01-01
Maintenance gefitinib significantly prolonged progression-free survival (PFS) compared with placebo in patients from eastern Asian with locally advanced/metastatic non-small-cell lung cancer (NSCLC) after four chemotherapeutic cycles (21 days per cycle) of first-line platinum-based combination chemotherapy without disease progression. The objective of the current study was to evaluate the cost-effectiveness of maintenance gefitinib therapy after four chemotherapeutic cycle's stand first-line platinum-based chemotherapy for patients with locally advanced or metastatic NSCLC with unknown EGFR mutations, from a Chinese health care system perspective. A semi-Markov model was designed to evaluate cost-effectiveness of the maintenance gefitinib treatment. Two-parametric Weibull and Log-logistic distribution were fitted to PFS and overall survival curves independently. One-way and probabilistic sensitivity analyses were conducted to assess the stability of the model designed. The model base-case analysis suggested that maintenance gefitinib would increase benefits in a 1, 3, 6 or 10-year time horizon, with incremental $184,829, $19,214, $19,328, and $21,308 per quality-adjusted life-year (QALY) gained, respectively. The most sensitive influential variable in the cost-effectiveness analysis was utility of PFS plus rash, followed by utility of PFS plus diarrhoea, utility of progressed disease, price of gefitinib, cost of follow-up treatment in progressed survival state, and utility of PFS on oral therapy. The price of gefitinib is the most significant parameter that could reduce the incremental cost per QALY. Probabilistic sensitivity analysis indicated that the cost-effective probability of maintenance gefitinib was zero under the willingness-to-pay (WTP) threshold of $16,349 (3 × per-capita gross domestic product of China). The sensitivity analyses all suggested that the model was robust. Maintenance gefitinib following first-line platinum-based chemotherapy for patients with locally advanced/metastatic NSCLC with unknown EGFR mutations is not cost-effective. Decreasing the price of gefitinib may be a preferential choice for meeting widely treatment demands in China.
Lester-Coll, Nataniel H; Dosoretz, Arie P; Magnuson, William J; Laurans, Maxwell S; Chiang, Veronica L; Yu, James B
2016-12-01
OBJECTIVE The JLGK0901 study found that stereotactic radiosurgery (SRS) is a safe and effective treatment option for treating up to 10 brain metastases. The purpose of this study is to determine the cost-effectiveness of treating up to 10 brain metastases with SRS, whole-brain radiation therapy (WBRT), or SRS and immediate WBRT (SRS+WBRT). METHODS A Markov model was developed to evaluate the cost effectiveness of SRS, WBRT, and SRS+WBRT in patients with 1 or 2-10 brain metastases. Transition probabilities were derived from the JLGK0901 study and modified according to the recurrence rates observed in the Radiation Therapy Oncology Group (RTOG) 9508 and European Organization for Research and Treatment of Cancer (EORTC) 22952-26001 studies to simulate the outcomes for patients who receive WBRT. Costs are based on 2015 Medicare reimbursements. Health state utilities were prospectively collected using the Standard Gamble method. End points included cost, quality-adjusted life years (QALYs), and incremental cost-effectiveness ratios (ICERs). The willingness-to-pay (WTP) threshold was $100,000 per QALY. One-way and probabilistic sensitivity analyses explored uncertainty with regard to the model assumptions. RESULTS In patients with 1 brain metastasis, the ICERs for SRS versus WBRT, SRS versus SRS+WBRT, and SRS+WBRT versus WBRT were $117,418, $51,348, and $746,997 per QALY gained, respectively. In patients with 2-10 brain metastases, the ICERs were $123,256, $58,903, and $821,042 per QALY gained, respectively. On the sensitivity analyses, the model was sensitive to the cost of SRS and the utilities associated with stable post-SRS and post-WBRT states. In patients with 2-10 brain metastases, SRS versus WBRT becomes cost-effective if the cost of SRS is reduced by $3512. SRS versus WBRT was also cost effective at a WTP of $200,000 per QALY on the probabilistic sensitivity analysis. CONCLUSIONS The most cost-effective strategy for patients with up to 10 brain metastases is SRS alone relative to SRS+WBRT. SRS alone may also be cost-effective relative to WBRT alone, but this depends on WTP, the cost of SRS, and patient preferences.
Zeng, Xiaohui; Peng, Liubao; Li, Jianhe; Chen, Gannong; Tan, Chongqing; Wang, Siying; Wan, Xiaomin; Ouyang, Lihui; Zhao, Ziying
2013-01-01
Continuation maintenance treatment with pemetrexed is approved by current clinical guidelines as a category 2A recommendation after induction therapy with cisplatin and pemetrexed chemotherapy (CP strategy) for patients with advanced nonsquamous non-small-cell lung cancer (NSCLC). However, the cost-effectiveness of the treatment remains unclear. We completed a trial-based assessment, from the perspective of the Chinese health care system, of the cost-effectiveness of maintenance pemetrexed treatment after a CP strategy for patients with advanced nonsquamous NSCLC. A Markov model was developed to estimate costs and benefits. It was based on a clinical trial that compared continuation maintenance pemetrexed therapy plus best supportive care (BSC) versus placebo plus BSC after a CP strategy for advanced nonsquamous NSCLC. Sensitivity analyses were conducted to assess the stability of the model. The model base case analysis suggested that continuation maintenance pemetrexed therapy after a CP strategy would increase benefits in a 1-, 2-, 5-, or 10-year time horizon, with incremental costs of $183,589.06, $126,353.16, $124,766.68, and $124,793.12 per quality-adjusted life-year gained, respectively. The most sensitive influential variable in the cost-effectiveness analysis was the utility of the progression-free survival state, followed by proportion of patients with postdiscontinuation therapy in both arms, proportion of BSC costs for PFS versus progressed survival state, and cost of pemetrexed. Probabilistic sensitivity analysis indicated that the cost-effective probability of adding continuation maintenance pemetrexed therapy to BSC was zero. One-way and probabilistic sensitivity analyses revealed that the Markov model was robust. Continuation maintenance of pemetrexed after a CP strategy for patients with advanced nonsquamous NSCLC is not cost-effective based on a recent clinical trial. Decreasing the price or adjusting the dosage of pemetrexed may be a better option for meeting the treatment demands of Chinese patients. Copyright © 2013 Elsevier HS Journals, Inc. All rights reserved.
Lunar Exploration Architecture Level Key Drivers and Sensitivities
NASA Technical Reports Server (NTRS)
Goodliff, Kandyce; Cirillo, William; Earle, Kevin; Reeves, J. D.; Shyface, Hilary; Andraschko, Mark; Merrill, R. Gabe; Stromgren, Chel; Cirillo, Christopher
2009-01-01
Strategic level analysis of the integrated behavior of lunar transportation and lunar surface systems architecture options is performed to assess the benefit, viability, affordability, and robustness of system design choices. This analysis employs both deterministic and probabilistic modeling techniques so that the extent of potential future uncertainties associated with each option are properly characterized. The results of these analyses are summarized in a predefined set of high-level Figures of Merit (FOMs) so as to provide senior NASA Constellation Program (CxP) and Exploration Systems Mission Directorate (ESMD) management with pertinent information to better inform strategic level decision making. The strategic level exploration architecture model is designed to perform analysis at as high a level as possible but still capture those details that have major impacts on system performance. The strategic analysis methodology focuses on integrated performance, affordability, and risk analysis, and captures the linkages and feedbacks between these three areas. Each of these results leads into the determination of the high-level FOMs. This strategic level analysis methodology has been previously applied to Space Shuttle and International Space Station assessments and is now being applied to the development of the Constellation Program point-of-departure lunar architecture. This paper provides an overview of the strategic analysis methodology and the lunar exploration architecture analyses to date. In studying these analysis results, the strategic analysis team has identified and characterized key drivers affecting the integrated architecture behavior. These key drivers include inclusion of a cargo lander, mission rate, mission location, fixed-versus- variable costs/return on investment, and the requirement for probabilistic analysis. Results of sensitivity analysis performed on lunar exploration architecture scenarios are also presented.
Wheeler, Stephanie B.; Stranix-Chibanda, Lynda; Hosek, Sybil G.; Watts, D. Heather; Siberry, George K.; Spiegel, Hans M. L.; Stringer, Jeffrey S.; Chi, Benjamin H.
2016-01-01
Introduction: Antiretroviral pre-exposure prophylaxis (PrEP) for the prevention of HIV acquisition is cost-effective when delivered to those at substantial risk. Despite a high incidence of HIV infection among pregnant and breastfeeding women in sub-Saharan Africa (SSA), a theoretical increased risk of preterm birth on PrEP could outweigh the HIV prevention benefit. Methods: We developed a decision analytic model to evaluate a strategy of daily oral PrEP during pregnancy and breastfeeding in SSA. We approached the analysis from a health care system perspective across a lifetime time horizon. Model inputs were derived from existing literature and local sources. The incremental cost-effectiveness ratio (ICER) of PrEP versus no PrEP was calculated in 2015 U.S. dollars per disability-adjusted life year (DALY) averted. We evaluated the effect of uncertainty in baseline estimates through one-way and probabilistic sensitivity analyses. Results: PrEP administered to pregnant and breastfeeding women in SSA was cost-effective. In a base case of 10,000 women, the administration of PrEP averted 381 HIV infections but resulted in 779 more preterm births. PrEP was more costly per person ($450 versus $117), but resulted in fewer disability-adjusted life years (DALYs) (3.15 versus 3.49). The incremental cost-effectiveness ratio of $965/DALY averted was below the recommended regional threshold for cost-effectiveness of $6462/DALY. Probabilistic sensitivity analyses demonstrated robustness of the model. Conclusions: Providing PrEP to pregnant and breastfeeding women in SSA is likely cost-effective, although more data are needed about adherence and safety. For populations at high risk of HIV acquisition, PrEP may be considered as part of a broader combination HIV prevention strategy. PMID:27355502
Wu, Bin; Li, Jin; Wu, Haixiang
2015-11-01
To investigate the cost-effectiveness of different screening intervals for diabetic retinopathy (DR) in Chinese patients with newly diagnosed type 2 diabetes mellitus (T2DM). Chinese healthcare system.Chinese general clinical setting. A cost-effectiveness model was developed to simulate the disease course of Chinese population with newly diagnosed with diabetes. Different DR screening programs were modeled to project economic outcomes. To develop the economic model, we calibrated the progression rates of DR that fit Chinese epidemiologic data derived from the published literature. Costs were estimated from the perspective of the Chinese healthcare system, and the analysis was run over a lifetime horizon. One-way and probabilistic sensitivity analyses were performed. Total costs, vision outcomes, costs per quality-adjusted life year (QALY), the incremental cost-effectiveness ratio (ICER) of screening strategies compared to no screening. DR screening is effective in Chinese patients with newly diagnosed T2DM, and screen strategies with ≥4-year intervals were cost-effective (ICER <$7,485 per QALY) compared to no screening. Screening every 4 years produced the greatest increase in QALYs (11.066) among the cost-effective strategies. The screening intervals could be varied dramatically by age at T2DM diagnosis. Probabilistic sensitivity analyses demonstrated the consistency and robustness of the cost-effectiveness of the 4-year interval screening strategy. The findings suggest that a 4-year interval screening strategy is likely to be more cost-effective than screening every 1 to 3 years in comparison with no screening in the Chinese setting. The screening intervals might be tailored according to the age at T2DM diagnosis.
Learning to Look: Probabilistic Variation and Noise Guide Infants' Eye Movements
ERIC Educational Resources Information Center
Tummeltshammer, Kristen Swan; Kirkham, Natasha Z.
2013-01-01
Young infants have demonstrated a remarkable sensitivity to probabilistic relations among visual features (Fiser & Aslin, 2002; Kirkham et al., 2002). Previous research has raised important questions regarding the usefulness of statistical learning in an environment filled with variability and noise, such as an infant's natural world. In…
Probabilistic Cues to Grammatical Category in English Orthography and Their Influence during Reading
ERIC Educational Resources Information Center
Arciuli, Joanne; Monaghan, Padraic
2009-01-01
We investigated probabilistic cues to grammatical category (noun vs. verb) in English orthography. These cues are located in both the beginnings and endings of words--as identified in our large-scale corpus analysis. Experiment 1 tested participants' sensitivity to beginning and ending cues while making speeded grammatical classifications.…
Reliability, Risk and Cost Trade-Offs for Composite Designs
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.
1996-01-01
Risk and cost trade-offs have been simulated using a probabilistic method. The probabilistic method accounts for all naturally-occurring uncertainties including those in constituent material properties, fabrication variables, structure geometry and loading conditions. The probability density function of first buckling load for a set of uncertain variables is computed. The probabilistic sensitivity factors of uncertain variables to the first buckling load is calculated. The reliability-based cost for a composite fuselage panel is defined and minimized with respect to requisite design parameters. The optimization is achieved by solving a system of nonlinear algebraic equations whose coefficients are functions of probabilistic sensitivity factors. With optimum design parameters such as the mean and coefficient of variation (representing range of scatter) of uncertain variables, the most efficient and economical manufacturing procedure can be selected. In this paper, optimum values of the requisite design parameters for a predetermined cost due to failure occurrence are computationally determined. The results for the fuselage panel analysis show that the higher the cost due to failure occurrence, the smaller the optimum coefficient of variation of fiber modulus (design parameter) in longitudinal direction.
Probabilistic, Decision-theoretic Disease Surveillance and Control
Wagner, Michael; Tsui, Fuchiang; Cooper, Gregory; Espino, Jeremy U.; Harkema, Hendrik; Levander, John; Villamarin, Ricardo; Voorhees, Ronald; Millett, Nicholas; Keane, Christopher; Dey, Anind; Razdan, Manik; Hu, Yang; Tsai, Ming; Brown, Shawn; Lee, Bruce Y.; Gallagher, Anthony; Potter, Margaret
2011-01-01
The Pittsburgh Center of Excellence in Public Health Informatics has developed a probabilistic, decision-theoretic system for disease surveillance and control for use in Allegheny County, PA and later in Tarrant County, TX. This paper describes the software components of the system and its knowledge bases. The paper uses influenza surveillance to illustrate how the software components transform data collected by the healthcare system into population level analyses and decision analyses of potential outbreak-control measures. PMID:23569617
Langenderfer, Joseph E; Rullkoetter, Paul J; Mell, Amy G; Laz, Peter J
2009-04-01
An accurate assessment of shoulder kinematics is useful for understanding healthy normal and pathological mechanics. Small variability in identifying and locating anatomical landmarks (ALs) has potential to affect reported shoulder kinematics. The objectives of this study were to quantify the effect of landmark location variability on scapular and humeral kinematic descriptions for multiple subjects using probabilistic analysis methods, and to evaluate the consistency in results across multiple subjects. Data from 11 healthy subjects performing humeral elevation in the scapular plane were used to calculate Euler angles describing humeral and scapular kinematics. Probabilistic analyses were performed for each subject to simulate uncertainty in the locations of 13 upper-extremity ALs. For standard deviations of 4 mm in landmark location, the analysis predicted Euler angle envelopes between the 1 and 99 percentile bounds of up to 16.6 degrees . While absolute kinematics varied with the subject, the average 1-99% kinematic ranges for the motion were consistent across subjects and sensitivity factors showed no statistically significant differences between subjects. The description of humeral kinematics was most sensitive to the location of landmarks on the thorax, while landmarks on the scapula had the greatest effect on the description of scapular elevation. The findings of this study can provide a better understanding of kinematic variability, which can aid in making accurate clinical diagnoses and refining kinematic measurement techniques.
NASA Technical Reports Server (NTRS)
Price J. M.; Ortega, R.
1998-01-01
Probabilistic method is not a universally accepted approach for the design and analysis of aerospace structures. The validity of this approach must be demonstrated to encourage its acceptance as it viable design and analysis tool to estimate structural reliability. The objective of this Study is to develop a well characterized finite population of similar aerospace structures that can be used to (1) validate probabilistic codes, (2) demonstrate the basic principles behind probabilistic methods, (3) formulate general guidelines for characterization of material drivers (such as elastic modulus) when limited data is available, and (4) investigate how the drivers affect the results of sensitivity analysis at the component/failure mode level.
Probabilistic structural analysis using a general purpose finite element program
NASA Astrophysics Data System (ADS)
Riha, D. S.; Millwater, H. R.; Thacker, B. H.
1992-07-01
This paper presents an accurate and efficient method to predict the probabilistic response for structural response quantities, such as stress, displacement, natural frequencies, and buckling loads, by combining the capabilities of MSC/NASTRAN, including design sensitivity analysis and fast probability integration. Two probabilistic structural analysis examples have been performed and verified by comparison with Monte Carlo simulation of the analytical solution. The first example consists of a cantilevered plate with several point loads. The second example is a probabilistic buckling analysis of a simply supported composite plate under in-plane loading. The coupling of MSC/NASTRAN and fast probability integration is shown to be orders of magnitude more efficient than Monte Carlo simulation with excellent accuracy.
Cost–effectiveness analysis of quadrivalent influenza vaccine in Spain
García, Amos; Ortiz de Lejarazu, Raúl; Reina, Jordi; Callejo, Daniel; Cuervo, Jesús; Morano Larragueta, Raúl
2016-01-01
ABSTRACT Influenza has a major impact on healthcare systems and society, but can be prevented using vaccination. The World Health Organization (WHO) currently recommends that influenza vaccines should include at least two virus A and one virus B lineage (trivalent vaccine; TIV). A new quadrivalent vaccine (QIV), which includes an additional B virus strain, received regulatory approval and is now recommended by several countries. The present study estimates the cost-effectiveness of replacing TIVs with QIV for risk groups and elderly population in Spain. A static, lifetime, multi-cohort Markov model with a one-year cycle time was adapted to assess the costs and health outcomes associated with a switch from TIV to QIV. The model followed a cohort vaccinated each year according to health authority recommendations, for the duration of their lives. National epidemiological data allowed the determination of whether the B strain included in TIVs matched the circulating one. Societal perspective was considered, costs and outcomes were discounted at 3% and one-way and probabilistic sensitivity analyses were performed. Compared to TIVs, QIV reduced more influenza cases and influenza-related complications and deaths during periods of B-mismatch strains in the TIV. The incremental cost-effectiveness ratio (ICER) was 8,748€/quality-adjusted life year (QALY). One-way sensitivity analysis showed mismatch with the B lineage included in the TIV was the main driver for ICER. Probabilistic sensitivity analysis shows ICER below 30,000€/QALY in 96% of simulations. Replacing TIVs with QIV in Spain could improve influenza prevention by avoiding B virus mismatch and provide a cost-effective healthcare intervention. PMID:27184622
Cost-effectiveness analysis of quadrivalent influenza vaccine in Spain.
García, Amos; Ortiz de Lejarazu, Raúl; Reina, Jordi; Callejo, Daniel; Cuervo, Jesús; Morano Larragueta, Raúl
2016-09-01
Influenza has a major impact on healthcare systems and society, but can be prevented using vaccination. The World Health Organization (WHO) currently recommends that influenza vaccines should include at least two virus A and one virus B lineage (trivalent vaccine; TIV). A new quadrivalent vaccine (QIV), which includes an additional B virus strain, received regulatory approval and is now recommended by several countries. The present study estimates the cost-effectiveness of replacing TIVs with QIV for risk groups and elderly population in Spain. A static, lifetime, multi-cohort Markov model with a one-year cycle time was adapted to assess the costs and health outcomes associated with a switch from TIV to QIV. The model followed a cohort vaccinated each year according to health authority recommendations, for the duration of their lives. National epidemiological data allowed the determination of whether the B strain included in TIVs matched the circulating one. Societal perspective was considered, costs and outcomes were discounted at 3% and one-way and probabilistic sensitivity analyses were performed. Compared to TIVs, QIV reduced more influenza cases and influenza-related complications and deaths during periods of B-mismatch strains in the TIV. The incremental cost-effectiveness ratio (ICER) was 8,748€/quality-adjusted life year (QALY). One-way sensitivity analysis showed mismatch with the B lineage included in the TIV was the main driver for ICER. Probabilistic sensitivity analysis shows ICER below 30,000€/QALY in 96% of simulations. Replacing TIVs with QIV in Spain could improve influenza prevention by avoiding B virus mismatch and provide a cost-effective healthcare intervention.
The cost-effectiveness of screening for colorectal cancer.
Telford, Jennifer J; Levy, Adrian R; Sambrook, Jennifer C; Zou, Denise; Enns, Robert A
2010-09-07
Published decision analyses show that screening for colorectal cancer is cost-effective. However, because of the number of tests available, the optimal screening strategy in Canada is unknown. We estimated the incremental cost-effectiveness of 10 strategies for colorectal cancer screening, as well as no screening, incorporating quality of life, noncompliance and data on the costs and benefits of chemotherapy. We used a probabilistic Markov model to estimate the costs and quality-adjusted life expectancy of 50-year-old average-risk Canadians without screening and with screening by each test. We populated the model with data from the published literature. We calculated costs from the perspective of a third-party payer, with inflation to 2007 Canadian dollars. Of the 10 strategies considered, we focused on three tests currently being used for population screening in some Canadian provinces: low-sensitivity guaiac fecal occult blood test, performed annually; fecal immunochemical test, performed annually; and colonoscopy, performed every 10 years. These strategies reduced the incidence of colorectal cancer by 44%, 65% and 81%, and mortality by 55%, 74% and 83%, respectively, compared with no screening. These strategies generated incremental cost-effectiveness ratios of $9159, $611 and $6133 per quality-adjusted life year, respectively. The findings were robust to probabilistic sensitivity analysis. Colonoscopy every 10 years yielded the greatest net health benefit. Screening for colorectal cancer is cost-effective over conventional levels of willingness to pay. Annual high-sensitivity fecal occult blood testing, such as a fecal immunochemical test, or colonoscopy every 10 years offer the best value for the money in Canada.
St-Onge, Maude; Fan, Eddy; Mégarbane, Bruno; Hancock-Howard, Rebecca; Coyte, Peter C
2015-04-01
Venoarterial extracorporeal membrane oxygenation represents an emerging and recommended option to treat life-threatening cardiotoxicant poisoning. The objective of this cost-effectiveness analysis was to estimate the incremental cost-effectiveness ratio of using venoarterial extracorporeal membrane oxygenation for adults in cardiotoxicant-induced shock or cardiac arrest compared with standard care. Adults in shock or in cardiac arrest secondary to cardiotoxicant poisoning were studied with a lifetime horizon and a societal perspective. Venoarterial extracorporeal membrane oxygenation cost effectiveness was calculated using a decision analysis tree, with the effect of the intervention and the probabilities used in the model taken from an observational study representing the highest level of evidence available. The costs (2013 Canadian dollars, where $1.00 Canadian = $0.9562 US dollars) were documented with interviews, reviews of official provincial documents, or published articles. A series of one-way sensitivity analyses and a probabilistic sensitivity analysis using Monte Carlo simulation were used to evaluate uncertainty in the decision model. The cost per life year (LY) gained in the extracorporeal membrane oxygenation group was $145 931/18 LY compared with $88 450/10 LY in the non-extracorporeal membrane oxygenation group. The incremental cost-effectiveness ratio ($7185/LY but $34 311/LY using a more pessimistic approach) was mainly influenced by the probability of survival. The probabilistic sensitivity analysis identified variability in both cost and effectiveness. Venoarterial extracorporeal membrane oxygenation may be cost effective in treating cardiotoxicant poisonings. Copyright © 2014 Elsevier Inc. All rights reserved.
Repetitive Elements May Comprise Over Two-Thirds of the Human Genome
de Koning, A. P. Jason; Gu, Wanjun; Castoe, Todd A.; Batzer, Mark A.; Pollock, David D.
2011-01-01
Transposable elements (TEs) are conventionally identified in eukaryotic genomes by alignment to consensus element sequences. Using this approach, about half of the human genome has been previously identified as TEs and low-complexity repeats. We recently developed a highly sensitive alternative de novo strategy, P-clouds, that instead searches for clusters of high-abundance oligonucleotides that are related in sequence space (oligo “clouds”). We show here that P-clouds predicts >840 Mbp of additional repetitive sequences in the human genome, thus suggesting that 66%–69% of the human genome is repetitive or repeat-derived. To investigate this remarkable difference, we conducted detailed analyses of the ability of both P-clouds and a commonly used conventional approach, RepeatMasker (RM), to detect different sized fragments of the highly abundant human Alu and MIR SINEs. RM can have surprisingly low sensitivity for even moderately long fragments, in contrast to P-clouds, which has good sensitivity down to small fragment sizes (∼25 bp). Although short fragments have a high intrinsic probability of being false positives, we performed a probabilistic annotation that reflects this fact. We further developed “element-specific” P-clouds (ESPs) to identify novel Alu and MIR SINE elements, and using it we identified ∼100 Mb of previously unannotated human elements. ESP estimates of new MIR sequences are in good agreement with RM-based predictions of the amount that RM missed. These results highlight the need for combined, probabilistic genome annotation approaches and suggest that the human genome consists of substantially more repetitive sequence than previously believed. PMID:22144907
Neptune: a bioinformatics tool for rapid discovery of genomic variation in bacterial populations
Marinier, Eric; Zaheer, Rahat; Berry, Chrystal; Weedmark, Kelly A.; Domaratzki, Michael; Mabon, Philip; Knox, Natalie C.; Reimer, Aleisha R.; Graham, Morag R.; Chui, Linda; Patterson-Fortin, Laura; Zhang, Jian; Pagotto, Franco; Farber, Jeff; Mahony, Jim; Seyer, Karine; Bekal, Sadjia; Tremblay, Cécile; Isaac-Renton, Judy; Prystajecky, Natalie; Chen, Jessica; Slade, Peter
2017-01-01
Abstract The ready availability of vast amounts of genomic sequence data has created the need to rethink comparative genomics algorithms using ‘big data’ approaches. Neptune is an efficient system for rapidly locating differentially abundant genomic content in bacterial populations using an exact k-mer matching strategy, while accommodating k-mer mismatches. Neptune’s loci discovery process identifies sequences that are sufficiently common to a group of target sequences and sufficiently absent from non-targets using probabilistic models. Neptune uses parallel computing to efficiently identify and extract these loci from draft genome assemblies without requiring multiple sequence alignments or other computationally expensive comparative sequence analyses. Tests on simulated and real datasets showed that Neptune rapidly identifies regions that are both sensitive and specific. We demonstrate that this system can identify trait-specific loci from different bacterial lineages. Neptune is broadly applicable for comparative bacterial analyses, yet will particularly benefit pathogenomic applications, owing to efficient and sensitive discovery of differentially abundant genomic loci. The software is available for download at: http://github.com/phac-nml/neptune. PMID:29048594
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKone, T.E.; Enoch, K.G.
2002-08-01
CalTOX has been developed as a set of spreadsheet models and spreadsheet data sets to assist in assessing human exposures from continuous releases to multiple environmental media, i.e. air, soil, and water. It has also been used for waste classification and for setting soil clean-up levels at uncontrolled hazardous wastes sites. The modeling components of CalTOX include a multimedia transport and transformation model, multi-pathway exposure scenario models, and add-ins to quantify and evaluate uncertainty and variability. All parameter values used as inputs to CalTOX are distributions, described in terms of mean values and a coefficient of variation, rather than asmore » point estimates or plausible upper values such as most other models employ. This probabilistic approach allows both sensitivity and uncertainty analyses to be directly incorporated into the model operation. This manual provides CalTOX users with a brief overview of the CalTOX spreadsheet model and provides instructions for using the spreadsheet to make deterministic and probabilistic calculations of source-dose-risk relationships.« less
ORNL Pre-test Analyses of A Large-scale Experiment in STYLE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Paul T; Yin, Shengjun; Klasky, Hilda B
Oak Ridge National Laboratory (ORNL) is conducting a series of numerical analyses to simulate a large scale mock-up experiment planned within the European Network for Structural Integrity for Lifetime Management non-RPV Components (STYLE). STYLE is a European cooperative effort to assess the structural integrity of (non-reactor pressure vessel) reactor coolant pressure boundary components relevant to ageing and life-time management and to integrate the knowledge created in the project into mainstream nuclear industry assessment codes. ORNL contributes work-in-kind support to STYLE Work Package 2 (Numerical Analysis/Advanced Tools) and Work Package 3 (Engineering Assessment Methods/LBB Analyses). This paper summarizes the current statusmore » of ORNL analyses of the STYLE Mock-Up3 large-scale experiment to simulate and evaluate crack growth in a cladded ferritic pipe. The analyses are being performed in two parts. In the first part, advanced fracture mechanics models are being developed and performed to evaluate several experiment designs taking into account the capabilities of the test facility while satisfying the test objectives. Then these advanced fracture mechanics models will be utilized to simulate the crack growth in the large scale mock-up test. For the second part, the recently developed ORNL SIAM-PFM open-source, cross-platform, probabilistic computational tool will be used to generate an alternative assessment for comparison with the advanced fracture mechanics model results. The SIAM-PFM probabilistic analysis of the Mock-Up3 experiment will utilize fracture modules that are installed into a general probabilistic framework. The probabilistic results of the Mock-Up3 experiment obtained from SIAM-PFM will be compared to those results generated using the deterministic 3D nonlinear finite-element modeling approach. The objective of the probabilistic analysis is to provide uncertainty bounds that will assist in assessing the more detailed 3D finite-element solutions and to also assess the level of confidence that can be placed in the best-estimate finiteelement solutions.« less
NASA Technical Reports Server (NTRS)
Cruse, T. A.
1987-01-01
The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.
1988-01-01
The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.
NASA Astrophysics Data System (ADS)
Rabatel, Matthias; Rampal, Pierre; Carrassi, Alberto; Bertino, Laurent; Jones, Christopher K. R. T.
2018-03-01
We present a sensitivity analysis and discuss the probabilistic forecast capabilities of the novel sea ice model neXtSIM used in hindcast mode. The study pertains to the response of the model to the uncertainty on winds using probabilistic forecasts of ice trajectories. neXtSIM is a continuous Lagrangian numerical model that uses an elasto-brittle rheology to simulate the ice response to external forces. The sensitivity analysis is based on a Monte Carlo sampling of 12 members. The response of the model to the uncertainties is evaluated in terms of simulated ice drift distances from their initial positions, and from the mean position of the ensemble, over the mid-term forecast horizon of 10 days. The simulated ice drift is decomposed into advective and diffusive parts that are characterised separately both spatially and temporally and compared to what is obtained with a free-drift model, that is, when the ice rheology does not play any role in the modelled physics of the ice. The seasonal variability of the model sensitivity is presented and shows the role of the ice compactness and rheology in the ice drift response at both local and regional scales in the Arctic. Indeed, the ice drift simulated by neXtSIM in summer is close to the one obtained with the free-drift model, while the more compact and solid ice pack shows a significantly different mechanical and drift behaviour in winter. For the winter period analysed in this study, we also show that, in contrast to the free-drift model, neXtSIM reproduces the sea ice Lagrangian diffusion regimes as found from observed trajectories. The forecast capability of neXtSIM is also evaluated using a large set of real buoy's trajectories and compared to the capability of the free-drift model. We found that neXtSIM performs significantly better in simulating sea ice drift, both in terms of forecast error and as a tool to assist search and rescue operations, although the sources of uncertainties assumed for the present experiment are not sufficient for complete coverage of the observed IABP positions.
Probabilistic sizing of laminates with uncertainties
NASA Technical Reports Server (NTRS)
Shah, A. R.; Liaw, D. G.; Chamis, C. C.
1993-01-01
A reliability based design methodology for laminate sizing and configuration for a special case of composite structures is described. The methodology combines probabilistic composite mechanics with probabilistic structural analysis. The uncertainties of constituent materials (fiber and matrix) to predict macroscopic behavior are simulated using probabilistic theory. Uncertainties in the degradation of composite material properties are included in this design methodology. A multi-factor interaction equation is used to evaluate load and environment dependent degradation of the composite material properties at the micromechanics level. The methodology is integrated into a computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). Versatility of this design approach is demonstrated by performing a multi-level probabilistic analysis to size the laminates for design structural reliability of random type structures. The results show that laminate configurations can be selected to improve the structural reliability from three failures in 1000, to no failures in one million. Results also show that the laminates with the highest reliability are the least sensitive to the loading conditions.
Probabilistic wind/tornado/missile analyses for hazard and fragility evaluations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Y.J.; Reich, M.
Detailed analysis procedures and examples are presented for the probabilistic evaluation of hazard and fragility against high wind, tornado, and tornado-generated missiles. In the tornado hazard analysis, existing risk models are modified to incorporate various uncertainties including modeling errors. A significant feature of this paper is the detailed description of the Monte-Carlo simulation analyses of tornado-generated missiles. A simulation procedure, which includes the wind field modeling, missile injection, solution of flight equations, and missile impact analysis, is described with application examples.
Comparison of bias analysis strategies applied to a large data set.
Lash, Timothy L; Abrams, Barbara; Bodnar, Lisa M
2014-07-01
Epidemiologic data sets continue to grow larger. Probabilistic-bias analyses, which simulate hundreds of thousands of replications of the original data set, may challenge desktop computational resources. We implemented a probabilistic-bias analysis to evaluate the direction, magnitude, and uncertainty of the bias arising from misclassification of prepregnancy body mass index when studying its association with early preterm birth in a cohort of 773,625 singleton births. We compared 3 bias analysis strategies: (1) using the full cohort, (2) using a case-cohort design, and (3) weighting records by their frequency in the full cohort. Underweight and overweight mothers were more likely to deliver early preterm. A validation substudy demonstrated misclassification of prepregnancy body mass index derived from birth certificates. Probabilistic-bias analyses suggested that the association between underweight and early preterm birth was overestimated by the conventional approach, whereas the associations between overweight categories and early preterm birth were underestimated. The 3 bias analyses yielded equivalent results and challenged our typical desktop computing environment. Analyses applied to the full cohort, case cohort, and weighted full cohort required 7.75 days and 4 terabytes, 15.8 hours and 287 gigabytes, and 8.5 hours and 202 gigabytes, respectively. Large epidemiologic data sets often include variables that are imperfectly measured, often because data were collected for other purposes. Probabilistic-bias analysis allows quantification of errors but may be difficult in a desktop computing environment. Solutions that allow these analyses in this environment can be achieved without new hardware and within reasonable computational time frames.
Chua, Wen Bing Brandon; Cheen, Hua Heng McVin; Kong, Ming Chai; Chen, Li Li; Wee, Hwee Lin
2016-10-01
Background Oral anticoagulation with warfarin is the cornerstone therapy in atrial fibrillation (AF) for stroke prevention. Multi-disciplinary anticoagulation management services have been shown to be cost-effective in the United States, Hong Kong and Thailand, but the findings are not readily generalizable to Singapore's healthcare system. Objective This study aimed to evaluate the cost-effectiveness of pharmacist-managed anticoagulation clinic (ACC) compared with usual care (UC) for the management of older adults with AF receiving oral anticoagulation with warfarin. Setting Pharmacist-managed ACC in an academic medical centre. Method A Markov model with 3-month cycle length and 30-year time horizon compared costs and quality-adjusted life-years (QALYs) of ACC and UC from the patient's and healthcare provider's perspectives. Four pathways based on time in therapeutic range (TTR) were: ACC TTR < 70 %, ACC TTR ≥ 70 %, UC TTR < 70 % and UC TTR ≥ 70 %. A hypothetical cohort of 70-year-old Singaporean AF patients receiving warfarin was utilised. Local data from national disease registries, patient surveys and hospital databases were used. When local data was not available, published studies on Asian populations were utilized when available. One-way sensitivity analyses and probabilistic sensitivity analyses were performed to account for uncertainties. Costs and QALYs were discounted annually by 3 %. Main outcome measure Costs and QALYs of ACC and UC. Results Pharmacist-managed ACC was found to dominate UC in all comparisons. It improved effectiveness by 0.19 and 0.13 QALYs at TTR < 70 % and TTR ≥ 70 % respectively compared with UC. From the patient's perspective, ACC reduced costs by SG$1222.67 (€1110.24) for TTR < 70 % and SG$1008.16 (€915.46) for TTR ≥ 70 %. Similar trends were observed from the healthcare provider's perspective, with ACC reducing costs by SG$1444.79 (€1311.94) for TTR < 70 % and SG$1269.17 (€1152.46) for TTR ≥ 70 % compared with UC. The results were robust to variations of the parameters over their plausible ranges in one-way sensitivity analyses. Probabilistic sensitivity analyses demonstrated that ACC was cost-effective more than 79 % of the time from both perspectives at a willingness-to-pay threshold of SG$69,050 (€62,701) per QALY. Conclusion Pharmacist-managed ACC is more effective and less costly compared with UC regardless of the quality of anticoagulation therapy. The findings support the current body of evidence demonstrating the cost-effectiveness of ACC.
Probabilistic Assessment of National Wind Tunnel
NASA Technical Reports Server (NTRS)
Shah, A. R.; Shiao, M.; Chamis, C. C.
1996-01-01
A preliminary probabilistic structural assessment of the critical section of National Wind Tunnel (NWT) is performed using NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) computer code. Thereby, the capabilities of NESSUS code have been demonstrated to address reliability issues of the NWT. Uncertainties in the geometry, material properties, loads and stiffener location on the NWT are considered to perform the reliability assessment. Probabilistic stress, frequency, buckling, fatigue and proof load analyses are performed. These analyses cover the major global and some local design requirements. Based on the assumed uncertainties, the results reveal the assurance of minimum 0.999 reliability for the NWT. Preliminary life prediction analysis results show that the life of the NWT is governed by the fatigue of welds. Also, reliability based proof test assessment is performed.
Methods for Probabilistic Radiological Dose Assessment at a High-Level Radioactive Waste Repository.
NASA Astrophysics Data System (ADS)
Maheras, Steven James
Methods were developed to assess and evaluate the uncertainty in offsite and onsite radiological dose at a high-level radioactive waste repository to show reasonable assurance that compliance with applicable regulatory requirements will be achieved. Uncertainty in offsite dose was assessed by employing a stochastic precode in conjunction with Monte Carlo simulation using an offsite radiological dose assessment code. Uncertainty in onsite dose was assessed by employing a discrete-event simulation model of repository operations in conjunction with an occupational radiological dose assessment model. Complementary cumulative distribution functions of offsite and onsite dose were used to illustrate reasonable assurance. Offsite dose analyses were performed for iodine -129, cesium-137, strontium-90, and plutonium-239. Complementary cumulative distribution functions of offsite dose were constructed; offsite dose was lognormally distributed with a two order of magnitude range. However, plutonium-239 results were not lognormally distributed and exhibited less than one order of magnitude range. Onsite dose analyses were performed for the preliminary inspection, receiving and handling, and the underground areas of the repository. Complementary cumulative distribution functions of onsite dose were constructed and exhibited less than one order of magnitude range. A preliminary sensitivity analysis of the receiving and handling areas was conducted using a regression metamodel. Sensitivity coefficients and partial correlation coefficients were used as measures of sensitivity. Model output was most sensitive to parameters related to cask handling operations. Model output showed little sensitivity to parameters related to cask inspections.
Hoshi, Shu-Ling; Kondo, Masahide; Okubo, Ichiro
2017-05-31
The extended use of varicella vaccine in adults aged 50 and older against herpes zoster (HZ) was recently approved in Japan, which has raised the need to evaluate its value for money. We conducted a cost-effectiveness analysis with Markov modelling to evaluate the efficiency of varicella vaccine immunisation programme for the elderly in Japan. Four strategies with different ages to receive a shot of vaccine were set, namely: (1) 65-84, (2) 70-84, (3) 75-84 and (4) 80-84years old (y.o.). Incremental cost-effectiveness ratios (ICERs) compared with no programme from societal perspective were calculated. The health statuses following the target cohort are as follows: without any HZ-related disease, acute HZ followed by recovery, post-herpetic neuralgia (PHN) followed by recovery, post HZ/PHN, and general death. The transition probabilities, utility weights to estimate quality-adjusted life year (QALY) and disease treatment costs were either calculated or cited from literature. Costs of per course of vaccination were assumed at ¥10,000 (US$91). The model with one-year cycle runs until the surviving individual reached 100 y.o. ICERs ranged from ¥2,812,000/US$25,680 to ¥3,644,000/US$33,279 per QALY gained, with 65-84 y.o. strategy having the lowest ICER and 80-84 y.o. strategy the highest. None of the alternatives was strongly dominated by the other, while 80-84 y.o. and 70-84 y.o. strategy were extendedly dominated by 65-84 y.o. Probabilistic sensitivity analyses showed that the probabilities that ICER is under ¥5,000,000/US$45,662 per QALY gained was at 100% for 65-84 y.o., 70-84 y.o., 75-84 y.o. strategy, respectively, and at 98.4% for 80-84 y.o. We found that vaccinating individuals aged 65-84, 70-84, 75-84, and 80-84 with varicella vaccine to prevent HZ-associated disease in Japan can be cost-effective from societal perspective, with 65-84 y.o. strategy as the optimal alternative. Results are supported by one-way sensitivity analyses and probabilistic sensitivity analyses. Copyright © 2017 Elsevier Ltd. All rights reserved.
2013-01-01
Background Tools to support clinical or patient decision-making in the treatment/management of a health condition are used in a range of clinical settings for numerous preference-sensitive healthcare decisions. Their impact in clinical practice is largely dependent on their quality across a range of domains. We critically analysed currently available tools to support decision making or patient understanding in the treatment of acute ischaemic stroke with intravenous thrombolysis, as an exemplar to provide clinicians/researchers with practical guidance on development, evaluation and implementation of such tools for other preference-sensitive treatment options/decisions in different clinical contexts. Methods Tools were identified from bibliographic databases, Internet searches and a survey of UK and North American stroke networks. Two reviewers critically analysed tools to establish: information on benefits/risks of thrombolysis included in tools, and the methods used to convey probabilistic information (verbal descriptors, numerical and graphical); adherence to guidance on presenting outcome probabilities (IPDASi probabilities items) and information content (Picker Institute Checklist); readability (Fog Index); and the extent that tools had comprehensive development processes. Results Nine tools of 26 identified included information on a full range of benefits/risks of thrombolysis. Verbal descriptors, frequencies and percentages were used to convey probabilistic information in 20, 19 and 18 tools respectively, whilst nine used graphical methods. Shortcomings in presentation of outcome probabilities (e.g. omitting outcomes without treatment) were identified. Patient information tools had an aggregate median Fog index score of 10. None of the tools had comprehensive development processes. Conclusions Tools to support decision making or patient understanding in the treatment of acute stroke with thrombolysis have been sub-optimally developed. Development of tools should utilise mixed methods and strategies to meaningfully involve clinicians, patients and their relatives in an iterative design process; include evidence-based methods to augment interpretability of textual and probabilistic information (e.g. graphical displays showing natural frequencies) on the full range of outcome states associated with available options; and address patients with different levels of health literacy. Implementation of tools will be enhanced when mechanisms are in place to periodically assess the relevance of tools and where necessary, update the mode of delivery, form and information content. PMID:23777368
Probabilistic Structural Analysis Program
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.
2010-01-01
NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.
Degeling, Koen; IJzerman, Maarten J; Koopman, Miriam; Koffijberg, Hendrik
2017-12-15
Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive guidance on reflecting parameter uncertainty in the (correlated) parameters of distributions used to represent stochastic uncertainty in patient-level models. This study aims to provide this guidance by proposing appropriate methods and illustrating the impact of this uncertainty on modeling outcomes. Two approaches, 1) using non-parametric bootstrapping and 2) using multivariate Normal distributions, were applied in a simulation and case study. The approaches were compared based on point-estimates and distributions of time-to-event and health economic outcomes. To assess sample size impact on the uncertainty in these outcomes, sample size was varied in the simulation study and subgroup analyses were performed for the case-study. Accounting for parameter uncertainty in distributions that reflect stochastic uncertainty substantially increased the uncertainty surrounding health economic outcomes, illustrated by larger confidence ellipses surrounding the cost-effectiveness point-estimates and different cost-effectiveness acceptability curves. Although both approaches performed similar for larger sample sizes (i.e. n = 500), the second approach was more sensitive to extreme values for small sample sizes (i.e. n = 25), yielding infeasible modeling outcomes. Modelers should be aware that parameter uncertainty in distributions used to describe stochastic uncertainty needs to be reflected in probabilistic sensitivity analysis, as it could substantially impact the total amount of uncertainty surrounding health economic outcomes. If feasible, the bootstrap approach is recommended to account for this uncertainty.
Qian, Yushen; Pollom, Erqi L.; King, Martin T.; Dudley, Sara A.; Shaffer, Jenny L.; Chang, Daniel T.; Gibbs, Iris C.; Goldhaber-Fiebert, Jeremy D.; Horst, Kathleen C.
2016-01-01
Purpose The Clinical Evaluation of Pertuzumab and Trastuzumab (CLEOPATRA) study showed a 15.7-month survival benefit with the addition of pertuzumab to docetaxel and trastuzumab (THP) as first-line treatment for patients with human epidermal growth factor receptor 2 (HER2) –overexpressing metastatic breast cancer. We performed a cost-effectiveness analysis to assess the value of adding pertuzumab. Patient and Methods We developed a decision-analytic Markov model to evaluate the cost effectiveness of docetaxel plus trastuzumab (TH) with or without pertuzumab in US patients with metastatic breast cancer. The model followed patients weekly over their remaining lifetimes. Health states included stable disease, progressing disease, hospice, and death. Transition probabilities were based on the CLEOPATRA study. Costs reflected the 2014 Medicare rates. Health state utilities were the same as those used in other recent cost-effectiveness studies of trastuzumab and pertuzumab. Outcomes included health benefits expressed as discounted quality-adjusted life-years (QALYs), costs in US dollars, and cost effectiveness expressed as an incremental cost-effectiveness ratio. One- and multiway deterministic and probabilistic sensitivity analyses explored the effects of specific assumptions. Results Modeled median survival was 39.4 months for TH and 56.9 months for THP. The addition of pertuzumab resulted in an additional 1.81 life-years gained, or 0.62 QALYs, at a cost of $472,668 per QALY gained. Deterministic sensitivity analysis showed that THP is unlikely to be cost effective even under the most favorable assumptions, and probabilistic sensitivity analysis predicted 0% chance of cost effectiveness at a willingness to pay of $100,000 per QALY gained. Conclusion THP in patients with metastatic HER2-positive breast cancer is unlikely to be cost effective in the United States. PMID:26351332
Chit, Ayman; Roiz, Julie; Aballea, Samuel
2015-01-01
Ontario, Canada, immunizes against influenza using a trivalent inactivated influenza vaccine (IIV3) under a Universal Influenza Immunization Program (UIIP). The UIIP offers IIV3 free-of-charge to all Ontarians over 6 months of age. A newly approved quadrivalent inactivated influenza vaccine (IIV4) offers wider protection against influenza B disease. We explored the expected cost-utility and budget impact of replacing IIV3 with IIV4, within the context of Ontario's UIIP, using a probabilistic and static cost-utility model. Wherever possible, epidemiological and cost data were obtained from Ontario sources. Canadian or U.S. sources were used when Ontario data were not available. Vaccine efficacy for IIV3 was obtained from the literature. IIV4 efficacy was derived from meta-analysis of strain-specific vaccine efficacy. Conservatively, herd protection was not considered. In the base case, we used IIV3 and IIV4 prices of $5.5/dose and $7/dose, respectively. We conducted a sensitivity analysis on the price of IIV4, as well as standard univariate and multivariate statistical uncertainty analyses. Over a typical influenza season, relative to IIV3, IIV4 is expected to avert an additional 2,516 influenza cases, 1,683 influenza-associated medical visits, 27 influenza-associated hospitalizations, and 5 influenza-associated deaths. From a societal perspective, IIV4 would generate 76 more Quality Adjusted Life Years (QALYs) and a net societal budget impact of $4,784,112. The incremental cost effectiveness ratio for this comparison was $63,773/QALY. IIV4 remains cost-effective up to a 53% price premium over IIV3. A probabilistic sensitivity analysis showed that IIV4 was cost-effective with a probability of 65% for a threshold of $100,000/QALY gained. IIV4 is expected to achieve reductions in influenza-related morbidity and mortality compared to IIV3. Despite not accounting for herd protection, IIV4 is still expected to be a cost-effective alternative to IIV3 up to a price premium of 53%. Our conclusions were robust in the face of sensitivity analyses.
Verhoef, Talitha I; Trend, Verena; Kelly, Barry; Robinson, Nigel; Fox, Paul; Morris, Stephen
2016-07-22
We evaluated the cost-effectiveness of the Give-it-a-Go programme, which offers free leisure centre memberships to physically inactive members of the public in a single London Borough receiving state benefits. A decision analytic Markov model was developed to analyse lifetime costs and quality-adjusted life-years (QALYs) of 1025 people recruited to the intervention versus no intervention. In the intervention group, people were offered 4 months of free membership at a leisure centre. Physical activity levels were assessed at 0 and 4 months using the International Physical Activity Questionnaire (IPAQ). Higher levels of physical activity were assumed to decrease the risk of coronary heart disease, stroke and diabetes mellitus type II, as well as improve mental health. Costs were assessed from a National Health Service (NHS) perspective. Uncertainty was assessed using one-way and probabilistic sensitivity analyses. One-hundred fifty nine participants (15.5 %) completed the programme by attending the leisure centre for 4 months. Compared with no intervention, Give it a Go increased costs by £67.25 and QALYs by 0.0033 (equivalent to 1.21 days in full health) per recruited person. The incremental costs per QALY gained were £20,347. The results were highly sensitive to the magnitude of mental health gain due to physical activity and the duration of the effect of the programme (1 year in the base case analysis). When the mental health gain was omitted from the analysis, the incremental cost per QALY gained increased to almost £1.5 million. In the probabilistic sensitivity analysis, the incremental costs per QALY gained were below £20,000 in 39 % of the 5000 simulations. Give it a Go did not significantly increase life-expectancy, but had a positive influence on quality of life due to the mental health gain of physical activity. If the increase in physical activity caused by Give it a Go lasts for more than 1 year, the programme would be cost-effective given a willingness to pay for a QALY of £20,000.
Sarma-based key-group method for rock slope reliability analyses
NASA Astrophysics Data System (ADS)
Yarahmadi Bafghi, A. R.; Verdel, T.
2005-08-01
The methods used in conducting static stability analyses have remained pertinent to this day for reasons of both simplicity and speed of execution. The most well-known of these methods for purposes of stability analysis of fractured rock masses is the key-block method (KBM).This paper proposes an extension to the KBM, called the key-group method (KGM), which combines not only individual key-blocks but also groups of collapsable blocks into an iterative and progressive analysis of the stability of discontinuous rock slopes. To take intra-group forces into account, the Sarma method has been implemented within the KGM in order to generate a Sarma-based KGM, abbreviated SKGM. We will discuss herein the hypothesis behind this new method, details regarding its implementation, and validation through comparison with results obtained from the distinct element method.Furthermore, as an alternative to deterministic methods, reliability analyses or probabilistic analyses have been proposed to take account of the uncertainty in analytical parameters and models. The FOSM and ASM probabilistic methods could be implemented within the KGM and SKGM framework in order to take account of the uncertainty due to physical and mechanical data (density, cohesion and angle of friction). We will then show how such reliability analyses can be introduced into SKGM to give rise to the probabilistic SKGM (PSKGM) and how it can be used for rock slope reliability analyses. Copyright
Herzer, Kurt R; Niessen, Louis; Constenla, Dagna O; Ward, William J; Pronovost, Peter J
2014-01-01
Objective To assess the cost-effectiveness of a multifaceted quality improvement programme focused on reducing central line-associated bloodstream infections in intensive care units. Design Cost-effectiveness analysis using a decision tree model to compare programme to non-programme intensive care units. Setting USA. Population Adult patients in the intensive care unit. Costs Economic costs of the programme and of central line-associated bloodstream infections were estimated from the perspective of the hospital and presented in 2013 US dollars. Main outcome measures Central line-associated bloodstream infections prevented, deaths averted due to central line-associated bloodstream infections prevented, and incremental cost-effectiveness ratios. Probabilistic sensitivity analysis was performed. Results Compared with current practice, the programme is strongly dominant and reduces bloodstream infections and deaths at no additional cost. The probabilistic sensitivity analysis showed that there was an almost 80% probability that the programme reduces bloodstream infections and the infections’ economic costs to hospitals. The opportunity cost of a bloodstream infection to a hospital was the most important model parameter in these analyses. Conclusions This multifaceted quality improvement programme, as it is currently implemented by hospitals on an increasingly large scale in the USA, likely reduces the economic costs of central line-associated bloodstream infections for US hospitals. Awareness among hospitals about the programme's benefits should enhance implementation. The programme's implementation has the potential to substantially reduce morbidity, mortality and economic costs associated with central line-associated bloodstream infections. PMID:25256190
Kovac, Jason Ronald; Fantus, Jake; Lipshultz, Larry I; Fischer, Marc Anthony; Klinghoffer, Zachery
2014-09-01
Varicoceles are a common cause of male infertility; repair can be accomplished using either surgical or radiological means. We compare the cost-effectiveness of the gold standard, the microsurgical varicocele repair (MV), to the options of a nonmicrosurgical approach (NMV) and percutaneous embolization (PE) to manage varicocele-associated infertility. A Markov decision-analysis model was developed to estimate costs and pregnancy rates. Within the model, recurrences following MV and NMV were re-treated with PE and recurrences following PE were treated with repeat PE, MV or NMV. Pregnancy and recurrence rates were based on the literature, while costs were obtained from institutional and government supplied data. Univariate and probabilistic sensitivity-analyses were performed to determine the effects of the various parameters on model outcomes. Primary treatment with MV was the most cost-effective strategy at $5402 CAD (Canadian)/pregnancy. Primary treatment with NMV was the least costly approach, but it also yielded the fewest pregnancies. Primary treatment with PE was the least cost-effective strategy costing about $7300 CAD/pregnancy. Probabilistic sensitivity analysis reinforced MV as the most cost-effective strategy at a willingness-to-pay threshold of >$4100 CAD/pregnancy. MV yielded the most pregnancies at acceptable levels of incremental costs. As such, it is the preferred primary treatment strategy for varicocele-associated infertility. Treatment with PE was the least cost-effective approach and, as such, is best used only in cases of surgical failure.
Sensitivity of the Dengue Surveillance System in Brazil for Detecting Hospitalized Cases
2016-01-01
We evaluated the sensitivity of the dengue surveillance system in detecting hospitalized cases in ten capital cities in Brazil from 2008 to 2013 using a probabilistic record linkage of two independent information systems hospitalization (SIH-SUS) adopted as the gold standard and surveillance (SINAN). Sensitivity was defined as the proportion of cases reported to the surveillance system amid the suspected hospitalized cases registered in SIH-SUS. Of the 48,174 hospitalizations registered in SIH-SUS, 24,469 (50.7%) were reported and registered in SINAN, indicating an overall sensitivity of 50.8% (95%CI 50.3–51.2). The observed sensitivity for each of the municipalities included in the study ranged from 22.0% to 99.1%. The combination of the two data sources identified 71,161 hospitalizations, an increase of 97.0% over SINAN itself. Our results allowed establishing the proportion of underreported dengue hospitalizations in the public health system in Brazil, highlighting the use of probabilistic record linkage as a valuable tool for evaluating surveillance systems. PMID:27192405
Tu, H Y V; Pemberton, J; Lorenzo, A J; Braga, L H
2015-10-01
For infants with hydronephrosis, continuous antibiotic prophylaxis (CAP) may reduce urinary tract infections (UTIs); however, its value remains controversial. Recent studies have suggested that neonates with severe obstructive hydronephrosis are at an increased risk of UTIs, and support the use of CAP. Other studies have demonstrated the negligible risk for UTIs in the setting of suspected ureteropelvic junction obstruction and have highlighted the limited role of CAP in hydronephrosis. Furthermore, economic studies in this patient population have been sparse. This study aimed to evaluate whether the use of CAP is an efficient expenditure for preventing UTIs in children with high-grade hydronephrosis within the first 2 years of life. A decision model was used to estimate expected costs, clinical outcomes and quality-adjusted life years (QALYs) of CAP versus no CAP (Fig. 1). Cost data were collected from provincial databases and converted to 2013 Canadian dollars (CAD). Estimates of risks and health utility values were extracted from published literature. The analysis was performed over a time horizon of 2 years. One-way and probabilistic sensitivity analyses were carried out to assess uncertainty and robustness. Overall, CAP use was less costly and provided a minimal increase in health utility when compared to no CAP (Table). The mean cost over two years for CAP and no CAP was CAD$1571.19 and CAD$1956.44, respectively. The use of CAP reduced outpatient-managed UTIs by 0.21 infections and UTIs requiring hospitalization by 0.04 infections over 2 years. Cost-utility analysis revealed an increase of 0.0001 QALYs/year when using CAP. The CAP arm exhibited strong dominance over no CAP in all sensitivity analyses and across all willingness-to-pay thresholds. The use of CAP exhibited strong dominance in the economic evaluation, despite a small gain of 0.0001 QALYs/year. Whether this slight gain is clinically significant remains to be determined. However, small QALY gains have been reported in other pediatric economic evaluations. Strengths of this study included the use of data from a recent systematic review and meta-analysis, in addition to a comprehensive probabilistic sensitivity analysis. Limitations of this study included the use of estimates for UTI probabilities in the second year of life and health utility values, given that they were lacking in the literature. Spontaneous resolution of hydronephrosis and surgical management were also not implemented in this model. To prevent UTIs within the first 2 years of life in infants with high-grade hydronephrosis, this probabilistic model has shown that CAP use is a prudent expenditure of healthcare resources when compared to no CAP. Copyright © 2015 Journal of Pediatric Urology Company. Published by Elsevier Ltd. All rights reserved.
Denis Valle; Benjamin Baiser; Christopher W. Woodall; Robin Chazdon; Jerome Chave
2014-01-01
We propose a novel multivariate method to analyse biodiversity data based on the Latent Dirichlet Allocation (LDA) model. LDA, a probabilistic model, reduces assemblages to sets of distinct component communities. It produces easily interpretable results, can represent abrupt and gradual changes in composition, accommodates missing data and allows for coherent estimates...
Probabilistic analysis of preload in the abutment screw of a dental implant complex.
Guda, Teja; Ross, Thomas A; Lang, Lisa A; Millwater, Harry R
2008-09-01
Screw loosening is a problem for a percentage of implants. A probabilistic analysis to determine the cumulative probability distribution of the preload, the probability of obtaining an optimal preload, and the probabilistic sensitivities identifying important variables is lacking. The purpose of this study was to examine the inherent variability of material properties, surface interactions, and applied torque in an implant system to determine the probability of obtaining desired preload values and to identify the significant variables that affect the preload. Using software programs, an abutment screw was subjected to a tightening torque and the preload was determined from finite element (FE) analysis. The FE model was integrated with probabilistic analysis software. Two probabilistic analysis methods (advanced mean value and Monte Carlo sampling) were applied to determine the cumulative distribution function (CDF) of preload. The coefficient of friction, elastic moduli, Poisson's ratios, and applied torque were modeled as random variables and defined by probability distributions. Separate probability distributions were determined for the coefficient of friction in well-lubricated and dry environments. The probabilistic analyses were performed and the cumulative distribution of preload was determined for each environment. A distinct difference was seen between the preload probability distributions generated in a dry environment (normal distribution, mean (SD): 347 (61.9) N) compared to a well-lubricated environment (normal distribution, mean (SD): 616 (92.2) N). The probability of obtaining a preload value within the target range was approximately 54% for the well-lubricated environment and only 0.02% for the dry environment. The preload is predominately affected by the applied torque and coefficient of friction between the screw threads and implant bore at lower and middle values of the preload CDF, and by the applied torque and the elastic modulus of the abutment screw at high values of the preload CDF. Lubrication at the threaded surfaces between the abutment screw and implant bore affects the preload developed in the implant complex. For the well-lubricated surfaces, only approximately 50% of implants will have preload values within the generally accepted range. This probability can be improved by applying a higher torque than normally recommended or a more closely controlled torque than typically achieved. It is also suggested that materials with higher elastic moduli be used in the manufacture of the abutment screw to achieve a higher preload.
Do probabilistic forecasts lead to better decisions?
NASA Astrophysics Data System (ADS)
Ramos, M. H.; van Andel, S. J.; Pappenberger, F.
2013-06-01
The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.
Subgroup Economic Evaluation of Radiotherapy for Breast Cancer After Mastectomy.
Wan, Xiaomin; Peng, Liubao; Ma, Jinan; Chen, Gannong; Li, Yuanjian
2015-11-01
A recent meta-analysis by the Early Breast Cancer Trialists' Collaborative Group found significant improvements achieved by postmastectomy radiotherapy (PMRT) for patients with breast cancer with 1 to 3 positive nodes (pN1-3). It is unclear whether PMRT is cost-effective for subgroups of patients with positive nodes. To determine the cost-effectiveness of PMRT for subgroups of patients with breast cancer with positive nodes. A semi-Markov model was constructed to estimate the expected lifetime costs, life expectancy, and quality-adjusted life-years for patients receiving or not receiving radiation therapy. Clinical and health utilities data were from meta-analyses by the Early Breast Cancer Trialists' Collaborative Group or randomized clinical trials. Costs were estimated from the perspective of the Chinese society. One-way and probabilistic sensitivity analyses were performed. The incremental cost-effective ratio was estimated as $7984, $4043, $3572, and $19,021 per quality-adjusted life-year for patients with positive nodes (pN+), patients with pN1-3, patients with pN1-3 who received systemic therapy, and patients with >4 positive nodes (pN4+), respectively. According to World Health Organization recommendations, these incremental cost-effective ratios were judged as cost-effective. However, the results of one-way sensitivity analyses suggested that the results were highly sensitive to the relative effectiveness of PMRT (rate ratio). We determined that the results were highly sensitive to the rate ratio. However, the addition of PMRT for patients with pN1-3 in China has a reasonable chance to be cost-effective and may be judged as an efficient deployment of limited health resource, and the risk and uncertainty of PMRT are relatively greater for patients with pN4+. Copyright © 2015 Elsevier HS Journals, Inc. All rights reserved.
Stepp, J.C.; Wong, I.; Whitney, J.; Quittmeyer, R.; Abrahamson, N.; Toro, G.; Young, S.R.; Coppersmith, K.; Savy, J.; Sullivan, T.
2001-01-01
Probabilistic seismic hazard analyses were conducted to estimate both ground motion and fault displacement hazards at the potential geologic repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain, Nevada. The study is believed to be the largest and most comprehensive analyses ever conducted for ground-shaking hazard and is a first-of-a-kind assessment of probabilistic fault displacement hazard. The major emphasis of the study was on the quantification of epistemic uncertainty. Six teams of three experts performed seismic source and fault displacement evaluations, and seven individual experts provided ground motion evaluations. State-of-the-practice expert elicitation processes involving structured workshops, consensus identification of parameters and issues to be evaluated, common sharing of data and information, and open exchanges about the basis for preliminary interpretations were implemented. Ground-shaking hazard was computed for a hypothetical rock outcrop at -300 m, the depth of the potential waste emplacement drifts, at the designated design annual exceedance probabilities of 10-3 and 10-4. The fault displacement hazard was calculated at the design annual exceedance probabilities of 10-4 and 10-5.
Dynamic Probabilistic Instability of Composite Structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2009-01-01
A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties in that order.
The cost-effectiveness of etanercept in patients with severe ankylosing spondylitis in the UK.
Ara, R M; Reynolds, A V; Conway, P
2007-08-01
To examine the costs and benefits associated with long-term etanercept (ETN) treatment in patients with severe ankylosing spondylitis (AS) in the UK in accordance with the BSR guidelines. A mathematical model was constructed to estimate the costs and benefits associated with ETN plus non-steroidal anti-inflammatory drugs (NSAIDs) compared with NSAIDs alone. Individual patient data from Phase III RCTs was used to inform the proportion and magnitude of initial response to treatment and changes in health-related quality of life. A retrospective costing exercise on patients attending a UK secondary care rheumatology unit was used to inform disease costs. Published evidence on long-term disease progression was extrapolated over a 25-yr horizon. Uncertainty was examined using probabilistic sensitivity analyses. Over a 25-yr horizon, ETN plus NSAIDs gave 1.58 more QALYs at an additional cost of 35,978 pounds when compared with NSAID treatment alone. This equates to a central estimate of 22,700 pounds per QALY. The incremental cost per QALYs using shorter time periods were 27,600 pounds, 23,600 pounds and 22,600 pounds at 2, 5 and 15 yrs, respectively. Using a 25-yr horizon, 93% of results from the probabilistic analyses fall below a threshold of 25,000 pounds per QALY. This study demonstrates the potential cost-effectiveness of ETN plus NSAIDs compared with NSAIDs alone in patients with severe AS treated according to the BSR guidelines in the UK.
Acevedo, Joseph R; Fero, Katherine E; Wilson, Bayard; Sacco, Assuntina G; Mell, Loren K; Coffey, Charles S; Murphy, James D
2016-11-10
Purpose Recently, a large randomized trial found a survival advantage among patients who received elective neck dissection in conjunction with primary surgery for clinically node-negative oral cavity cancer compared with those receiving primary surgery alone. However, elective neck dissection comes with greater upfront cost and patient morbidity. We present a cost-effectiveness analysis of elective neck dissection for the initial surgical management of early-stage oral cavity cancer. Methods We constructed a Markov model to simulate primary, adjuvant, and salvage therapy; disease recurrence; and survival in patients with T1/T2 clinically node-negative oral cavity squamous cell carcinoma. Transition probabilities were derived from clinical trial data; costs (in 2015 US dollars) and health utilities were estimated from the literature. Incremental cost-effectiveness ratios, expressed as dollar per quality-adjusted life-year (QALY), were calculated with incremental cost-effectiveness ratios less than $100,000/QALY considered cost effective. We conducted one-way and probabilistic sensitivity analyses to examine model uncertainty. Results Our base-case model found that over a lifetime the addition of elective neck dissection to primary surgery reduced overall costs by $6,000 and improved effectiveness by 0.42 QALYs compared with primary surgery alone. The decrease in overall cost despite the added neck dissection was a result of less use of salvage therapy. On one-way sensitivity analysis, the model was most sensitive to assumptions about disease recurrence, survival, and the health utility reduction from a neck dissection. Probabilistic sensitivity analysis found that treatment with elective neck dissection was cost effective 76% of the time at a willingness-to-pay threshold of $100,000/QALY. Conclusion Our study found that the addition of elective neck dissection reduces costs and improves health outcomes, making this a cost-effective treatment strategy for patients with early-stage oral cavity cancer.
NASA Astrophysics Data System (ADS)
Donovan, Amy; Oppenheimer, Clive; Bravo, Michael
2012-12-01
This paper constitutes a philosophical and social scientific study of expert elicitation in the assessment and management of volcanic risk on Montserrat during the 1995-present volcanic activity. It outlines the broader context of subjective probabilistic methods and then uses a mixed-method approach to analyse the use of these methods in volcanic crises. Data from a global survey of volcanologists regarding the use of statistical methods in hazard assessment are presented. Detailed qualitative data from Montserrat are then discussed, particularly concerning the expert elicitation procedure that was pioneered during the eruptions. These data are analysed and conclusions about the use of these methods in volcanology are drawn. The paper finds that while many volcanologists are open to the use of these methods, there are still some concerns, which are similar to the concerns encountered in the literature on probabilistic and determinist approaches to seismic hazard analysis.
Tsugawa, Hiroshi; Arita, Masanori; Kanazawa, Mitsuhiro; Ogiwara, Atsushi; Bamba, Takeshi; Fukusaki, Eiichiro
2013-05-21
We developed a new software program, MRMPROBS, for widely targeted metabolomics by using the large-scale multiple reaction monitoring (MRM) mode. The strategy became increasingly popular for the simultaneous analysis of up to several hundred metabolites at high sensitivity, selectivity, and quantitative capability. However, the traditional method of assessing measured metabolomics data without probabilistic criteria is not only time-consuming but is often subjective and makeshift work. Our program overcomes these problems by detecting and identifying metabolites automatically, by separating isomeric metabolites, and by removing background noise using a probabilistic score defined as the odds ratio from an optimized multivariate logistic regression model. Our software program also provides a user-friendly graphical interface to curate and organize data matrices and to apply principal component analyses and statistical tests. For a demonstration, we conducted a widely targeted metabolome analysis (152 metabolites) of propagating Saccharomyces cerevisiae measured at 15 time points by gas and liquid chromatography coupled to triple quadrupole mass spectrometry. MRMPROBS is a useful and practical tool for the assessment of large-scale MRM data available to any instrument or any experimental condition.
A comprehensive probabilistic analysis model of oil pipelines network based on Bayesian network
NASA Astrophysics Data System (ADS)
Zhang, C.; Qin, T. X.; Jiang, B.; Huang, C.
2018-02-01
Oil pipelines network is one of the most important facilities of energy transportation. But oil pipelines network accident may result in serious disasters. Some analysis models for these accidents have been established mainly based on three methods, including event-tree, accident simulation and Bayesian network. Among these methods, Bayesian network is suitable for probabilistic analysis. But not all the important influencing factors are considered and the deployment rule of the factors has not been established. This paper proposed a probabilistic analysis model of oil pipelines network based on Bayesian network. Most of the important influencing factors, including the key environment condition and emergency response are considered in this model. Moreover, the paper also introduces a deployment rule for these factors. The model can be used in probabilistic analysis and sensitive analysis of oil pipelines network accident.
Nazir, Jameel; Maman, Khaled; Neine, Mohamed-Elmoctar; Briquet, Benjamin; Odeyemi, Isaac A O; Hakimi, Zalmai; Garnham, Andy; Aballéa, Samuel
2015-09-01
Mirabegron, a first-in-class selective oral β3-adrenoceptor agonist, has similar efficacy to most antimuscarinic agents and a lower incidence of dry mouth in patients with overactive bladder (OAB). To evaluate the cost-effectiveness of mirabegron 50 mg compared with oral antimuscarinic agents in adults with OAB from a UK National Health Service perspective. A Markov model including health states for symptom severity, treatment status, and adverse events was developed. Cycle length was 1 month, and the time horizon was 5 years. Antimuscarinic comparators were tolterodine extended release, solifenacin, fesoterodine, oxybutynin extended release and immediate release (IR), darifenacin, and trospium chloride modified release. Transition probabilities for symptom severity levels and adverse events were estimated from a mirabegron trial and a mixed treatment comparison. Estimates for other inputs were obtained from published literature or expert opinion. Quality-adjusted life-years (QALYs) and total health care costs, including costs of drug acquisition, physician visits, incontinence pad use, and botox injections, were modeled. Deterministic and probabilistic sensitivity analyses were performed. Base-case incremental cost-effectiveness ratios ranged from £367 (vs. solifenacin 10 mg) to £15,593 (vs. oxybutynin IR 10 mg) per QALY gained. Probabilistic sensitivity analyses showed that at a willingness-to-pay threshold of £20,000/QALY gained, the probability of mirabegron 50 mg being cost-effective ranged from 70.2% versus oxybutynin IR 10 mg to 97.8% versus darifenacin 15 mg. A limitation of our analysis is the uncertainty due to the lack of direct comparisons of mirabegron with other agents; a mixed treatment comparison using rigorous methodology provided the data for the analysis, but the studies involved showed heterogeneity. Mirabegron 50 mg appears to be cost-effective compared with standard oral antimuscarinic agents for the treatment of adults with OAB from a UK National Health Service perspective. Copyright © 2015. Published by Elsevier Inc.
Cost-effectiveness of pazopanib compared with sunitinib in metastatic renal cell carcinoma in Canada
Amdahl, J.; Diaz, J.; Park, J.; Nakhaipour, H.R.; Delea, T.E.
2016-01-01
Background In Canada and elsewhere, pazopanib and sunitinib—tyrosine kinase inhibitors targeting the vascular endothelial growth factor receptors—are recommended as first-line treatment for patients with metastatic renal cell carcinoma (mrcc). A large randomized noninferiority trial of pazopanib versus sunitinib (comparz) demonstrated that the two drugs have similar efficacy; however, patients randomized to pazopanib experienced better health-related quality of life (hrqol) and nominally lower rates of non-study medical resource utilization. Methods The cost-effectiveness of pazopanib compared with sunitinib for first-line treatment of mrcc from a Canadian health care system perspective was evaluated using a partitioned-survival model that incorporated data from comparz and other secondary sources. The time horizon of 5 years was based on the maximum duration of follow-up in the final analysis of overall survival from the comparz trial. Analyses were conducted first using list prices for pazopanib and sunitinib and then by assuming that the prices of sunitinib and pazopanib would be equivalent. Results Based on list prices, expected costs were CA$10,293 less with pazopanib than with sunitinib. Pazopanib was estimated to yield 0.059 more quality-adjusted life-years (qalys). Pazopanib was therefore dominant (more qalys and lower costs) compared with sunitinib in the base case. In probabilistic sensitivity analyses, pazopanib was dominant in 79% of simulations and was cost-effective in 90%–100% of simulations at a threshold cost-effectiveness ratio of CA$100,000. Assuming equivalent pricing, pazopanib yielded CA$917 in savings in the base case, was dominant in 36% of probabilistic sensitivity analysis simulations, and was cost-effective in 89% of simulations at a threshold cost-effectiveness ratio of CA$100,000. Conclusions Compared with sunitinib, pazopanib is likely to be a cost-effective option for first-line treatment of mrcc from a Canadian health care perspective. PMID:27536183
Failed rib region prediction in a human body model during crash events with precrash braking.
Guleyupoglu, B; Koya, B; Barnard, R; Gayzik, F S
2018-02-28
The objective of this study is 2-fold. We used a validated human body finite element model to study the predicted chest injury (focusing on rib fracture as a function of element strain) based on varying levels of simulated precrash braking. Furthermore, we compare deterministic and probabilistic methods of rib injury prediction in the computational model. The Global Human Body Models Consortium (GHBMC) M50-O model was gravity settled in the driver position of a generic interior equipped with an advanced 3-point belt and airbag. Twelve cases were investigated with permutations for failure, precrash braking system, and crash severity. The severities used were median (17 kph), severe (34 kph), and New Car Assessment Program (NCAP; 56.4 kph). Cases with failure enabled removed rib cortical bone elements once 1.8% effective plastic strain was exceeded. Alternatively, a probabilistic framework found in the literature was used to predict rib failure. Both the probabilistic and deterministic methods take into consideration location (anterior, lateral, and posterior). The deterministic method is based on a rubric that defines failed rib regions dependent on a threshold for contiguous failed elements. The probabilistic method depends on age-based strain and failure functions. Kinematics between both methods were similar (peak max deviation: ΔX head = 17 mm; ΔZ head = 4 mm; ΔX thorax = 5 mm; ΔZ thorax = 1 mm). Seat belt forces at the time of probabilistic failed region initiation were lower than those at deterministic failed region initiation. The probabilistic method for rib fracture predicted more failed regions in the rib (an analog for fracture) than the deterministic method in all but 1 case where they were equal. The failed region patterns between models are similar; however, there are differences that arise due to stress reduced from element elimination that cause probabilistic failed regions to continue to rise after no deterministic failed region would be predicted. Both the probabilistic and deterministic methods indicate similar trends with regards to the effect of precrash braking; however, there are tradeoffs. The deterministic failed region method is more spatially sensitive to failure and is more sensitive to belt loads. The probabilistic failed region method allows for increased capability in postprocessing with respect to age. The probabilistic failed region method predicted more failed regions than the deterministic failed region method due to force distribution differences.
Health economics and outcomes methods in risk-based decision-making for blood safety.
Custer, Brian; Janssen, Mart P
2015-08-01
Analytical methods appropriate for health economic assessments of transfusion safety interventions have not previously been described in ways that facilitate their use. Within the context of risk-based decision-making (RBDM), health economics can be important for optimizing decisions among competing interventions. The objective of this review is to address key considerations and limitations of current methods as they apply to blood safety. Because a voluntary blood supply is an example of a public good, analyses should be conducted from the societal perspective when possible. Two primary study designs are recommended for most blood safety intervention assessments: budget impact analysis (BIA), which measures the cost to implement an intervention both to the blood operator but also in a broader context, and cost-utility analysis (CUA), which measures the ratio between costs and health gain achieved, in terms of reduced morbidity and mortality, by use of an intervention. These analyses often have important limitations because data that reflect specific aspects, for example, blood recipient population characteristics or complication rates, are not available. Sensitivity analyses play an important role. The impact of various uncertain factors can be studied conjointly in probabilistic sensitivity analyses. The use of BIA and CUA together provides a comprehensive assessment of the costs and benefits from implementing (or not) specific interventions. RBDM is multifaceted and impacts a broad spectrum of stakeholders. Gathering and analyzing health economic evidence as part of the RBDM process enhances the quality, completeness, and transparency of decision-making. © 2015 AABB.
Acute stress selectively reduces reward sensitivity
Berghorst, Lisa H.; Bogdan, Ryan; Frank, Michael J.; Pizzagalli, Diego A.
2013-01-01
Stress may promote the onset of psychopathology by disrupting reward processing. However, the extent to which stress impairs reward processing, rather than incentive processing more generally, is unclear. To evaluate the specificity of stress-induced reward processing disruption, 100 psychiatrically healthy females were administered a probabilistic stimulus selection task (PSST) that enabled comparison of sensitivity to reward-driven (Go) and punishment-driven (NoGo) learning under either “no stress” or “stress” (threat-of-shock) conditions. Cortisol samples and self-report measures were collected. Contrary to hypotheses, the groups did not differ significantly in task performance or cortisol reactivity. However, further analyses focusing only on individuals under “stress” who were high responders with regard to both cortisol reactivity and self-reported negative affect revealed reduced reward sensitivity relative to individuals tested in the “no stress” condition; importantly, these deficits were reward-specific. Overall, findings provide preliminary evidence that stress-reactive individuals show diminished sensitivity to reward, but not punishment, under stress. While such results highlight the possibility that stress-induced anhedonia might be an important mechanism linking stress to affective disorders, future studies are necessary to confirm this conjecture. PMID:23596406
Probabilistic assessment of uncertain adaptive hybrid composites
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.
1994-01-01
Adaptive composite structures using actuation materials, such as piezoelectric fibers, were assessed probabilistically utilizing intraply hybrid composite mechanics in conjunction with probabilistic composite structural analysis. Uncertainties associated with the actuation material as well as the uncertainties in the regular (traditional) composite material properties were quantified and considered in the assessment. Static and buckling analyses were performed for rectangular panels with various boundary conditions and different control arrangements. The probability density functions of the structural behavior, such as maximum displacement and critical buckling load, were computationally simulated. The results of the assessment indicate that improved design and reliability can be achieved with actuation material.
Phillips, Benjamin U; Dewan, Sigma; Nilsson, Simon R O; Robbins, Trevor W; Heath, Christopher J; Saksida, Lisa M; Bussey, Timothy J; Alsiö, Johan
2018-04-22
Dysregulation of the serotonin (5-HT) system is a pathophysiological component in major depressive disorder (MDD), a condition closely associated with abnormal emotional responsivity to positive and negative feedback. However, the precise mechanism through which 5-HT tone biases feedback responsivity remains unclear. 5-HT2C receptors (5-HT2CRs) are closely linked with aspects of depressive symptomatology, including abnormalities in reinforcement processes and response to stress. Thus, we aimed to determine the impact of 5-HT2CR function on response to feedback in biased reinforcement learning. We used two touchscreen assays designed to assess the impact of positive and negative feedback on probabilistic reinforcement in mice, including a novel valence-probe visual discrimination (VPVD) and a probabilistic reversal learning procedure (PRL). Systemic administration of a 5-HT2CR agonist and antagonist resulted in selective changes in the balance of feedback sensitivity bias on these tasks. Specifically, on VPVD, SB 242084, the 5-HT2CR antagonist, impaired acquisition of a discrimination dependent on appropriate integration of positive and negative feedback. On PRL, SB 242084 at 1 mg/kg resulted in changes in behaviour consistent with reduced sensitivity to positive feedback. In contrast, WAY 163909, the 5-HT2CR agonist, resulted in changes associated with increased sensitivity to positive feedback and decreased sensitivity to negative feedback. These results suggest that 5-HT2CRs tightly regulate feedback sensitivity bias in mice with consequent effects on learning and cognitive flexibility and specify a framework for the influence of 5-HT2CRs on sensitivity to reinforcement.
Herpes zoster vaccine: A health economic evaluation for Switzerland.
Blank, Patricia R; Ademi, Zanfina; Lu, Xiaoyan; Szucs, Thomas D; Schwenkglenks, Matthias
2017-07-03
Herpes zoster (HZ) or "shingles" results from a reactivation of the varicella zoster virus (VZV) acquired during primary infection (chickenpox) and surviving in the dorsal root ganglia. In about 20% of cases, a complication occurs, known as post-herpetic neuralgia (PHN). A live attenuated vaccine against VZV is available for the prevention of HZ and subsequent PHN. The present study aims to update an earlier evaluation estimating the cost-effectiveness of the HZ vaccine from a Swiss third party payer perspective. It takes into account updated vaccine prices, a different age cohort, latest clinical data and burden of illness data. A Markov model was developed to simulate the lifetime consequences of vaccinating 15% of the Swiss population aged 65-79 y. Information from sentinel data, official statistics and published literature were used. Endpoints assessed were number of HZ and PHN cases, quality-adjusted life years (QALYs), costs of hospitalizations, consultations and prescriptions. Based on a vaccine price of CHF 162, the vaccination strategy accrued additional costs of CHF 17,720,087 and gained 594 QALYs. The incremental cost-effectiveness ratio (ICER) was CHF 29,814 per QALY gained. Sensitivity analyses showed that the results were most sensitive to epidemiological inputs, utility values, discount rates, duration of vaccine efficacy, and vaccine price. Probabilistic sensitivity analyses indicated a more than 99% chance that the ICER was below 40,000 CHF per QALY. Findings were in line with existing cost-effectiveness analyses of HZ vaccination. This updated study supports the value of an HZ vaccination strategy targeting the Swiss population aged 65-79 y.
Herpes zoster vaccine: A health economic evaluation for Switzerland
Blank, Patricia R.; Ademi, Zanfina; Lu, Xiaoyan; Szucs, Thomas D.; Schwenkglenks, Matthias
2017-01-01
ABSTRACT Herpes zoster (HZ) or “shingles” results from a reactivation of the varicella zoster virus (VZV) acquired during primary infection (chickenpox) and surviving in the dorsal root ganglia. In about 20% of cases, a complication occurs, known as post-herpetic neuralgia (PHN). A live attenuated vaccine against VZV is available for the prevention of HZ and subsequent PHN. The present study aims to update an earlier evaluation estimating the cost-effectiveness of the HZ vaccine from a Swiss third party payer perspective. It takes into account updated vaccine prices, a different age cohort, latest clinical data and burden of illness data. A Markov model was developed to simulate the lifetime consequences of vaccinating 15% of the Swiss population aged 65–79 y. Information from sentinel data, official statistics and published literature were used. Endpoints assessed were number of HZ and PHN cases, quality-adjusted life years (QALYs), costs of hospitalizations, consultations and prescriptions. Based on a vaccine price of CHF 162, the vaccination strategy accrued additional costs of CHF 17,720,087 and gained 594 QALYs. The incremental cost-effectiveness ratio (ICER) was CHF 29,814 per QALY gained. Sensitivity analyses showed that the results were most sensitive to epidemiological inputs, utility values, discount rates, duration of vaccine efficacy, and vaccine price. Probabilistic sensitivity analyses indicated a more than 99% chance that the ICER was below 40,000 CHF per QALY. Findings were in line with existing cost-effectiveness analyses of HZ vaccination. This updated study supports the value of an HZ vaccination strategy targeting the Swiss population aged 65–79 y. PMID:28481678
NASA Astrophysics Data System (ADS)
Song, Lu-Kai; Wen, Jie; Fei, Cheng-Wei; Bai, Guang-Chen
2018-05-01
To improve the computing efficiency and precision of probabilistic design for multi-failure structure, a distributed collaborative probabilistic design method-based fuzzy neural network of regression (FR) (called as DCFRM) is proposed with the integration of distributed collaborative response surface method and fuzzy neural network regression model. The mathematical model of DCFRM is established and the probabilistic design idea with DCFRM is introduced. The probabilistic analysis of turbine blisk involving multi-failure modes (deformation failure, stress failure and strain failure) was investigated by considering fluid-structure interaction with the proposed method. The distribution characteristics, reliability degree, and sensitivity degree of each failure mode and overall failure mode on turbine blisk are obtained, which provides a useful reference for improving the performance and reliability of aeroengine. Through the comparison of methods shows that the DCFRM reshapes the probability of probabilistic analysis for multi-failure structure and improves the computing efficiency while keeping acceptable computational precision. Moreover, the proposed method offers a useful insight for reliability-based design optimization of multi-failure structure and thereby also enriches the theory and method of mechanical reliability design.
Cost-effectiveness of chronic hepatitis C treatment with thymosin alpha-1.
García-Contreras, Fernando; Nevárez-Sida, Armando; Constantino-Casas, Patricia; Abud-Bastida, Fernando; Garduño-Espinosa, Juan
2006-07-01
More than one million individuals in Mexico are infected with hepatitis C virus (HCV), and 80% are at risk for developing a chronic infection that could lead to hepatic cirrhosis and other complications that impact quality of life and institutional costs. The objective of the study was to determine the most cost-effective treatment against HCV among the following: peginterferon, peginterferon plus ribavirin, peginterferon plus ribavirin plus thymosin, and no treatment. We carried out cost-effectiveness analysis using the institutional perspective, including a 45-year time frame and a 3% discount rate for costs and effectiveness. We employed a Bayesian-focused decision tree and a Markov model. One- and two-way sensitivity analyses were performed, as well as threshold-oriented and probabilistic analyses, and we obtained acceptability curves and net health benefits. Triple therapy (peginterferon plus ribavirin plus thymosin alpha-1) was dominant with lower cost and higher utility in relationship with peginterferon + ribavirin option, peginterferon alone and no-treatment option. In triple therapy the cost per unit of success was of 1,908 [USD/quality-adjusted life years (QALY)] compared with peginterferon plus ribavirin 2,277/QALY, peginterferon alone 2,929/QALY, and no treatment 4,204/QALY. Sensitivity analyses confirmed the robustness of the base case. Peginterferon plus ribavirin plus thymosin alpha-1 option was dominant (lowest cost and highest effectiveness). Using no drug was the most expensive and least effective option.
Flash-flood early warning using weather radar data: from nowcasting to forecasting
NASA Astrophysics Data System (ADS)
Liechti, Katharina; Panziera, Luca; Germann, Urs; Zappa, Massimiliano
2013-04-01
In our study we explore the limits of radar-based forecasting for hydrological runoff prediction. Two novel probabilistic radar-based forecasting chains for flash-flood early warning are investigated in three catchments in the Southern Swiss Alps and set in relation to deterministic discharge forecast for the same catchments. The first probabilistic radar-based forecasting chain is driven by NORA (Nowcasting of Orographic Rainfall by means of Analogues), an analogue-based heuristic nowcasting system to predict orographic rainfall for the following eight hours. The second probabilistic forecasting system evaluated is REAL-C2, where the numerical weather prediction COSMO-2 is initialized with 25 different initial conditions derived from a four-day nowcast with the radar ensemble REAL. Additionally, three deterministic forecasting chains were analysed. The performance of these five flash-flood forecasting systems was analysed for 1389 hours between June 2007 and December 2010 for which NORA forecasts were issued, due to the presence of orographic forcing. We found a clear preference for the probabilistic approach. Discharge forecasts perform better when forced by NORA rather than by a persistent radar QPE for lead times up to eight hours and for all discharge thresholds analysed. The best results were, however, obtained with the REAL-C2 forecasting chain, which was also remarkably skilful even with the highest thresholds. However, for regions where REAL cannot be produced, NORA might be an option for forecasting events triggered by orographic forcing.
Flash-flood early warning using weather radar data: from nowcasting to forecasting
NASA Astrophysics Data System (ADS)
Liechti, K.; Panziera, L.; Germann, U.; Zappa, M.
2013-01-01
This study explores the limits of radar-based forecasting for hydrological runoff prediction. Two novel probabilistic radar-based forecasting chains for flash-flood early warning are investigated in three catchments in the Southern Swiss Alps and set in relation to deterministic discharge forecast for the same catchments. The first probabilistic radar-based forecasting chain is driven by NORA (Nowcasting of Orographic Rainfall by means of Analogues), an analogue-based heuristic nowcasting system to predict orographic rainfall for the following eight hours. The second probabilistic forecasting system evaluated is REAL-C2, where the numerical weather prediction COSMO-2 is initialized with 25 different initial conditions derived from a four-day nowcast with the radar ensemble REAL. Additionally, three deterministic forecasting chains were analysed. The performance of these five flash-flood forecasting systems was analysed for 1389 h between June 2007 and December 2010 for which NORA forecasts were issued, due to the presence of orographic forcing. We found a clear preference for the probabilistic approach. Discharge forecasts perform better when forced by NORA rather than by a persistent radar QPE for lead times up to eight hours and for all discharge thresholds analysed. The best results were, however, obtained with the REAL-C2 forecasting chain, which was also remarkably skilful even with the highest thresholds. However, for regions where REAL cannot be produced, NORA might be an option for forecasting events triggered by orographic precipitation.
Tong, Ruipeng; Yang, Xiaoyi; Su, Hanrui; Pan, Yue; Zhang, Qiuzhuo; Wang, Juan; Long, Mingce
2018-03-01
The levels, sources and quantitative probabilistic health risks for polycyclic aromatic hydrocarbons (PAHs) in agricultural soils in the vicinity of power, steel and petrochemical plants in the suburbs of Shanghai are discussed. The total concentration of 16 PAHs in the soils ranges from 223 to 8214ng g -1 . The sources of PAHs were analyzed by both isomeric ratios and a principal component analysis-multiple linear regression method. The results indicate that PAHs mainly originated from the incomplete combustion of coal and oil. The probabilistic risk assessments for both carcinogenic and non-carcinogenic risks posed by PAHs in soils with adult farmers as concerned receptors were quantitatively calculated by Monte Carlo simulation. The estimated total carcinogenic risks (TCR) for the agricultural soils has a 45% possibility of exceeding the acceptable threshold value (10 -6 ), indicating potential adverse health effects. However, all non-carcinogenic risks are below the threshold value. Oral intake is the dominant exposure pathway, accounting for 77.7% of TCR, while inhalation intake is negligible. The three PAHs with the highest contribution for TCR are BaP (64.35%), DBA (17.56%) and InP (9.06%). Sensitivity analyses indicate that exposure frequency has the greatest impact on the total risk uncertainty, followed by the exposure dose through oral intake and exposure duration. These results indicate that it is essential to manage the health risks of PAH-contaminated agricultural soils in the vicinity of typical industries in megacities. Copyright © 2017 Elsevier B.V. All rights reserved.
Probabilistic boundary element method
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Raveendra, S. T.
1989-01-01
The purpose of the Probabilistic Structural Analysis Method (PSAM) project is to develop structural analysis capabilities for the design analysis of advanced space propulsion system hardware. The boundary element method (BEM) is used as the basis of the Probabilistic Advanced Analysis Methods (PADAM) which is discussed. The probabilistic BEM code (PBEM) is used to obtain the structural response and sensitivity results to a set of random variables. As such, PBEM performs analogous to other structural analysis codes such as finite elements in the PSAM system. For linear problems, unlike the finite element method (FEM), the BEM governing equations are written at the boundary of the body only, thus, the method eliminates the need to model the volume of the body. However, for general body force problems, a direct condensation of the governing equations to the boundary of the body is not possible and therefore volume modeling is generally required.
Probabilistic simulation of multi-scale composite behavior
NASA Technical Reports Server (NTRS)
Liaw, D. G.; Shiao, M. C.; Singhal, S. N.; Chamis, Christos C.
1993-01-01
A methodology is developed to computationally assess the probabilistic composite material properties at all composite scale levels due to the uncertainties in the constituent (fiber and matrix) properties and in the fabrication process variables. The methodology is computationally efficient for simulating the probability distributions of material properties. The sensitivity of the probabilistic composite material property to each random variable is determined. This information can be used to reduce undesirable uncertainties in material properties at the macro scale of the composite by reducing the uncertainties in the most influential random variables at the micro scale. This methodology was implemented into the computer code PICAN (Probabilistic Integrated Composite ANalyzer). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in the material properties of a typical laminate and comparing the results with the Monte Carlo simulation method. The experimental data of composite material properties at all scales fall within the scatters predicted by PICAN.
Akinci, A.; Galadini, F.; Pantosti, D.; Petersen, M.; Malagnini, L.; Perkins, D.
2009-01-01
We produce probabilistic seismic-hazard assessments for the central Apennines, Italy, using time-dependent models that are characterized using a Brownian passage time recurrence model. Using aperiodicity parameters, ?? of 0.3, 0.5, and 0.7, we examine the sensitivity of the probabilistic ground motion and its deaggregation to these parameters. For the seismic source model we incorporate both smoothed historical seismicity over the area and geological information on faults. We use the maximum magnitude model for the fault sources together with a uniform probability of rupture along the fault (floating fault model) to model fictitious faults to account for earthquakes that cannot be correlated with known geologic structural segmentation.
Probabilistic assessment of smart composite structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Shiao, Michael C.
1994-01-01
A composite wing with spars and bulkheads is used to demonstrate the effectiveness of probabilistic assessment of smart composite structures to control uncertainties in distortions and stresses. Results show that a smart composite wing can be controlled to minimize distortions and to have specified stress levels in the presence of defects. Structural responses such as changes in angle of attack, vertical displacements, and stress in the control and controlled plies are probabilistically assessed to quantify their respective uncertainties. Sensitivity factors are evaluated to identify those parameters that have the greatest influence on a specific structural response. Results show that smart composite structures can be configured to control both distortions and ply stresses to satisfy specified design requirements.
Probabilistic simulation of the human factor in structural reliability
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1993-01-01
A formal approach is described in an attempt to computationally simulate the probable ranges of uncertainties of the human factor in structural probabilistic assessments. A multi-factor interaction equation (MFIE) model has been adopted for this purpose. Human factors such as marital status, professional status, home life, job satisfaction, work load and health, are considered to demonstrate the concept. Parametric studies in conjunction with judgment are used to select reasonable values for the participating factors (primitive variables). Suitability of the MFIE in the subsequently probabilistic sensitivity studies are performed to assess the validity of the whole approach. Results obtained show that the uncertainties for no error range from five to thirty percent for the most optimistic case.
Probabilistic simulation of the human factor in structural reliability
NASA Astrophysics Data System (ADS)
Chamis, Christos C.; Singhal, Surendra N.
1994-09-01
The formal approach described herein computationally simulates the probable ranges of uncertainties for the human factor in probabilistic assessments of structural reliability. Human factors such as marital status, professional status, home life, job satisfaction, work load, and health are studied by using a multifactor interaction equation (MFIE) model to demonstrate the approach. Parametric studies in conjunction with judgment are used to select reasonable values for the participating factors (primitive variables). Subsequently performed probabilistic sensitivity studies assess the suitability of the MFIE as well as the validity of the whole approach. Results show that uncertainties range from 5 to 30 percent for the most optimistic case, assuming 100 percent for no error (perfect performance).
Probabilistic Simulation of the Human Factor in Structural Reliability
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Singhal, Surendra N.
1994-01-01
The formal approach described herein computationally simulates the probable ranges of uncertainties for the human factor in probabilistic assessments of structural reliability. Human factors such as marital status, professional status, home life, job satisfaction, work load, and health are studied by using a multifactor interaction equation (MFIE) model to demonstrate the approach. Parametric studies in conjunction with judgment are used to select reasonable values for the participating factors (primitive variables). Subsequently performed probabilistic sensitivity studies assess the suitability of the MFIE as well as the validity of the whole approach. Results show that uncertainties range from 5 to 30 percent for the most optimistic case, assuming 100 percent for no error (perfect performance).
Fernandes, Silke; Sicuri, Elisa; Kayentao, Kassoum; van Eijk, Anne Maria; Hill, Jenny; Webster, Jayne; Were, Vincent; Akazili, James; Madanitsa, Mwayi; ter Kuile, Feiko O; Hanson, Kara
2015-03-01
In 2012, WHO changed its recommendation for intermittent preventive treatment of malaria during pregnancy (IPTp) from two doses to monthly doses of sulfadoxine-pyrimethamine during the second and third trimesters, but noted the importance of a cost-effectiveness analysis to lend support to the decision of policy makers. We therefore estimated the incremental cost-effectiveness of IPTp with three or more (IPTp-SP3+) versus two doses of sulfadoxine-pyrimethamine (IPTp-SP2). For this analysis, we used data from a 2013 meta-analysis of seven studies in sub-Saharan Africa. We developed a decision tree model with a lifetime horizon. We analysed the base case from a societal perspective. We did deterministic and probabilistic sensitivity analyses with appropriate parameter ranges and distributions for settings with low, moderate, and high background risk of low birthweight, and did a separate analysis for HIV-negative women. Parameters in the model were obtained for all countries included in the original meta-analysis. We did simulations in hypothetical cohorts of 1000 pregnant women receiving either IPTp-SP3+ or IPTp-SP2. We calculated disability-adjusted life-years (DALYs) for low birthweight, severe to moderate anaemia, and clinical malaria. We calculated cost estimates from data obtained in observational studies, exit surveys, and from public procurement databases. We give financial and economic costs in constant 2012 US$. The main outcome measure was the incremental cost per DALY averted. The delivery of IPTp-SP3+ to 1000 pregnant women averted 113·4 DALYs at an incremental cost of $825·67 producing an incremental cost-effectiveness ratio (ICER) of $7·28 per DALY averted. The results remained robust in the deterministic sensitivity analysis. In the probabilistic sensitivity analyses, the ICER was $7·7 per DALY averted for moderate risk of low birthweight, $19·4 per DALY averted for low risk, and $4·0 per DALY averted for high risk. The ICER for HIV-negative women was $6·2 per DALY averted. Our findings lend strong support to the WHO guidelines that recommend a monthly dose of IPTp-SP from the second trimester onwards. Malaria in Pregnancy Consortium and the Bill & Melinda Gates Foundation. Copyright © 2015 Fernandes et al. Open Access article distributed under the terms of CC BY-NC-SA. Published by .. All rights reserved.
10 CFR 436.24 - Uncertainty analyses.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...
10 CFR 436.24 - Uncertainty analyses.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...
10 CFR 436.24 - Uncertainty analyses.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...
10 CFR 436.24 - Uncertainty analyses.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...
Probabilistic Simulation of Multi-Scale Composite Behavior
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2012-01-01
A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mays, S.E.; Poloski, J.P.; Sullivan, W.H.
1982-07-01
A probabilistic risk assessment (PRA) was made of the Browns Ferry, Unit 1, nuclear plant as part of the Nuclear Regulatory Commission's Interim Reliability Evaluation Program (IREP). Specific goals of the study were to identify the dominant contributors to core melt, develop a foundation for more extensive use of PRA methods, expand the cadre of experienced PRA practitioners, and apply procedures for extension of IREP analyses to other domestic light water reactors. Event tree and fault tree analyses were used to estimate the frequency of accident sequences initiated by transients and loss of coolant accidents. External events such as floods,more » fires, earthquakes, and sabotage were beyond the scope of this study and were, therefore, excluded. From these sequences, the dominant contributors to probable core melt frequency were chosen. Uncertainty and sensitivity analyses were performed on these sequences to better understand the limitations associated with the estimated sequence frequencies. Dominant sequences were grouped according to common containment failure modes and corresponding release categories on the basis of comparison with analyses of similar designs rather than on the basis of detailed plant-specific calculations.« less
Fisher, Mark; Walker, Andrew; Falqués, Meritxell; Moya, Miguel; Rance, Mark; Taylor, Douglas; Lindner, Leandro
2016-12-01
Presently, linaclotide is the only EMA-approved therapy indicated for the treatment of irritable bowel syndrome with constipation (IBS-C). This study sought to determine the cost-effectiveness of linaclotide compared to antidepressants for the treatment of adults with moderate to severe IBS-C who have previously received antispasmodics and/or laxatives. A Markov model was created to estimate costs and QALYs over a 5-year time horizon from the perspective of NHS Scotland. Health states were based on treatment satisfaction (satisfied, moderately satisfied, not satisfied) and mortality. Transition probabilities were based on satisfaction data from the linaclotide pivotal studies and Scottish general all-cause mortality statistics. Treatment costs were calculated from the British National Formulary. NHS resource use and disease-related costs for each health state were estimated from Scottish clinician interviews in combination with NHS Reference costs. Quality of life was based on EQ-5D data collected from the pivotal studies. Costs and QALYs were discounted at 3.5 % per annum. Uncertainty was explored through extensive deterministic and probabilistic sensitivity analyses. Over a 5-year time horizon, the additional costs and QALYs generated with linaclotide were £659 and 0.089, resulting in an incremental cost-effectiveness ratio of £7370 per QALY versus antidepressants. Based on the probabilistic sensitivity analysis, the likelihood that linaclotide was cost-effective at a willingness to pay of £20,000 per QALY was 73 %. Linaclotide can be a cost-effective treatment for adults with moderate-to-severe IBS-C who have previously received antispasmodics and/or laxatives in Scotland.
NASA Astrophysics Data System (ADS)
Separovic, Leo; Husain, Syed Zahid; Yu, Wei
2015-09-01
Internal variability (IV) in dynamical downscaling with limited-area models (LAMs) represents a source of error inherent to the downscaled fields, which originates from the sensitive dependence of the models to arbitrarily small modifications. If IV is large it may impose the need for probabilistic verification of the downscaled information. Atmospheric spectral nudging (ASN) can reduce IV in LAMs as it constrains the large-scale components of LAM fields in the interior of the computational domain and thus prevents any considerable penetration of sensitively dependent deviations into the range of large scales. Using initial condition ensembles, the present study quantifies the impact of ASN on IV in LAM simulations in the range of fine scales that are not controlled by spectral nudging. Four simulation configurations that all include strong ASN but differ in the nudging settings are considered. In the fifth configuration, grid nudging of land surface variables toward high-resolution surface analyses is applied. The results show that the IV at scales larger than 300 km can be suppressed by selecting an appropriate ASN setup. At scales between 300 and 30 km, however, in all configurations, the hourly near-surface temperature, humidity, and winds are only partly reproducible. Nudging the land surface variables is found to have the potential to significantly reduce IV, particularly for fine-scale temperature and humidity. On the other hand, hourly precipitation accumulations at these scales are generally irreproducible in all configurations, and probabilistic approach to downscaling is therefore recommended.
Herzer, Kurt R; Niessen, Louis; Constenla, Dagna O; Ward, William J; Pronovost, Peter J
2014-09-25
To assess the cost-effectiveness of a multifaceted quality improvement programme focused on reducing central line-associated bloodstream infections in intensive care units. Cost-effectiveness analysis using a decision tree model to compare programme to non-programme intensive care units. USA. Adult patients in the intensive care unit. Economic costs of the programme and of central line-associated bloodstream infections were estimated from the perspective of the hospital and presented in 2013 US dollars. Central line-associated bloodstream infections prevented, deaths averted due to central line-associated bloodstream infections prevented, and incremental cost-effectiveness ratios. Probabilistic sensitivity analysis was performed. Compared with current practice, the programme is strongly dominant and reduces bloodstream infections and deaths at no additional cost. The probabilistic sensitivity analysis showed that there was an almost 80% probability that the programme reduces bloodstream infections and the infections' economic costs to hospitals. The opportunity cost of a bloodstream infection to a hospital was the most important model parameter in these analyses. This multifaceted quality improvement programme, as it is currently implemented by hospitals on an increasingly large scale in the USA, likely reduces the economic costs of central line-associated bloodstream infections for US hospitals. Awareness among hospitals about the programme's benefits should enhance implementation. The programme's implementation has the potential to substantially reduce morbidity, mortality and economic costs associated with central line-associated bloodstream infections. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Jayaraman, Sudha P; Jiang, Yushan; Resch, Stephen; Askari, Reza; Klompas, Michael
2016-10-01
Interventions to contain two multi-drug-resistant Acinetobacter (MDRA) outbreaks reduced the incidence of multi-drug-resistant (MDR) organisms, specifically methicillin-resistant Staphylococcus aureus, vancomycin-resistant Enterococcus, and Clostridium difficile in the general surgery intensive care unit (ICU) of our hospital. We therefore conducted a cost-effective analysis of a proactive model infection-control program to reduce transmission of MDR organisms based on the practices used to control the MDRA outbreak. We created a model of a proactive infection control program based on the 2011 MDRA outbreak response. We built a decision analysis model and performed univariable and probabilistic sensitivity analyses to evaluate the cost-effectiveness of the proposed program compared with standard infection control practices to reduce transmission of these MDR organisms. The cost of a proactive infection control program would be $68,509 per year. The incremental cost-effectiveness ratio (ICER) was calculated to be $3,804 per aversion of transmission of MDR organisms in a one-year period compared with standard infection control. On the basis of probabilistic sensitivity analysis, a willingness-to-pay (WTP) threshold of $14,000 per transmission averted would have a 42% probability of being cost-effective, rising to 100% at $22,000 per transmission averted. This analysis gives an estimated ICER for implementing a proactive program to prevent transmission of MDR organisms in the general surgery ICU. To better understand the causal relations between the critical steps in the program and the rate reductions, a randomized study of a package of interventions to prevent healthcare-associated infections should be considered.
Cost-effectiveness of influenza vaccination of older adults in the ED setting.
Patterson, Brian W; Khare, Rahul K; Courtney, D Mark; Lee, Todd A; Kyriacou, Demetrios N
2012-09-01
Adults older than 50 years are at greater risk for death and severe disability from influenza. Persons in this age group, however, are frequently not vaccinated, despite extensive efforts by physicians to provide this preventive measure in primary care settings. We performed this study to determine if influenza vaccination of older adults in the emergency department (ED) may be cost-effective. Using a probabilistic decision model with quasi-Markov modeling of a typical influenza season, we calculated costs and health outcomes for a hypothetical cohort of patients using parameters from the literature. Three ED-based intervention strategies were compared: (1) no vaccination offered, (2) vaccination offered to patients older than 65 years (limited strategy), and (3) vaccination offered to all patients who are 50 years and older (inclusive strategy). Outcomes were measured as costs, lives saved, and incremental costs per life saved. We performed deterministic and probabilistic sensitivity analyses. Vaccination of patients 50 years of age and older results in an incremental cost of $34,610 per life saved when compared with the no-vaccination strategy. Limiting vaccination to only those older than 65 years results in an incremental cost of $13,084 per life saved. Results were sensitive to changes in vaccine cost but were insensitive to changes in other model parameters. Vaccination of older adults against influenza in the ED setting is cost-effective, especially for those older than 65 years. Emergency departments may be an important setting for providing influenza vaccination to adults who may otherwise have remained unvaccinated. Copyright © 2012 Elsevier Inc. All rights reserved.
Pan, Yuesong; Wang, Anxin; Liu, Gaifen; Zhao, Xingquan; Meng, Xia; Zhao, Kun; Liu, Liping; Wang, Chunxue; Johnston, S. Claiborne; Wang, Yilong; Wang, Yongjun
2014-01-01
Background Treatment with the combination of clopidogrel and aspirin taken soon after a transient ischemic attack (TIA) or minor stroke was shown to reduce the 90‐day risk of stroke in a large trial in China, but the cost‐effectiveness is unknown. This study sought to estimate the cost‐effectiveness of the clopidogrel‐aspirin regimen for acute TIA or minor stroke. Methods and Results A Markov model was created to determine the cost‐effectiveness of treatment of acute TIA or minor stroke patients with clopidogrel‐aspirin compared with aspirin alone. Inputs for the model were obtained from clinical trial data, claims databases, and the published literature. The main outcome measure was cost per quality‐adjusted life‐years (QALYs) gained. One‐way and multivariable probabilistic sensitivity analyses were performed to test the robustness of the findings. Compared with aspirin alone, clopidogrel‐aspirin resulted in a lifetime gain of 0.037 QALYs at an additional cost of CNY 1250 (US$ 192), yielding an incremental cost‐effectiveness ratio of CNY 33 800 (US$ 5200) per QALY gained. Probabilistic sensitivity analysis showed that clopidogrel‐aspirin therapy was more cost‐effective in 95.7% of the simulations at a willingness‐to‐pay threshold recommended by the World Health Organization of CNY 105 000 (US$ 16 200) per QALY. Conclusions Early 90‐day clopidogrel‐aspirin regimen for acute TIA or minor stroke is highly cost‐effective in China. Although clopidogrel is generic, Plavix is brand in China. If Plavix were generic, treatment with clopidogrel‐aspirin would have been cost saving. PMID:24904018
Müller, Dirk; Danner, Marion; Rhiem, Kerstin; Stollenwerk, Björn; Engel, Christoph; Rasche, Linda; Borsi, Lisa; Schmutzler, Rita; Stock, Stephanie
2018-04-01
Women with a BRCA1 or BRCA2 mutation are at increased risk of developing breast and/or ovarian cancer. This economic modeling study evaluated different preventive interventions for 30-year-old women with a confirmed BRCA (1 or 2) mutation. A Markov model was developed to estimate the costs and benefits [i.e., quality-adjusted life years (QALYs), and life years gained (LYG)] associated with prophylactic bilateral mastectomy (BM), prophylactic bilateral salpingo-oophorectomy (BSO), BM plus BSO, BM plus BSO at age 40, and intensified surveillance. Relevant input data was obtained from a large German database including 5902 women with BRCA 1 or 2, and from the literature. The analysis was performed from the German Statutory Health Insurance (SHI) perspective. In order to assess the robustness of the results, deterministic and probabilistic sensitivity analyses were performed. With costs of €29,434 and a gain in QALYs of 17.7 (LYG 19.9), BM plus BSO at age 30 was less expensive and more effective than the other strategies, followed by BM plus BSO at age 40. Women who were offered the surveillance strategy had the highest costs at the lowest gain in QALYs/LYS. In the probabilistic sensitivity analysis, the probability of cost-saving was 57% for BM plus BSO. At a WTP of 10,000 € per QALY, the probability of the intervention being cost-effective was 80%. From the SHI perspective, undergoing BM plus immediate BSO should be recommended to BRCA 1 or 2 mutation carriers due to its favorable comparative cost-effectiveness.
Hiraishi, Kunihiko
2014-01-01
One of the significant topics in systems biology is to develop control theory of gene regulatory networks (GRNs). In typical control of GRNs, expression of some genes is inhibited (activated) by manipulating external stimuli and expression of other genes. It is expected to apply control theory of GRNs to gene therapy technologies in the future. In this paper, a control method using a Boolean network (BN) is studied. A BN is widely used as a model of GRNs, and gene expression is expressed by a binary value (ON or OFF). In particular, a context-sensitive probabilistic Boolean network (CS-PBN), which is one of the extended models of BNs, is used. For CS-PBNs, the verification problem and the optimal control problem are considered. For the verification problem, a solution method using the probabilistic model checker PRISM is proposed. For the optimal control problem, a solution method using polynomial optimization is proposed. Finally, a numerical example on the WNT5A network, which is related to melanoma, is presented. The proposed methods provide us useful tools in control theory of GRNs. PMID:24587766
Affective and cognitive factors influencing sensitivity to probabilistic information.
Tyszka, Tadeusz; Sawicki, Przemyslaw
2011-11-01
In study 1 different groups of female students were randomly assigned to one of four probabilistic information formats. Five different levels of probability of a genetic disease in an unborn child were presented to participants (within-subject factor). After the presentation of the probability level, participants were requested to indicate the acceptable level of pain they would tolerate to avoid the disease (in their unborn child), their subjective evaluation of the disease risk, and their subjective evaluation of being worried by this risk. The results of study 1 confirmed the hypothesis that an experience-based probability format decreases the subjective sense of worry about the disease, thus, presumably, weakening the tendency to overrate the probability of rare events. Study 2 showed that for the emotionally laden stimuli, the experience-based probability format resulted in higher sensitivity to probability variations than other formats of probabilistic information. These advantages of the experience-based probability format are interpreted in terms of two systems of information processing: the rational deliberative versus the affective experiential and the principle of stimulus-response compatibility. © 2011 Society for Risk Analysis.
Probabilistic Micromechanics and Macromechanics for Ceramic Matrix Composites
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Mital, Subodh K.; Shah, Ashwin R.
1997-01-01
The properties of ceramic matrix composites (CMC's) are known to display a considerable amount of scatter due to variations in fiber/matrix properties, interphase properties, interphase bonding, amount of matrix voids, and many geometry- or fabrication-related parameters, such as ply thickness and ply orientation. This paper summarizes preliminary studies in which formal probabilistic descriptions of the material-behavior- and fabrication-related parameters were incorporated into micromechanics and macromechanics for CMC'S. In this process two existing methodologies, namely CMC micromechanics and macromechanics analysis and a fast probability integration (FPI) technique are synergistically coupled to obtain the probabilistic composite behavior or response. Preliminary results in the form of cumulative probability distributions and information on the probability sensitivities of the response to primitive variables for a unidirectional silicon carbide/reaction-bonded silicon nitride (SiC/RBSN) CMC are presented. The cumulative distribution functions are computed for composite moduli, thermal expansion coefficients, thermal conductivities, and longitudinal tensile strength at room temperature. The variations in the constituent properties that directly affect these composite properties are accounted for via assumed probabilistic distributions. Collectively, the results show that the present technique provides valuable information about the composite properties and sensitivity factors, which is useful to design or test engineers. Furthermore, the present methodology is computationally more efficient than a standard Monte-Carlo simulation technique; and the agreement between the two solutions is excellent, as shown via select examples.
Donnarumma, Francesco; Maisto, Domenico; Pezzulo, Giovanni
2016-01-01
How do humans and other animals face novel problems for which predefined solutions are not available? Human problem solving links to flexible reasoning and inference rather than to slow trial-and-error learning. It has received considerable attention since the early days of cognitive science, giving rise to well known cognitive architectures such as SOAR and ACT-R, but its computational and brain mechanisms remain incompletely known. Furthermore, it is still unclear whether problem solving is a “specialized” domain or module of cognition, in the sense that it requires computations that are fundamentally different from those supporting perception and action systems. Here we advance a novel view of human problem solving as probabilistic inference with subgoaling. In this perspective, key insights from cognitive architectures are retained such as the importance of using subgoals to split problems into subproblems. However, here the underlying computations use probabilistic inference methods analogous to those that are increasingly popular in the study of perception and action systems. To test our model we focus on the widely used Tower of Hanoi (ToH) task, and show that our proposed method can reproduce characteristic idiosyncrasies of human problem solvers: their sensitivity to the “community structure” of the ToH and their difficulties in executing so-called “counterintuitive” movements. Our analysis reveals that subgoals have two key roles in probabilistic inference and problem solving. First, prior beliefs on (likely) useful subgoals carve the problem space and define an implicit metric for the problem at hand—a metric to which humans are sensitive. Second, subgoals are used as waypoints in the probabilistic problem solving inference and permit to find effective solutions that, when unavailable, lead to problem solving deficits. Our study thus suggests that a probabilistic inference scheme enhanced with subgoals provides a comprehensive framework to study problem solving and its deficits. PMID:27074140
Donnarumma, Francesco; Maisto, Domenico; Pezzulo, Giovanni
2016-04-01
How do humans and other animals face novel problems for which predefined solutions are not available? Human problem solving links to flexible reasoning and inference rather than to slow trial-and-error learning. It has received considerable attention since the early days of cognitive science, giving rise to well known cognitive architectures such as SOAR and ACT-R, but its computational and brain mechanisms remain incompletely known. Furthermore, it is still unclear whether problem solving is a "specialized" domain or module of cognition, in the sense that it requires computations that are fundamentally different from those supporting perception and action systems. Here we advance a novel view of human problem solving as probabilistic inference with subgoaling. In this perspective, key insights from cognitive architectures are retained such as the importance of using subgoals to split problems into subproblems. However, here the underlying computations use probabilistic inference methods analogous to those that are increasingly popular in the study of perception and action systems. To test our model we focus on the widely used Tower of Hanoi (ToH) task, and show that our proposed method can reproduce characteristic idiosyncrasies of human problem solvers: their sensitivity to the "community structure" of the ToH and their difficulties in executing so-called "counterintuitive" movements. Our analysis reveals that subgoals have two key roles in probabilistic inference and problem solving. First, prior beliefs on (likely) useful subgoals carve the problem space and define an implicit metric for the problem at hand-a metric to which humans are sensitive. Second, subgoals are used as waypoints in the probabilistic problem solving inference and permit to find effective solutions that, when unavailable, lead to problem solving deficits. Our study thus suggests that a probabilistic inference scheme enhanced with subgoals provides a comprehensive framework to study problem solving and its deficits.
NASA Technical Reports Server (NTRS)
McGhee, David S.; Peck, Jeff A.; McDonald, Emmett J.
2012-01-01
This paper examines Probabilistic Sensitivity Analysis (PSA) methods and tools in an effort to understand their utility in vehicle loads and dynamic analysis. Specifically, this study addresses how these methods may be used to establish limits on payload mass and cg location and requirements on adaptor stiffnesses while maintaining vehicle loads and frequencies within established bounds. To this end, PSA methods and tools are applied to a realistic, but manageable, integrated launch vehicle analysis where payload and payload adaptor parameters are modeled as random variables. This analysis is used to study both Regional Response PSA (RRPSA) and Global Response PSA (GRPSA) methods, with a primary focus on sampling based techniques. For contrast, some MPP based approaches are also examined.
Jensen, Cathrine Elgaard; Riis, Allan; Petersen, Karin Dam; Jensen, Martin Bach; Pedersen, Kjeld Møller
2017-05-01
In connection with the publication of a clinical practice guideline on the management of low back pain (LBP) in general practice in Denmark, a cluster randomised controlled trial was conducted. In this trial, a multifaceted guideline implementation strategy to improve general practitioners' treatment of patients with LBP was compared with a usual implementation strategy. The aim was to determine whether the multifaceted strategy was cost effective, as compared with the usual implementation strategy. The economic evaluation was conducted as a cost-utility analysis where cost collected from a societal perspective and quality-adjusted life years were used as outcome measures. The analysis was conducted as a within-trial analysis with a 12-month time horizon consistent with the follow-up period of the clinical trial. To adjust for a priori selected covariates, generalised linear models with a gamma family were used to estimate incremental costs and quality-adjusted life years. Furthermore, both deterministic and probabilistic sensitivity analyses were conducted. Results showed that costs associated with primary health care were higher, whereas secondary health care costs were lower for the intervention group when compared with the control group. When adjusting for covariates, the intervention was less costly, and there was no significant difference in effect between the 2 groups. Sensitivity analyses showed that results were sensitive to uncertainty. In conclusion, the multifaceted implementation strategy was cost saving when compared with the usual strategy for implementing LBP clinical practice guidelines in general practice. Furthermore, there was no significant difference in effect, and the estimate was sensitive to uncertainty.
Chlan, Linda L; Heiderscheit, Annette; Skaar, Debra J; Neidecker, Marjorie V
2018-05-04
Music intervention has been shown to reduce anxiety and sedative exposure among mechanically ventilated patients. Whether music intervention reduces ICU costs is not known. The aim of this study was to examine ICU costs for patients receiving a patient-directed music intervention compared with patients who received usual ICU care. A cost-effectiveness analysis from the hospital perspective was conducted to determine if patient-directed music intervention was cost-effective in improving patient-reported anxiety. Cost savings were also evaluated. One-way and probabilistic sensitivity analyses determined the influence of input variation on the cost-effectiveness. Midwestern ICUs. Adult ICU patients from a parent clinical trial receiving mechanical ventilatory support. Patients receiving the experimental patient-directed music intervention received a MP3 player, noise-canceling headphones, and music tailored to individual preferences by a music therapist. The base case cost-effectiveness analysis estimated patient-directed music intervention reduced anxiety by 19 points on the Visual Analogue Scale-Anxiety with a reduction in cost of $2,322/patient compared with usual ICU care, resulting in patient-directed music dominance. The probabilistic cost-effectiveness analysis found that average patient-directed music intervention costs were $2,155 less than usual ICU care and projected that cost saving is achieved in 70% of 1,000 iterations. Based on break-even analyses, cost saving is achieved if the per-patient cost of patient-directed music intervention remains below $2,651, a value eight times the base case of $329. Patient-directed music intervention is cost-effective for reducing anxiety in mechanically ventilated ICU patients.
NASA Astrophysics Data System (ADS)
Bin, Che; Ruoying, Yu; Dongsheng, Dang; Xiangyan, Wang
2017-05-01
Distributed Generation (DG) integrating to the network would cause the harmonic pollution which would cause damages on electrical devices and affect the normal operation of power system. On the other hand, due to the randomness of the wind and solar irradiation, the output of DG is random, too, which leads to an uncertainty of the harmonic generated by the DG. Thus, probabilistic methods are needed to analyse the impacts of the DG integration. In this work we studied the harmonic voltage probabilistic distribution and the harmonic distortion in distributed network after the distributed photovoltaic (DPV) system integrating in different weather conditions, mainly the sunny day, cloudy day, rainy day and the snowy day. The probabilistic distribution function of the DPV output power in different typical weather conditions could be acquired via the parameter identification method of maximum likelihood estimation. The Monte-Carlo simulation method was adopted to calculate the probabilistic distribution of harmonic voltage content at different frequency orders as well as the harmonic distortion (THD) in typical weather conditions. The case study was based on the IEEE33 system and the results of harmonic voltage content probabilistic distribution as well as THD in typical weather conditions were compared.
Fusar-Poli, P; Schultze-Lutter, F
2016-02-01
Prediction of psychosis in patients at clinical high risk (CHR) has become a mainstream focus of clinical and research interest worldwide. When using CHR instruments for clinical purposes, the predicted outcome is but only a probability; and, consequently, any therapeutic action following the assessment is based on probabilistic prognostic reasoning. Yet, probabilistic reasoning makes considerable demands on the clinicians. We provide here a scholarly practical guide summarising the key concepts to support clinicians with probabilistic prognostic reasoning in the CHR state. We review risk or cumulative incidence of psychosis in, person-time rate of psychosis, Kaplan-Meier estimates of psychosis risk, measures of prognostic accuracy, sensitivity and specificity in receiver operator characteristic curves, positive and negative predictive values, Bayes' theorem, likelihood ratios, potentials and limits of real-life applications of prognostic probabilistic reasoning in the CHR state. Understanding basic measures used for prognostic probabilistic reasoning is a prerequisite for successfully implementing the early detection and prevention of psychosis in clinical practice. Future refinement of these measures for CHR patients may actually influence risk management, especially as regards initiating or withholding treatment. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Probabilistic Design and Analysis Framework
NASA Technical Reports Server (NTRS)
Strack, William C.; Nagpal, Vinod K.
2010-01-01
PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.
Quantification of uncertainties in the performance of smart composite structures
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Chamis, Christos C.
1993-01-01
A composite wing with spars, bulkheads, and built-in control devices is evaluated using a method for the probabilistic assessment of smart composite structures. Structural responses (such as change in angle of attack, vertical displacements, and stresses in regular plies with traditional materials and in control plies with mixed traditional and actuation materials) are probabilistically assessed to quantify their respective scatter. Probabilistic sensitivity factors are computed to identify those parameters that have a significant influence on a specific structural response. Results show that the uncertainties in the responses of smart composite structures can be quantified. Responses such as structural deformation, ply stresses, frequencies, and buckling loads in the presence of defects can be reliably controlled to satisfy specified design requirements.
Stenehjem, David D; Bellows, Brandon K; Yager, Kraig M; Jones, Joshua; Kaldate, Rajesh; Siebert, Uwe; Brixner, Diana I
2016-02-01
A prognostic test was developed to guide adjuvant chemotherapy (ACT) decisions in early-stage non-small cell lung cancer (NSCLC) adenocarcinomas. The objective of this study was to compare the cost-utility of the prognostic test to the current standard of care (SoC) in patients with early-stage NSCLC. Lifetime costs (2014 U.S. dollars) and effectiveness (quality-adjusted life-years [QALYs]) of ACT treatment decisions were examined using a Markov microsimulation model from a U.S. third-party payer perspective. Cancer stage distribution and probability of receiving ACT with the SoC were based on data from an academic cancer center. The probability of receiving ACT with the prognostic test was estimated from a physician survey. Risk classification was based on the 5-year predicted NSCLC-related mortality. Treatment benefit with ACT was based on the prognostic score. Discounting at a 3% annual rate was applied to costs and QALYs. Deterministic one-way and probabilistic sensitivity analyses examined parameter uncertainty. Lifetime costs and effectiveness were $137,403 and 5.45 QALYs with the prognostic test and $127,359 and 5.17 QALYs with the SoC. The resulting incremental cost-effectiveness ratio for the prognostic test versus the SoC was $35,867/QALY gained. One-way sensitivity analyses indicated the model was most sensitive to the utility of patients without recurrence after ACT and the ACT treatment benefit. Probabilistic sensitivity analysis indicated the prognostic test was cost-effective in 65.5% of simulations at a willingness to pay of $50,000/QALY. The study suggests using a prognostic test to guide ACT decisions in early-stage NSCLC is potentially cost-effective compared with using the SoC based on globally accepted willingness-to-pay thresholds. Providing prognostic information to decision makers may help some patients with high-risk early stage non-small cell lung cancer receive appropriate adjuvant chemotherapy while avoiding the associated toxicities and costs in patients with low-risk disease. This study used an economic model to assess the effectiveness and costs associated with using a prognostic test to guide adjuvant chemotherapy decisions compared with the current standard of care in patients with non-small cell lung cancer. When compared with current standard care, the prognostic test was potentially cost effective at commonly accepted thresholds in the U.S. This study can be used to help inform decision makers who are considering using prognostic tests. ©AlphaMed Press.
Sangchan, Apichat; Chaiyakunapruk, Nathorn; Supakankunti, Siripen; Pugkhem, Ake; Mairiang, Pisaln
2014-01-01
Endoscopic biliary drainage using metal and plastic stent in unresectable hilar cholangiocarcinoma (HCA) is widely used but little is known about their cost-effectiveness. This study evaluated the cost-utility of endoscopic metal and plastic stent drainage in unresectable complex, Bismuth type II-IV, HCA patients. Decision analytic model, Markov model, was used to evaluate cost and quality-adjusted life year (QALY) of endoscopic biliary drainage in unresectable HCA. Costs of treatment and utilities of each Markov state were retrieved from hospital charges and unresectable HCA patients from tertiary care hospital in Thailand, respectively. Transition probabilities were derived from international literature. Base case analyses and sensitivity analyses were performed. Under the base-case analysis, metal stent is more effective but more expensive than plastic stent. An incremental cost per additional QALY gained is 192,650 baht (US$ 6,318). From probabilistic sensitivity analysis, at the willingness to pay threshold of one and three times GDP per capita or 158,000 baht (US$ 5,182) and 474,000 baht (US$ 15,546), the probability of metal stent being cost-effective is 26.4% and 99.8%, respectively. Based on the WHO recommendation regarding the cost-effectiveness threshold criteria, endoscopic metal stent drainage is cost-effective compared to plastic stent in unresectable complex HCA.
A probabilistic maintenance model for diesel engines
NASA Astrophysics Data System (ADS)
Pathirana, Shan; Abeygunawardane, Saranga Kumudu
2018-02-01
In this paper, a probabilistic maintenance model is developed for inspection based preventive maintenance of diesel engines based on the practical model concepts discussed in the literature. Developed model is solved using real data obtained from inspection and maintenance histories of diesel engines and experts' views. Reliability indices and costs were calculated for the present maintenance policy of diesel engines. A sensitivity analysis is conducted to observe the effect of inspection based preventive maintenance on the life cycle cost of diesel engines.
Cost-Utility of Elbasvir/Grazoprevir in Patients with Chronic Hepatitis C Genotype 1 Infection.
Corman, Shelby; Elbasha, Elamin H; Michalopoulos, Steven N; Nwankwo, Chizoba
2017-09-01
To evaluate the cost-utility of treatment with elbasvir/grazoprevir (EBR/GZR) regimens compared with ledipasvir/sofosbuvir (LDV/SOF), ombitasvir/paritaprevir/ritonavir + dasabuvir ± ribavirin (3D ± RBV), and sofosbuvir/velpatasvir (SOF/VEL) in patients with chronic hepatitis C genotype (GT) 1 infection. A Markov cohort state-transition model was constructed to evaluate the cost-utility of EBR/GZR ± RBV over a lifetime time horizon from the payer perspective. The target population was patients infected with chronic hepatitis C GT1 subtypes a or b (GT1a or GT1b), stratified by treatment history (treatment-naive [TN] or treatment-experienced), presence of cirrhosis, baseline hepatitis C virus RNA (< or ≥6 million IU/mL), and presence of NS5A resistance-associated variants. The primary outcome was incremental cost-utility ratio for EBR/GZR ± RBV versus available oral direct-acting antiviral agents. One-way and probabilistic sensitivity analyses were performed to test the robustness of the model. EBR/GZR ± RBV was economically dominant versus LDV/SOF in all patient populations. EBR/GZR ± RBV was also less costly than SOF/VEL and 3D ± RBV, but produced fewer quality-adjusted life-years in select populations. In the remaining populations, EBR/GZR ± RBV was economically dominant. One-way sensitivity analyses showed varying sustained virologic response rates across EBR/GZR ± RBV regimens, commonly impacted model conclusions when lower bound values were inserted, and at the upper bound resulted in dominance over SOF/VEL in GT1a cirrhotic and GT1b TN noncirrhotic patients. Results of the probabilistic sensitivity analysis showed that EBR/GZR ± RBV was cost-effective in more than 99% of iterations in GT1a and GT1b noncirrhotic patients and more than 69% of iterations in GT1b cirrhotic patients. Compared with other oral direct-acting antiviral agents, EBR/GZR ± RBV was the economically dominant regimen for treating GT1a noncirrhotic and GT1b TN cirrhotic patients, and was cost saving in all other populations. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Probabilistic Simulation of Combined Thermo-Mechanical Cyclic Fatigue in Composites
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2011-01-01
A methodology to compute probabilistically-combined thermo-mechanical fatigue life of polymer matrix laminated composites has been developed and is demonstrated. Matrix degradation effects caused by long-term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress-dependent multifactor-interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability-integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/-45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical-cyclic loads and low thermal-cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical-cyclic loads and high thermal-cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.
Probabilistic Simulation for Combined Cycle Fatigue in Composites
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2010-01-01
A methodology to compute probabilistic fatigue life of polymer matrix laminated composites has been developed and demonstrated. Matrix degradation effects caused by long term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress dependent multifactor interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/- 45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical cyclic loads and low thermal cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical cyclic loads and high thermal cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.
Probabilistic Simulation of Combined Thermo-Mechanical Cyclic Fatigue in Composites
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2010-01-01
A methodology to compute probabilistically-combined thermo-mechanical fatigue life of polymer matrix laminated composites has been developed and is demonstrated. Matrix degradation effects caused by long-term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress-dependent multifactor-interaction relationship developed at NASA Glenn Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability-integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability-based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/-45/90)s graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical-cyclic loads and low thermal-cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical-cyclic loads and high thermal-cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peace, Gerald L.; Goering, Timothy James; Miller, Mark Laverne
2005-11-01
A probabilistic performance assessment has been conducted to evaluate the fate and transport of radionuclides (americium-241, cesium-137, cobalt-60, plutonium-238, plutonium-239, radium-226, radon-222, strontium-90, thorium-232, tritium, uranium-238), heavy metals (lead and cadmium), and volatile organic compounds (VOCs) at the Mixed Waste Landfill (MWL). Probabilistic analyses were performed to quantify uncertainties inherent in the system and models for a 1,000-year period, and sensitivity analyses were performed to identify parameters and processes that were most important to the simulated performance metrics. Comparisons between simulated results and measured values at the MWL were made to gain confidence in the models and perform calibrations whenmore » data were available. In addition, long-term monitoring requirements and triggers were recommended based on the results of the quantified uncertainty and sensitivity analyses. At least one-hundred realizations were simulated for each scenario defined in the performance assessment. Conservative values and assumptions were used to define values and distributions of uncertain input parameters when site data were not available. Results showed that exposure to tritium via the air pathway exceeded the regulatory metric of 10 mrem/year in about 2% of the simulated realizations when the receptor was located at the MWL (continuously exposed to the air directly above the MWL). Simulations showed that peak radon gas fluxes exceeded the design standard of 20 pCi/m{sup 2}/s in about 3% of the realizations if up to 1% of the containers of sealed radium-226 sources were assumed to completely degrade in the future. If up to 100% of the containers of radium-226 sources were assumed to completely degrade, 30% of the realizations yielded radon surface fluxes that exceeded the design standard. For the groundwater pathway, simulations showed that none of the radionuclides or heavy metals (lead and cadmium) reached the groundwater during the 1,000-year evaluation period. Tetrachloroethylene (PCE) was used as a proxy for other VOCs because of its mobility and potential to exceed maximum contaminant levels in the groundwater relative to other VOCs. Simulations showed that PCE reached the groundwater, but only 1% of the realizations yielded aquifer concentrations that exceeded the regulatory metric of 5 {micro}g/L. Based on these results, monitoring triggers have been proposed for the air, surface soil, vadose zone, and groundwater at the MWL. Specific triggers include numerical thresholds for radon concentrations in the air, tritium concentrations in surface soil, infiltration through the vadose zone, and uranium and select VOC concentrations in groundwater. The proposed triggers are based on U.S. Environmental Protection Agency and Department of Energy regulatory standards. If a trigger is exceeded, then a trigger evaluation process will be initiated which will allow sufficient data to be collected to assess trends and recommend corrective actions, if necessary.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-12
... NUCLEAR REGULATORY COMMISSION [NRC-2013-0047] Compendium of Analyses To Investigate Select Level 1...) has issued for public comment a document entitled: Compendium of Analyses to Investigate Select Level... begin the search, select ``ADAMS Public Documents'' and then select ``Begin Web- based ADAMS Search...
Cost-effectiveness of EOB-MRI for Hepatocellular Carcinoma in Japan.
Nishie, Akihiro; Goshima, Satoshi; Haradome, Hiroki; Hatano, Etsuro; Imai, Yasuharu; Kudo, Masatoshi; Matsuda, Masanori; Motosugi, Utaroh; Saitoh, Satoshi; Yoshimitsu, Kengo; Crawford, Bruce; Kruger, Eliza; Ball, Graeme; Honda, Hiroshi
2017-04-01
The objective of the study was to evaluate the cost-effectiveness of gadoxetic acid-enhanced magnetic resonance imaging (EOB-MRI) in the diagnosis and treatment of hepatocellular carcinoma (HCC) in Japan compared with extracellular contrast media-enhanced MRI (ECCM-MRI) and contrast media-enhanced computed tomography (CE-CT) scanning. A 6-stage Markov model was developed to estimate lifetime direct costs and clinical outcomes associated with EOB-MRI. Diagnostic sensitivity and specificity, along with clinical data on HCC survival, recurrence, treatment patterns, costs, and health state utility values, were derived from predominantly Japanese publications. Parameters unavailable from publications were estimated in a Delphi panel of Japanese clinical experts who also confirmed the structure and overall approach of the model. Sensitivity analyses, including one-way, probabilistic, and scenario analyses, were conducted to account for uncertainty in the results. Over a lifetime horizon, EOB-MRI was associated with lower direct costs (¥2,174,869) and generated a greater number of quality-adjusted life years (QALYs) (9.502) than either ECCM-MRI (¥2,365,421, 9.303 QALYs) or CE-CT (¥2,482,608, 9.215 QALYs). EOB-MRI was superior to the other diagnostic strategies considered, and this finding was robust over sensitivity and scenario analyses. A majority of the direct costs associated with HCC in Japan were found to be costs of treatment. The model results revealed the superior cost-effectiveness of the EOB-MRI diagnostic strategy compared with ECCM-MRI and CE-CT. EOB-MRI could be the first-choice imaging modality for medical care of HCC among patients with hepatitis or liver cirrhosis in Japan. Widespread implementation of EOB-MRI could reduce health care expenditures, particularly downstream treatment costs, associated with HCC. Copyright © 2017 Elsevier HS Journals, Inc. All rights reserved.
Stevanović, Jelena; Pompen, Marjolein; Le, Hoa H.; Rozenbaum, Mark H.; Tieleman, Robert G.; Postma, Maarten J.
2014-01-01
Background Stroke prevention is the main goal of treating patients with atrial fibrillation (AF). Vitamin-K antagonists (VKAs) present an effective treatment in stroke prevention, however, the risk of bleeding and the requirement for regular coagulation monitoring are limiting their use. Apixaban is a novel oral anticoagulant associated with significantly lower hazard rates for stroke, major bleedings and treatment discontinuations, compared to VKAs. Objective To estimate the cost-effectiveness of apixaban compared to VKAs in non-valvular AF patients in the Netherlands. Methods Previously published lifetime Markov model using efficacy data from the ARISTOTLE and the AVERROES trial was modified to reflect the use of oral anticoagulants in the Netherlands. Dutch specific costs, baseline population stroke risk and coagulation monitoring levels were incorporated. Univariate, probabilistic sensitivity and scenario analyses on the impact of different coagulation monitoring levels were performed on the incremental cost-effectiveness ratio (ICER). Results Treatment with apixaban compared to VKAs resulted in an ICER of €10,576 per quality adjusted life year (QALY). Those findings correspond with lower number of strokes and bleedings associated with the use of apixaban compared to VKAs. Univariate sensitivity analyses revealed model sensitivity to the absolute stroke risk with apixaban and treatment discontinuations risks with apixaban and VKAs. The probability that apixaban is cost-effective at a willingness-to-pay threshold of €20,000/QALY was 68%. Results of the scenario analyses on the impact of different coagulation monitoring levels were quite robust. Conclusions In patients with non-valvular AF, apixaban is likely to be a cost-effective alternative to VKAs in the Netherlands. PMID:25093723
Cost-effectiveness of pharmacist-participated warfarin therapy management in Thailand.
Saokaew, Surasak; Permsuwan, Unchalee; Chaiyakunapruk, Nathorn; Nathisuwan, Surakit; Sukonthasarn, Apichard; Jeanpeerapong, Napawan
2013-10-01
Although pharmacist-participated warfarin therapy management (PWTM) is well established, the economic evaluation of PWTM is still lacking particularly in Asia-Pacific region. The objective of this study was to estimate the cost-effectiveness of PWTM in Thailand using local data where available. A Markov model was used to compare lifetime costs and quality-adjusted life years (QALYs) accrued to patients receiving warfarin therapy through PWTM or usual care (UC). The model was populated with relevant information from both health care system and societal perspectives. Input data were obtained from literatures and database analyses. Incremental cost-effectiveness ratios (ICERs) were presented as year 2012 values. A base-case analysis was performed for patients at age 45 years old. Sensitivity analyses including one-way and probabilistic sensitivity analyses were constructed to determine the robustness of the findings. From societal perspective, PWTM and UC results in 39.5 and 38.7 QALY, respectively. Thus, PWTM increase QALY by 0.79, and increase costs by 92,491 THB (3,083 USD) compared with UC (ICER 116,468 THB [3,882.3 USD] per QALY gained). While, from health care system perspective, PWTM also results in 0.79 QALY, and increase costs by 92,788 THB (3,093 USD) compared with UC (ICER 116,842 THB [3,894.7 USD] per QALY gained). Thus, PWTM was cost-effective compared with usual care, assuming willingness-to-pay (WTP) of 150,000 THB/QALY. Results were sensitive to the discount rate and cost of clinic set-up. Our finding suggests that PWTM is a cost-effective intervention. Policy-makers may consider our finding as part of information in their decision-making for implementing this strategy into healthcare benefit package. Further updates when additional data available are needed. © 2013.
Medialization thyroplasty versus injection laryngoplasty: a cost minimization analysis.
Tam, Samantha; Sun, Hongmei; Sarma, Sisira; Siu, Jennifer; Fung, Kevin; Sowerby, Leigh
2017-02-20
Medialization thyroplasty and injection laryngoplasty are widely accepted treatment options for unilateral vocal fold paralysis. Although both procedures result in similar clinical outcomes, little is known about the corresponding medical care costs. Medialization thyroplasty requires expensive operating room resources while injection laryngoplasty utilizes outpatient resources but may require repeated procedures. The purpose of this study, therefore, is to quantify the cost differences in adult patients with unilateral vocal fold paralysis undergoing medialization thyroplasty versus injection laryngoplasty. Cost minimization analysis conducted using a decision tree model. A decision tree model was constructed to capture clinical scenarios for medialization thyroplasty and injection laryngoplasty. Probabilities for various events were obtained from a retrospective cohort from the London Health Sciences Centre, Canada. Costs were derived from the published literature and the London Health Science Centre. All costs were reported in 2014 Canadian dollars. Time horizon was 5 years. The study was conducted from an academic hospital perspective in Canada. Various sensitivity analyses were conducted to assess differences in procedure-specific costs and probabilities of key events. Sixty-three patients underwent medialization thyroplasty and 41 underwent injection laryngoplasty. Cost of medialization thyroplasty was C$2499.10 per patient whereas those treated with injection laryngoplasty cost C$943.19. Results showed that cost savings with IL were C$1555.91. Deterministic and probabilistic sensitivity analyses suggested cost savings ranged from C$596 to C$3626. Treatment with injection laryngoplasty results in cost savings of C$1555.91 per patient. Our extensive sensitivity analyses suggest that switching from medialization thyroplasty to injection laryngoplasty will lead to a minimum cost savings of C$596 per patient. Considering the significant cost savings and similar effectiveness, injection laryngoplasty should be strongly considered as a preferred treatment option for patients diagnosed with unilateral vocal fold paralysis.
Whitty, Jennifer A; Crosland, Paul; Hewson, Kaye; Narula, Rajan; Nathan, Timothy R; Campbell, Peter A; Keller, Andrew; Scuffham, Paul A
2014-03-01
To compare the costs of photoselective vaporisation (PVP) and transurethral resection of the prostate (TURP) for management of symptomatic benign prostatic hyperplasia (BPH) from the perspective of a Queensland public hospital provider. A decision-analytic model was used to compare the costs of PVP and TURP. Cost inputs were sourced from an audit of patients undergoing PVP or TURP across three hospitals. The probability of re-intervention was obtained from secondary literature sources. Probabilistic and multi-way sensitivity analyses were used to account for uncertainty and test the impact of varying key assumptions. In the base case analysis, which included equipment, training and re-intervention costs, PVP was AU$ 739 (95% credible interval [CrI] -12 187 to 14 516) more costly per patient than TURP. The estimate was most sensitive to changes in procedural costs, fibre costs and the probability of re-intervention. Sensitivity analyses based on data from the most favourable site or excluding equipment and training costs reduced the point estimate to favour PVP (incremental cost AU$ -684, 95% CrI -8319 to 5796 and AU$ -100, 95% CrI -13 026 to 13 678, respectively). However, CrIs were wide for all analyses. In this cost minimisation analysis, there was no significant cost difference between PVP and TURP, after accounting for equipment, training and re-intervention costs. However, PVP was associated with a shorter length of stay and lower procedural costs during audit, indicating PVP potentially provides comparatively good value for money once the technology is established. © 2013 The Authors. BJU International © 2013 BJU International.
Application of the Probabilistic Dynamic Synthesis Method to the Analysis of a Realistic Structure
NASA Technical Reports Server (NTRS)
Brown, Andrew M.; Ferri, Aldo A.
1998-01-01
The Probabilistic Dynamic Synthesis method is a new technique for obtaining the statistics of a desired response engineering quantity for a structure with non-deterministic parameters. The method uses measured data from modal testing of the structure as the input random variables, rather than more "primitive" quantities like geometry or material variation. This modal information is much more comprehensive and easily measured than the "primitive" information. The probabilistic analysis is carried out using either response surface reliability methods or Monte Carlo simulation. A previous work verified the feasibility of the PDS method on a simple seven degree-of-freedom spring-mass system. In this paper, extensive issues involved with applying the method to a realistic three-substructure system are examined, and free and forced response analyses are performed. The results from using the method are promising, especially when the lack of alternatives for obtaining quantitative output for probabilistic structures is considered.
Application of the Probabilistic Dynamic Synthesis Method to Realistic Structures
NASA Technical Reports Server (NTRS)
Brown, Andrew M.; Ferri, Aldo A.
1998-01-01
The Probabilistic Dynamic Synthesis method is a technique for obtaining the statistics of a desired response engineering quantity for a structure with non-deterministic parameters. The method uses measured data from modal testing of the structure as the input random variables, rather than more "primitive" quantities like geometry or material variation. This modal information is much more comprehensive and easily measured than the "primitive" information. The probabilistic analysis is carried out using either response surface reliability methods or Monte Carlo simulation. In previous work, the feasibility of the PDS method applied to a simple seven degree-of-freedom spring-mass system was verified. In this paper, extensive issues involved with applying the method to a realistic three-substructure system are examined, and free and forced response analyses are performed. The results from using the method are promising, especially when the lack of alternatives for obtaining quantitative output for probabilistic structures is considered.
Lutz, Manfred A; Lovato, Pedro; Cuesta, Genaro
2012-02-01
In Central American countries, the economic burden of tobacco has not been assessed. In Costa Rica, a study demonstrated that tobacco-related diseases represent high costs for the health care system. The aim of this study was to assess the cost-effectiveness of varenicline compared with other existing strategies for smoking cessation within a 10-year time horizon in an adult population cohort from Central American and Caribbean countries using the health care payer's perspective. The Benefits of Smoking Cessation on Outcomes simulation model was used for an adult cohort in Costa Rica (n = 2 474 029), Panama (n = 2 249 676), Nicaragua (n = 3 639 948), El Salvador (n = 4 537 803), and the Dominican Republic (n = 6 528 125) (N = 19 429 581). Smoking cessation therapies compared were varenicline (0.5-2 mg/day) versus bupropion (300 mg/day), nicotine replacement therapy (5-15 mg/day), and unaided cessation. Effectiveness measures were: life-years (LYs) gained and quality-adjusted life-years (QALYs) gained. Resource use and cost data were obtained from a country's Ministry of Health and/or Social Security Institutions (2008-2010). The model used a 5% discount rate for costs (expressed in 2010 US$) and health outcomes. Probabilistic sensitivity analyses were conducted and acceptability curves were constructed. Varenicline reduced smoking-related morbidity, mortality, and health care costs in each country included in the study. Accumulatively, mortality in the varenicline arm was reduced by 1190, 1538, and 2902 smoking-related deaths compared with bupropion, nicotine replacement therapy, and unaided cessation, respectively. The net average cost per additional quitter showed that varenicline was cost-saving when compared with competing alternatives. Regarding LYs and QALYs gained in 10 years, varenicline obtained the greatest number of QALYs and LYs in each country, while unaided cessation obtained the fewest. Cost-effectiveness analyses in all 5 countries showed that varenicline was the dominant strategy. Acceptability curves showed that, independent of the willingness to pay, the probability that varenicline is cost-effective was 99% for this region. The results of the probabilistic sensitivity analyses support the robustness of the findings. Smoking cessation therapy with varenicline is cost-saving for the Central American and Caribbean countries included. These results could help to reduce the tobacco-related disease burden and align cost-containment policies.
Wang, Xiao Jun; Saha, Ashwini; Zhang, Xu-Hao
2017-01-01
Currently, two pediatric pneumococcal conjugate vaccines are available in the private market of Malaysia-13-valent pneumococcal conjugate vaccine (PCV13) and pneumococcal polysaccharide and non-typeable Haemophilus influenzae protein D conjugate vaccine (PHiD-CV). This study aimed to evaluate the cost-effectiveness of a universal mass vaccination program with a PHiD-CV 2+1 schedule versus no vaccination or with a PCV13 2+1 schedule in Malaysia. A published Markov cohort model was adapted to evaluate the epidemiological and economic consequences of programs with no vaccination, a PHiD-CV 2+1 schedule or a PCV13 2+1 schedule over a 10-year time horizon. Disease cases, deaths, direct medical costs, quality-adjusted life-years (QALYs) and incremental cost-effectiveness ratios (ICERs) were estimated. Locally published epidemiology and cost data were used whenever possible. Vaccine effectiveness and disutility data were based on the best available published data. All data inputs and assumptions were validated by local clinical and health economics experts. Analyses were conducted from the perspective of the Malaysian government for a birth cohort of 508,774. Costs and QALYs were discounted at 3% per annum. One-way and probabilistic sensitivity analyses were performed. Compared with no vaccination, a PHiD-CV 2+1 program was projected to prevent 1109 invasive pneumococcal disease (IPD), 24,679 pneumonia and 72,940 acute otitis media (AOM) cases and 103 IPD/pneumonia deaths over 10 years, with additional costs and QALYs of United States dollars (USD) 30.9 million and 1084 QALYs, respectively, at an ICER of USD 28,497/QALY. Compared with a PCV13 2+1 program, PHiD-CV 2+1 was projected to result in similar reductions in IPD cases (40 cases more) but significantly fewer AOM cases (30,001 cases less), with cost savings and additional QALYs gained of USD 5.2 million and 116 QALYs, respectively, demonstrating dominance over PCV13. Results were robust to variations in one-way and probabilistic sensitivity analyses. A PHiD-CV 2+1 universal mass vaccination program could substantially reduce pneumococcal disease burden versus no vaccination, and was expected to be cost-effective in Malaysia. A PHiD-CV 2+1 program was also expected to be a dominant choice over a PCV13 2+1 program in Malaysia.
Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Nagpal, Vinod K.
2007-01-01
An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.
ERIC Educational Resources Information Center
Little, Daniel R.; Lewandowsky, Stephan
2009-01-01
Despite the fact that categories are often composed of correlated features, the evidence that people detect and use these correlations during intentional category learning has been overwhelmingly negative to date. Nonetheless, on other categorization tasks, such as feature prediction, people show evidence of correlational sensitivity. A…
Durán, I; Beiras, R
2013-10-01
Acute water quality criteria (WQC) for the protection of coastal ecosystems are developed on the basis of short-term ecotoxicological data using the most sensitive life stages of representative species from the main taxa of marine water column organisms. A probabilistic approach based on species sensitivity distribution (SSD) curves has been chosen and compared to the WQC obtained applying an assessment factor to the critical toxicity values, i.e. the 'deterministic' approach. The criteria obtained from HC5 values (5th percentile of the SSD) were 1.01 μg/l for Hg, 1.39 μg/l for Cu, 3.83 μg/l for Cd, 25.3 μg/l for Pb and 8.24 μg/l for Zn. Using sensitive early life stages and very sensitive endpoints allowed calculation of WQC for marine coastal ecosystems. These probabilistic WQC, intended to protect 95% of the species in 95% of the cases, were calculated on the basis of a limited ecotoxicological dataset, avoiding the use of large and uncertain assessment factors. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Khabbazan, Mohammad Mohammadi; Roshan, Elnaz; Held, Hermann
2017-04-01
In principle solar radiation management (SRM) offers an option to ameliorate anthropogenic temperature rise. However we cannot expect it to simultaneously compensate for anthropogenic changes in further climate variables in a perfect manner. Here, we ask to what extent a proponent of the 2°C-temperature target would apply SRM in conjunction with mitigation in view of global or regional disparities in precipitation changes. We apply cost-risk analysis (CRA), which is a decision analytic framework that makes a trade-off between the expected welfare-loss from climate policy costs and the climate risks from transgressing a climate target. Here, in both global-scale and 'Giorgi'-regional-scale analyses, we evaluate the optimal mixture of SRM and mitigation under probabilistic information about climate sensitivity. To do so, we generalize CRA for the sake of including not only temperature risk, but also globally aggregated and regionally disaggregated precipitation risks. Social welfare is maximized for the following three valuation scenarios: temperature-risk-only, precipitation-risk-only, and equally weighted both-risks. For now, the Giorgi regions are treated by equal weight. We find that for regionally differentiated precipitation targets, the usage of SRM will be comparably more restricted. In the course of time, a cooling of up to 1.3°C can be attributed to SRM for the latter scenario and for a median climate sensitivity of 3°C (for a global target only, this number reduces by 0.5°C). Our results indicate that although SRM would almost completely substitute for mitigation in the globally aggregated analysis, it only saves 70% to 75% of the welfare-loss compared to a purely mitigation-based analysis (from economic costs and climate risks, approximately 4% in terms of BGE) when considering regional precipitation risks in precipitation-risk-only and both-risks scenarios. It remains to be shown how the inclusion of further risks or different regional weights would change that picture.
Safta, C.; Ricciuto, Daniel M.; Sargsyan, Khachik; ...
2015-07-01
In this paper we propose a probabilistic framework for an uncertainty quantification (UQ) study of a carbon cycle model and focus on the comparison between steady-state and transient simulation setups. A global sensitivity analysis (GSA) study indicates the parameters and parameter couplings that are important at different times of the year for quantities of interest (QoIs) obtained with the data assimilation linked ecosystem carbon (DALEC) model. We then employ a Bayesian approach and a statistical model error term to calibrate the parameters of DALEC using net ecosystem exchange (NEE) observations at the Harvard Forest site. The calibration results are employedmore » in the second part of the paper to assess the predictive skill of the model via posterior predictive checks.« less
Coupled Multi-Disciplinary Optimization for Structural Reliability and Affordability
NASA Technical Reports Server (NTRS)
Abumeri, Galib H.; Chamis, Christos C.
2003-01-01
A computational simulation method is presented for Non-Deterministic Multidisciplinary Optimization of engine composite materials and structures. A hypothetical engine duct made with ceramic matrix composites (CMC) is evaluated probabilistically in the presence of combined thermo-mechanical loading. The structure is tailored by quantifying the uncertainties in all relevant design variables such as fabrication, material, and loading parameters. The probabilistic sensitivities are used to select critical design variables for optimization. In this paper, two approaches for non-deterministic optimization are presented. The non-deterministic minimization of combined failure stress criterion is carried out by: (1) performing probabilistic evaluation first and then optimization and (2) performing optimization first and then probabilistic evaluation. The first approach shows that the optimization feasible region can be bounded by a set of prescribed probability limits and that the optimization follows the cumulative distribution function between those limits. The second approach shows that the optimization feasible region is bounded by 0.50 and 0.999 probabilities.
NASA Technical Reports Server (NTRS)
Nagpal, Vinod K.; Tong, Michael; Murthy, P. L. N.; Mital, Subodh
1998-01-01
An integrated probabilistic approach has been developed to assess composites for high temperature applications. This approach was used to determine thermal and mechanical properties and their probabilistic distributions of a 5-harness 0/90 Sylramic fiber/CVI-SiC/Mi-SiC woven Ceramic Matrix Composite (CMC) at high temperatures. The purpose of developing this approach was to generate quantitative probabilistic information on this CMC to help complete the evaluation for its potential application for HSCT combustor liner. This approach quantified the influences of uncertainties inherent in constituent properties called primitive variables on selected key response variables of the CMC at 2200 F. The quantitative information is presented in the form of Cumulative Density Functions (CDFs). Probability Density Functions (PDFS) and primitive variable sensitivities on response. Results indicate that the scatters in response variables were reduced by 30-50% when the uncertainties in the primitive variables, which showed the most influence, were reduced by 50%.
Non-Deterministic Dynamic Instability of Composite Shells
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Abumeri, Galib H.
2004-01-01
A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics, and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties, in that order.
Guerriero, Carla; Cairns, John; Roberts, Ian; Rodgers, Anthony; Whittaker, Robyn; Free, Caroline
2013-10-01
The txt2stop trial has shown that mobile-phone-based smoking cessation support doubles biochemically validated quitting at 6 months. This study examines the cost-effectiveness of smoking cessation support delivered by mobile phone text messaging. The lifetime incremental costs and benefits of adding text-based support to current practice are estimated from a UK NHS perspective using a Markov model. The cost-effectiveness was measured in terms of cost per quitter, cost per life year gained and cost per QALY gained. As in previous studies, smokers are assumed to face a higher risk of experiencing the following five diseases: lung cancer, stroke, myocardial infarction, chronic obstructive pulmonary disease, and coronary heart disease (i.e. the main fatal or disabling, but by no means the only, adverse effects of prolonged smoking). The treatment costs and health state values associated with these diseases were identified from the literature. The analysis was based on the age and gender distribution observed in the txt2stop trial. Effectiveness and cost parameters were varied in deterministic sensitivity analyses, and a probabilistic sensitivity analysis was also performed. The cost of text-based support per 1,000 enrolled smokers is £16,120, which, given an estimated 58 additional quitters at 6 months, equates to £278 per quitter. However, when the future NHS costs saved (as a result of reduced smoking) are included, text-based support would be cost saving. It is estimated that 18 LYs are gained per 1,000 smokers (0.3 LYs per quitter) receiving text-based support, and 29 QALYs are gained (0.5 QALYs per quitter). The deterministic sensitivity analysis indicated that changes in individual model parameters did not alter the conclusion that this is a cost-effective intervention. Similarly, the probabilistic sensitivity analysis indicated a >90 % chance that the intervention will be cost saving. This study shows that under a wide variety of conditions, personalised smoking cessation advice and support by mobile phone message is both beneficial for health and cost saving to a health system.
B-value and slip rate sensitivity analysis for PGA value in Lembang fault and Cimandiri fault area
NASA Astrophysics Data System (ADS)
Pratama, Cecep; Ito, Takeo; Meilano, Irwan; Nugraha, Andri Dian
2017-07-01
We examine slip rate and b-value contribution of Peak Ground Acceleration (PGA), in probabilistic seismic hazard maps (10% probability of exceedence in 50 years or 500 years return period). Hazard curve of PGA have been investigated for Sukabumi and Bandung using a PSHA (Probabilistic Seismic Hazard Analysis). We observe that the most influence in the hazard estimate is crustal fault. Monte Carlo approach has been developed to assess the sensitivity. Uncertainty and coefficient of variation from slip rate and b-value in Lembang and Cimandiri Fault area have been calculated. We observe that seismic hazard estimates are sensitive to fault slip rate and b-value with uncertainty result are 0.25 g dan 0.1-0.2 g, respectively. For specific site, we found seismic hazard estimate are 0.49 + 0.13 g with COV 27% and 0.39 + 0.05 g with COV 13% for Sukabumi and Bandung, respectively.
Monte Carlo simulation for slip rate sensitivity analysis in Cimandiri fault area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pratama, Cecep, E-mail: great.pratama@gmail.com; Meilano, Irwan; Nugraha, Andri Dian
Slip rate is used to estimate earthquake recurrence relationship which is the most influence for hazard level. We examine slip rate contribution of Peak Ground Acceleration (PGA), in probabilistic seismic hazard maps (10% probability of exceedance in 50 years or 500 years return period). Hazard curve of PGA have been investigated for Sukabumi using a PSHA (Probabilistic Seismic Hazard Analysis). We observe that the most influence in the hazard estimate is crustal fault. Monte Carlo approach has been developed to assess the sensitivity. Then, Monte Carlo simulations properties have been assessed. Uncertainty and coefficient of variation from slip rate formore » Cimandiri Fault area has been calculated. We observe that seismic hazard estimates is sensitive to fault slip rate with seismic hazard uncertainty result about 0.25 g. For specific site, we found seismic hazard estimate for Sukabumi is between 0.4904 – 0.8465 g with uncertainty between 0.0847 – 0.2389 g and COV between 17.7% – 29.8%.« less
Simulation of probabilistic wind loads and building analysis
NASA Technical Reports Server (NTRS)
Shah, Ashwin R.; Chamis, Christos C.
1991-01-01
Probabilistic wind loads likely to occur on a structure during its design life are predicted. Described here is a suitable multifactor interactive equation (MFIE) model and its use in the Composite Load Spectra (CLS) computer program to simulate the wind pressure cumulative distribution functions on four sides of a building. The simulated probabilistic wind pressure load was applied to a building frame, and cumulative distribution functions of sway displacements and reliability against overturning were obtained using NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), a stochastic finite element computer code. The geometry of the building and the properties of building members were also considered as random in the NESSUS analysis. The uncertainties of wind pressure, building geometry, and member section property were qualified in terms of their respective sensitivities on the structural response.
NASA Astrophysics Data System (ADS)
Legget, J.; Pepper, W.; Sankovski, A.; Smith, J.; Tol, R.; Wigley, T.
2003-04-01
Potential risks of human-induced climate change are subject to a three-fold uncertainty associated with: the extent of future anthropogenic and natural GHG emissions; global and regional climatic responses to emissions; and impacts of climatic changes on economies and the biosphere. Long-term analyses are also subject to uncertainty regarding how humans will respond to actual or perceived changes, through adaptation or mitigation efforts. Explicitly addressing these uncertainties is a high priority in the scientific and policy communities Probabilistic modeling is gaining momentum as a technique to quantify uncertainties explicitly and use decision analysis techniques that take advantage of improved risk information. The Climate Change Risk Assessment Framework (CCRAF) presented here a new integrative tool that combines the probabilistic approaches developed in population, energy and economic sciences with empirical data and probabilistic results of climate and impact models. The main CCRAF objective is to assess global climate change as a risk management challenge and to provide insights regarding robust policies that address the risks, by mitigating greenhouse gas emissions and by adapting to climate change consequences. The CCRAF endogenously simulates to 2100 or beyond annual region-specific changes in population; GDP; primary (by fuel) and final energy (by type) use; a wide set of associated GHG emissions; GHG concentrations; global temperature change and sea level rise; economic, health, and biospheric impacts; costs of mitigation and adaptation measures and residual costs or benefits of climate change. Atmospheric and climate components of CCRAF are formulated based on the latest version of Wigley's and Raper's MAGICC model and impacts are simulated based on a modified version of Tol's FUND model. The CCRAF is based on series of log-linear equations with deterministic and random components and is implemented using a Monte-Carlo method with up to 5000 variants per set of fixed input parameters. The shape and coefficients of CCRAF equations are derived from regression analyses of historic data and expert assessments. There are two types of random components in CCRAF - one reflects a year-to-year fluctuations around the expected value of a given variable (e.g., standard error of the annual GDP growth) and another is fixed within each CCRAF variant and represents some essential constants within a "world" represented by that variant (e.g., the value of climate sensitivity). Both types of random components are drawn from pre-defined probability distributions functions developed based on historic data or expert assessments. Preliminary CCRAF results emphasize the relative importance of uncertainties associated with the conversion of GHG and particulate emissions into radiative forcing and quantifying climate change effects at the regional level. A separates analysis involves an "adaptive decision-making", which optimizes the expected future policy effects given the estimated probabilistic uncertainties. As uncertainty for some variables evolve over the time steps, the decisions also adapt. This modeling approach is feasible only with explicit modeling of uncertainties.
Seismic, high wind, tornado, and probabilistic risk assessments of the High Flux Isotope Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, S.P.; Stover, R.L.; Hashimoto, P.S.
1989-01-01
Natural phenomena analyses were performed on the High Flux Isotope Reactor (HFIR) Deterministic and probabilistic evaluations were made to determine the risks resulting from earthquakes, high winds, and tornadoes. Analytic methods in conjunction with field evaluations and an earthquake experience data base evaluation methods were used to provide more realistic results in a shorter amount of time. Plant modifications completed in preparation for HFIR restart and potential future enhancements are discussed. 5 figs.
Probabilistic tsunami hazard analysis: Multiple sources and global applications
Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël; Parsons, Thomas E.; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie
2017-01-01
Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.
Probabilistic Tsunami Hazard Analysis: Multiple Sources and Global Applications
NASA Astrophysics Data System (ADS)
Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël.; Parsons, Tom; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie
2017-12-01
Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.
Casciano, Roman; Chulikavit, Maruit; Di Lorenzo, Giuseppe; Liu, Zhimei; Baladi, Jean-Francois; Wang, Xufang; Robertson, Justin; Garrison, Lou
2011-01-01
A recent indirect comparison study showed that sunitinib-refractory metastatic renal cell carcinoma (mRCC) patients treated with everolimus are expected to have improved overall survival outcomes compared to patients treated with sorafenib. This analysis examines the likely cost-effectiveness of everolimus versus sorafenib in this setting from a US payer perspective. A Markov model was developed to simulate a cohort of sunitinib-refractory mRCC patients and to estimate the cost per incremental life-years gained (LYG) and quality-adjusted life-years (QALYs) gained. Markov states included are stable disease without adverse events, stable disease with adverse events, disease progression, and death. Transition probabilities were estimated using a subset of the RECORD-1 patient population receiving everolimus after sunitinib, and a comparable population receiving sorafenib in a single-arm phase II study. Costs of antitumor therapies were based on wholesale acquisition cost. Health state costs accounted for physician visits, tests, adverse events, postprogression therapy, and end-of-life care. The model extrapolated beyond the trial time horizon for up to 6 years based on published trial data. Deterministic and probabilistic sensitivity analyses were conducted. The estimated gain over sorafenib treatment was 1.273 LYs (0.916 QALYs) at an incremental cost of $81,643. The deterministic analysis resulted in an incremental cost-effectiveness ratio (ICER) of $64,155/LYG ($89,160/QALY). The probabilistic sensitivity analysis demonstrated that results were highly consistent across simulations. As the ICER fell within the cost per QALY range for many other widely used oncology medicines, everolimus is projected to be a cost-effective treatment relative to sorafenib for sunitinib-refractory mRCC. Copyright © 2011 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Galperine, Tatiana; Denies, Fanette; Lannoy, Damien; Lenne, Xavier; Odou, Pascal; Guery, Benoit; Dervaux, Benoit
2017-01-01
Background Clostridium difficile infection (CDI) is characterized by high rates of recurrence, resulting in substantial health care costs. The aim of this study was to analyze the cost-effectiveness of treatments for the management of second recurrence of community-onset CDI in France. Methods We developed a decision-analytic simulation model to compare 5 treatments for the management of second recurrence of community-onset CDI: pulsed-tapered vancomycin, fidaxomicin, fecal microbiota transplantation (FMT) via colonoscopy, FMT via duodenal infusion, and FMT via enema. The model outcome was the incremental cost-effectiveness ratio (ICER), expressed as cost per quality-adjusted life year (QALY) among the 5 treatments. ICERs were interpreted using a willingness-to-pay threshold of €32,000/QALY. Uncertainty was evaluated through deterministic and probabilistic sensitivity analyses. Results Three strategies were on the efficiency frontier: pulsed-tapered vancomycin, FMT via enema, and FMT via colonoscopy, in order of increasing effectiveness. FMT via duodenal infusion and fidaxomicin were dominated (i.e. less effective and costlier) by FMT via colonoscopy and FMT via enema. FMT via enema compared with pulsed-tapered vancomycin had an ICER of €18,092/QALY. The ICER for FMT via colonoscopy versus FMT via enema was €73,653/QALY. Probabilistic sensitivity analysis with 10,000 Monte Carlo simulations showed that FMT via enema was the most cost-effective strategy in 58% of simulations and FMT via colonoscopy was favored in 19% at a willingness-to-pay threshold of €32,000/QALY. Conclusions FMT via enema is the most cost-effective initial strategy for the management of second recurrence of community-onset CDI at a willingness-to-pay threshold of €32,000/QALY. PMID:28103289
Baro, Emilie; Galperine, Tatiana; Denies, Fanette; Lannoy, Damien; Lenne, Xavier; Odou, Pascal; Guery, Benoit; Dervaux, Benoit
2017-01-01
Clostridium difficile infection (CDI) is characterized by high rates of recurrence, resulting in substantial health care costs. The aim of this study was to analyze the cost-effectiveness of treatments for the management of second recurrence of community-onset CDI in France. We developed a decision-analytic simulation model to compare 5 treatments for the management of second recurrence of community-onset CDI: pulsed-tapered vancomycin, fidaxomicin, fecal microbiota transplantation (FMT) via colonoscopy, FMT via duodenal infusion, and FMT via enema. The model outcome was the incremental cost-effectiveness ratio (ICER), expressed as cost per quality-adjusted life year (QALY) among the 5 treatments. ICERs were interpreted using a willingness-to-pay threshold of €32,000/QALY. Uncertainty was evaluated through deterministic and probabilistic sensitivity analyses. Three strategies were on the efficiency frontier: pulsed-tapered vancomycin, FMT via enema, and FMT via colonoscopy, in order of increasing effectiveness. FMT via duodenal infusion and fidaxomicin were dominated (i.e. less effective and costlier) by FMT via colonoscopy and FMT via enema. FMT via enema compared with pulsed-tapered vancomycin had an ICER of €18,092/QALY. The ICER for FMT via colonoscopy versus FMT via enema was €73,653/QALY. Probabilistic sensitivity analysis with 10,000 Monte Carlo simulations showed that FMT via enema was the most cost-effective strategy in 58% of simulations and FMT via colonoscopy was favored in 19% at a willingness-to-pay threshold of €32,000/QALY. FMT via enema is the most cost-effective initial strategy for the management of second recurrence of community-onset CDI at a willingness-to-pay threshold of €32,000/QALY.
Wan, Xiao Min; Peng, Liu Bao; Ma, Jin An; Li, Yuan Jian
2017-07-15
Nivolumab is a new standard of care for patients with metastatic renal cell carcinoma (mRCC) and provides an overall survival benefit of 5.40 months in comparison with everolimus. This study evaluated the cost-effectiveness of nivolumab for the second-line treatment of mRCC from the perspective of US payers and identified the range of drug costs for which the addition of nivolumab to standard therapy could be considered cost-effective from a Chinese perspective. A partitioned survival model was constructed to estimate lifetime costs, life-years, and quality-adjusted life-years (QALYs). Costs were estimated for the US and Chinese health care systems. One-way and probabilistic sensitivity analyses were performed. Nivolumab provided an additional 0.29 QALYs at a cost of $151,676/QALY in the United States. The probabilistic sensitivity analysis showed that at a willingness-to-pay threshold of $100,000/QALY, at the current cost of nivolumab, the chance of nivolumab being cost-effective was 3.10%. For China, when nivolumab cost less than $7.90 or $9.70/mg, there was a nearly 90% likelihood that the incremental cost-effectiveness ratio for nivolumab would be less than $22,785 or $48,838/QALY, respectively. For the United States, nivolumab is unlikely to be a high-value treatment for mRCC at the current price, and a price reduction appears to be justified. In China, value-based prices for nivolumab are $7.90 and $9.70/mg for the country and Beijing City, respectively. This study could and should inform the multilateral drug-price negotiations in China that may be upcoming for nivolumab. Cancer 2017;123:2634-41. © 2017 American Cancer Society. © 2017 American Cancer Society.
Mittmann, Nicole; Chan, Brian C; Craven, B Cathy; Isogai, Pierre K; Houghton, Pamela
2011-06-01
To evaluate the incremental cost-effectiveness of electrical stimulation (ES) plus standard wound care (SWC) as compared with SWC only in a spinal cord injury (SCI) population with grade III/IV pressure ulcers (PUs) from the public payer perspective. A decision analytic model was constructed for a 1-year time horizon to determine the incremental cost-effectiveness of ES plus SWC to SWC in a cohort of participants with SCI and grade III/IV PUs. Model inputs for clinical probabilities were based on published literature. Model inputs, namely clinical probabilities and direct health system and medical resources were based on a randomized controlled trial of ES plus SWC versus SWC. Costs (Can $) included outpatient (clinic, home care, health professional) and inpatient management (surgery, complications). One way and probabilistic sensitivity (1000 Monte Carlo iterations) analyses were conducted. The perspective of this analysis is from a Canadian public health system payer. Model target population was an SCI cohort with grade III/IV PUs. Not applicable. Incremental cost per PU healed. ES plus SWC were associated with better outcomes and lower costs. There was a 16.4% increase in the PUs healed and a cost savings of $224 at 1 year. ES plus SWC were thus considered a dominant economic comparator. Probabilistic sensitivity analysis resulted in economic dominance for ES plus SWC in 62%, with another 35% having incremental cost-effectiveness ratios of $50,000 or less per PU healed. The largest driver of the economic model was the percentage of PU healed with ES plus SWC. The addition of ES to SWC improved healing in grade III/IV PU and reduced costs in an SCI population. Copyright © 2011 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Potential cost-effectiveness of universal access to modern contraceptives in Uganda.
Babigumira, Joseph B; Stergachis, Andy; Veenstra, David L; Gardner, Jacqueline S; Ngonzi, Joseph; Mukasa-Kivunike, Peter; Garrison, Louis P
2012-01-01
Over two thirds of women who need contraception in Uganda lack access to modern effective methods. This study was conducted to estimate the potential cost-effectiveness of achieving universal access to modern contraceptives in Uganda by implementing a hypothetical new contraceptive program (NCP) from both societal and governmental (Ministry of Health (MoH)) perspectives. A Markov model was developed to compare the NCP to the status quo or current contraceptive program (CCP). The model followed a hypothetical cohort of 15-year old girls over a lifetime horizon. Data were obtained from the Uganda National Demographic and Health Survey and from published and unpublished sources. Costs, life expectancy, disability-adjusted life expectancy, pregnancies, fertility and incremental cost-effectiveness measured as cost per life-year (LY) gained, cost per disability-adjusted life-year (DALY) averted, cost per pregnancy averted and cost per unit of fertility reduction were calculated. Univariate and probabilistic sensitivity analyses were performed to examine the robustness of results. Mean discounted life expectancy and disability-adjusted life expectancy (DALE) were higher under the NCP vs. CCP (28.74 vs. 28.65 years and 27.38 vs. 27.01 respectively). Mean pregnancies and live births per woman were lower under the NCP (9.51 vs. 7.90 and 6.92 vs. 5.79 respectively). Mean lifetime societal costs per woman were lower for the NCP from the societal perspective ($1,949 vs. $1,987) and the MoH perspective ($636 vs. $685). In the incremental analysis, the NCP dominated the CCP, i.e. it was both less costly and more effective. The results were robust to univariate and probabilistic sensitivity analysis. Universal access to modern contraceptives in Uganda appears to be highly cost-effective. Increasing contraceptive coverage should be considered among Uganda's public health priorities.
Saint-Laurent Thibault, Catherine; Moorjaney, Divya; Ganz, Michael L; Sill, Bruce; Hede, Shalini; Yuan, Yong; Gorsh, Boris
2017-07-01
A phase III trial evaluated the efficacy and safety of Daklinza (daclatasvir or DCV) in combination with sofosbuvir (SOF) for treatment of genotype (GT) 3 hepatitis C virus (HCV) patients. This study evaluated the cost-effectiveness of DCV + SOF vs SOF in combination with ribavirin (RBV) over a 20-year time horizon from the perspective of a United States (US) payer. A published Markov model was adapted to reflect US demographic characteristics, treatment patterns, costs of drug acquisition, monitoring, disease and adverse event management, and mortality risks. Clinical inputs came from the ALLY-3 and VALENCE trials. The primary outcome was the incremental cost-utility ratio. Life-years, incidence of complications, number of patients achieving sustained virological response (SVR), and the total cost per SVR were secondary outcomes. Costs (2014 USD) and quality-adjusted life years (QALYs) were discounted at 3% per year. Deterministic, probabilistic, and scenario sensitivity analyses were conducted. DCV + SOF was associated with lower costs and better effectiveness than SOF + RBV in the base case and in almost all scenarios (i.e. treatment-experienced, non-cirrhotic, time horizons of 5, 10, and 80 years). DCV + SOF was less costly, but also slightly less effective than SOF + RBV in the cirrhotic and treatment-naïve population scenarios. Results were sensitive to variations in the probability of achieving SVR for both treatment arms. DCV + SOF costs less than $50,000 per QALY gained in 79% of all probabilistic iterations compared with SOF + RBV. DCV + SOF is a dominant option compared with SOF + RBV in the US for the overall GT 3 HCV patient population.
Patel, Dipen A; Shorr, Andrew F; Chastre, Jean; Niederman, Michael; Simor, Andrew; Stephens, Jennifer M; Charbonneau, Claudie; Gao, Xin; Nathwani, Dilip
2014-07-22
We compared the economic impacts of linezolid and vancomycin for the treatment of hospitalized patients with methicillin-resistant Staphylococcus aureus (MRSA)-confirmed nosocomial pneumonia. We used a 4-week decision tree model incorporating published data and expert opinion on clinical parameters, resource use and costs (in 2012 US dollars), such as efficacy, mortality, serious adverse events, treatment duration and length of hospital stay. The results presented are from a US payer perspective. The base case first-line treatment duration for patients with MRSA-confirmed nosocomial pneumonia was 10 days. Clinical treatment success (used for the cost-effectiveness ratio) and failure due to lack of efficacy, serious adverse events or mortality were possible clinical outcomes that could impact costs. Cost of treatment and incremental cost-effectiveness per successfully treated patient were calculated for linezolid versus vancomycin. Univariate (one-way) and probabilistic sensitivity analyses were conducted. The model allowed us to calculate the total base case inpatient costs as $46,168 (linezolid) and $46,992 (vancomycin). The incremental cost-effectiveness ratio favored linezolid (versus vancomycin), with lower costs ($824 less) and greater efficacy (+2.7% absolute difference in the proportion of patients successfully treated for MRSA nosocomial pneumonia). Approximately 80% of the total treatment costs were attributed to hospital stay (primarily in the intensive care unit). The results of our probabilistic sensitivity analysis indicated that linezolid is the cost-effective alternative under varying willingness to pay thresholds. These model results show that linezolid has a favorable incremental cost-effectiveness ratio compared to vancomycin for MRSA-confirmed nosocomial pneumonia, largely attributable to the higher clinical trial response rate of patients treated with linezolid. The higher drug acquisition cost of linezolid was offset by lower treatment failure-related costs and fewer days of hospitalization.
Candia, Roberto; Naimark, David; Sander, Beate; Nguyen, Geoffrey C
2017-11-01
Postoperative recurrence of Crohn's disease is common. This study sought to assess whether the postoperative management should be based on biological therapy alone or combined with thiopurines and whether the therapy should be started immediately after surgery or guided by either endoscopic or clinical recurrence. A Markov model was developed to estimate expected health outcomes in quality-adjusted life years (QALYs) and costs in Canadian dollars (CAD$) accrued by hypothetical patients with high recurrence risk after ileocolic resection. Eight strategies of postoperative management were evaluated. A lifetime time horizon, an annual discount rate of 5%, a societal perspective, and a cost-effectiveness threshold of 50,000 CAD$/QALY were assumed. Deterministic and probabilistic sensitivity analyses were conducted. The model was validated against randomized trials and historical cohorts. Three strategies dominated the others: endoscopy-guided full step-up therapy (14.80 QALYs, CAD$ 462,180), thiopurines immediately post-surgery plus endoscopy-guided biological step-up therapy (14.89 QALYs, CAD$ 464,099) and combination therapy immediately post-surgery (14.94 QALYs, CAD$ 483,685). The second strategy was the most cost-effective, assuming a cost-effectiveness threshold of 50,000 CAD$/QALY. Probabilistic sensitivity analysis showed that the second strategy has the highest probability of being the optimal alternative in all comparisons at cost-effectiveness thresholds from 30,000 to 100,000 CAD$/QALY. The strategies guided only by clinical recurrence and those using biologics alone were dominated. According to this decision analysis, thiopurines immediately after surgery and addition of biologics guided by endoscopic recurrence is the optimal strategy of postoperative management in patients with Crohn's disease with high risk of recurrence (see Video Abstract, Supplemental Digital Content 1, http://links.lww.com/IBD/B654).
Cost-Effectiveness of Thrombolysis within 4.5 Hours of Acute Ischemic Stroke in China
Zhao, Xingquan; Liao, Xiaoling; Wang, Chunjuan; Du, Wanliang; Liu, Gaifen; Liu, Liping; Wang, Chunxue; Wang, Yilong; Wang, Yongjun
2014-01-01
Background Previous economic studies conducted in developed countries showed intravenous tissue-type plasminogen activator (tPA) is cost-effective for acute ischemic stroke. The present study aimed to determine the cost-effectiveness of tPA treatment in China, the largest developing country. Methods A combination of decision tree and Markov model was developed to determine the cost-effectiveness of tPA treatment versus non-tPA treatment within 4.5 hours after stroke onset. Outcomes and costs data were derived from the database of Thrombolysis Implementation and Monitor of acute ischemic Stroke in China (TIMS-China) study. Efficacy data were derived from a pooled analysis of ECASS, ATLANTIS, NINDS, and EPITHET trials. Costs and quality-adjusted life-years (QALYs) were compared in both short term (2 years) and long term (30 years). One-way and probabilistic sensitivity analyses were performed to test the robustness of the results. Results Comparing to non-tPA treatment, tPA treatment within 4.5 hours led to a short-term gain of 0.101 QALYs at an additional cost of CNY 9,520 (US$ 1,460), yielding an incremental cost-effectiveness ratio (ICER) of CNY 94,300 (US$ 14,500) per QALY gained in 2 years; and to a long-term gain of 0.422 QALYs at an additional cost of CNY 6,530 (US$ 1,000), yielding an ICER of CNY 15,500 (US$ 2,380) per QALY gained in 30 years. Probabilistic sensitivity analysis showed that tPA treatment is cost-effective in 98.7% of the simulations at a willingness-to-pay threshold of CNY 105,000 (US$ 16,200) per QALY. Conclusions Intravenous tPA treatment within 4.5 hours is highly cost-effective for acute ischemic strokes in China. PMID:25329637
Zhang, Li E; Huang, Daizheng; Yang, Jie; Wei, Xiao; Qin, Jian; Ou, Songfeng; Zhang, Zhiyong; Zou, Yunfeng
2017-03-01
Studies have yet to evaluate the effects of water improvement on fluoride concentrations in drinking water and the corresponding health risks to Chinese residents in endemic fluorosis areas (EFAs) at a national level. This paper summarized available data in the published literature (2008-2016) on water fluoride from the EFAs in China before and after water quality was improved. Based on these obtained data, health risk assessment of Chinese residents' exposure to fluoride in improved drinking water was performed by means of a probabilistic approach. The uncertainties in the risk estimates were quantified using Monte Carlo simulation and sensitivity analysis. Our results showed that in general, the average fluoride levels (0.10-2.24 mg/L) in the improved drinking water in the EFAs of China were lower than the pre-intervention levels (0.30-15.24 mg/L). The highest fluoride levels were detected in North and Southwest China. The mean non-carcinogenic risks associated with consumption of the improved drinking water for Chinese residents were mostly accepted (hazard quotient < 1), but the non-carcinogenic risk of children in most of the EFAs at the 95th percentile exceeded the safe level of 1, indicating the potential non-cancer-causing health effects on this fluoride-exposed population. Sensitivity analyses indicated that fluoride concentration in drinking water, ingestion rate of water, and the exposure time in the shower were the most relevant variables in the model, therefore, efforts should focus mainly on the definition of their probability distributions for a more accurate risk assessment. Copyright © 2016 Elsevier Ltd. All rights reserved.
Chidi, Alexis P.; Bryce, Cindy L.; Donohue, Julie; Fine, Michael J.; Landsittel, Doug; Myaskovsky, Larissa; Rogal, Shari; Switzer, Galen; Tsung, Allan; Smith, Kenneth
2016-01-01
INTRODUCTION Interferon-free hepatitis C treatment regimens are effective but very costly. The cost-effectiveness, budget and public health impacts of current Medicaid treatment policies restricting treatment to patients with advanced disease remain unknown. METHODS Using a Markov model, we compared two strategies for 45–55 year old Medicaid beneficiaries: (1) Current Practice - only advanced disease is treated before Medicare eligibility; and (2) Full Access – both early-stage and advanced disease are treated before Medicare eligibility. Patients could develop progressive fibrosis, cirrhosis or hepatocellular carcinoma, undergo transplantation, or die each year. Morbidity was reduced after successful treatment. We calculated the incremental cost-effectiveness ratio and compared the costs and public health effects of each strategy from the perspective of Medicare alone as well as the Centers for Medicare and Medicaid Services (CMS) perspective. We varied model inputs in one-way and probabilistic sensitivity analyses. RESULTS Full Access was less costly and more effective than Current Practice for all cohorts and perspectives, with differences in cost from $5,369–$11,960 and in effectiveness from 0.82–3.01 quality adjusted life-years). In a probabilistic sensitivity analysis, Full Access was cost saving in 93% of model iterations. Compared to Current Practice, Full Access averted 5,994 hepatocellular carcinoma cases and 121 liver transplants per 100,000 patients. CONCLUSIONS Current Medicaid policies restricting hepatitis C treatment to patients with advanced disease are more costly and less effective than unrestricted, full access strategies. Collaboration between state and federal payers may be needed to realize the full public health impact of recent innovations in hepatitis C treatment. PMID:27325324
Pan, Yuesong; Wang, Anxin; Liu, Gaifen; Zhao, Xingquan; Meng, Xia; Zhao, Kun; Liu, Liping; Wang, Chunxue; Johnston, S Claiborne; Wang, Yilong; Wang, Yongjun
2014-06-05
Treatment with the combination of clopidogrel and aspirin taken soon after a transient ischemic attack (TIA) or minor stroke was shown to reduce the 90-day risk of stroke in a large trial in China, but the cost-effectiveness is unknown. This study sought to estimate the cost-effectiveness of the clopidogrel-aspirin regimen for acute TIA or minor stroke. A Markov model was created to determine the cost-effectiveness of treatment of acute TIA or minor stroke patients with clopidogrel-aspirin compared with aspirin alone. Inputs for the model were obtained from clinical trial data, claims databases, and the published literature. The main outcome measure was cost per quality-adjusted life-years (QALYs) gained. One-way and multivariable probabilistic sensitivity analyses were performed to test the robustness of the findings. Compared with aspirin alone, clopidogrel-aspirin resulted in a lifetime gain of 0.037 QALYs at an additional cost of CNY 1250 (US$ 192), yielding an incremental cost-effectiveness ratio of CNY 33 800 (US$ 5200) per QALY gained. Probabilistic sensitivity analysis showed that clopidogrel-aspirin therapy was more cost-effective in 95.7% of the simulations at a willingness-to-pay threshold recommended by the World Health Organization of CNY 105 000 (US$ 16 200) per QALY. Early 90-day clopidogrel-aspirin regimen for acute TIA or minor stroke is highly cost-effective in China. Although clopidogrel is generic, Plavix is brand in China. If Plavix were generic, treatment with clopidogrel-aspirin would have been cost saving. © 2014 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.
Kim, Dong-Jin; Kim, Ho-Sook; Oh, Minkyung; Kim, Eun-Young; Shin, Jae-Gook
2017-10-01
Although studies assessing the cost effectiveness of genotype-guided warfarin dosing for the management of atrial fibrillation, deep vein thrombosis, and pulmonary embolism have been reported, no publications have addressed genotype-guided warfarin therapy in mechanical heart valve replacement (MHVR) patients or genotype-guided warfarin therapy under the fee-for-service (FFS) insurance system. The aim of this study was to evaluate the cost effectiveness of genotype-guided warfarin dosing in patients with MHVR under the FFS system from the Korea healthcare sector perspective. A decision-analytic Markov model was developed to evaluate the cost effectiveness of genotype-guided warfarin dosing compared with standard dosing. Estimates of clinical adverse event rates and health state utilities were derived from the published literature. The outcome measure was the incremental cost-effectiveness ratio (ICER) per quality-adjusted life-year (QALY). One-way and probabilistic sensitivity analyses were performed to explore the range of plausible results. In a base-case analysis, genotype-guided warfarin dosing was associated with marginally higher QALYs than standard warfarin dosing (6.088 vs. 6.083, respectively), at a slightly higher cost (US$6.8) (year 2016 values). The ICER was US$1356.2 per QALY gained. In probabilistic sensitivity analysis, there was an 82.7% probability that genotype-guided dosing was dominant compared with standard dosing, and a 99.8% probability that it was cost effective at a willingness-to-pay threshold of US$50,000 per QALY gained. Compared with only standard warfarin therapy, genotype-guided warfarin dosing was cost effective in MHVR patients under the FFS insurance system.
Cost-effectiveness of thrombolysis within 4.5 hours of acute ischemic stroke in China.
Pan, Yuesong; Chen, Qidong; Zhao, Xingquan; Liao, Xiaoling; Wang, Chunjuan; Du, Wanliang; Liu, Gaifen; Liu, Liping; Wang, Chunxue; Wang, Yilong; Wang, Yongjun
2014-01-01
Previous economic studies conducted in developed countries showed intravenous tissue-type plasminogen activator (tPA) is cost-effective for acute ischemic stroke. The present study aimed to determine the cost-effectiveness of tPA treatment in China, the largest developing country. A combination of decision tree and Markov model was developed to determine the cost-effectiveness of tPA treatment versus non-tPA treatment within 4.5 hours after stroke onset. Outcomes and costs data were derived from the database of Thrombolysis Implementation and Monitor of acute ischemic Stroke in China (TIMS-China) study. Efficacy data were derived from a pooled analysis of ECASS, ATLANTIS, NINDS, and EPITHET trials. Costs and quality-adjusted life-years (QALYs) were compared in both short term (2 years) and long term (30 years). One-way and probabilistic sensitivity analyses were performed to test the robustness of the results. Comparing to non-tPA treatment, tPA treatment within 4.5 hours led to a short-term gain of 0.101 QALYs at an additional cost of CNY 9,520 (US$ 1,460), yielding an incremental cost-effectiveness ratio (ICER) of CNY 94,300 (US$ 14,500) per QALY gained in 2 years; and to a long-term gain of 0.422 QALYs at an additional cost of CNY 6,530 (US$ 1,000), yielding an ICER of CNY 15,500 (US$ 2,380) per QALY gained in 30 years. Probabilistic sensitivity analysis showed that tPA treatment is cost-effective in 98.7% of the simulations at a willingness-to-pay threshold of CNY 105,000 (US$ 16,200) per QALY. Intravenous tPA treatment within 4.5 hours is highly cost-effective for acute ischemic strokes in China.
Groundwater Remediation using Bayesian Information-Gap Decision Theory
NASA Astrophysics Data System (ADS)
O'Malley, D.; Vesselinov, V. V.
2016-12-01
Probabilistic analyses of groundwater remediation scenarios frequently fail because the probability of an adverse, unanticipated event occurring is often high. In general, models of flow and transport in contaminated aquifers are always simpler than reality. Further, when a probabilistic analysis is performed, probability distributions are usually chosen more for convenience than correctness. The Bayesian Information-Gap Decision Theory (BIGDT) was designed to mitigate the shortcomings of the models and probabilistic decision analyses by leveraging a non-probabilistic decision theory - information-gap decision theory. BIGDT considers possible models that have not been explicitly enumerated and does not require us to commit to a particular probability distribution for model and remediation-design parameters. Both the set of possible models and the set of possible probability distributions grow as the degree of uncertainty increases. The fundamental question that BIGDT asks is "How large can these sets be before a particular decision results in an undesirable outcome?". The decision that allows these sets to be the largest is considered to be the best option. In this way, BIGDT enables robust decision-support for groundwater remediation problems. Here we apply BIGDT to in a representative groundwater remediation scenario where different options for hydraulic containment and pump & treat are being considered. BIGDT requires many model runs and for complex models high-performance computing resources are needed. These analyses are carried out on synthetic problems, but are applicable to real-world problems such as LANL site contaminations. BIGDT is implemented in Julia (a high-level, high-performance dynamic programming language for technical computing) and is part of the MADS framework (http://mads.lanl.gov/ and https://github.com/madsjulia/Mads.jl).
Kamboj, Sunita; Yu, Charley; Johnson, Robert
2013-05-01
The Derived Concentration Guideline Levels for two building areas previously used in waste processing and storage at Argonne National Laboratory were developed using both probabilistic and deterministic radiological environmental pathway analysis. Four scenarios were considered. The two current uses considered were on-site industrial use and off-site residential use with farming. The two future uses (i.e., after an institutional control period of 100 y) were on-site recreational use and on-site residential use with farming. The RESRAD-OFFSITE code was used for the current-use off-site residential/farming scenario and RESRAD (onsite) was used for the other three scenarios. Contaminants of concern were identified from the past operations conducted in the buildings and the actual characterization done at the site. Derived Concentration Guideline Levels were developed for all four scenarios using deterministic and probabilistic approaches, which include both "peak-of-the-means" and "mean-of-the-peaks" analyses. The future-use on-site residential/farming scenario resulted in the most restrictive Derived Concentration Guideline Levels for most radionuclides.
Design and analysis of DNA strand displacement devices using probabilistic model checking
Lakin, Matthew R.; Parker, David; Cardelli, Luca; Kwiatkowska, Marta; Phillips, Andrew
2012-01-01
Designing correct, robust DNA devices is difficult because of the many possibilities for unwanted interference between molecules in the system. DNA strand displacement has been proposed as a design paradigm for DNA devices, and the DNA strand displacement (DSD) programming language has been developed as a means of formally programming and analysing these devices to check for unwanted interference. We demonstrate, for the first time, the use of probabilistic verification techniques to analyse the correctness, reliability and performance of DNA devices during the design phase. We use the probabilistic model checker prism, in combination with the DSD language, to design and debug DNA strand displacement components and to investigate their kinetics. We show how our techniques can be used to identify design flaws and to evaluate the merits of contrasting design decisions, even on devices comprising relatively few inputs. We then demonstrate the use of these components to construct a DNA strand displacement device for approximate majority voting. Finally, we discuss some of the challenges and possible directions for applying these methods to more complex designs. PMID:22219398
van Boven, Job FM; Kocks, Janwillem WH; Postma, Maarten J
2016-01-01
Purpose The fixed-dose dual bronchodilator combination (FDC) of tiotropium and olodaterol showed increased effectiveness regarding lung function and health-related quality of life in patients with chronic obstructive pulmonary disease (COPD) compared with the use of its mono-components. Yet, while effectiveness and safety have been shown, the health economic implication of this treatment is still unknown. The aim of this study was to assess the cost–utility and budget impact of tiotropium–olodaterol FDC in patients with moderate to very severe COPD in the Netherlands. Patients and methods A cost–utility study was performed, using an individual-level Markov model. To populate the model, individual patient-level data (age, height, sex, COPD duration, baseline forced expiratory volume in 1 second) were obtained from the tiotropium–olodaterol TOnado trial. In the model, forced expiratory volume in 1 second and patient-level data were extrapolated to utility and survival, and treatment with tiotropium–olodaterol FDC was compared with tiotropium. Cost–utility analysis was performed from the Dutch health care payer’s perspective using a 15-year time horizon in the base-case analysis. The standard Dutch discount rates were applied (costs: 4.0%; effects: 1.5%). Both univariate and probabilistic sensitivity analyses were performed. Budget impact was annually assessed over a 5-year time horizon, taking into account different levels of medication adherence. Results As a result of cost increases, combined with quality-adjusted life-year (QALY) gains, results showed that tiotropium–olodaterol FDC had an incremental cost-effectiveness ratio of €7,004/QALY. Without discounting, the incremental cost-effectiveness ratio was €5,981/QALY. Results were robust in univariate and probabilistic sensitivity analyses. Budget impact was estimated at €4.3 million over 5 years assuming 100% medication adherence. Scenarios with 40%, 60%, and 80% adherence resulted in lower 5-year incremental cost increases of €1.7, €2.6, and €3.4 million, respectively. Conclusion Tiotropium–olodaterol FDC can be considered a cost-effective treatment under current Dutch cost-effectiveness thresholds. PMID:27703341
Saito, Shota; Muneoka, Yusuke; Ishikawa, Takashi; Akazawa, Kouhei
2017-12-01
The combination of paclitaxel + ramucirumab is a standard second-line treatment in patients with advanced gastric cancer. This therapy has been associated with increased median overall survival and progression-free survival compared with those with paclitaxel monotherapy. We evaluated the cost-effectiveness of paclitaxel + ramucirumab combination therapy in patients with advanced gastric cancer, from the perspective of health care payers in Japan. We constructed a Markov model to compare, over a time horizon of 3 years, the costs and effectiveness of the combination of paclitaxel + ramucirumab and paclitaxel alone as second-line therapies for advanced gastric cancer in Japan. Health outcomes were measured in life-years (LYs) and quality-adjusted (QA) LYs gained. Costs were calculated using year-2016 Japanese yen (¥1 = US $17.79) according to the social insurance reimbursement schedule and drug tariff of the fee-for-service system in Japan. Model robustness was addressed through 1-way and probabilistic sensitivity analyses. The costs and QALYs were discounted at a rate of 2% per year. The willingness-to-pay threshold was set at the World Health Organization's criterion of ¥12 million, because no consensus exists regarding the threshold for acceptable cost per QALY ratios in Japan's health policy. Paclitaxel + ramucirumab combination therapy was estimated to provide an additional 0.09 QALYs (0.10 LYs) at a cost of ¥3,870,077, resulting in an incremental cost-effectiveness ratio of ¥43,010,248/QALY. The incremental cost-effectiveness ratio for the combination therapy was >¥12 million/QALY in all of the 1-way and probabilistic sensitivity analyses. Adding ramucirumab to a regimen of paclitaxel in the second-line treatment of advanced gastric cancer is expected to provide a minimal incremental benefit at a high incremental cost per QALY. Based on our findings, adjustments in the price of ramucirumab, as well as improves in other clinical parameters such as survival time and adverse event in advanced gastric cancer therapy, are needed. Copyright © 2017 Elsevier HS Journals, Inc. All rights reserved.
van Boven, Job Fm; Kocks, Janwillem Wh; Postma, Maarten J
2016-01-01
The fixed-dose dual bronchodilator combination (FDC) of tiotropium and olodaterol showed increased effectiveness regarding lung function and health-related quality of life in patients with chronic obstructive pulmonary disease (COPD) compared with the use of its mono-components. Yet, while effectiveness and safety have been shown, the health economic implication of this treatment is still unknown. The aim of this study was to assess the cost-utility and budget impact of tiotropium-olodaterol FDC in patients with moderate to very severe COPD in the Netherlands. A cost-utility study was performed, using an individual-level Markov model. To populate the model, individual patient-level data (age, height, sex, COPD duration, baseline forced expiratory volume in 1 second) were obtained from the tiotropium-olodaterol TOnado trial. In the model, forced expiratory volume in 1 second and patient-level data were extrapolated to utility and survival, and treatment with tiotropium-olodaterol FDC was compared with tiotropium. Cost-utility analysis was performed from the Dutch health care payer's perspective using a 15-year time horizon in the base-case analysis. The standard Dutch discount rates were applied (costs: 4.0%; effects: 1.5%). Both univariate and probabilistic sensitivity analyses were performed. Budget impact was annually assessed over a 5-year time horizon, taking into account different levels of medication adherence. As a result of cost increases, combined with quality-adjusted life-year (QALY) gains, results showed that tiotropium-olodaterol FDC had an incremental cost-effectiveness ratio of €7,004/QALY. Without discounting, the incremental cost-effectiveness ratio was €5,981/QALY. Results were robust in univariate and probabilistic sensitivity analyses. Budget impact was estimated at €4.3 million over 5 years assuming 100% medication adherence. Scenarios with 40%, 60%, and 80% adherence resulted in lower 5-year incremental cost increases of €1.7, €2.6, and €3.4 million, respectively. Tiotropium-olodaterol FDC can be considered a cost-effective treatment under current Dutch cost-effectiveness thresholds.
Influences of geological parameters to probabilistic assessment of slope stability of embankment
NASA Astrophysics Data System (ADS)
Nguyen, Qui T.; Le, Tuan D.; Konečný, Petr
2018-04-01
This article considers influences of geological parameters to slope stability of the embankment in probabilistic analysis using SLOPE/W computational system. Stability of a simple slope is evaluated with and without pore–water pressure on the basis of variation of soil properties. Normal distributions of unit weight, cohesion and internal friction angle are assumed. Monte Carlo simulation technique is employed to perform analysis of critical slip surface. Sensitivity analysis is performed to observe the variation of the geological parameters and their effects on safety factors of the slope stability.
Evidence synthesis for decision making 7: a reviewer's checklist.
Ades, A E; Caldwell, Deborah M; Reken, Stefanie; Welton, Nicky J; Sutton, Alex J; Dias, Sofia
2013-07-01
This checklist is for the review of evidence syntheses for treatment efficacy used in decision making based on either efficacy or cost-effectiveness. It is intended to be used for pairwise meta-analysis, indirect comparisons, and network meta-analysis, without distinction. It does not generate a quality rating and is not prescriptive. Instead, it focuses on a series of questions aimed at revealing the assumptions that the authors of the synthesis are expecting readers to accept, the adequacy of the arguments authors advance in support of their position, and the need for further analyses or sensitivity analyses. The checklist is intended primarily for those who review evidence syntheses, including indirect comparisons and network meta-analyses, in the context of decision making but will also be of value to those submitting syntheses for review, whether to decision-making bodies or journals. The checklist has 4 main headings: A) definition of the decision problem, B) methods of analysis and presentation of results, C) issues specific to network synthesis, and D) embedding the synthesis in a probabilistic cost-effectiveness model. The headings and implicit advice follow directly from the other tutorials in this series. A simple table is provided that could serve as a pro forma checklist.
Nshimyumukiza, Léon; Douville, Xavier; Fournier, Diane; Duplantie, Julie; Daher, Rana K; Charlebois, Isabelle; Longtin, Jean; Papenburg, Jesse; Guay, Maryse; Boissinot, Maurice; Bergeron, Michel G; Boudreau, Denis; Gagné, Christian; Rousseau, François; Reinharz, Daniel
2016-03-01
A point-of-care rapid test (POCRT) may help early and targeted use of antiviral drugs for the management of influenza A infection. (i) To determine whether antiviral treatment based on a POCRT for influenza A is cost-effective and, (ii) to determine the thresholds of key test parameters (sensitivity, specificity and cost) at which a POCRT based-strategy appears to be cost effective. An hybrid « susceptible, infected, recovered (SIR) » compartmental transmission and Markov decision analytic model was used to simulate the cost-effectiveness of antiviral treatment based on a POCRT for influenza A in the social perspective. Data input parameters used were retrieved from peer-review published studies and government databases. The outcome considered was the incremental cost per life-year saved for one seasonal influenza season. In the base-case analysis, the antiviral treatment based on POCRT saves 2 lives/100,000 person-years and costs $7600 less than the empirical antiviral treatment based on clinical judgment alone, which demonstrates that the POCRT-based strategy is dominant. In one and two way-sensitivity analyses, results were sensitive to the POCRT accuracy and cost, to the vaccination coverage as well as to the prevalence of influenza A. In probabilistic sensitivity analyses, the POCRT strategy is cost-effective in 66% of cases, for a commonly accepted threshold of $50,000 per life-year saved. The influenza antiviral treatment based on POCRT could be cost-effective in specific conditions of performance, price and disease prevalence. © 2015 The Authors. Influenza and Other Respiratory Viruses Published by John Wiley & Sons Ltd.
Précis of bayesian rationality: The probabilistic approach to human reasoning.
Oaksford, Mike; Chater, Nick
2009-02-01
According to Aristotle, humans are the rational animal. The borderline between rationality and irrationality is fundamental to many aspects of human life including the law, mental health, and language interpretation. But what is it to be rational? One answer, deeply embedded in the Western intellectual tradition since ancient Greece, is that rationality concerns reasoning according to the rules of logic--the formal theory that specifies the inferential connections that hold with certainty between propositions. Piaget viewed logical reasoning as defining the end-point of cognitive development; and contemporary psychology of reasoning has focussed on comparing human reasoning against logical standards. Bayesian Rationality argues that rationality is defined instead by the ability to reason about uncertainty. Although people are typically poor at numerical reasoning about probability, human thought is sensitive to subtle patterns of qualitative Bayesian, probabilistic reasoning. In Chapters 1-4 of Bayesian Rationality (Oaksford & Chater 2007), the case is made that cognition in general, and human everyday reasoning in particular, is best viewed as solving probabilistic, rather than logical, inference problems. In Chapters 5-7 the psychology of "deductive" reasoning is tackled head-on: It is argued that purportedly "logical" reasoning problems, revealing apparently irrational behaviour, are better understood from a probabilistic point of view. Data from conditional reasoning, Wason's selection task, and syllogistic inference are captured by recasting these problems probabilistically. The probabilistic approach makes a variety of novel predictions which have been experimentally confirmed. The book considers the implications of this work, and the wider "probabilistic turn" in cognitive science and artificial intelligence, for understanding human rationality.
Probabilistic Fracture Mechanics of Reactor Pressure Vessels with Populations of Flaws
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Benjamin; Backman, Marie; Williams, Paul
This report documents recent progress in developing a tool that uses the Grizzly and RAVEN codes to perform probabilistic fracture mechanics analyses of reactor pressure vessels in light water reactor nuclear power plants. The Grizzly code is being developed with the goal of creating a general tool that can be applied to study a variety of degradation mechanisms in nuclear power plant components. Because of the central role of the reactor pressure vessel (RPV) in a nuclear power plant, particular emphasis is being placed on developing capabilities to model fracture in embrittled RPVs to aid in the process surrounding decisionmore » making relating to life extension of existing plants. A typical RPV contains a large population of pre-existing flaws introduced during the manufacturing process. The use of probabilistic techniques is necessary to assess the likelihood of crack initiation at one or more of these flaws during a transient event. This report documents development and initial testing of a capability to perform probabilistic fracture mechanics of large populations of flaws in RPVs using reduced order models to compute fracture parameters. The work documented here builds on prior efforts to perform probabilistic analyses of a single flaw with uncertain parameters, as well as earlier work to develop deterministic capabilities to model the thermo-mechanical response of the RPV under transient events, and compute fracture mechanics parameters at locations of pre-defined flaws. The capabilities developed as part of this work provide a foundation for future work, which will develop a platform that provides the flexibility needed to consider scenarios that cannot be addressed with the tools used in current practice.« less
Finley, B; Paustenbach, D
1994-02-01
Probabilistic risk assessments are enjoying increasing popularity as a tool to characterize the health hazards associated with exposure to chemicals in the environment. Because probabilistic analyses provide much more information to the risk manager than standard "point" risk estimates, this approach has generally been heralded as one which could significantly improve the conduct of health risk assessments. The primary obstacles to replacing point estimates with probabilistic techniques include a general lack of familiarity with the approach and a lack of regulatory policy and guidance. This paper discusses some of the advantages and disadvantages of the point estimate vs. probabilistic approach. Three case studies are presented which contrast and compare the results of each. The first addresses the risks associated with household exposure to volatile chemicals in tapwater. The second evaluates airborne dioxin emissions which can enter the food-chain. The third illustrates how to derive health-based cleanup levels for dioxin in soil. It is shown that, based on the results of Monte Carlo analyses of probability density functions (PDFs), the point estimate approach required by most regulatory agencies will nearly always overpredict the risk for the 95th percentile person by a factor of up to 5. When the assessment requires consideration of 10 or more exposure variables, the point estimate approach will often predict risks representative of the 99.9th percentile person rather than the 50th or 95th percentile person. This paper recommends a number of data distributions for various exposure variables that we believe are now sufficiently well understood to be used with confidence in most exposure assessments. A list of exposure variables that may require additional research before adequate data distributions can be developed are also discussed.
Donati, Maria Anna; Panno, Angelo; Chiesi, Francesca; Primi, Caterina
2014-01-01
This study tested the mediating role of probabilistic reasoning ability in the relationship between fluid intelligence and advantageous decision making among adolescents in explicit situations of risk--that is, in contexts in which information on the choice options (gains, losses, and probabilities) were explicitly presented at the beginning of the task. Participants were 282 adolescents attending high school (77% males, mean age = 17.3 years). We first measured fluid intelligence and probabilistic reasoning ability. Then, to measure decision making under explicit conditions of risk, participants performed the Game of Dice Task, in which they have to decide among different alternatives that are explicitly linked to a specific amount of gain or loss and have obvious winning probabilities that are stable over time. Analyses showed a significant positive indirect effect of fluid intelligence on advantageous decision making through probabilistic reasoning ability that acted as a mediator. Specifically, fluid intelligence may enhance ability to reason in probabilistic terms, which in turn increases the likelihood of advantageous choices when adolescents are confronted with an explicit decisional context. Findings show that in experimental paradigm settings, adolescents are able to make advantageous decisions using cognitive abilities when faced with decisions under explicit risky conditions. This study suggests that interventions designed to promote probabilistic reasoning, for example by incrementing the mathematical prerequisites necessary to reason in probabilistic terms, may have a positive effect on adolescents' decision-making abilities.
NASA Astrophysics Data System (ADS)
Warner, Thomas T.; Sheu, Rong-Shyang; Bowers, James F.; Sykes, R. Ian; Dodd, Gregory C.; Henn, Douglas S.
2002-05-01
Ensemble simulations made using a coupled atmospheric dynamic model and a probabilistic Lagrangian puff dispersion model were employed in a forensic analysis of the transport and dispersion of a toxic gas that may have been released near Al Muthanna, Iraq, during the Gulf War. The ensemble study had two objectives, the first of which was to determine the sensitivity of the calculated dosage fields to the choices that must be made about the configuration of the atmospheric dynamic model. In this test, various choices were used for model physics representations and for the large-scale analyses that were used to construct the model initial and boundary conditions. The second study objective was to examine the dispersion model's ability to use ensemble inputs to predict dosage probability distributions. Here, the dispersion model was used with the ensemble mean fields from the individual atmospheric dynamic model runs, including the variability in the individual wind fields, to generate dosage probabilities. These are compared with the explicit dosage probabilities derived from the individual runs of the coupled modeling system. The results demonstrate that the specific choices made about the dynamic-model configuration and the large-scale analyses can have a large impact on the simulated dosages. For example, the area near the source that is exposed to a selected dosage threshold varies by up to a factor of 4 among members of the ensemble. The agreement between the explicit and ensemble dosage probabilities is relatively good for both low and high dosage levels. Although only one ensemble was considered in this study, the encouraging results suggest that a probabilistic dispersion model may be of value in quantifying the effects of uncertainties in a dynamic-model ensemble on dispersion model predictions of atmospheric transport and dispersion.
Mulla, Mubashir; Schulte, Klaus-Martin
2012-01-01
Cervical lymph nodes (CLNs) are the most common site of metastases in papillary thyroid cancer (PTC). Ultrasound scan (US) is the most commonly used imaging modality in the evaluation of CLNs in PTC. Computerised tomography (CT) and 18fluorodeoxyglucose positron emission tomography (18FDG PET–CT) are used less commonly. It is widely believed that the above imaging techniques should guide the surgical approach to the patient with PTC. Methods We performed a systematic review of imaging studies from the literature assessing the usefulness for the detection of metastatic CLNs in PTC. We evaluated the author's interpretation of their numeric findings specifically with regard to ‘sensitivity’ and ‘negative predictive value’ (NPV) by comparing their use against standard definitions of these terms in probabilistic statistics. Results A total of 16 studies used probabilistic terms to describe the value of US for the detection of LN metastases. Only 6 (37.5%) calculated sensitivity and NPV correctly. For CT, out of the eight studies, only 1 (12.5%) used correct terms to describe analytical results. One study looked at magnetic resonance imaging, while three assessed 18FDG PET–CT, none of which provided correct calculations for sensitivity and NPV. Conclusion Imaging provides high specificity for the detection of cervical metastases of PTC. However, sensitivity and NPV are low. The majority of studies reporting on a high sensitivity have not used key terms according to standard definitions of probabilistic statistics. Against common opinion, there is no current evidence that failure to find LN metastases on ultrasound or cross-sectional imaging can be used to guide surgical decision making. PMID:23781308
Benchmarking Exercises To Validate The Updated ELLWF GoldSim Slit Trench Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, G. A.; Hiergesell, R. A.
2013-11-12
The Savannah River National Laboratory (SRNL) results of the 2008 Performance Assessment (PA) (WSRC, 2008) sensitivity/uncertainty analyses conducted for the trenches located in the EArea LowLevel Waste Facility (ELLWF) were subject to review by the United States Department of Energy (U.S. DOE) Low-Level Waste Disposal Facility Federal Review Group (LFRG) (LFRG, 2008). LFRG comments were generally approving of the use of probabilistic modeling in GoldSim to support the quantitative sensitivity analysis. A recommendation was made, however, that the probabilistic models be revised and updated to bolster their defensibility. SRS committed to addressing those comments and, in response, contracted with Neptunemore » and Company to rewrite the three GoldSim models. The initial portion of this work, development of Slit Trench (ST), Engineered Trench (ET) and Components-in-Grout (CIG) trench GoldSim models, has been completed. The work described in this report utilizes these revised models to test and evaluate the results against the 2008 PORFLOW model results. This was accomplished by first performing a rigorous code-to-code comparison of the PORFLOW and GoldSim codes and then performing a deterministic comparison of the two-dimensional (2D) unsaturated zone and three-dimensional (3D) saturated zone PORFLOW Slit Trench models against results from the one-dimensional (1D) GoldSim Slit Trench model. The results of the code-to-code comparison indicate that when the mechanisms of radioactive decay, partitioning of contaminants between solid and fluid, implementation of specific boundary conditions and the imposition of solubility controls were all tested using identical flow fields, that GoldSim and PORFLOW produce nearly identical results. It is also noted that GoldSim has an advantage over PORFLOW in that it simulates all radionuclides simultaneously - thus avoiding a potential problem as demonstrated in the Case Study (see Section 2.6). Hence, it was concluded that the follow-on work using GoldSim to develop 1D equivalent models of the PORFLOW multi-dimensional models was justified. The comparison of GoldSim 1D equivalent models to PORFLOW multi-dimensional models was made at two locations in the model domains - at the unsaturated-saturated zone interface and at the 100m point of compliance. PORFLOW model results from the 2008 PA were utilized to investigate the comparison. By making iterative adjustments to certain water flux terms in the GoldSim models it was possible to produce contaminant mass fluxes and water concentrations that were highly similar to the PORFLOW model results at the two locations where comparisons were made. Based on the ability of the GoldSim 1D trench models to produce mass flux and concentration curves that are sufficiently similar to multi-dimensional PORFLOW models for all of the evaluated radionuclides and their progeny, it is concluded that the use of the GoldSim 1D equivalent Slit and Engineered trenches models for further probabilistic sensitivity and uncertainty analysis of ELLWF trench units is justified. A revision to the original report was undertaken to correct mislabeling on the y-axes of the compliance point concentration graphs, to modify the terminology used to define the ''blended'' source term Case for the saturated zone to make it consistent with terminology used in the 2008 PA, and to make a more definitive statement regarding the justification of the use of the GoldSim 1D equivalent trench models for follow-on probabilistic sensitivity and uncertainty analysis.« less
GRIZZLY/FAVOR Interface Project Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, Terry L; Williams, Paul T; Yin, Shengjun
As part of the Light Water Reactor Sustainability (LWRS) Program, the objective of the GRIZZLY/FAVOR Interface project is to create the capability to apply GRIZZLY 3-D finite element (thermal and stress) analysis results as input to FAVOR probabilistic fracture mechanics (PFM) analyses. The one benefit of FAVOR to Grizzly is the PROBABILISTIC capability. This document describes the implementation of the GRIZZLY/FAVOR Interface, the preliminary verification and tests results and a user guide that provides detailed step-by-step instructions to run the program.
Jain, Siddharth; Kilgore, Meredith; Edwards, Rodney K; Owen, John
2016-07-01
Preterm birth (PTB) is a significant cause of neonatal morbidity and mortality. Studies have shown that vaginal progesterone therapy for women diagnosed with shortened cervical length can reduce the risk of PTB. However, published cost-effectiveness analyses of vaginal progesterone for short cervix have not considered an appropriate range of clinically important parameters. To evaluate the cost-effectiveness of universal cervical length screening in women without a history of spontaneous PTB, assuming that all women with shortened cervical length receive progesterone to reduce the likelihood of PTB. A decision analysis model was developed to compare universal screening and no-screening strategies. The primary outcome was the cost-effectiveness ratio of both the strategies, defined as the estimated patient cost per quality-adjusted life-year (QALY) realized by the children. One-way sensitivity analyses were performed by varying progesterone efficacy to prevent PTB. A probabilistic sensitivity analysis was performed to address uncertainties in model parameter estimates. In our base-case analysis, assuming that progesterone reduces the likelihood of PTB by 11%, the incremental cost-effectiveness ratio for screening was $158,000/QALY. Sensitivity analyses show that these results are highly sensitive to the presumed efficacy of progesterone to prevent PTB. In a 1-way sensitivity analysis, screening results in cost-saving if progesterone can reduce PTB by 36%. Additionally, for screening to be cost-effective at WTP=$60,000 in three clinical scenarios, progesterone therapy has to reduce PTB by 60%, 34% and 93%. Screening is never cost-saving in the worst-case scenario or when serial ultrasounds are employed, but could be cost-saving with a two-day hospitalization only if progesterone were 64% effective. Cervical length screening and treatment with progesterone is a not a dominant, cost-effective strategy unless progesterone is more effective than has been suggested by available data for US women. Until future trials demonstrate greater progesterone efficacy, and effectiveness studies confirm a benefit from screening and treatment, the cost-effectiveness of universal cervical length screening in the United States remains questionable. Copyright © 2016 Elsevier Inc. All rights reserved.
Prike, Toby; Arnold, Michelle M; Williamson, Paul
2017-08-01
A growing body of research has shown people who hold anomalistic (e.g., paranormal) beliefs may differ from nonbelievers in their propensity to make probabilistic reasoning errors. The current study explored the relationship between these beliefs and performance through the development of a new measure of anomalistic belief, called the Anomalistic Belief Scale (ABS). One key feature of the ABS is that it includes a balance of both experiential and theoretical belief items. Another aim of the study was to use the ABS to investigate the relationship between belief and probabilistic reasoning errors on conjunction fallacy tasks. As expected, results showed there was a relationship between anomalistic belief and propensity to commit the conjunction fallacy. Importantly, regression analyses on the factors that make up the ABS showed that the relationship between anomalistic belief and probabilistic reasoning occurred only for beliefs about having experienced anomalistic phenomena, and not for theoretical anomalistic beliefs. Copyright © 2017 Elsevier Inc. All rights reserved.
Probabilistic Exposure Analysis for Chemical Risk Characterization
Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.
2009-01-01
This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660
Campbell, Kieran R; Yau, Christopher
2017-03-15
Modeling bifurcations in single-cell transcriptomics data has become an increasingly popular field of research. Several methods have been proposed to infer bifurcation structure from such data, but all rely on heuristic non-probabilistic inference. Here we propose the first generative, fully probabilistic model for such inference based on a Bayesian hierarchical mixture of factor analyzers. Our model exhibits competitive performance on large datasets despite implementing full Markov-Chain Monte Carlo sampling, and its unique hierarchical prior structure enables automatic determination of genes driving the bifurcation process. We additionally propose an Empirical-Bayes like extension that deals with the high levels of zero-inflation in single-cell RNA-seq data and quantify when such models are useful. We apply or model to both real and simulated single-cell gene expression data and compare the results to existing pseudotime methods. Finally, we discuss both the merits and weaknesses of such a unified, probabilistic approach in the context practical bioinformatics analyses.
Probabilistic Analysis of Gas Turbine Field Performance
NASA Technical Reports Server (NTRS)
Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.
2002-01-01
A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.
NASA Astrophysics Data System (ADS)
Sari, Dwi Ivayana; Hermanto, Didik
2017-08-01
This research is a developmental research of probabilistic thinking-oriented learning tools for probability materials at ninth grade students. This study is aimed to produce a good probabilistic thinking-oriented learning tools. The subjects were IX-A students of MTs Model Bangkalan. The stages of this development research used 4-D development model which has been modified into define, design and develop. Teaching learning tools consist of lesson plan, students' worksheet, learning teaching media and students' achievement test. The research instrument used was a sheet of learning tools validation, a sheet of teachers' activities, a sheet of students' activities, students' response questionnaire and students' achievement test. The result of those instruments were analyzed descriptively to answer research objectives. The result was teaching learning tools in which oriented to probabilistic thinking of probability at ninth grade students which has been valid. Since teaching and learning tools have been revised based on validation, and after experiment in class produced that teachers' ability in managing class was effective, students' activities were good, students' responses to the learning tools were positive and the validity, sensitivity and reliability category toward achievement test. In summary, this teaching learning tools can be used by teacher to teach probability for develop students' probabilistic thinking.
Arcella, D; Soggiu, M E; Leclercq, C
2003-10-01
For the assessment of exposure to food-borne chemicals, the most commonly used methods in the European Union follow a deterministic approach based on conservative assumptions. Over the past few years, to get a more realistic view of exposure to food chemicals, risk managers are getting more interested in the probabilistic approach. Within the EU-funded 'Monte Carlo' project, a stochastic model of exposure to chemical substances from the diet and a computer software program were developed. The aim of this paper was to validate the model with respect to the intake of saccharin from table-top sweeteners and cyclamate from soft drinks by Italian teenagers with the use of the software and to evaluate the impact of the inclusion/exclusion of indicators on market share and brand loyalty through a sensitivity analysis. Data on food consumption and the concentration of sweeteners were collected. A food frequency questionnaire aimed at identifying females who were high consumers of sugar-free soft drinks and/or of table top sweeteners was filled in by 3982 teenagers living in the District of Rome. Moreover, 362 subjects participated in a detailed food survey by recording, at brand level, all foods and beverages ingested over 12 days. Producers were asked to provide the intense sweeteners' concentration of sugar-free products. Results showed that consumer behaviour with respect to brands has an impact on exposure assessments. Only probabilistic models that took into account indicators of market share and brand loyalty met the validation criteria.
NASA Technical Reports Server (NTRS)
Bast, Callie Corinne Scheidt
1994-01-01
This thesis presents the on-going development of methodology for a probabilistic material strength degradation model. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes four effects that typically reduce lifetime strength: high temperature, mechanical fatigue, creep, and thermal fatigue. Statistical analysis was conducted on experimental Inconel 718 data obtained from the open literature. This analysis provided regression parameters for use as the model's empirical material constants, thus calibrating the model specifically for Inconel 718. Model calibration was carried out for four variables, namely, high temperature, mechanical fatigue, creep, and thermal fatigue. Methodology to estimate standard deviations of these material constants for input into the probabilistic material strength model was developed. Using the current version of PROMISS, entitled PROMISS93, a sensitivity study for the combined effects of mechanical fatigue, creep, and thermal fatigue was performed. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing a combination of mechanical fatigue and high temperature effects by model to the combination by experiment were conducted. Thus, for Inconel 718, the basic model assumption of independence between effects was evaluated. Results from this limited verification study strongly supported this assumption.
NASA Astrophysics Data System (ADS)
Knowlton, R. G.; Arnold, B. W.; Mattie, P. D.; Kuo, M.; Tien, N.
2006-12-01
For several years now, Taiwan has been engaged in a process to select a low-level radioactive waste (LLW) disposal site. Taiwan is generating LLW from operational and decommissioning wastes associated with nuclear power reactors, as well as research, industrial, and medical radioactive wastes. The preliminary selection process has narrowed the search to four potential candidate sites. These sites are to be evaluated in a performance assessment analysis to determine the likelihood of meeting the regulatory criteria for disposal. Sandia National Laboratories and Taiwan's Institute of Nuclear Energy Research have been working together to develop the necessary performance assessment methodology and associated computer models to perform these analyses. The methodology utilizes both deterministic (e.g., single run) and probabilistic (e.g., multiple statistical realizations) analyses to achieve the goals. The probabilistic approach provides a means of quantitatively evaluating uncertainty in the model predictions and a more robust basis for performing sensitivity analyses to better understand what is driving the dose predictions from the models. Two types of disposal configurations are under consideration: a shallow land burial concept and a cavern disposal concept. The shallow land burial option includes a protective cover to limit infiltration potential to the waste. Both conceptual designs call for the disposal of 55 gallon waste drums within concrete lined trenches or tunnels, and backfilled with grout. Waste emplaced in the drums may be solidified. Both types of sites are underlain or placed within saturated fractured bedrock material. These factors have influenced the conceptual model development of each site, as well as the selection of the models to employ for the performance assessment analyses. Several existing codes were integrated in order to facilitate a comprehensive performance assessment methodology to evaluate the potential disposal sites. First, a need existed to simulate the failure processes of the waste containers, with subsequent leaching of the waste form to the underlying host rock. The Breach, Leach, and Transport Multiple Species (BLT-MS) code was selected to meet these needs. BLT-MS also has a 2-D finite-element advective-dispersive transport module, with radionuclide in-growth and decay. BLT-MS does not solve the groundwater flow equation, but instead requires the input of Darcy flow velocity terms. These terms were abstracted from a groundwater flow model using the FEHM code. For the shallow land burial site, the HELP code was also used to evaluate the performance of the protective cover. The GoldSim code was used for two purposes: quantifying uncertainties in the predictions, and providing a platform to evaluate an alternative conceptual model involving matrix-diffusion transport. Results of the preliminary performance assessment analyses using examples to illustrate the computational framework will be presented. Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE AC04 94AL85000.
Comparing five alternative methods of breast reconstruction surgery: a cost-effectiveness analysis.
Grover, Ritwik; Padula, William V; Van Vliet, Michael; Ridgway, Emily B
2013-11-01
The purpose of this study was to assess the cost-effectiveness of five standardized procedures for breast reconstruction to delineate the best reconstructive approach in postmastectomy patients in the settings of nonirradiated and irradiated chest walls. A decision tree was used to model five breast reconstruction procedures from the provider perspective to evaluate cost-effectiveness. Procedures included autologous flaps with pedicled tissue, autologous flaps with free tissue, latissimus dorsi flaps with breast implants, expanders with implant exchange, and immediate implant placement. All methods were compared with a "do-nothing" alternative. Data for model parameters were collected through a systematic review, and patient health utilities were calculated from an ad hoc survey of reconstructive surgeons. Results were measured in cost (2011 U.S. dollars) per quality-adjusted life-year. Univariate sensitivity analyses and Bayesian multivariate probabilistic sensitivity analysis were conducted. Pedicled autologous tissue and free autologous tissue reconstruction were cost-effective compared with the do-nothing alternative. Pedicled autologous tissue was the slightly more cost-effective of the two. The other procedures were not found to be cost-effective. The results were robust to a number of sensitivity analyses, although the margin between pedicled and free autologous tissue reconstruction is small and affected by some parameter values. Autologous pedicled tissue was slightly more cost-effective than free tissue reconstruction in irradiated and nonirradiated patients. Implant-based techniques were not cost-effective. This is in agreement with the growing trend at academic institutions to encourage autologous tissue reconstruction because of its natural recreation of the breast contour, suppleness, and resiliency in the setting of irradiated recipient beds.
Kim, H; Rajagopalan, M S; Beriwal, S; Smith, K J
2017-10-01
Stereotactic radiosurgery (SRS) alone or upfront whole brain radiation therapy (WBRT) plus SRS are the most commonly used treatment options for one to three brain oligometastases. The most recent randomised clinical trial result comparing SRS alone with upfront WBRT plus SRS (NCCTG N0574) has favoured SRS alone for neurocognitive function, whereas treatment options remain controversial in terms of cognitive decline and local control. The aim of this study was to conduct a cost-effectiveness analysis of these two competing treatments. A Markov model was constructed for patients treated with SRS alone or SRS plus upfront WBRT based on largely randomised clinical trials. Costs were based on 2016 Medicare reimbursement. Strategies were compared using the incremental cost-effectiveness ratio (ICER) and effectiveness was measured in quality-adjusted life years (QALYs). One-way and probabilistic sensitivity analyses were carried out. Strategies were evaluated from the healthcare payer's perspective with a willingness-to-pay threshold of $100 000 per QALY gained. In the base case analysis, the median survival was 9 months for both arms. SRS alone resulted in an ICER of $9917 per QALY gained. In one-way sensitivity analyses, results were most sensitive to variation in cognitive decline rates for both groups and median survival rates, but the SRS alone remained cost-effective for most parameter ranges. Based on the current available evidence, SRS alone was found to be cost-effective for patients with one to three brain metastases compared with upfront WBRT plus SRS. Copyright © 2017 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
Zhan, Mei; Zheng, Hanrui; Xu, Ting; Yang, Yu; Li, Qiu
2017-08-01
Malignant pleural mesothelioma (MPM) is a rare malignancy, and pemetrexed/cisplatin (PC) is the gold standard first-line regime. This study evaluated the cost-effectiveness of the addition of bevacizumab to PC (with maintenance bevacizumab) for unresectable MPM based on a phase III trial that showed a survival benefit compared with chemotherapy alone. To estimate the incremental cost-effectiveness ratio (ICER) of the incorporation of bevacizumab, a Markov model based on the MAPS trial, including the disease states of progression-free survival, progressive disease and death, was used. Total costs were calculated from a Chinese payer perspective, and health outcomes were converted into quality-adjusted life year (QALY). Model robustness was explored in sensitivity analyses. The addition of bevacizumab to PC was estimated to increase the cost by $81446.69, with a gain of 0.112 QALYs, resulting in an ICER of $727202.589 per QALY. In both one-way sensitivity and probabilistic sensitivity analyses, the ICER exceeded the commonly accepted willingness-to-pay threshold of 3 times the gross domestic product per capita of China ($23970.00 per QALY). The cost of bevacizumab had the most important impact on the ICER. The combination of bevacizumab with PC chemotherapy is not a cost-effective treatment option for MPM in China. Given its positive clinical value and extremely low incidence of MPM, an appropriate price discount, assistance programs and medical insurance should be considered to make bevacizumab more affordable for this rare patient population. Copyright © 2017 Elsevier B.V. All rights reserved.
Economic Evaluation of Frequent Home Nocturnal Hemodialysis Based on a Randomized Controlled Trial
Tonelli, Marcello; Pauly, Robert; Walsh, Michael; Culleton, Bruce; So, Helen; Hemmelgarn, Brenda; Manns, Braden
2014-01-01
Provider and patient enthusiasm for frequent home nocturnal hemodialysis (FHNHD) has been renewed; however, the cost-effectiveness of this technique is unknown. We performed a cost-utility analysis of FHNHD compared with conventional hemodialysis (CvHD; 4 hours three times per week) from a health payer perspective over a lifetime horizon using patient information from the Alberta NHD randomized controlled trial. Costs, including training costs, were obtained using microcosting and administrative data (CAN$2012). We determined the incremental cost per quality-adjusted life year (QALY) gained. Robustness was assessed using scenario, sensitivity, and probabilistic sensitivity analyses. Compared with CvHD (61% in-center, 14% satellite, and 25% home dialysis), FHNHD led to incremental cost savings (−$6700) and an additional 0.38 QALYs. In sensitivity analyses, when the annual probability of technique failure with FHNHD increased from 7.6% (reference case) to ≥19%, FHNHD became unattractive (>$75,000/QALY). The cost/QALY gained became $13,000 if average training time for FHNHD increased from 3.7 to 6 weeks. In scenarios with alternate comparator modalities, FHNHD remained dominant compared with in-center CvHD; cost/QALYs gained were $18,500, $198,000, and $423,000 compared with satellite CvHD, home CvHD, and peritoneal dialysis, respectively. In summary, FHNHD is attractive compared with in-center CvHD in this cohort. However, the attractiveness of FHNHD varies by technique failure rate, training time, and dialysis modalities from which patients are drawn, and these variables should be considered when establishing FHNHD programs. PMID:24231665
Lazzaro, Carlo; Barone, Carlo; Caprioni, Francesco; Cascinu, Stefano; Falcone, Alfredo; Maiello, Evaristo; Milella, Michele; Pinto, Carmine; Reni, Michele; Tortora, Giampaolo
2018-04-20
the APICE study evaluates the cost-effectiveness of nanoparticle albumin-bound paclitaxel (nab-paclitaxel - Nab-P) + gemcitabine (G) vs G alone in metastatic pancreatic cancer (MPC) from the Italian National Health Service (INHS) standpoint. A 4-year, 4 health states (progression-free; progressed; end of life; death) Markov model based on the MPACT trial was developed to estimate costs (Euro [€], 2017 values), and quality-adjusted life years (QALYs). Patients were assumed to receive intravenously Nab-P 125 mg/m 2 + G 1000 mg/m 2 on days 1, 8, and 15 every 4 weeks or G alone 1000 mg/m 2 weekly for 7 out of 8 weeks (cycle 1) and then on days 1, 8, and 15 every 4 weeks (cycle 2 and subsequent cycles) until progression. One-way and probabilistic sensitivity analyses explored the uncertainty surrounding the baseline incremental cost-utility ratio (ICUR). Nab-P + G totals 0.154 incremental QALYs and €7082.68 incremental costs vs G alone. ICUR (€46,021.58) is lower than the informal threshold value of €87,330 adopted by the Italian Medicines Agency during 2010-2013 for reimbursing oncological drugs. Sensitivity analyses confirmed the robustness of the baseline findings. Nab-P + G in MPC patients can be considered cost-effective for the INHS.
Cost-effectiveness of bedaquiline in MDR and XDR tuberculosis in Italy
Codecasa, Luigi R.; Toumi, Mondher; D’Ausilio, Anna; Aiello, Andrea; Damele, Francesco; Termini, Roberta; Uglietti, Alessia; Hettle, Robert; Graziano, Giorgio; De Lorenzo, Saverio
2017-01-01
ABSTRACT Objective: To evaluate the cost-effectiveness of bedaquiline plus background drug regimens (BR) for multidrug-resistant tuberculosis (MDR-TB) and extensively drug-resistant tuberculosis (XDR-TB) in Italy. Methods: A Markov model was adapted to the Italian setting to estimate the incremental cost-effectiveness ratio (ICER) of bedaquiline plus BR (BBR) versus BR in the treatment of MDR-TB and XDR-TB over 10 years, from both the National Health Service (NHS) and societal perspective. Cost-effectiveness was evaluated in terms of life-years gained (LYG). Clinical data were sourced from trials; resource consumption for compared treatments was modelled according to advice from an expert clinicians panel. NHS tariffs for inpatient and outpatient resource consumption were retrieved from published Italian sources. Drug costs were provided by reference centres for disease treatment in Italy. A 3% annual discount was applied to both cost and effectiveness. Deterministic and probabilistic sensitivity analyses were conducted. Results: Over 10 years, BBR vs. BR alone is cost-effective, with ICERs of €16,639/LYG and €4081/LYG for the NHS and society, respectively. The sensitivity analyses confirmed the robustness of the results from both considered perspectives. Conclusion: In Italy, BBR vs. BR alone has proven to be cost-effective in the treatment of MDR-TB and XDR-TB under a range of scenarios. PMID:28265350
Cost-Effectiveness of Dapagliflozin versus Acarbose as a Monotherapy in Type 2 Diabetes in China.
Gu, Shuyan; Mu, Yiming; Zhai, Suodi; Zeng, Yuhang; Zhen, Xuemei; Dong, Hengjin
2016-01-01
To estimate the long-term cost-effectiveness of dapagliflozin versus acarbose as monotherapy in treatment-naïve patients with type 2 diabetes mellitus (T2DM) in China. The Cardiff Diabetes Model, an economic model designed to evaluate the cost-effectiveness of comparator therapies in diabetes was used to simulate disease progression and estimate the long-term effect of treatments on patients. Systematic literature reviews, hospital surveys, meta-analysis and indirect treatment comparison were conducted to obtain model-required patient profiles, clinical data and costs. Health insurance costs (2015¥) were estimated over 40 years from a healthcare payer perspective. Univariate and probabilistic sensitivity analyses were performed. The model predicted that dapagliflozin had lower incidences of cardiovascular events, hypoglycemia and mortality events, was associated with a mean incremental benefit of 0.25 quality-adjusted life-years (QALYs) and with a lower cost of ¥8,439 compared with acarbose. This resulted in a cost saving of ¥33,786 per QALY gained with dapagliflozin. Sensitivity analyses determined that the results are robust. Dapagliflozin is dominant compared with acarbose as monotherapy for Chinese T2DM patients, with a little QALY gain and lower costs. Dapagliflozin offers a well-tolerated and cost-effective alternative medication for treatment-naive patients in China, and may have a direct impact in reducing the disease burden of T2DM.
Economic impact of Tegaderm chlorhexidine gluconate (CHG) dressing in critically ill patients.
Thokala, Praveen; Arrowsmith, Martin; Poku, Edith; Martyn-St James, Marissa; Anderson, Jeff; Foster, Steve; Elliott, Tom; Whitehouse, Tony
2016-09-01
To estimate the economic impact of a Tegaderm TM chlorhexidine gluconate (CHG) gel dressing compared with a standard intravenous (i.v.) dressing (defined as non-antimicrobial transparent film dressing), used for insertion site care of short-term central venous and arterial catheters (intravascular catheters) in adult critical care patients using a cost-consequence model populated with data from published sources. A decision analytical cost-consequence model was developed which assigned each patient with an indwelling intravascular catheter and a standard dressing, a baseline risk of associated dermatitis, local infection at the catheter insertion site and catheter-related bloodstream infections (CRBSI), estimated from published secondary sources. The risks of these events for patients with a Tegaderm CHG were estimated by applying the effectiveness parameters from the clinical review to the baseline risks. Costs were accrued through costs of intervention (i.e. Tegaderm CHG or standard intravenous dressing) and hospital treatment costs depended on whether the patients had local dermatitis, local infection or CRBSI. Total costs were estimated as mean values of 10,000 probabilistic sensitivity analysis (PSA) runs. Tegaderm CHG resulted in an average cost-saving of £77 per patient in an intensive care unit. Tegaderm CHG also has a 98.5% probability of being cost-saving compared to standard i.v. dressings. The analyses suggest that Tegaderm CHG is a cost-saving strategy to reduce CRBSI and the results were robust to sensitivity analyses.
Cost-Effectiveness of Dapagliflozin versus Acarbose as a Monotherapy in Type 2 Diabetes in China
Gu, Shuyan; Mu, Yiming; Zhai, Suodi; Zeng, Yuhang; Zhen, Xuemei; Dong, Hengjin
2016-01-01
Objective To estimate the long-term cost-effectiveness of dapagliflozin versus acarbose as monotherapy in treatment-naïve patients with type 2 diabetes mellitus (T2DM) in China. Methods The Cardiff Diabetes Model, an economic model designed to evaluate the cost-effectiveness of comparator therapies in diabetes was used to simulate disease progression and estimate the long-term effect of treatments on patients. Systematic literature reviews, hospital surveys, meta-analysis and indirect treatment comparison were conducted to obtain model-required patient profiles, clinical data and costs. Health insurance costs (2015¥) were estimated over 40 years from a healthcare payer perspective. Univariate and probabilistic sensitivity analyses were performed. Results The model predicted that dapagliflozin had lower incidences of cardiovascular events, hypoglycemia and mortality events, was associated with a mean incremental benefit of 0.25 quality-adjusted life-years (QALYs) and with a lower cost of ¥8,439 compared with acarbose. This resulted in a cost saving of ¥33,786 per QALY gained with dapagliflozin. Sensitivity analyses determined that the results are robust. Conclusion Dapagliflozin is dominant compared with acarbose as monotherapy for Chinese T2DM patients, with a little QALY gain and lower costs. Dapagliflozin offers a well-tolerated and cost-effective alternative medication for treatment-naive patients in China, and may have a direct impact in reducing the disease burden of T2DM. PMID:27806087
Objectively combining AR5 instrumental period and paleoclimate climate sensitivity evidence
NASA Astrophysics Data System (ADS)
Lewis, Nicholas; Grünwald, Peter
2018-03-01
Combining instrumental period evidence regarding equilibrium climate sensitivity with largely independent paleoclimate proxy evidence should enable a more constrained sensitivity estimate to be obtained. Previous, subjective Bayesian approaches involved selection of a prior probability distribution reflecting the investigators' beliefs about climate sensitivity. Here a recently developed approach employing two different statistical methods—objective Bayesian and frequentist likelihood-ratio—is used to combine instrumental period and paleoclimate evidence based on data presented and assessments made in the IPCC Fifth Assessment Report. Probabilistic estimates from each source of evidence are represented by posterior probability density functions (PDFs) of physically-appropriate form that can be uniquely factored into a likelihood function and a noninformative prior distribution. The three-parameter form is shown accurately to fit a wide range of estimated climate sensitivity PDFs. The likelihood functions relating to the probabilistic estimates from the two sources are multiplicatively combined and a prior is derived that is noninformative for inference from the combined evidence. A posterior PDF that incorporates the evidence from both sources is produced using a single-step approach, which avoids the order-dependency that would arise if Bayesian updating were used. Results are compared with an alternative approach using the frequentist signed root likelihood ratio method. Results from these two methods are effectively identical, and provide a 5-95% range for climate sensitivity of 1.1-4.05 K (median 1.87 K).
The method of belief scales as a means for dealing with uncertainty in tough regulatory decisions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pilch, Martin M.
Modeling and simulation is playing an increasing role in supporting tough regulatory decisions, which are typically characterized by variabilities and uncertainties in the scenarios, input conditions, failure criteria, model parameters, and even model form. Variability exists when there is a statistically significant database that is fully relevant to the application. Uncertainty, on the other hand, is characterized by some degree of ignorance. A simple algebraic problem was used to illustrate how various risk methodologies address variability and uncertainty in a regulatory context. These traditional risk methodologies include probabilistic methods (including frequensic and Bayesian perspectives) and second-order methods where variabilities andmore » uncertainties are treated separately. Representing uncertainties with (subjective) probability distributions and using probabilistic methods to propagate subjective distributions can lead to results that are not logically consistent with available knowledge and that may not be conservative. The Method of Belief Scales (MBS) is developed as a means to logically aggregate uncertain input information and to propagate that information through the model to a set of results that are scrutable, easily interpretable by the nonexpert, and logically consistent with the available input information. The MBS, particularly in conjunction with sensitivity analyses, has the potential to be more computationally efficient than other risk methodologies. The regulatory language must be tailored to the specific risk methodology if ambiguity and conflict are to be avoided.« less
Random mechanics: Nonlinear vibrations, turbulences, seisms, swells, fatigue
NASA Astrophysics Data System (ADS)
Kree, P.; Soize, C.
The random modeling of physical phenomena, together with probabilistic methods for the numerical calculation of random mechanical forces, are analytically explored. Attention is given to theoretical examinations such as probabilistic concepts, linear filtering techniques, and trajectory statistics. Applications of the methods to structures experiencing atmospheric turbulence, the quantification of turbulence, and the dynamic responses of the structures are considered. A probabilistic approach is taken to study the effects of earthquakes on structures and to the forces exerted by ocean waves on marine structures. Theoretical analyses by means of vector spaces and stochastic modeling are reviewed, as are Markovian formulations of Gaussian processes and the definition of stochastic differential equations. Finally, random vibrations with a variable number of links and linear oscillators undergoing the square of Gaussian processes are investigated.
Trimming a hazard logic tree with a new model-order-reduction technique
Porter, Keith; Field, Edward; Milner, Kevin R
2017-01-01
The size of the logic tree within the Uniform California Earthquake Rupture Forecast Version 3, Time-Dependent (UCERF3-TD) model can challenge risk analyses of large portfolios. An insurer or catastrophe risk modeler concerned with losses to a California portfolio might have to evaluate a portfolio 57,600 times to estimate risk in light of the hazard possibility space. Which branches of the logic tree matter most, and which can one ignore? We employed two model-order-reduction techniques to simplify the model. We sought a subset of parameters that must vary, and the specific fixed values for the remaining parameters, to produce approximately the same loss distribution as the original model. The techniques are (1) a tornado-diagram approach we employed previously for UCERF2, and (2) an apparently novel probabilistic sensitivity approach that seems better suited to functions of nominal random variables. The new approach produces a reduced-order model with only 60 of the original 57,600 leaves. One can use the results to reduce computational effort in loss analyses by orders of magnitude.
NASA Technical Reports Server (NTRS)
Carson, William; Lindemuth, Kathleen; Mich, John; White, K. Preston; Parker, Peter A.
2009-01-01
Probabilistic engineering design enhances safety and reduces costs by incorporating risk assessment directly into the design process. In this paper, we assess the format of the quantitative metrics for the vehicle which will replace the Space Shuttle, the Ares I rocket. Specifically, we address the metrics for in-flight measurement error in the vector position of the motor nozzle, dictated by limits on guidance, navigation, and control systems. Analyses include the propagation of error from measured to derived parameters, the time-series of dwell points for the duty cycle during static tests, and commanded versus achieved yaw angle during tests. Based on these analyses, we recommend a probabilistic template for specifying the maximum error in angular displacement and radial offset for the nozzle-position vector. Criteria for evaluating individual tests and risky decisions also are developed.
Bamrungsawad, Naruemon; Upakdee, Nilawan; Pratoomsoot, Chayanin; Sruamsiri, Rosarin; Dilokthornsakul, Piyameth; Dechanont, Supinya; Wu, David Bin-Chia; Dejthevaporn, Charungthai; Chaiyakunapruk, Nathorn
2016-07-01
Intravenous immunoglobulin (IVIG) has been recommended for steroid-resistant chronic inflammatory demyelinating polyradiculoneuropathy (CIDP). The treatment, however, is very costly to healthcare system, and there remains no evidence of its economic justifiability. This study aimed to conduct an economic evaluation (EE) of IVIG plus corticosteroids in steroid-resistant CIDP in Thailand. A Markov model was constructed to estimate the lifetime costs and outcomes for IVIG plus corticosteroids in comparison with immunosuppressants plus corticosteroids in steroid-resistant CIDP patients from a societal perspective. Efficacy and utility data were obtained from clinical literature, meta-analyses, medical record reviews, and patient interviews. Cost data were obtained from list prices, an electronic hospital database, published source, and patient interviews. All costs [in 2015 US dollars (US$)] and outcomes were discounted at 3 % annually. One-way and probabilistic sensitivity analyses were conducted. In the base-case, the incremental costs and quality-adjusted life years (QALYs) of IVIG plus corticosteroids versus immunosuppressants plus corticosteroids were US$2112.02 and 1.263 QALYs, respectively, resulting in an incremental cost-effectiveness ratio (ICER) of US$1672.71 per QALY gained. Sensitivity analyses revealed that the utility value of disabled patients was the greatest influence on ICER. At a societal willingness-to-pay threshold in Thailand of US$4672 per QALY gained, IVIG plus corticosteroids had a 92.1 % probability of being cost effective. At a threshold of US$4672 per QALY gained, IVIG plus corticosteroids is considered a cost-effective treatment for steroid-resistant CIDP patients in Thailand.
Constraining ozone-precursor responsiveness using ambient measurements
This study develops probabilistic estimates of ozone (O3) sensitivities to precursoremissions by incorporating uncertainties in photochemical modeling and evaluating modelperformance based on ground-level observations of O3 and oxides of nitrogen (NOx).Uncertainties in model form...
Corcoran, R; Rowse, G; Moore, R; Blackwood, N; Kinderman, P; Howard, R; Cummins, S; Bentall, R P
2008-11-01
A tendency to make hasty decisions on probabilistic reasoning tasks and a difficulty attributing mental states to others are key cognitive features of persecutory delusions (PDs) in the context of schizophrenia. This study examines whether these same psychological anomalies characterize PDs when they present in the context of psychotic depression. Performance on measures of probabilistic reasoning and theory of mind (ToM) was examined in five subgroups differing in diagnostic category and current illness status. The tendency to draw hasty decisions in probabilistic settings and poor ToM tested using story format feature in PDs irrespective of diagnosis. Furthermore, performance on the ToM story task correlated with the degree of distress caused by and preoccupation with the current PDs in the currently deluded groups. By contrast, performance on the non-verbal ToM task appears to be more sensitive to diagnosis, as patients with schizophrenia spectrum disorders perform worse on this task than those with depression irrespective of the presence of PDs. The psychological anomalies associated with PDs examined here are transdiagnostic but different measures of ToM may be more or less sensitive to indices of severity of the PDs, diagnosis and trait- or state-related cognitive effects.
NASA Technical Reports Server (NTRS)
Lyle, Karen H.
2014-01-01
Acceptance of new spacecraft structural architectures and concepts requires validated design methods to minimize the expense involved with technology validation via flighttesting. This paper explores the implementation of probabilistic methods in the sensitivity analysis of the structural response of a Hypersonic Inflatable Aerodynamic Decelerator (HIAD). HIAD architectures are attractive for spacecraft deceleration because they are lightweight, store compactly, and utilize the atmosphere to decelerate a spacecraft during re-entry. However, designers are hesitant to include these inflatable approaches for large payloads or spacecraft because of the lack of flight validation. In the example presented here, the structural parameters of an existing HIAD model have been varied to illustrate the design approach utilizing uncertainty-based methods. Surrogate models have been used to reduce computational expense several orders of magnitude. The suitability of the design is based on assessing variation in the resulting cone angle. The acceptable cone angle variation would rely on the aerodynamic requirements.
Quantifying Uncertainties in the Thermo-Mechanical Properties of Particulate Reinforced Composites
NASA Technical Reports Server (NTRS)
Mital, Subodh K.; Murthy, Pappu L. N.
1999-01-01
The present paper reports results from a computational simulation of probabilistic particulate reinforced composite behavior. The approach consists use of simplified micromechanics of particulate reinforced composites together with a Fast Probability Integration (FPI) technique. Sample results are presented for a Al/SiC(sub p)(silicon carbide particles in aluminum matrix) composite. The probability density functions for composite moduli, thermal expansion coefficient and thermal conductivities along with their sensitivity factors are computed. The effect of different assumed distributions and the effect of reducing scatter in constituent properties on the thermal expansion coefficient are also evaluated. The variations in the constituent properties that directly effect these composite properties are accounted for by assumed probabilistic distributions. The results show that the present technique provides valuable information about the scatter in composite properties and sensitivity factors, which are useful to test or design engineers.
Connectome sensitivity or specificity: which is more important?
Zalesky, Andrew; Fornito, Alex; Cocchi, Luca; Gollo, Leonardo L; van den Heuvel, Martijn P; Breakspear, Michael
2016-11-15
Connectomes with high sensitivity and high specificity are unattainable with current axonal fiber reconstruction methods, particularly at the macro-scale afforded by magnetic resonance imaging. Tensor-guided deterministic tractography yields sparse connectomes that are incomplete and contain false negatives (FNs), whereas probabilistic methods steered by crossing-fiber models yield dense connectomes, often with low specificity due to false positives (FPs). Densely reconstructed probabilistic connectomes are typically thresholded to improve specificity at the cost of a reduction in sensitivity. What is the optimal tradeoff between connectome sensitivity and specificity? We show empirically and theoretically that specificity is paramount. Our evaluations of the impact of FPs and FNs on empirical connectomes indicate that specificity is at least twice as important as sensitivity when estimating key properties of brain networks, including topological measures of network clustering, network efficiency and network modularity. Our asymptotic analysis of small-world networks with idealized modular structure reveals that as the number of nodes grows, specificity becomes exactly twice as important as sensitivity to the estimation of the clustering coefficient. For the estimation of network efficiency, the relative importance of specificity grows linearly with the number of nodes. The greater importance of specificity is due to FPs occurring more prevalently between network modules rather than within them. These spurious inter-modular connections have a dramatic impact on network topology. We argue that efforts to maximize the sensitivity of connectome reconstruction should be realigned with the need to map brain networks with high specificity. Copyright © 2016 Elsevier Inc. All rights reserved.
Design for cyclic loading endurance of composites
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Murthy, Pappu L. N.; Chamis, Christos C.; Liaw, Leslie D. G.
1993-01-01
The application of the computer code IPACS (Integrated Probabilistic Assessment of Composite Structures) to aircraft wing type structures is described. The code performs a complete probabilistic analysis for composites taking into account the uncertainties in geometry, boundary conditions, material properties, laminate lay-ups, and loads. Results of the analysis are presented in terms of cumulative distribution functions (CDF) and probability density function (PDF) of the fatigue life of a wing type composite structure under different hygrothermal environments subjected to the random pressure. The sensitivity of the fatigue life to a number of critical structural/material variables is also computed from the analysis.
Probabilistic analysis of bladed turbine disks and the effect of mistuning
NASA Technical Reports Server (NTRS)
Shah, A. R.; Nagpal, V. K.; Chamis, Christos C.
1990-01-01
Probabilistic assessment of the maximum blade response on a mistuned rotor disk is performed using the computer code NESSUS. The uncertainties in natural frequency, excitation frequency, amplitude of excitation and damping are included to obtain the cumulative distribution function (CDF) of blade responses. Advanced mean value first order analysis is used to compute CDF. The sensitivities of different random variables are identified. Effect of the number of blades on a rotor on mistuning is evaluated. It is shown that the uncertainties associated with the forcing function parameters have significant effect on the response distribution of the bladed rotor.
Probabilistic analysis of bladed turbine disks and the effect of mistuning
NASA Technical Reports Server (NTRS)
Shah, Ashwin; Nagpal, V. K.; Chamis, C. C.
1990-01-01
Probabilistic assessment of the maximum blade response on a mistuned rotor disk is performed using the computer code NESSUS. The uncertainties in natural frequency, excitation frequency, amplitude of excitation and damping have been included to obtain the cumulative distribution function (CDF) of blade responses. Advanced mean value first order analysis is used to compute CDF. The sensitivities of different random variables are identified. Effect of the number of blades on a rotor on mistuning is evaluated. It is shown that the uncertainties associated with the forcing function parameters have significant effect on the response distribution of the bladed rotor.
NASA Astrophysics Data System (ADS)
Maiti, Saumen; Tiwari, Ram Krishna
2010-10-01
A new probabilistic approach based on the concept of Bayesian neural network (BNN) learning theory is proposed for decoding litho-facies boundaries from well-log data. We show that how a multi-layer-perceptron neural network model can be employed in Bayesian framework to classify changes in litho-log successions. The method is then applied to the German Continental Deep Drilling Program (KTB) well-log data for classification and uncertainty estimation in the litho-facies boundaries. In this framework, a posteriori distribution of network parameter is estimated via the principle of Bayesian probabilistic theory, and an objective function is minimized following the scaled conjugate gradient optimization scheme. For the model development, we inflict a suitable criterion, which provides probabilistic information by emulating different combinations of synthetic data. Uncertainty in the relationship between the data and the model space is appropriately taken care by assuming a Gaussian a priori distribution of networks parameters (e.g., synaptic weights and biases). Prior to applying the new method to the real KTB data, we tested the proposed method on synthetic examples to examine the sensitivity of neural network hyperparameters in prediction. Within this framework, we examine stability and efficiency of this new probabilistic approach using different kinds of synthetic data assorted with different level of correlated noise. Our data analysis suggests that the designed network topology based on the Bayesian paradigm is steady up to nearly 40% correlated noise; however, adding more noise (˜50% or more) degrades the results. We perform uncertainty analyses on training, validation, and test data sets with and devoid of intrinsic noise by making the Gaussian approximation of the a posteriori distribution about the peak model. We present a standard deviation error-map at the network output corresponding to the three types of the litho-facies present over the entire litho-section of the KTB. The comparisons of maximum a posteriori geological sections constructed here, based on the maximum a posteriori probability distribution, with the available geological information and the existing geophysical findings suggest that the BNN results reveal some additional finer details in the KTB borehole data at certain depths, which appears to be of some geological significance. We also demonstrate that the proposed BNN approach is superior to the conventional artificial neural network in terms of both avoiding "over-fitting" and aiding uncertainty estimation, which are vital for meaningful interpretation of geophysical records. Our analyses demonstrate that the BNN-based approach renders a robust means for the classification of complex changes in the litho-facies successions and thus could provide a useful guide for understanding the crustal inhomogeneity and the structural discontinuity in many other tectonically complex regions.
Probabilistic Analysis of Large-Scale Composite Structures Using the IPACS Code
NASA Technical Reports Server (NTRS)
Lemonds, Jeffrey; Kumar, Virendra
1995-01-01
An investigation was performed to ascertain the feasibility of using IPACS (Integrated Probabilistic Assessment of Composite Structures) for probabilistic analysis of a composite fan blade, the development of which is being pursued by various industries for the next generation of aircraft engines. A model representative of the class of fan blades used in the GE90 engine has been chosen as the structural component to be analyzed with IPACS. In this study, typical uncertainties are assumed in the level, and structural responses for ply stresses and frequencies are evaluated in the form of cumulative probability density functions. Because of the geometric complexity of the blade, the number of plies varies from several hundred at the root to about a hundred at the tip. This represents a extremely complex composites application for the IPACS code. A sensitivity study with respect to various random variables is also performed.
Gofer-Levi, M; Silberg, T; Brezner, A; Vakil, E
2014-09-01
Children learn to engage their surroundings skillfully, adopting implicit knowledge of complex regularities and associations. Probabilistic classification learning (PCL) is a type of cognitive procedural learning in which different cues are probabilistically associated with specific outcomes. Little is known about the effects of developmental disorders on cognitive skill acquisition. Twenty-four children and adolescents with cerebral palsy (CP) were compared to 24 typically developing (TD) youth in their ability to learn probabilistic associations. Performance was examined in relation to general cognitive abilities, level of motor impairment and age. Improvement in PCL was observed for all participants, with no relation to IQ. An age effect was found only among TD children. Learning curves of children with CP on a cognitive procedural learning task differ from those of TD peers and do not appear to be age sensitive. Copyright © 2014 Elsevier Ltd. All rights reserved.
te Beest, Dennis; de Bruin, Erwin; Imholz, Sandra; Wallinga, Jacco; Teunis, Peter; Koopmans, Marion; van Boven, Michiel
2014-01-01
Reliable discrimination of recent influenza A infection from previous exposure using hemagglutination inhibition (HI) or virus neutralization tests is currently not feasible. This is due to low sensitivity of the tests and the interference of antibody responses generated by previous infections. Here we investigate the diagnostic characteristics of a newly developed antibody (HA1) protein microarray using data from cross-sectional serological studies carried out before and after the pandemic of 2009. The data are analysed by mixture models, providing a probabilistic classification of sera (susceptible, prior-exposed, recently infected). Estimated sensitivity and specificity for identifying A/2009 infections are low using HI (66% and 51%), and high when using A/2009 microarray data alone or together with A/1918 microarray data (96% and 95%). As a heuristic, a high A/2009 to A/1918 antibody ratio (>1.05) is indicative of recent infection, while a low ratio is indicative of a pre-existing response, even if the A/2009 titer is high. We conclude that highly sensitive and specific classification of individual sera is possible using the protein microarray, thereby enabling precise estimation of age-specific infection attack rates in the population even if sample sizes are small. PMID:25405997
Noradrenergic modulation of risk/reward decision making.
Montes, David R; Stopper, Colin M; Floresco, Stan B
2015-08-01
Catecholamine transmission modulates numerous cognitive and reward-related processes that can subserve more complex functions such as cost/benefit decision making. Dopamine has been shown to play an integral role in decisions involving reward uncertainty, yet there is a paucity of research investigating the contributions of noradrenaline (NA) transmission to these functions. The present study was designed to elucidate the contribution of NA to risk/reward decision making in rats, assessed with a probabilistic discounting task. We examined the effects of reducing noradrenergic transmission with the α2 agonist clonidine (10-100 μg/kg), and increasing activity at α2A receptor sites with the agonist guanfacine (0.1-1 mg/kg), the α2 antagonist yohimbine (1-3 mg/kg), and the noradrenaline transporter (NET) inhibitor atomoxetine (0.3-3 mg/kg) on probabilistic discounting. Rats chose between a small/certain reward and a larger/risky reward, wherein the probability of obtaining the larger reward either decreased (100-12.5 %) or increased (12.5-100 %) over a session. In well-trained rats, clonidine reduced risky choice by decreasing reward sensitivity, whereas guanfacine did not affect choice behavior. Yohimbine impaired adjustments in decision biases as reward probability changed within a session by altering negative feedback sensitivity. In a subset of rats that displayed prominent discounting of probabilistic rewards, the lowest dose of atomoxetine increased preference for the large/risky reward when this option had greater long-term utility. These data highlight an important and previously uncharacterized role for noradrenergic transmission in mediating different aspects of risk/reward decision making and mediating reward and negative feedback sensitivity.
Dinov, Martin; Leech, Robert
2017-01-01
Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses.
Dinov, Martin; Leech, Robert
2017-01-01
Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses. PMID:29163110
NASA Astrophysics Data System (ADS)
Setiawan, R.
2018-05-01
In this paper, Economic Order Quantity (EOQ) of the vendor-buyer supply-chain model under a probabilistic condition with imperfect quality items has been analysed. The analysis is delivered using two concepts in game theory approach, which is Stackelberg equilibrium and Pareto Optimal, under non-cooperative and cooperative games, respectively. Another result is getting acomparison of theoptimal result between integrated scheme and game theory approach based on analytical and numerical result using appropriate simulation data.
Generating probabilistic Boolean networks from a prescribed transition probability matrix.
Ching, W-K; Chen, X; Tsing, N-K
2009-11-01
Probabilistic Boolean networks (PBNs) have received much attention in modeling genetic regulatory networks. A PBN can be regarded as a Markov chain process and is characterised by a transition probability matrix. In this study, the authors propose efficient algorithms for constructing a PBN when its transition probability matrix is given. The complexities of the algorithms are also analysed. This is an interesting inverse problem in network inference using steady-state data. The problem is important as most microarray data sets are assumed to be obtained from sampling the steady-state.
Probabilistic Characterization of Adversary Behavior in Cyber Security
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyers, C A; Powers, S S; Faissol, D M
2009-10-08
The objective of this SMS effort is to provide a probabilistic characterization of adversary behavior in cyber security. This includes both quantitative (data analysis) and qualitative (literature review) components. A set of real LLNL email data was obtained for this study, consisting of several years worth of unfiltered traffic sent to a selection of addresses at ciac.org. The email data was subjected to three interrelated analyses: a textual study of the header data and subject matter, an examination of threats present in message attachments, and a characterization of the maliciousness of embedded URLs.
ENDANGERED AQUATIC VERTEBRATES: COMPARATIVE AND PROBABILISTIC-BASED TOXICOLOGY
It has previously been assumed that endangered, threatened, and candidate endangered species (collectively known as “listed” species) are uniquely sensitive to chemicals. The purpose of this cooperative research effort (U.S. Environmental Protection Agency, U.S. Geological Surve...
NASA Astrophysics Data System (ADS)
Barani, S.; Mascandola, C.; Massa, M.; Spallarossa, D.
2017-12-01
The recent Emilia seismic sequence (Northern Italy) occurred at the end of the first half of 2012 with main shock of Mw6.1 highlighted the importance of studying site effects in the Po Plain, the larger and deeper sedimentary basin in Italy. As has long been known, long-period amplification related to deep sedimentary basins can significantly affect the characteristics of the ground-motion induced by strong earthquakes. It follows that the effects of deep sedimentary deposits on ground shaking require special attention during the definition of the design seismic action. The work presented here analyzes the impact of deep-soil discontinuities on ground-motion amplification, with particular focus on long-period probabilistic seismic-hazard assessment. The study focuses on the site of Castelleone, where a seismic station of the Italian National Seismic Network has been recording since 2009. Our study includes both experimental and numerical site response analyses. Specifically, extensive active and passive geophysical measurements were carried out in order to define a detailed shear-wave velocity (VS) model to be used in the numerical analyses. These latter are needed to assess the site-specific ground-motion hazard. Besides classical seismic refraction profiles and multichannel analysis of surface waves, we analyzed ambient vibration measurements in both single and array configurations. The VS profile was determined via joint inversion of the experimental phase-velocity dispersion curve with the ellipticity curve derived from horizontal-to-vertical spectral ratios. The profile shows two main discontinuities at depths of around 160 and 1350 m, respectively. The probabilistic site-specific hazard was assessed in terms of both spectral acceleration and displacement. A partially non-ergodic approach was adopted. We have found that the spectral acceleration hazard is barely sensitive to long-period (up to 10 s) amplification related to the deeper discontinuity whereas the displacement hazard is strongly affected. Our results show that neglecting the effects of the deeper discontinuity implies an underestimation of the hazard of up to about 49% for a mean return period (MRP) of 475 years and 57% for an MRP of 2475 years, with possible consequences on the design of very tall buildings and large bridges.
Probabilistically modeling lava flows with MOLASSES
NASA Astrophysics Data System (ADS)
Richardson, J. A.; Connor, L.; Connor, C.; Gallant, E.
2017-12-01
Modeling lava flows through Cellular Automata methods enables a computationally inexpensive means to quickly forecast lava flow paths and ultimate areal extents. We have developed a lava flow simulator, MOLASSES, that forecasts lava flow inundation over an elevation model from a point source eruption. This modular code can be implemented in a deterministic fashion with given user inputs that will produce a single lava flow simulation. MOLASSES can also be implemented in a probabilistic fashion where given user inputs define parameter distributions that are randomly sampled to create many lava flow simulations. This probabilistic approach enables uncertainty in input data to be expressed in the model results and MOLASSES outputs a probability map of inundation instead of a determined lava flow extent. Since the code is comparatively fast, we use it probabilistically to investigate where potential vents are located that may impact specific sites and areas, as well as the unconditional probability of lava flow inundation of sites or areas from any vent. We have validated the MOLASSES code to community-defined benchmark tests and to the real world lava flows at Tolbachik (2012-2013) and Pico do Fogo (2014-2015). To determine the efficacy of the MOLASSES simulator at accurately and precisely mimicking the inundation area of real flows, we report goodness of fit using both model sensitivity and the Positive Predictive Value, the latter of which is a Bayesian posterior statistic. Model sensitivity is often used in evaluating lava flow simulators, as it describes how much of the lava flow was successfully modeled by the simulation. We argue that the positive predictive value is equally important in determining how good a simulator is, as it describes the percentage of the simulation space that was actually inundated by lava.
Probabilistic reversal learning is impaired in Parkinson's disease
Peterson, David A.; Elliott, Christian; Song, David D.; Makeig, Scott; Sejnowski, Terrence J.; Poizner, Howard
2009-01-01
In many everyday settings, the relationship between our choices and their potentially rewarding outcomes is probabilistic and dynamic. In addition, the difficulty of the choices can vary widely. Although a large body of theoretical and empirical evidence suggests that dopamine mediates rewarded learning, the influence of dopamine in probabilistic and dynamic rewarded learning remains unclear. We adapted a probabilistic rewarded learning task originally used to study firing rates of dopamine cells in primate substantia nigra pars compacta (Morris et al. 2006) for use as a reversal learning task with humans. We sought to investigate how the dopamine depletion in Parkinson's disease (PD) affects probabilistic reward learning and adaptation to a reversal in reward contingencies. Over the course of 256 trials subjects learned to choose the more favorable from among pairs of images with small or large differences in reward probabilities. During a subsequent otherwise identical reversal phase, the reward probability contingencies for the stimuli were reversed. Seventeen Parkinson's disease (PD) patients of mild to moderate severity were studied off of their dopaminergic medications and compared to 15 age-matched controls. Compared to controls, PD patients had distinct pre- and post-reversal deficiencies depending upon the difficulty of the choices they had to learn. The patients also exhibited compromised adaptability to the reversal. A computational model of the subjects’ trial-by-trial choices demonstrated that the adaptability was sensitive to the gain with which patients weighted pre-reversal feedback. Collectively, the results implicate the nigral dopaminergic system in learning to make choices in environments with probabilistic and dynamic reward contingencies. PMID:19628022
NASA Technical Reports Server (NTRS)
Bast, Callie C.; Jurena, Mark T.; Godines, Cody R.; Chamis, Christos C. (Technical Monitor)
2001-01-01
This project included both research and education objectives. The goal of this project was to advance innovative research and education objectives in theoretical and computational probabilistic structural analysis, reliability, and life prediction for improved reliability and safety of structural components of aerospace and aircraft propulsion systems. Research and education partners included Glenn Research Center (GRC) and Southwest Research Institute (SwRI) along with the University of Texas at San Antonio (UTSA). SwRI enhanced the NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) code and provided consulting support for NESSUS-related activities at UTSA. NASA funding supported three undergraduate students, two graduate students, a summer course instructor and the Principal Investigator. Matching funds from UTSA provided for the purchase of additional equipment for the enhancement of the Advanced Interactive Computational SGI Lab established during the first year of this Partnership Award to conduct the probabilistic finite element summer courses. The research portion of this report presents the cumulation of work performed through the use of the probabilistic finite element program, NESSUS, Numerical Evaluation and Structures Under Stress, and an embedded Material Strength Degradation (MSD) model. Probabilistic structural analysis provided for quantification of uncertainties associated with the design, thus enabling increased system performance and reliability. The structure examined was a Space Shuttle Main Engine (SSME) fuel turbopump blade. The blade material analyzed was Inconel 718, since the MSD model was previously calibrated for this material. Reliability analysis encompassing the effects of high temperature and high cycle fatigue, yielded a reliability value of 0.99978 using a fully correlated random field for the blade thickness. The reliability did not change significantly for a change in distribution type except for a change in distribution from Gaussian to Weibull for the centrifugal load. The sensitivity factors determined to be most dominant were the centrifugal loading and the initial strength of the material. These two sensitivity factors were influenced most by a change in distribution type from Gaussian to Weibull. The education portion of this report describes short-term and long-term educational objectives. Such objectives serve to integrate research and education components of this project resulting in opportunities for ethnic minority students, principally Hispanic. The primary vehicle to facilitate such integration was the teaching of two probabilistic finite element method courses to undergraduate engineering students in the summers of 1998 and 1999.
Vanderveldt, Ariana; Green, Leonard; Myerson, Joel
2014-01-01
The value of an outcome is affected both by the delay until its receipt (delay discounting) and by the likelihood of its receipt (probability discounting). Despite being well-described by the same hyperboloid function, delay and probability discounting involve fundamentally different processes, as revealed, for example, by the differential effects of reward amount. Previous research has focused on the discounting of delayed and probabilistic rewards separately, with little research examining more complex situations in which rewards are both delayed and probabilistic. In two experiments, participants made choices between smaller rewards that were both immediate and certain and larger rewards that were both delayed and probabilistic. Analyses revealed significant interactions between delay and probability factors inconsistent with an additive model. In contrast, a hyperboloid discounting model in which delay and probability were combined multiplicatively provided an excellent fit to the data. These results suggest that the hyperboloid is a good descriptor of decision making in complicated monetary choice situations like those people encounter in everyday life. PMID:24933696
Incorporating uncertainty in watershed management decision-making: A mercury TMDL case study
Labiosa, W.; Leckie, J.; Shachter, R.; Freyberg, D.; Rytuba, J.; ,
2005-01-01
Water quality impairment due to high mercury fish tissue concentrations and high mercury aqueous concentrations is a widespread problem in several sub-watersheds that are major sources of mercury to the San Francisco Bay. Several mercury Total Maximum Daily Load regulations are currently being developed to address this problem. Decisions about control strategies are being made despite very large uncertainties about current mercury loading behavior, relationships between total mercury loading and methyl mercury formation, and relationships between potential controls and mercury fish tissue levels. To deal with the issues of very large uncertainties, data limitations, knowledge gaps, and very limited State agency resources, this work proposes a decision analytical alternative for mercury TMDL decision support. The proposed probabilistic decision model is Bayesian in nature and is fully compatible with a "learning while doing" adaptive management approach. Strategy evaluation, sensitivity analysis, and information collection prioritization are examples of analyses that can be performed using this approach.
Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cetiner, Mustafa Sacit; none,; Flanagan, George F.
2014-07-30
An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two typesmore » of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.« less
Probabilistic, meso-scale flood loss modelling
NASA Astrophysics Data System (ADS)
Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno
2016-04-01
Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balkey, K.; Witt, F.J.; Bishop, B.A.
1995-06-01
Significant attention has been focused on the issue of reactor vessel pressurized thermal shock (PTS) for many years. Pressurized thermal shock transient events are characterized by a rapid cooldown at potentially high pressure levels that could lead to a reactor vessel integrity concern for some pressurized water reactors. As a result of regulatory and industry efforts in the early 1980`s, a probabilistic risk assessment methodology has been established to address this concern. Probabilistic fracture mechanics analyses are performed as part of this methodology to determine conditional probability of significant flaw extension for given pressurized thermal shock events. While recent industrymore » efforts are underway to benchmark probabilistic fracture mechanics computer codes that are currently used by the nuclear industry, Part I of this report describes the comparison of two independent computer codes used at the time of the development of the original U.S. Nuclear Regulatory Commission (NRC) pressurized thermal shock rule. The work that was originally performed in 1982 and 1983 to compare the U.S. NRC - VISA and Westinghouse (W) - PFM computer codes has been documented and is provided in Part I of this report. Part II of this report describes the results of more recent industry efforts to benchmark PFM computer codes used by the nuclear industry. This study was conducted as part of the USNRC-EPRI Coordinated Research Program for reviewing the technical basis for pressurized thermal shock (PTS) analyses of the reactor pressure vessel. The work focused on the probabilistic fracture mechanics (PFM) analysis codes and methods used to perform the PTS calculations. An in-depth review of the methodologies was performed to verify the accuracy and adequacy of the various different codes. The review was structured around a series of benchmark sample problems to provide a specific context for discussion and examination of the fracture mechanics methodology.« less
Screen or not to screen for peripheral arterial disease: guidance from a decision model.
Vaidya, Anil; Joore, Manuela A; Ten Cate-Hoek, Arina J; Ten Cate, Hugo; Severens, Johan L
2014-01-29
Asymptomatic Peripheral Arterial Disease (PAD) is associated with greater risk of acute cardiovascular events. This study aims to determine the cost-effectiveness of one time only PAD screening using Ankle Brachial Index (ABI) test and subsequent anti platelet preventive treatment (low dose aspirin or clopidogrel) in individuals at high risk for acute cardiovascular events compared to no screening and no treatment using decision analytic modelling. A probabilistic Markov model was developed to evaluate the life time cost-effectiveness of the strategy of selective PAD screening and consequent preventive treatment compared to no screening and no preventive treatment. The analysis was conducted from the Dutch societal perspective and to address decision uncertainty, probabilistic sensitivity analysis was performed. Results were based on average values of 1000 Monte Carlo simulations and using discount rates of 1.5% and 4% for effects and costs respectively. One way sensitivity analyses were performed to identify the two most influential model parameters affecting model outputs. Then, a two way sensitivity analysis was conducted for combinations of values tested for these two most influential parameters. For the PAD screening strategy, life years and quality adjusted life years gained were 21.79 and 15.66 respectively at a lifetime cost of 26,548 Euros. Compared to no screening and treatment (20.69 life years, 15.58 Quality Adjusted Life Years, 28,052 Euros), these results indicate that PAD screening and treatment is a dominant strategy. The cost effectiveness acceptability curves show 88% probability of PAD screening being cost effective at the Willingness To Pay (WTP) threshold of 40000 Euros. In a scenario analysis using clopidogrel as an alternative anti-platelet drug, PAD screening strategy remained dominant. This decision analysis suggests that targeted ABI screening and consequent secondary prevention of cardiovascular events using low dose aspirin or clopidogrel in the identified patients is a cost-effective strategy. Implementation of targeted PAD screening and subsequent treatment in primary care practices and in public health programs is likely to improve the societal health and to save health care costs by reducing catastrophic cardiovascular events.
Grau, Santiago; Lozano, Virginia; Valladares, Amparo; Cavanillas, Rafael; Xie, Yang; Nocea, Gonzalo
2014-01-01
Background Clinical efficacy of antibiotics may be affected by changes in the susceptibility of microorganisms to antimicrobial agents. The purpose of this study is to assess how these changes could affect the initial efficacy of ertapenem and ceftriaxone in the treatment of community-acquired pneumonia (CAP) in elderly patients and the potential consequences this may have in health care costs. Methods Initial efficacy in elderly was obtained from a combined analysis of two multicenter, randomized studies. An alternative scenario was carried out using initial efficacy data according to the pneumonia severity index (PSI). Country-specific pathogens distribution was obtained from a national epidemiological study, and microbiological susceptibilities to first- and second-line therapies were obtained from Spanish or European surveillance studies. A decision analytic model was used to compare ertapenem versus ceftriaxone for CAP inpatient treatment. Inputs of the model were the expected effectiveness previously estimated and resource use considering a Spanish national health system perspective. Outcomes include difference in proportion of successfully treated patients and difference in total costs between ertapenem and ceftriaxone. The model performed one-way and probabilistic sensitivity analyses. Results First-line treatment of CAP with ertapenem led to a higher proportion of successfully treated patients compared with ceftriaxone in Spain. One-way sensitivity analysis showed that length of stay was the key parameter of the model. Probabilistic sensitivity analysis showed that ertapenem can be a cost-saving strategy compared with ceftriaxone, with a 59% probability of being dominant (lower costs with additional health benefits) for both, elderly patients (>65 years) and patients with PSI >3. Conclusion The incorporation of the current antimicrobial susceptibility into the initial clinical efficacy has a significant impact in outcomes and costs in CAP treatment. The treatment with ertapenem compared with ceftriaxone resulted in better clinical outcomes and lower treatment costs for two segments of the Spanish population: elderly patients and patients with severe pneumonia (PSI >3). PMID:24611019
Development of probabilistic emission inventories of air toxics for Jacksonville, Florida, USA.
Zhao, Yuchao; Frey, H Christopher
2004-11-01
Probabilistic emission inventories were developed for 1,3-butadiene, mercury (Hg), arsenic (As), benzene, formaldehyde, and lead for Jacksonville, FL. To quantify inter-unit variability in empirical emission factor data, the Maximum Likelihood Estimation (MLE) method or the Method of Matching Moments was used to fit parametric distributions. For data sets that contain nondetected measurements, a method based upon MLE was used for parameter estimation. To quantify the uncertainty in urban air toxic emission factors, parametric bootstrap simulation and empirical bootstrap simulation were applied to uncensored and censored data, respectively. The probabilistic emission inventories were developed based on the product of the uncertainties in the emission factors and in the activity factors. The uncertainties in the urban air toxics emission inventories range from as small as -25 to +30% for Hg to as large as -83 to +243% for As. The key sources of uncertainty in the emission inventory for each toxic are identified based upon sensitivity analysis. Typically, uncertainty in the inventory of a given pollutant can be attributed primarily to a small number of source categories. Priorities for improving the inventories and for refining the probabilistic analysis are discussed.
Malekpour, Shirin; Langeveld, Jeroen; Letema, Sammy; Clemens, François; van Lier, Jules B
2013-03-30
This paper introduces the probabilistic evaluation framework, to enable transparent and objective decision-making in technology selection for sanitation solutions in low-income countries. The probabilistic framework recognizes the often poor quality of the available data for evaluations. Within this framework, the evaluations will be done based on the probabilities that the expected outcomes occur in practice, considering the uncertainties in evaluation parameters. Consequently, the outcome of evaluations will not be single point estimates; but there exists a range of possible outcomes. A first trial application of this framework for evaluation of sanitation options in the Nyalenda settlement in Kisumu, Kenya, showed how the range of values that an evaluation parameter may obtain in practice would influence the evaluation outcomes. In addition, as the probabilistic evaluation requires various site-specific data, sensitivity analysis was performed to determine the influence of each data set quality on the evaluation outcomes. Based on that, data collection activities could be (re)directed, in a trade-off between the required investments in those activities and the resolution of the decisions that are to be made. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Bast, Callie C.; Boyce, Lola
1995-01-01
This report presents the results of both the fifth and sixth year effort of a research program conducted for NASA-LeRC by The University of Texas at San Antonio (UTSA). The research included on-going development of methodology for a probabilistic material strength degradation model. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes five effects that typically reduce lifetime strength: high temperature, high-cycle mechanical fatigue, low-cycle mechanical fatigue, creep and thermal fatigue. Statistical analysis was conducted on experimental Inconel 718 data obtained from the open literature. This analysis provided regression parameters for use as the model's empirical material constants, thus calibrating the model specifically for Inconel 718. Model calibration was carried out for five variables, namely, high temperature, high-cycle and low-cycle mechanical fatigue, creep and thermal fatigue. Methodology to estimate standard deviations of these material constants for input into the probabilistic material strength model was developed. Using an updated version of PROMISS, entitled PROMISS93, a sensitivity study for the combined effects of high-cycle mechanical fatigue, creep and thermal fatigue was performed. Then using the current version of PROMISS, entitled PROMISS94, a second sensitivity study including the effect of low-cycle mechanical fatigue, as well as, the three previous effects was performed. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing a combination of high-cycle mechanical fatigue and high temperature effects by model to the combination by experiment were conducted. Thus, for Inconel 718, the basic model assumption of independence between effects was evaluated. Results from this limited verification study strongly supported this assumption.
Westerhout, K Y; Verheggen, B G; Schreder, C H; Augustin, M
2012-01-01
An economic evaluation was conducted to assess the outcomes and costs as well as cost-effectiveness of the following grass-pollen immunotherapies: OA (Oralair; Stallergenes S.A., Antony, France) vs GRZ (Grazax; ALK-Abelló, Hørsholm, Denmark), and ALD (Alk Depot SQ; ALK-Abelló) (immunotherapy agents alongside symptomatic medication) and symptomatic treatment alone for grass pollen allergic rhinoconjunctivitis. The costs and outcomes of 3-year treatment were assessed for a period of 9 years using a Markov model. Treatment efficacy was estimated using an indirect comparison of available clinical trials with placebo as a common comparator. Estimates for immunotherapy discontinuation, occurrence of asthma, health state utilities, drug costs, resource use, and healthcare costs were derived from published sources. The analysis was conducted from the insurant's perspective including public and private health insurance payments and co-payments by insurants. Outcomes were reported as quality-adjusted life years (QALYs) and symptom-free days. The uncertainty around incremental model results was tested by means of extensive deterministic univariate and probabilistic multivariate sensitivity analyses. In the base case analysis the model predicted a cost-utility ratio of OA vs symptomatic treatment of €14,728 per QALY; incremental costs were €1356 (95%CI: €1230; €1484) and incremental QALYs 0.092 (95%CI: 0.052; 0.140). OA was the dominant strategy compared to GRZ and ALD, with estimated incremental costs of -€1142 (95%CI: -€1255; -€1038) and -€54 (95%CI: -€188; €85) and incremental QALYs of 0.015 (95%CI: -0.025; 0.056) and 0.027 (95%CI: -0.022; 0.075), respectively. At a willingness-to-pay threshold of €20,000, the probability of OA being the most cost-effective treatment was predicted to be 79%. Univariate sensitivity analyses show that incremental outcomes were moderately sensitive to changes in efficacy estimates. The main study limitation was the requirement of an indirect comparison involving several steps to assess relative treatment effects. The analysis suggests OA to be cost-effective compared to GRZ and ALD, and a symptomatic treatment. Sensitivity analyses showed that uncertainty surrounding treatment efficacy estimates affected the model outcomes.
Probabilistic simulation of uncertainties in composite uniaxial strengths
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Stock, T. A.
1990-01-01
Probabilistic composite micromechanics methods are developed that simulate uncertainties in unidirectional fiber composite strengths. These methods are in the form of computational procedures using composite mechanics with Monte Carlo simulation. The variables for which uncertainties are accounted include constituent strengths and their respective scatter. A graphite/epoxy unidirectional composite (ply) is studied to illustrate the procedure and its effectiveness to formally estimate the probable scatter in the composite uniaxial strengths. The results show that ply longitudinal tensile and compressive, transverse compressive and intralaminar shear strengths are not sensitive to single fiber anomalies (breaks, intergacial disbonds, matrix microcracks); however, the ply transverse tensile strength is.
Multi-disciplinary coupling effects for integrated design of propulsion systems
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Singhal, S. N.
1993-01-01
Effective computational simulation procedures are described for modeling the inherent multi-disciplinary interactions which govern the accurate response of propulsion systems. Results are presented for propulsion system responses including multi-disciplinary coupling effects using coupled multi-discipline thermal, structural, and acoustic tailoring; an integrated system of multi-disciplinary simulators; coupled material behavior/fabrication process tailoring; sensitivities using a probabilistic simulator; and coupled materials, structures, fracture, and probabilistic behavior simulator. The results demonstrate that superior designs can be achieved if the analysis/tailoring methods account for the multi-disciplinary coupling effects. The coupling across disciplines can be used to develop an integrated coupled multi-discipline numerical propulsion system simulator.
Multi-disciplinary coupling for integrated design of propulsion systems
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Singhal, S. N.
1993-01-01
Effective computational simulation procedures are described for modeling the inherent multi-disciplinary interactions for determining the true response of propulsion systems. Results are presented for propulsion system responses including multi-discipline coupling effects via (1) coupled multi-discipline tailoring, (2) an integrated system of multidisciplinary simulators, (3) coupled material-behavior/fabrication-process tailoring, (4) sensitivities using a probabilistic simulator, and (5) coupled materials/structures/fracture/probabilistic behavior simulator. The results show that the best designs can be determined if the analysis/tailoring methods account for the multi-disciplinary coupling effects. The coupling across disciplines can be used to develop an integrated interactive multi-discipline numerical propulsion system simulator.
Reliability and Probabilistic Risk Assessment - How They Play Together
NASA Technical Reports Server (NTRS)
Safie, Fayssal M.; Stutts, Richard; Huang, Zhaofeng
2015-01-01
The objective of this presentation is to discuss the PRA process and the reliability engineering discipline, their differences and similarities, and how they are used as complimentary analyses to support design and flight decisions.
Economic Evaluation of Telemedicine for Patients in ICUs.
Yoo, Byung-Kwang; Kim, Minchul; Sasaki, Tomoko; Melnikow, Joy; Marcin, James P
2016-02-01
Despite telemedicine's potential to improve patients' health outcomes and reduce costs in the ICU, hospitals have been slow to introduce telemedicine in the ICU due to high up-front costs and mixed evidence on effectiveness. This study's first aim was to conduct a cost-effectiveness analysis to estimate the incremental cost-effectiveness ratio of telemedicine in the ICU, compared with ICU without telemedicine, from the healthcare system perspective. The second aim was to examine potential cost saving of telemedicine in the ICU through probabilistic analyses and break-even analyses. Simulation analyses performed by standard decision models. Hypothetical ICU defined by the U.S. literature. Hypothetical adult patients in ICU defined by the U.S. literature. The intervention was the introduction of telemedicine in the ICU, which was assumed to affect per-patient per-hospital-stay ICU cost and hospital mortality. Telemedicine in the ICU operation costs included the telemedicine equipment-installation (start-up) costs with 5-year depreciation, maintenance costs, and clinician staffing costs. Telemedicine in the ICU effectiveness was measured by cumulative quality-adjusted life years for 5 years after ICU discharge. The base case cost-effectiveness analysis estimated telemedicine in the ICU to extend 0.011 quality-adjusted life years with an incremental cost of $516 per patient compared with ICU without telemedicine, resulting in an incremental cost-effectiveness ratio of $45,320 per additional quality-adjusted life year (= $516/0.011). The probabilistic cost-effectiveness analysis estimated an incremental cost-effectiveness ratio of $50,265 with a wide 95% CI from a negative value (suggesting cost savings) to $375,870. These probabilistic analyses projected that cost saving is achieved 37% of 1,000 iterations. Cost saving is also feasible if the per-patient per-hospital-stay operational cost and physician cost were less than $422 and less than $155, respectively, based on break-even analyses. Our analyses suggest that telemedicine in the ICU is cost-effective in most cases and cost saving in some cases. The thresholds of cost and effectiveness, estimated by break-even analyses, help hospitals determine the impact of telemedicine in the ICU and potential cost saving.
Tu, Hong Anh; Palimaka, Stefan; Sehatzadeh, Shayan; Blackhouse, Gord; Yap, Belinda; Tsoi, Bernice; Bowen, Jim; O'Reilly, Daria; Holubowich, Corinne; Kaulback, Kellee; Campbell, Kaitryn
2016-01-01
Background Major depressive disorder (MDD, 10% over a person's lifetime) is common and costly to the health system. Unfortunately, many MDD cases are resistant to treatment with antidepressant drugs and require other treatment to reduce or eliminate depression. Electroconvulsive therapy (ECT) has long been used to treat persons with treatment-resistant depression (TRD). Despite its effectiveness, ECT has side effects that make patients intolerant to the treatment, or they refuse to use it. Repetitive transcranial magnetic stimulation (rTMS), which has fewer side effects than ECT and might be an alternative for TRD patients who are ineligible for or unwilling to undergo ECT, has been developed to treat TRD. Objectives This analysis evaluates the cost-effectiveness of rTMS for patients with TRD compared with ECT or sham rTMS and estimates the potential budgetary impact of various levels of implementation of rTMS in Ontario. Review Methods A cost-utility analysis compared the costs and health outcomes of two treatments for persons with TRD in Ontario: rTMS alone compared with ECT alone and rTMS alone compared with sham rTMS. We calculated the six-month incremental costs and quality-adjusted life-years (QALYs) for these treatments. One-way and probabilistic sensitivity analyses were performed to test the robustness of the model's results. A 1-year budget impact analysis estimated the costs of providing funding for rTMS. The base-case analysis examined the additional costs for funding six centres, where rTMS infrastructure is in place. Sensitivity and scenario analyses explored the impact of increasing diffusion of rTMS to centres with existing ECT infrastructure. All analyses were conducted from the Ontario health care payer perspective. Results ECT was cost effective compared to rTMS when the willingness to pay is greater than $37,640.66 per QALY. In the base-case analysis, which had a six-month time horizon, the cost and effectiveness for rTMS was $5,272 and 0.31 quality-adjusted life-years (QALYs). The cost and effectiveness for ECT were $5,960 and 0.32 QALYs. This translates in an incremental cost-effectiveness ratio of $37,640.66 per QALY gained for ECT compared to rTMS. When rTMS is compared with sham rTMS, an additional $2,154.33 would be spent to gain 0.02 QALY. This translates to an ICER of $98,242.37 per QALY gained. Probabilistic sensitivity analysis showed that the probability of rTMS being cost-effective compared to sham rTMS was 2% and 45% at the thresholds of $50,000 and $100,000 per QALY gained, respectively. Conclusions Repetitive transcranial magnetic stimulation may be cost-effective compared to sham treatment in patients with treatment-resistant depression, depending on the willingness-to-pay threshold. PMID:27110317
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Hayeon, E-mail: kimh2@upmc.edu; Gill, Beant; Beriwal, Sushil
Purpose: To conduct a cost-effectiveness analysis to determine whether stereotactic body radiation therapy (SBRT) is a cost-effective therapy compared with radiofrequency ablation (RFA) for patients with unresectable colorectal cancer (CRC) liver metastases. Methods and Materials: A cost-effectiveness analysis was conducted using a Markov model and 1-month cycle over a lifetime horizon. Transition probabilities, quality of life utilities, and costs associated with SBRT and RFA were captured in the model on the basis of a comprehensive literature review and Medicare reimbursements in 2014. Strategies were compared using the incremental cost-effectiveness ratio, with effectiveness measured in quality-adjusted life years (QALYs). To account formore » model uncertainty, 1-way and probabilistic sensitivity analyses were performed. Strategies were evaluated with a willingness-to-pay threshold of $100,000 per QALY gained. Results: In base case analysis, treatment costs for 3 fractions of SBRT and 1 RFA procedure were $13,000 and $4397, respectively. Median survival was assumed the same for both strategies (25 months). The SBRT costs $8202 more than RFA while gaining 0.05 QALYs, resulting in an incremental cost-effectiveness ratio of $164,660 per QALY gained. In 1-way sensitivity analyses, results were most sensitive to variation of median survival from both treatments. Stereotactic body radiation therapy was economically reasonable if better survival was presumed (>1 month gain) or if used for large tumors (>4 cm). Conclusions: If equal survival is assumed, SBRT is not cost-effective compared with RFA for inoperable colorectal liver metastases. However, if better local control leads to small survival gains with SBRT, this strategy becomes cost-effective. Ideally, these results should be confirmed with prospective comparative data.« less
Beyer, Sebastian E; Hunink, Myriam G; Schöberl, Florian; von Baumgarten, Louisa; Petersen, Steffen E; Dichgans, Martin; Janssen, Hendrik; Ertl-Wagner, Birgit; Reiser, Maximilian F; Sommer, Wieland H
2015-07-01
This study evaluated the cost-effectiveness of different noninvasive imaging strategies in patients with possible basilar artery occlusion. A Markov decision analytic model was used to evaluate long-term outcomes resulting from strategies using computed tomographic angiography (CTA), magnetic resonance imaging, nonenhanced CT, or duplex ultrasound with intravenous (IV) thrombolysis being administered after positive findings. The analysis was performed from the societal perspective based on US recommendations. Input parameters were derived from the literature. Costs were obtained from United States costing sources and published literature. Outcomes were lifetime costs, quality-adjusted life-years (QALYs), incremental cost-effectiveness ratios, and net monetary benefits, with a willingness-to-pay threshold of $80,000 per QALY. The strategy with the highest net monetary benefit was considered the most cost-effective. Extensive deterministic and probabilistic sensitivity analyses were performed to explore the effect of varying parameter values. In the reference case analysis, CTA dominated all other imaging strategies. CTA yielded 0.02 QALYs more than magnetic resonance imaging and 0.04 QALYs more than duplex ultrasound followed by CTA. At a willingness-to-pay threshold of $80,000 per QALY, CTA yielded the highest net monetary benefits. The probability that CTA is cost-effective was 96% at a willingness-to-pay threshold of $80,000/QALY. Sensitivity analyses showed that duplex ultrasound was cost-effective only for a prior probability of ≤0.02 and that these results were only minimally influenced by duplex ultrasound sensitivity and specificity. Nonenhanced CT and magnetic resonance imaging never became the most cost-effective strategy. Our results suggest that CTA in patients with possible basilar artery occlusion is cost-effective. © 2015 The Authors.
Pang, Y-K; Ip, M; You, J H S
2017-01-01
Early initiation of antifungal treatment for invasive candidiasis is associated with change in mortality. Beta-D-glucan (BDG) is a fungal cell wall component and a serum diagnostic biomarker of fungal infection. Clinical findings suggested an association between reduced invasive candidiasis incidence in intensive care units (ICUs) and BDG-guided preemptive antifungal therapy. We evaluated the potential cost-effectiveness of active BDG surveillance with preemptive antifungal therapy in patients admitted to adult ICUs from the perspective of Hong Kong healthcare providers. A Markov model was designed to simulate the outcomes of active BDG surveillance with preemptive therapy (surveillance group) and no surveillance (standard care group). Candidiasis-associated outcome measures included mortality rate, quality-adjusted life year (QALY) loss, and direct medical cost. Model inputs were derived from the literature. Sensitivity analyses were conducted to evaluate the robustness of model results. In base-case analysis, the surveillance group was more costly (1387 USD versus 664 USD) (1 USD = 7.8 HKD), with lower candidiasis-associated mortality rate (0.653 versus 1.426 per 100 ICU admissions) and QALY loss (0.116 versus 0.254) than the standard care group. The incremental cost per QALY saved by the surveillance group was 5239 USD/QALY. One-way sensitivity analyses found base-case results to be robust to variations of all model inputs. In probabilistic sensitivity analysis, the surveillance group was cost-effective in 50 % and 100 % of 10,000 Monte Carlo simulations at willingness-to-pay (WTP) thresholds of 7200 USD/QALY and ≥27,800 USD/QALY, respectively. Active BDG surveillance with preemptive therapy appears to be highly cost-effective to reduce the candidiasis-associated mortality rate and save QALYs in the ICU setting.
Lee, Lawrence; Saleem, Abdulaziz; Landry, Tara; Latimer, Eric; Chaudhury, Prosanto; Feldman, Liane S
2014-01-01
Parastomal hernia (PSH) is common after stoma formation. Studies have reported that mesh prophylaxis reduces PSH, but there are no cost-effectiveness data. Our objective was to determine the cost effectiveness of mesh prophylaxis vs no prophylaxis to prevent PSH in patients undergoing abdominoperineal resection with permanent colostomy for rectal cancer. Using a cohort Markov model, we modeled the costs and effectiveness of mesh prophylaxis vs no prophylaxis at the index operation in a cohort of 60-year-old patients undergoing abdominoperineal resection for rectal cancer during a time horizon of 5 years. Costs were expressed in 2012 Canadian dollars (CAD$) and effectiveness in quality-adjusted life years. Deterministic and probabilistic sensitivity analyses were performed. In patients with stage I to III rectal cancer, prophylactic mesh was dominant (less costly and more effective) compared with no mesh. In patients with stage IV disease, mesh prophylaxis was associated with higher cost (CAD$495 more) and minimally increased effectiveness (0.05 additional quality-adjusted life years), resulting in an incremental cost-effectiveness ratio of CAD$10,818 per quality-adjusted life year. On sensitivity analyses, the decision was sensitive to the probability of mesh infection and the cost of the mesh, and method of diagnosing PSH. In patients undergoing abdominoperineal resection with permanent colostomy for rectal cancer, mesh prophylaxis might be the less costly and more effective strategy compared with no mesh to prevent PSH in patients with stage I to III disease, and might be cost effective in patients with stage IV disease. Copyright © 2014 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Wilson, Michele R; Bergman, Annika; Chevrou-Severac, Helene; Selby, Ross; Smyth, Michael; Kerrigan, Matthew C
2018-03-01
To examine the clinical and economic impact of vedolizumab compared with infliximab, adalimumab, and golimumab in the treatment of moderately to severely active ulcerative colitis (UC) in the United Kingdom (UK). A decision analytic model in Microsoft Excel was used to compare vedolizumab with other biologic treatments (infliximab, adalimumab, and golimumab) for the treatment of biologic-naïve patients with UC in the UK. Efficacy data were obtained from a network meta-analysis using placebo as the common comparator. Other inputs (e.g., unit costs, adverse-event disutilities, probability of surgery, mortality) were obtained from published literature. Costs were presented in 2012/2013 British pounds. Outcomes included quality-adjusted life-years (QALYs). Costs and outcomes were discounted by 3.5% per year. Incremental cost-effectiveness ratios were presented for vedolizumab compared with other biologics. Univariate and multivariate probabilistic sensitivity analyses were conducted to assess model robustness to parameter uncertainty. The model predicted that anti-tumour necrosis factor-naïve patients on vedolizumab would accrue more QALY than patients on other biologics. The incremental results suggest that vedolizumab is a cost-effective treatment compared with adalimumab (incremental cost-effectiveness ratio of £22,735/QALY) and dominant compared with infliximab and golimumab. Sensitivity analyses suggest that results are most sensitive to treatment response and transition probabilities. However, vedolizumab is cost-effective irrespective of variation in any of the input parameters. Our model predicted that treatment with vedolizumab improves QALY, increases time in remission and response, and is a cost-effective treatment option compared with all other biologics for biologic-naïve patients with moderately to severely active UC.
Wilson, Michele R; Azzabi Zouraq, Ismail; Chevrou-Severac, Helene; Selby, Ross; Kerrigan, Matthew C
2017-01-01
To examine the clinical and economic impact of vedolizumab compared with conventional therapy in the treatment of moderately-to-severely active ulcerative colitis (UC) in the UK based on results of the GEMINI I trial. A decision-analytic model in Microsoft Excel was used to compare vedolizumab with conventional therapy (aminosalicylates, corticosteroids, immunomodulators) for the treatment of patients with UC in the UK. We considered the following three populations: the overall intent-to-treat population from the GEMINI I trial, patients naïve to anti-TNF therapy, and those who had failed anti-TNF-therapy. Population characteristics and efficacy data were obtained from the GEMINI I trial. Other inputs (eg, unit costs, probability of surgery, mortality) were obtained from published literature. Time horizon was a lifetime horizon, with costs and outcomes discounted by 3.5% per year. One-way and probabilistic sensitivity analyses were conducted to measure the impact of parameter uncertainty. Vedolizumab had incremental cost-effectiveness ratios of £4,095/quality-adjusted life-year (QALY), £4,423/QALY, and £5,972/QALY compared with conventional therapy in the intent-to-treat, anti-TNF-naïve, and anti-TNF-failure populations, respectively. Patients on vedolizumab accrued more QALYs while incurring more costs than patients on conventional therapy. The sensitivity analyses showed that the results were most sensitive to induction response and transition probabilities for each treatment. The results suggest that vedolizumab results in more QALYs and may be a cost-effective treatment option compared with conventional therapy for both anti-TNF-naïve and anti-TNF-failure patients with moderately-to-severely active UC.
Zimovetz, Evelina A; Beard, Stephen M; Hodgkins, Paul; Bischof, Matthias; Mauskopf, Josephine A; Setyawan, Juliana
2016-10-01
An economic analysis from the perspective of the UK National Health Service (NHS) evaluated the cost effectiveness of lisdexamfetamine dimesylate (LDX) compared with atomoxetine in children and adolescents with attention-deficit/hyperactivity disorder who have had an inadequate response to methylphenidate. A 1-year decision-analytic model was constructed, with the health outcomes "response", "nonresponse", and "unable to tolerate". Clinical data were taken from a head-to-head, randomized controlled trial in inadequate responders to methylphenidate. Response to treatment was defined as a score of 1 (very much improved) or 2 (much improved) on the Clinical Global Impression-Improvement subscale. Tolerability was assessed by discontinuation rates owing to adverse events. Utility weights were identified via a systematic literature review. Healthcare resource use estimates were obtained via a survey of clinicians. Daily drug costs were derived from British National Formulary 2012 costs and mean doses reported in the trial. One-way and probabilistic sensitivity analyses (PSAs) were performed. The comparison of LDX with atomoxetine resulted in an estimate of an incremental cost-effectiveness ratio of £1802 per quality-adjusted life-year (QALY). The result was robust in a wide range of sensitivity analyses; results were most sensitive to changes in drug costs and efficacy. In the PSA, assuming a maximum willingness to pay of £20,000 per QALY, LDX versus atomoxetine had an 86 % probability of being cost effective. In 38 % of PSA runs, LDX was more effective and less costly than atomoxetine. From the perspective of the UK NHS, LDX provides a cost-effective treatment option for children and adolescents who are inadequate responders to methylphenidate.
Suh, Hae Sun; Song, Hyun Jin; Jang, Eun Jin; Kim, Jung-Sun; Choi, Donghoon; Lee, Sang Moo
2013-07-01
The goal of this study was to perform an economic analysis of a primary stenting with drug-eluting stents (DES) compared with bare-metal stents (BMS) in patients with acute myocardial infarction (AMI) admitted through an emergency room (ER) visit in Korea using population-based data. We employed a cost-minimization method using a decision analytic model with a two-year time period. Model probabilities and costs were obtained from a published systematic review and population-based data from which a retrospective database analysis of the national reimbursement database of Health Insurance Review and Assessment covering 2006 through 2010 was performed. Uncertainty was evaluated using one-way sensitivity analyses and probabilistic sensitivity analyses. Among 513 979 cases with AMI during 2007 and 2008, 24 742 cases underwent stenting procedures and 20 320 patients admitted through an ER visit with primary stenting were identified in the base model. The transition probabilities of DES-to-DES, DES-to-BMS, DES-to-coronary artery bypass graft, and DES-to-balloon were 59.7%, 0.6%, 4.3%, and 35.3%, respectively, among these patients. The average two-year costs of DES and BMS in 2011 Korean won were 11 065 528 won/person and 9 647 647 won/person, respectively. DES resulted in higher costs than BMS by 1 417 882 won/person. The model was highly sensitive to the probability and costs of having no revascularization. Primary stenting with BMS for AMI with an ER visit was shown to be a cost-saving procedure compared with DES in Korea. Caution is needed when applying this finding to patients with a higher level of severity in health status.
Different Imaging Strategies in Patients With Possible Basilar Artery Occlusion
Beyer, Sebastian E.; Hunink, Myriam G.; Schöberl, Florian; von Baumgarten, Louisa; Petersen, Steffen E.; Dichgans, Martin; Janssen, Hendrik; Ertl-Wagner, Birgit; Reiser, Maximilian F.
2015-01-01
Background and Purpose— This study evaluated the cost-effectiveness of different noninvasive imaging strategies in patients with possible basilar artery occlusion. Methods— A Markov decision analytic model was used to evaluate long-term outcomes resulting from strategies using computed tomographic angiography (CTA), magnetic resonance imaging, nonenhanced CT, or duplex ultrasound with intravenous (IV) thrombolysis being administered after positive findings. The analysis was performed from the societal perspective based on US recommendations. Input parameters were derived from the literature. Costs were obtained from United States costing sources and published literature. Outcomes were lifetime costs, quality-adjusted life-years (QALYs), incremental cost-effectiveness ratios, and net monetary benefits, with a willingness-to-pay threshold of $80 000 per QALY. The strategy with the highest net monetary benefit was considered the most cost-effective. Extensive deterministic and probabilistic sensitivity analyses were performed to explore the effect of varying parameter values. Results— In the reference case analysis, CTA dominated all other imaging strategies. CTA yielded 0.02 QALYs more than magnetic resonance imaging and 0.04 QALYs more than duplex ultrasound followed by CTA. At a willingness-to-pay threshold of $80 000 per QALY, CTA yielded the highest net monetary benefits. The probability that CTA is cost-effective was 96% at a willingness-to-pay threshold of $80 000/QALY. Sensitivity analyses showed that duplex ultrasound was cost-effective only for a prior probability of ≤0.02 and that these results were only minimally influenced by duplex ultrasound sensitivity and specificity. Nonenhanced CT and magnetic resonance imaging never became the most cost-effective strategy. Conclusions— Our results suggest that CTA in patients with possible basilar artery occlusion is cost-effective. PMID:26022634
A Robust Approach to Risk Assessment Based on Species Sensitivity Distributions.
Monti, Gianna S; Filzmoser, Peter; Deutsch, Roland C
2018-05-03
The guidelines for setting environmental quality standards are increasingly based on probabilistic risk assessment due to a growing general awareness of the need for probabilistic procedures. One of the commonly used tools in probabilistic risk assessment is the species sensitivity distribution (SSD), which represents the proportion of species affected belonging to a biological assemblage as a function of exposure to a specific toxicant. Our focus is on the inverse use of the SSD curve with the aim of estimating the concentration, HCp, of a toxic compound that is hazardous to p% of the biological community under study. Toward this end, we propose the use of robust statistical methods in order to take into account the presence of outliers or apparent skew in the data, which may occur without any ecological basis. A robust approach exploits the full neighborhood of a parametric model, enabling the analyst to account for the typical real-world deviations from ideal models. We examine two classic HCp estimation approaches and consider robust versions of these estimators. In addition, we also use data transformations in conjunction with robust estimation methods in case of heteroscedasticity. Different scenarios using real data sets as well as simulated data are presented in order to illustrate and compare the proposed approaches. These scenarios illustrate that the use of robust estimation methods enhances HCp estimation. © 2018 Society for Risk Analysis.
LaWen T. Hollingsworth; Laurie L. Kurth; Bernard R. Parresol; Roger D. Ottmar; Susan J. Prichard
2012-01-01
Landscape-scale fire behavior analyses are important to inform decisions on resource management projects that meet land management objectives and protect values from adverse consequences of fire. Deterministic and probabilistic geospatial fire behavior analyses are conducted with various modeling systems including FARSITE, FlamMap, FSPro, and Large Fire Simulation...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coleman, Justin; Slaughter, Andrew; Veeraraghavan, Swetha
Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON) is a finite element application that aims at analyzing the response of 3-D soil-structure systems to natural and man-made hazards such as earthquakes, floods and fire. MASTODON currently focuses on the simulation of seismic events and has the capability to perform extensive ‘source-to-site’ simulations including earthquake fault rupture, nonlinear wave propagation and nonlinear soil-structure interaction (NLSSI) analysis. MASTODON is being developed to be a dynamic probabilistic risk assessment framework that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment.
Whittington, Melanie D; Atherly, Adam J; Curtis, Donna J; Lindrooth, Richard C; Bradley, Cathy J; Campbell, Jonathan D
2017-08-01
Patients in the ICU are at the greatest risk of contracting healthcare-associated infections like methicillin-resistant Staphylococcus aureus. This study calculates the cost-effectiveness of methicillin-resistant S aureus prevention strategies and recommends specific strategies based on screening test implementation. A cost-effectiveness analysis using a Markov model from the hospital perspective was conducted to determine if the implementation costs of methicillin-resistant S aureus prevention strategies are justified by associated reductions in methicillin-resistant S aureus infections and improvements in quality-adjusted life years. Univariate and probabilistic sensitivity analyses determined the influence of input variation on the cost-effectiveness. ICU. Hypothetical cohort of adults admitted to the ICU. Three prevention strategies were evaluated, including universal decolonization, targeted decolonization, and screening and isolation. Because prevention strategies have a screening component, the screening test in the model was varied to reflect commonly used screening test categories, including conventional culture, chromogenic agar, and polymerase chain reaction. Universal and targeted decolonization are less costly and more effective than screening and isolation. This is consistent for all screening tests. When compared with targeted decolonization, universal decolonization is cost-saving to cost-effective, with maximum cost savings occurring when a hospital uses more expensive screening tests like polymerase chain reaction. Results were robust to sensitivity analyses. As compared with screening and isolation, the current standard practice in ICUs, targeted decolonization, and universal decolonization are less costly and more effective. This supports updating the standard practice to a decolonization approach.
Joshi, Ashish V; Stephens, Jennifer M; Munro, Vicki; Mathew, Prasad; Botteman, Marc F
2006-01-01
To compare the cost-effectiveness of three treatment regimens using recombinant activated Factor VII (rFVIIa), NovoSeven, and activated prothrombin-complex concentrate (APCC), FEIBA VH, for home treatment of minor-to-moderate bleeds in hemophilia patients with inhibitors. A literature-based, decision-analytic model was developed to compare three treatment regimens. The regimens consisting of first-, second-, and third-line treatments were: rFVIIa-rFVIIa-rFVIIa; APCC-rFVIIa-rFVIIa; and APCC-APCC-rFVIIa. Patients not responding to first-line treatment were administered second-line treatment, and those failing second-line received third-line treatment. Using literature and expert opinion, the model structure and base-case inputs were adapted to the US from a previously published analysis. The percentage of evaluable bleeds controlled with rFVIIa and APCC were obtained from published literature. Drug costs (2005 US$) based on average wholesale price were included in the base-case model. Univariate and probabilistic sensitivity analyses (second-order Monte Carlo simulation) were conducted by varying the efficacy, re-bleeding rates, patient weight, and dosing to ascertain robustness of the model. In the base-case analysis, the average cost per resolved bleed using rFVIIa as first-, second-, and third-line treatment was $28 076. Using APCC as first-line and rFVIIa as second- and third-line treatment resulted in an average cost per resolved bleed of $30 883, whereas the regimen using APCC as first- and second-line, and rFVIIa as third-line treatment was the most expensive, with an average cost per resolved bleed of $32 150. Cost offsets occurred for the rFVIIa-only regimen through avoidance of second and third lines of treatment. In probabilistic sensitivity analyses, the rFVIIa-only strategy was the least expensive strategy more than 68% of the time. The management of minor-to-moderate bleeds extends beyond the initial line of treatment, and should include the economic impact of re-bleeding and failures over multiple lines of treatment. In the majority of cases, the rFVIIa-only regimen appears to be a less expensive treatment option in inhibitor patients with minor-to-moderate bleeds over three lines of treatment.
Necitumumab in Metastatic Squamous Cell Lung Cancer: Establishing a Value-Based Cost.
Goldstein, Daniel A; Chen, Qiushi; Ayer, Turgay; Howard, David H; Lipscomb, Joseph; Ramalingam, Suresh S; Khuri, Fadlo R; Flowers, Christopher R
2015-12-01
The SQUIRE trial demonstrated that adding necitumumab to chemotherapy for patients with metastatic squamous cell lung cancer (mSqCLC) increased median overall survival by 1.6 months (hazard ratio, 0.84). However, the costs and value associated with this intervention remains unclear. Value-based pricing links the price of a drug to the benefit that it provides and is a novel method to establish prices for new treatments. To evaluate the range of drug costs for which adding necitumumab to chemotherapy could be considered cost-effective. We developed a Markov model using data from multiple sources, including the SQUIRE trial, which compared standard chemotherapy with and without necitumumab as first-line treatment of mSqCLC, to evaluate the costs and patient life expectancies associated with each regimen. In the analysis, patients were modeled to receive gemcitabine and cisplatin for 6 cycles or gemcitabine, cisplatin, and necitumumab for 6 cycles followed by maintenance necitumumab. Our model's clinical inputs were the survival estimates and frequency of adverse events (AEs) described in the SQUIRE trial. Log-logistic models were fitted to the survival distributions in the SQUIRE trial. The cost inputs included drug costs, based on the Medicare average sale prices, and costs for drug administration and management of AEs, based on Medicare reimbursement rates (all in 2014 US dollars). We evaluated incremental cost-effectiveness ratios (ICERs) for the use of necitumumab across a range of values for its cost. Model robustness was assessed by probabilistic sensitivity analyses, based on 10 000 Monte Carlo simulations, sampling values from the distributions of all model parameters. In the base case analysis, the addition of necitumumab to the treatment regimen produced an incremental survival benefit of 0.15 life-years and 0.11 quality-adjusted life-years (QALYs). The probabilistic sensitivity analyses established that when necitumumab cost less than $563 and less than $1309 per cycle, there was 90% confidence that the ICER for adding necitumumab would be less than $100 000 per QALY and less than $200 000 per QALY, respectively. These findings provide a value-based range for the cost of necitumumab from $563 to $1309 per cycle. This study provides a framework for establishing value-based pricing for new oncology drugs entering the US marketplace.
Vaidya, Anil; Vaidya, Param; Both, Brigitte; Brew-Graves, Chris; Bulsara, Max; Vaidya, Jayant S
2017-08-17
The clinical effectiveness of targeted intraoperative radiotherapy (TARGIT-IORT) has been confirmed in the randomised TARGIT-A (targeted intraoperative radiotherapy-alone) trial to be similar to a several weeks' course of whole-breast external-beam radiation therapy (EBRT) in patients with early breast cancer. This study aims to determine the cost-effectiveness of TARGIT-IORT to inform policy decisions about its wider implementation. TARGIT-A randomised clinical trial (ISRCTN34086741) which compared TARGIT with traditional EBRT and found similar breast cancer control, particularly when TARGIT was given simultaneously with lumpectomy. Cost-utility analysis using decision analytic modelling by a Markov model. A cost-effectiveness Markov model was developed using TreeAge Pro V.2015. The decision analytic model compared two strategies of radiotherapy for breast cancer in a hypothetical cohort of patients with early breast cancer based on the published health state transition probability data from the TARGIT-A trial. Analysis was performed for UK setting and National Health Service (NHS) healthcare payer's perspective using NHS cost data and treatment outcomes were simulated for both strategies for a time horizon of 10 years. Model health state utilities were drawn from the published literature. Future costs and effects were discounted at the rate of 3.5%. To address uncertainty, one-way and probabilistic sensitivity analyses were performed. Quality-adjusted life-years (QALYs). In the base case analysis, TARGIT-IORT was a highly cost-effective strategy yielding health gain at a lower cost than its comparator EBRT. Discounted TARGIT-IORT and EBRT costs for the time horizon of 10 years were £12 455 and £13 280, respectively. TARGIT-IORT gained 0.18 incremental QALY as the discounted QALYs gained by TARGIT-IORT were 8.15 and by EBRT were 7.97 showing TARGIT-IORT as a dominant strategy over EBRT. Model outputs were robust to one-way and probabilistic sensitivity analyses. TARGIT-IORT is a dominant strategy over EBRT, being less costly and producing higher QALY gain. ISRCTN34086741; post results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
A Cost-Minimization Analysis of Tissue-Engineered Constructs for Corneal Endothelial Transplantation
Tan, Tien-En; Peh, Gary S. L.; George, Benjamin L.; Cajucom-Uy, Howard Y.; Dong, Di; Finkelstein, Eric A.; Mehta, Jodhbir S.
2014-01-01
Corneal endothelial transplantation or endothelial keratoplasty has become the preferred choice of transplantation for patients with corneal blindness due to endothelial dysfunction. Currently, there is a worldwide shortage of transplantable tissue, and demand is expected to increase further with aging populations. Tissue-engineered alternatives are being developed, and are likely to be available soon. However, the cost of these constructs may impair their widespread use. A cost-minimization analysis comparing tissue-engineered constructs to donor tissue procured from eye banks for endothelial keratoplasty was performed. Both initial investment costs and recurring costs were considered in the analysis to arrive at a final tissue cost per transplant. The clinical outcomes of endothelial keratoplasty with tissue-engineered constructs and with donor tissue procured from eye banks were assumed to be equivalent. One-way and probabilistic sensitivity analyses were performed to simulate various possible scenarios, and to determine the robustness of the results. A tissue engineering strategy was cheaper in both investment cost and recurring cost. Tissue-engineered constructs for endothelial keratoplasty could be produced at a cost of US$880 per transplant. In contrast, utilizing donor tissue procured from eye banks for endothelial keratoplasty required US$3,710 per transplant. Sensitivity analyses performed further support the results of this cost-minimization analysis across a wide range of possible scenarios. The use of tissue-engineered constructs for endothelial keratoplasty could potentially increase the supply of transplantable tissue and bring the costs of corneal endothelial transplantation down, making this intervention accessible to a larger group of patients. Tissue-engineering strategies for corneal epithelial constructs or other tissue types, such as pancreatic islet cells, should also be subject to similar pharmacoeconomic analyses. PMID:24949869
Goldstein, Daniel A.; Chen, Qiushi; Ayer, Turgay; Howard, David H.; Lipscomb, Joseph; El-Rayes, Bassel F.; Flowers, Christopher R.
2015-01-01
Purpose The addition of bevacizumab to fluorouracil-based chemotherapy is a standard of care for previously untreated metastatic colorectal cancer. Continuation of bevacizumab beyond progression is an accepted standard of care based on a 1.4-month increase in median overall survival observed in a randomized trial. No United States–based cost-effectiveness modeling analyses are currently available addressing the use of bevacizumab in metastatic colorectal cancer. Our objective was to determine the cost effectiveness of bevacizumab in the first-line setting and when continued beyond progression from the perspective of US payers. Methods We developed two Markov models to compare the cost and effectiveness of fluorouracil, leucovorin, and oxaliplatin with or without bevacizumab in the first-line treatment and subsequent fluorouracil, leucovorin, and irinotecan with or without bevacizumab in the second-line treatment of metastatic colorectal cancer. Model robustness was addressed by univariable and probabilistic sensitivity analyses. Health outcomes were measured in life-years and quality-adjusted life-years (QALYs). Results Using bevacizumab in first-line therapy provided an additional 0.10 QALYs (0.14 life-years) at a cost of $59,361. The incremental cost-effectiveness ratio was $571,240 per QALY. Continuing bevacizumab beyond progression provided an additional 0.11 QALYs (0.16 life-years) at a cost of $39,209. The incremental cost-effectiveness ratio was $364,083 per QALY. In univariable sensitivity analyses, the variables with the greatest influence on the incremental cost-effectiveness ratio were bevacizumab cost, overall survival, and utility. Conclusion Bevacizumab provides minimal incremental benefit at high incremental cost per QALY in both the first- and second-line settings of metastatic colorectal cancer treatment. PMID:25691669
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ebinger, M.H.; Beckman, R.J.; Myers, O.B.
1996-09-01
The purpose of this study was to evaluate the immediate and long-term consequences of depleted uranium (DU) in the environment at Aberdeen Proving Ground (APG) and Yuma Proving Ground (YPG) for the Test and Evaluation Command (TECOM) of the US Army. Specifically, we examined the potential for adverse radiological and toxicological effects to humans and ecosystems caused by exposure to DU at both installations. We developed contaminant transport models of aquatic and terrestrial ecosystems at APG and terrestrial ecosystems at YPG to assess potential adverse effects from DU exposure. Sensitivity and uncertainty analyses of the initial models showed the portionsmore » of the models that most influenced predicted DU concentrations, and the results of the sensitivity analyses were fundamental tools in designing field sampling campaigns at both installations. Results of uranium (U) isotope analyses of field samples provided data to evaluate the source of U in the environment and the toxicological and radiological doses to different ecosystem components and to humans. Probabilistic doses were estimated from the field data, and DU was identified in several components of the food chain at APG and YPG. Dose estimates from APG data indicated that U or DU uptake was insufficient to cause adverse toxicological or radiological effects. Dose estimates from YPG data indicated that U or DU uptake is insufficient to cause radiological effects in ecosystem components or in humans, but toxicological effects in small mammals (e.g., kangaroo rats and pocket mice) may occur from U or DU ingestion. The results of this study were used to modify environmental radiation monitoring plans at APG and YPG to ensure collection of adequate data for ongoing ecological and human health risk assessments.« less
Bamrungsawad, Naruemon; Chaiyakunapruk, Nathorn; Upakdee, Nilawan; Pratoomsoot, Chayanin; Sruamsiri, Rosarin; Dilokthornsakul, Piyameth
2015-05-01
Intravenous immunoglobulin (IVIG) has been shown to be effective in treating steroid-refractory dermatomyositis (DM). There remains no evidence of its cost-effectiveness in Thailand. Our objective was to estimate the cost utility of IVIG as a second-line therapy in steroid-refractory DM in Thailand. A Markov model was developed to estimate the relevant costs and health benefits for IVIG plus corticosteroids in comparison with immunosuppressant plus corticosteroids in steroid-refractory DM from a societal perspective over a patient's lifetime. The effectiveness and utility parameters were obtained from clinical literature, meta-analyses, medical record reviews, and patient interviews, whereas cost data were obtained from an electronic hospital database and patient interviews. Costs are presented in $US, year 2012 values. All future costs and outcomes were discounted at a rate of 3% per annum. One-way and probabilistic sensitivity analyses were also performed. Over a lifetime horizon, the model estimated treatment under IVIG plus corticosteroids to be cost saving compared with immunosuppressant plus corticosteroids, where the saving of costs and incremental quality-adjusted life-years (QALYs) were $US4738.92 and 1.96 QALYs, respectively. Sensitivity analyses revealed that probability of response of immunosuppressant plus corticosteroids was the most influential parameter on incremental QALYs and costs. At a societal willingness-to-pay threshold in Thailand of $US5148 per QALY gained, the probability of IVIG being cost effective was 97.6%. The use of IVIG plus corticosteroids is cost saving compared with treatment with immunosuppressant plus corticosteroids in Thai patients with steroid-refractory DM. Policy makers should consider using our findings in their decision-making process for adding IVIG to corticosteroids as the second-line therapy for steroid-refractory DM patients.
Tan, Tien-En; Peh, Gary S L; George, Benjamin L; Cajucom-Uy, Howard Y; Dong, Di; Finkelstein, Eric A; Mehta, Jodhbir S
2014-01-01
Corneal endothelial transplantation or endothelial keratoplasty has become the preferred choice of transplantation for patients with corneal blindness due to endothelial dysfunction. Currently, there is a worldwide shortage of transplantable tissue, and demand is expected to increase further with aging populations. Tissue-engineered alternatives are being developed, and are likely to be available soon. However, the cost of these constructs may impair their widespread use. A cost-minimization analysis comparing tissue-engineered constructs to donor tissue procured from eye banks for endothelial keratoplasty was performed. Both initial investment costs and recurring costs were considered in the analysis to arrive at a final tissue cost per transplant. The clinical outcomes of endothelial keratoplasty with tissue-engineered constructs and with donor tissue procured from eye banks were assumed to be equivalent. One-way and probabilistic sensitivity analyses were performed to simulate various possible scenarios, and to determine the robustness of the results. A tissue engineering strategy was cheaper in both investment cost and recurring cost. Tissue-engineered constructs for endothelial keratoplasty could be produced at a cost of US$880 per transplant. In contrast, utilizing donor tissue procured from eye banks for endothelial keratoplasty required US$3,710 per transplant. Sensitivity analyses performed further support the results of this cost-minimization analysis across a wide range of possible scenarios. The use of tissue-engineered constructs for endothelial keratoplasty could potentially increase the supply of transplantable tissue and bring the costs of corneal endothelial transplantation down, making this intervention accessible to a larger group of patients. Tissue-engineering strategies for corneal epithelial constructs or other tissue types, such as pancreatic islet cells, should also be subject to similar pharmacoeconomic analyses.
Sud, Sachin; Mittmann, Nicole; Cook, Deborah J; Geerts, William; Chan, Brian; Dodek, Peter; Gould, Michael K; Guyatt, Gordon; Arabi, Yaseen; Fowler, Robert A
2011-12-01
Venous thromboembolism is difficult to diagnose in critically ill patients and may increase morbidity and mortality. To evaluate the cost-effectiveness of strategies to reduce morbidity from venous thromboembolism in critically ill patients. A Markov decision analytic model to compare weekly compression ultrasound screening (screening) plus investigation for clinically suspected deep vein thrombosis (DVT) (case finding) versus case finding alone; and a hypothetical program to increase adherence to DVT prevention. Probabilities were derived from a systematic review of venous thromboembolism in medical-surgical intensive care unit patients. Costs (in 2010 $US) were obtained from hospitals in Canada, Australia, and the United States, and the medical literature. Analyses were conducted from a societal perspective over a lifetime horizon. Outcomes included costs, quality-adjusted life-years (QALY), and incremental cost-effectiveness ratios. In the base case, the rate of proximal DVT was 85 per 1,000 patients. Screening resulted in three fewer pulmonary emboli than case-finding alone but also two additional bleeding episodes, and cost $223,801 per QALY gained. In sensitivity analyses, screening cost less than $50,000 per QALY only if the probability of proximal DVT increased from a baseline of 8.5-16%. By comparison, increasing adherence to appropriate pharmacologic thromboprophylaxis by 10% resulted in 16 fewer DVTs, one fewer pulmonary emboli, and one additional heparin-induced thrombocytopenia and bleeding event, and cost $27,953 per QALY gained. Programs achieving increased adherence to best-practice venous thromboembolism prevention were cost-effective over a wide range of program costs and were robust in probabilistic sensitivity analyses. Appropriate prophylaxis provides better value in terms of costs and health gains than routine screening for DVT. Resources should be targeted at optimizing thromboprophylaxis.
Paixão, Enny S; Harron, Katie; Andrade, Kleydson; Teixeira, Maria Glória; Fiaccone, Rosemeire L; Costa, Maria da Conceição N; Rodrigues, Laura C
2017-07-17
Due to the increasing availability of individual-level information across different electronic datasets, record linkage has become an efficient and important research tool. High quality linkage is essential for producing robust results. The objective of this study was to describe the process of preparing and linking national Brazilian datasets, and to compare the accuracy of different linkage methods for assessing the risk of stillbirth due to dengue in pregnancy. We linked mothers and stillbirths in two routinely collected datasets from Brazil for 2009-2010: for dengue in pregnancy, notifications of infectious diseases (SINAN); for stillbirths, mortality (SIM). Since there was no unique identifier, we used probabilistic linkage based on maternal name, age and municipality. We compared two probabilistic approaches, each with two thresholds: 1) a bespoke linkage algorithm; 2) a standard linkage software widely used in Brazil (ReclinkIII), and used manual review to identify further links. Sensitivity and positive predictive value (PPV) were estimated using a subset of gold-standard data created through manual review. We examined the characteristics of false-matches and missed-matches to identify any sources of bias. From records of 678,999 dengue cases and 62,373 stillbirths, the gold-standard linkage identified 191 cases. The bespoke linkage algorithm with a conservative threshold produced 131 links, with sensitivity = 64.4% (68 missed-matches) and PPV = 92.5% (8 false-matches). Manual review of uncertain links identified an additional 37 links, increasing sensitivity to 83.7%. The bespoke algorithm with a relaxed threshold identified 132 true matches (sensitivity = 69.1%), but introduced 61 false-matches (PPV = 68.4%). ReclinkIII produced lower sensitivity and PPV than the bespoke linkage algorithm. Linkage error was not associated with any recorded study variables. Despite a lack of unique identifiers for linking mothers and stillbirths, we demonstrate a high standard of linkage of large routine databases from a middle income country. Probabilistic linkage and manual review were essential for accurately identifying cases for a case-control study, but this approach may not be feasible for larger databases or for linkage of more common outcomes.
Sensitivity Analysis for Probabilistic Neural Network Structure Reduction.
Kowalski, Piotr A; Kusy, Maciej
2018-05-01
In this paper, we propose the use of local sensitivity analysis (LSA) for the structure simplification of the probabilistic neural network (PNN). Three algorithms are introduced. The first algorithm applies LSA to the PNN input layer reduction by selecting significant features of input patterns. The second algorithm utilizes LSA to remove redundant pattern neurons of the network. The third algorithm combines the proposed two and constitutes the solution of how they can work together. PNN with a product kernel estimator is used, where each multiplicand computes a one-dimensional Cauchy function. Therefore, the smoothing parameter is separately calculated for each dimension by means of the plug-in method. The classification qualities of the reduced and full structure PNN are compared. Furthermore, we evaluate the performance of PNN, for which global sensitivity analysis (GSA) and the common reduction methods are applied, both in the input layer and the pattern layer. The models are tested on the classification problems of eight repository data sets. A 10-fold cross validation procedure is used to determine the prediction ability of the networks. Based on the obtained results, it is shown that the LSA can be used as an alternative PNN reduction approach.
Kwon, Taejoon; Choi, Hyungwon; Vogel, Christine; Nesvizhskii, Alexey I; Marcotte, Edward M
2011-07-01
Shotgun proteomics using mass spectrometry is a powerful method for protein identification but suffers limited sensitivity in complex samples. Integrating peptide identifications from multiple database search engines is a promising strategy to increase the number of peptide identifications and reduce the volume of unassigned tandem mass spectra. Existing methods pool statistical significance scores such as p-values or posterior probabilities of peptide-spectrum matches (PSMs) from multiple search engines after high scoring peptides have been assigned to spectra, but these methods lack reliable control of identification error rates as data are integrated from different search engines. We developed a statistically coherent method for integrative analysis, termed MSblender. MSblender converts raw search scores from search engines into a probability score for every possible PSM and properly accounts for the correlation between search scores. The method reliably estimates false discovery rates and identifies more PSMs than any single search engine at the same false discovery rate. Increased identifications increment spectral counts for most proteins and allow quantification of proteins that would not have been quantified by individual search engines. We also demonstrate that enhanced quantification contributes to improve sensitivity in differential expression analyses.
Kwon, Taejoon; Choi, Hyungwon; Vogel, Christine; Nesvizhskii, Alexey I.; Marcotte, Edward M.
2011-01-01
Shotgun proteomics using mass spectrometry is a powerful method for protein identification but suffers limited sensitivity in complex samples. Integrating peptide identifications from multiple database search engines is a promising strategy to increase the number of peptide identifications and reduce the volume of unassigned tandem mass spectra. Existing methods pool statistical significance scores such as p-values or posterior probabilities of peptide-spectrum matches (PSMs) from multiple search engines after high scoring peptides have been assigned to spectra, but these methods lack reliable control of identification error rates as data are integrated from different search engines. We developed a statistically coherent method for integrative analysis, termed MSblender. MSblender converts raw search scores from search engines into a probability score for all possible PSMs and properly accounts for the correlation between search scores. The method reliably estimates false discovery rates and identifies more PSMs than any single search engine at the same false discovery rate. Increased identifications increment spectral counts for all detected proteins and allow quantification of proteins that would not have been quantified by individual search engines. We also demonstrate that enhanced quantification contributes to improve sensitivity in differential expression analyses. PMID:21488652
DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT
Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...
Barshes, Neal R; Flores, Everardo; Belkin, Michael; Kougias, Panos; Armstrong, David G; Mills, Joseph L
2016-12-01
Patients with diabetic foot ulcers (DFUs) should be evaluated for peripheral artery disease (PAD). We sought to estimate the overall diagnostic accuracy for various strategies that are used to identify PAD in this population. A Markov model with probabilistic and deterministic sensitivity analyses was used to simulate the clinical events in a population of 10,000 patients with diabetes. One of 14 different diagnostic strategies was applied to those who developed DFUs. Baseline data on diagnostic accuracy of individual noninvasive tests were based on a meta-analysis of previously reported studies. The overall sensitivity and cost-effectiveness of the 14 strategies were then compared. The overall sensitivity of various combinations of diagnostic testing strategies ranged from 32.6% to 92.6%. Cost-effective strategies included ankle-brachial indices for all patients; skin perfusion pressures (SPPs) or toe-brachial indices (TBIs) for all patients; and SPPs or TBIs to corroborate normal pulse examination findings, a strategy that lowered leg amputation rates by 36%. Strategies that used noninvasive vascular testing to investigate only abnormal pulse examination results had low overall diagnostic sensitivity and were weakly dominated in cost-effectiveness evaluations. Population prevalence of PAD did not alter strategy ordering by diagnostic accuracy or cost-effectiveness. TBIs or SPPs used uniformly or to corroborate a normal pulse examination finding are among the most sensitive and cost-effective strategies to improve the identification of PAD among patients presenting with DFUs. These strategies may significantly reduce leg amputation rates with only modest increases in cost. Published by Elsevier Inc.
Potential Cost-Effectiveness of Universal Access to Modern Contraceptives in Uganda
Babigumira, Joseph B.; Stergachis, Andy; Veenstra, David L.; Gardner, Jacqueline S.; Ngonzi, Joseph; Mukasa-Kivunike, Peter; Garrison, Louis P.
2012-01-01
Background Over two thirds of women who need contraception in Uganda lack access to modern effective methods. This study was conducted to estimate the potential cost-effectiveness of achieving universal access to modern contraceptives in Uganda by implementing a hypothetical new contraceptive program (NCP) from both societal and governmental (Ministry of Health (MoH)) perspectives. Methodology/Principal Findings A Markov model was developed to compare the NCP to the status quo or current contraceptive program (CCP). The model followed a hypothetical cohort of 15-year old girls over a lifetime horizon. Data were obtained from the Uganda National Demographic and Health Survey and from published and unpublished sources. Costs, life expectancy, disability-adjusted life expectancy, pregnancies, fertility and incremental cost-effectiveness measured as cost per life-year (LY) gained, cost per disability-adjusted life-year (DALY) averted, cost per pregnancy averted and cost per unit of fertility reduction were calculated. Univariate and probabilistic sensitivity analyses were performed to examine the robustness of results. Mean discounted life expectancy and disability-adjusted life expectancy (DALE) were higher under the NCP vs. CCP (28.74 vs. 28.65 years and 27.38 vs. 27.01 respectively). Mean pregnancies and live births per woman were lower under the NCP (9.51 vs. 7.90 and 6.92 vs. 5.79 respectively). Mean lifetime societal costs per woman were lower for the NCP from the societal perspective ($1,949 vs. $1,987) and the MoH perspective ($636 vs. $685). In the incremental analysis, the NCP dominated the CCP, i.e. it was both less costly and more effective. The results were robust to univariate and probabilistic sensitivity analysis. Conclusion/Significance Universal access to modern contraceptives in Uganda appears to be highly cost-effective. Increasing contraceptive coverage should be considered among Uganda's public health priorities. PMID:22363480
Contreras-Hernández, Iris; Mould-Quevedo, Joaquín F; Torres-González, Rubén; Goycochea-Robles, María Victoria; Pacheco-Domínguez, Reyna Lizette; Sánchez-García, Sergio; Mejía-Aranguré, Juan Manuel; Garduño-Espinosa, Juan
2008-11-12
Osteoarthritis (OA) is one of the main causes of disability worldwide, especially in persons >55 years of age. Currently, controversy remains about the best therapeutic alternative for this disease when evaluated from a cost-effectiveness viewpoint. For Social Security Institutions in developing countries, it is very important to assess what drugs may decrease the subsequent use of medical care resources, considering their adverse events that are known to have a significant increase in medical care costs of patients with OA. Three treatment alternatives were compared: celecoxib (200 mg twice daily), non-selective NSAIDs (naproxen, 500 mg twice daily; diclofenac, 100 mg twice daily; and piroxicam, 20 mg/day) and acetaminophen, 1000 mg twice daily. The aim of this study was to identify the most cost-effective first-choice pharmacological treatment for the control of joint pain secondary to OA in patients treated at the Instituto Mexicano del Seguro Social (IMSS). A cost-effectiveness assessment was carried out. A systematic review of the literature was performed to obtain transition probabilities. In order to evaluate analysis robustness, one-way and probabilistic sensitivity analyses were conducted. Estimations were done for a 6-month period. Treatment demonstrating the best cost-effectiveness results [lowest cost-effectiveness ratio $17.5 pesos/patient ($1.75 USD)] was celecoxib. According to the one-way sensitivity analysis, celecoxib would need to markedly decrease its effectiveness in order for it to not be the optimal treatment option. In the probabilistic analysis, both in the construction of the acceptability curves and in the estimation of net economic benefits, the most cost-effective option was celecoxib. From a Mexican institutional perspective and probably in other Social Security Institutions in similar developing countries, the most cost-effective option for treatment of knee and/or hip OA would be celecoxib.
Contreras-Hernández, Iris; Mould-Quevedo, Joaquín F; Torres-González, Rubén; Goycochea-Robles, María Victoria; Pacheco-Domínguez, Reyna Lizette; Sánchez-García, Sergio; Mejía-Aranguré, Juan Manuel; Garduño-Espinosa, Juan
2008-01-01
Background Osteoarthritis (OA) is one of the main causes of disability worldwide, especially in persons >55 years of age. Currently, controversy remains about the best therapeutic alternative for this disease when evaluated from a cost-effectiveness viewpoint. For Social Security Institutions in developing countries, it is very important to assess what drugs may decrease the subsequent use of medical care resources, considering their adverse events that are known to have a significant increase in medical care costs of patients with OA. Three treatment alternatives were compared: celecoxib (200 mg twice daily), non-selective NSAIDs (naproxen, 500 mg twice daily; diclofenac, 100 mg twice daily; and piroxicam, 20 mg/day) and acetaminophen, 1000 mg twice daily. The aim of this study was to identify the most cost-effective first-choice pharmacological treatment for the control of joint pain secondary to OA in patients treated at the Instituto Mexicano del Seguro Social (IMSS). Methods A cost-effectiveness assessment was carried out. A systematic review of the literature was performed to obtain transition probabilities. In order to evaluate analysis robustness, one-way and probabilistic sensitivity analyses were conducted. Estimations were done for a 6-month period. Results Treatment demonstrating the best cost-effectiveness results [lowest cost-effectiveness ratio $17.5 pesos/patient ($1.75 USD)] was celecoxib. According to the one-way sensitivity analysis, celecoxib would need to markedly decrease its effectiveness in order for it to not be the optimal treatment option. In the probabilistic analysis, both in the construction of the acceptability curves and in the estimation of net economic benefits, the most cost-effective option was celecoxib. Conclusion From a Mexican institutional perspective and probably in other Social Security Institutions in similar developing countries, the most cost-effective option for treatment of knee and/or hip OA would be celecoxib. PMID:19014495
Monahan, M; Ensor, J; Moore, D; Fitzmaurice, D; Jowett, S
2017-08-01
Essentials Correct duration of treatment after a first unprovoked venous thromboembolism (VTE) is unknown. We assessed when restarting anticoagulation was worthwhile based on patient risk of recurrent VTE. When the risk over a one-year period is 17.5%, restarting is cost-effective. However, sensitivity analyses indicate large uncertainty in the estimates. Background Following at least 3 months of anticoagulation therapy after a first unprovoked venous thromboembolism (VTE), there is uncertainty about the duration of therapy. Further anticoagulation therapy reduces the risk of having a potentially fatal recurrent VTE but at the expense of a higher risk of bleeding, which can also be fatal. Objective An economic evaluation sought to estimate the long-term cost-effectiveness of using a decision rule for restarting anticoagulation therapy vs. no extension of therapy in patients based on their risk of a further unprovoked VTE. Methods A Markov patient-level simulation model was developed, which adopted a lifetime time horizon with monthly time cycles and was from a UK National Health Service (NHS)/Personal Social Services (PSS) perspective. Results Base-case model results suggest that treating patients with a predicted 1 year VTE risk of 17.5% or higher may be cost-effective if decision makers are willing to pay up to £20 000 per quality adjusted life year (QALY) gained. However, probabilistic sensitivity analysis shows that the model was highly sensitive to overall parameter uncertainty and caution is warranted in selecting the optimal decision rule on cost-effectiveness grounds. Univariate sensitivity analyses indicate variables such as anticoagulation therapy disutility and mortality risks were very influential in driving model results. Conclusion This represents the first economic model to consider the use of a decision rule for restarting therapy for unprovoked VTE patients. Better data are required to predict long-term bleeding risks during therapy in this patient group. © 2017 International Society on Thrombosis and Haemostasis.
Dopamine neurons learn relative chosen value from probabilistic rewards
Lak, Armin; Stauffer, William R; Schultz, Wolfram
2016-01-01
Economic theories posit reward probability as one of the factors defining reward value. Individuals learn the value of cues that predict probabilistic rewards from experienced reward frequencies. Building on the notion that responses of dopamine neurons increase with reward probability and expected value, we asked how dopamine neurons in monkeys acquire this value signal that may represent an economic decision variable. We found in a Pavlovian learning task that reward probability-dependent value signals arose from experienced reward frequencies. We then assessed neuronal response acquisition during choices among probabilistic rewards. Here, dopamine responses became sensitive to the value of both chosen and unchosen options. Both experiments showed also the novelty responses of dopamine neurones that decreased as learning advanced. These results show that dopamine neurons acquire predictive value signals from the frequency of experienced rewards. This flexible and fast signal reflects a specific decision variable and could update neuronal decision mechanisms. DOI: http://dx.doi.org/10.7554/eLife.18044.001 PMID:27787196
Probabilistic Assessment of Fracture Progression in Composite Structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Minnetyan, Levon; Mauget, Bertrand; Huang, Dade; Addi, Frank
1999-01-01
This report describes methods and corresponding computer codes that are used to evaluate progressive damage and fracture and to perform probabilistic assessment in built-up composite structures. Structural response is assessed probabilistically, during progressive fracture. The effects of design variable uncertainties on structural fracture progression are quantified. The fast probability integrator (FPI) is used to assess the response scatter in the composite structure at damage initiation. The sensitivity of the damage response to design variables is computed. The methods are general purpose and are applicable to stitched and unstitched composites in all types of structures and fracture processes starting from damage initiation to unstable propagation and to global structure collapse. The methods are demonstrated for a polymer matrix composite stiffened panel subjected to pressure. The results indicated that composite constituent properties, fabrication parameters, and respective uncertainties have a significant effect on structural durability and reliability. Design implications with regard to damage progression, damage tolerance, and reliability of composite structures are examined.
NASA Technical Reports Server (NTRS)
Bast, Callie C.; Boyce, Lola
1995-01-01
The development of methodology for a probabilistic material strength degradation is described. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes five effects that typically reduce lifetime strength: high temperature, high-cycle mechanical fatigue, low-cycle mechanical fatigue, creep and thermal fatigue. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing predictions of high-cycle mechanical fatigue and high temperature effects with experiments are presented. Results from this limited verification study strongly supported that material degradation can be represented by randomized multifactor interaction models.
Optimization of Adaptive Intraply Hybrid Fiber Composites with Reliability Considerations
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Chamis, Christos C.
1994-01-01
The reliability with bounded distribution parameters (mean, standard deviation) was maximized and the reliability-based cost was minimized for adaptive intra-ply hybrid fiber composites by using a probabilistic method. The probabilistic method accounts for all naturally occurring uncertainties including those in constituent material properties, fabrication variables, structure geometry, and control-related parameters. Probabilistic sensitivity factors were computed and used in the optimization procedures. For actuated change in the angle of attack of an airfoil-like composite shell structure with an adaptive torque plate, the reliability was maximized to 0.9999 probability, with constraints on the mean and standard deviation of the actuation material volume ratio (percentage of actuation composite material in a ply) and the actuation strain coefficient. The reliability-based cost was minimized for an airfoil-like composite shell structure with an adaptive skin and a mean actuation material volume ratio as the design parameter. At a O.9-mean actuation material volume ratio, the minimum cost was obtained.
Probabilistic flood damage modelling at the meso-scale
NASA Astrophysics Data System (ADS)
Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno
2014-05-01
Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.
Kelly, V; Sagili, K D; Satyanarayana, S; Reza, L W; Chadha, S S; Wilson, N C
2015-06-01
With support from the Stop TB Partnership's TB REACH Wave 2 Grant, diagnostic microscopy services for tuberculosis (TB) were upgraded from conventional Ziehl-Neelsen (ZN) based sputum microscopy to light emitting diode technology-based fluorescence microscopy (LED FM) in 200 high-workload microscopy centres in India as a pilot intervention. To evaluate the cost-effectiveness of LED-FM over conventional ZN microscopy to inform further scale-up. A decision-tree model was constructed to assess the cost utility of LED FM over ZN microscopy. The results were summarised using incremental cost-effectiveness ratio (ICER); one-way and probabilistic sensitivity analyses were also conducted to address uncertainty within the model. Data were analysed from 200 medical colleges in 2011 and 2012, before and after the introduction of LED microscopes. A full costing analysis was carried out from the perspective of a national TB programme. The ICER was calculated at US$14.64 per disability-adjusted life-year, with an 82% probability of being cost-effective at a willingness-to-pay threshold equivalent to Indian gross domestic product per capita. LED FM is a cost-effective intervention for detecting TB cases in India at high-workload medical college settings.
Henry, Thea L; De Brouwer, Bonnie F E; Van Keep, Marjolijn M L; Blankestijn, Peter J; Bots, Michiel L; Koffijberg, Hendrik
2015-01-01
Safety and efficacy data for catheter-based renal denervation (RDN) in the treatment of resistant hypertension have been used to estimate the cost-effectiveness of this approach. However, there are no Dutch-specific analyses. This study examined the cost-effectiveness of RDN from the perspective of the healthcare payer in The Netherlands. A previously constructed Markov state-transition model was adapted and updated with costs and utilities relevant to the Dutch setting. The cost-effectiveness of RDN was compared with standard of care (SoC) for patients with resistant hypertension. The efficacy of RDN treatment was modeled as a reduction in the risk of cardiovascular events associated with a lower systolic blood pressure (SBP). Treatment with RDN compared to SoC gave an incremental quality-adjusted life year (QALY) gain of 0.89 at an additional cost of €1315 over a patient's lifetime, resulting in a base case incremental cost-effectiveness ratio (ICER) of €1474. Deterministic and probabilistic sensitivity analyses (PSA) showed that treatment with RDN therapy was cost-effective at conventional willingness-to-pay thresholds (€10,000-80,000/QALY). RDN is a cost-effective intervention for patients with resistant hypertension in The Netherlands.
Linke, Julia; Wessa, Michèle
2017-09-01
High reward sensitivity and wanting of rewarding stimuli help to identify and motivate repetition of pleasant activities. This behavioral activation is thought to increase positive emotions. Therefore, both mechanisms are highly relevant for resilience against depressive symptoms. Yet, these mechanisms have not been targeted by psychotherapeutic interventions. In the present study, we tested a mental imagery training comprising eight 10-minute sessions every second day and delivered via the Internet to healthy volunteers (N = 30, 21 female, mean age of 23.8 years, Caucasian) who were preselected for low reward sensitivity. Participants were paired according to age, sex, reward sensitivity, and mental imagery ability. Then, members of each pair were randomly assigned to either the intervention or wait condition. Ratings of wanting and response bias toward probabilistic reward cues (Probabilistic Reward Task) served as primary outcomes. We further tested whether training effects extended to approach behavior (Approach Avoidance Task) and depressive symptoms (Beck Depression Inventory). The intervention led to an increase in wanting (p < .001, η 2 p = .45) and reward sensitivity (p = .004, η 2 p = .27). Further, the training group displayed faster approach toward positive edibles and activities (p = .025, η 2 p = .18) and reductions in depressive symptoms (p = .028, η 2 p = .16). Results extend existing literature by showing that mental imagery training can increase wanting of rewarding stimuli and reward sensitivity. Further, the training appears to reduce depressive symptoms and thus may foster the successful implementation of exsiting treatments for depression such as behavioral activation and could also increase resilience against depressive symptoms. Copyright © 2017. Published by Elsevier Ltd.
High-Resolution Underwater Mapping Using Side-Scan Sonar
2016-01-01
The goal of this study is to generate high-resolution sea floor maps using a Side-Scan Sonar(SSS). This is achieved by explicitly taking into account the SSS operation as follows. First, the raw sensor data is corrected by means of a physics-based SSS model. Second, the data is projected to the sea-floor. The errors involved in this projection are thoroughfully analysed. Third, a probabilistic SSS model is defined and used to estimate the probability of each sea-floor region to be observed. This probabilistic information is then used to weight the contribution of each SSS measurement to the map. Because of these models, arbitrary map resolutions can be achieved, even beyond the sensor resolution. Finally, a geometric map building method is presented and combined with the probabilistic approach. The resulting map is composed of two layers. The echo intensity layer holds the most likely echo intensities at each point in the sea-floor. The probabilistic layer contains information about how confident can the user or the higher control layers be about the echo intensity layer data. Experimental results have been conducted in a large subsea region. PMID:26821379
NASA Astrophysics Data System (ADS)
Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.
2017-08-01
While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.
Shih, Ya-Chen Tina; Chien, Chun-Ru; Moguel, Rocio; Hernandez, Mike; Hajek, Richard A; Jones, Lovell A
2016-04-01
To assess the cost-effectiveness of implementing a patient navigation (PN) program with capitated payment for Medicare beneficiaries diagnosed with lung cancer. Cost-effectiveness analysis. A Markov model to capture the disease progression of lung cancer and characterize clinical benefits of PN services as timeliness of treatment and care coordination. Taking a payer's perspective, we estimated the lifetime costs, life years (LYs), and quality-adjusted life years (QALYs) and addressed uncertainties in one-way and probabilistic sensitivity analyses. Model inputs were extracted from the literature, supplemented with data from a Centers for Medicare and Medicaid Services demonstration project. Compared to usual care, PN services incurred higher costs but also yielded better outcomes. The incremental cost and effectiveness was $9,145 and 0.47 QALYs, respectively, resulting in an incremental cost-effectiveness ratio of $19,312/QALY. One-way sensitivity analysis indicated that findings were most sensitive to a parameter capturing PN survival benefit for local-stage patients. CE-acceptability curve showed the probability that the PN program was cost-effective was 0.80 and 0.91 at a societal willingness-to-pay of $50,000 and $100,000/QALY, respectively. Instituting a capitated PN program is cost-effective for lung cancer patients in Medicare. Future research should evaluate whether the same conclusion holds in other cancers. © Health Research and Educational Trust.
Lexical Frequency Profiles and Zipf's Law
ERIC Educational Resources Information Center
Edwards, Roderick; Collins, Laura
2011-01-01
Laufer and Nation (1995) proposed that the Lexical Frequency Profile (LFP) can estimate the size of a second-language writer's productive vocabulary. Meara (2005) questioned the sensitivity and the reliability of LFPs for estimating vocabulary sizes, based on the results obtained from probabilistic simulations of LFPs. However, the underlying…
Probabilistic Analysis of Solid Oxide Fuel Cell Based Hybrid Gas Turbine System
NASA Technical Reports Server (NTRS)
Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.
2003-01-01
The emergence of fuel cell systems and hybrid fuel cell systems requires the evolution of analysis strategies for evaluating thermodynamic performance. A gas turbine thermodynamic cycle integrated with a fuel cell was computationally simulated and probabilistically evaluated in view of the several uncertainties in the thermodynamic performance parameters. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the uncertainties in the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design and make it cost effective. The analysis leads to the selection of criteria for gas turbine performance.
Modeling Array Stations in SIG-VISA
NASA Astrophysics Data System (ADS)
Ding, N.; Moore, D.; Russell, S.
2013-12-01
We add support for array stations to SIG-VISA, a system for nuclear monitoring using probabilistic inference on seismic signals. Array stations comprise a large portion of the IMS network; they can provide increased sensitivity and more accurate directional information compared to single-component stations. Our existing model assumed that signals were independent at each station, which is false when lots of stations are close together, as in an array. The new model removes that assumption by jointly modeling signals across array elements. This is done by extending our existing Gaussian process (GP) regression models, also known as kriging, from a 3-dimensional single-component space of events to a 6-dimensional space of station-event pairs. For each array and each event attribute (including coda decay, coda height, amplitude transfer and travel time), we model the joint distribution across array elements using a Gaussian process that learns the correlation lengthscale across the array, thereby incorporating information of array stations into the probabilistic inference framework. To evaluate the effectiveness of our model, we perform ';probabilistic beamforming' on new events using our GP model, i.e., we compute the event azimuth having highest posterior probability under the model, conditioned on the signals at array elements. We compare the results from our probabilistic inference model to the beamforming currently performed by IMS station processing.
Probabilistic estimates of drought impacts on agricultural production
NASA Astrophysics Data System (ADS)
Madadgar, Shahrbanou; AghaKouchak, Amir; Farahmand, Alireza; Davis, Steven J.
2017-08-01
Increases in the severity and frequency of drought in a warming climate may negatively impact agricultural production and food security. Unlike previous studies that have estimated agricultural impacts of climate condition using single-crop yield distributions, we develop a multivariate probabilistic model that uses projected climatic conditions (e.g., precipitation amount or soil moisture) throughout a growing season to estimate the probability distribution of crop yields. We demonstrate the model by an analysis of the historical period 1980-2012, including the Millennium Drought in Australia (2001-2009). We find that precipitation and soil moisture deficit in dry growing seasons reduced the average annual yield of the five largest crops in Australia (wheat, broad beans, canola, lupine, and barley) by 25-45% relative to the wet growing seasons. Our model can thus produce region- and crop-specific agricultural sensitivities to climate conditions and variability. Probabilistic estimates of yield may help decision-makers in government and business to quantitatively assess the vulnerability of agriculture to climate variations. We develop a multivariate probabilistic model that uses precipitation to estimate the probability distribution of crop yields. The proposed model shows how the probability distribution of crop yield changes in response to droughts. During Australia's Millennium Drought precipitation and soil moisture deficit reduced the average annual yield of the five largest crops.
Rygula, Rafal; Clarke, Hannah F.; Cardinal, Rudolf N.; Cockcroft, Gemma J.; Xia, Jing; Dalley, Jeff W.; Robbins, Trevor W.; Roberts, Angela C.
2015-01-01
Understanding the role of serotonin (or 5-hydroxytryptamine, 5-HT) in aversive processing has been hampered by the contradictory findings, across studies, of increased sensitivity to punishment in terms of subsequent response choice but decreased sensitivity to punishment-induced response suppression following gross depletion of central 5-HT. To address this apparent discrepancy, the present study determined whether both effects could be found in the same animals by performing localized 5-HT depletions in the amygdala or orbitofrontal cortex (OFC) of a New World monkey, the common marmoset. 5-HT depletion in the amygdala impaired response choice on a probabilistic visual discrimination task by increasing the effectiveness of misleading, or false, punishment and reward, and decreased response suppression in a variable interval test of punishment sensitivity that employed the same reward and punisher. 5-HT depletion in the OFC also disrupted probabilistic discrimination learning and decreased response suppression. Computational modeling of behavior on the discrimination task showed that the lesions reduced reinforcement sensitivity. A novel, unitary account of the findings in terms of the causal role of 5-HT in the anticipation of both negative and positive motivational outcomes is proposed and discussed in relation to current theories of 5-HT function and our understanding of mood and anxiety disorders. PMID:24879752
Probabilistic Simulation of Progressive Fracture in Bolted-Joint Composite Laminates
NASA Technical Reports Server (NTRS)
Minnetyan, L.; Singhal, S. N.; Chamis, C. C.
1996-01-01
This report describes computational methods to probabilistically simulate fracture in bolted composite structures. An innovative approach that is independent of stress intensity factors and fracture toughness was used to simulate progressive fracture. The effect of design variable uncertainties on structural damage was also quantified. A fast probability integrator assessed the scatter in the composite structure response before and after damage. Then the sensitivity of the response to design variables was computed. General-purpose methods, which are applicable to bolted joints in all types of structures and in all fracture processes-from damage initiation to unstable propagation and global structure collapse-were used. These methods were demonstrated for a bolted joint of a polymer matrix composite panel under edge loads. The effects of the fabrication process were included in the simulation of damage in the bolted panel. Results showed that the most effective way to reduce end displacement at fracture is to control both the load and the ply thickness. The cumulative probability for longitudinal stress in all plies was most sensitive to the load; in the 0 deg. plies it was very sensitive to ply thickness. The cumulative probability for transverse stress was most sensitive to the matrix coefficient of thermal expansion. In addition, fiber volume ratio and fiber transverse modulus both contributed significantly to the cumulative probability for the transverse stresses in all the plies.
Alshreef, Abualbishr; Wailoo, Allan J; Brown, Steven R; Tiernan, James P; Watson, Angus J M; Biggs, Katie; Bradburn, Mike; Hind, Daniel
2017-09-01
Haemorrhoids are a common condition, with nearly 30,000 procedures carried out in England in 2014/15, and result in a significant quality-of-life burden to patients and a financial burden to the healthcare system. This study examined the cost effectiveness of haemorrhoidal artery ligation (HAL) compared with rubber band ligation (RBL) in the treatment of grade II-III haemorrhoids. This analyses used data from the HubBLe study, a multicentre, open-label, parallel group, randomised controlled trial conducted in 17 acute UK hospitals between September 2012 and August 2015. A full economic evaluation, including long-term cost effectiveness, was conducted from the UK National Health Service (NHS) perspective. Main outcomes included healthcare costs, quality-adjusted life-years (QALYs) and recurrence. Cost-effectiveness results were presented in terms of incremental cost per QALY gained and cost per recurrence avoided. Extrapolation analysis for 3 years beyond the trial follow-up, two subgroup analyses (by grade of haemorrhoids and recurrence following RBL at baseline), and various sensitivity analyses were undertaken. In the primary base-case within-trial analysis, the incremental total mean cost per patient for HAL compared with RBL was £1027 (95% confidence interval [CI] £782-£1272, p < 0.001). The incremental QALYs were 0.01 QALYs (95% CI -0.02 to 0.04, p = 0.49). This generated an incremental cost-effectiveness ratio (ICER) of £104,427 per QALY. In the extrapolation analysis, the estimated probabilistic ICER was £21,798 per QALY. Results from all subgroup and sensitivity analyses did not materially change the base-case result. Under all assessed scenarios, the HAL procedure was not cost effective compared with RBL for the treatment of grade II-III haemorrhoids at a cost-effectiveness threshold of £20,000 per QALY; therefore, economically, its use in the NHS should be questioned.
NASA Astrophysics Data System (ADS)
Ishibashi, Yoshihiro; Fukui, Minoru
2018-03-01
The effect of the probabilistic delayed start on the one-dimensional traffic flow is investigated on the basis of several models. Analogy with the degeneracy of the states and its resolution, as well as that with the mathematical procedures adopted for them, is utilized. The perturbation is assumed to be proportional to the probability of the delayed start, and the perturbation function is determined so that imposed conditions are fulfilled. The obtained formulas coincide with those previously derived on the basis of the mean-field analyses of the Nagel-Schreckenberg and Fukui-Ishibashi models, and reproduce the cellular automaton simulation results.
Mihajlović, Jovan; Pechlivanoglou, Petros; Sabo, Ana; Tomić, Zdenko; Postma, Maarten J
2013-12-01
New targeted therapeutics for metastatic renal cell carcinoma (mRCC) enable an increment in progression-free survival (PFS) ranging from 2 to 6 months. Compared with best supportive care, everolimus demonstrated an additional PFS of 3 months in patients with mRCC whose disease had progressed on sunitinib and/or sorafenib. The only targeted therapy for mRCC currently reimbursed in Serbia is sunitinib. The aim of this study was to estimate the cost-effectiveness and the budget impact of the introduction of everolimus in Serbia in comparison to best supportive care, for mRCC patients refractory to sunitinib. A Markov model was designed corresponding with Serbian treatment protocols. A health care payer perspective was taken, including direct costs only. Treated and untreated cohorts were followed up over 18 cycles, each cycle lasting 8 weeks, which covered the lifetime horizon of mRCC patients refractory to the first-line treatment. Annual discounted rates of 1.5% for effectiveness and 3% for costs were applied. Transitions between health states were modeled by time-dependent probabilities extracted from published Kaplan-Meier curves of PFS and overall survival (OS). Utility values were obtained from the appraisals of other mRCC treatments. One-way and probabilistic sensitivity analyses were done to test the robustness and uncertainty of the base-case estimate. Lastly, the potential impacts of everolimus on the overall health care expenditures on annual and 4-year bases were estimated in the budget-impact analysis. The incremental cost-effectiveness ratio for everolimus was estimated at €86,978 per quality-adjusted life-year. Sensitivity analysis identified the hazard multiplier, a statistical approximator of OS gain, as the main driver of everolimus cost-effectiveness. Furthermore, probabilistic sensitivity analyses revealed a wide 95% CI around the base-case incremental cost-effectiveness ratio estimate (€32,594-€425,258 per quality-adjusted life-year). Finally, an average annual budgetary impact of everolimus in first 4 years after its potential reimbursement would be around €270,000, contributing to <1% of the total budget in Serbian oncology. Everolimus as a second-line treatment of mRCC is not likely to be a cost-effective option under the present conditions in Serbia, with a relatively limited impact on its budget in oncology. A major constraint on the estimation of the cost-effectiveness of everolimus relates to the uncertainty around the everolimus effect on extending OS. However, prior to a final decision on the acceptance/rejection of everolimus, reassessment of the whole therapeutic group might be needed to construct an economically rational treatment strategy within the mRCC field. © 2013 Elsevier HS Journals, Inc. All rights reserved.
Dynamic Stability of Uncertain Laminated Beams Under Subtangential Loads
NASA Technical Reports Server (NTRS)
Goyal, Vijay K.; Kapania, Rakesh K.; Adelman, Howard (Technical Monitor); Horta, Lucas (Technical Monitor)
2002-01-01
Because of the inherent complexity of fiber-reinforced laminated composites, it can be challenging to manufacture composite structures according to their exact design specifications, resulting in unwanted material and geometric uncertainties. In this research, we focus on the deterministic and probabilistic stability analysis of laminated structures subject to subtangential loading, a combination of conservative and nonconservative tangential loads, using the dynamic criterion. Thus a shear-deformable laminated beam element, including warping effects, is derived to study the deterministic and probabilistic response of laminated beams. This twenty-one degrees of freedom element can be used for solving both static and dynamic problems. In the first-order shear deformable model used here we have employed a more accurate method to obtain the transverse shear correction factor. The dynamic version of the principle of virtual work for laminated composites is expressed in its nondimensional form and the element tangent stiffness and mass matrices are obtained using analytical integration The stability is studied by giving the structure a small disturbance about an equilibrium configuration, and observing if the resulting response remains small. In order to study the dynamic behavior by including uncertainties into the problem, three models were developed: Exact Monte Carlo Simulation, Sensitivity Based Monte Carlo Simulation, and Probabilistic FEA. These methods were integrated into the developed finite element analysis. Also, perturbation and sensitivity analysis have been used to study nonconservative problems, as well as to study the stability analysis, using the dynamic criterion.
A model-based test for treatment effects with probabilistic classifications.
Cavagnaro, Daniel R; Davis-Stober, Clintin P
2018-05-21
Within modern psychology, computational and statistical models play an important role in describing a wide variety of human behavior. Model selection analyses are typically used to classify individuals according to the model(s) that best describe their behavior. These classifications are inherently probabilistic, which presents challenges for performing group-level analyses, such as quantifying the effect of an experimental manipulation. We answer this challenge by presenting a method for quantifying treatment effects in terms of distributional changes in model-based (i.e., probabilistic) classifications across treatment conditions. The method uses hierarchical Bayesian mixture modeling to incorporate classification uncertainty at the individual level into the test for a treatment effect at the group level. We illustrate the method with several worked examples, including a reanalysis of the data from Kellen, Mata, and Davis-Stober (2017), and analyze its performance more generally through simulation studies. Our simulations show that the method is both more powerful and less prone to type-1 errors than Fisher's exact test when classifications are uncertain. In the special case where classifications are deterministic, we find a near-perfect power-law relationship between the Bayes factor, derived from our method, and the p value obtained from Fisher's exact test. We provide code in an online supplement that allows researchers to apply the method to their own data. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Slob, Wout
2006-07-01
Probabilistic dietary exposure assessments that are fully based on Monte Carlo sampling from the raw intake data may not be appropriate. This paper shows that the data should first be analysed by using a statistical model that is able to take the various dimensions of food consumption patterns into account. A (parametric) model is discussed that takes into account the interindividual variation in (daily) consumption frequencies, as well as in amounts consumed. Further, the model can be used to include covariates, such as age, sex, or other individual attributes. Some illustrative examples show how this model may be used to estimate the probability of exceeding an (acute or chronic) exposure limit. These results are compared with the results based on directly counting the fraction of observed intakes exceeding the limit value. This comparison shows that the latter method is not adequate, in particular for the acute exposure situation. A two-step approach for probabilistic (acute) exposure assessment is proposed: first analyse the consumption data by a (parametric) statistical model as discussed in this paper, and then use Monte Carlo techniques for combining the variation in concentrations with the variation in consumption (by sampling from the statistical model). This approach results in an estimate of the fraction of the population as a function of the fraction of days at which the exposure limit is exceeded by the individual.
Kolios, Athanasios; Jiang, Ying; Somorin, Tosin; Sowale, Ayodeji; Anastasopoulou, Aikaterini; Anthony, Edward J; Fidalgo, Beatriz; Parker, Alison; McAdam, Ewan; Williams, Leon; Collins, Matt; Tyrrel, Sean
2018-05-01
A probabilistic modelling approach was developed and applied to investigate the energy and environmental performance of an innovative sanitation system, the "Nano-membrane Toilet" (NMT). The system treats human excreta via an advanced energy and water recovery island with the aim of addressing current and future sanitation demands. Due to the complex design and inherent characteristics of the system's input material, there are a number of stochastic variables which may significantly affect the system's performance. The non-intrusive probabilistic approach adopted in this study combines a finite number of deterministic thermodynamic process simulations with an artificial neural network (ANN) approximation model and Monte Carlo simulations (MCS) to assess the effect of system uncertainties on the predicted performance of the NMT system. The joint probability distributions of the process performance indicators suggest a Stirling Engine (SE) power output in the range of 61.5-73 W with a high confidence interval (CI) of 95%. In addition, there is high probability (with 95% CI) that the NMT system can achieve positive net power output between 15.8 and 35 W. A sensitivity study reveals the system power performance is mostly affected by SE heater temperature. Investigation into the environmental performance of the NMT design, including water recovery and CO 2 /NO x emissions, suggests significant environmental benefits compared to conventional systems. Results of the probabilistic analysis can better inform future improvements on the system design and operational strategy and this probabilistic assessment framework can also be applied to similar complex engineering systems.
Probabilistic methods for sensitivity analysis and calibration in the NASA challenge problem
Safta, Cosmin; Sargsyan, Khachik; Najm, Habib N.; ...
2015-01-01
In this study, a series of algorithms are proposed to address the problems in the NASA Langley Research Center Multidisciplinary Uncertainty Quantification Challenge. A Bayesian approach is employed to characterize and calibrate the epistemic parameters based on the available data, whereas a variance-based global sensitivity analysis is used to rank the epistemic and aleatory model parameters. A nested sampling of the aleatory–epistemic space is proposed to propagate uncertainties from model parameters to output quantities of interest.
Probabilistic methods for sensitivity analysis and calibration in the NASA challenge problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Safta, Cosmin; Sargsyan, Khachik; Najm, Habib N.
In this study, a series of algorithms are proposed to address the problems in the NASA Langley Research Center Multidisciplinary Uncertainty Quantification Challenge. A Bayesian approach is employed to characterize and calibrate the epistemic parameters based on the available data, whereas a variance-based global sensitivity analysis is used to rank the epistemic and aleatory model parameters. A nested sampling of the aleatory–epistemic space is proposed to propagate uncertainties from model parameters to output quantities of interest.
NASA Astrophysics Data System (ADS)
Baer, P.; Mastrandrea, M.
2006-12-01
Simple probabilistic models which attempt to estimate likely transient temperature change from specified CO2 emissions scenarios must make assumptions about at least six uncertain aspects of the causal chain between emissions and temperature: current radiative forcing (including but not limited to aerosols), current land use emissions, carbon sinks, future non-CO2 forcing, ocean heat uptake, and climate sensitivity. Of these, multiple PDFs (probability density functions) have been published for the climate sensitivity, a couple for current forcing and ocean heat uptake, one for future non-CO2 forcing, and none for current land use emissions or carbon cycle uncertainty (which are interdependent). Different assumptions about these parameters, as well as different model structures, will lead to different estimates of likely temperature increase from the same emissions pathway. Thus policymakers will be faced with a range of temperature probability distributions for the same emissions scenarios, each described by a central tendency and spread. Because our conventional understanding of uncertainty and probability requires that a probabilistically defined variable of interest have only a single mean (or median, or modal) value and a well-defined spread, this "multidimensional" uncertainty defies straightforward utilization in policymaking. We suggest that there are no simple solutions to the questions raised. Crucially, we must dispel the notion that there is a "true" probability probabilities of this type are necessarily subjective, and reasonable people may disagree. Indeed, we suggest that what is at stake is precisely the question, what is it reasonable to believe, and to act as if we believe? As a preliminary suggestion, we demonstrate how the output of a simple probabilistic climate model might be evaluated regarding the reasonableness of the outputs it calculates with different input PDFs. We suggest further that where there is insufficient evidence to clearly favor one range of probabilistic projections over another, that the choice of results on which to base policy must necessarily involve ethical considerations, as they have inevitable consequences for the distribution of risk In particular, the choice to use a more "optimistic" PDF for climate sensitivity (or other components of the causal chain) leads to the allowance of higher emissions consistent with any specified goal for risk reduction, and thus leads to higher climate impacts, in exchange for lower mitigation costs.
Using probabilistic theory to develop interpretation guidelines for Y-STR profiles.
Taylor, Duncan; Bright, Jo-Anne; Buckleton, John
2016-03-01
Y-STR profiling makes up a small but important proportion of forensic DNA casework. Often Y-STR profiles are used when autosomal profiling has failed to yield an informative result. Consequently Y-STR profiles are often from the most challenging samples. In addition to these points, Y-STR loci are linked, meaning that evaluation of haplotype probabilities are either based on overly simplified counting methods or computationally costly genetic models, neither of which extend well to the evaluation of mixed Y-STR data. For all of these reasons Y-STR data analysis has not seen the same advances as autosomal STR data. We present here a probabilistic model for the interpretation of Y-STR data. Due to the fact that probabilistic systems for Y-STR data are still some way from reaching active casework, we also describe how data can be analysed in a continuous way to generate interpretational thresholds and guidelines. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Identifying a Probabilistic Boolean Threshold Network From Samples.
Melkman, Avraham A; Cheng, Xiaoqing; Ching, Wai-Ki; Akutsu, Tatsuya
2018-04-01
This paper studies the problem of exactly identifying the structure of a probabilistic Boolean network (PBN) from a given set of samples, where PBNs are probabilistic extensions of Boolean networks. Cheng et al. studied the problem while focusing on PBNs consisting of pairs of AND/OR functions. This paper considers PBNs consisting of Boolean threshold functions while focusing on those threshold functions that have unit coefficients. The treatment of Boolean threshold functions, and triplets and -tuplets of such functions, necessitates a deepening of the theoretical analyses. It is shown that wide classes of PBNs with such threshold functions can be exactly identified from samples under reasonable constraints, which include: 1) PBNs in which any number of threshold functions can be assigned provided that all have the same number of input variables and 2) PBNs consisting of pairs of threshold functions with different numbers of input variables. It is also shown that the problem of deciding the equivalence of two Boolean threshold functions is solvable in pseudopolynomial time but remains co-NP complete.
Henderson, Sarah B; Gauld, Jillian S; Rauch, Stephen A; McLean, Kathleen E; Krstic, Nikolas; Hondula, David M; Kosatsky, Tom
2016-11-15
Most excess deaths that occur during extreme hot weather events do not have natural heat recorded as an underlying or contributing cause. This study aims to identify the specific individuals who died because of hot weather using only secondary data. A novel approach was developed in which the expected number of deaths was repeatedly sampled from all deaths that occurred during a hot weather event, and compared with deaths during a control period. The deaths were compared with respect to five factors known to be associated with hot weather mortality. Individuals were ranked by their presence in significant models over 100 trials of 10,000 repetitions. Those with the highest rankings were identified as probable excess deaths. Sensitivity analyses were performed on a range of model combinations. These methods were applied to a 2009 hot weather event in greater Vancouver, Canada. The excess deaths identified were sensitive to differences in model combinations, particularly between univariate and multivariate approaches. One multivariate and one univariate combination were chosen as the best models for further analyses. The individuals identified by multiple combinations suggest that marginalized populations in greater Vancouver are at higher risk of death during hot weather. This study proposes novel methods for classifying specific deaths as expected or excess during a hot weather event. Further work is needed to evaluate performance of the methods in simulation studies and against clinically identified cases. If confirmed, these methods could be applied to a wide range of populations and events of interest.
Placental alpha-microglobulin-1 and combined traditional diagnostic test: a cost-benefit analysis.
Echebiri, Nelson C; McDoom, M Maya; Pullen, Jessica A; Aalto, Meaghan M; Patel, Natasha N; Doyle, Nora M
2015-01-01
We sought to evaluate if the placental alpha-microglobulin (PAMG)-1 test vs the combined traditional diagnostic test (CTDT) of pooling, nitrazine, and ferning would be a cost-beneficial screening strategy in the setting of potential preterm premature rupture of membranes. A decision analysis model was used to estimate the economic impact of PAMG-1 test vs the CTDT on preterm delivery costs from a societal perspective. Our primary outcome was the annual net cost-benefit per person tested. Baseline probabilities and costs assumptions were derived from published literature. We conducted sensitivity analyses using both deterministic and probabilistic models. Cost estimates reflect 2013 US dollars. Annual net benefit from PAMG-1 was $20,014 per person tested, while CTDT had a net benefit of $15,757 per person tested. If the probability of rupture is <38%, PAMG-1 will be cost-beneficial with an annual net benefit of $16,000-37,000 per person tested, while CTDT will have an annual net benefit of $16,000-19,500 per person tested. If the probability of rupture is >38%, CTDT is more cost-beneficial. Monte Carlo simulations of 1 million trials selected PAMG-1 as the optimal strategy with a frequency of 89%, while CTDT was only selected as the optimal strategy with a frequency of 11%. Sensitivity analyses were robust. Our cost-benefit analysis provides the economic evidence for the adoption of PAMG-1 in diagnosing preterm premature rupture of membranes in uncertain presentations and when CTDT is equivocal at 34 to <37 weeks' gestation. Copyright © 2015 Elsevier Inc. All rights reserved.
Gu, Shuyan; Deng, Jing; Shi, Lizheng; Mu, Yiming; Dong, Hengjin
2015-01-01
This study aims to estimate the long-term cost-effectiveness of saxagliptin + metformin (SAXA + MET) vs glimepiride + metformin (GLI + MET) in patients with Type 2 diabetes mellitus (T2DM) inadequately controlled with MET in China. The Cardiff Model was used to simulate disease progression and estimate the long-term effect of treatments on patients. Systematic literature reviews and hospital surveys were conducted to obtain patients profiles, clinical data, and costs. Health insurance costs (2014¥) were estimated over a 40-year period. One-way and probabilistic sensitivity analyses were performed. SAXA + MET had lower predicted incidences of cardiovascular and hypoglycemia events and a decreased total cost compared with GLI + MET (¥241,072,807 vs ¥285,455,177). There were increased numbers of quality-adjusted life-years (QALYs; 1.01/patient) and life-years (Lys; 0.03/patient) gained with SAXA + MET compared with GLI + MET, and the incremental cost of SAXA + MET vs GLI + MET (-¥44,382) resulted in -¥43,883/QALY and -¥1,710,926/LY gained with SAXA + MET. Sensitivity analyses confirmed that the results were robust. In patients with T2DM in China, SAXA + MET was more cost-effective and was well tolerated with fewer adverse effects (AEs) compared with GLI + MET. As a second-line therapy for T2DM, SAXA may address some of the unmet medical needs attributable to AEs in the treatment of T2DM.
Saito, Shota; Shimizu, Utako; Nan, Zhang; Mandai, Nozomu; Yokoyama, Junji; Terajima, Kenshi; Akazawa, Kouhei
2013-03-01
Combination therapy with infliximab (IFX) and azathioprine (AZA) is significantly more effective for treatment of active Crohn's disease (CD) than IFX monotherapy. However, AZA is associated with an increased risk of lymphoma in patients with inflammatory bowel disease. To evaluate the cost-effectiveness of combination therapy with IFX plus AZA for drug-refractory CD. A decision analysis model is constructed to compare, over a time horizon of 1year, the cost-effectiveness of combination therapy with IFX plus AZA and that of IFX monotherapy for CD patients refractory to conventional non-anti-TNF-α therapy. The treatment efficacy, adverse effects, quality-of-life scores, and treatment costs are derived from published data. One-way and probabilistic sensitivity analyses are performed to estimate the uncertainty in the results. The incremental cost-effectiveness ratio (ICER) of combination therapy with IFX plus AZA is 24,917 GBP/QALY when compared with IFX monotherapy. The sensitivity analyses reveal that the utility score of nonresponding active disease has the strongest influence on the cost-effectiveness, with ICERs ranging from 17,147 to 45,564 GBP/QALY. Assuming that policy makers are willing to pay 30,000 GBP/QALY, the probability that combination therapy with IFX plus AZA is cost-effective is 0.750. Combination therapy with IFX plus AZA appears to be a cost-effective treatment for drug-refractory CD when compared with IFX monotherapy. Furthermore, the additional lymphoma risk of combination therapy has little significance on its cost-effectiveness. Copyright © 2012 European Crohn's and Colitis Organisation. Published by Elsevier B.V. All rights reserved.
Matter-Walstra, K; Ruhstaller, T; Klingbiel, D; Schwenkglenks, M; Dedes, K J
2016-07-01
Endocrine therapy continues to be the optimal systemic treatment for metastatic ER(+)HER2(-) breast cancer. The CDK4/6 inhibitor palbociclib combined with letrozole has recently been shown to significantly improve progression-free survival. Here we examined the cost-effectiveness of this regimen for the Swiss healthcare system. A Markov cohort simulation based on the PALOMA-1 trial (Finn et al. in Lancet Oncol 16:25-35, 2015) was used as the clinical course. Input parameters were based on summary trial data. Costs were assessed from the Swiss healthcare system perspective. Adding palbociclib to letrozole (PALLET) compared to letrozole monotherapy was estimated to cost an additional CHF342,440 and gain 1.14 quality-adjusted life years, resulting in an incremental cost-effectiveness ratio (ICER) of CHF301,227/QALY gained. In univariate sensitivity analyses, no tested variation in key parameters resulted in an ICER below a willingness-to-pay threshold of CHF100,000/QALY. PALLET had a 0 % probability of being cost-effective in probabilistic sensitivity analyses. Lowering PALLET's price by 75 % resulted in an ICER of CHF73,995/QALY and a 73 % probability of being cost-effective. At current prices, PALLET would cost the Swiss healthcare system an additional CHF155 million/year. Palbociclib plus letrozole cannot be considered cost-effective for the first-line treatment of patients with metastatic breast cancer in the Swiss healthcare system.
Barkun, Alan N; Adam, Viviane; Martel, Myriam; AlNaamani, Khalid; Moses, Peter L
2015-01-01
BACKGROUND/OBJECTIVE: Partially covered self-expandable metal stents (SEMS) and polyethylene stents (PES) are both commonly used in the palliation of malignant biliary obstruction. Although SEMS are significantly more expensive, they are more efficacious than PES. Accordingly, a cost-effectiveness analysis was performed. METHODS: A cost-effectiveness analysis compared the approach of initial placement of PES versus SEMS for the study population. Patients with malignant biliary obstruction underwent an endoscopic retrograde cholangiopancreatography to insert the initial stent. If the insertion failed, a percutaneous transhepatic cholangiogram was performed. If stent occlusion occurred, a PES was inserted at repeat endoscopic retrograde cholangiopancreatography, either in an outpatient setting or after admission to hospital if cholangitis was present. A third-party payer perspective was adopted. Effectiveness was expressed as the likelihood of no occlusion over the one-year adopted time horizon. Probabilities were based on a contemporary randomized clinical trial, and costs were issued from national references. Deterministic and probabilistic sensitivity analyses were performed. RESULTS: A PES-first strategy was both more expensive and less efficacious than an SEMS-first approach. The mean per-patient costs were US$6,701 for initial SEMS and US$20,671 for initial PES, which were associated with effectiveness probabilities of 65.6% and 13.9%, respectively. Sensitivity analyses confirmed the robustness of these results. CONCLUSION: At the time of initial endoscopic drainage for patients with malignant biliary obstruction undergoing palliative stenting, an initial SEMS insertion approach was both more effective and less costly than a PES-first strategy. PMID:26125107
Legleye, Stéphane
2018-06-01
The Cannabis Abuse Screening Test (CAST) aims at screening the problematic use of cannabis. It has never been validated against the Diagnostic and Statistical Manual of Mental Disorders (DSM)-5 and its relationships with this latter have never been studied. We used a probabilistic telephone survey collected in 2014 (1351 past-year cannabis users aged 15-64) implementing the CAST and a DSM-5 adaptation of the Munich Composite International Diagnostic Interview assessing cannabis use disorders. Data were weighted, and CAST items were considered categorical. Factorial structures were assessed with confirmatory factor analyses; the relationships between the instruments were studied with multiple factor analysis (MFA). One factor for the DSM-5 and two correlated factors for the CAST were the best confirmatory factor analyses solutions. The CAST thresholds for screening moderate/severe and severe cannabis use disorders were 5 (sensitivity = 78.2% and specificity = 79.6%) and 8 (sensitivity = 86.0% and specificity = 86.7%), respectively. The MFA identified two orthogonal dimensions: The first was equally shared by both instruments; the second was the second CAST dimension (extreme frequencies of use before midday and alone, memory problems, and reproaches from friends/family). The CAST structure and screening properties were confirmed. The MFA explains its screening performances by its first dimension and identified the problematic patterns (the second dimension) that are not captured by the DSM-5. Copyright © 2017 John Wiley & Sons, Ltd.
Reliability Coupled Sensitivity Based Design Approach for Gravity Retaining Walls
NASA Astrophysics Data System (ADS)
Guha Ray, A.; Baidya, D. K.
2012-09-01
Sensitivity analysis involving different random variables and different potential failure modes of a gravity retaining wall focuses on the fact that high sensitivity of a particular variable on a particular mode of failure does not necessarily imply a remarkable contribution to the overall failure probability. The present paper aims at identifying a probabilistic risk factor ( R f ) for each random variable based on the combined effects of failure probability ( P f ) of each mode of failure of a gravity retaining wall and sensitivity of each of the random variables on these failure modes. P f is calculated by Monte Carlo simulation and sensitivity analysis of each random variable is carried out by F-test analysis. The structure, redesigned by modifying the original random variables with the risk factors, is safe against all the variations of random variables. It is observed that R f for friction angle of backfill soil ( φ 1 ) increases and cohesion of foundation soil ( c 2 ) decreases with an increase of variation of φ 1 , while R f for unit weights ( γ 1 and γ 2 ) for both soil and friction angle of foundation soil ( φ 2 ) remains almost constant for variation of soil properties. The results compared well with some of the existing deterministic and probabilistic methods and found to be cost-effective. It is seen that if variation of φ 1 remains within 5 %, significant reduction in cross-sectional area can be achieved. But if the variation is more than 7-8 %, the structure needs to be modified. Finally design guidelines for different wall dimensions, based on the present approach, are proposed.
Nonequilibrium Probabilistic Dynamics of the Logistic Map at the Edge of Chaos
NASA Astrophysics Data System (ADS)
Borges, Ernesto P.; Tsallis, Constantino; Añaños, Garín F.; de Oliveira, Paulo Murilo
2002-12-01
We consider nonequilibrium probabilistic dynamics in logisticlike maps xt+1=1-a|xt|z, (z>1) at their chaos threshold: We first introduce many initial conditions within one among W>>1 intervals partitioning the phase space and focus on the unique value qsen<1 for which the entropic form Sq≡(1- ∑
Probabilistic direct counterfactual quantum communication
NASA Astrophysics Data System (ADS)
Zhang, Sheng
2017-02-01
It is striking that the quantum Zeno effect can be used to launch a direct counterfactual communication between two spatially separated parties, Alice and Bob. So far, existing protocols of this type only provide a deterministic counterfactual communication service. However, this counterfactuality should be payed at a price. Firstly, the transmission time is much longer than a classical transmission costs. Secondly, the chained-cycle structure makes them more sensitive to channel noises. Here, we extend the idea of counterfactual communication, and present a probabilistic-counterfactual quantum communication protocol, which is proved to have advantages over the deterministic ones. Moreover, the presented protocol could evolve to a deterministic one solely by adjusting the parameters of the beam splitters. Project supported by the National Natural Science Foundation of China (Grant No. 61300203).
The Importance of Calibration in Clinical Psychology.
Lindhiem, Oliver; Petersen, Isaac T; Mentch, Lucas K; Youngstrom, Eric A
2018-02-01
Accuracy has several elements, not all of which have received equal attention in the field of clinical psychology. Calibration, the degree to which a probabilistic estimate of an event reflects the true underlying probability of the event, has largely been neglected in the field of clinical psychology in favor of other components of accuracy such as discrimination (e.g., sensitivity, specificity, area under the receiver operating characteristic curve). Although it is frequently overlooked, calibration is a critical component of accuracy with particular relevance for prognostic models and risk-assessment tools. With advances in personalized medicine and the increasing use of probabilistic (0% to 100%) estimates and predictions in mental health research, the need for careful attention to calibration has become increasingly important.
Trastuzumab in early stage breast cancer: a cost-effectiveness analysis for Belgium.
Neyt, Mattias; Huybrechts, Michel; Hulstaert, Frank; Vrijens, France; Ramaekers, Dirk
2008-08-01
Although trastuzumab is traditionally used in metastatic breast cancer treatment, studies reported on the efficacy and safety of trastuzumab in adjuvant setting for the treatment of early stage breast cancer in HER2+ tumors. We estimated the cost-effectiveness and budget impact of reimbursing trastuzumab in this indication from a payer's perspective. We constructed a health economic model. Long-term consequences of preventing patients to progress to metastatic breast cancer and side effects such as congestive heart failure were taken into account. Uncertainty was handled applying probabilistic modeling and through probabilistic sensitivity analyses. In the HERA scenario, applying an arbitrary threshold of euro30000 per life-year gained, early stage breast cancer treatment with trastuzumab is cost-effective for 9 out of 15 analyzed subgroups (according to age and stage). In contrast, treatment according to the FinHer scenario is cost-effective in 14 subgroups. Furthermore, the FinHer regimen is most of the times cost saving with an average incremental cost of euro668, euro-1045, and euro-6869 for respectively stages I, II and III breast cancer patients whereas the HERA regimen is never cost saving due to the higher initial treatment costs. The model shows better cost-effectiveness for the 9-week initial treatment (FinHer) compared to no trastuzumab treatment than for the 1-year post-chemotherapy treatment (HERA). Both from a medical and an economic point of view, the 9-week initial treatment regimen with trastuzumab shows promising results and justifies the initiation of a large comparative trial with a 1-year regimen.
Reliability and Creep/Fatigue Analysis of a CMC Component
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Mital, Subodh K.; Gyekenyesi, John Z.; Gyekenyesi, John P.
2007-01-01
High temperature ceramic matrix composites (CMC) are being explored as viable candidate materials for hot section gas turbine components. These advanced composites can potentially lead to reduced weight and enable higher operating temperatures requiring less cooling; thus leading to increased engine efficiencies. There is a need for convenient design tools that can accommodate various loading conditions and material data with their associated uncertainties to estimate the minimum predicted life as well as the failure probabilities of a structural component. This paper presents a review of the life prediction and probabilistic analyses performed for a CMC turbine stator vane. A computer code, NASALife, is used to predict the life of a 2-D woven silicon carbide fiber reinforced silicon carbide matrix (SiC/SiC) turbine stator vane due to a mission cycle which induces low cycle fatigue and creep. The output from this program includes damage from creep loading, damage due to cyclic loading and the combined damage due to the given loading cycle. Results indicate that the trends predicted by NASALife are as expected for the loading conditions used for this study. In addition, a combination of woven composite micromechanics, finite element structural analysis and Fast Probability Integration (FPI) techniques has been used to evaluate the maximum stress and its probabilistic distribution in a CMC turbine stator vane. Input variables causing scatter are identified and ranked based upon their sensitivity magnitude. Results indicate that reducing the scatter in proportional limit strength of the vane material has the greatest effect in improving the overall reliability of the CMC vane.
NASA Technical Reports Server (NTRS)
Duda, David P.; Minnis, Patrick
2009-01-01
Previous studies have shown that probabilistic forecasting may be a useful method for predicting persistent contrail formation. A probabilistic forecast to accurately predict contrail formation over the contiguous United States (CONUS) is created by using meteorological data based on hourly meteorological analyses from the Advanced Regional Prediction System (ARPS) and from the Rapid Update Cycle (RUC) as well as GOES water vapor channel measurements, combined with surface and satellite observations of contrails. Two groups of logistic models were created. The first group of models (SURFACE models) is based on surface-based contrail observations supplemented with satellite observations of contrail occurrence. The second group of models (OUTBREAK models) is derived from a selected subgroup of satellite-based observations of widespread persistent contrails. The mean accuracies for both the SURFACE and OUTBREAK models typically exceeded 75 percent when based on the RUC or ARPS analysis data, but decreased when the logistic models were derived from ARPS forecast data.
What's in a Name? Probabilistic Inference of Religious Community from South Asian Names
ERIC Educational Resources Information Center
Susewind, Raphael
2015-01-01
Fine-grained data on religious communities are often considered sensitive in South Asia and consequently remain inaccessible. Yet without such data, statistical research on communal relations and group-based inequality remains superficial, hampering the development of appropriate policy measures to prevent further social exclusion on the basis of…
Population viability assessment of salmonids by using probabilistic networks
Danny C. Lee; Bruce E. Rieman
1997-01-01
Public agencies are being asked to quantitatively assess the impact of land management activities on sensitive populations of salmonids. To aid in these assessments, we developed a Bayesian viability assessment procedure (BayVAM) to help characterize land use risks to salmonids in the Pacific Northwest. This procedure incorporates a hybrid approach to viability...
Reliability and performance evaluation of systems containing embedded rule-based expert systems
NASA Technical Reports Server (NTRS)
Beaton, Robert M.; Adams, Milton B.; Harrison, James V. A.
1989-01-01
A method for evaluating the reliability of real-time systems containing embedded rule-based expert systems is proposed and investigated. It is a three stage technique that addresses the impact of knowledge-base uncertainties on the performance of expert systems. In the first stage, a Markov reliability model of the system is developed which identifies the key performance parameters of the expert system. In the second stage, the evaluation method is used to determine the values of the expert system's key performance parameters. The performance parameters can be evaluated directly by using a probabilistic model of uncertainties in the knowledge-base or by using sensitivity analyses. In the third and final state, the performance parameters of the expert system are combined with performance parameters for other system components and subsystems to evaluate the reliability and performance of the complete system. The evaluation method is demonstrated in the context of a simple expert system used to supervise the performances of an FDI algorithm associated with an aircraft longitudinal flight-control system.
NASA Technical Reports Server (NTRS)
1971-01-01
The optimal allocation of resources to the national space program over an extended time period requires the solution of a large combinatorial problem in which the program elements are interdependent. The computer model uses an accelerated search technique to solve this problem. The model contains a large number of options selectable by the user to provide flexible input and a broad range of output for use in sensitivity analyses of all entering elements. Examples of these options are budget smoothing under varied appropriation levels, entry of inflation and discount effects, and probabilistic output which provides quantified degrees of certainty that program costs will remain within planned budget. Criteria and related analytic procedures were established for identifying potential new space program directions. Used in combination with the optimal resource allocation model, new space applications can be analyzed in realistic perspective, including the advantage gain from existing space program plant and on-going programs such as the space transportation system.
Addressing the Hard Factors for Command File Errors by Probabilistic Reasoning
NASA Technical Reports Server (NTRS)
Meshkat, Leila; Bryant, Larry
2014-01-01
Command File Errors (CFE) are managed using standard risk management approaches at the Jet Propulsion Laboratory. Over the last few years, more emphasis has been made on the collection, organization, and analysis of these errors for the purpose of reducing the CFE rates. More recently, probabilistic modeling techniques have been used for more in depth analysis of the perceived error rates of the DAWN mission and for managing the soft factors in the upcoming phases of the mission. We broadly classify the factors that can lead to CFE's as soft factors, which relate to the cognition of the operators and hard factors which relate to the Mission System which is composed of the hardware, software and procedures used for the generation, verification & validation and execution of commands. The focus of this paper is to use probabilistic models that represent multiple missions at JPL to determine the root cause and sensitivities of the various components of the mission system and develop recommendations and techniques for addressing them. The customization of these multi-mission models to a sample interplanetary spacecraft is done for this purpose.
Probabilistic techniques for obtaining accurate patient counts in Clinical Data Warehouses
Myers, Risa B.; Herskovic, Jorge R.
2011-01-01
Proposal and execution of clinical trials, computation of quality measures and discovery of correlation between medical phenomena are all applications where an accurate count of patients is needed. However, existing sources of this type of patient information, including Clinical Data Warehouses (CDW) may be incomplete or inaccurate. This research explores applying probabilistic techniques, supported by the MayBMS probabilistic database, to obtain accurate patient counts from a clinical data warehouse containing synthetic patient data. We present a synthetic clinical data warehouse (CDW), and populate it with simulated data using a custom patient data generation engine. We then implement, evaluate and compare different techniques for obtaining patients counts. We model billing as a test for the presence of a condition. We compute billing’s sensitivity and specificity both by conducting a “Simulated Expert Review” where a representative sample of records are reviewed and labeled by experts, and by obtaining the ground truth for every record. We compute the posterior probability of a patient having a condition through a “Bayesian Chain”, using Bayes’ Theorem to calculate the probability of a patient having a condition after each visit. The second method is a “one-shot” approach that computes the probability of a patient having a condition based on whether the patient is ever billed for the condition Our results demonstrate the utility of probabilistic approaches, which improve on the accuracy of raw counts. In particular, the simulated review paired with a single application of Bayes’ Theorem produces the best results, with an average error rate of 2.1% compared to 43.7% for the straightforward billing counts. Overall, this research demonstrates that Bayesian probabilistic approaches improve patient counts on simulated patient populations. We believe that total patient counts based on billing data are one of the many possible applications of our Bayesian framework. Use of these probabilistic techniques will enable more accurate patient counts and better results for applications requiring this metric. PMID:21986292
Integrated Risk-Informed Decision-Making for an ALMR PRISM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muhlheim, Michael David; Belles, Randy; Denning, Richard S.
Decision-making is the process of identifying decision alternatives, assessing those alternatives based on predefined metrics, selecting an alternative (i.e., making a decision), and then implementing that alternative. The generation of decisions requires a structured, coherent process, or a decision-making process. The overall objective for this work is that the generalized framework is adopted into an autonomous decision-making framework and tailored to specific requirements for various applications. In this context, automation is the use of computing resources to make decisions and implement a structured decision-making process with limited or no human intervention. The overriding goal of automation is to replace ormore » supplement human decision makers with reconfigurable decision-making modules that can perform a given set of tasks rationally, consistently, and reliably. Risk-informed decision-making requires a probabilistic assessment of the likelihood of success given the status of the plant/systems and component health, and a deterministic assessment between plant operating parameters and reactor protection parameters to prevent unnecessary trips and challenges to plant safety systems. The probabilistic portion of the decision-making engine of the supervisory control system is based on the control actions associated with an ALMR PRISM. Newly incorporated into the probabilistic models are the prognostic/diagnostic models developed by Pacific Northwest National Laboratory. These allow decisions to incorporate the health of components into the decision–making process. Once the control options are identified and ranked based on the likelihood of success, the supervisory control system transmits the options to the deterministic portion of the platform. The deterministic portion of the decision-making engine uses thermal-hydraulic modeling and components for an advanced liquid-metal reactor Power Reactor Inherently Safe Module. The deterministic multi-attribute decision-making framework uses various sensor data (e.g., reactor outlet temperature, steam generator drum level) and calculates its position within the challenge state, its trajectory, and its margin within the controllable domain using utility functions to evaluate current and projected plant state space for different control decisions. The metrics that are evaluated are based on reactor trip set points. The integration of the deterministic calculations using multi-physics analyses and probabilistic safety calculations allows for the examination and quantification of margin recovery strategies. This also provides validation of the control options identified from the probabilistic assessment. Thus, the thermalhydraulics analyses are used to validate the control options identified from the probabilistic assessment. Future work includes evaluating other possible metrics and computational efficiencies, and developing a user interface to mimic display panels at a modern nuclear power plant.« less
Wali, Arvin R; Park, Charlie C; Santiago-Dieppa, David R; Vaida, Florin; Murphy, James D; Khalessi, Alexander A
2017-06-01
OBJECTIVE Rupture of large or giant intracranial aneurysms leads to significant morbidity, mortality, and health care costs. Both coiling and the Pipeline embolization device (PED) have been shown to be safe and clinically effective for the treatment of unruptured large and giant intracranial aneurysms; however, the relative cost-to-outcome ratio is unknown. The authors present the first cost-effectiveness analysis to compare the economic impact of the PED compared with coiling or no treatment for the endovascular management of large or giant intracranial aneurysms. METHODS A Markov model was constructed to simulate a 60-year-old woman with a large or giant intracranial aneurysm considering a PED, endovascular coiling, or no treatment in terms of neurological outcome, angiographic outcome, retreatment rates, procedural and rehabilitation costs, and rupture rates. Transition probabilities were derived from prior literature reporting outcomes and costs of PED, coiling, and no treatment for the management of aneurysms. Cost-effectiveness was defined, with the incremental cost-effectiveness ratios (ICERs) defined as difference in costs divided by the difference in quality-adjusted life years (QALYs). The ICERs < $50,000/QALY gained were considered cost-effective. To study parameter uncertainty, 1-way, 2-way, and probabilistic sensitivity analyses were performed. RESULTS The base-case model demonstrated lifetime QALYs of 12.72 for patients in the PED cohort, 12.89 for the endovascular coiling cohort, and 9.7 for patients in the no-treatment cohort. Lifetime rehabilitation and treatment costs were $59,837.52 for PED; $79,025.42 for endovascular coiling; and $193,531.29 in the no-treatment cohort. Patients who did not undergo elective treatment were subject to increased rates of aneurysm rupture and high treatment and rehabilitation costs. One-way sensitivity analysis demonstrated that the model was most sensitive to assumptions about the costs and mortality risks for PED and coiling. Probabilistic sampling demonstrated that PED was the cost-effective strategy in 58.4% of iterations, coiling was the cost-effective strategy in 41.4% of iterations, and the no-treatment option was the cost-effective strategy in only 0.2% of iterations. CONCLUSIONS The authors' cost-effective model demonstrated that elective endovascular techniques such as PED and endovascular coiling are cost-effective strategies for improving health outcomes and lifetime quality of life measures in patients with large or giant unruptured intracranial aneurysm.
Chowdhury, Enayet K.; Ademi, Zanfina; Moss, John R.; Wing, Lindon M.H.; Reid, Christopher M.
2015-01-01
Abstract The objective of this study was to examine the cost-effectiveness of angiotensin-converting enzyme inhibitor (ACEI)-based treatment compared with thiazide diuretic-based treatment for hypertension in elderly Australians considering diabetes as an outcome along with cardiovascular outcomes from the Australian government's perspective. We used a cost–utility analysis to estimate the incremental cost-effectiveness ratio (ICER) per quality-adjusted life-year (QALY) gained. Data on cardiovascular events and new onset of diabetes were used from the Second Australian National Blood Pressure Study, a randomized clinical trial comparing diuretic-based (hydrochlorothiazide) versus ACEI-based (enalapril) treatment in 6083 elderly (age ≥65 years) hypertensive patients over a median 4.1-year period. For this economic analysis, the total study population was stratified into 2 groups. Group A was restricted to participants diabetes free at baseline (n = 5642); group B was restricted to participants with preexisting diabetes mellitus (type 1 or type 2) at baseline (n = 441). Data on utility scores for different events were used from available published literatures; whereas, treatment and adverse event management costs were calculated from direct health care costs available from Australian government reimbursement data. Costs and QALYs were discounted at 5% per annum. One-way and probabilistic sensitivity analyses were performed to assess the uncertainty around utilities and cost data. After a treatment period of 5 years, for group A, the ICER was Australian dollars (AUD) 27,698 (€ 18,004; AUD 1–€ 0.65) per QALY gained comparing ACEI-based treatment with diuretic-based treatment (sensitive to the utility value for new-onset diabetes). In group B, ACEI-based treatment was a dominant strategy (both more effective and cost-saving). On probabilistic sensitivity analysis, the ICERs per QALY gained were always below AUD 50,000 for group B; whereas for group A, the probability of being below AUD 50,000 was 85%. Although the dispensed price of diuretic-based treatment of hypertension in the elderly is lower, upon considering the potential enhanced likelihood of the development of diabetes in addition to the costs of treating cardiovascular disease, ACEI-based treatment may be a more cost-effective strategy in this population. PMID:25738481
Improved Point-source Detection in Crowded Fields Using Probabilistic Cataloging
NASA Astrophysics Data System (ADS)
Portillo, Stephen K. N.; Lee, Benjamin C. G.; Daylan, Tansu; Finkbeiner, Douglas P.
2017-10-01
Cataloging is challenging in crowded fields because sources are extremely covariant with their neighbors and blending makes even the number of sources ambiguous. We present the first optical probabilistic catalog, cataloging a crowded (˜0.1 sources per pixel brighter than 22nd mag in F606W) Sloan Digital Sky Survey r-band image from M2. Probabilistic cataloging returns an ensemble of catalogs inferred from the image and thus can capture source-source covariance and deblending ambiguities. By comparing to a traditional catalog of the same image and a Hubble Space Telescope catalog of the same region, we show that our catalog ensemble better recovers sources from the image. It goes more than a magnitude deeper than the traditional catalog while having a lower false-discovery rate brighter than 20th mag. We also present an algorithm for reducing this catalog ensemble to a condensed catalog that is similar to a traditional catalog, except that it explicitly marginalizes over source-source covariances and nuisance parameters. We show that this condensed catalog has a similar completeness and false-discovery rate to the catalog ensemble. Future telescopes will be more sensitive, and thus more of their images will be crowded. Probabilistic cataloging performs better than existing software in crowded fields and so should be considered when creating photometric pipelines in the Large Synoptic Survey Telescope era.
Clinical Benefits, Costs, and Cost-Effectiveness of Neonatal Intensive Care in Mexico
Profit, Jochen; Lee, Diana; Zupancic, John A.; Papile, LuAnn; Gutierrez, Cristina; Goldie, Sue J.; Gonzalez-Pier, Eduardo; Salomon, Joshua A.
2010-01-01
Background Neonatal intensive care improves survival, but is associated with high costs and disability amongst survivors. Recent health reform in Mexico launched a new subsidized insurance program, necessitating informed choices on the different interventions that might be covered by the program, including neonatal intensive care. The purpose of this study was to estimate the clinical outcomes, costs, and cost-effectiveness of neonatal intensive care in Mexico. Methods and Findings A cost-effectiveness analysis was conducted using a decision analytic model of health and economic outcomes following preterm birth. Model parameters governing health outcomes were estimated from Mexican vital registration and hospital discharge databases, supplemented with meta-analyses and systematic reviews from the published literature. Costs were estimated on the basis of data provided by the Ministry of Health in Mexico and World Health Organization price lists, supplemented with published studies from other countries as needed. The model estimated changes in clinical outcomes, life expectancy, disability-free life expectancy, lifetime costs, disability-adjusted life years (DALYs), and incremental cost-effectiveness ratios (ICERs) for neonatal intensive care compared to no intensive care. Uncertainty around the results was characterized using one-way sensitivity analyses and a multivariate probabilistic sensitivity analysis. In the base-case analysis, neonatal intensive care for infants born at 24–26, 27–29, and 30–33 weeks gestational age prolonged life expectancy by 28, 43, and 34 years and averted 9, 15, and 12 DALYs, at incremental costs per infant of US$11,400, US$9,500, and US$3,000, respectively, compared to an alternative of no intensive care. The ICERs of neonatal intensive care at 24–26, 27–29, and 30–33 weeks were US$1,200, US$650, and US$240, per DALY averted, respectively. The findings were robust to variation in parameter values over wide ranges in sensitivity analyses. Conclusions Incremental cost-effectiveness ratios for neonatal intensive care imply very high value for money on the basis of conventional benchmarks for cost-effectiveness analysis. Please see later in the article for the Editors' Summary PMID:21179496
Economic evaluation of DNA ploidy analysis vs liquid-based cytology for cervical screening.
Nghiem, V T; Davies, K R; Beck, J R; Follen, M; MacAulay, C; Guillaud, M; Cantor, S B
2015-06-09
DNA ploidy analysis involves automated quantification of chromosomal aneuploidy, a potential marker of progression toward cervical carcinoma. We evaluated the cost-effectiveness of this method for cervical screening, comparing five ploidy strategies (using different numbers of aneuploid cells as cut points) with liquid-based Papanicolaou smear and no screening. A state-transition Markov model simulated the natural history of HPV infection and possible progression into cervical neoplasia in a cohort of 12-year-old females. The analysis evaluated cost in 2012 US$ and effectiveness in quality-adjusted life-years (QALYs) from a health-system perspective throughout a lifetime horizon in the US setting. We calculated incremental cost-effectiveness ratios (ICERs) to determine the best strategy. The robustness of optimal choices was examined in deterministic and probabilistic sensitivity analyses. In the base-case analysis, the ploidy 4 cell strategy was cost-effective, yielding an increase of 0.032 QALY and an ICER of $18 264/QALY compared to no screening. For most scenarios in the deterministic sensitivity analysis, the ploidy 4 cell strategy was the only cost-effective strategy. Cost-effectiveness acceptability curves showed that this strategy was more likely to be cost-effective than the Papanicolaou smear. Compared to the liquid-based Papanicolaou smear, screening with a DNA ploidy strategy appeared less costly and comparably effective.
Rivera, Fernando; Valladares, Manuel; Gea, Salvador; López-Martínez, Noemí
2017-06-01
To assess the cost-effectiveness of panitumumab in combination with mFOLFOX6 (oxaliplatin, 5-fluorouracil, and leucovorin) vs bevacizumab in combination with mFOLFOX6 as first-line treatment of patients with wild-type RAS metastatic colorectal cancer (mCRC) in Spain. A semi-Markov model was developed including the following health states: Progression free; Progressive disease: Treat with best supportive care; Progressive disease: Treat with subsequent active therapy; Attempted resection of metastases; Disease free after metastases resection; Progressive disease: after resection and relapse; and Death. Parametric survival analyses of patient-level progression free survival and overall survival data from the PEAK Phase II clinical trial were used to estimate health state transitions. Additional data from the PEAK trial were considered for the dose and duration of therapy, the use of subsequent therapy, the occurrence of adverse events, and the incidence and probability of time to metastasis resection. Utility weightings were calculated from patient-level data from panitumumab trials evaluating first-, second-, and third-line treatments. The study was performed from the Spanish National Health System (NHS) perspective including only direct costs. A life-time horizon was applied. Probabilistic sensitivity analyses and scenario sensitivity analyses were performed to assess the robustness of the model. Based on the PEAK trial, which demonstrated greater efficacy of panitumumab vs bevacizumab, both in combination with mFOLFOX6 first-line in wild-type RAS mCRC patients, the estimated incremental cost per life-year gained was €16,567 and the estimated incremental cost per quality-adjusted life year gained was €22,794. The sensitivity analyses showed the model was robust to alternative parameters and assumptions. The analysis was based on a simulation model and, therefore, the results should be interpreted cautiously. Based on the PEAK Phase II clinical trial and taking into account Spanish costs, the results of the analysis showed that first-line treatment of mCRC with panitumumab + mFOLFOX6 could be considered a cost-effective option compared with bevacizumab + mFOLFOX6 for the Spanish NHS.
Risk-Based Probabilistic Approach to Aeropropulsion System Assessment
NASA Technical Reports Server (NTRS)
Tong, Michael T.
2002-01-01
In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decision makers to determine the feasibility and return-on-investment of a new aircraft engine. In this work, an alternative approach based on the probabilistic method was described for a comprehensive assessment of an aeropropulsion system. The statistical approach quantifies the design uncertainties inherent in a new aeropropulsion system and their influences on engine performance. Because of this, it enhances the reliability of a system assessment. A technical assessment of a wave-rotor-enhanced gas turbine engine was performed to demonstrate the methodology. The assessment used probability distributions to account for the uncertainties that occur in component efficiencies and flows and in mechanical design variables. The approach taken in this effort was to integrate the thermodynamic cycle analysis embedded in the computer code NEPP (NASA Engine Performance Program) and the engine weight analysis embedded in the computer code WATE (Weight Analysis of Turbine Engines) with the fast probability integration technique (FPI). FPI was developed by Southwest Research Institute under contract with the NASA Glenn Research Center. The results were plotted in the form of cumulative distribution functions and sensitivity analyses and were compared with results from the traditional deterministic approach. The comparison showed that the probabilistic approach provides a more realistic and systematic way to assess an aeropropulsion system. The current work addressed the application of the probabilistic approach to assess specific fuel consumption, engine thrust, and weight. Similarly, the approach can be used to assess other aspects of aeropropulsion system performance, such as cost, acoustic noise, and emissions. Additional information is included in the original extended abstract.
Lesage, Elise; Aronson, Sarah E; Sutherland, Matthew T; Ross, Thomas J; Salmeron, Betty Jo; Stein, Elliot A
2017-06-01
Withdrawal from nicotine is an important contributor to smoking relapse. Understanding how reward-based decision making is affected by abstinence and by pharmacotherapies such as nicotine replacement therapy and varenicline tartrate may aid cessation treatment. To independently assess the effects of nicotine dependence and stimulation of the nicotinic acetylcholine receptor on the ability to interpret valence information (reward sensitivity) and subsequently alter behavior as reward contingencies change (cognitive flexibility) in a probabilistic reversal learning task. Nicotine-dependent smokers and nonsmokers completed a probabilistic reversal learning task during acquisition of functional magnetic resonance imaging (fMRI) in a 2-drug, double-blind placebo-controlled crossover design conducted from January 21, 2009, to September 29, 2011. Smokers were abstinent from cigarette smoking for 12 hours for all sessions. In a fully Latin square fashion, participants in both groups underwent MRI twice while receiving varenicline and twice while receiving a placebo pill, wearing either a nicotine or a placebo patch. Imaging analysis was performed from June 15, 2015, to August 10, 2016. A well-established computational model captured effects of smoking status and administration of nicotine and varenicline on probabilistic reversal learning choice behavior. Neural effects of smoking status, nicotine, and varenicline were tested for on MRI contrasts that captured reward sensitivity and cognitive flexibility. The study included 24 nicotine-dependent smokers (12 women and 12 men; mean [SD] age, 35.8 [9.9] years) and 20 nonsmokers (10 women and 10 men; mean [SD] age, 30.4 [7.2] years). Computational modeling indicated that abstinent smokers were biased toward response shifting and that their decisions were less sensitive to the available evidence, suggesting increased impulsivity during withdrawal. These behavioral impairments were mitigated with nicotine and varenicline. Similarly, decreased mesocorticolimbic activity associated with cognitive flexibility in abstinent smokers was restored to the level of nonsmokers following stimulation of nicotinic acetylcholine receptors (familywise error-corrected P < .05). Conversely, neural signatures of decreased reward sensitivity in smokers (vs nonsmokers; familywise error-corrected P < .05) in the dorsal striatum and anterior cingulate cortex were not mitigated by nicotine or varenicline. There was a double dissociation between the effects of chronic nicotine dependence on neural representations of reward sensitivity and acute effects of stimulation of nicotinic acetylcholine receptors on behavioral and neural signatures of cognitive flexibility in smokers. These chronic and acute pharmacologic effects were observed in overlapping mesocorticolimbic regions, suggesting that available pharmacotherapies may alleviate deficits in the same circuitry for certain mental computations but not for others. clinicaltrials.gov Identifier: NCT00830739.
NASA Astrophysics Data System (ADS)
Ghotbi, Abdoul R.
2014-09-01
The seismic behavior of skewed bridges has not been well studied compared to straight bridges. Skewed bridges have shown extensive damage, especially due to deck rotation, shear keys failure, abutment unseating and column-bent drift. This research, therefore, aims to study the behavior of skewed and straight highway overpass bridges both with and without taking into account the effects of Soil-Structure Interaction (SSI) due to near-fault ground motions. Due to several sources of uncertainty associated with the ground motions, soil and structure, a probabilistic approach is needed. Thus, a probabilistic methodology similar to the one developed by the Pacific Earthquake Engineering Research Center (PEER) has been utilized to assess the probability of damage due to various levels of shaking using appropriate intensity measures with minimum dispersions. The probabilistic analyses were performed for various bridge configurations and site conditions, including sand ranging from loose to dense and clay ranging from soft to stiff, in order to evaluate the effects. The results proved a considerable susceptibility of skewed bridges to deck rotation and shear keys displacement. It was also found that SSI had a decreasing effect on the damage probability for various demands compared to the fixed-base model without including SSI. However, deck rotation for all types of the soil and also abutment unseating for very loose sand and soft clay showed an increase in damage probability compared to the fixed-base model. The damage probability for various demands has also been found to decrease with an increase of soil strength for both sandy and clayey sites. With respect to the variations in the skew angle, an increase in skew angle has had an increasing effect on the amplitude of the seismic response for various demands. Deck rotation has been very sensitive to the increase in the skew angle; therefore, as the skew angle increased, the deck rotation responded accordingly. Furthermore, abutment unseating showed an increasing trend due to an increase in skew angle for both fixed-base and SSI models.
Diagnostic Validity of an Automated Probabilistic Tractography in Amnestic Mild Cognitive Impairment
Jung, Won Sang; Um, Yoo Hyun; Kang, Dong Woo; Lee, Chang Uk; Woo, Young Sup; Bahk, Won-Myong
2018-01-01
Objective Although several prior works showed the white matter (WM) integrity changes in amnestic mild cognitive impairment (aMCI) and Alzheimer’s disease, it is still unclear the diagnostic accuracy of the WM integrity measurements using diffusion tensor imaging (DTI) in discriminating aMCI from normal controls. The aim of this study is to explore diagnostic validity of whole brain automated probabilistic tractography in discriminating aMCI from normal controls. Methods One hundred-two subjects (50 aMCI and 52 normal controls) were included and underwent DTI scans. Whole brain WM tracts were reconstructed with automated probabilistic tractography. Fractional anisotropy (FA) and mean diffusivity (MD) values of the memory related WM tracts were measured and compared between the aMCI and the normal control groups. In addition, the diagnostic validities of these WM tracts were evaluated. Results Decreased FA and increased MD values of memory related WM tracts were observed in the aMCI group compared with the control group. Among FA and MD value of each tract, the FA value of left cingulum angular bundle showed the highest area under the curve (AUC) of 0.85 with a sensitivity of 88.2%, a specificity of 76.9% in differentiating MCI patients from control subjects. Furthermore, the combination FA values of WM integrity measures of memory related WM tracts showed AUC value of 0.98, a sensitivity of 96%, a specificity of 94.2%. Conclusion Our results with good diagnostic validity of WM integrity measurements suggest DTI might be promising neuroimaging tool for early detection of aMCI and AD patients. PMID:29739127
Myers, Casey A.; Laz, Peter J.; Shelburne, Kevin B.; Davidson, Bradley S.
2015-01-01
Uncertainty that arises from measurement error and parameter estimation can significantly affect the interpretation of musculoskeletal simulations; however, these effects are rarely addressed. The objective of this study was to develop an open-source probabilistic musculoskeletal modeling framework to assess how measurement error and parameter uncertainty propagate through a gait simulation. A baseline gait simulation was performed for a male subject using OpenSim for three stages: inverse kinematics, inverse dynamics, and muscle force prediction. A series of Monte Carlo simulations were performed that considered intrarater variability in marker placement, movement artifacts in each phase of gait, variability in body segment parameters, and variability in muscle parameters calculated from cadaveric investigations. Propagation of uncertainty was performed by also using the output distributions from one stage as input distributions to subsequent stages. Confidence bounds (5–95%) and sensitivity of outputs to model input parameters were calculated throughout the gait cycle. The combined impact of uncertainty resulted in mean bounds that ranged from 2.7° to 6.4° in joint kinematics, 2.7 to 8.1 N m in joint moments, and 35.8 to 130.8 N in muscle forces. The impact of movement artifact was 1.8 times larger than any other propagated source. Sensitivity to specific body segment parameters and muscle parameters were linked to where in the gait cycle they were calculated. We anticipate that through the increased use of probabilistic tools, researchers will better understand the strengths and limitations of their musculoskeletal simulations and more effectively use simulations to evaluate hypotheses and inform clinical decisions. PMID:25404535
Brandsch, Rainer
2017-10-01
Migration modelling provides reliable migration estimates from food-contact materials (FCM) to food or food simulants based on mass-transfer parameters like diffusion and partition coefficients related to individual materials. In most cases, mass-transfer parameters are not readily available from the literature and for this reason are estimated with a given uncertainty. Historically, uncertainty was accounted for by introducing upper limit concepts first, turning out to be of limited applicability due to highly overestimated migration results. Probabilistic migration modelling gives the possibility to consider uncertainty of the mass-transfer parameters as well as other model inputs. With respect to a functional barrier, the most important parameters among others are the diffusion properties of the functional barrier and its thickness. A software tool that accepts distribution as inputs and is capable of applying Monte Carlo methods, i.e., random sampling from the input distributions of the relevant parameters (i.e., diffusion coefficient and layer thickness), predicts migration results with related uncertainty and confidence intervals. The capabilities of probabilistic migration modelling are presented in the view of three case studies (1) sensitivity analysis, (2) functional barrier efficiency and (3) validation by experimental testing. Based on the predicted migration by probabilistic migration modelling and related exposure estimates, safety evaluation of new materials in the context of existing or new packaging concepts is possible. Identifying associated migration risk and potential safety concerns in the early stage of packaging development is possible. Furthermore, dedicated material selection exhibiting required functional barrier efficiency under application conditions becomes feasible. Validation of the migration risk assessment by probabilistic migration modelling through a minimum of dedicated experimental testing is strongly recommended.
Influential input classification in probabilistic multimedia models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maddalena, Randy L.; McKone, Thomas E.; Hsieh, Dennis P.H.
1999-05-01
Monte Carlo analysis is a statistical simulation method that is often used to assess and quantify the outcome variance in complex environmental fate and effects models. Total outcome variance of these models is a function of (1) the uncertainty and/or variability associated with each model input and (2) the sensitivity of the model outcome to changes in the inputs. To propagate variance through a model using Monte Carlo techniques, each variable must be assigned a probability distribution. The validity of these distributions directly influences the accuracy and reliability of the model outcome. To efficiently allocate resources for constructing distributions onemore » should first identify the most influential set of variables in the model. Although existing sensitivity and uncertainty analysis methods can provide a relative ranking of the importance of model inputs, they fail to identify the minimum set of stochastic inputs necessary to sufficiently characterize the outcome variance. In this paper, we describe and demonstrate a novel sensitivity/uncertainty analysis method for assessing the importance of each variable in a multimedia environmental fate model. Our analyses show that for a given scenario, a relatively small number of input variables influence the central tendency of the model and an even smaller set determines the shape of the outcome distribution. For each input, the level of influence depends on the scenario under consideration. This information is useful for developing site specific models and improving our understanding of the processes that have the greatest influence on the variance in outcomes from multimedia models.« less
Mao, Ningying; Lesher, Beth; Liu, Qifa; Qin, Lei; Chen, Yixi; Gao, Xin; Earnshaw, Stephanie R; McDade, Cheryl L; Charbonneau, Claudie
2016-01-01
Invasive fungal infections (IFIs) require rapid diagnosis and treatment. A decision-analytic model was used to estimate total costs and survival associated with a diagnostic-driven (DD) or an empiric treatment approach in neutropenic patients with hematological malignancies receiving chemotherapy or autologous/allogeneic stem cell transplants in Shanghai, Beijing, Chengdu, and Guangzhou, the People's Republic of China. Treatment initiation for the empiric approach occurred after clinical suspicion of an IFI; treatment initiation for the DD approach occurred after clinical suspicion and a positive IFI diagnostic test result. Model inputs were obtained from the literature; treatment patterns and resource use were based on clinical opinion. Total costs were lower for the DD versus the empiric approach in Shanghai (¥3,232 vs ¥4,331), Beijing (¥3,894 vs ¥4,864), Chengdu, (¥4,632 vs ¥5,795), and Guangzhou (¥8,489 vs ¥9,795). Antifungal administration was lower using the DD (5.7%) than empiric (9.8%) approach, with similar survival rates. Results from one-way and probabilistic sensitivity analyses were most sensitive to changes in diagnostic test sensitivity and IFI incidence; the DD approach dominated the empiric approach in 88% of scenarios. These results suggest that a DD compared to an empiric treatment approach in the People's Republic of China may be cost saving, with similar overall survival in immunocompromised patients with suspected IFIs.
Rajan, Prashant V; Qudsi, Rameez A; Dyer, George S M; Losina, Elena
2018-02-07
There is no consensus on the optimal fixation method for patients who require a surgical procedure for distal radial fractures. We used cost-effectiveness analyses to determine which of 3 modalities offers the best value: closed reduction and percutaneous pinning, open reduction and internal fixation, or external fixation. We developed a Markov model that projected short-term and long-term health benefits and costs in patients undergoing a surgical procedure for a distal radial fracture. Simulations began at the patient age of 50 years and were run over the patient's lifetime. The analysis was conducted from health-care payer and societal perspectives. We estimated transition probabilities and quality-of-life values from the literature and determined costs from Medicare reimbursement schedules in 2016 U.S. dollars. Suboptimal postoperative outcomes were determined by rates of reduction loss (4% for closed reduction and percutaneous pinning, 1% for open reduction and internal fixation, and 11% for external fixation) and rates of orthopaedic complications. Procedural costs were $7,638 for closed reduction and percutaneous pinning, $10,170 for open reduction and internal fixation, and $9,886 for external fixation. Outputs were total costs and quality-adjusted life-years (QALYs), discounted at 3% per year. We considered willingness-to-pay thresholds of $50,000 and $100,000. We conducted deterministic and probabilistic sensitivity analyses to evaluate the impact of data uncertainty. From the health-care payer perspective, closed reduction and percutaneous pinning dominated (i.e., produced greater QALYs at lower costs than) open reduction and internal fixation and dominated external fixation. From the societal perspective, the incremental cost-effectiveness ratio for closed reduction and percutaneous pinning compared with open reduction and internal fixation was $21,058 per QALY and external fixation was dominated. In probabilistic sensitivity analysis, open reduction and internal fixation was cost-effective roughly 50% of the time compared with roughly 45% for closed reduction and percutaneous pinning. When considering data uncertainty, there is only a 5% to 10% difference in the frequency of probability combinations that find open reduction and internal fixation to be more cost-effective. The current degree of uncertainty in the data produces difficulty in distinguishing either strategy as being more cost-effective overall and thus it may be left to surgeon and patient shared decision-making. Economic Level III. See Instructions for Authors for a complete description of levels of evidence.
NASA Astrophysics Data System (ADS)
Alfonso, Leonardo; van Andel, Schalk Jan
2014-05-01
Part of recent research in ensemble and probabilistic hydro-meteorological forecasting analyses which probabilistic information is required by decision makers and how it can be most effectively visualised. This work, in addition, analyses if decision making in flood early warning is also influenced by the way the decision question is posed. For this purpose, the decision-making game "Do probabilistic forecasts lead to better decisions?", which Ramos et al (2012) conducted at the EGU General Assembly 2012 in the city of Vienna, has been repeated with a small group and expanded. In that game decision makers had to decide whether or not to open a flood release gate, on the basis of flood forecasts, with and without uncertainty information. A conclusion of that game was that, in the absence of uncertainty information, decision makers are compelled towards a more risk-averse attitude. In order to explore to what extent the answers were driven by the way the questions were framed, in addition to the original experiment, a second variant was introduced where participants were asked to choose between a sure value (for either loosing or winning with a giving probability) and a gamble. This set-up is based on Kahneman and Tversky (1979). Results indicate that the way how the questions are posed may play an important role in decision making and that Prospect Theory provides promising concepts to further understand how this works.
Sørensen, Sabrina Storgaard; Pedersen, Kjeld Møller; Weinreich, Ulla Møller; Ehlers, Lars
2017-06-01
To analyse the cost effectiveness of community-based case management for patients suffering from chronic obstructive pulmonary disease (COPD). The study took place in the third largest municipality in Denmark and was conducted as a randomised controlled trial with 12 months of follow-up. A total of 150 patients with COPD were randomised into two groups receiving usual care and case management in addition to usual care. Case management included among other things self care proficiency, medicine compliance, and care coordination. Outcome measure for the analysis was the incremental cost-effectiveness ratio (ICER) as cost per quality-adjusted life year (QALY) from the perspective of the healthcare sector. Costs were valued in British Pounds (£) at price level 2016. Scenario analyses and probabilistic sensitivity analyses were conducted in order to assess uncertainty of the ICER estimate. The intervention resulted in a QALY improvement of 0.0146 (95% CI -0.0216; 0.0585), and a cost increase of £494 (95% CI -1778; 2766) per patient. No statistically significant difference was observed either in costs or effects. The ICER was £33,865 per QALY gained. Scenario analyses confirmed the robustness of the result and revealed slightly lower ICERs of £28,100-£31,340 per QALY. Analysis revealed that case management led to a positive incremental QALY, but were more costly than usual care. The highly uncertain ICER somewhat exceeds for instance the threshold value used by the National Institute of Health and Care Excellence (NICE). No formally established Danish threshold value exists. ClinicalTrials.gov Identifier: NCT01512836.
NASA Astrophysics Data System (ADS)
Cui, Tao; Moore, Catherine; Raiber, Matthias
2018-05-01
Modelling cumulative impacts of basin-scale coal seam gas (CSG) extraction is challenging due to the long time frames and spatial extent over which impacts occur combined with the need to consider local-scale processes. The computational burden of such models limits the ability to undertake calibration and sensitivity and uncertainty analyses. A framework is presented that integrates recently developed methods and tools to address the computational burdens of an assessment of drawdown impacts associated with rapid CSG development in the Surat Basin, Australia. The null space Monte Carlo method combined with singular value decomposition (SVD)-assisted regularisation was used to analyse the uncertainty of simulated drawdown impacts. The study also describes how the computational burden of assessing local-scale impacts was mitigated by adopting a novel combination of a nested modelling framework which incorporated a model emulator of drawdown in dual-phase flow conditions, and a methodology for representing local faulting. This combination provides a mechanism to support more reliable estimates of regional CSG-related drawdown predictions. The study indicates that uncertainties associated with boundary conditions are reduced significantly when expressing differences between scenarios. The results are analysed and distilled to enable the easy identification of areas where the simulated maximum drawdown impacts could exceed trigger points associated with legislative `make good' requirements; trigger points require that either an adjustment in the development scheme or other measures are implemented to remediate the impact. This report contributes to the currently small body of work that describes modelling and uncertainty analyses of CSG extraction impacts on groundwater.
Németh, Bertalan; Józwiak-Hagymásy, Judit; Kovács, Gábor; Kovács, Attila; Demjén, Tibor; Huber, Manuel B; Cheung, Kei-Long; Coyle, Kathryn; Lester-George, Adam; Pokhrel, Subhash; Vokó, Zoltán
2018-01-25
To evaluate potential health and economic returns from implementing smoking cessation interventions in Hungary. The EQUIPTMOD, a Markov-based economic model, was used to assess the cost-effectiveness of three implementation scenarios: (a) introducing a social marketing campaign; (b) doubling the reach of existing group-based behavioural support therapies and proactive telephone support; and (c) a combination of the two scenarios. All three scenarios were compared with current practice. The scenarios were chosen as feasible options available for Hungary based on the outcome of interviews with local stakeholders. Life-time costs and quality-adjusted life years (QALYs) were calculated from a health-care perspective. The analyses used various return on investment (ROI) estimates, including incremental cost-effectiveness ratios (ICERs), to compare the scenarios. Probabilistic sensitivity analyses assessed the extent to which the estimated mean ICERs were sensitive to the model input values. Introducing a social marketing campaign resulted in an increase of 0.3014 additional quitters per 1 000 smokers, translating to health-care cost-savings of €0.6495 per smoker compared with current practice. When the value of QALY gains was considered, cost-savings increased to €14.1598 per smoker. Doubling the reach of existing group-based behavioural support therapies and proactive telephone support resulted in health-care savings of €0.2539 per smoker (€3.9620 with the value of QALY gains), compared with current practice. The respective figures for the combined scenario were €0.8960 and €18.0062. Results were sensitive to model input values. According to the EQUIPTMOD modelling tool, it would be cost-effective for the Hungarian authorities introduce a social marketing campaign and double the reach of existing group-based behavioural support therapies and proactive telephone support. Such policies would more than pay for themselves in the long term. © 2018 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.
Ferko, Nicole; Ferrante, Giuseppe; Hasegawa, James T; Schikorr, Tanya; Soleas, Ireena M; Hernandez, John B; Sabaté, Manel; Kaiser, Christoph; Brugaletta, Salvatore; de la Torre Hernandez, Jose Maria; Galatius, Soeren; Cequier, Angel; Eberli, Franz; de Belder, Adam; Serruys, Patrick W; Valgimigli, Marco
2017-05-01
Second-generation drug eluting stents (DES) may reduce costs and improve clinical outcomes compared to first-generation DES with improved cost-effectiveness when compared to bare metal stents (BMS). We aimed to conduct an economic evaluation of a cobalt-chromium everolimus eluting stent (Co-Cr EES) compared with BMS in percutaneous coronary intervention (PCI). To conduct a cost-effectiveness analysis (CEA) of a cobalt-chromium everolimus eluting stent (Co-Cr EES) versus BMS in PCI. A Markov state transition model with a 2-year time horizon was applied from a US Medicare setting with patients undergoing PCI with Co-Cr EES or BMS. Baseline characteristics, treatment effects, and safety measures were taken from a patient level meta-analysis of 5 RCTs (n = 4,896). The base-case analysis evaluated stent-related outcomes; a secondary analysis considered the broader set of outcomes reported in the meta-analysis. The base-case and secondary analyses reported an additional 0.018 and 0.013 quality-adjusted life years (QALYs) and cost savings of $236 and $288, respectively with Co-Cr EES versus BMS. Results were robust to sensitivity analyses and were most sensitive to the price of clopidogrel. In the probabilistic sensitivity analysis, Co-Cr EES was associated with a greater than 99% chance of being cost saving or cost effective (at a cost per QALY threshold of $50,000) versus BMS. Using data from a recent patient level meta-analysis and contemporary cost data, this analysis found that PCI with Co-Cr EES is more effective and less costly than PCI with BMS. © 2016 The Authors. Catheterization and Cardiovascular Interventions Published by Wiley Periodicals, Inc. © 2016 The Authors. Catheterization and Cardiovascular Interventions Published by Wiley Periodicals, Inc.
Cost-Effectiveness of Peer Counselling for the Promotion of Exclusive Breastfeeding in Uganda.
Chola, Lumbwe; Fadnes, Lars T; Engebretsen, Ingunn M S; Nkonki, Lungiswa; Nankabirwa, Victoria; Sommerfelt, Halvor; Tumwine, James K; Tylleskar, Thorkild; Robberstad, Bjarne
2015-01-01
Community based breastfeeding promotion programmes have been shown to be effective in increasing breastfeeding prevalence. However, there is limited data on the cost-effectiveness of these programmes in sub-Saharan Africa. This paper evaluates the cost-effectiveness of a breastfeeding promotion intervention targeting mothers and their 0 to 6 month old children. Data were obtained from a community randomized trial conducted in Uganda between 2006-2008, and supplemented with evidence from several studies in sub-Saharan Africa. In the trial, peer counselling was offered to women in intervention clusters. In the control and intervention clusters, women could access standard health facility breastfeeding promotion services (HFP). Thus, two methods of breastfeeding promotion were compared: community based peer counselling (in addition to HFP) and standard HFP alone. A Markov model was used to calculate incremental cost-effectiveness ratios between the two strategies. The model estimated changes in breastfeeding prevalence and disability adjusted life years. Costs were estimated from a provider perspective. Uncertainty around the results was characterized using one-way sensitivity analyses and a probabilistic sensitivity analysis. Peer counselling more than doubled the breastfeeding prevalence as reported by mothers, but there was no observable impact on diarrhoea prevalence. Estimated incremental cost-effectiveness ratios were US$68 per month of exclusive or predominant breastfeeding and U$11,353 per disability adjusted life year (DALY) averted. The findings were robust to parameter variations in the sensitivity analyses. Our strategy to promote community based peer counselling is unlikely to be cost-effective in reducing diarrhoea prevalence and mortality in Uganda, because its cost per DALY averted far exceeds the commonly assumed willingness-to-pay threshold of three times Uganda's GDP per capita (US$1653). However, since the intervention significantly increases prevalence of exclusive or predominant breastfeeding, it could be adopted in Uganda if benefits other than reducing the occurrence of diarrhoea are believed to be important.
Mori, T; Crandall, C J; Ganz, D A
2017-02-01
We developed a Markov microsimulation model among hypothetical cohorts of community-dwelling US white women without prior major osteoporotic fractures over a lifetime horizon. At ages 75 and 80, adding 1 year of exercise to 5 years of oral bisphosphonate therapy is cost-effective at a conventionally accepted threshold compared with bisphosphonates alone. The purpose of this study was to examine the cost-effectiveness of the combined strategy of oral bisphosphonate therapy for 5 years and falls prevention exercise for 1 year compared with either strategy in isolation. We calculated incremental cost-effectiveness ratios [ICERs] (2014 US dollars per quality-adjusted life year [QALY]), using a Markov microsimulation model among hypothetical cohorts of community-dwelling US white women with different starting ages (65, 70, 75, and 80) without prior history of hip, vertebral, or wrist fractures over a lifetime horizon from the societal perspective. At ages 65, 70, 75, and 80, the combined strategy had ICERs of $202,020, $118,460, $46,870, and $17,640 per QALY, respectively, compared with oral bisphosphonate therapy alone. The combined strategy provided better health at lower cost than falls prevention exercise alone at ages 70, 75, and 80. In deterministic sensitivity analyses, results were particularly sensitive to the change in the opportunity cost of participants' time spent exercising. In probabilistic sensitivity analyses, the probabilities of the combined strategy being cost-effective compared with the next best alternative increased with age, ranging from 35 % at age 65 to 48 % at age 80 at a willingness-to-pay of $100,000 per QALY. Among community-dwelling US white women ages 75 and 80, adding 1 year of exercise to 5 years of oral bisphosphonate therapy is cost-effective at a willingness-to-pay of $100,000 per QALY, compared with oral bisphosphonate therapy only. This analysis will help clinicians and policymakers make better decisions about treatment options to reduce fracture risk.
Probabilistic assessment of roadway departure risk in a curve
NASA Astrophysics Data System (ADS)
Rey, G.; Clair, D.; Fogli, M.; Bernardin, F.
2011-10-01
Roadway departure while cornering constitutes a major part of car accidents and casualties in France. Even though drastic policy about overspeeding contributes to reduce accidents, there obviously exist other factors. This article presents the construction of a probabilistic strategy for the roadway departure risk assessment. A specific vehicle dynamic model is developed in which some parameters are modelled by random variables. These parameters are deduced from a sensitivity analysis to ensure an efficient representation of the inherent uncertainties of the system. Then, structural reliability methods are employed to assess the roadway departure risk in function of the initial conditions measured at the entrance of the curve. This study is conducted within the French national road safety project SARI that aims to implement a warning systems alerting the driver in case of dangerous situation.
Chapman, Tara; Lefevre, Philippe; Semal, Patrick; Moiseev, Fedor; Sholukha, Victor; Louryan, Stéphane; Rooze, Marcel; Van Sint Jan, Serge
2014-01-01
The hip bone is one of the most reliable indicators of sex in the human body due to the fact it is the most dimorphic bone. Probabilistic Sex Diagnosis (DSP: Diagnose Sexuelle Probabiliste) developed by Murail et al., in 2005, is a sex determination method based on a worldwide hip bone metrical database. Sex is determined by comparing specific measurements taken from each specimen using sliding callipers and computing the probability of specimens being female or male. In forensic science it is sometimes not possible to sex a body due to corpse decay or injury. Skeletalization and dissection of a body is a laborious process and desecrates the body. There were two aims to this study. The first aim was to examine the accuracy of the DSP method in comparison with a current visual sexing method on sex determination. A further aim was to see if it was possible to virtually utilise the DSP method on both the hip bone and the pelvic girdle in order to utilise this method for forensic sciences. For the first part of the study, forty-nine dry hip bones of unknown sex were obtained from the Body Donation Programme of the Université Libre de Bruxelles (ULB). A comparison was made between DSP analysis and visual sexing on dry bone by two researchers. CT scans of bones were then analysed to obtain three-dimensional (3D) virtual models and the method of DSP was analysed virtually by importing the models into a customised software programme called lhpFusionBox which was developed at ULB. The software enables DSP distances to be measured via virtually-palpated bony landmarks. There was found to be 100% agreement of sex between the manual and virtual DSP method. The second part of the study aimed to further validate the method by analysing thirty-nine supplementary pelvic girdles of known sex blind. There was found to be a 100% accuracy rate further demonstrating that the virtual DSP method is robust. Statistically significant differences were found in the identification of sex between researchers in the visual sexing method although both researchers identified the same sex in all cases in the manual and virtual DSP methods for both the hip bones and pelvic girdles. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Effect of Cyclic Thermo-Mechanical Loads on Fatigue Reliability in Polymer Matrix Composites
NASA Technical Reports Server (NTRS)
Shah, A. R.; Murthy, P. L. N.; Chamis, C. C.
1996-01-01
A methodology to compute probabilistic fatigue life of polymer matrix laminated composites has been developed and demonstrated. Matrix degradation effects caused by long term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress dependent multi-factor interaction relationship developed at NASA Lewis Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability- based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/- 45/90)(sub s) graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical cyclic loads and low thermal cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical cyclic loads and high thermal cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.
Development of a First-of-a-Kind Deterministic Decision-Making Tool for Supervisory Control System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cetiner, Sacit M; Kisner, Roger A; Muhlheim, Michael David
2015-07-01
Decision-making is the process of identifying and choosing alternatives where each alternative offers a different approach or path to move from a given state or condition to a desired state or condition. The generation of consistent decisions requires that a structured, coherent process be defined, immediately leading to a decision-making framework. The overall objective of the generalized framework is for it to be adopted into an autonomous decision-making framework and tailored to specific requirements for various applications. In this context, automation is the use of computing resources to make decisions and implement a structured decision-making process with limited or nomore » human intervention. The overriding goal of automation is to replace or supplement human decision makers with reconfigurable decision- making modules that can perform a given set of tasks reliably. Risk-informed decision making requires a probabilistic assessment of the likelihood of success given the status of the plant/systems and component health, and a deterministic assessment between plant operating parameters and reactor protection parameters to prevent unnecessary trips and challenges to plant safety systems. The implementation of the probabilistic portion of the decision-making engine of the proposed supervisory control system was detailed in previous milestone reports. Once the control options are identified and ranked based on the likelihood of success, the supervisory control system transmits the options to the deterministic portion of the platform. The deterministic multi-attribute decision-making framework uses variable sensor data (e.g., outlet temperature) and calculates where it is within the challenge state, its trajectory, and margin within the controllable domain using utility functions to evaluate current and projected plant state space for different control decisions. Metrics to be evaluated include stability, cost, time to complete (action), power level, etc. The integration of deterministic calculations using multi-physics analyses (i.e., neutronics, thermal, and thermal-hydraulics) and probabilistic safety calculations allows for the examination and quantification of margin recovery strategies. This also provides validation of the control options identified from the probabilistic assessment. Thus, the thermal-hydraulics analyses are used to validate the control options identified from the probabilistic assessment. Future work includes evaluating other possible metrics and computational efficiencies.« less
O'Day, Ken; Meyer, Kellie; Stafkey-Mailey, Dana; Watson, Crystal
2015-04-01
To assess the cost-effectiveness of natalizumab vs fingolimod over 2 years in relapsing-remitting multiple sclerosis (RRMS) patients and patients with rapidly evolving severe disease in Sweden. A decision analytic model was developed to estimate the incremental cost per relapse avoided of natalizumab and fingolimod from the perspective of the Swedish healthcare system. Modeled 2-year costs in Swedish kronor of treating RRMS patients included drug acquisition costs, administration and monitoring costs, and costs of treating MS relapses. Effectiveness was measured in terms of MS relapses avoided using data from the AFFIRM and FREEDOMS trials for all patients with RRMS and from post-hoc sub-group analyses for patients with rapidly evolving severe disease. Probabilistic sensitivity analyses were conducted to assess uncertainty. The analysis showed that, in all patients with MS, treatment with fingolimod costs less (440,463 Kr vs 444,324 Kr), but treatment with natalizumab results in more relapses avoided (0.74 vs 0.59), resulting in an incremental cost-effectiveness ratio (ICER) of 25,448 Kr per relapse avoided. In patients with rapidly evolving severe disease, natalizumab dominated fingolimod. Results of the sensitivity analysis demonstrate the robustness of the model results. At a willingness-to-pay (WTP) threshold of 500,000 Kr per relapse avoided, natalizumab is cost-effective in >80% of simulations in both patient populations. Limitations include absence of data from direct head-to-head studies comparing natalizumab and fingolimod, use of relapse rate reduction rather than sustained disability progression as the primary model outcome, assumption of 100% adherence to MS treatment, and exclusion of adverse event costs in the model. Natalizumab remains a cost-effective treatment option for patients with MS in Sweden. In the RRMS patient population, the incremental cost per relapse avoided is well below a 500,000 Kr WTP threshold per relapse avoided. In the rapidly evolving severe disease patient population, natalizumab dominates fingolimod.
Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.
Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James
2009-04-01
The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.
Han, Guangjie; Li, Shanshan; Zhu, Chunsheng; Jiang, Jinfang; Zhang, Wenbo
2017-02-08
Marine environmental monitoring provides crucial information and support for the exploitation, utilization, and protection of marine resources. With the rapid development of information technology, the development of three-dimensional underwater acoustic sensor networks (3D UASNs) provides a novel strategy to acquire marine environment information conveniently, efficiently and accurately. However, the specific propagation effects of acoustic communication channel lead to decreased successful information delivery probability with increased distance. Therefore, we investigate two probabilistic neighborhood-based data collection algorithms for 3D UASNs which are based on a probabilistic acoustic communication model instead of the traditional deterministic acoustic communication model. An autonomous underwater vehicle (AUV) is employed to traverse along the designed path to collect data from neighborhoods. For 3D UASNs without prior deployment knowledge, partitioning the network into grids can allow the AUV to visit the central location of each grid for data collection. For 3D UASNs in which the deployment knowledge is known in advance, the AUV only needs to visit several selected locations by constructing a minimum probabilistic neighborhood covering set to reduce data latency. Otherwise, by increasing the transmission rounds, our proposed algorithms can provide a tradeoff between data collection latency and information gain. These algorithms are compared with basic Nearest-neighbor Heuristic algorithm via simulations. Simulation analyses show that our proposed algorithms can efficiently reduce the average data collection completion time, corresponding to a decrease of data latency.
Adaptive predictors based on probabilistic SVM for real time disruption mitigation on JET
NASA Astrophysics Data System (ADS)
Murari, A.; Lungaroni, M.; Peluso, E.; Gaudio, P.; Vega, J.; Dormido-Canto, S.; Baruzzo, M.; Gelfusa, M.; Contributors, JET
2018-05-01
Detecting disruptions with sufficient anticipation time is essential to undertake any form of remedial strategy, mitigation or avoidance. Traditional predictors based on machine learning techniques can be very performing, if properly optimised, but do not provide a natural estimate of the quality of their outputs and they typically age very quickly. In this paper a new set of tools, based on probabilistic extensions of support vector machines (SVM), are introduced and applied for the first time to JET data. The probabilistic output constitutes a natural qualification of the prediction quality and provides additional flexibility. An adaptive training strategy ‘from scratch’ has also been devised, which allows preserving the performance even when the experimental conditions change significantly. Large JET databases of disruptions, covering entire campaigns and thousands of discharges, have been analysed, both for the case of the graphite and the ITER Like Wall. Performance significantly better than any previous predictor using adaptive training has been achieved, satisfying even the requirements of the next generation of devices. The adaptive approach to the training has also provided unique information about the evolution of the operational space. The fact that the developed tools give the probability of disruption improves the interpretability of the results, provides an estimate of the predictor quality and gives new insights into the physics. Moreover, the probabilistic treatment permits to insert more easily these classifiers into general decision support and control systems.
Limited-scope probabilistic safety analysis for the Los Alamos Meson Physics Facility (LAMPF)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharirli, M.; Rand, J.L.; Sasser, M.K.
1992-01-01
The reliability of instrumentation and safety systems is a major issue in the operation of accelerator facilities. A probabilistic safety analysis was performed or the key safety and instrumentation systems at the Los Alamos Meson Physics Facility (LAMPF). in Phase I of this unique study, the Personnel Safety System (PSS) and the Current Limiters (XLs) were analyzed through the use of the fault tree analyses, failure modes and effects analysis, and criticality analysis. Phase II of the program was done to update and reevaluate the safety systems after the Phase I recommendations were implemented. This paper provides a brief reviewmore » of the studies involved in Phases I and II of the program.« less
Limited-scope probabilistic safety analysis for the Los Alamos Meson Physics Facility (LAMPF)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharirli, M.; Rand, J.L.; Sasser, M.K.
1992-12-01
The reliability of instrumentation and safety systems is a major issue in the operation of accelerator facilities. A probabilistic safety analysis was performed or the key safety and instrumentation systems at the Los Alamos Meson Physics Facility (LAMPF). in Phase I of this unique study, the Personnel Safety System (PSS) and the Current Limiters (XLs) were analyzed through the use of the fault tree analyses, failure modes and effects analysis, and criticality analysis. Phase II of the program was done to update and reevaluate the safety systems after the Phase I recommendations were implemented. This paper provides a brief reviewmore » of the studies involved in Phases I and II of the program.« less
A multilevel probabilistic beam search algorithm for the shortest common supersequence problem.
Gallardo, José E
2012-01-01
The shortest common supersequence problem is a classical problem with many applications in different fields such as planning, Artificial Intelligence and especially in Bioinformatics. Due to its NP-hardness, we can not expect to efficiently solve this problem using conventional exact techniques. This paper presents a heuristic to tackle this problem based on the use at different levels of a probabilistic variant of a classical heuristic known as Beam Search. The proposed algorithm is empirically analysed and compared to current approaches in the literature. Experiments show that it provides better quality solutions in a reasonable time for medium and large instances of the problem. For very large instances, our heuristic also provides better solutions, but required execution times may increase considerably.
NASA Astrophysics Data System (ADS)
Mastrolorenzo, G.; Pappalardo, L.; Troise, C.; Panizza, A.; de Natale, G.
2005-05-01
Integrated volcanological-probabilistic approaches has been used in order to simulate pyroclastic density currents and fallout and produce hazard maps for Campi Flegrei and Somma Vesuvius areas. On the basis of the analyses of all types of pyroclastic flows, surges, secondary pyroclastic density currents and fallout events occurred in the volcanological history of the two volcanic areas and the evaluation of probability for each type of events, matrixs of input parameters for a numerical simulation have been performed. The multi-dimensional input matrixs include the main controlling parameters of the pyroclasts transport and deposition dispersion, as well as the set of possible eruptive vents used in the simulation program. Probabilistic hazard maps provide of each points of campanian area, the yearly probability to be interested by a given event with a given intensity and resulting demage. Probability of a few events in one thousand years are typical of most areas around the volcanoes whitin a range of ca 10 km, including Neaples. Results provide constrains for the emergency plans in Neapolitan area.
A quantitative model of optimal data selection in Wason's selection task.
Hattori, Masasi
2002-10-01
The optimal data selection model proposed by Oaksford and Chater (1994) successfully formalized Wason's selection task (Wason, 1966). The model, however, involved some questionable assumptions and was also not sufficient as a model of the task because it could not provide quantitative predictions of the card selection frequencies. In this paper, the model was revised to provide quantitative fits to the data. The model can predict the selection frequencies of cards based on a selection tendency function (STF), or conversely, it enables the estimation of subjective probabilities from data. Past experimental data were first re-analysed based on the model. In Experiment 1, the superiority of the revised model was shown. However, when the relationship between antecedent and consequent was forced to deviate from the biconditional form, the model was not supported. In Experiment 2, it was shown that sufficient emphasis on probabilistic information can affect participants' performance. A detailed experimental method to sort participants by probabilistic strategies was introduced. Here, the model was supported by a subgroup of participants who used the probabilistic strategy. Finally, the results were discussed from the viewpoint of adaptive rationality.
On the skill of various ensemble spread estimators for probabilistic short range wind forecasting
NASA Astrophysics Data System (ADS)
Kann, A.
2012-05-01
A variety of applications ranging from civil protection associated with severe weather to economical interests are heavily dependent on meteorological information. For example, a precise planning of the energy supply with a high share of renewables requires detailed meteorological information on high temporal and spatial resolution. With respect to wind power, detailed analyses and forecasts of wind speed are of crucial interest for the energy management. Although the applicability and the current skill of state-of-the-art probabilistic short range forecasts has increased during the last years, ensemble systems still show systematic deficiencies which limit its practical use. This paper presents methods to improve the ensemble skill of 10-m wind speed forecasts by combining deterministic information from a nowcasting system on very high horizontal resolution with uncertainty estimates from a limited area ensemble system. It is shown for a one month validation period that a statistical post-processing procedure (a modified non-homogeneous Gaussian regression) adds further skill to the probabilistic forecasts, especially beyond the nowcasting range after +6 h.
A lifetime Markov model for the economic evaluation of chronic obstructive pulmonary disease.
Menn, Petra; Leidl, Reiner; Holle, Rolf
2012-09-01
Chronic obstructive pulmonary disease (COPD) is currently the fourth leading cause of death worldwide. It has serious health effects and causes substantial costs for society. The aim of the present paper was to develop a state-of-the-art decision-analytic model of COPD whereby the cost effectiveness of interventions in Germany can be estimated. To demonstrate the applicability of the model, a smoking cessation programme was evaluated against usual care. A seven-stage Markov model (disease stages I to IV according to the GOLD [Global Initiative for Chronic Obstructive Lung Disease] classification, states after lung-volume reduction surgery and lung transplantation, death) was developed to conduct a cost-utility analysis from the societal perspective over a time horizon of 10, 40 and 60 years. Patients entered the cohort model at the age of 45 with mild COPD. Exacerbations were classified into three levels: mild, moderate and severe. Estimation of stage-specific probabilities (for smokers and quitters), utilities and costs was based on German data where possible. Data on effectiveness of the intervention was retrieved from the literature. A discount rate of 3% was applied to costs and effects. Probabilistic sensitivity analysis was used to assess the robustness of the results. The smoking cessation programme was the dominant strategy compared with usual care, and the intervention resulted in an increase in health effects of 0.54 QALYs and a cost reduction of &U20AC;1115 per patient (year 2007 prices) after 60 years. In the probabilistic analysis, the intervention dominated in about 95% of the simulations. Sensitivity analyses showed that uncertainty primarily originated from data on disease progression and treatment cost in the early stages of disease. The model developed allows the long-term cost effectiveness of interventions to be estimated, and has been adapted to Germany. The model suggests that the smoking cessation programme evaluated was more effective than usual care as well as being cost-saving. Most patients had mild or moderate COPD, stages for which parameter uncertainty was found to be high. This raises the need to improve data on the early stages of COPD.
Schackman, Bruce R; Leff, Jared A; Polsky, Daniel; Moore, Brent A; Fiellin, David A
2012-06-01
Primary care physicians with appropriate training may prescribe buprenorphine-naloxone (bup/nx) to treat opioid dependence in US office-based settings, where many patients prefer to be treated. Bup/nx is off patent but not available as a generic. We evaluated the cost-effectiveness of long-term office-based bup/nx treatment for clinically stable opioid-dependent patients compared to no treatment. A decision analytic model simulated a hypothetical cohort of clinically stable opioid-dependent individuals who have already completed 6 months of office-based bup/nx treatment. Data were from a published cohort study that collected treatment retention, opioid use, and costs for this population, and published quality-of-life weights. Uncertainties in estimated monthly costs and quality-of-life weights were evaluated in probabilistic sensitivity analyses, and the economic value of additional research to reduce these uncertainties was also evaluated. Bup/nx, provider, and patient costs in 2010 US dollars, quality-adjusted life years (QALYs), and incremental cost-effectiveness (CE) ratios ($/QALY); costs and QALYs are discounted at 3% annually. In the base case, office-based bup/nx for clinically stable patients has a CE ratio of $35,100/QALY compared to no treatment after 24 months, with 64% probability of being < $100,000/QALY in probabilistic sensitivity analysis. With a 50% bup/nx price reduction the CE ratio is $23,000/QALY with 69% probability of being < $100,000/QALY. Alternative quality-of-life weights result in CE ratios of $138,000/QALY and $90,600/QALY. The value of research to reduce quality-of-life uncertainties for 24-month results is $6,400 per person eligible for treatment at the current bup/nx price and $5,100 per person with a 50% bup/nx price reduction. Office-based bup/nx for clinically stable patients may be a cost-effective alternative to no treatment at a threshold of $100,000/QALY depending on assumptions about quality-of-life weights. Additional research about quality-of-life benefits and broader health system and societal cost savings of bup/nx therapy is needed.
Chidi, Alexis P; Bryce, Cindy L; Donohue, Julie M; Fine, Michael J; Landsittel, Douglas P; Myaskovsky, Larissa; Rogal, Shari S; Switzer, Galen E; Tsung, Allan; Smith, Kenneth J
2016-06-01
Interferon-free hepatitis C treatment regimens are effective but very costly. The cost-effectiveness, budget, and public health impacts of current Medicaid treatment policies restricting treatment to patients with advanced disease remain unknown. To evaluate the cost-effectiveness of current Medicaid policies restricting hepatitis C treatment to patients with advanced disease compared with a strategy providing unrestricted access to hepatitis C treatment, assess the budget and public health impact of each strategy, and estimate the feasibility and long-term effects of increased access to treatment for patients with hepatitis C. Using a Markov model, we compared two strategies for 45- to 55-year-old Medicaid beneficiaries: 1) Current Practice-only advanced disease is treated before Medicare eligibility and 2) Full Access-both early-stage and advanced disease are treated before Medicare eligibility. Patients could develop progressive fibrosis, cirrhosis, or hepatocellular carcinoma, undergo transplantation, or die each year. Morbidity was reduced after successful treatment. We calculated the incremental cost-effectiveness ratio and compared the costs and public health effects of each strategy from the perspective of Medicare alone as well as the Centers for Medicare & Medicaid Services perspective. We varied model inputs in one-way and probabilistic sensitivity analyses. Full Access was less costly and more effective than Current Practice for all cohorts and perspectives, with differences in cost ranging from $5,369 to $11,960 and in effectiveness from 0.82 to 3.01 quality-adjusted life-years. In a probabilistic sensitivity analysis, Full Access was cost saving in 93% of model iterations. Compared with Current Practice, Full Access averted 5,994 hepatocellular carcinoma cases and 121 liver transplants per 100,000 patients. Current Medicaid policies restricting hepatitis C treatment to patients with advanced disease are more costly and less effective than unrestricted, full-access strategies. Collaboration between state and federal payers may be needed to realize the full public health impact of recent innovations in hepatitis C treatment. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Martínez-Velázquez, Eduardo S; Ramos-Loyo, Julieta; González-Garrido, Andrés A; Sequeira, Henrique
2015-01-21
Feedback-related negativity (FRN) is a negative deflection that appears around 250 ms after the gain or loss of feedback to chosen alternatives in a gambling task in frontocentral regions following outcomes. Few studies have reported FRN enhancement in adolescents compared with adults in a gambling task without probabilistic reinforcement learning, despite the fact that learning from positive or negative consequences is crucial for decision-making during adolescence. Therefore, the aim of the present research was to identify differences in FRN amplitude and latency between adolescents and adults on a gambling task with favorable and unfavorable probabilistic reinforcement learning conditions, in addition to a nonlearning condition with monetary gains and losses. Higher rate scores of high-magnitude choices during the final 30 trials compared with the first 30 trials were observed during the favorable condition, whereas lower rates were observed during the unfavorable condition in both groups. Higher FRN amplitude in all conditions and longer latency in the nonlearning condition were observed in adolescents compared with adults and in relation to losses. Results indicate that both the adolescents and the adults improved their performance in relation to positive and negative feedback. However, the FRN findings suggest an increased sensitivity to external feedback to losses in adolescents compared with adults, irrespective of the presence or absence of probabilistic reinforcement learning. These results reflect processing differences on the neural monitoring system and provide new perspectives on the dynamic development of an adolescent's brain.
Hohl, Corinne Michèle; Nosyk, Bohdan; Sadatsafavi, Mohsen; Anis, Aslam Hayat
2008-01-01
To determine the incremental cost-effectiveness of using propofol versus midazolam for procedural sedation (PS) in adults in the emergency department (ED). The authors conducted a cost-effectiveness analysis from the perspective of the health care provider. The primary outcome was the incremental cost (or savings) to achieve one additional successful sedation with propofol compared to midazolam. A decision model was developed in which the clinical effectiveness and cost of a PS strategy using either agent was estimated. The authors derived estimates of clinical effectiveness and risk of adverse events (AEs) from a systematic review. The cost of each clinical outcome was determined by incorporating the baseline cost of the ED visit, the cost of the drug, the cost of labor of physicians and nurses, the cost and probability of an AE, and the cost and probability of a PS failure. A standard meta-analytic technique was used to calculate the weighted mean difference in recovery times and obtain mean drug doses from patient-level data from a randomized controlled trial. Probabilistic sensitivity analyses were conducted to examine the uncertainty around the estimated incremental cost-effectiveness ratio using Monte Carlo simulation. Choosing a sedation strategy with propofol resulted in average savings of $17.33 (95% confidence interval [CI] = $24.13 to $10.44) per sedation performed. This resulted in an incremental cost-effectiveness ratio of -$597.03 (95% credibility interval -$6,434.03 to $6,113.57) indicating savings of $597.03 per additional successful sedation performed with propofol. This result was driven by shorter recovery times and was robust to all sensitivity analyses performed. These results indicate that using propofol for PS in the ED is a cost-saving strategy.
Shankaran, Veena; Ortendahl, Jesse D; Purdum, Anna G; Bolinder, Bjorn; Anene, Ayanna M; Sun, Gordon H; Bentley, Tanya G K
2018-01-01
We conducted a cost-effectiveness analysis incorporating recent phase III clinical trial (FIRE-3) data to evaluate clinical and economic tradeoffs associated with first-line treatments of KRAS wild-type (WT) metastatic colorectal cancer (mCRC). A cost-effectiveness model was developed using FIRE-3 data to project survival and lifetime costs of FOLFIRI plus either cetuximab or bevacizumab. Hypothetical KRAS-WT mCRC patients initiated first-line treatment and could experience adverse events, disease progression warranting second-line treatment, or clinical response and hepatic metastasectomy. Model inputs were derived from FIRE-3 and published literature. Incremental cost-effectiveness ratios (ICERs) were reported as US$ per life year (LY) and quality-adjusted life year (QALY). Scenario analyses considered patients with extended RAS mutations and CALGB/SWOG 80405 data; 1-way and probabilistic sensitivity analyses were conducted. Compared with bevacizumab, KRAS-WT patients receiving first-line cetuximab gained 5.7 months of life at a cost of $46,266, for an ICER of $97,223/LY ($122,610/QALY). For extended RAS-WT patients, the ICER was $77,339/LY ($99,584/QALY). Cetuximab treatment was cost-effective 80.3% of the time, given a willingness-to-pay threshold of $150,000/LY. Results were sensitive to changes in survival, treatment duration, and product costs. Our analysis of FIRE-3 data suggests that first-line treatment with cetuximab and FOLFIRI in KRAS (and extended RAS) WT mCRC patients may improve health outcomes and use financial resources more efficiently than bevacizumab and FOLFIRI. This information, in combination with other studies investigating comparative effectiveness of first-line options, can be useful to clinicians, payers, and policymakers in making treatment and resource allocation decisions for mCRC patients.
Schackman, Bruce R.; Leff, Jared A.; Barter, Devra M.; DiLorenzo, Madeline A.; Feaster, Daniel J.; Metsch, Lisa R.; Freedberg, Kenneth A.; Linas, Benjamin P.
2014-01-01
Aims To evaluate the cost-effectiveness of rapid hepatitis C virus (HCV) and simultaneous HCV/HIV antibody testing in substance abuse treatment programs. Design We used a decision analytic model to compare the cost-effectiveness of no HCV testing referral or offer, off-site HCV testing referral, on-site rapid HCV testing offer, and on-site rapid HCV and HIV testing offer. Base case inputs included 11% undetected chronic HCV, 0.4% undetected HIV, 35% HCV co-infection among HIV-infected, 53% linked to HCV care after testing antibody positive, and 67% linked to HIV care. Disease outcomes were estimated from established computer simulation models of HCV (HEP-CE) and HIV (CEPAC). Setting and Participants Data on test acceptance and costs were from a national randomized trial of HIV testing strategies conducted at 12 substance abuse treatment programs in the USA. Measurements Lifetime costs (2011 US dollars) and quality-adjusted life years (QALYs) discounted at 3% annually; incremental cost-effectiveness ratios (ICERs) Findings On-site rapid HCV testing had an ICER of $18,300/QALY compared with no testing, and was more efficient than (dominated) off-site HCV testing referral. On-site rapid HCV and HIV testing had an ICER of $64,500/QALY compared with on-site rapid HCV testing alone. In one and two-way sensitivity analyses, the ICER of on-site rapid HCV and HIV testing remained <$100,000/QALY, except when undetected HIV prevalence was <0.1% or when we assumed frequent HIV testing elsewhere. The ICER remained <$100,000/QALY in approximately 90% of probabilistic sensitivity analyses. Conclusions On-site rapid hepatitis C virus and HIV testing in substance abuse treatment programs is cost-effective at a <$100,000/ quality-adjusted life years threshold. PMID:25291977
Cost-effectiveness of screening for abdominal aortic aneurysm in the Netherlands and Norway.
Spronk, S; van Kempen, B J H; Boll, A P M; Jørgensen, J J; Hunink, M G M; Kristiansen, I S
2011-11-01
The aim of this study was to determine the cost-effectiveness of ultrasound screening for abdominal aortic aneurysm (AAA) in men aged 65 years, for both the Netherlands and Norway. A Markov model was developed to simulate life expectancy, quality-adjusted life-years, net health benefits, lifetime costs and incremental cost-effectiveness ratios for both screening and no screening for AAA. The best available evidence was retrieved from the literature and combined with primary data from the two countries separately, and analysed from a national perspective. A threshold willingness-to-pay (WTP) of €20,000 and €62,500 was used for data from the Netherlands and Norway respectively. The additional costs of the screening strategy compared with no screening were €421 (95 per cent confidence interval 33 to 806) per person in the Netherlands, and the additional life-years were 0·097 (-0·180 to 0·365), representing €4340 per life-year. For Norway, the values were €562 (59 to 1078), 0·057 (-0·135 to 0·253) life-years and €9860 per life-year respectively. In Norway the results were sensitive to a decrease in the prevalence of AAA in 65-year-old men to 1 per cent, or lower. Probabilistic sensitivity analyses indicated that AAA screening has a 70 per cent probability of being cost-effective in the Netherlands with a WTP threshold of €20,000, and 70 per cent in Norway with a threshold of €62,500. Using this model, screening for AAA in 65-year-old men would be highly cost-effective in both the Netherlands and Norway. Copyright © 2011 British Journal of Surgery Society Ltd. Published by John Wiley & Sons, Ltd.
Mori, Amani T; Ngalesoni, Frida; Norheim, Ole F; Robberstad, Bjarne
2014-09-15
Dihydroartemisinin-piperaquine (DhP) is highly recommended for the treatment of uncomplicated malaria. This study aims to compare the costs, health benefits and cost-effectiveness of DhP and artemether-lumefantrine (AL) alongside "do-nothing" as a baseline comparator in order to consider the appropriateness of DhP as a first-line anti-malarial drug for children in Tanzania. A cost-effectiveness analysis was performed using a Markov decision model, from a provider's perspective. The study used cost data from Tanzania and secondary effectiveness data from a review of articles from sub-Saharan Africa. Probabilistic sensitivity analysis was used to incorporate uncertainties in the model parameters. In addition, sensitivity analyses were used to test plausible variations of key parameters and the key assumptions were tested in scenario analyses. The model predicts that DhP is more cost-effective than AL, with an incremental cost-effectiveness ratio (ICER) of US$ 12.40 per DALY averted. This result relies on the assumption that compliance to treatment with DhP is higher than that with AL due to its relatively simple once-a-day dosage regimen. When compliance was assumed to be identical for the two drugs, AL was more cost-effective than DhP with an ICER of US$ 12.54 per DALY averted. DhP is, however, slightly more likely to be cost-effective compared to a willingness-to-pay threshold of US$ 150 per DALY averted. Dihydroartemisinin-piperaquine is a very cost-effective anti-malarial drug. The findings support its use as an alternative first-line drug for treatment of uncomplicated malaria in children in Tanzania and other sub-Saharan African countries with similar healthcare infrastructures and epidemiology of malaria.
Kuznik, Andreas; Bégo-Le-Bagousse, Gaëlle; Eckert, Laurent; Gadkari, Abhijit; Simpson, Eric; Graham, Christopher N; Miles, LaStella; Mastey, Vera; Mahajan, Puneet; Sullivan, Sean D
2017-12-01
Dupilumab significantly improves signs and symptoms of atopic dermatitis (AD), including pruritus, symptoms of anxiety and depression, and health-related quality of life versus placebo in adults with moderate-to-severe AD. Since the cost-effectiveness of dupilumab has not been evaluated, the objective of this analysis was to estimate a value-based price range in which dupilumab would be considered cost-effective compared with supportive care (SC) for treatment of moderate-to-severe AD in an adult population. A health economic model was developed to evaluate from the US payer perspective the long-term costs and benefits of dupilumab treatment administered every other week (q2w). Dupilumab q2w was compared with SC; robustness of assumptions and results were tested using sensitivity and scenario analyses. Clinical data were derived from the dupilumab LIBERTY AD SOLO trials; healthcare use and cost data were from health insurance claims histories of adult patients with AD. The annual price of maintenance therapy with dupilumab to be considered cost-effective was estimated for decision thresholds of US$100,000 and $150,000 per quality-adjusted life-year (QALY) gained. In the base case, the annual maintenance price for dupilumab therapy to be considered cost-effective would be $28,770 at a $100,000 per QALY gained threshold, and $39,940 at a $150,000 threshold. Results were generally robust to parameter variations in one-way and probabilistic sensitivity analyses. Dupilumab q2w compared with SC is cost-effective for the treatment of moderate-to-severe AD in US adults at an annual price of maintenance therapy in the range of $29,000-$40,000 at the $100,000-$150,000 per QALY thresholds. Sanofi and Regeneron Pharmaceuticals, Inc.
A Risk-Based Approach for Aerothermal/TPS Analysis and Testing
NASA Technical Reports Server (NTRS)
Wright, Michael J.; Grinstead, Jay H.; Bose, Deepak
2007-01-01
The current status of aerothermal and thermal protection system modeling for civilian entry missions is reviewed. For most such missions, the accuracy of our simulations is limited not by the tools and processes currently employed, but rather by reducible deficiencies in the underlying physical models. Improving the accuracy of and reducing the uncertainties in these models will enable a greater understanding of the system level impacts of a particular thermal protection system and of the system operation and risk over the operational life of the system. A strategic plan will be laid out by which key modeling deficiencies can be identified via mission-specific gap analysis. Once these gaps have been identified, the driving component uncertainties are determined via sensitivity analyses. A Monte-Carlo based methodology is presented for physics-based probabilistic uncertainty analysis of aerothermodynamics and thermal protection system material response modeling. These data are then used to advocate for and plan focused testing aimed at reducing key uncertainties. The results of these tests are used to validate or modify existing physical models. Concurrently, a testing methodology is outlined for thermal protection materials. The proposed approach is based on using the results of uncertainty/sensitivity analyses discussed above to tailor ground testing so as to best identify and quantify system performance and risk drivers. A key component of this testing is understanding the relationship between the test and flight environments. No existing ground test facility can simultaneously replicate all aspects of the flight environment, and therefore good models for traceability to flight are critical to ensure a low risk, high reliability thermal protection system design. Finally, the role of flight testing in the overall thermal protection system development strategy is discussed.
Kaambwa, Billingsley; Bryan, Stirling; Jowett, Sue; Mant, Jonathan; Bray, Emma P; Hobbs, F D Richard; Holder, Roger; Jones, Miren I; Little, Paul; Williams, Bryan; McManus, Richard J
2014-12-01
Self-monitoring and self-titration of antihypertensives (self-management) is a novel intervention which improves blood pressure control. However, little evidence exists regarding the cost-effectiveness of self-monitoring of blood pressure in general and self-management in particular. This study aimed to evaluate whether self-management of hypertension was cost-effective. A cohort Markov model-based probabilistic cost-effectiveness analysis was undertaken extrapolating to up to 35 years from cost and outcome data collected from the telemonitoring and self-management in hypertension trial (TASMINH2). Self-management of hypertension was compared with usual care in terms of lifetime costs, quality adjusted life years and cost-effectiveness using a UK Health Service perspective. Sensitivity analyses examined the effect of different time horizons and reduced effectiveness over time from self-management. In the long-term, when compared with usual care, self-management was more effective by 0.24 and 0.12 quality adjusted life years (QALYs) gained per patient for men and women, respectively. The resultant incremental cost-effectiveness ratio for self-management was £1624 per QALY for men and £4923 per QALY for women. There was at least a 99% chance of the intervention being cost-effective for both sexes at a willingness to pay threshold of £20,000 per QALY gained. These results were robust to sensitivity analyses around the assumptions made, provided that the effects of self-management lasted at least two years for men and five years for women. Self-monitoring with self-titration of antihypertensives and telemonitoring of blood pressure measurements not only reduces blood pressure, compared with usual care, but also represents a cost-effective use of health care resources. © The European Society of Cardiology 2013 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Cost-effectiveness analysis of oral fentanyl formulations for breakthrough cancer pain treatment
Cortesi, Paolo Angelo; D’Angiolella, Lucia Sara; Vellucci, Renato; Allegri, Massimo; Casale, Giuseppe; Favaretti, Carlo; Kheiraoui, Flavia; Cesana, Giancarlo; Mantovani, Lorenzo Giovanni
2017-01-01
Breakthrough cancer Pain (BTcP) has a high prevalence in cancer population. Patients with BTcP reported relevant health care costs and poor quality of life. The study assessed the cost-effectiveness of the available Oral Fentanyl Formulations (OFFs) for BTcP in Italy. A decision-analytical model was developed to estimate costs and benefits associated with treatments, from the Italian NHS perspective. Expected reductions in pain intensity per BTcP episodes were translated into, percentage of BTcP reduction, resource use and Quality-Adjusted-Life-Years (QALYs). Relative efficacy, resources used and unit costs data were derived from the literature and validated by clinical experts. Probabilistic and deterministic sensitivity analyses were performed. At base-case analysis, Sublingual Fentanyl Citrate (FCSL) compared to other oral formulations reported a lower patient’s cost (€1,960.8) and a higher efficacy (18.7% of BTcP avoided and 0.0507 QALYs gained). The sensitivity analyses confirmed the main results in all tested scenarios, with the highest impact reported by BTcP duration and health care resources consumption parameters. Between OFFs, FCSL is the cost-effective option due to faster reduction of pain intensity. However, new research is needed to better understand the economic and epidemiologic impact of BTcP, and to collect more robust data on economic and quality of life impact of the different fentanyl formulations. Different fentanyl formulations are available to manage BTcP in cancer population. The study is the first that assesses the different impact in terms of cost and effectiveness of OFFs, providing new information to better allocate the resources available to treat BTcP and highlighting the need of better data. PMID:28654672
Smith, Kenneth J; Kuo, Shihchen; Zgibor, Janice C; McTigue, Kathleen M; Hess, Rachel; Bhargava, Tina; Bryce, Cindy L
2016-06-01
To assess the cost-effectiveness of an online adaptation of the diabetes prevention program (ODPP) lifestyle intervention. ODPP was a before-after evaluation of a weight loss intervention comprising 16 weekly and 8 monthly lessons, incorporating behavioral tools and regular, brief, web-based individualized counseling in an overweight/obese cohort (mean age 52, 76% female, 92% white, 28% with diabetes). A Markov model was developed to estimate ODPP cost effectiveness compared with usual care (UC) to reduce metabolic risk over 10years. Intervention costs and weight change outcomes were obtained from the study; other model parameters were based on published reports. In the model, diabetes risk was a function of weight change with and without the intervention. Compared to UC, the ODPP in our cohort cost $14,351 and $29,331 per quality-adjusted life-year (QALY) gained from the health care system and societal perspectives, respectively. In a hypothetical cohort without diabetes, the ODPP cost $7777 and $18,263 per QALY gained, respectively. Results were robust in sensitivity analyses, but enrolling cohorts with lower annual risk of developing diabetes (≤1.8%), enrolling fewer participants (≤15), or increasing the hourly cost (≥$91.20) or annual per-participant time (≥1.45h) required for technical support could increase ODPP cost to >$20,000 per QALY gained. In probabilistic sensitivity analyses, ODPP was cost-effective in 20-58% of model iterations using an acceptability threshold of $20,000, 73-92% at $50,000, and 95-99% at $100,000 per QALY gained. The ODPP may offer an economical approach to combating overweight and obesity. Copyright © 2016 Elsevier Inc. All rights reserved.
Klinghoffer, Zachary; Tarride, Jean-Eric; Novara, Giacomo; Ficarra, Vincenzo; Kapoor, Anil; Shayegan, Bobby; Braga, Luis H.
2013-01-01
Objectives: We compare the cost-utility of laparoscopic radical nephrectomy (LRN), laparoscopic partial nephrectomy (LPN) and open partial nephrectomy (OPN) in the management of small renal masses (SRMs) when the impact of ensuing chronic kidney disease (CKD) disease is considered. Methods: We designed a Markov decision analysis model with a 10-year time horizon. Estimates of costs, utilities, complication rates and probabilities of developing CKD were derived from the literature. The base case patient was assumed to be a 65-year-old patient with a <4-cm unilateral renal mass, a normal contralateral kidney and a normal preoperative serum creatinine. Univariate and probabilistic sensitivity analyses were conducted to address the uncertainty associated with the study parameters. Results: OPN was the least costly strategy at $25 941 USD and generated 7.161 quality-adjusted life years (QALYs) over 10 years. LPN yielded 0.098 additional QALYs at an additional cost of $888 for an incremental cost-effectiveness ratio of $9057 per QALY, well below a commonly cited willingness-to-pay threshold of $50 000 per QALY. LRN was more costly and yielded fewer QALYs than OPN and LPN. Sensitivity analyses demonstrated our model to be robust to changes to key parameters. Age had no effect on preferred strategy. Conclusions: Partial nephrectomy (PN) is the preferred treatment strategy for SRMs. In centres where LPN is not available, OPN remains considerably more cost-effective than LRN. Furthermore, our study demonstrates that there is no age at which PN is not preferred to LRN. Our study provides additional evidence to advocate PN for the management of all amenable SRMs. PMID:23671525
Biasutti, Maria; Dufour, Natacha; Ferroud, Clotilde; Dab, William; Temime, Laura
2012-01-01
Used as contrast agents for brain magnetic resonance imaging (MRI), markers for beta-amyloid deposits might allow early diagnosis of Alzheimer's disease (AD). We evaluated the cost-effectiveness of such a diagnostic test, MRI+CLP (contrastophore-linker-pharmacophore), should it become clinically available. We compared the cost-effectiveness of MRI+CLP to that of standard diagnosis using currently available cognition tests and of standard MRI, and investigated the impact of a hypothetical treatment efficient in early AD. The primary analysis was based on the current French context for 70-year-old patients with Mild Cognitive Impairment (MCI). In alternative "screen and treat" scenarios, we analyzed the consequences of systematic screenings of over-60 individuals (either population-wide or restricted to the ApoE4 genotype population). We used a Markov model of AD progression; model parameters, as well as incurred costs and quality-of-life weights in France were taken from the literature. We performed univariate and probabilistic multivariate sensitivity analyses. The base-case preferred strategy was the standard MRI diagnosis strategy. In the primary analysis however, MRI+CLP could become the preferred strategy under a wide array of scenarios involving lower cost and/or higher sensitivity or specificity. By contrast, in the "screen and treat" analyses, the probability of MRI+CLP becoming the preferred strategy remained lower than 5%. It is thought that anti-beta-amyloid compounds might halt the development of dementia in early stage patients. This study suggests that, even should such treatments become available, systematically screening the over-60 population for AD would only become cost-effective with highly specific tests able to diagnose early stages of the disease. However, offering a new diagnostic test based on beta-amyloid markers to elderly patients with MCI might prove cost-effective.
Modelling the healthcare costs of skin cancer in South Africa.
Gordon, Louisa G; Elliott, Thomas M; Wright, Caradee Y; Deghaye, Nicola; Visser, Willie
2016-04-02
Skin cancer is a growing public health problem in South Africa due to its high ambient ultraviolet radiation environment. The purpose of this study was to estimate the annual health system costs of cutaneous melanoma, squamous cell carcinoma (SCC) and basal cell carcinoma (BCC) in South Africa, incorporating both the public and private sectors. A cost-of-illness study was used to measure the economic burden of skin cancer and a 'bottom-up' micro-costing approach. Clinicians provided data on the patterns of care and treatments while national costing reports and clinician fees provided cost estimates. The mean costs per melanoma and per SCC/BCC were extrapolated to estimate national costs using published incidence data and official population statistics. One-way and probabilistic sensitivity analyses were undertaken to address the uncertainty of the parameters used in the model. The estimated total annual cost of treating skin cancers in South Africa were ZAR 92.4 million (2015) (or US$15.7 million). Sensitivity analyses showed that the total costs could vary between ZAR 89.7 to 94.6 million (US$15.2 to $16.1 million) when melanoma-related variables were changed and between ZAR 78.4 to 113.5 million ($13.3 to $19.3 million) when non-melanoma-related variables were changed. The primary drivers of overall costs were the cost of excisions, follow-up care, radical lymph node dissection, cryotherapy and radiation therapy. The cost of managing skin cancer in South Africa is sizable. Since skin cancer is largely preventable through improvements to sun-protection awareness and skin cancer prevention programs, this study highlights these healthcare resources could be used for other pressing public health problems in South Africa.
Granados-García, Víctor; Contreras, Ana M; García-Peña, Carmen; Salinas-Escudero, Guillermo; Thein, Hla-Hla; Flores, Yvonne N
2016-01-01
We conducted a cost-effectiveness analysis of seven hepatitis C virus (HCV) testing strategies in blood donors. Three of the seven strategies were based on HCV diagnosis and reporting guidelines in Mexico and four were from previous and current recommendations outlined by the CDC. The strategies that were evaluated determine antibody levels according to the signal-to-cut-off (S/CO) ratio and use reflex Immunoblot (IMB) or HCV RNA tests to confirm true positive (TP) cases of chronic HCV infection. Costs were calculated from the perspective of the Mexican Institute of Social Security (IMSS). A decision tree model was developed to estimate the expected number of true positive cases and costs for the base-case scenarios and for the sensitivity analyses. Base-case findings indicate an extended dominance of the CDC-USA2 and CDC-USA4 options by the IMSS Mexico3 and IMSS-Mexico1 alternatives. The probabilistic sensitivity analyses results suggest that for a willingness-to-pay (WTP) range of $0-9,000 USD the IMSS-Mexico1 strategy is the most cost-effective of all strategies ($5,000 USD per TP). The IMSS-Mexico3, IMSS-Mexico2, and CDC-USA3 strategies are also cost-effective strategies that cost between $7,800 and $8,800 USD per TP case detected. The CDC-USA1 strategy was very expensive and not cost-effective. HCV antibody testing strategies based on the classification of two or three levels of the S/CO are cost-effective procedures to identify patients who require reflex IMB or HCV RNA testing to confirm chronic HCV infection.
Taylor, Richard Andrew; Singh Gill, Harman; Marcolini, Evie G; Meyers, H Pendell; Faust, Jeremy Samuel; Newman, David H
2016-10-01
The objective was to determine the testing threshold for lumbar puncture (LP) in the evaluation of aneurysmal subarachnoid hemorrhage (SAH) after a negative head computed tomography (CT). As a secondary aim we sought to identify clinical variables that have the greatest impact on this threshold. A decision analytic model was developed to estimate the testing threshold for patients with normal neurologic findings, being evaluated for SAH, after a negative CT of the head. The testing threshold was calculated as the pretest probability of disease where the two strategies (LP or no LP) are balanced in terms of quality-adjusted life-years. Two-way and probabilistic sensitivity analyses (PSAs) were performed. For the base-case scenario the testing threshold for performing an LP after negative head CT was 4.3%. Results for the two-way sensitivity analyses demonstrated that the test threshold ranged from 1.9% to 15.6%, dominated by the uncertainty in the probability of death from initial missed SAH. In the PSA the mean testing threshold was 4.3% (95% confidence interval = 1.4% to 9.3%). Other significant variables in the model included probability of aneurysmal versus nonaneurysmal SAH after negative head CT, probability of long-term morbidity from initial missed SAH, and probability of renal failure from contrast-induced nephropathy. Our decision analysis results suggest a testing threshold for LP after negative CT to be approximately 4.3%, with a range of 1.4% to 9.3% on robust PSA. In light of these data, and considering the low probability of aneurysmal SAH after a negative CT, classical teaching and current guidelines addressing testing for SAH should be revisited. © 2016 by the Society for Academic Emergency Medicine.
Meier, Sandra L; Charleston, Alison J; Tippett, Lynette J
2010-11-01
Amyotrophic lateral sclerosis, a progressive disease affecting motor neurons, may variably affect cognition and behaviour. We tested the hypothesis that functions associated with orbitomedial prefrontal cortex are affected by evaluating the behavioural and cognitive performance of 18 participants with amyotrophic lateral sclerosis without dementia and 18 healthy, matched controls. We measured Theory of Mind (Faux Pas Task), emotional prosody recognition (Aprosodia Battery), reversal of behaviour in response to changes in reward (Probabilistic Reversal Learning Task), decision making without risk (Holiday Apartment Task) and aberrant behaviour (Neuropsychiatric Inventory). We also assessed dorsolateral prefrontal function, using verbal and written fluency and planning (One-touch Stockings of Cambridge), to determine whether impairments in tasks sensitive to these two prefrontal regions co-occur. The patient group was significantly impaired at identifying social faux pas, recognizing emotions and decision-making, indicating mild, but consistent impairment on most measures sensitive to orbitomedial prefrontal cortex. Significant levels of aberrant behaviour were present in 50% of patients. Patients were also impaired on verbal fluency and planning. Individual subject analyses involved computing classical dissociations between tasks sensitive to different prefrontal regions. These revealed heterogeneous patterns of impaired and spared cognitive abilities: 33% of participants had classical dissociations involving orbitomedial prefrontal tasks, 17% had classical dissociations involving dorsolateral prefrontal tasks, 22% had classical dissociations between tasks of both regions, and 28% had no classical dissociations. These data indicate subtle changes in behaviour, emotional processing, decision-making and altered social awareness, associated with orbitomedial prefrontal cortex, may be present in a significant proportion of individuals with amyotrophic lateral sclerosis without dementia, some with no signs of dysfunction in tasks sensitive to other regions of prefrontal cortex. This demonstration of variability in cognitive integrity supports previous research indicating amyotrophic lateral sclerosis is a heterogeneous disease.
NASA Technical Reports Server (NTRS)
Onwubiko, Chin-Yere; Onyebueke, Landon
1996-01-01
The structural design, or the design of machine elements, has been traditionally based on deterministic design methodology. The deterministic method considers all design parameters to be known with certainty. This methodology is, therefore, inadequate to design complex structures that are subjected to a variety of complex, severe loading conditions. A nonlinear behavior that is dependent on stress, stress rate, temperature, number of load cycles, and time is observed on all components subjected to complex conditions. These complex conditions introduce uncertainties; hence, the actual factor of safety margin remains unknown. In the deterministic methodology, the contingency of failure is discounted; hence, there is a use of a high factor of safety. It may be most useful in situations where the design structures are simple. The probabilistic method is concerned with the probability of non-failure performance of structures or machine elements. It is much more useful in situations where the design is characterized by complex geometry, possibility of catastrophic failure, sensitive loads and material properties. Also included: Comparative Study of the use of AGMA Geometry Factors and Probabilistic Design Methodology in the Design of Compact Spur Gear Set.
Barsky, Murray M.; Tucker, Matthew A.; Stickgold, Robert
2015-01-01
During wakefulness the brain creates meaningful relationships between disparate stimuli in ways that escape conscious awareness. Processes active during sleep can strengthen these relationships, leading to more adaptive use of those stimuli when encountered during subsequent wake. Performance on the weather prediction task (WPT), a well-studied measure of implicit probabilistic learning, has been shown to improve significantly following a night of sleep, with stronger initial learning predicting more nocturnal REM sleep. We investigated this relationship further, studying the effect on WPT performance of a daytime nap containing REM sleep. We also added an interference condition after the nap/wake period as an additional probe of memory strength. Our results show that a nap significantly boosts WPT performance, and that this improvement is correlated with the amount of REM sleep obtained during the nap. When interference training is introduced following the nap, however, this REM-sleep benefit vanishes. In contrast, following an equal period of wake, performance is both unchanged from training and unaffected by interference training. Thus, while the true probabilistic relationships between WPT stimuli are strengthened by sleep, these changes are selectively susceptible to the destructive effects of retroactive interference, at least in the short term. PMID:25769506
Probing the Small-scale Structure in Strongly Lensed Systems via Transdimensional Inference
NASA Astrophysics Data System (ADS)
Daylan, Tansu; Cyr-Racine, Francis-Yan; Diaz Rivero, Ana; Dvorkin, Cora; Finkbeiner, Douglas P.
2018-02-01
Strong lensing is a sensitive probe of the small-scale density fluctuations in the Universe. We implement a pipeline to model strongly lensed systems using probabilistic cataloging, which is a transdimensional, hierarchical, and Bayesian framework to sample from a metamodel (union of models with different dimensionality) consistent with observed photon count maps. Probabilistic cataloging allows one to robustly characterize modeling covariances within and across lens models with different numbers of subhalos. Unlike traditional cataloging of subhalos, it does not require model subhalos to improve the goodness of fit above the detection threshold. Instead, it allows the exploitation of all information contained in the photon count maps—for instance, when constraining the subhalo mass function. We further show that, by not including these small subhalos in the lens model, fixed-dimensional inference methods can significantly mismodel the data. Using a simulated Hubble Space Telescope data set, we show that the subhalo mass function can be probed even when many subhalos in the sample catalogs are individually below the detection threshold and would be absent in a traditional catalog. The implemented software, Probabilistic Cataloger (PCAT) is made publicly available at https://github.com/tdaylan/pcat.
Probabilistic Dynamic Buckling of Smart Composite Shells
NASA Technical Reports Server (NTRS)
Abumeri, Galib H.; Chamis, Christos C.
2003-01-01
A computational simulation method is presented to evaluate the deterministic and nondeterministic dynamic buckling of smart composite shells. The combined use of composite mechanics, finite element computer codes, and probabilistic analysis enable the effective assessment of the dynamic buckling load of smart composite shells. A universal plot is generated to estimate the dynamic buckling load of composite shells at various load rates and probabilities. The shell structure is also evaluated with smart fibers embedded in the plies right below the outer plies. The results show that, on the average, the use of smart fibers improved the shell buckling resistance by about 10 percent at different probabilities and delayed the buckling occurrence time. The probabilistic sensitivities results indicate that uncertainties in the fiber volume ratio and ply thickness have major effects on the buckling load while uncertainties in the electric field strength and smart material volume fraction have moderate effects. For the specific shell considered in this evaluation, the use of smart composite material is not recommended because the shell buckling resistance can be improved by simply re-arranging the orientation of the outer plies, as shown in the dynamic buckling analysis results presented in this report.
Probabilistic Dynamic Buckling of Smart Composite Shells
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Abumeri, Galib H.
2007-01-01
A computational simulation method is presented to evaluate the deterministic and nondeterministic dynamic buckling of smart composite shells. The combined use of intraply hybrid composite mechanics, finite element computer codes, and probabilistic analysis enable the effective assessment of the dynamic buckling load of smart composite shells. A universal plot is generated to estimate the dynamic buckling load of composite shells at various load rates and probabilities. The shell structure is also evaluated with smart fibers embedded in the plies right next to the outer plies. The results show that, on the average, the use of smart fibers improved the shell buckling resistance by about 10% at different probabilities and delayed the buckling occurrence time. The probabilistic sensitivities results indicate that uncertainties in the fiber volume ratio and ply thickness have major effects on the buckling load while uncertainties in the electric field strength and smart material volume fraction have moderate effects. For the specific shell considered in this evaluation, the use of smart composite material is not recommended because the shell buckling resistance can be improved by simply re-arranging the orientation of the outer plies, as shown in the dynamic buckling analysis results presented in this report.
Steven P. Norman; Danny C. Lee; Sandra Jacobson; Christine Damiani
2010-01-01
The tradeoffs that surround forest management are inherently complex, often involving multiple temporal and spatial scales. For example, conflicts may result when fuel treatments are designed to mediate long-term fuel hazards, but activities could impair sensitive aquatic habitat or degrade wildlife habitat in the short term. This complexity makes it hard for managers...
ERIC Educational Resources Information Center
Paulsen, David J.; Woldorff, Marty G.; Brannon, Elizabeth M.
2010-01-01
The current study investigated the neural activity patterns associated with numerical sensitivity in adults. Event-related potentials (ERPs) were recorded while adults observed sequentially presented display arrays (S1 and S2) of non-symbolic numerical stimuli (dots) and made same/different judgments of these stimuli by pressing a button only when…
The Dynamics of Scaling: A Memory-Based Anchor Model of Category Rating and Absolute Identification
ERIC Educational Resources Information Center
Petrov, Alexander A.; Anderson, John R.
2005-01-01
A memory-based scaling model--ANCHOR--is proposed and tested. The perceived magnitude of the target stimulus is compared with a set of anchors in memory. Anchor selection is probabilistic and sensitive to similarity, base-level strength, and recency. The winning anchor provides a reference point near the target and thereby converts the global…
2016-11-09
the model does not become a full probabilistic attack graph analysis of the network , whose data requirements are currently unrealistic. The second...flow. – Untrustworthy persons may intentionally try to exfiltrate known sensitive data to ex- ternal networks . People may also unintentionally leak...section will provide details on the components, procedures, data requirements, and parameters required to instantiate the network porosity model. These
NASA Astrophysics Data System (ADS)
Olivia, G.; Santoso, A.; Prayogo, D. N.
2017-11-01
Nowadays, the level of competition between supply chains is getting tighter and a good coordination system between supply chains members is very crucial in solving the issue. This paper focused on a model development of coordination system between single supplier and buyers in a supply chain as a solution. Proposed optimization model was designed to determine the optimal number of deliveries from a supplier to buyers in order to minimize the total cost over a planning horizon. Components of the total supply chain cost consist of transportation costs, handling costs of supplier and buyers and also stock out costs. In the proposed optimization model, the supplier can supply various types of items to retailers whose item demand patterns are probabilistic. Sensitivity analysis of the proposed model was conducted to test the effect of changes in transport costs, handling costs and production capacities of the supplier. The results of the sensitivity analysis showed a significant influence on the changes in the transportation cost, handling costs and production capacity to the decisions of the optimal numbers of product delivery for each item to the buyers.
Reliability and Maintainability Data for Lead Lithium Cooling Systems
Cadwallader, Lee
2016-11-16
This article presents component failure rate data for use in assessment of lead lithium cooling systems. Best estimate data applicable to this liquid metal coolant is presented. Repair times for similar components are also referenced in this work. These data support probabilistic safety assessment and reliability, availability, maintainability and inspectability analyses.
Predicting Improvement among University Counseling Center Clients.
ERIC Educational Resources Information Center
Lichtenberg, James W.; Hummel, Thomas J.
The fundamental question to which most clients want and deserve an answer is, "Am I going to get better (as a result of counseling)?" Although meta-analyses provide strong evidence supporting the efficacy of counseling in general, if one wants to make probabilistic statements about individual client outcomes--rather than about the more generalized…