Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat
2008-01-01
Background Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. Methods In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Results Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Conclusion Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided. PMID:19036144
Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat
2008-11-26
Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided.
Diaby, Vakaramoko; Goeree, Ron
2014-02-01
In recent years, the quest for more comprehensiveness, structure and transparency in reimbursement decision-making in healthcare has prompted the research into alternative decision-making frameworks. In this environment, multi-criteria decision analysis (MCDA) is arising as a valuable tool to support healthcare decision-making. In this paper, we present the main MCDA decision support methods (elementary methods, value-based measurement models, goal programming models and outranking models) using a case study approach. For each family of methods, an example of how an MCDA model would operate in a real decision-making context is presented from a critical perspective, highlighting the parameters setting, the selection of the appropriate evaluation model as well as the role of sensitivity and robustness analyses. This study aims to provide a step-by-step guide on how to use MCDA methods for reimbursement decision-making in healthcare.
Methods for Conducting Cognitive Task Analysis for a Decision Making Task.
1996-01-01
Cognitive task analysis (CTA) improves traditional task analysis procedures by analyzing the thought processes of performers while they complete a...for using these methods to conduct a CTA for domains which involve critical decision making tasks in naturalistic settings. The cognitive task analysis methods
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-09
... Role of Risk Analysis in Decision-Making AGENCY: Environmental Protection Agency (EPA). ACTION: Notice... documents entitled, ``Using Probabilistic Methods to Enhance the Role of Risk Analysis in Decision- Making... Probabilistic Methods to Enhance the Role of Risk Analysis in Decision-Making, with Case Study Examples'' and...
Multicriteria decision analysis: Overview and implications for environmental decision making
Hermans, Caroline M.; Erickson, Jon D.; Erickson, Jon D.; Messner, Frank; Ring, Irene
2007-01-01
Environmental decision making involving multiple stakeholders can benefit from the use of a formal process to structure stakeholder interactions, leading to more successful outcomes than traditional discursive decision processes. There are many tools available to handle complex decision making. Here we illustrate the use of a multicriteria decision analysis (MCDA) outranking tool (PROMETHEE) to facilitate decision making at the watershed scale, involving multiple stakeholders, multiple criteria, and multiple objectives. We compare various MCDA methods and their theoretical underpinnings, examining methods that most realistically model complex decision problems in ways that are understandable and transparent to stakeholders.
Thokala, Praveen; Devlin, Nancy; Marsh, Kevin; Baltussen, Rob; Boysen, Meindert; Kalo, Zoltan; Longrenn, Thomas; Mussen, Filip; Peacock, Stuart; Watkins, John; Ijzerman, Maarten
2016-01-01
Health care decisions are complex and involve confronting trade-offs between multiple, often conflicting, objectives. Using structured, explicit approaches to decisions involving multiple criteria can improve the quality of decision making and a set of techniques, known under the collective heading multiple criteria decision analysis (MCDA), are useful for this purpose. MCDA methods are widely used in other sectors, and recently there has been an increase in health care applications. In 2014, ISPOR established an MCDA Emerging Good Practices Task Force. It was charged with establishing a common definition for MCDA in health care decision making and developing good practice guidelines for conducting MCDA to aid health care decision making. This initial ISPOR MCDA task force report provides an introduction to MCDA - it defines MCDA; provides examples of its use in different kinds of decision making in health care (including benefit risk analysis, health technology assessment, resource allocation, portfolio decision analysis, shared patient clinician decision making and prioritizing patients' access to services); provides an overview of the principal methods of MCDA; and describes the key steps involved. Upon reviewing this report, readers should have a solid overview of MCDA methods and their potential for supporting health care decision making. Copyright © 2016. Published by Elsevier Inc.
Analytical methods for Multi-Criteria Decision Analysis (MCDA) support the non-monetary valuation of ecosystem services for environmental decision making. Many published case studies transform ecosystem service outcomes into a common metric and aggregate the outcomes to set land ...
Advancing Alternative Analysis: Integration of Decision Science
Zaunbrecher, Virginia M.; Batteate, Christina M.; Blake, Ann; Carroll, William F.; Corbett, Charles J.; Hansen, Steffen Foss; Lempert, Robert J.; Linkov, Igor; McFadden, Roger; Moran, Kelly D.; Olivetti, Elsa; Ostrom, Nancy K.; Romero, Michelle; Schoenung, Julie M.; Seager, Thomas P.; Sinsheimer, Peter; Thayer, Kristina A.
2017-01-01
Background: Decision analysis—a systematic approach to solving complex problems—offers tools and frameworks to support decision making that are increasingly being applied to environmental challenges. Alternatives analysis is a method used in regulation and product design to identify, compare, and evaluate the safety and viability of potential substitutes for hazardous chemicals. Objectives: We assessed whether decision science may assist the alternatives analysis decision maker in comparing alternatives across a range of metrics. Methods: A workshop was convened that included representatives from government, academia, business, and civil society and included experts in toxicology, decision science, alternatives assessment, engineering, and law and policy. Participants were divided into two groups and were prompted with targeted questions. Throughout the workshop, the groups periodically came together in plenary sessions to reflect on other groups’ findings. Results: We concluded that the further incorporation of decision science into alternatives analysis would advance the ability of companies and regulators to select alternatives to harmful ingredients and would also advance the science of decision analysis. Conclusions: We advance four recommendations: a) engaging the systematic development and evaluation of decision approaches and tools; b) using case studies to advance the integration of decision analysis into alternatives analysis; c) supporting transdisciplinary research; and d) supporting education and outreach efforts. https://doi.org/10.1289/EHP483 PMID:28669940
Decision Making Methods in Space Economics and Systems Engineering
NASA Technical Reports Server (NTRS)
Shishko, Robert
2006-01-01
This viewgraph presentation reviews various methods of decision making and the impact that they have on space economics and systems engineering. Some of the methods discussed are: Present Value and Internal Rate of Return (IRR); Cost-Benefit Analysis; Real Options; Cost-Effectiveness Analysis; Cost-Utility Analysis; Multi-Attribute Utility Theory (MAUT); and Analytic Hierarchy Process (AHP).
Decision modeling for fire incident analysis
Donald G. MacGregor; Armando González-Cabán
2009-01-01
This paper reports on methods for representing and modeling fire incidents based on concepts and models from the decision and risk sciences. A set of modeling techniques are used to characterize key fire management decision processes and provide a basis for incident analysis. The results of these methods can be used to provide insights into the structure of fire...
Decision Support Methods and Tools
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Alexandrov, Natalia M.; Brown, Sherilyn A.; Cerro, Jeffrey A.; Gumbert, Clyde r.; Sorokach, Michael R.; Burg, Cecile M.
2006-01-01
This paper is one of a set of papers, developed simultaneously and presented within a single conference session, that are intended to highlight systems analysis and design capabilities within the Systems Analysis and Concepts Directorate (SACD) of the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC). This paper focuses on the specific capabilities of uncertainty/risk analysis, quantification, propagation, decomposition, and management, robust/reliability design methods, and extensions of these capabilities into decision analysis methods within SACD. These disciplines are discussed together herein under the name of Decision Support Methods and Tools. Several examples are discussed which highlight the application of these methods within current or recent aerospace research at the NASA LaRC. Where applicable, commercially available, or government developed software tools are also discussed
Risk analysis theory applied to fishing operations: A new approach on the decision-making problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cunha, J.C.S.
1994-12-31
In the past the decisions concerning whether to continue or interrupt a fishing operation were based primarily on the operator`s previous experience. This procedure often led to wrong decisions and unnecessary loss of money and time. This paper describes a decision-making method based on risk analysis theory and previous operation results from a field under study. The method leads to more accurate decisions on a daily basis allowing the operator to verify each day of the operation if the decision being carried out is the one with the highest probability to conduct to the best economical result. An example ofmore » the method application is provided at the end of the paper.« less
A fuzzy decision analysis method for integrating ecological indicators is developed. This is a combination of a fuzzy ranking method and the Analytic Hierarchy Process (AHP). The method is capable ranking ecosystems in terms of environmental conditions and suggesting cumula...
Real options analysis for land use management: Methods, application, and implications for policy.
Regan, Courtney M; Bryan, Brett A; Connor, Jeffery D; Meyer, Wayne S; Ostendorf, Bertram; Zhu, Zili; Bao, Chenming
2015-09-15
Discounted cash flow analysis, including net present value is an established way to value land use and management investments which accounts for the time-value of money. However, it provides a static view and assumes passive commitment to an investment strategy when real world land use and management investment decisions are characterised by uncertainty, irreversibility, change, and adaptation. Real options analysis has been proposed as a better valuation method under uncertainty and where the opportunity exists to delay investment decisions, pending more information. We briefly review the use of discounted cash flow methods in land use and management and discuss their benefits and limitations. We then provide an overview of real options analysis, describe the main analytical methods, and summarize its application to land use investment decisions. Real options analysis is largely underutilized in evaluating land use decisions, despite uncertainty in policy and economic drivers, the irreversibility and sunk costs involved. New simulation methods offer the potential for overcoming current technical challenges to implementation as demonstrated with a real options simulation model used to evaluate an agricultural land use decision in South Australia. We conclude that considering option values in future policy design will provide a more realistic assessment of landholder investment decision making and provide insights for improved policy performance. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Feinberg, A.; Miles, R. F., Jr.
1978-01-01
The principal concepts of the Keeney and Raiffa approach to multiattribute decision analysis are described. Topics discussed include the concepts of decision alternatives, outcomes, objectives, attributes and their states, attribute utility functions, and the necessary independence properties for the attribute states to be aggregated into a numerical representation of the preferences of the decision maker for the outcomes and decision alternatives.
ERIC Educational Resources Information Center
Knight, Jennifer L.
This paper considers some decisions that must be made by the researcher conducting an exploratory factor analysis. The primary purpose is to aid the researcher in making informed decisions during the factor analysis instead of relying on defaults in statistical programs or traditions of previous researchers. Three decision areas are addressed.…
2018-01-01
Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization. PMID:29377956
Zu, Xianghuan; Yang, Chuanlei; Wang, Hechun; Wang, Yinyan
2018-01-01
Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization.
Kriston, Levente; Meister, Ramona
2014-03-01
Judging applicability (relevance) of meta-analytical findings to particular clinical decision-making situations remains challenging. We aimed to describe an evidence synthesis method that accounts for possible uncertainty regarding applicability of the evidence. We conceptualized uncertainty regarding applicability of the meta-analytical estimates to a decision-making situation as the result of uncertainty regarding applicability of the findings of the trials that were included in the meta-analysis. This trial-level applicability uncertainty can be directly assessed by the decision maker and allows for the definition of trial inclusion probabilities, which can be used to perform a probabilistic meta-analysis with unequal probability resampling of trials (adaptive meta-analysis). A case study with several fictitious decision-making scenarios was performed to demonstrate the method in practice. We present options to elicit trial inclusion probabilities and perform the calculations. The result of an adaptive meta-analysis is a frequency distribution of the estimated parameters from traditional meta-analysis that provides individually tailored information according to the specific needs and uncertainty of the decision maker. The proposed method offers a direct and formalized combination of research evidence with individual clinical expertise and may aid clinicians in specific decision-making situations. Copyright © 2014 Elsevier Inc. All rights reserved.
2015-10-01
capability to meet the task to the standard under the condition, nothing more or less, else the funding is wasted . Also, that funding for the...bin to segregate gaps qualitatively before the gap value model determined preference among gaps within the bins. Computation of a gap’s...for communication, interpretation, or processing by humans or by automatic means (as it pertains to modeling and simulation). Delphi Method -- a
Fuzzy decision-making framework for treatment selection based on the combined QUALIFLEX-TODIM method
NASA Astrophysics Data System (ADS)
Ji, Pu; Zhang, Hong-yu; Wang, Jian-qiang
2017-10-01
Treatment selection is a multi-criteria decision-making problem of significant concern in the medical field. In this study, a fuzzy decision-making framework is established for treatment selection. The framework mitigates information loss by introducing single-valued trapezoidal neutrosophic numbers to denote evaluation information. Treatment selection has multiple criteria that remarkably exceed the alternatives. In consideration of this characteristic, the framework utilises the idea of the qualitative flexible multiple criteria method. Furthermore, it considers the risk-averse behaviour of a decision maker by employing a concordance index based on TODIM (an acronym in Portuguese of interactive and multi-criteria decision-making) method. A sensitivity analysis is performed to illustrate the robustness of the framework. Finally, a comparative analysis is conducted to compare the framework with several extant methods. Results indicate the advantages of the framework and its better performance compared with the extant methods.
Characterizing uncertain sea-level rise projections to support investment decisions.
Sriver, Ryan L; Lempert, Robert J; Wikman-Svahn, Per; Keller, Klaus
2018-01-01
Many institutions worldwide are considering how to include uncertainty about future changes in sea-levels and storm surges into their investment decisions regarding large capital infrastructures. Here we examine how to characterize deeply uncertain climate change projections to support such decisions using Robust Decision Making analysis. We address questions regarding how to confront the potential for future changes in low probability but large impact flooding events due to changes in sea-levels and storm surges. Such extreme events can affect investments in infrastructure but have proved difficult to consider in such decisions because of the deep uncertainty surrounding them. This study utilizes Robust Decision Making methods to address two questions applied to investment decisions at the Port of Los Angeles: (1) Under what future conditions would a Port of Los Angeles decision to harden its facilities against extreme flood scenarios at the next upgrade pass a cost-benefit test, and (2) Do sea-level rise projections and other information suggest such conditions are sufficiently likely to justify such an investment? We also compare and contrast the Robust Decision Making methods with a full probabilistic analysis. These two analysis frameworks result in similar investment recommendations for different idealized future sea-level projections, but provide different information to decision makers and envision different types of engagement with stakeholders. In particular, the full probabilistic analysis begins by aggregating the best scientific information into a single set of joint probability distributions, while the Robust Decision Making analysis identifies scenarios where a decision to invest in near-term response to extreme sea-level rise passes a cost-benefit test, and then assembles scientific information of differing levels of confidence to help decision makers judge whether or not these scenarios are sufficiently likely to justify making such investments. Results highlight the highly-localized and context dependent nature of applying Robust Decision Making methods to inform investment decisions.
Characterizing uncertain sea-level rise projections to support investment decisions
Lempert, Robert J.; Wikman-Svahn, Per; Keller, Klaus
2018-01-01
Many institutions worldwide are considering how to include uncertainty about future changes in sea-levels and storm surges into their investment decisions regarding large capital infrastructures. Here we examine how to characterize deeply uncertain climate change projections to support such decisions using Robust Decision Making analysis. We address questions regarding how to confront the potential for future changes in low probability but large impact flooding events due to changes in sea-levels and storm surges. Such extreme events can affect investments in infrastructure but have proved difficult to consider in such decisions because of the deep uncertainty surrounding them. This study utilizes Robust Decision Making methods to address two questions applied to investment decisions at the Port of Los Angeles: (1) Under what future conditions would a Port of Los Angeles decision to harden its facilities against extreme flood scenarios at the next upgrade pass a cost-benefit test, and (2) Do sea-level rise projections and other information suggest such conditions are sufficiently likely to justify such an investment? We also compare and contrast the Robust Decision Making methods with a full probabilistic analysis. These two analysis frameworks result in similar investment recommendations for different idealized future sea-level projections, but provide different information to decision makers and envision different types of engagement with stakeholders. In particular, the full probabilistic analysis begins by aggregating the best scientific information into a single set of joint probability distributions, while the Robust Decision Making analysis identifies scenarios where a decision to invest in near-term response to extreme sea-level rise passes a cost-benefit test, and then assembles scientific information of differing levels of confidence to help decision makers judge whether or not these scenarios are sufficiently likely to justify making such investments. Results highlight the highly-localized and context dependent nature of applying Robust Decision Making methods to inform investment decisions. PMID:29414978
Decision curve analysis: a novel method for evaluating prediction models.
Vickers, Andrew J; Elkin, Elena B
2006-01-01
Diagnostic and prognostic models are typically evaluated with measures of accuracy that do not address clinical consequences. Decision-analytic techniques allow assessment of clinical outcomes but often require collection of additional information and may be cumbersome to apply to models that yield a continuous result. The authors sought a method for evaluating and comparing prediction models that incorporates clinical consequences,requires only the data set on which the models are tested,and can be applied to models that have either continuous or dichotomous results. The authors describe decision curve analysis, a simple, novel method of evaluating predictive models. They start by assuming that the threshold probability of a disease or event at which a patient would opt for treatment is informative of how the patient weighs the relative harms of a false-positive and a false-negative prediction. This theoretical relationship is then used to derive the net benefit of the model across different threshold probabilities. Plotting net benefit against threshold probability yields the "decision curve." The authors apply the method to models for the prediction of seminal vesicle invasion in prostate cancer patients. Decision curve analysis identified the range of threshold probabilities in which a model was of value, the magnitude of benefit, and which of several models was optimal. Decision curve analysis is a suitable method for evaluating alternative diagnostic and prognostic strategies that has advantages over other commonly used measures and techniques.
Liem T. Tran; C. Gregory Knight; Robert V. O' Neill; Elizabeth R. Smith; Kurt H. Riitters; James D. Wickham
2002-01-01
A fuzzy decision analysis method for integrating ecological indicators was developed. This was a combination of a fuzzy ranking method and the analytic hierarchy process (AHP). The method was capable of ranking ecosystems in terms of environmental conditions and suggesting cumulative impacts across a large region. Using data on land cover, population, roads, streams,...
Lee, Eun Gyung; Kim, Seung Won; Feigley, Charles E.; Harper, Martin
2015-01-01
This study introduces two semi-quantitative methods, Structured Subjective Assessment (SSA) and Control of Substances Hazardous to Health (COSHH) Essentials, in conjunction with two-dimensional Monte Carlo simulations for determining prior probabilities. Prior distribution using expert judgment was included for comparison. Practical applications of the proposed methods were demonstrated using personal exposure measurements of isoamyl acetate in an electronics manufacturing facility and of isopropanol in a printing shop. Applicability of these methods in real workplaces was discussed based on the advantages and disadvantages of each method. Although these methods could not be completely independent of expert judgments, this study demonstrated a methodological improvement in the estimation of the prior distribution for the Bayesian decision analysis tool. The proposed methods provide a logical basis for the decision process by considering determinants of worker exposure. PMID:23252451
Decerns: A framework for multi-criteria decision analysis
Yatsalo, Boris; Didenko, Vladimir; Gritsyuk, Sergey; ...
2015-02-27
A new framework, Decerns, for multicriteria decision analysis (MCDA) of a wide range of practical problems on risk management is introduced. Decerns framework contains a library of modules that are the basis for two scalable systems: DecernsMCDA for analysis of multicriteria problems, and DecernsSDSS for multicriteria analysis of spatial options. DecernsMCDA includes well known MCDA methods and original methods for uncertainty treatment based on probabilistic approaches and fuzzy numbers. As a result, these MCDA methods are described along with a case study on analysis of multicriteria location problem.
Value of information analysis in healthcare: a review of principles and applications.
Tuffaha, Haitham W; Gordon, Louisa G; Scuffham, Paul A
2014-06-01
Economic evaluations are increasingly utilized to inform decisions in healthcare; however, decisions remain uncertain when they are not based on adequate evidence. Value of information (VOI) analysis has been proposed as a systematic approach to measure decision uncertainty and assess whether there is sufficient evidence to support new technologies. The objective of this paper is to review the principles and applications of VOI analysis in healthcare. Relevant databases were systematically searched to identify VOI articles. The findings from the selected articles were summarized and narratively presented. Various VOI methods have been developed and applied to inform decision-making, optimally designing research studies and setting research priorities. However, the application of this approach in healthcare remains limited due to technical and policy challenges. There is a need to create more awareness about VOI analysis, simplify its current methods, and align them with the needs of decision-making organizations.
A method for studying decision-making by guideline development groups.
Gardner, Benjamin; Davidson, Rosemary; McAteer, John; Michie, Susan
2009-08-05
Multidisciplinary guideline development groups (GDGs) have considerable influence on UK healthcare policy and practice, but previous research suggests that research evidence is a variable influence on GDG recommendations. The Evidence into Recommendations (EiR) study has been set up to document social-psychological influences on GDG decision-making. In this paper we aim to evaluate the relevance of existing qualitative methodologies to the EiR study, and to develop a method best-suited to capturing influences on GDG decision-making. A research team comprised of three postdoctoral research fellows and a multidisciplinary steering group assessed the utility of extant qualitative methodologies for coding verbatim GDG meeting transcripts and semi-structured interviews with GDG members. A unique configuration of techniques was developed to permit data reduction and analysis. Our method incorporates techniques from thematic analysis, grounded theory analysis, content analysis, and framework analysis. Thematic analysis of individual interviews conducted with group members at the start and end of the GDG process defines discrete problem areas to guide data extraction from GDG meeting transcripts. Data excerpts are coded both inductively and deductively, using concepts taken from theories of decision-making, social influence and group processes. These codes inform a framework analysis to describe and explain incidents within GDG meetings. We illustrate the application of the method by discussing some preliminary findings of a study of a National Institute for Health and Clinical Excellence (NICE) acute physical health GDG. This method is currently being applied to study the meetings of three of NICE GDGs. These cover topics in acute physical health, mental health and public health, and comprise a total of 45 full-day meetings. The method offers potential for application to other health care and decision-making groups.
Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis
2016-03-01
Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time.
The effect of uncertainties in distance-based ranking methods for multi-criteria decision making
NASA Astrophysics Data System (ADS)
Jaini, Nor I.; Utyuzhnikov, Sergei V.
2017-08-01
Data in the multi-criteria decision making are often imprecise and changeable. Therefore, it is important to carry out sensitivity analysis test for the multi-criteria decision making problem. The paper aims to present a sensitivity analysis for some ranking techniques based on the distance measures in multi-criteria decision making. Two types of uncertainties are considered for the sensitivity analysis test. The first uncertainty is related to the input data, while the second uncertainty is towards the Decision Maker preferences (weights). The ranking techniques considered in this study are TOPSIS, the relative distance and trade-off ranking methods. TOPSIS and the relative distance method measure a distance from an alternative to the ideal and antiideal solutions. In turn, the trade-off ranking calculates a distance of an alternative to the extreme solutions and other alternatives. Several test cases are considered to study the performance of each ranking technique in both types of uncertainties.
1987-12-01
were presented. The second part of the thesis proposed the alternative methods of decision analysis and PROMETHEE to solve TAF’s . prioritization...of decision analysis (DA) and Preference Ranking Orqanization Method for Enrichment Evaluations ( PROMETHEE ) will be explained. First, the...dollars. However, once this task is successfully accomplished, TAF would be able to use DA to prioritize their mods. The PROMETHEE is a "new class of
A Chaotic Ordered Hierarchies Consistency Analysis Performance Evaluation Model
NASA Astrophysics Data System (ADS)
Yeh, Wei-Chang
2013-02-01
The Hierarchies Consistency Analysis (HCA) is proposed by Guh in-cooperated along with some case study on a Resort to reinforce the weakness of Analytical Hierarchy Process (AHP). Although the results obtained enabled aid for the Decision Maker to make more reasonable and rational verdicts, the HCA itself is flawed. In this paper, our objective is to indicate the problems of HCA, and then propose a revised method called chaotic ordered HCA (COH in short) which can avoid problems. Since the COH is based upon Guh's method, the Decision Maker establishes decisions in a way similar to that of the original method.
Development of the Expert System Domain Advisor and Analysis Tool
1991-09-01
analysis. Typical of the current methods in use at this time is the " tarot metric". This method defines a decision rule whose output is whether to go...B - TAROT METRIC B. ::TTRODUCTION The system chart of ESEM, Figure 1, shows the following three risk-based decision points: i. At prolect initiation...34 decisions. B-I 201 PRELIMINARY T" B-I. Evaluais Factan for ES Deyelopsineg FACTORS POSSIBLE VALUE RATINGS TAROT metric (overall suitability) Poor, Fair
Multi-criteria decision analysis in environmental sciences: ten years of applications and trends.
Huang, Ivy B; Keisler, Jeffrey; Linkov, Igor
2011-09-01
Decision-making in environmental projects requires consideration of trade-offs between socio-political, environmental, and economic impacts and is often complicated by various stakeholder views. Multi-criteria decision analysis (MCDA) emerged as a formal methodology to face available technical information and stakeholder values to support decisions in many fields and can be especially valuable in environmental decision making. This study reviews environmental applications of MCDA. Over 300 papers published between 2000 and 2009 reporting MCDA applications in the environmental field were identified through a series of queries in the Web of Science database. The papers were classified by their environmental application area, decision or intervention type. In addition, the papers were also classified by the MCDA methods used in the analysis (analytic hierarchy process, multi-attribute utility theory, and outranking). The results suggest that there is a significant growth in environmental applications of MCDA over the last decade across all environmental application areas. Multiple MCDA tools have been successfully used for environmental applications. Even though the use of the specific methods and tools varies in different application areas and geographic regions, our review of a few papers where several methods were used in parallel with the same problem indicates that recommended course of action does not vary significantly with the method applied. Published by Elsevier B.V.
Issue a Boil-Water Advisory or Wait for Definitive Information? A Decision Analysis
Wagner, Michael M.; Wallstrom, Garrick L.; Onisko, Agnieszka
2005-01-01
Objective Study the decision to issue a boil-water advisory in response to a spike in sales of diarrhea remedies or wait 72 hours for the results of definitive testing of water and people. Methods Decision analysis. Results In the base-case analysis, the optimal decision is test-and-wait. If the cost of issuing a boil-water advisory is less than 13.92 cents per person per day, the optimal decision is to issue the boil-water advisory immediately. Conclusions Decisions based on surveillance data that are suggestive but not conclusive about the existence of a disease outbreak can be modeled. PMID:16779145
Decision problems in management of construction projects
NASA Astrophysics Data System (ADS)
Szafranko, E.
2017-10-01
In a construction business, one must oftentimes make decisions during all stages of a building process, from planning a new construction project through its execution to the stage of using a ready structure. As a rule, the decision making process is made more complicated due to certain conditions specific for civil engineering. With such diverse decision situations, it is recommended to apply various decision making support methods. Both, literature and hands-on experience suggest several methods based on analytical and computational procedures, some less and some more complex. This article presents the methods which can be helpful in supporting decision making processes in the management of civil engineering projects. These are multi-criteria methods, such as MCE, AHP or indicator methods. Because the methods have different advantages and disadvantages, whereas decision situations have their own specific nature, a brief summary of the methods alongside some recommendations regarding their practical applications has been given at the end of the paper. The main aim of this article is to review the methods of decision support and their analysis for possible use in the construction industry.
Decision-Making Phenomena Described by Expert Nurses Working in Urban Community Health Settings.
ERIC Educational Resources Information Center
Watkins, Mary P.
1998-01-01
Expert community health nurses (n=28) described crucial clinical situations. Content analysis revealed that decision making was both rational and intuitive. Eight themes were identified: decision-making focus, type, purpose, decision-maker characteristics, sequencing of events, data collection methods, facilitators/barriers, and decision-making…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rowe, M.D.; Pierce, B.L.
This report presents results of tests of different final site selection methods used for siting large-scale facilities such as nuclear power plants. Test data are adapted from a nuclear power plant siting study conducted on Long Island, New York. The purpose of the tests is to determine whether or not different final site selection methods produce different results, and to obtain some understanding of the nature of any differences found. Decision rules and weighting methods are included. Decision rules tested are Weighting Summation, Power Law, Decision Analysis, Goal Programming, and Goal Attainment; weighting methods tested are Categorization, Ranking, Rating Ratiomore » Estimation, Metfessel Allocation, Indifferent Tradeoff, Decision Analysis lottery, and Global Evaluation. Results show that different methods can, indeed, produce different results, but that the probability that they will do so is controlled by the structure of differences among the sites being evaluated. Differences in weights and suitability scores attributable to methods have reduced significance if the alternatives include one or two sites that are superior to all others in many attributes. The more tradeoffs there are among good and bad levels of different attributes at different sites, the more important are the specifics of methods to the final decision. 5 refs., 14 figs., 19 tabs.« less
Finnveden, Göran; Björklund, Anna; Moberg, Asa; Ekvall, Tomas
2007-06-01
A large number of methods and approaches that can be used for supporting waste management decisions at different levels in society have been developed. In this paper an overview of methods is provided and preliminary guidelines for the choice of methods are presented. The methods introduced include: Environmental Impact Assessment, Strategic Environmental Assessment, Life Cycle Assessment, Cost-Benefit Analysis, Cost-effectiveness Analysis, Life-cycle Costing, Risk Assessment, Material Flow Accounting, Substance Flow Analysis, Energy Analysis, Exergy Analysis, Entropy Analysis, Environmental Management Systems, and Environmental Auditing. The characteristics used are the types of impacts included, the objects under study and whether the method is procedural or analytical. The different methods can be described as systems analysis methods. Waste management systems thinking is receiving increasing attention. This is, for example, evidenced by the suggested thematic strategy on waste by the European Commission where life-cycle analysis and life-cycle thinking get prominent positions. Indeed, life-cycle analyses have been shown to provide policy-relevant and consistent results. However, it is also clear that the studies will always be open to criticism since they are simplifications of reality and include uncertainties. This is something all systems analysis methods have in common. Assumptions can be challenged and it may be difficult to generalize from case studies to policies. This suggests that if decisions are going to be made, they are likely to be made on a less than perfect basis.
Sensitivity Analysis in Sequential Decision Models.
Chen, Qiushi; Ayer, Turgay; Chhatwal, Jagpreet
2017-02-01
Sequential decision problems are frequently encountered in medical decision making, which are commonly solved using Markov decision processes (MDPs). Modeling guidelines recommend conducting sensitivity analyses in decision-analytic models to assess the robustness of the model results against the uncertainty in model parameters. However, standard methods of conducting sensitivity analyses cannot be directly applied to sequential decision problems because this would require evaluating all possible decision sequences, typically in the order of trillions, which is not practically feasible. As a result, most MDP-based modeling studies do not examine confidence in their recommended policies. In this study, we provide an approach to estimate uncertainty and confidence in the results of sequential decision models. First, we provide a probabilistic univariate method to identify the most sensitive parameters in MDPs. Second, we present a probabilistic multivariate approach to estimate the overall confidence in the recommended optimal policy considering joint uncertainty in the model parameters. We provide a graphical representation, which we call a policy acceptability curve, to summarize the confidence in the optimal policy by incorporating stakeholders' willingness to accept the base case policy. For a cost-effectiveness analysis, we provide an approach to construct a cost-effectiveness acceptability frontier, which shows the most cost-effective policy as well as the confidence in that for a given willingness to pay threshold. We demonstrate our approach using a simple MDP case study. We developed a method to conduct sensitivity analysis in sequential decision models, which could increase the credibility of these models among stakeholders.
Kleinhans, Sonja; Herrmann, Eva; Kohnen, Thomas; Bühren, Jens
2017-08-15
Background Iatrogenic keratectasia is one of the most dreaded complications of refractive surgery. In most cases, keratectasia develops after refractive surgery of eyes suffering from subclinical stages of keratoconus with few or no signs. Unfortunately, there has been no reliable procedure for the early detection of keratoconus. In this study, we used binary decision trees (recursive partitioning) to assess their suitability for discrimination between normal eyes and eyes with subclinical keratoconus. Patients and Methods The method of decision tree analysis was compared with discriminant analysis which has shown good results in previous studies. Input data were 32 eyes of 32 patients with newly diagnosed keratoconus in the contralateral eye and preoperative data of 10 eyes of 5 patients with keratectasia after laser in-situ keratomileusis (LASIK). The control group was made up of 245 normal eyes after LASIK and 12-month follow-up without any signs of iatrogenic keratectasia. Results Decision trees gave better accuracy and specificity than did discriminant analysis. The sensitivity of decision trees was lower than the sensitivity of discriminant analysis. Conclusion On the basis of the patient population of this study, decision trees did not prove to be superior to linear discriminant analysis for the detection of subclinical keratoconus. Georg Thieme Verlag KG Stuttgart · New York.
Application of Grey Relational Analysis to Decision-Making during Product Development
ERIC Educational Resources Information Center
Hsiao, Shih-Wen; Lin, Hsin-Hung; Ko, Ya-Chuan
2017-01-01
A multi-attribute decision-making (MADM) approach was proposed in this study as a prediction method that differs from the conventional production and design methods for a product. When a client has different dimensional requirements, this approach can quickly provide a company with design decisions for each product. The production factors of a…
21 CFR 2.19 - Methods of analysis.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 1 2011-04-01 2011-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis... enforcement programs to utilize the methods of analysis of the AOAC INTERNATIONAL (AOAC) as published in the...
21 CFR 2.19 - Methods of analysis.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis... enforcement programs to utilize the methods of analysis of the AOAC INTERNATIONAL (AOAC) as published in the...
Developing a Value Framework: The Need to Reflect the Opportunity Costs of Funding Decisions.
Sculpher, Mark; Claxton, Karl; Pearson, Steven D
2017-02-01
A growing number of health care systems internationally use formal economic evaluation methods to support health care funding decisions. Recently, a range of organizations have been advocating forms of analysis that have been termed "value frameworks." There has also been a push for analytical methods to reflect a fuller range of benefits of interventions through multicriteria decision analysis. A key principle that is invariably neglected in current and proposed frameworks is the need to reflect evidence on the opportunity costs that health systems face when making funding decisions. The mechanisms by which opportunity costs are realized vary depending on the system's financial arrangements, but they always mean that a decision to fund a specific intervention for a particular patient group has the potential to impose costs on others in terms of forgone benefits. These opportunity costs are rarely explicitly reflected in analysis to support decisions, but recent developments to quantify benefits forgone make more appropriate analyses feasible. Opportunity costs also need to be reflected in decisions if a broader range of attributes of benefit is considered, and opportunity costs are a key consideration in determining the appropriate level of total expenditure in a system. The principles by which opportunity costs can be reflected in analysis are illustrated in this article by using the example of the proposed methods for value-based pricing in the United Kingdom. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Advancing Alternative Analysis: Integration of Decision Science.
Malloy, Timothy F; Zaunbrecher, Virginia M; Batteate, Christina M; Blake, Ann; Carroll, William F; Corbett, Charles J; Hansen, Steffen Foss; Lempert, Robert J; Linkov, Igor; McFadden, Roger; Moran, Kelly D; Olivetti, Elsa; Ostrom, Nancy K; Romero, Michelle; Schoenung, Julie M; Seager, Thomas P; Sinsheimer, Peter; Thayer, Kristina A
2017-06-13
Decision analysis-a systematic approach to solving complex problems-offers tools and frameworks to support decision making that are increasingly being applied to environmental challenges. Alternatives analysis is a method used in regulation and product design to identify, compare, and evaluate the safety and viability of potential substitutes for hazardous chemicals. We assessed whether decision science may assist the alternatives analysis decision maker in comparing alternatives across a range of metrics. A workshop was convened that included representatives from government, academia, business, and civil society and included experts in toxicology, decision science, alternatives assessment, engineering, and law and policy. Participants were divided into two groups and were prompted with targeted questions. Throughout the workshop, the groups periodically came together in plenary sessions to reflect on other groups' findings. We concluded that the further incorporation of decision science into alternatives analysis would advance the ability of companies and regulators to select alternatives to harmful ingredients and would also advance the science of decision analysis. We advance four recommendations: a ) engaging the systematic development and evaluation of decision approaches and tools; b ) using case studies to advance the integration of decision analysis into alternatives analysis; c ) supporting transdisciplinary research; and d ) supporting education and outreach efforts. https://doi.org/10.1289/EHP483.
75 FR 58374 - 2010 Release of CADDIS (Causal Analysis/Diagnosis Decision Information System)
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-24
... 2010 version of the Causal Analysis/Diagnosis Decision Information System (CADDIS). This Web site was... methods; information on basic and advanced data analyses; downloadable software tools; and an online... ENVIRONMENTAL PROTECTION AGENCY [FRL-9206-7] 2010 Release of CADDIS (Causal Analysis/Diagnosis...
Dolan, James G
2010-01-01
Current models of healthcare quality recommend that patient management decisions be evidence-based and patient-centered. Evidence-based decisions require a thorough understanding of current information regarding the natural history of disease and the anticipated outcomes of different management options. Patient-centered decisions incorporate patient preferences, values, and unique personal circumstances into the decision making process and actively involve both patients along with health care providers as much as possible. Fundamentally, therefore, evidence-based, patient-centered decisions are multi-dimensional and typically involve multiple decision makers.Advances in the decision sciences have led to the development of a number of multiple criteria decision making methods. These multi-criteria methods are designed to help people make better choices when faced with complex decisions involving several dimensions. They are especially helpful when there is a need to combine "hard data" with subjective preferences, to make trade-offs between desired outcomes, and to involve multiple decision makers. Evidence-based, patient-centered clinical decision making has all of these characteristics. This close match suggests that clinical decision support systems based on multi-criteria decision making techniques have the potential to enable patients and providers to carry out the tasks required to implement evidence-based, patient-centered care effectively and efficiently in clinical settings.The goal of this paper is to give readers a general introduction to the range of multi-criteria methods available and show how they could be used to support clinical decision-making. Methods discussed include the balance sheet, the even swap method, ordinal ranking methods, direct weighting methods, multi-attribute decision analysis, and the analytic hierarchy process (AHP).
Dolan, James G.
2010-01-01
Current models of healthcare quality recommend that patient management decisions be evidence-based and patient-centered. Evidence-based decisions require a thorough understanding of current information regarding the natural history of disease and the anticipated outcomes of different management options. Patient-centered decisions incorporate patient preferences, values, and unique personal circumstances into the decision making process and actively involve both patients along with health care providers as much as possible. Fundamentally, therefore, evidence-based, patient-centered decisions are multi-dimensional and typically involve multiple decision makers. Advances in the decision sciences have led to the development of a number of multiple criteria decision making methods. These multi-criteria methods are designed to help people make better choices when faced with complex decisions involving several dimensions. They are especially helpful when there is a need to combine “hard data” with subjective preferences, to make trade-offs between desired outcomes, and to involve multiple decision makers. Evidence-based, patient-centered clinical decision making has all of these characteristics. This close match suggests that clinical decision support systems based on multi-criteria decision making techniques have the potential to enable patients and providers to carry out the tasks required to implement evidence-based, patient-centered care effectively and efficiently in clinical settings. The goal of this paper is to give readers a general introduction to the range of multi-criteria methods available and show how they could be used to support clinical decision-making. Methods discussed include the balance sheet, the even swap method, ordinal ranking methods, direct weighting methods, multi-attribute decision analysis, and the analytic hierarchy process (AHP) PMID:21394218
21 CFR 2.19 - Methods of analysis.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 1 2013-04-01 2013-04-01 false Methods of analysis. 2.19 Section 2.19 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL GENERAL ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis...
21 CFR 2.19 - Methods of analysis.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 1 2014-04-01 2014-04-01 false Methods of analysis. 2.19 Section 2.19 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL GENERAL ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis...
21 CFR 2.19 - Methods of analysis.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 1 2012-04-01 2012-04-01 false Methods of analysis. 2.19 Section 2.19 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL GENERAL ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis...
A decision-analytic approach to predict state regulation of hydraulic fracturing.
Linkov, Igor; Trump, Benjamin; Jin, David; Mazurczak, Marcin; Schreurs, Miranda
2014-01-01
The development of horizontal drilling and hydraulic fracturing methods has dramatically increased the potential for the extraction of previously unrecoverable natural gas. Nonetheless, the potential risks and hazards associated with such technologies are not without controversy and are compounded by frequently changing information and an uncertain landscape of international politics and laws. Where each nation has its own energy policies and laws, predicting how a state with natural gas reserves that require hydraulic fracturing will regulate the industry is of paramount importance for potential developers and extractors. We present a method for predicting hydraulic fracturing decisions using multiple-criteria decision analysis. The case study evaluates the decisions of five hypothetical countries with differing political, social, environmental, and economic priorities, choosing among four policy alternatives: open hydraulic fracturing, limited hydraulic fracturing, completely banned hydraulic fracturing, and a cap and trade program. The result is a model that identifies the preferred policy alternative for each archetypal country and demonstrates the sensitivity the decision to particular metrics. Armed with such information, observers can predict each country's likely decisions related to natural gas exploration as more data become available or political situations change. Decision analysis provides a method to manage uncertainty and address forecasting concerns where rich and objective data may be lacking. For the case of hydraulic fracturing, the various political pressures and extreme uncertainty regarding the technology's risks and benefits serve as a prime platform to demonstrate how decision analysis can be used to predict future behaviors.
French, Rebecca S; Cowan, Frances M; Wellings, Kaye; Dowie, Jack
2014-04-01
My Contraception Tool (MCT) applies the principles of multi-criteria decision analysis to the choice of contraceptive method. Its purpose is to make the decision-making process transparent to the user and to suggest a method to them based on their own preferences. The contraceptive option that emerges as optimal from the analysis takes account of the probability of a range of outcomes and the relative weight ascribed to them by the user. The development of MCT was a collaborative project between London School of Hygiene & Tropical Medicine, Brook, FPA and Maldaba Ltd. MCT is available online via the Brook and FPA websites. In this article we describe MCT's development and how it works. Further work is needed to assess the impact it has on decision quality and contraceptive behaviour.
The Potential for Meta-Analysis to Support Decision Analysis in Ecology
ERIC Educational Resources Information Center
Mengersen, Kerrie; MacNeil, M. Aaron; Caley, M. Julian
2015-01-01
Meta-analysis and decision analysis are underpinned by well-developed methods that are commonly applied to a variety of problems and disciplines. While these two fields have been closely linked in some disciplines such as medicine, comparatively little attention has been paid to the potential benefits of linking them in ecology, despite reasonable…
Bearing Procurement Analysis Method by Total Cost of Ownership Analysis and Reliability Prediction
NASA Astrophysics Data System (ADS)
Trusaji, Wildan; Akbar, Muhammad; Sukoyo; Irianto, Dradjad
2018-03-01
In making bearing procurement analysis, price and its reliability must be considered as decision criteria, since price determines the direct cost as acquisition cost and reliability of bearing determine the indirect cost such as maintenance cost. Despite the indirect cost is hard to identify and measured, it has high contribution to overall cost that will be incurred. So, the indirect cost of reliability must be considered when making bearing procurement analysis. This paper tries to explain bearing evaluation method with the total cost of ownership analysis to consider price and maintenance cost as decision criteria. Furthermore, since there is a lack of failure data when bearing evaluation phase is conducted, reliability prediction method is used to predict bearing reliability from its dynamic load rating parameter. With this method, bearing with a higher price but has higher reliability is preferable for long-term planning. But for short-term planning the cheaper one but has lower reliability is preferable. This contextuality can give rise to conflict between stakeholders. Thus, the planning horizon needs to be agreed by all stakeholder before making a procurement decision.
Strategic planning decision making using fuzzy SWOT-TOPSIS with reliability factor
NASA Astrophysics Data System (ADS)
Mohamad, Daud; Afandi, Nur Syamimi; Kamis, Nor Hanimah
2015-10-01
Strategic planning is a process of decision making and action for long-term activities in an organization. The Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis has been commonly used to help organizations in strategizing their future direction by analyzing internal and external environment. However, SWOT analysis has some limitations as it is unable to prioritize appropriately the multiple alternative strategic decisions. Some efforts have been made to solve this problem by incorporating Multi Criteria Decision Making (MCDM) methods. Nevertheless, another important aspect has raised concerns on obtaining the decision that is the reliability of the information. Decision makers evaluate differently depending on their level of confidence or sureness in the evaluation. This study proposes a decision making procedure for strategic planning using SWOT-TOPSIS method by incorporating the reliability factor of the evaluation based on Z-number. An example using a local authority in the east coast of Malaysia is illustrated to determine the strategic options ranking and to prioritize factors in each SWOT category.
Volk, Michael L; Lok, Anna S F; Ubel, Peter A; Vijan, Sandeep
2008-01-01
The utilitarian foundation of decision analysis limits its usefulness for many social policy decisions. In this study, the authors examine a method to incorporate competing ethical principles in a decision analysis of liver transplantation for a patient with acute liver failure (ALF). A Markov model was constructed to compare the benefit of transplantation for a patient with ALF versus the harm caused to other patients on the waiting list and to determine the lowest acceptable 5-y posttransplant survival for the ALF patient. The weighting of the ALF patient and other patients was then adjusted using a multiattribute variable incorporating utilitarianism, urgency, and other principles such as fair chances. In the base-case analysis, the strategy of transplanting the ALF patient resulted in a 0.8% increase in the risk of death and a utility loss of 7.8 quality-adjusted days of life for each of the other patients on the waiting list. These harms cumulatively outweighed the benefit of transplantation for an ALF patient having a posttransplant survival of less than 48% at 5 y. However, the threshold for an acceptable posttransplant survival for the ALF patient ranged from 25% to 56% at 5 y, depending on the ethical principles involved. The results of the decision analysis vary depending on the ethical perspective. This study demonstrates how competing ethical principles can be numerically incorporated in a decision analysis.
Alves-Pinto, A.; Sollini, J.; Sumner, C.J.
2012-01-01
Signal detection theory (SDT) provides a framework for interpreting psychophysical experiments, separating the putative internal sensory representation and the decision process. SDT was used to analyse ferret behavioural responses in a (yes–no) tone-in-noise detection task. Instead of measuring the receiver-operating characteristic (ROC), we tested SDT by comparing responses collected using two common psychophysical data collection methods. These (Constant Stimuli, Limits) differ in the set of signal levels presented within and across behavioural sessions. The results support the use of SDT as a method of analysis: SDT sensory component was unchanged between the two methods, even though decisions depended on the stimuli presented within a behavioural session. Decision criterion varied trial-by-trial: a ‘yes’ response was more likely after a correct rejection trial than a hit trial. Simulation using an SDT model with several decision components reproduced the experimental observations accurately, leaving only ∼10% of the variance unaccounted for. The model also showed that trial-by-trial dependencies were unlikely to influence measured psychometric functions or thresholds. An additional model component suggested that inattention did not contribute substantially. Further analysis showed that ferrets were changing their decision criteria, almost optimally, to maximise the reward obtained in a session. The data suggest trial-by-trial reward-driven optimization of the decision process. Understanding the factors determining behavioural responses is important for correlating neural activity and behaviour. SDT provides a good account of animal psychoacoustics, and can be validated using standard psychophysical methods and computer simulations, without recourse to ROC measurements. PMID:22698686
A Monte-Carlo game theoretic approach for Multi-Criteria Decision Making under uncertainty
NASA Astrophysics Data System (ADS)
Madani, Kaveh; Lund, Jay R.
2011-05-01
Game theory provides a useful framework for studying Multi-Criteria Decision Making problems. This paper suggests modeling Multi-Criteria Decision Making problems as strategic games and solving them using non-cooperative game theory concepts. The suggested method can be used to prescribe non-dominated solutions and also can be used as a method to predict the outcome of a decision making problem. Non-cooperative stability definitions for solving the games allow consideration of non-cooperative behaviors, often neglected by other methods which assume perfect cooperation among decision makers. To deal with the uncertainty in input variables a Monte-Carlo Game Theory (MCGT) approach is suggested which maps the stochastic problem into many deterministic strategic games. The games are solved using non-cooperative stability definitions and the results include possible effects of uncertainty in input variables on outcomes. The method can handle multi-criteria multi-decision-maker problems with uncertainty. The suggested method does not require criteria weighting, developing a compound decision objective, and accurate quantitative (cardinal) information as it simplifies the decision analysis by solving problems based on qualitative (ordinal) information, reducing the computational burden substantially. The MCGT method is applied to analyze California's Sacramento-San Joaquin Delta problem. The suggested method provides insights, identifies non-dominated alternatives, and predicts likely decision outcomes.
2011-01-01
Background Decision curve analysis has been introduced as a method to evaluate prediction models in terms of their clinical consequences if used for a binary classification of subjects into a group who should and into a group who should not be treated. The key concept for this type of evaluation is the "net benefit", a concept borrowed from utility theory. Methods We recall the foundations of decision curve analysis and discuss some new aspects. First, we stress the formal distinction between the net benefit for the treated and for the untreated and define the concept of the "overall net benefit". Next, we revisit the important distinction between the concept of accuracy, as typically assessed using the Youden index and a receiver operating characteristic (ROC) analysis, and the concept of utility of a prediction model, as assessed using decision curve analysis. Finally, we provide an explicit implementation of decision curve analysis to be applied in the context of case-control studies. Results We show that the overall net benefit, which combines the net benefit for the treated and the untreated, is a natural alternative to the benefit achieved by a model, being invariant with respect to the coding of the outcome, and conveying a more comprehensive picture of the situation. Further, within the framework of decision curve analysis, we illustrate the important difference between the accuracy and the utility of a model, demonstrating how poor an accurate model may be in terms of its net benefit. Eventually, we expose that the application of decision curve analysis to case-control studies, where an accurate estimate of the true prevalence of a disease cannot be obtained from the data, is achieved with a few modifications to the original calculation procedure. Conclusions We present several interrelated extensions to decision curve analysis that will both facilitate its interpretation and broaden its potential area of application. PMID:21696604
Fuzzy rationality and parameter elicitation in decision analysis
NASA Astrophysics Data System (ADS)
Nikolova, Natalia D.; Tenekedjiev, Kiril I.
2010-07-01
It is widely recognised by decision analysts that real decision-makers always make estimates in an interval form. An overview of techniques to find an optimal alternative among such with imprecise and interval probabilities is presented. Scalarisation methods are outlined as most appropriate. A proper continuation of such techniques is fuzzy rational (FR) decision analysis. A detailed representation of the elicitation process influenced by fuzzy rationality is given. The interval character of probabilities leads to the introduction of ribbon functions, whose general form and special cases are compared with the p-boxes. As demonstrated, approximation of utilities in FR decision analysis does not depend on the probabilities, but the approximation of probabilities is dependent on preferences.
Pasta, D J; Taylor, J L; Henning, J M
1999-01-01
Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.
Pieterse, Arwen H; de Vries, Marieke
2013-09-01
Increasingly, patient decision aids and values clarification methods (VCMs) are being developed to support patients in making preference-sensitive health-care decisions. Many VCMs encourage extensive deliberation about options, without solid theoretical or empirical evidence showing that deliberation is advantageous. Research suggests that simple, fast and frugal heuristic decision strategies sometimes result in better judgments and decisions. Durand et al. have developed two fast and frugal heuristic-based VCMs. To critically analyse the suitability of the 'take the best' (TTB) and 'tallying' fast and frugal heuristics in the context of patient decision making. Analysis of the structural similarities between the environments in which the TTB and tallying heuristics have been proven successful and the context of patient decision making and of the potential of these heuristic decision processes to support patient decision making. The specific nature of patient preference-sensitive decision making does not seem to resemble environments in which the TTB and tallying heuristics have proven successful. Encouraging patients to consider less rather than more relevant information potentially even deteriorates their values clarification process. Values clarification methods promoting the use of more intuitive decision strategies may sometimes be more effective. Nevertheless, we strongly recommend further theoretical thinking about the expected value of such heuristics and of other more intuitive decision strategies in this context, as well as empirical assessments of the mechanisms by which inducing such decision strategies may impact the quality and outcome of values clarification. © 2011 John Wiley & Sons Ltd.
Pieterse, Arwen H.; de Vries, Marieke
2011-01-01
Abstract Background Increasingly, patient decision aids and values clarification methods (VCMs) are being developed to support patients in making preference‐sensitive health‐care decisions. Many VCMs encourage extensive deliberation about options, without solid theoretical or empirical evidence showing that deliberation is advantageous. Research suggests that simple, fast and frugal heuristic decision strategies sometimes result in better judgments and decisions. Durand et al. have developed two fast and frugal heuristic‐based VCMs. Objective To critically analyse the suitability of the ‘take the best’ (TTB) and ‘tallying’ fast and frugal heuristics in the context of patient decision making. Strategy Analysis of the structural similarities between the environments in which the TTB and tallying heuristics have been proven successful and the context of patient decision making and of the potential of these heuristic decision processes to support patient decision making. Conclusion The specific nature of patient preference‐sensitive decision making does not seem to resemble environments in which the TTB and tallying heuristics have proven successful. Encouraging patients to consider less rather than more relevant information potentially even deteriorates their values clarification process. Values clarification methods promoting the use of more intuitive decision strategies may sometimes be more effective. Nevertheless, we strongly recommend further theoretical thinking about the expected value of such heuristics and of other more intuitive decision strategies in this context, as well as empirical assessments of the mechanisms by which inducing such decision strategies may impact the quality and outcome of values clarification. PMID:21902770
Restoring and Managing Gulf of Mexico Fisheries: A Path Toward Creative Decision-Making
This chapter introduces decision analysis concepts with examples for managing fisheries. Decision analytic methods provide useful tools for structuring environmental management problems and separating technical judgments from preference judgments to better weigh the prospects fro...
Decision Analysis for a Sustainable Environment, Economy & Society
Environmental decisions are often made without consideration of the roles that ecosystem services play. Most decision-makers do not currently have access to useful or usable methods and approaches when they are presented with choices that will have significant ecosystem impacts. ...
Decision Analysis For A Sustainable Environment, Economy, & Society
Environmental decisions are often made without consideration of the roles that ecosystem services play. Most decision-makers do not currently have access to useful or usable methods and approaches when they are presented with choices that will have significant ecosystem impacts....
The potential for meta-analysis to support decision analysis in ecology.
Mengersen, Kerrie; MacNeil, M Aaron; Caley, M Julian
2015-06-01
Meta-analysis and decision analysis are underpinned by well-developed methods that are commonly applied to a variety of problems and disciplines. While these two fields have been closely linked in some disciplines such as medicine, comparatively little attention has been paid to the potential benefits of linking them in ecology, despite reasonable expectations that benefits would be derived from doing so. Meta-analysis combines information from multiple studies to provide more accurate parameter estimates and to reduce the uncertainty surrounding them. Decision analysis involves selecting among alternative choices using statistical information that helps to shed light on the uncertainties involved. By linking meta-analysis to decision analysis, improved decisions can be made, with quantification of the costs and benefits of alternate decisions supported by a greater density of information. Here, we briefly review concepts of both meta-analysis and decision analysis, illustrating the natural linkage between them and the benefits from explicitly linking one to the other. We discuss some examples in which this linkage has been exploited in the medical arena and how improvements in precision and reduction of structural uncertainty inherent in a meta-analysis can provide substantive improvements to decision analysis outcomes by reducing uncertainty in expected loss and maximising information from across studies. We then argue that these significant benefits could be translated to ecology, in particular to the problem of making optimal ecological decisions in the face of uncertainty. Copyright © 2013 John Wiley & Sons, Ltd.
EEG feature selection method based on decision tree.
Duan, Lijuan; Ge, Hui; Ma, Wei; Miao, Jun
2015-01-01
This paper aims to solve automated feature selection problem in brain computer interface (BCI). In order to automate feature selection process, we proposed a novel EEG feature selection method based on decision tree (DT). During the electroencephalogram (EEG) signal processing, a feature extraction method based on principle component analysis (PCA) was used, and the selection process based on decision tree was performed by searching the feature space and automatically selecting optimal features. Considering that EEG signals are a series of non-linear signals, a generalized linear classifier named support vector machine (SVM) was chosen. In order to test the validity of the proposed method, we applied the EEG feature selection method based on decision tree to BCI Competition II datasets Ia, and the experiment showed encouraging results.
Psychophysical Models for Signal Detection with Time Varying Uncertainty. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Gai, E.
1975-01-01
Psychophysical models for the behavior of the human operator in detection tasks which include change in detectability, correlation between observations and deferred decisions are developed. Classical Signal Detection Theory (SDT) is discussed and its emphasis on the sensory processes is contrasted to decision strategies. The analysis of decision strategies utilizes detection tasks with time varying signal strength. The classical theory is modified to include such tasks and several optimal decision strategies are explored. Two methods of classifying strategies are suggested. The first method is similar to the analysis of ROC curves, while the second is based on the relation between the criterion level (CL) and the detectability. Experiments to verify the analysis of tasks with changes of signal strength are designed. The results show that subjects are aware of changes in detectability and tend to use strategies that involve changes in the CL's.
Analysis of complex decisionmaking processes. [with application to jet engine development
NASA Technical Reports Server (NTRS)
Hill, J. D.; Ollila, R. G.
1978-01-01
The analysis of corporate decisionmaking processes related to major system developments is unusually difficult because of the number of decisionmakers involved in the process and the long development cycle. A method for analyzing such decision processes is developed and illustrated through its application to the analysis of the commercial jet engine development process. The method uses interaction matrices as the key tool for structuring the problem, recording data, and analyzing the data to establish the rank order of the major factors affecting development decisions. In the example, the use of interaction matrices permitted analysts to collect and analyze approximately 50 factors that influenced decisions during the four phases of the development cycle, and to determine the key influencers of decisions at each development phase. The results of this study indicate that the cost of new technology installed on an aircraft is the prime concern of the engine manufacturer.
NASA Technical Reports Server (NTRS)
Greenberg, Marc W.; Laing, William
2013-01-01
An Economic Analysis (EA) is a systematic approach to the problem of choosing the best method of allocating scarce resources to achieve a given objective. An EA helps guide decisions on the "worth" of pursuing an action that departs from status quo ... an EA is the crux of decision-support.
Diaby, Vakaramoko; Sanogo, Vassiki; Moussa, Kouame Richard
2015-01-01
Objective In this paper, the readers are introduced to ELICIT, an imprecise weight elicitation technique for multicriteria decision analysis for healthcare. Methods The application of ELICIT consists of two steps: the rank ordering of evaluation criteria based on decision-makers’ (DMs) preferences using the principal component analysis; and the estimation of criteria weights and their descriptive statistics using the variable interdependent analysis and the Monte Carlo method. The application of ELICIT is illustrated with a hypothetical case study involving the elicitation of weights for five criteria used to select the best device for eye surgery. Results The criteria were ranked from 1–5, based on a strict preference relationship established by the DMs. For each criterion, the deterministic weight was estimated as well as the standard deviation and 95% credibility interval. Conclusions ELICIT is appropriate in situations where only ordinal DMs’ preferences are available to elicit decision criteria weights. PMID:26361235
Tools, information sources, and methods used in deciding on drug availability in HMOs.
Barner, J C; Thomas, J
1998-01-01
The use and importance of specific decision-making tools, information sources, and drug-use management methods in determining drug availability and use in HMOs were studied. A questionnaire was sent to 303 randomly selected HMOs. Respondents were asked to rate their use of each of four formal decision-making tools and its relative importance, as well as the use and importance of eight information sources and 11 methods for managing drug availability and use, on a 5-point scale. The survey response rate was 28%. Approximately half of the respondents reported that their HMOs used decision analysis or multiattribute analysis in deciding on drug availability. If used, these tools were rated as very important. There were significant differences in levels of use by HMO type, membership size, and age. Journal articles and reference books were reported most often as information sources. Retrospective drug-use review was used very often and perceived to be very important in managing drug use. Other management methods were used only occasionally, but the importance placed on these tools when used ranged from moderately to very important. Older organizations used most of the management methods more often than did other HMOs. Decision analysis and multiattribute analysis were the most commonly used tools for deciding on which drugs to make available to HMO members, and reference books and journal articles were the most commonly used information sources. Retrospective and prospective drug-use reviews were the most commonly applied methods for managing HMO members' access to drugs.
Analysis of methods of processing of expert information by optimization of administrative decisions
NASA Astrophysics Data System (ADS)
Churakov, D. Y.; Tsarkova, E. G.; Marchenko, N. D.; Grechishnikov, E. V.
2018-03-01
In the real operation the measure definition methodology in case of expert estimation of quality and reliability of application-oriented software products is offered. In operation methods of aggregation of expert estimates on the example of a collective choice of an instrumental control projects in case of software development of a special purpose for needs of institutions are described. Results of operation of dialogue decision making support system are given an algorithm of the decision of the task of a choice on the basis of a method of the analysis of hierarchies and also. The developed algorithm can be applied by development of expert systems to the solution of a wide class of the tasks anyway connected to a multicriteria choice.
Combining conversation analysis and event sequencing to study health communication.
Pecanac, Kristen E
2018-06-01
Good communication is essential in patient-centered care. The purpose of this paper is to describe conversation analysis and event sequencing and explain how integrating these methods strengthened the analysis in a study of communication between clinicians and surrogate decision makers in an intensive care unit. Conversation analysis was first used to determine how clinicians introduced the need for decision-making regarding life-sustaining treatment and how surrogate decision makers responded. Event sequence analysis then was used to determine the transitional probability (probability of one event leading to another in the interaction) that a given type of clinician introduction would lead to surrogate resistance or alignment. Conversation analysis provides a detailed analysis of the interaction between participants in a conversation. When combined with a quantitative analysis of the patterns of communication in an interaction, these data add information on the communication strategies that produce positive outcomes. Researchers can apply this mixed-methods approach to identify beneficial conversational practices and design interventions to improve health communication. © 2018 Wiley Periodicals, Inc.
Constantinou, Anthony Costa; Yet, Barbaros; Fenton, Norman; Neil, Martin; Marsh, William
2016-01-01
Inspired by real-world examples from the forensic medical sciences domain, we seek to determine whether a decision about an interventional action could be subject to amendments on the basis of some incomplete information within the model, and whether it would be worthwhile for the decision maker to seek further information prior to suggesting a decision. The method is based on the underlying principle of Value of Information to enhance decision analysis in interventional and counterfactual Bayesian networks. The method is applied to two real-world Bayesian network models (previously developed for decision support in forensic medical sciences) to examine the average gain in terms of both Value of Information (average relative gain ranging from 11.45% and 59.91%) and decision making (potential amendments in decision making ranging from 0% to 86.8%). We have shown how the method becomes useful for decision makers, not only when decision making is subject to amendments on the basis of some unknown risk factors, but also when it is not. Knowing that a decision outcome is independent of one or more unknown risk factors saves us from the trouble of seeking information about the particular set of risk factors. Further, we have also extended the assessment of this implication to the counterfactual case and demonstrated how answers about interventional actions are expected to change when some unknown factors become known, and how useful this becomes in forensic medical science. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Wei; Reddy, T. A.; Gurian, Patrick
2007-01-31
A companion paper to Jiang and Reddy that presents a general and computationally efficient methodology for dyanmic scheduling and optimal control of complex primary HVAC&R plants using a deterministic engineering optimization approach.
Development of an evidence-based decision pathway for vestibular schwannoma treatment options.
Linkov, Faina; Valappil, Benita; McAfee, Jacob; Goughnour, Sharon L; Hildrew, Douglas M; McCall, Andrew A; Linkov, Igor; Hirsch, Barry; Snyderman, Carl
To integrate multiple sources of clinical information with patient feedback to build evidence-based decision support model to facilitate treatment selection for patients suffering from vestibular schwannomas (VS). This was a mixed methods study utilizing focus group and survey methodology to solicit feedback on factors important for making treatment decisions among patients. Two 90-minute focus groups were conducted by an experienced facilitator. Previously diagnosed VS patients were recruited by clinical investigators at the University of Pittsburgh Medical Center (UPMC). Classical content analysis was used for focus group data analysis. Providers were recruited from practices within the UPMC system and were surveyed using Delphi methods. This information can provide a basis for multi-criteria decision analysis (MCDA) framework to develop a treatment decision support system for patients with VS. Eight themes were derived from these data (focus group + surveys): doctor/health care system, side effects, effectiveness of treatment, anxiety, mortality, family/other people, quality of life, and post-operative symptoms. These data, as well as feedback from physicians were utilized in building a multi-criteria decision model. The study illustrated steps involved in the development of a decision support model that integrates evidence-based data and patient values to select treatment alternatives. Studies focusing on the actual development of the decision support technology for this group of patients are needed, as decisions are highly multifactorial. Such tools have the potential to improve decision making for complex medical problems with alternate treatment pathways. Copyright © 2016 Elsevier Inc. All rights reserved.
Decision Making About Method of Delivery on the U.S.–Mexico Border
DESISTO, CARLA L.; McDONALD, JILL A.; ROCHAT, ROGER; DIAZ-APODACA, BEATRIZ A.; DECLERCQ, EUGENE
2015-01-01
We explored how low-risk, nulliparous pregnant women and their doctors in two contiguous U.S.–Mexico border communities communicate about methods of delivery and how they perceive that the delivery method decision is made. We recruited 18 women through obstetricians in El Paso, Texas (n = 10), and prenatal care providers in Ciudad Juárez, Mexico (n = 8). We observed prenatal care visits, interviewed women prenatally and postpartum, and interviewed the El Paso obstetricians. Qualitative analysis demonstrated that birthing decisions are complex and involve multiple influences, including women's level of knowledge about birth, doctor–patient communication, and women's participation in decision making. PMID:25364879
McCoy, S; Blayney-Chandramouli, J; Mutnick, A
1998-12-15
A formulary decision at a health care institution was studied by using two pharmacoeconomic methods. A pharmacoeconomic study was undertaken to assess the impact of a 1995 formulary decision to designate cimetidine as the primary histamine H2-receptor antagonist (H2RA) and to restrict the use of famotidine. Consecutive patients receiving either i.v. cimetidine or famotidine for stress ulcer prophylaxis were reviewed during a two-month period in 1997, and information on demographics, dosage and duration of H2RA therapy, admission date, laboratory test values, and adverse drug reactions was collected. Data for 62 patients (43 cimetidine recipients and 19 famotidine recipients) were evaluated. Therapy was categorized as successful or failed, and the data were then evaluated by decision analysis to evaluate the cost-effectiveness of the agents and by multiattribute utility theory (MAUT) to incorporate a humanistic evaluation of the treatments, namely, the number of doses administered and the number of times dosages were changed. The decision tree revealed that the average cost of receiving cimetidine was $82.01 and the average cost of famotidine therapy was $92.45. The MAUT analysis showed that cimetidine was the preferred agent as long as cost was valued at greater than 60% of the decision-making process and efficacy remained equal between the two agents. Two pharmacoeconomic methods lent support to a formulary decision at a health care institution.
Halim, Isa; Arep, Hambali; Kamat, Seri Rahayu; Abdullah, Rohana; Omar, Abdul Rahman; Ismail, Ahmad Rasdan
2014-01-01
Background Prolonged standing has been hypothesized as a vital contributor to discomfort and muscle fatigue in the workplace. The objective of this study was to develop a decision support system that could provide systematic analysis and solutions to minimize the discomfort and muscle fatigue associated with prolonged standing. Methods The integration of object-oriented programming and a Model Oriented Simultaneous Engineering System were used to design the architecture of the decision support system. Results Validation of the decision support system was carried out in two manufacturing companies. The validation process showed that the decision support system produced reliable results. Conclusion The decision support system is a reliable advisory tool for providing analysis and solutions to problems related to the discomfort and muscle fatigue associated with prolonged standing. Further testing of the decision support system is suggested before it is used commercially. PMID:25180141
Fast Image Texture Classification Using Decision Trees
NASA Technical Reports Server (NTRS)
Thompson, David R.
2011-01-01
Texture analysis would permit improved autonomous, onboard science data interpretation for adaptive navigation, sampling, and downlink decisions. These analyses would assist with terrain analysis and instrument placement in both macroscopic and microscopic image data products. Unfortunately, most state-of-the-art texture analysis demands computationally expensive convolutions of filters involving many floating-point operations. This makes them infeasible for radiation- hardened computers and spaceflight hardware. A new method approximates traditional texture classification of each image pixel with a fast decision-tree classifier. The classifier uses image features derived from simple filtering operations involving integer arithmetic. The texture analysis method is therefore amenable to implementation on FPGA (field-programmable gate array) hardware. Image features based on the "integral image" transform produce descriptive and efficient texture descriptors. Training the decision tree on a set of training data yields a classification scheme that produces reasonable approximations of optimal "texton" analysis at a fraction of the computational cost. A decision-tree learning algorithm employing the traditional k-means criterion of inter-cluster variance is used to learn tree structure from training data. The result is an efficient and accurate summary of surface morphology in images. This work is an evolutionary advance that unites several previous algorithms (k-means clustering, integral images, decision trees) and applies them to a new problem domain (morphology analysis for autonomous science during remote exploration). Advantages include order-of-magnitude improvements in runtime, feasibility for FPGA hardware, and significant improvements in texture classification accuracy.
Applications of Formal Methods to Specification and Safety of Avionics Software
NASA Technical Reports Server (NTRS)
Hoover, D. N.; Guaspari, David; Humenn, Polar
1996-01-01
This report treats several topics in applications of formal methods to avionics software development. Most of these topics concern decision tables, an orderly, easy-to-understand format for formally specifying complex choices among alternative courses of action. The topics relating to decision tables include: generalizations fo decision tables that are more concise and support the use of decision tables in a refinement-based formal software development process; a formalism for systems of decision tables with behaviors; an exposition of Parnas tables for users of decision tables; and test coverage criteria and decision tables. We outline features of a revised version of ORA's decision table tool, Tablewise, which will support many of the new ideas described in this report. We also survey formal safety analysis of specifications and software.
Agapova, Maria; Bresnahan, Brian B; Higashi, Mitchell; Kessler, Larry; Garrison, Louis P; Devine, Beth
2017-02-01
The American College of Radiology develops evidence-based practice guidelines to aid appropriate utilization of radiological procedures. Panel members use expert opinion to weight trade-offs and consensus methods to rate appropriateness of imaging tests. These ratings include an equivocal range, assigned when there is disagreement about a technology's appropriateness and the evidence base is weak or for special circumstances. It is not clear how expert consensus merges with the evidence base to arrive at an equivocal rating. Quantitative benefit-risk assessment (QBRA) methods may assist decision makers in this capacity. However, many methods exist and it is not clear which methods are best suited for this application. We perform a critical appraisal of QBRA methods and propose several steps that may aid in making transparent areas of weak evidence and barriers to consensus in guideline development. We identify QBRA methods with potential to facilitate decision making in guideline development and build a decision aid for selecting among these methods. This study identified 2 families of QBRA methods suited to guideline development when expert opinion is expected to contribute substantially to decision making. Key steps to deciding among QBRA methods involve identifying specific benefit-risk criteria and developing a state-of-evidence matrix. For equivocal ratings assigned for reasons other than disagreement or weak evidence base, QBRA may not be needed. In the presence of disagreement but the absence of a weak evidence base, multicriteria decision analysis approaches are recommended; and in the presence of weak evidence base and the absence of disagreement, incremental net health benefit alone or combined with multicriteria decision analysis is recommended. Our critical appraisal further extends investigation of the strengths and limitations of select QBRA methods in facilitating diagnostic radiology clinical guideline development. The process of using the decision aid exposes and makes transparent areas of weak evidence and barriers to consensus. © 2016 John Wiley & Sons, Ltd.
Child Custody Decisions: Content Analysis of a Judicial Survey.
ERIC Educational Resources Information Center
Settle, Shirley A; Lowery, Carol R.
1982-01-01
Surveyed judges and trial commissioners (N=80) regarding child custody decisions in divorce. The content analysis described the responents' comments which clarified their reasons for attaching greater or lesser importance to a particular consideration or the method using in assessing a particular consideration during a court proceeding. (JAC)
Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ettehadtavakkol, Amin, E-mail: amin.ettehadtavakkol@ttu.edu; Jablonowski, Christopher; Lake, Larry
Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum designmore » concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.« less
Building a maintenance policy through a multi-criterion decision-making model
NASA Astrophysics Data System (ADS)
Faghihinia, Elahe; Mollaverdi, Naser
2012-08-01
A major competitive advantage of production and service systems is establishing a proper maintenance policy. Therefore, maintenance managers should make maintenance decisions that best fit their systems. Multi-criterion decision-making methods can take into account a number of aspects associated with the competitiveness factors of a system. This paper presents a multi-criterion decision-aided maintenance model with three criteria that have more influence on decision making: reliability, maintenance cost, and maintenance downtime. The Bayesian approach has been applied to confront maintenance failure data shortage. Therefore, the model seeks to make the best compromise between these three criteria and establish replacement intervals using Preference Ranking Organization Method for Enrichment Evaluation (PROMETHEE II), integrating the Bayesian approach with regard to the preference of the decision maker to the problem. Finally, using a numerical application, the model has been illustrated, and for a visual realization and an illustrative sensitivity analysis, PROMETHEE GAIA (the visual interactive module) has been used. Use of PROMETHEE II and PROMETHEE GAIA has been made with Decision Lab software. A sensitivity analysis has been made to verify the robustness of certain parameters of the model.
Development of Decision Analysis Specifically for Arctic Offshore Drilling Islands.
1985-12-01
the decision analysis method will - give tradeoffs between costs and design wave height, production and depth • :of water for an oil platform , etc...optimizing the type of platform that is best suited for a particular site has become an extremely difficult decision. Over fifty- one different types of...drilling and production platforms have been identified for the Arctic environment, with new concepts being developed - every year, Boslov et al (198j
NASA Astrophysics Data System (ADS)
Tacnet, Jean-Marc; Carladous, Simon; Dezert, Jean; Batton-Hubert, Mireille
2017-04-01
Mountain natural phenomena (e.g. torrential floods) put people and buildings at risk. Civil engineering protection works such as torrent check-dams are designed to mitigate those natural risks. Protection works act on both causes and effects of phenomena to reduce consequences and therefore risks. For instance, check-dams control sediment production and liquid/solid flow of torrential floods: several series of dams are located in the headwaters of a watershed, each having specific functions. All those works are damaged by time passing and flood impacts. Effectiveness assessment is needed to define, compare or choose strategies for investment and maintenance which are essential issues in risk management process. Decision support tools are expected to analyze at different scales both their technical effectiveness (related to their structural state and functional effects on phenomena such as stopping, braking, guiding, etc.) and their economic efficiency through comparison between benefits and costs. Several methods, often based on expert knowledge, have already been developed to care about decision under risk. But uncertainty has also to be considered, since decisions are indeed often taken in a context of lack of information and knowledge on natural phenomena, heterogeneity of available information and, finally, reliability of sources. First methods derived from classical industrial contexts, such as dependability analysis, are used to formalize expert knowledge used for decision-making. After having defined the concept of effectiveness, dependability analysis are used to identify decision contexts and problems: criteria and indicators are identified in relation with structural or functional features. Then, innovative and multi-scales multi-criteria decision-making methods (MCDMs) and frameworks are proposed to help assessing protection works effectiveness. They combine classical MCDM approaches, belief function, fuzzy sets and possibility theories. Those methods allow to make decisions based on heterogeneous, imprecise and uncertain evaluation of criteria provided by more or less reliable sources in an uncertain context: COWA-ER (Cautious Ordered Weighted Averaging with Evidential Reasoning), Fuzzy-Cautious OWA or ER-MCDA (Evidential Reasoning for Multi Criteria Decision Analysis) are thus applied to several scales of torrent check-dams' effectiveness assessment. Those methods are then improved for a better knowledge representation and final decision. Enhanced methods are then associated together. Finally, individual problems and associated methods are integrated in a generic methodology to move from torrential protective single measure effectiveness assessment to complete protection systems at watershed scale.
2010-01-01
Background Decision curve analysis (DCA) has been proposed as an alternative method for evaluation of diagnostic tests, prediction models, and molecular markers. However, DCA is based on expected utility theory, which has been routinely violated by decision makers. Decision-making is governed by intuition (system 1), and analytical, deliberative process (system 2), thus, rational decision-making should reflect both formal principles of rationality and intuition about good decisions. We use the cognitive emotion of regret to serve as a link between systems 1 and 2 and to reformulate DCA. Methods First, we analysed a classic decision tree describing three decision alternatives: treat, do not treat, and treat or no treat based on a predictive model. We then computed the expected regret for each of these alternatives as the difference between the utility of the action taken and the utility of the action that, in retrospect, should have been taken. For any pair of strategies, we measure the difference in net expected regret. Finally, we employ the concept of acceptable regret to identify the circumstances under which a potentially wrong strategy is tolerable to a decision-maker. Results We developed a novel dual visual analog scale to describe the relationship between regret associated with "omissions" (e.g. failure to treat) vs. "commissions" (e.g. treating unnecessary) and decision maker's preferences as expressed in terms of threshold probability. We then proved that the Net Expected Regret Difference, first presented in this paper, is equivalent to net benefits as described in the original DCA. Based on the concept of acceptable regret we identified the circumstances under which a decision maker tolerates a potentially wrong decision and expressed it in terms of probability of disease. Conclusions We present a novel method for eliciting decision maker's preferences and an alternative derivation of DCA based on regret theory. Our approach may be intuitively more appealing to a decision-maker, particularly in those clinical situations when the best management option is the one associated with the least amount of regret (e.g. diagnosis and treatment of advanced cancer, etc). PMID:20846413
Decision Trajectories in Dementia Care Networks: Decisions and Related Key Events.
Groen-van de Ven, Leontine; Smits, Carolien; Oldewarris, Karen; Span, Marijke; Jukema, Jan; Eefsting, Jan; Vernooij-Dassen, Myrra
2017-10-01
This prospective multiperspective study provides insight into the decision trajectories of people with dementia by studying the decisions made and related key events. This study includes three waves of interviews, conducted between July 2010 and July 2012, with 113 purposefully selected respondents (people with beginning to advanced stages of dementia and their informal and professional caregivers) completed in 12 months (285 interviews). Our multilayered qualitative analysis consists of content analysis, timeline methods, and constant comparison. Four decision themes emerged-managing daily life, arranging support, community living, and preparing for the future. Eight key events delineate the decision trajectories of people with dementia. Decisions and key events differ between people with dementia living alone and living with a caregiver. Our study clarifies that decisions relate not only to the disease but to living with the dementia. Individual differences in decision content and sequence may effect shared decision-making and advance care planning.
A comparison of two methods for expert elicitation in health technology assessments.
Grigore, Bogdan; Peters, Jaime; Hyde, Christopher; Stein, Ken
2016-07-26
When data needed to inform parameters in decision models are lacking, formal elicitation of expert judgement can be used to characterise parameter uncertainty. Although numerous methods for eliciting expert opinion as probability distributions exist, there is little research to suggest whether one method is more useful than any other method. This study had three objectives: (i) to obtain subjective probability distributions characterising parameter uncertainty in the context of a health technology assessment; (ii) to compare two elicitation methods by eliciting the same parameters in different ways; (iii) to collect subjective preferences of the experts for the different elicitation methods used. Twenty-seven clinical experts were invited to participate in an elicitation exercise to inform a published model-based cost-effectiveness analysis of alternative treatments for prostate cancer. Participants were individually asked to express their judgements as probability distributions using two different methods - the histogram and hybrid elicitation methods - presented in a random order. Individual distributions were mathematically aggregated across experts with and without weighting. The resulting combined distributions were used in the probabilistic analysis of the decision model and mean incremental cost-effectiveness ratios and the expected values of perfect information (EVPI) were calculated for each method, and compared with the original cost-effectiveness analysis. Scores on the ease of use of the two methods and the extent to which the probability distributions obtained from each method accurately reflected the expert's opinion were also recorded. Six experts completed the task. Mean ICERs from the probabilistic analysis ranged between £162,600-£175,500 per quality-adjusted life year (QALY) depending on the elicitation and weighting methods used. Compared to having no information, use of expert opinion decreased decision uncertainty: the EVPI value at the £30,000 per QALY threshold decreased by 74-86 % from the original cost-effectiveness analysis. Experts indicated that the histogram method was easier to use, but attributed a perception of more accuracy to the hybrid method. Inclusion of expert elicitation can decrease decision uncertainty. Here, choice of method did not affect the overall cost-effectiveness conclusions, but researchers intending to use expert elicitation need to be aware of the impact different methods could have.
Hospital site selection using fuzzy AHP and its derivatives.
Vahidnia, Mohammad H; Alesheikh, Ali A; Alimohammadi, Abbas
2009-07-01
Environmental managers are commonly faced with sophisticated decisions, such as choosing the location of a new facility subject to multiple conflicting criteria. This paper considers the specific problem of creating a well-distributed network of hospitals that delivers its services to the target population with minimal time, pollution and cost. We develop a Multi-Criteria Decision Analysis process that combines Geographical Information System (GIS) analysis with the Fuzzy Analytical Hierarchy Process (FAHP), and use this process to determine the optimum site for a new hospital in the Tehran urban area. The GIS was used to calculate and classify governing criteria, while FAHP was used to evaluate the decision factors and their impacts on alternative sites. Three methods were used to estimate the total weights and priorities of the candidate sites: fuzzy extent analysis, center-of-area defuzzification, and the alpha-cut method. The three methods yield identical priorities for the five alternatives considered. Fuzzy extent analysis provides less discriminating power, but is simpler to implement and compute than the other two methods. The alpha-cut method is more complicated, but integrates the uncertainty and overall attitude of the decision-maker. The usefulness of the new hospital site is evaluated by computing an accessibility index for each pixel in the GIS, defined as the ratio of population density to travel time. With the addition of a new hospital at the optimum site, this index improved over about 6.5 percent of the geographical area.
Whitty, Jennifer A; Rundle-Thiele, Sharyn R; Scuffham, Paul A
2012-03-01
Discrete choice experiments (DCEs) and the Juster scale are accepted methods for the prediction of individual purchase probabilities. Nevertheless, these methods have seldom been applied to a social decision-making context. To gain an overview of social decisions for a decision-making population through data triangulation, these two methods were used to understand purchase probability in a social decision-making context. We report an exploratory social decision-making study of pharmaceutical subsidy in Australia. A DCE and selected Juster scale profiles were presented to current and past members of the Australian Pharmaceutical Benefits Advisory Committee and its Economic Subcommittee. Across 66 observations derived from 11 respondents for 6 different pharmaceutical profiles, there was a small overall median difference of 0.024 in the predicted probability of public subsidy (p = 0.003), with the Juster scale predicting the higher likelihood. While consistency was observed at the extremes of the probability scale, the funding probability differed over the mid-range of profiles. There was larger variability in the DCE than Juster predictions within each individual respondent, suggesting the DCE is better able to discriminate between profiles. However, large variation was observed between individuals in the Juster scale but not DCE predictions. It is important to use multiple methods to obtain a complete picture of the probability of purchase or public subsidy in a social decision-making context until further research can elaborate on our findings. This exploratory analysis supports the suggestion that the mixed logit model, which was used for the DCE analysis, may fail to adequately account for preference heterogeneity in some contexts.
Goulart Coelho, Lineker M; Lange, Liséte C; Coelho, Hosmanny Mg
2017-01-01
Solid waste management is a complex domain involving the interaction of several dimensions; thus, its analysis and control impose continuous challenges for decision makers. In this context, multi-criteria decision-making models have become important and convenient supporting tools for solid waste management because they can handle problems involving multiple dimensions and conflicting criteria. However, the selection of the multi-criteria decision-making method is a hard task since there are several multi-criteria decision-making approaches, each one with a large number of variants whose applicability depends on information availability and the aim of the study. Therefore, to support researchers and decision makers, the objectives of this article are to present a literature review of multi-criteria decision-making applications used in solid waste management, offer a critical assessment of the current practices, and provide suggestions for future works. A brief review of fundamental concepts on this topic is first provided, followed by the analysis of 260 articles related to the application of multi-criteria decision making in solid waste management. These studies were investigated in terms of the methodology, including specific steps such as normalisation, weighting, and sensitivity analysis. In addition, information related to waste type, the study objective, and aspects considered was recorded. From the articles analysed it is noted that studies using multi-criteria decision making in solid waste management are predominantly addressed to problems related to municipal solid waste involving facility location or management strategy.
Park, Ji Hyun; Kim, Hyeon-Young; Lee, Hanna; Yun, Eun Kyoung
2015-12-01
This study compares the performance of the logistic regression and decision tree analysis methods for assessing the risk factors for infection in cancer patients undergoing chemotherapy. The subjects were 732 cancer patients who were receiving chemotherapy at K university hospital in Seoul, Korea. The data were collected between March 2011 and February 2013 and were processed for descriptive analysis, logistic regression and decision tree analysis using the IBM SPSS Statistics 19 and Modeler 15.1 programs. The most common risk factors for infection in cancer patients receiving chemotherapy were identified as alkylating agents, vinca alkaloid and underlying diabetes mellitus. The logistic regression explained 66.7% of the variation in the data in terms of sensitivity and 88.9% in terms of specificity. The decision tree analysis accounted for 55.0% of the variation in the data in terms of sensitivity and 89.0% in terms of specificity. As for the overall classification accuracy, the logistic regression explained 88.0% and the decision tree analysis explained 87.2%. The logistic regression analysis showed a higher degree of sensitivity and classification accuracy. Therefore, logistic regression analysis is concluded to be the more effective and useful method for establishing an infection prediction model for patients undergoing chemotherapy. Copyright © 2015 Elsevier Ltd. All rights reserved.
Using the weighted area under the net benefit curve for decision curve analysis.
Talluri, Rajesh; Shete, Sanjay
2016-07-18
Risk prediction models have been proposed for various diseases and are being improved as new predictors are identified. A major challenge is to determine whether the newly discovered predictors improve risk prediction. Decision curve analysis has been proposed as an alternative to the area under the curve and net reclassification index to evaluate the performance of prediction models in clinical scenarios. The decision curve computed using the net benefit can evaluate the predictive performance of risk models at a given or range of threshold probabilities. However, when the decision curves for 2 competing models cross in the range of interest, it is difficult to identify the best model as there is no readily available summary measure for evaluating the predictive performance. The key deterrent for using simple measures such as the area under the net benefit curve is the assumption that the threshold probabilities are uniformly distributed among patients. We propose a novel measure for performing decision curve analysis. The approach estimates the distribution of threshold probabilities without the need of additional data. Using the estimated distribution of threshold probabilities, the weighted area under the net benefit curve serves as the summary measure to compare risk prediction models in a range of interest. We compared 3 different approaches, the standard method, the area under the net benefit curve, and the weighted area under the net benefit curve. Type 1 error and power comparisons demonstrate that the weighted area under the net benefit curve has higher power compared to the other methods. Several simulation studies are presented to demonstrate the improvement in model comparison using the weighted area under the net benefit curve compared to the standard method. The proposed measure improves decision curve analysis by using the weighted area under the curve and thereby improves the power of the decision curve analysis to compare risk prediction models in a clinical scenario.
Decision tree and PCA-based fault diagnosis of rotating machinery
NASA Astrophysics Data System (ADS)
Sun, Weixiang; Chen, Jin; Li, Jiaqing
2007-04-01
After analysing the flaws of conventional fault diagnosis methods, data mining technology is introduced to fault diagnosis field, and a new method based on C4.5 decision tree and principal component analysis (PCA) is proposed. In this method, PCA is used to reduce features after data collection, preprocessing and feature extraction. Then, C4.5 is trained by using the samples to generate a decision tree model with diagnosis knowledge. At last the tree model is used to make diagnosis analysis. To validate the method proposed, six kinds of running states (normal or without any defect, unbalance, rotor radial rub, oil whirl, shaft crack and a simultaneous state of unbalance and radial rub), are simulated on Bently Rotor Kit RK4 to test C4.5 and PCA-based method and back-propagation neural network (BPNN). The result shows that C4.5 and PCA-based diagnosis method has higher accuracy and needs less training time than BPNN.
The use of decision analysis to examine ethical decision making by critical care nurses.
Hughes, K K; Dvorak, E M
1997-01-01
To examine the extent to which critical care staff nurses make ethical decisions that coincide with those recommended by a decision analytic model. Nonexperimental, ex post facto. Midwestern university-affiliated 500 bed tertiary care medical center. One hundred critical care staff nurses randomly selected from seven critical care units. Complete responses were obtained from 82 nurses (for a final response rate of 82%). The dependent variable--consistent decision making--was measured as staff nurses' abilities to make ethical decisions that coincided with those prescribed by the decision model. Subjects completed two instruments, the Ethical Decision Analytic Model, a computer-administered instrument designed to measure staff nurses' abilities to make consistent decisions about a chemically-impaired colleague; and a Background Inventory. The results indicate marked consensus among nurses when informal methods were used. However, there was little consistency between the nurses' informal decisions and those recommended by the decision analytic model. Although 50% (n = 41) of all nurses chose a course of action that coincided with the model's least optimal alternative, few nurses agreed with the model as to the most optimal course of action. The findings also suggest that consistency was unrelated (p > 0.05) to the nurses' educational background or years of clinical experience; that most subjects reported receiving little or no education in decision making during their basic nursing education programs; but that exposure to decision-making strategies was related to years of nursing experience (p < 0.05). The findings differ from related studies that have found a moderate degree of consistency between nurses and decision analytic models for strictly clinical decision tasks, especially when those tasks were less complex. However, the findings partially coincide with other findings that decision analysis may not be particularly well-suited to the critical care environment. Additional research is needed to determine whether critical care nurses use the same decision-making methods as do other nurses; and to clarify the effects of decision task (clinical versus ethical) on nurses' decision making. It should not be assumed that methods used to study nurses' clinical decision making are applicable for all nurses or all types of decisions, including ethical decisions.
Using real options analysis to support strategic management decisions
NASA Astrophysics Data System (ADS)
Kabaivanov, Stanimir; Markovska, Veneta; Milev, Mariyan
2013-12-01
Decision making is a complex process that requires taking into consideration multiple heterogeneous sources of uncertainty. Standard valuation and financial analysis techniques often fail to properly account for all these sources of risk as well as for all sources of additional flexibility. In this paper we explore applications of a modified binomial tree method for real options analysis (ROA) in an effort to improve decision making process. Usual cases of use of real options are analyzed with elaborate study on the applications and advantages that company management can derive from their application. A numeric results based on extending simple binomial tree approach for multiple sources of uncertainty are provided to demonstrate the improvement effects on management decisions.
Decisions and Reasons: Examining Preservice Teacher Decision-Making through Video Self-Analysis
ERIC Educational Resources Information Center
Rich, Peter J.; Hannafin, Michael J.
2008-01-01
Methods used to study teacher thinking have both provided insight into the cognitive aspects of teaching and resulted in new, as yet unresolved, relationships between practice and theory. Recent developments in video-analysis tools have allowed preservice teachers to analyze both their practices and thinking, providing important feedback for…
[Scenario analysis--a method for long-term planning].
Stavem, K
2000-01-10
Scenarios are known from the film industry, as detailed descriptions of films. This has given name to scenario analysis, a method for long term planning using descriptions of composite future pictures. This article is an introduction to the scenario method. Scenarios describe plausible, not necessarily probable, developments. They focus on problems and questions that decision makers must be aware of and prepare to deal with, and the consequences of alternative decisions. Scenarios are used in corporate and governmental planning, and they can be useful and complementary to traditional planning and extrapolation of past experience. The method is particularly useful in a rapidly changing world with shifting external conditions.
Automatic rule generation for high-level vision
NASA Technical Reports Server (NTRS)
Rhee, Frank Chung-Hoon; Krishnapuram, Raghu
1992-01-01
A new fuzzy set based technique that was developed for decision making is discussed. It is a method to generate fuzzy decision rules automatically for image analysis. This paper proposes a method to generate rule-based approaches to solve problems such as autonomous navigation and image understanding automatically from training data. The proposed method is also capable of filtering out irrelevant features and criteria from the rules.
Disclosing Sexual Assault Within Social Networks: A Mixed-Method Investigation.
Dworkin, Emily R; Pittenger, Samantha L; Allen, Nicole E
2016-03-01
Most survivors of sexual assault disclose their experiences within their social networks, and these disclosure decisions can have important implications for their entry into formal systems and well-being, but no research has directly examined these networks as a strategy to understand disclosure decisions. Using a mixed-method approach that combined survey data, social network analysis, and interview data, we investigate whom, among potential informal responders in the social networks of college students who have experienced sexual assault, survivors contact regarding their assault, and how survivors narrate the role of networks in their decisions about whom to contact. Quantitative results suggest that characteristics of survivors, their social networks, and members of these networks are associated with disclosure decisions. Using data from social network analysis, we identified that survivors tended to disclose to a smaller proportion of their network when many network members had relationships with each other or when the network had more subgroups. Our qualitative analysis helps to contextualize these findings. © Society for Community Research and Action 2016.
In search of tools to aid logical thinking and communicating about medical decision making.
Hunink, M G
2001-01-01
To have real-time impact on medical decision making, decision analysts need a wide variety of tools to aid logical thinking and communication. Decision models provide a formal framework to integrate evidence and values, but they are commonly perceived as complex and difficult to understand by those unfamiliar with the methods, especially in the context of clinical decision making. The theory of constraints, introduced by Eliyahu Goldratt in the business world, provides a set of tools for logical thinking and communication that could potentially be useful in medical decision making. The author used the concept of a conflict resolution diagram to analyze the decision to perform carotid endarterectomy prior to coronary artery bypass grafting in a patient with both symptomatic coronary and asymptomatic carotid artery disease. The method enabled clinicians to visualize and analyze the issues, identify and discuss the underlying assumptions, search for the best available evidence, and use the evidence to make a well-founded decision. The method also facilitated communication among those involved in the care of the patient. Techniques from fields other than decision analysis can potentially expand the repertoire of tools available to support medical decision making and to facilitate communication in decision consults.
NASA Astrophysics Data System (ADS)
Chen, Ting-Yu
2012-06-01
This article presents a useful method for relating anchor dependency and accuracy functions to multiple attribute decision-making (MADM) problems in the context of Atanassov intuitionistic fuzzy sets (A-IFSs). Considering anchored judgement with displaced ideals and solution precision with minimal hesitation, several auxiliary optimisation models have proposed to obtain the optimal weights of the attributes and to acquire the corresponding TOPSIS (the technique for order preference by similarity to the ideal solution) index for alternative rankings. Aside from the TOPSIS index, as a decision-maker's personal characteristics and own perception of self may also influence the direction in the axiom of choice, the evaluation of alternatives is conducted based on distances of each alternative from the positive and negative ideal alternatives, respectively. This article originates from Li's [Li, D.-F. (2005), 'Multiattribute Decision Making Models and Methods Using Intuitionistic Fuzzy Sets', Journal of Computer and System Sciences, 70, 73-85] work, which is a seminal study of intuitionistic fuzzy decision analysis using deduced auxiliary programming models, and deems it a benchmark method for comparative studies on anchor dependency and accuracy functions. The feasibility and effectiveness of the proposed methods are illustrated by a numerical example. Finally, a comparative analysis is illustrated with computational experiments on averaging accuracy functions, TOPSIS indices, separation measures from positive and negative ideal alternatives, consistency rates of ranking orders, contradiction rates of the top alternative and average Spearman correlation coefficients.
Goltz, Sonia M.
2000-01-01
Decision fiascoes such as escalation of commitment, the tendency of decision makers to “throw good money after bad,” can have serious consequences for organizations and are therefore of great interest in applied research. This paper discusses the use of behavior analysis in organizational behavior research on escalation. Among the most significant aspects of behavior-analytic research on escalation is that it has indicated that both the patterns of outcomes that decision makers have experienced for past decisions and the patterns of responses that they make are critical for understanding escalation. This research has also stimulated the refinement of methods by researchers to better assess decision making and the role reinforcement plays in it. Finally, behavior-analytic escalation research has not only indicated the utility of reinforcement principles for predicting more complex human behavior but has also suggested some additional areas for future exploration of decision making using behavior analysis. PMID:22478347
2012-06-01
Military Operational Research , with special theme ‘The use of ‘soft’ methods in OR’. OR52 (7 – 9 September 2010, Royal Holloway University of London...on human judgement. Judgement-based OA applies the methods of ‘Soft Operational Research ’ developed in academia. It has appeared, however, that the...similarity between judgemental methods in operational research practice and a number of other modes of professional analytical practice. The closest
Decision Making Analysis: Critical Factors-Based Methodology
2010-04-01
the pitfalls associated with current wargaming methods such as assuming a western view of rational values in decision - making regardless of the cultures...Utilization theory slightly expands the rational decision making model as it states that “actors try to maximize their expected utility by weighing the...items to categorize the decision - making behavior of political leaders which tend to demonstrate either a rational or cognitive leaning. Leaders
A Compact Review of Multi-criteria Decision Analysis Uncertainty Techniques
2013-02-01
9 3.4 PROMETHEE -GAIA Method...obtained (74). 3.4 PROMETHEE -GAIA Method Preference Ranking Organization Method for Enrichment Evaluation ( PROMETHEE ) and Geometrical Analysis for...greater understanding of the importance of their selections. The PROMETHEE method was designed to perform MCDA while accounting for each of these
Child Protection Decision Making: A Factorial Analysis Using Case Vignettes
ERIC Educational Resources Information Center
Stokes, Jacqueline; Schmidt, Glen
2012-01-01
This study explored decision making by child protection social workers in the province of British Columbia, Canada. A factorial survey method was used in which case vignettes were constructed by randomly assigning a number of key characteristics associated with decision making in child protection. Child protection social workers (n = 118) assessed…
Use of multicriteria decision analysis to address conservation conflicts.
Davies, A L; Bryce, R; Redpath, S M
2013-10-01
Conservation conflicts are increasing on a global scale and instruments for reconciling competing interests are urgently needed. Multicriteria decision analysis (MCDA) is a structured, decision-support process that can facilitate dialogue between groups with differing interests and incorporate human and environmental dimensions of conflict. MCDA is a structured and transparent method of breaking down complex problems and incorporating multiple objectives. The value of this process for addressing major challenges in conservation conflict management is that MCDA helps in setting realistic goals; entails a transparent decision-making process; and addresses mistrust, differing world views, cross-scale issues, patchy or contested information, and inflexible legislative tools. Overall we believe MCDA provides a valuable decision-support tool, particularly for increasing awareness of the effects of particular values and choices for working toward negotiated compromise, although an awareness of the effect of methodological choices and the limitations of the method is vital before applying it in conflict situations. © 2013 Society for Conservation Biology.
A method to harness global crowd-sourced data to understand travel behavior in avalanche terrain.
NASA Astrophysics Data System (ADS)
Hendrikx, J.; Johnson, J.
2015-12-01
To date, most studies of the human dimensions of decision making in avalanche terrain has focused on two areas - post-accident analysis using accident reports/interviews and, the development of tools as decision forcing aids. We present an alternate method using crowd-sourced citizen science, for understanding decision-making in avalanche terrain. Our project combines real-time GPS tracking via a smartphone application, with internet based surveys of winter backcountry users as a method to describe and quantify travel practices in concert with group decision-making dynamics, and demographic data of participants during excursions. Effectively, we use the recorded GPS track taken within the landscape as an expression of the decision making processes and terrain usage by the group. Preliminary data analysis shows that individual experience levels, gender, avalanche hazard, and group composition all influence the ways in which people travel in avalanche terrain. Our results provide the first analysis of coupled real-time GPS tracking of the crowd while moving in avalanche terrain combined with psychographic and demographic correlates. This research will lead to an improved understanding of real-time decision making in avalanche terrain. In this paper we will specifically focus on the presentation of the methods used to solicit, and then harness the crowd to obtain data in a unique and innovative application of citizen science where the movements within the terrain are the desired output data (Figure 1). Figure 1: Example GPS tracks sourced from backcountry winter users in the Teton Pass area (Wyoming), from the 2014-15 winter season, where tracks in red represent those recorded as self-assessed experts (as per our survey), and where tracks in blue represent those recorded as self-assessed intermediates. All tracks shown were obtained under similar avalanche conditions. Statistical analysis of terrain metrics showed that the experts used steeper terrain than the intermediate users under similar avalanche conditions, demonstrating different terrain choice and use as a function of experience rather than hazard level.
2014-01-01
Background To improve quality of care and patient outcomes, health system decision-makers need to identify and implement effective interventions. An increasing number of systematic reviews document the effects of quality improvement programs to assist decision-makers in developing new initiatives. However, limitations in the reporting of primary studies and current meta-analysis methods (including approaches for exploring heterogeneity) reduce the utility of existing syntheses for health system decision-makers. This study will explore the role of innovative meta-analysis approaches and the added value of enriched and updated data for increasing the utility of systematic reviews of complex interventions. Methods/Design We will use the dataset from our recent systematic review of 142 randomized trials of diabetes quality improvement programs to evaluate novel approaches for exploring heterogeneity. These will include exploratory methods, such as multivariate meta-regression analyses and all-subsets combinatorial meta-analysis. We will then update our systematic review to include new trials and enrich the dataset by surveying authors of all included trials. In doing so, we will explore the impact of variables not, reported in previous publications, such as details of study context, on the effectiveness of the intervention. We will use innovative analytical methods on the enriched and updated dataset to identify key success factors in the implementation of quality improvement interventions for diabetes. Decision-makers will be involved throughout to help identify and prioritize variables to be explored and to aid in the interpretation and dissemination of results. Discussion This study will inform future systematic reviews of complex interventions and describe the value of enriching and updating data for exploring heterogeneity in meta-analysis. It will also result in an updated comprehensive systematic review of diabetes quality improvement interventions that will be useful to health system decision-makers in developing interventions to improve outcomes for people with diabetes. Systematic review registration PROSPERO registration no. CRD42013005165 PMID:25115289
Bean, Nigel G.; Ruberu, Ravi P.
2017-01-01
Background The external validity, or generalizability, of trials and guidelines has been considered poor in the context of multiple morbidity. How multiple morbidity might affect the magnitude of benefit of a given treatment, and thereby external validity, has had little study. Objective To provide a method of decision analysis to quantify the effects of age and comorbidity on the probability of deriving a given magnitude of treatment benefit. Design We developed a method to calculate probabilistically the effect of all of a patient’s comorbidities on their underlying utility, or well-being, at a future time point. From this, we derived a distribution of possible magnitudes of treatment benefit at that future time point. We then expressed this distribution as the probability of deriving at least a given magnitude of treatment benefit. To demonstrate the applicability of this method of decision analysis, we applied it to the treatment of hypercholesterolaemia in a geriatric population of 50 individuals. We highlighted the results of four of these individuals. Results This method of analysis provided individualized quantifications of the effect of age and comorbidity on the probability of treatment benefit. The average probability of deriving a benefit, of at least 50% of the magnitude of benefit available to an individual without comorbidity, was only 0.8%. Conclusion The effects of age and comorbidity on the probability of deriving significant treatment benefits can be quantified for any individual. Even without consideration of other factors affecting external validity, these effects may be sufficient to guide decision-making. PMID:29090189
Material selection and assembly method of battery pack for compact electric vehicle
NASA Astrophysics Data System (ADS)
Lewchalermwong, N.; Masomtob, M.; Lailuck, V.; Charoenphonphanich, C.
2018-01-01
Battery packs become the key component in electric vehicles (EVs). The main costs of which are battery cells and assembling processes. The battery cell is indeed priced from battery manufacturers while the assembling cost is dependent on battery pack designs. Battery pack designers need overall cost as cheap as possible, but it still requires high performance and more safety. Material selection and assembly method as well as component design are very important to determine the cost-effectiveness of battery modules and battery packs. Therefore, this work presents Decision Matrix, which can aid in the decision-making process of component materials and assembly methods for a battery module design and a battery pack design. The aim of this study is to take the advantage of incorporating Architecture Analysis method into decision matrix methods by capturing best practices for conducting design architecture analysis in full account of key design components critical to ensure efficient and effective development of the designs. The methodology also considers the impacts of choice-alternatives along multiple dimensions. Various alternatives for materials and assembly techniques of battery pack are evaluated, and some sample costs are presented. Due to many components in the battery pack, only seven components which are positive busbar and Z busbar are represented in this paper for using decision matrix methods.
Analysis And Assistant Planning System Ofregional Agricultural Economic Inform
NASA Astrophysics Data System (ADS)
Han, Jie; Zhang, Junfeng
For the common problems existed in regional development and planning, we try to design a decision support system for assisting regional agricultural development and alignment as a decision-making tool for local government and decision maker. The analysis methods of forecast, comparative advantage, liner programming and statistical analysis are adopted. According to comparative advantage theory, the regional advantage can be determined by calculating and comparing yield advantage index (YAI), Scale advantage index (SAI), Complicated advantage index (CAI). Combining with GIS, agricultural data are presented as a form of graph such as area, bar and pie to uncover the principle and trend for decision-making which can't be found in data table. This system provides assistant decisions for agricultural structure adjustment, agro-forestry development and planning, and can be integrated to information technologies such as RS, AI and so on.
Fuzzy MCDM Technique for Planning the Environment Watershed
NASA Astrophysics Data System (ADS)
Chen, Yi-Chun; Lien, Hui-Pang; Tzeng, Gwo-Hshiung; Yang, Lung-Shih; Yen, Leon
In the real word, the decision making problems are very vague and uncertain in a number of ways. The most criteria have interdependent and interactive features so they cannot be evaluated by conventional measures method. Such as the feasibility, thus, to approximate the human subjective evaluation process, it would be more suitable to apply a fuzzy method in environment-watershed plan topic. This paper describes the design of a fuzzy decision support system in multi-criteria analysis approach for selecting the best plan alternatives or strategies in environmentwatershed. The Fuzzy Analytic Hierarchy Process (FAHP) method is used to determine the preference weightings of criteria for decision makers by subjective perception. A questionnaire was used to find out from three related groups comprising fifteen experts. Subjectivity and vagueness analysis is dealt with the criteria and alternatives for selection process and simulation results by using fuzzy numbers with linguistic terms. Incorporated the decision makers’ attitude towards preference, overall performance value of each alternative can be obtained based on the concept of Fuzzy Multiple Criteria Decision Making (FMCDM). This research also gives an example of evaluating consisting of five alternatives, solicited from a environmentwatershed plan works in Taiwan, is illustrated to demonstrate the effectiveness and usefulness of the proposed approach.
Systems Analysis - a new paradigm and decision support tools for the water framework directive
NASA Astrophysics Data System (ADS)
Bruen, M.
2008-05-01
In the early days of Systems Analysis the focus was on providing tools for optimisation, modelling and simulation for use by experts. Now there is a recognition of the need to develop and disseminate tools to assist in making decisions, negotiating compromises and communicating preferences that can easily be used by stakeholders without the need for specialist training. The Water Framework Directive (WFD) requires public participation and thus provides a strong incentive for progress in this direction. This paper places the new paradigm in the context of the classical one and discusses some of the new approaches which can be used in the implementation of the WFD. These include multi-criteria decision support methods suitable for environmental problems, adaptive management, cognitive mapping, social learning and cooperative design and group decision-making. Concordance methods (such as ELECTRE) and the Analytical Hierarchy Process (AHP) are identified as multi-criteria methods that can be readily integrated into Decision Support Systems (DSS) that deal with complex environmental issues with very many criteria, some of which are qualitative. The expanding use of the new paradigm provides an opportunity to observe and learn from the interaction of stakeholders with the new technology and to assess its effectiveness.
Entropy Methods For Univariate Distributions in Decision Analysis
NASA Astrophysics Data System (ADS)
Abbas, Ali E.
2003-03-01
One of the most important steps in decision analysis practice is the elicitation of the decision-maker's belief about an uncertainty of interest in the form of a representative probability distribution. However, the probability elicitation process is a task that involves many cognitive and motivational biases. Alternatively, the decision-maker may provide other information about the distribution of interest, such as its moments, and the maximum entropy method can be used to obtain a full distribution subject to the given moment constraints. In practice however, decision makers cannot readily provide moments for the distribution, and are much more comfortable providing information about the fractiles of the distribution of interest or bounds on its cumulative probabilities. In this paper we present a graphical method to determine the maximum entropy distribution between upper and lower probability bounds and provide an interpretation for the shape of the maximum entropy distribution subject to fractile constraints, (FMED). We also discuss the problems with the FMED in that it is discontinuous and flat over each fractile interval. We present a heuristic approximation to a distribution if in addition to its fractiles, we also know it is continuous and work through full examples to illustrate the approach.
Staged decision making based on probabilistic forecasting
NASA Astrophysics Data System (ADS)
Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris
2016-04-01
Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in flood event management, the more damage can be reduced. And with decisions based on probabilistic forecasts, partial decisions can be made earlier in time (with a lower probability) and can be scaled up or down later in time when there is more certainty; whether the event takes place or not. Partial decisions are often more cheap, or shorten the final mitigation-time at the moment when there is more certainty. The proposed method is tested on Stonehaven, on the Carron River in Scotland. Decisions to implement demountable defences in the town are currently made based on a very short lead-time due to the absence of certainty. Application showed that staged decision making is possible and gives the decision maker more time to respond to a situation. The decision maker is able to take a lower regret decision with higher uncertainty and less related negative consequences. Although it is not possible to quantify intangible effects, it is part of the analysis to reduce these effects. Above all, the proposed approach has shown to be a possible improvement in economic terms and opens up possibilities of more flexible and robust decision making.
NASA Astrophysics Data System (ADS)
Rouillon, M.; Taylor, M. P.; Dong, C.
2016-12-01
This research assesses the advantages of integrating field portable X-ray Fluorescence (pXRF) technology for reducing the risk and increase confidence of decision making for metal-contaminated site assessments. Metal-contaminated sites are often highly heterogeneous and require a high sampling density to accurately characterize the distribution and concentration of contaminants. The current regulatory assessment approaches rely on a small number of samples processed using standard wet-chemistry methods. In New South Wales (NSW), Australia, the current notification trigger for characterizing metal-contaminated sites require the upper 95% confidence interval of the site mean to equal or exceed the relevant guidelines. The method's low `minimum' sampling requirements can misclassify sites due to the heterogeneous nature of soil contamination, leading to inaccurate decision making. To address this issue, we propose integrating infield pXRF analysis with the established sampling method to overcome sampling limitations. This approach increases the minimum sampling resolution and reduces the 95% CI of the site mean. Infield pXRF analysis at contamination hotspots enhances sample resolution efficiently and without the need to return to the site. In this study, the current and proposed pXRF site assessment methods are compared at five heterogeneous metal-contaminated sites by analysing the spatial distribution of contaminants, 95% confidence intervals of site means, and the sampling and analysis uncertainty associated with each method. Finally, an analysis of costs associated with both the current and proposed methods is presented to demonstrate the advantages of incorporating pXRF into metal-contaminated site assessments. The data shows that pXRF integrated site assessments allows for faster, cost-efficient, characterisation of metal-contaminated sites with greater confidence for decision making.
Structured decision making: Chapter 5
Runge, Michael C.; Grand, James B.; Mitchell, Michael S.; Krausman, Paul R.; Cain, James W. III
2013-01-01
Wildlife management is a decision-focused discipline. It needs to integrate traditional wildlife science and social science to identify actions that are most likely to achieve the array of desires society has surrounding wildlife populations. Decision science, a vast field with roots in economics, operations research, and psychology, offers a rich set of tools to help wildlife managers frame, decompose, analyze, and synthesize their decisions. The nature of wildlife management as a decision science has been recognized since the inception of the field, but formal methods of decision analysis have been underused. There is tremendous potential for wildlife management to grow further through the use of formal decision analysis. First, the wildlife science and human dimensions of wildlife disciplines can be readily integrated. Second, decisions can become more efficient. Third, decisions makers can communicate more clearly with stakeholders and the public. Fourth, good, intuitive wildlife managers, by explicitly examining how they make decisions, can translate their art into a science that is readily used by the next generation.
Van Norman, Ethan R; Christ, Theodore J
2016-10-01
Curriculum based measurement of oral reading (CBM-R) is used to monitor the effects of academic interventions for individual students. Decisions to continue, modify, or terminate these interventions are made by interpreting time series CBM-R data. Such interpretation is founded upon visual analysis or the application of decision rules. The purpose of this study was to compare the accuracy of visual analysis and decision rules. Visual analysts interpreted 108 CBM-R progress monitoring graphs one of three ways: (a) without graphic aids, (b) with a goal line, or (c) with a goal line and a trend line. Graphs differed along three dimensions, including trend magnitude, variability of observations, and duration of data collection. Automated trend line and data point decision rules were also applied to each graph. Inferential analyses permitted the estimation of the probability of a correct decision (i.e., the student is improving - continue the intervention, or the student is not improving - discontinue the intervention) for each evaluation method as a function of trend magnitude, variability of observations, and duration of data collection. All evaluation methods performed better when students made adequate progress. Visual analysis and decision rules performed similarly when observations were less variable. Results suggest that educators should collect data for more than six weeks, take steps to control measurement error, and visually analyze graphs when data are variable. Implications for practice and research are discussed. Copyright © 2016 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Basye, Austin T.
A matrix element method analysis of the Standard Model Higgs boson, produced in association with two top quarks decaying to the lepton-plus-jets channel is presented. Based on 20.3 fb--1 of s=8 TeV data, produced at the Large Hadron Collider and collected by the ATLAS detector, this analysis utilizes multiple advanced techniques to search for ttH signatures with a 125 GeV Higgs boson decaying to two b -quarks. After categorizing selected events based on their jet and b-tag multiplicities, signal rich regions are analyzed using the matrix element method. Resulting variables are then propagated to two parallel multivariate analyses utilizing Neural Networks and Boosted Decision Trees respectively. As no significant excess is found, an observed (expected) limit of 3.4 (2.2) times the Standard Model cross-section is determined at 95% confidence, using the CLs method, for the Neural Network analysis. For the Boosted Decision Tree analysis, an observed (expected) limit of 5.2 (2.7) times the Standard Model cross-section is determined at 95% confidence, using the CLs method. Corresponding unconstrained fits of the Higgs boson signal strength to the observed data result in the measured signal cross-section to Standard Model cross-section prediction of mu = 1.2 +/- 1.3(total) +/- 0.7(stat.) for the Neural Network analysis, and mu = 2.9 +/- 1.4(total) +/- 0.8(stat.) for the Boosted Decision Tree analysis.
Tran, Liem T; Knight, C Gregory; O'Neill, Robert V; Smith, Elizabeth R; Riitters, Kurt H; Wickham, James
2002-06-01
A fuzzy decision analysis method for integrating ecological indicators was developed. This was a combination of a fuzzy ranking method and the analytic hierarchy process (AHP). The method was capable of ranking ecosystems in terms of environmental conditions and suggesting cumulative impacts across a large region. Using data on land cover, population, roads, streams, air pollution, and topography of the Mid-Atlantic region, we were able to point out areas that were in relatively poor condition and/or vulnerable to future deterioration. The method offered an easy and comprehensive way to combine the strengths of fuzzy set theory and the AHP for ecological assessment. Furthermore, the suggested method can serve as a building block for the evaluation of environmental policies.
Modeling Adversaries in Counterterrorism Decisions Using Prospect Theory.
Merrick, Jason R W; Leclerc, Philip
2016-04-01
Counterterrorism decisions have been an intense area of research in recent years. Both decision analysis and game theory have been used to model such decisions, and more recently approaches have been developed that combine the techniques of the two disciplines. However, each of these approaches assumes that the attacker is maximizing its utility. Experimental research shows that human beings do not make decisions by maximizing expected utility without aid, but instead deviate in specific ways such as loss aversion or likelihood insensitivity. In this article, we modify existing methods for counterterrorism decisions. We keep expected utility as the defender's paradigm to seek for the rational decision, but we use prospect theory to solve for the attacker's decision to descriptively model the attacker's loss aversion and likelihood insensitivity. We study the effects of this approach in a critical decision, whether to screen containers entering the United States for radioactive materials. We find that the defender's optimal decision is sensitive to the attacker's levels of loss aversion and likelihood insensitivity, meaning that understanding such descriptive decision effects is important in making such decisions. © 2014 Society for Risk Analysis.
Aguirre-Junco, Angel-Ricardo; Colombet, Isabelle; Zunino, Sylvain; Jaulent, Marie-Christine; Leneveut, Laurence; Chatellier, Gilles
2004-01-01
The initial step for the computerization of guidelines is the knowledge specification from the prose text of guidelines. We describe a method of knowledge specification based on a structured and systematic analysis of text allowing detailed specification of a decision tree. We use decision tables to validate the decision algorithm and decision trees to specify and represent this algorithm, along with elementary messages of recommendation. Edition tools are also necessary to facilitate the process of validation and workflow between expert physicians who will validate the specified knowledge and computer scientist who will encode the specified knowledge in a guide-line model. Applied to eleven different guidelines issued by an official agency, the method allows a quick and valid computerization and integration in a larger decision support system called EsPeR (Personalized Estimate of Risks). The quality of the text guidelines is however still to be developed further. The method used for computerization could help to define a framework usable at the initial step of guideline development in order to produce guidelines ready for electronic implementation.
Munro, Sarah; Stacey, Dawn; Lewis, Krystina B; Bansback, Nick
2016-04-01
To understand how well patients make value congruent decisions with and without patient decision aids (PtDAs) for screening and treatment options, and identify issues with its measurement and evaluation. A sub-analysis of trials included in the 2014 Cochrane Review of Decision Aids. Eligible trials measured value congruence with chosen option. Two reviewers independently screened 115 trials. Among 18 included trials, 8 (44%) measured value congruence using the Multidimensional Measure of Informed Choice (MMIC), 7 (39%) used heterogeneous methods, and 3 (17%) used unclear methods. Pooled results of trials that used heterogeneous measures were statistically non-significant (n=3). Results from trials that used the MMIC suggest patients are 48% more likely to make value congruent decisions when exposed to a PtDA for a screening decision (RR 1.48, 95% CI 1.01 to 2.16, n=8). Patients struggle to make value congruent decisions, but PtDAs may help. While the absolute improvement is relatively small it may be underestimated due to sample size issues, definitions, and heterogeneity of measures. Current approaches are inadequate to support patients making decisions that are consistent with their values. There is some evidence that PtDAs support patients with achieving values congruent decisions for screening choices. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Application fuzzy multi-attribute decision analysis method to prioritize project success criteria
NASA Astrophysics Data System (ADS)
Phong, Nguyen Thanh; Quyen, Nguyen Le Hoang Thuy To
2017-11-01
Project success is a foundation for project owner to manage and control not only for the current project but also for future potential projects in construction companies. However, identifying the key success criteria for evaluating a particular project in real practice is a challenging task. Normally, it depends on a lot of factors, such as the expectation of the project owner and stakeholders, triple constraints of the project (cost, time, quality), and company's mission, vision, and objectives. Traditional decision-making methods for measuring the project success are usually based on subjective opinions of panel experts, resulting in irrational and inappropriate decisions. Therefore, this paper introduces a multi-attribute decision analysis method (MADAM) for weighting project success criteria by using fuzzy Analytical Hierarchy Process approach. It is found that this method is useful when dealing with imprecise and uncertain human judgments in evaluating project success criteria. Moreover, this research also suggests that although cost, time, and quality are three project success criteria projects, the satisfaction of project owner and acceptance of project stakeholders with the completed project criteria is the most important criteria for project success evaluation in Vietnam.
Data-driven freeway performance evaluation framework for project prioritization and decision making.
DOT National Transportation Integrated Search
2017-01-01
This report describes methods that potentially can be incorporated into the performance monitoring and planning processes for freeway performance evaluation and decision making. Reliability analysis was conducted on the selected I-15 corridor by empl...
Data-driven freeway performance evaluation framework for project prioritization and decision making.
DOT National Transportation Integrated Search
2015-03-01
This report describes methods that potentially can be incorporated into the performance monitoring and planning : processes for freeway performance evaluation and decision making. Reliability analysis is conducted on the selected : I-15 corridor by e...
Use of Cost-Utility Decision Models in Business Education.
ERIC Educational Resources Information Center
Lewis, Darrell R.
1989-01-01
Explains how cost-utility analysis can be applied to the selection of curriculum and instructional methods. Describes the use of multiattribute utility models of decision making as a tool for more informed judgment in educational administration. (SK)
Climate Risk Informed Decision Analysis: A Hypothetical Application to the Waas Region
NASA Astrophysics Data System (ADS)
Gilroy, Kristin; Mens, Marjolein; Haasnoot, Marjolijn; Jeuken, Ad
2016-04-01
More frequent and intense hydrologic events under climate change are expected to enhance water security and flood risk management challenges worldwide. Traditional planning approaches must be adapted to address climate change and develop solutions with an appropriate level of robustness and flexibility. The Climate Risk Informed Decision Analysis (CRIDA) method is a novel planning approach embodying a suite of complementary methods, including decision scaling and adaptation pathways. Decision scaling offers a bottom-up approach to assess risk and tailors the complexity of the analysis to the problem at hand and the available capacity. Through adaptation pathway,s an array of future strategies towards climate robustness are developed, ranging in flexibility and immediacy of investments. Flexible pathways include transfer points to other strategies to ensure that the system can be adapted if future conditions vary from those expected. CRIDA combines these two approaches in a stakeholder driven process which guides decision makers through the planning and decision process, taking into account how the confidence in the available science, the consequences in the system, and the capacity of institutions should influence strategy selection. In this presentation, we will explain the CRIDA method and compare it to existing planning processes, such as the US Army Corps of Engineers Principles and Guidelines as well as Integrated Water Resources Management Planning. Then, we will apply the approach to a hypothetical case study for the Waas Region, a large downstream river basin facing rapid development threatened by increased flood risks. Through the case study, we will demonstrate how a stakeholder driven process can be used to evaluate system robustness to climate change; develop adaptation pathways for multiple objectives and criteria; and illustrate how varying levels of confidence, consequences, and capacity would play a role in the decision making process, specifically in regards to the level of robustness and flexibility in the selected strategy. This work will equip practitioners and decision makers with an example of a structured process for decision making under climate uncertainty that can be scaled as needed to the problem at hand. This presentation builds further on another submitted abstract "Climate Risk Informed Decision Analysis (CRIDA): A novel practical guidance for Climate Resilient Investments and Planning" by Jeuken et al.
Advanced Productivity Analysis Methods for Air Traffic Control Operations
1976-12-01
Routine Work ............................... 37 4.2.2. Surveillance Work .......................... 40 4.2.3. Conflict Prcessing Work ................... 41...crossing and overtake conflicts) includes potential- conflict recognition, assessment, and resolution decision making and A/N voice communications...makers to utilize £ .quantitative and dynamic analysis as a tool for decision - making. 1.1.3 Types of Simulation Models Although there are many ways to
A bayesian approach to classification criteria for spectacled eiders
Taylor, B.L.; Wade, P.R.; Stehn, R.A.; Cochrane, J.F.
1996-01-01
To facilitate decisions to classify species according to risk of extinction, we used Bayesian methods to analyze trend data for the Spectacled Eider, an arctic sea duck. Trend data from three independent surveys of the Yukon-Kuskokwim Delta were analyzed individually and in combination to yield posterior distributions for population growth rates. We used classification criteria developed by the recovery team for Spectacled Eiders that seek to equalize errors of under- or overprotecting the species. We conducted both a Bayesian decision analysis and a frequentist (classical statistical inference) decision analysis. Bayesian decision analyses are computationally easier, yield basically the same results, and yield results that are easier to explain to nonscientists. With the exception of the aerial survey analysis of the 10 most recent years, both Bayesian and frequentist methods indicated that an endangered classification is warranted. The discrepancy between surveys warrants further research. Although the trend data are abundance indices, we used a preliminary estimate of absolute abundance to demonstrate how to calculate extinction distributions using the joint probability distributions for population growth rate and variance in growth rate generated by the Bayesian analysis. Recent apparent increases in abundance highlight the need for models that apply to declining and then recovering species.
Tsalatsanis, Athanasios; Hozo, Iztok; Vickers, Andrew; Djulbegovic, Benjamin
2010-09-16
Decision curve analysis (DCA) has been proposed as an alternative method for evaluation of diagnostic tests, prediction models, and molecular markers. However, DCA is based on expected utility theory, which has been routinely violated by decision makers. Decision-making is governed by intuition (system 1), and analytical, deliberative process (system 2), thus, rational decision-making should reflect both formal principles of rationality and intuition about good decisions. We use the cognitive emotion of regret to serve as a link between systems 1 and 2 and to reformulate DCA. First, we analysed a classic decision tree describing three decision alternatives: treat, do not treat, and treat or no treat based on a predictive model. We then computed the expected regret for each of these alternatives as the difference between the utility of the action taken and the utility of the action that, in retrospect, should have been taken. For any pair of strategies, we measure the difference in net expected regret. Finally, we employ the concept of acceptable regret to identify the circumstances under which a potentially wrong strategy is tolerable to a decision-maker. We developed a novel dual visual analog scale to describe the relationship between regret associated with "omissions" (e.g. failure to treat) vs. "commissions" (e.g. treating unnecessary) and decision maker's preferences as expressed in terms of threshold probability. We then proved that the Net Expected Regret Difference, first presented in this paper, is equivalent to net benefits as described in the original DCA. Based on the concept of acceptable regret we identified the circumstances under which a decision maker tolerates a potentially wrong decision and expressed it in terms of probability of disease. We present a novel method for eliciting decision maker's preferences and an alternative derivation of DCA based on regret theory. Our approach may be intuitively more appealing to a decision-maker, particularly in those clinical situations when the best management option is the one associated with the least amount of regret (e.g. diagnosis and treatment of advanced cancer, etc).
Vetter, Jeffrey S.
2005-02-01
The method and system described herein presents a technique for performance analysis that helps users understand the communication behavior of their message passing applications. The method and system described herein may automatically classifies individual communication operations and reveal the cause of communication inefficiencies in the application. This classification allows the developer to quickly focus on the culprits of truly inefficient behavior, rather than manually foraging through massive amounts of performance data. Specifically, the method and system described herein trace the message operations of Message Passing Interface (MPI) applications and then classify each individual communication event using a supervised learning technique: decision tree classification. The decision tree may be trained using microbenchmarks that demonstrate both efficient and inefficient communication. Since the method and system described herein adapt to the target system's configuration through these microbenchmarks, they simultaneously automate the performance analysis process and improve classification accuracy. The method and system described herein may improve the accuracy of performance analysis and dramatically reduce the amount of data that users must encounter.
Using Decision Analysis to Improve Malaria Control Policy Making
Kramer, Randall; Dickinson, Katherine L.; Anderson, Richard M.; Fowler, Vance G.; Miranda, Marie Lynn; Mutero, Clifford M.; Saterson, Kathryn A.; Wiener, Jonathan B.
2013-01-01
Malaria and other vector-borne diseases represent a significant and growing burden in many tropical countries. Successfully addressing these threats will require policies that expand access to and use of existing control methods, such as insecticide-treated bed nets and artemesinin combination therapies for malaria, while weighing the costs and benefits of alternative approaches over time. This paper argues that decision analysis provides a valuable framework for formulating such policies and combating the emergence and re-emergence of malaria and other diseases. We outline five challenges that policy makers and practitioners face in the struggle against malaria, and demonstrate how decision analysis can help to address and overcome these challenges. A prototype decision analysis framework for malaria control in Tanzania is presented, highlighting the key components that a decision support tool should include. Developing and applying such a framework can promote stronger and more effective linkages between research and policy, ultimately helping to reduce the burden of malaria and other vector-borne diseases. PMID:19356821
A framework for sensitivity analysis of decision trees.
Kamiński, Bogumił; Jakubczyk, Michał; Szufel, Przemysław
2018-01-01
In the paper, we consider sequential decision problems with uncertainty, represented as decision trees. Sensitivity analysis is always a crucial element of decision making and in decision trees it often focuses on probabilities. In the stochastic model considered, the user often has only limited information about the true values of probabilities. We develop a framework for performing sensitivity analysis of optimal strategies accounting for this distributional uncertainty. We design this robust optimization approach in an intuitive and not overly technical way, to make it simple to apply in daily managerial practice. The proposed framework allows for (1) analysis of the stability of the expected-value-maximizing strategy and (2) identification of strategies which are robust with respect to pessimistic/optimistic/mode-favoring perturbations of probabilities. We verify the properties of our approach in two cases: (a) probabilities in a tree are the primitives of the model and can be modified independently; (b) probabilities in a tree reflect some underlying, structural probabilities, and are interrelated. We provide a free software tool implementing the methods described.
M&S Decision/Role-Behavior Decompositions
2007-10-17
M &S Decision/Role-Behavior Decompositions Wargaming and Analysis Workshop Military Operations Research Society 17 October 2007 Paul Works, Methods...number. 1. REPORT DATE 17 OCT 2007 2. REPORT TYPE 3. DATES COVERED 00-00-2007 to 00-00-2007 4. TITLE AND SUBTITLE M &S Decision/Role-Behavior...transmission. • Combat models and simulations ( M &S) continue, in most cases, to model “effects-level” representations of SA, decisions, and behaviors. – M &S
Miller, W B; Pasta, D J
2001-01-01
In this study we develop and then test a couple model of contraceptive method choice decision-making following a pregnancy scare. The central constructs in our model are satisfaction with one's current method and confidence in the use of it. Downstream in the decision sequence, satisfaction and confidence predict desires and intentions to change methods. Upstream they are predicted by childbearing motivations, contraceptive attitudes, and the residual effects of the couples' previous method decisions. We collected data from 175 mostly unmarried and racially/ethnically diverse couples who were seeking pregnancy tests. We used LISREL and its latent variable capacity to estimate a structural equation model of the couple decision-making sequence leading to a change (or not) in contraceptive method. Results confirm most elements in our model and demonstrate a number of important cross-partner effects. Almost one-half of the sample had positive pregnancy tests and the base model fitted to this subsample indicates less accuracy in partner perception and greater influence of the female partner on method change decision-making. The introduction of some hypothesis-generating exogenous variables to our base couple model, together with some unexpected findings for the contraceptive attitude variables, suggest interesting questions that require further exploration.
Dionne-Odom, J. Nicholas; Willis, Danny G.; Bakitas, Marie; Crandall, Beth; Grace, Pamela J.
2014-01-01
Background Surrogate decision-makers (SDMs) face difficult decisions at end of life (EOL) for decisionally incapacitated intensive care unit (ICU) patients. Purpose Identify and describe the underlying psychological processes of surrogate decision-making for adults at EOL in the ICU. Method Qualitative case study design using a cognitive task analysis (CTA) interviewing approach. Participants were recruited from October 2012 to June 2013 from an academic tertiary medical center’s ICU located in the rural Northeastern United States. Nineteen SDMs for patients who had died in the ICU completed in-depth semi-structured CTA interviews. Discussion The conceptual framework formulated from data analysis reveals that three underlying, iterative, psychological dimensions: gist impressions, distressing emotions, and moral intuitions impact a SDM’s judgment about the acceptability of either the patient’s medical treatments or his or her condition. Conclusion The framework offers initial insights about the underlying psychological processes of surrogate decision-making and may facilitate enhanced decision support for SDMs. PMID:25982772
Aghajani Mir, M; Taherei Ghazvinei, P; Sulaiman, N M N; Basri, N E A; Saheri, S; Mahmood, N Z; Jahan, A; Begum, R A; Aghamohammadi, N
2016-01-15
Selecting a suitable Multi Criteria Decision Making (MCDM) method is a crucial stage to establish a Solid Waste Management (SWM) system. Main objective of the current study is to demonstrate and evaluate a proposed method using Multiple Criteria Decision Making methods (MCDM). An improved version of Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) applied to obtain the best municipal solid waste management method by comparing and ranking the scenarios. Applying this method in order to rank treatment methods is introduced as one contribution of the study. Besides, Viekriterijumsko Kompromisno Rangiranje (VIKOR) compromise solution method applied for sensitivity analyses. The proposed method can assist urban decision makers in prioritizing and selecting an optimized Municipal Solid Waste (MSW) treatment system. Besides, a logical and systematic scientific method was proposed to guide an appropriate decision-making. A modified TOPSIS methodology as a superior to existing methods for first time was applied for MSW problems. Applying this method in order to rank treatment methods is introduced as one contribution of the study. Next, 11 scenarios of MSW treatment methods are defined and compared environmentally and economically based on the waste management conditions. Results show that integrating a sanitary landfill (18.1%), RDF (3.1%), composting (2%), anaerobic digestion (40.4%), and recycling (36.4%) was an optimized model of integrated waste management. An applied decision-making structure provides the opportunity for optimum decision-making. Therefore, the mix of recycling and anaerobic digestion and a sanitary landfill with Electricity Production (EP) are the preferred options for MSW management. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan
2018-03-01
GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.
Dolan, James G.; Boohaker, Emily; Allison, Jeroan; Imperiale, Thomas F.
2013-01-01
Background Current US colorectal cancer screening guidelines that call for shared decision making regarding the choice among several recommended screening options are difficult to implement. Multi-criteria decision analysis (MCDA) is an established methodology well suited for supporting shared decision making. Our study goal was to determine if a streamlined form of MCDA using rank order based judgments can accurately assess patients’ colorectal cancer screening priorities. Methods We converted priorities for four decision criteria and three sub-criteria regarding colorectal cancer screening obtained from 484 average risk patients using the Analytic Hierarchy Process (AHP) in a prior study into rank order-based priorities using rank order centroids. We compared the two sets of priorities using Spearman rank correlation and non-parametric Bland-Altman limits of agreement analysis. We assessed the differential impact of using the rank order-based versus the AHP-based priorities on the results of a full MCDA comparing three currently recommended colorectal cancer screening strategies. Generalizability of the results was assessed using Monte Carlo simulation. Results Correlations between the two sets of priorities for the seven criteria ranged from 0.55 to 0.92. The proportions of absolute differences between rank order-based and AHP-based priorities that were more than ± 0.15 ranged from 1% to 16%. Differences in the full MCDA results were minimal and the relative rankings of the three screening options were identical more than 88% of the time. The Monte Carlo simulation results were similar. Conclusion Rank order-based MCDA could be a simple, practical way to guide individual decisions and assess population decision priorities regarding colorectal cancer screening strategies. Additional research is warranted to further explore the use of these methods for promoting shared decision making. PMID:24300851
Prescott, Jeffrey William
2013-02-01
The importance of medical imaging for clinical decision making has been steadily increasing over the last four decades. Recently, there has also been an emphasis on medical imaging for preclinical decision making, i.e., for use in pharamaceutical and medical device development. There is also a drive towards quantification of imaging findings by using quantitative imaging biomarkers, which can improve sensitivity, specificity, accuracy and reproducibility of imaged characteristics used for diagnostic and therapeutic decisions. An important component of the discovery, characterization, validation and application of quantitative imaging biomarkers is the extraction of information and meaning from images through image processing and subsequent analysis. However, many advanced image processing and analysis methods are not applied directly to questions of clinical interest, i.e., for diagnostic and therapeutic decision making, which is a consideration that should be closely linked to the development of such algorithms. This article is meant to address these concerns. First, quantitative imaging biomarkers are introduced by providing definitions and concepts. Then, potential applications of advanced image processing and analysis to areas of quantitative imaging biomarker research are described; specifically, research into osteoarthritis (OA), Alzheimer's disease (AD) and cancer is presented. Then, challenges in quantitative imaging biomarker research are discussed. Finally, a conceptual framework for integrating clinical and preclinical considerations into the development of quantitative imaging biomarkers and their computer-assisted methods of extraction is presented.
[Effect of occupational stress on mental health].
Yu, Shan-fa; Zhang, Rui; Ma, Liang-qing; Gu, Gui-zhen; Yang, Yan; Li, Kui-rong
2003-02-01
To study the effect of job psychological demands and job control on mental health and their interaction. 93 male freight train dispatchers were evaluated by using revised Job Demand-Control Scale and 7 strain scales. Stepwise regression analysis, Univariate ANOVA, Kruskal-Wallis H and Modian methods were used in statistic analysis. Kruskal-Wallis H and Modian methods analysis revealed the difference in mental health scores among groups of decision latitude (mean rank 55.57, 47.95, 48.42, 33.50, P < 0.05), the differences in scores of mental health (37.45, 40.01, 58.35), job satisfaction (53.18, 46.91, 32.43), daily life strains (33.00, 44.96, 56.12) and depression (36.45, 42.25, 53.61) among groups of job time demands (P < 0.05) were all statistically significant. ANOVA showed that job time demands and decision latitude had interaction effects on physical complains (R(2) = 0.24), state-anxiety (R(2) = 0.26), and daytime fatigue (R(2) = 0.28) (P < 0.05). Regression analysis revealed a significant job time demands and job decision latitude interaction effect as well as significant main effects of the some independent variables on different job strains (R(2) > 0.05). Job time demands and job decision latitude have direct and interactive effects on psychosomatic health, the more time demands, the more psychological strains, the effect of job time demands is greater than that of job decision latitude.
A Decision Support System for Evaluating and Selecting Information Systems Projects
NASA Astrophysics Data System (ADS)
Deng, Hepu; Wibowo, Santoso
2009-01-01
This chapter presents a decision support system (DSS) for effectively solving the information systems (IS) project selection problem. The proposed DSS recognizes the multidimensional nature of the IS project selection problem, the availability of multicriteria analysis (MA) methods, and the preferences of the decision-maker (DM) on the use of specific MA methods in a given situation. A knowledge base consisting of IF-THEN production rules is developed for assisting the DM with a systematic adoption of the most appropriate method with the efficient use of the powerful reasoning and explanation capabilities of intelligent DSS. The idea of letting the problem to be solved determines the method to be used is incorporated into the proposed DSS. As a result, effective decisions can be made for solving the IS project selection problem. An example is presented to demonstrate the applicability of the proposed DSS for solving the problem of selecting IS projects in real world situations.
An Analysis of Categorical and Quantitative Methods for Planning Under Uncertainty
Langlotz, Curtis P.; Shortliffe, Edward H.
1988-01-01
Decision theory and logical reasoning are both methods for representing and solving medical decision problems. We analyze the usefulness of these two approaches to medical therapy planning by establishing a simple correspondence between decision theory and non-monotonic logic, a formalization of categorical logical reasoning. The analysis indicates that categorical approaches to planning can be viewed as comprising two decision-theoretic concepts: probabilities (degrees of belief in planning hypotheses) and utilities (degrees of desirability of planning outcomes). We present and discuss examples of the following lessons from this decision-theoretic view of categorical (nonmonotonic) reasoning: (1) Decision theory and artificial intelligence techniques are intended to solve different components of the planning problem. (2) When considered in the context of planning under uncertainty, nonmonotonic logics do not retain the domain-independent characteristics of classical logical reasoning for planning under certainty. (3) Because certain nonmonotonic programming paradigms (e.g., frame-based inheritance, rule-based planning, protocol-based reminders) are inherently problem-specific, they may be inappropriate to employ in the solution of certain types of planning problems. We discuss how these conclusions affect several current medical informatics research issues, including the construction of “very large” medical knowledge bases.
Cost-effectiveness Analysis with Influence Diagrams.
Arias, M; Díez, F J
2015-01-01
Cost-effectiveness analysis (CEA) is used increasingly in medicine to determine whether the health benefit of an intervention is worth the economic cost. Decision trees, the standard decision modeling technique for non-temporal domains, can only perform CEA for very small problems. To develop a method for CEA in problems involving several dozen variables. We explain how to build influence diagrams (IDs) that explicitly represent cost and effectiveness. We propose an algorithm for evaluating cost-effectiveness IDs directly, i.e., without expanding an equivalent decision tree. The evaluation of an ID returns a set of intervals for the willingness to pay - separated by cost-effectiveness thresholds - and, for each interval, the cost, the effectiveness, and the optimal intervention. The algorithm that evaluates the ID directly is in general much more efficient than the brute-force method, which is in turn more efficient than the expansion of an equivalent decision tree. Using OpenMarkov, an open-source software tool that implements this algorithm, we have been able to perform CEAs on several IDs whose equivalent decision trees contain millions of branches. IDs can perform CEA on large problems that cannot be analyzed with decision trees.
NASA Astrophysics Data System (ADS)
Chu, J.; Zhang, C.; Fu, G.; Li, Y.; Zhou, H.
2015-08-01
This study investigates the effectiveness of a sensitivity-informed method for multi-objective operation of reservoir systems, which uses global sensitivity analysis as a screening tool to reduce computational demands. Sobol's method is used to screen insensitive decision variables and guide the formulation of the optimization problems with a significantly reduced number of decision variables. This sensitivity-informed method dramatically reduces the computational demands required for attaining high-quality approximations of optimal trade-off relationships between conflicting design objectives. The search results obtained from the reduced complexity multi-objective reservoir operation problems are then used to pre-condition the full search of the original optimization problem. In two case studies, the Dahuofang reservoir and the inter-basin multi-reservoir system in Liaoning province, China, sensitivity analysis results show that reservoir performance is strongly controlled by a small proportion of decision variables. Sensitivity-informed dimension reduction and pre-conditioning are evaluated in their ability to improve the efficiency and effectiveness of multi-objective evolutionary optimization. Overall, this study illustrates the efficiency and effectiveness of the sensitivity-informed method and the use of global sensitivity analysis to inform dimension reduction of optimization problems when solving complex multi-objective reservoir operation problems.
Diaby, Vakaramoko; Sanogo, Vassiki; Moussa, Kouame Richard
2016-01-01
In this paper, the readers are introduced to ELICIT, an imprecise weight elicitation technique for multicriteria decision analysis for healthcare. The application of ELICIT consists of two steps: the rank ordering of evaluation criteria based on decision-makers' (DMs) preferences using the principal component analysis; and the estimation of criteria weights and their descriptive statistics using the variable interdependent analysis and the Monte Carlo method. The application of ELICIT is illustrated with a hypothetical case study involving the elicitation of weights for five criteria used to select the best device for eye surgery. The criteria were ranked from 1-5, based on a strict preference relationship established by the DMs. For each criterion, the deterministic weight was estimated as well as the standard deviation and 95% credibility interval. ELICIT is appropriate in situations where only ordinal DMs' preferences are available to elicit decision criteria weights.
Applied Swarm-based medicine: collecting decision trees for patterns of algorithms analysis.
Panje, Cédric M; Glatzer, Markus; von Rappard, Joscha; Rothermundt, Christian; Hundsberger, Thomas; Zumstein, Valentin; Plasswilm, Ludwig; Putora, Paul Martin
2017-08-16
The objective consensus methodology has recently been applied in consensus finding in several studies on medical decision-making among clinical experts or guidelines. The main advantages of this method are an automated analysis and comparison of treatment algorithms of the participating centers which can be performed anonymously. Based on the experience from completed consensus analyses, the main steps for the successful implementation of the objective consensus methodology were identified and discussed among the main investigators. The following steps for the successful collection and conversion of decision trees were identified and defined in detail: problem definition, population selection, draft input collection, tree conversion, criteria adaptation, problem re-evaluation, results distribution and refinement, tree finalisation, and analysis. This manuscript provides information on the main steps for successful collection of decision trees and summarizes important aspects at each point of the analysis.
Learning from examples - Generation and evaluation of decision trees for software resource analysis
NASA Technical Reports Server (NTRS)
Selby, Richard W.; Porter, Adam A.
1988-01-01
A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.
Oakley, Jeremy E.; Brennan, Alan; Breeze, Penny
2015-01-01
Health economic decision-analytic models are used to estimate the expected net benefits of competing decision options. The true values of the input parameters of such models are rarely known with certainty, and it is often useful to quantify the value to the decision maker of reducing uncertainty through collecting new data. In the context of a particular decision problem, the value of a proposed research design can be quantified by its expected value of sample information (EVSI). EVSI is commonly estimated via a 2-level Monte Carlo procedure in which plausible data sets are generated in an outer loop, and then, conditional on these, the parameters of the decision model are updated via Bayes rule and sampled in an inner loop. At each iteration of the inner loop, the decision model is evaluated. This is computationally demanding and may be difficult if the posterior distribution of the model parameters conditional on sampled data is hard to sample from. We describe a fast nonparametric regression-based method for estimating per-patient EVSI that requires only the probabilistic sensitivity analysis sample (i.e., the set of samples drawn from the joint distribution of the parameters and the corresponding net benefits). The method avoids the need to sample from the posterior distributions of the parameters and avoids the need to rerun the model. The only requirement is that sample data sets can be generated. The method is applicable with a model of any complexity and with any specification of model parameter distribution. We demonstrate in a case study the superior efficiency of the regression method over the 2-level Monte Carlo method. PMID:25810269
The reliability of the pass/fail decision for assessments comprised of multiple components
Möltner, Andreas; Tımbıl, Sevgi; Jünger, Jana
2015-01-01
Objective: The decision having the most serious consequences for a student taking an assessment is the one to pass or fail that student. For this reason, the reliability of the pass/fail decision must be determined for high quality assessments, just as the measurement reliability of the point values. Assessments in a particular subject (graded course credit) are often composed of multiple components that must be passed independently of each other. When “conjunctively” combining separate pass/fail decisions, as with other complex decision rules for passing, adequate methods of analysis are necessary for estimating the accuracy and consistency of these classifications. To date, very few papers have addressed this issue; a generally applicable procedure was published by Douglas and Mislevy in 2010. Using the example of an assessment comprised of several parts that must be passed separately, this study analyzes the reliability underlying the decision to pass or fail students and discusses the impact of an improved method for identifying those who do not fulfill the minimum requirements. Method: The accuracy and consistency of the decision to pass or fail an examinee in the subject cluster Internal Medicine/General Medicine/Clinical Chemistry at the University of Heidelberg’s Faculty of Medicine was investigated. This cluster requires students to separately pass three components (two written exams and an OSCE), whereby students may reattempt to pass each component twice. Our analysis was carried out using the method described by Douglas and Mislevy. Results: Frequently, when complex logical connections exist between the individual pass/fail decisions in the case of low failure rates, only a very low reliability for the overall decision to grant graded course credit can be achieved, even if high reliabilities exist for the various components. For the example analyzed here, the classification accuracy and consistency when conjunctively combining the three individual parts is relatively low with κ=0.49 or κ=0.47, despite the good reliability of over 0.75 for each of the three components. The option to repeat each component twice leads to a situation in which only about half of the candidates who do not satisfy the minimum requirements would fail the overall assessment, while the other half is able to continue their studies despite having deficient knowledge and skills. Conclusion: The method put forth by Douglas and Mislevy allows the analysis of the decision accuracy and consistency for complex combinations of scores from different components. Even in the case of highly reliable components, it is not necessarily so that a reliable pass/fail decision has been reached – for instance in the case of low failure rates. Assessments must be administered with the explicit goal of identifying examinees that do not fulfill the minimum requirements. PMID:26483855
ERIC Educational Resources Information Center
Cathcart, Stephen Michael
2016-01-01
This mixed method study examines HRD professionals' decision-making processes when making an organizational purchase of training. The study uses a case approach with a degrees of freedom analysis. The data to analyze will examine how HRD professionals in manufacturing select outside vendors human resource development programs for training,…
Ghandour, Rula; Shoaibi, Azza; Khatib, Rana; Abu Rmeileh, Niveen; Unal, Belgin; Sözmen, Kaan; Kılıç, Bülent; Fouad, Fouad; Al Ali, Radwan; Ben Romdhane, Habiba; Aissi, Wafa; Ahmad, Balsam; Capewell, Simon; Critchley, Julia; Husseini, Abdullatif
2015-01-01
To explore the feasibility of using a simple multi-criteria decision analysis method with policy makers/key stakeholders to prioritize cardiovascular disease (CVD) policies in four Mediterranean countries: Palestine, Syria, Tunisia and Turkey. A simple multi-criteria decision analysis (MCDA) method was piloted. A mixed methods study was used to identify a preliminary list of policy options in each country. These policies were rated by different policymakers/stakeholders against pre-identified criteria to generate a priority score for each policy and then rank the policies. Twenty-five different policies were rated in the four countries to create a country-specific list of CVD prevention and control policies. The response rate was 100% in each country. The top policies were mostly population level interventions and health systems' level policies. Successful collaboration between policy makers/stakeholders and researchers was established in this small pilot study. MCDA appeared to be feasible and effective. Future applications should aim to engage a larger, representative sample of policy makers, especially from outside the health sector. Weighting the selected criteria might also be assessed.
Mayhorn, Christopher B; Fisk, Arthur D; Whittle, Justin D
2002-01-01
Decision making in uncertain environments is a daily challenge faced by adults of all ages. Framing decision options as either gains or losses is a common method of altering decision-making behavior. In the experiment reported here, benchmark decision-making data collected in the 1970s by Tversky and Kahneman (1981, 1988) were compared with data collected from current samples of young and older adults to determine whether behavior was consistent across time. Although differences did emerge between the benchmark and the present samples, the effect of framing on decision behavior was relatively stable. The present findings suggest that adults of all ages are susceptible to framing effects. Results also indicated that apparent age differences might be better explained by an analysis of cohort and time-of-testing effects. Actual or potential applications of this research include an understanding of how framing might influence the decision-making behavior of people of all ages in a number of applied contexts, such as product warning interactions and medical decision scenarios.
[Analyzing consumer preference by using the latest semantic model for verbal protocol].
Tamari, Yuki; Takemura, Kazuhisa
2012-02-01
This paper examines consumers' preferences for competing brands by using a preference model of verbal protocols. Participants were 150 university students, who reported their opinions and feelings about McDonalds and Mos Burger (competing hamburger restaurants in Japan). Their verbal protocols were analyzed by using the singular value decomposition method, and the latent decision frames were estimated. The verbal protocols having a large value in the decision frames could be interpreted as showing attributes that consumers emphasize. Based on the estimated decision frames, we predicted consumers' preferences using the logistic regression analysis method. The results indicate that the decision frames projected from the verbal protocol data explained consumers' preferences effectively.
Content Analysis as a Best Practice in Technical Communication Research
ERIC Educational Resources Information Center
Thayer, Alexander; Evans, Mary; McBride, Alicia; Queen, Matt; Spyridakis, Jan
2007-01-01
Content analysis is a powerful empirical method for analyzing text, a method that technical communicators can use on the job and in their research. Content analysis can expose hidden connections among concepts, reveal relationships among ideas that initially seem unconnected, and inform the decision-making processes associated with many technical…
Determining the optimal forensic DNA analysis procedure following investigation of sample quality.
Hedell, Ronny; Hedman, Johannes; Mostad, Petter
2018-07-01
Crime scene traces of various types are routinely sent to forensic laboratories for analysis, generally with the aim of addressing questions about the source of the trace. The laboratory may choose to analyse the samples in different ways depending on the type and quality of the sample, the importance of the case and the cost and performance of the available analysis methods. Theoretically well-founded guidelines for the choice of analysis method are, however, lacking in most situations. In this paper, it is shown how such guidelines can be created using Bayesian decision theory. The theory is applied to forensic DNA analysis, showing how the information from the initial qPCR analysis can be utilized. It is assumed the alternatives for analysis are using a standard short tandem repeat (STR) DNA analysis assay, using the standard assay and a complementary assay, or the analysis may be cancelled following quantification. The decision is based on information about the DNA amount and level of DNA degradation of the forensic sample, as well as case circumstances and the cost for analysis. Semi-continuous electropherogram models are used for simulation of DNA profiles and for computation of likelihood ratios. It is shown how tables and graphs, prepared beforehand, can be used to quickly find the optimal decision in forensic casework.
EEG Classification with a Sequential Decision-Making Method in Motor Imagery BCI.
Liu, Rong; Wang, Yongxuan; Newman, Geoffrey I; Thakor, Nitish V; Ying, Sarah
2017-12-01
To develop subject-specific classifier to recognize mental states fast and reliably is an important issue in brain-computer interfaces (BCI), particularly in practical real-time applications such as wheelchair or neuroprosthetic control. In this paper, a sequential decision-making strategy is explored in conjunction with an optimal wavelet analysis for EEG classification. The subject-specific wavelet parameters based on a grid-search method were first developed to determine evidence accumulative curve for the sequential classifier. Then we proposed a new method to set the two constrained thresholds in the sequential probability ratio test (SPRT) based on the cumulative curve and a desired expected stopping time. As a result, it balanced the decision time of each class, and we term it balanced threshold SPRT (BTSPRT). The properties of the method were illustrated on 14 subjects' recordings from offline and online tests. Results showed the average maximum accuracy of the proposed method to be 83.4% and the average decision time of 2.77[Formula: see text]s, when compared with 79.2% accuracy and a decision time of 3.01[Formula: see text]s for the sequential Bayesian (SB) method. The BTSPRT method not only improves the classification accuracy and decision speed comparing with the other nonsequential or SB methods, but also provides an explicit relationship between stopping time, thresholds and error, which is important for balancing the speed-accuracy tradeoff. These results suggest that BTSPRT would be useful in explicitly adjusting the tradeoff between rapid decision-making and error-free device control.
Uncovering the requirements of cognitive work.
Roth, Emilie M
2008-06-01
In this article, the author provides an overview of cognitive analysis methods and how they can be used to inform system analysis and design. Human factors has seen a shift toward modeling and support of cognitively intensive work (e.g., military command and control, medical planning and decision making, supervisory control of automated systems). Cognitive task analysis and cognitive work analysis methods extend traditional task analysis techniques to uncover the knowledge and thought processes that underlie performance in cognitively complex settings. The author reviews the multidisciplinary roots of cognitive analysis and the variety of cognitive task analysis and cognitive work analysis methods that have emerged. Cognitive analysis methods have been used successfully to guide system design, as well as development of function allocation, team structure, and training, so as to enhance performance and reduce the potential for error. A comprehensive characterization of cognitive work requires two mutually informing analyses: (a) examination of domain characteristics and constraints that define cognitive requirements and challenges and (b) examination of practitioner knowledge and strategies that underlie both expert and error-vulnerable performance. A variety of specific methods can be adapted to achieve these aims within the pragmatic constraints of particular projects. Cognitive analysis methods can be used effectively to anticipate cognitive performance problems and specify ways to improve individual and team cognitive performance (be it through new forms of training, user interfaces, or decision aids).
van der Burg, Max Post; Tyre, Andrew J
2011-01-01
Wildlife managers often make decisions under considerable uncertainty. In the most extreme case, a complete lack of data leads to uncertainty that is unquantifiable. Information-gap decision theory deals with assessing management decisions under extreme uncertainty, but it is not widely used in wildlife management. So too, robust population management methods were developed to deal with uncertainties in multiple-model parameters. However, the two methods have not, as yet, been used in tandem to assess population management decisions. We provide a novel combination of the robust population management approach for matrix models with the information-gap decision theory framework for making conservation decisions under extreme uncertainty. We applied our model to the problem of nest survival management in an endangered bird species, the Mountain Plover (Charadrius montanus). Our results showed that matrix sensitivities suggest that nest management is unlikely to have a strong effect on population growth rate, confirming previous analyses. However, given the amount of uncertainty about adult and juvenile survival, our analysis suggested that maximizing nest marking effort was a more robust decision to maintain a stable population. Focusing on the twin concepts of opportunity and robustness in an information-gap model provides a useful method of assessing conservation decisions under extreme uncertainty.
INTEGRATED ENVIRONMENTAL ASSESSMENT OF THE MID-ATLANTIC REGION WITH ANALYTICAL NETWORK PROCESS
A decision analysis method for integrating environmental indicators was developed. This was a combination of Principal Component Analysis (PCA) and the Analytic Network Process (ANP). Being able to take into account interdependency among variables, the method was capable of ran...
Luker, Kali R; Sullivan, Maura E; Peyre, Sarah E; Sherman, Randy; Grunwald, Tiffany
2008-01-01
The aim of this study was to compare the surgical knowledge of residents before and after receiving a cognitive task analysis-based multimedia teaching module. Ten plastic surgery residents were evaluated performing flexor tendon repair on 3 occasions. Traditional learning occurred between the first and second trial and served as the control. A teaching module was introduced as an intervention between the second and third trial using cognitive task analysis to illustrate decision-making skills. All residents showed improvement in their decision-making ability when performing flexor tendon repair after each surgical procedure. The group improved through traditional methods as well as exposure to our talk-aloud protocol (P > .01). After being trained using the cognitive task analysis curriculum the group displayed a statistically significant knowledge expansion (P < .01). Residents receiving cognitive task analysis-based multimedia surgical curriculum instruction achieved greater command of problem solving and are better equipped to make correct decisions in flexor tendon repair.
Sheehan, Barbara; Kaufman, David; Stetson, Peter; Currie, Leanne M.
2009-01-01
Computerized decision support systems have been used to help ensure safe medication prescribing. However, the acceptance of these types of decision support has been reported to be low. It has been suggested that decreased acceptance may be due to lack of clinical relevance. Additionally, cognitive fit between the user interface and clinical task may impact the response of clinicians as they interact with the system. In order to better understand clinician responses to such decision support, we used cognitive task analysis methods to evaluate clinical alerts for antibiotic prescribing in a neonatal intensive care unit. Two methods were used: 1) a cognitive walkthrough; and 2) usability testing with a ‘think-aloud’ protocol. Data were analyzed for impact on cognitive effort according to categories of cognitive distance. We found that responses to alerts may be context specific and that lack of screen cues often increases cognitive effort required to use a system. PMID:20351922
Mixture-based gatekeeping procedures in adaptive clinical trials.
Kordzakhia, George; Dmitrienko, Alex; Ishida, Eiji
2018-01-01
Clinical trials with data-driven decision rules often pursue multiple clinical objectives such as the evaluation of several endpoints or several doses of an experimental treatment. These complex analysis strategies give rise to "multivariate" multiplicity problems with several components or sources of multiplicity. A general framework for defining gatekeeping procedures in clinical trials with adaptive multistage designs is proposed in this paper. The mixture method is applied to build a gatekeeping procedure at each stage and inferences at each decision point (interim or final analysis) are performed using the combination function approach. An advantage of utilizing the mixture method is that it enables powerful gatekeeping procedures applicable to a broad class of settings with complex logical relationships among the hypotheses of interest. Further, the combination function approach supports flexible data-driven decisions such as a decision to increase the sample size or remove a treatment arm. The paper concludes with a clinical trial example that illustrates the methodology by applying it to develop an adaptive two-stage design with a mixture-based gatekeeping procedure.
Gilabert-Perramon, Antoni; Torrent-Farnell, Josep; Catalan, Arancha; Prat, Alba; Fontanet, Manel; Puig-Peiró, Ruth; Merino-Montero, Sandra; Khoury, Hanane; Goetghebeur, Mireille M; Badia, Xavier
2017-01-01
The aim of this study was to adapt and assess the value of a Multi-Criteria Decision Analysis (MCDA) framework (EVIDEM) for the evaluation of Orphan drugs in Catalonia (Catalan Health Service). The standard evaluation and decision-making procedures of CatSalut were compared with the EVIDEM methodology and contents. The EVIDEM framework was adapted to the Catalan context, focusing on the evaluation of Orphan drugs (PASFTAC program), during a Workshop with sixteen PASFTAC members. The criteria weighting was done using two different techniques (nonhierarchical and hierarchical). Reliability was assessed by re-test. The EVIDEM framework and methodology was found useful and feasible for Orphan drugs evaluation and decision making in Catalonia. All the criteria considered for the development of the CatSalut Technical Reports and decision making were considered in the framework. Nevertheless, the framework could improve the reporting of some of these criteria (i.e., "unmet needs" or "nonmedical costs"). Some Contextual criteria were removed (i.e., "Mandate and scope of healthcare system", "Environmental impact") or adapted ("population priorities and access") for CatSalut purposes. Independently of the weighting technique considered, the most important evaluation criteria identified for orphan drugs were: "disease severity", "unmet needs" and "comparative effectiveness", while the "size of the population" had the lowest relevance for decision making. Test-retest analysis showed weight consistency among techniques, supporting reliability overtime. MCDA (EVIDEM framework) could be a useful tool to complement the current evaluation methods of CatSalut, contributing to standardization and pragmatism, providing a method to tackle ethical dilemmas and facilitating discussions related to decision making.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soltani, Atousa; Hewage, Kasun; Reza, Bahareh
2015-01-15
Highlights: • We review Municipal Solid Waste Management studies with focus on multiple stakeholders. • We focus on studies with multi-criteria decision analysis methods and discover their trends. • Most studies do not offer solutions for situations where stakeholders compete for more benefits or have unequal voting powers. • Governments and experts are the most participated stakeholders and AHP is the most dominant method. - Abstract: Municipal Solid Waste Management (MSWM) is a complicated process that involves multiple environmental and socio-economic criteria. Decision-makers look for decision support frameworks that can guide in defining alternatives, relevant criteria and their weights, andmore » finding a suitable solution. In addition, decision-making in MSWM problems such as finding proper waste treatment locations or strategies often requires multiple stakeholders such as government, municipalities, industries, experts, and/or general public to get involved. Multi-criteria Decision Analysis (MCDA) is the most popular framework employed in previous studies on MSWM; MCDA methods help multiple stakeholders evaluate the often conflicting criteria, communicate their different preferences, and rank or prioritize MSWM strategies to finally agree on some elements of these strategies and make an applicable decision. This paper reviews and brings together research on the application of MCDA for solving MSWM problems with more focus on the studies that have considered multiple stakeholders and offers solutions for such problems. Results of this study show that AHP is the most common approach in consideration of multiple stakeholders and experts and governments/municipalities are the most common participants in these studies.« less
Thompson, Rachel; Manski, Ruth; Donnelly, Kyla Z; Stevens, Gabrielle; Agusti, Daniela; Banach, Michelle; Boardman, Maureen B; Brady, Pearl; Colón Bradt, Christina; Foster, Tina; Johnson, Deborah J; Li, Zhongze; Norsigian, Judy; Nothnagle, Melissa; Olson, Ardis L; Shepherd, Heather L; Stern, Lisa F; Tosteson, Tor D; Trevena, Lyndal; Upadhya, Krishna K; Elwyn, Glyn
2017-01-01
Introduction Despite the observed and theoretical advantages of shared decision-making in a range of clinical contexts, including contraceptive care, there remains a paucity of evidence on how to facilitate its adoption. This paper describes the protocol for a study to assess the comparative effectiveness of patient-targeted and provider-targeted interventions for facilitating shared decision-making about contraceptive methods. Methods and analysis We will conduct a 2×2 factorial cluster randomised controlled trial with four arms: (1) video+prompt card, (2) decision aids+training, (3) video+prompt card and decision aids+training and (4) usual care. The clusters will be clinics in USA that deliver contraceptive care. The participants will be people who have completed a healthcare visit at a participating clinic, were assigned female sex at birth, are aged 15–49 years, are able to read and write English or Spanish and have not previously participated in the study. The primary outcome will be shared decision-making about contraceptive methods. Secondary outcomes will be the occurrence of a conversation about contraception in the healthcare visit, satisfaction with the conversation about contraception, intended contraceptive method(s), intention to use a highly effective method, values concordance of the intended method(s), decision regret, contraceptive method(s) used, use of a highly effective method, use of the intended method(s), adherence, satisfaction with the method(s) used, unintended pregnancy and unwelcome pregnancy. We will collect study data via longitudinal patient surveys administered immediately after the healthcare visit, four weeks later and six months later. Ethics and dissemination We will disseminate results via presentations at scientific and professional conferences, papers published in peer-reviewed, open-access journals and scientific and lay reports. We will also make an anonymised copy of the final participant-level dataset available to others for research purposes. Trial registration number ClinicalTrials.gov Identifier: NCT02759939. PMID:29061624
Multivariate analysis of flow cytometric data using decision trees.
Simon, Svenja; Guthke, Reinhard; Kamradt, Thomas; Frey, Oliver
2012-01-01
Characterization of the response of the host immune system is important in understanding the bidirectional interactions between the host and microbial pathogens. For research on the host site, flow cytometry has become one of the major tools in immunology. Advances in technology and reagents allow now the simultaneous assessment of multiple markers on a single cell level generating multidimensional data sets that require multivariate statistical analysis. We explored the explanatory power of the supervised machine learning method called "induction of decision trees" in flow cytometric data. In order to examine whether the production of a certain cytokine is depended on other cytokines, datasets from intracellular staining for six cytokines with complex patterns of co-expression were analyzed by induction of decision trees. After weighting the data according to their class probabilities, we created a total of 13,392 different decision trees for each given cytokine with different parameter settings. For a more realistic estimation of the decision trees' quality, we used stratified fivefold cross validation and chose the "best" tree according to a combination of different quality criteria. While some of the decision trees reflected previously known co-expression patterns, we found that the expression of some cytokines was not only dependent on the co-expression of others per se, but was also dependent on the intensity of expression. Thus, for the first time we successfully used induction of decision trees for the analysis of high dimensional flow cytometric data and demonstrated the feasibility of this method to reveal structural patterns in such data sets.
Cox, Ruth; Sanchez, Javier; Revie, Crawford W
2013-01-01
Global climate change is known to result in the emergence or re-emergence of some infectious diseases. Reliable methods to identify the infectious diseases of humans and animals and that are most likely to be influenced by climate are therefore required. Since different priorities will affect the decision to address a particular pathogen threat, decision makers need a standardised method of prioritisation. Ranking methods and Multi-Criteria Decision approaches provide such a standardised method and were employed here to design two different pathogen prioritisation tools. The opinion of 64 experts was elicited to assess the importance of 40 criteria that could be used to prioritise emerging infectious diseases of humans and animals in Canada. A weight was calculated for each criterion according to the expert opinion. Attributes were defined for each criterion as a transparent and repeatable method of measurement. Two different Multi-Criteria Decision Analysis tools were tested, both of which used an additive aggregation approach. These were an Excel spreadsheet tool and a tool developed in software 'M-MACBETH'. The tools were trialed on nine 'test' pathogens. Two different methods of criteria weighting were compared, one using fixed weighting values, the other using probability distributions to account for uncertainty and variation in expert opinion. The ranking of the nine pathogens varied according to the weighting method that was used. In both tools, using both weighting methods, the diseases that tended to rank the highest were West Nile virus, Giardiasis and Chagas, while Coccidioidomycosis tended to rank the lowest. Both tools are a simple and user friendly approach to prioritising pathogens according to climate change by including explicit scoring of 40 criteria and incorporating weighting methods based on expert opinion. They provide a dynamic interactive method that can help to identify pathogens for which a full risk assessment should be pursued.
Cox, Ruth; Sanchez, Javier; Revie, Crawford W.
2013-01-01
Global climate change is known to result in the emergence or re-emergence of some infectious diseases. Reliable methods to identify the infectious diseases of humans and animals and that are most likely to be influenced by climate are therefore required. Since different priorities will affect the decision to address a particular pathogen threat, decision makers need a standardised method of prioritisation. Ranking methods and Multi-Criteria Decision approaches provide such a standardised method and were employed here to design two different pathogen prioritisation tools. The opinion of 64 experts was elicited to assess the importance of 40 criteria that could be used to prioritise emerging infectious diseases of humans and animals in Canada. A weight was calculated for each criterion according to the expert opinion. Attributes were defined for each criterion as a transparent and repeatable method of measurement. Two different Multi-Criteria Decision Analysis tools were tested, both of which used an additive aggregation approach. These were an Excel spreadsheet tool and a tool developed in software ‘M-MACBETH’. The tools were trialed on nine ‘test’ pathogens. Two different methods of criteria weighting were compared, one using fixed weighting values, the other using probability distributions to account for uncertainty and variation in expert opinion. The ranking of the nine pathogens varied according to the weighting method that was used. In both tools, using both weighting methods, the diseases that tended to rank the highest were West Nile virus, Giardiasis and Chagas, while Coccidioidomycosis tended to rank the lowest. Both tools are a simple and user friendly approach to prioritising pathogens according to climate change by including explicit scoring of 40 criteria and incorporating weighting methods based on expert opinion. They provide a dynamic interactive method that can help to identify pathogens for which a full risk assessment should be pursued. PMID:23950868
ERIC Educational Resources Information Center
Boyle, Gregory J.; Furedy, John J.; Neumann, David L.; Westbury, H. Rae; Reiestad, Magnus
2010-01-01
The wording of university academic job advertisements can reflect a commitment to equity (affirmative action) as opposed to academic merit in hiring decisions. The method of judgemental content analysis was applied by having three judges rate 810 Australian tenure-stream advertisements on seven-point magnitude scales of equity and merit. The…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greitzer, Frank L.; Podmore, Robin
2008-11-17
The focus of the present study is on improved training approaches to accelerate learning and improved methods for analyzing effectiveness of tools within a high-fidelity power grid simulated environment. A theory-based model has been developed to document and understand the mental processes that an expert power system operator uses when making critical decisions. The theoretical foundation for the method is based on the concepts of situation awareness, the methods of cognitive task analysis, and the naturalistic decision making (NDM) approach of Recognition Primed Decision Making. The method has been systematically explored and refined as part of a capability demonstration ofmore » a high-fidelity real-time power system simulator under normal and emergency conditions. To examine NDM processes, we analyzed transcripts of operator-to-operator conversations during the simulated scenario to reveal and assess NDM-based performance criteria. The results of the analysis indicate that the proposed framework can be used constructively to map or assess the Situation Awareness Level of the operators at each point in the scenario. We can also identify the mental models and mental simulations that the operators employ at different points in the scenario. This report documents the method, describes elements of the model, and provides appendices that document the simulation scenario and the associated mental models used by operators in the scenario.« less
NASA Astrophysics Data System (ADS)
Zein-Sabatto, Saleh; Mikhail, Maged; Bodruzzaman, Mohammad; DeSimio, Martin; Derriso, Mark; Behbahani, Alireza
2012-06-01
It has been widely accepted that data fusion and information fusion methods can improve the accuracy and robustness of decision-making in structural health monitoring systems. It is arguably true nonetheless, that decision-level is equally beneficial when applied to integrated health monitoring systems. Several decisions at low-levels of abstraction may be produced by different decision-makers; however, decision-level fusion is required at the final stage of the process to provide accurate assessment about the health of the monitored system as a whole. An example of such integrated systems with complex decision-making scenarios is the integrated health monitoring of aircraft. Thorough understanding of the characteristics of the decision-fusion methodologies is a crucial step for successful implementation of such decision-fusion systems. In this paper, we have presented the major information fusion methodologies reported in the literature, i.e., probabilistic, evidential, and artificial intelligent based methods. The theoretical basis and characteristics of these methodologies are explained and their performances are analyzed. Second, candidate methods from the above fusion methodologies, i.e., Bayesian, Dempster-Shafer, and fuzzy logic algorithms are selected and their applications are extended to decisions fusion. Finally, fusion algorithms are developed based on the selected fusion methods and their performance are tested on decisions generated from synthetic data and from experimental data. Also in this paper, a modeling methodology, i.e. cloud model, for generating synthetic decisions is presented and used. Using the cloud model, both types of uncertainties; randomness and fuzziness, involved in real decision-making are modeled. Synthetic decisions are generated with an unbiased process and varying interaction complexities among decisions to provide for fair performance comparison of the selected decision-fusion algorithms. For verification purposes, implementation results of the developed fusion algorithms on structural health monitoring data collected from experimental tests are reported in this paper.
Gunay, Osman; Toreyin, Behçet Ugur; Kose, Kivanc; Cetin, A Enis
2012-05-01
In this paper, an entropy-functional-based online adaptive decision fusion (EADF) framework is developed for image analysis and computer vision applications. In this framework, it is assumed that the compound algorithm consists of several subalgorithms, each of which yields its own decision as a real number centered around zero, representing the confidence level of that particular subalgorithm. Decision values are linearly combined with weights that are updated online according to an active fusion method based on performing entropic projections onto convex sets describing subalgorithms. It is assumed that there is an oracle, who is usually a human operator, providing feedback to the decision fusion method. A video-based wildfire detection system was developed to evaluate the performance of the decision fusion algorithm. In this case, image data arrive sequentially, and the oracle is the security guard of the forest lookout tower, verifying the decision of the combined algorithm. The simulation results are presented.
Probabilistic sensitivity analysis for NICE technology assessment: not an optional extra.
Claxton, Karl; Sculpher, Mark; McCabe, Chris; Briggs, Andrew; Akehurst, Ron; Buxton, Martin; Brazier, John; O'Hagan, Tony
2005-04-01
Recently the National Institute for Clinical Excellence (NICE) updated its methods guidance for technology assessment. One aspect of the new guidance is to require the use of probabilistic sensitivity analysis with all cost-effectiveness models submitted to the Institute. The purpose of this paper is to place the NICE guidance on dealing with uncertainty into a broader context of the requirements for decision making; to explain the general approach that was taken in its development; and to address each of the issues which have been raised in the debate about the role of probabilistic sensitivity analysis in general. The most appropriate starting point for developing guidance is to establish what is required for decision making. On the basis of these requirements, the methods and framework of analysis which can best meet these needs can then be identified. It will be argued that the guidance on dealing with uncertainty and, in particular, the requirement for probabilistic sensitivity analysis, is justified by the requirements of the type of decisions that NICE is asked to make. Given this foundation, the main issues and criticisms raised during and after the consultation process are reviewed. Finally, some of the methodological challenges posed by the need fully to characterise decision uncertainty and to inform the research agenda will be identified and discussed. Copyright (c) 2005 John Wiley & Sons, Ltd.
More Than a Destination: Contraceptive Decision Making as a Journey.
Downey, Margaret Mary; Arteaga, Stephanie; Villaseñor, Elodia; Gomez, Anu Manchikanti
Contraceptive use is widely recognized as a means of reducing adverse health-related outcomes. However, dominant paradigms of contraceptive counseling may rely on a narrow definition of "evidence" (i.e., scientifically accurate but exclusive of individual women's experiences). Given increased enthusiasm for long-acting, reversible contraceptive methods, such paradigms may reinforce counseling that overprivileges effectiveness, particularly for groups considered at high risk of unintended pregnancy. This study investigates where and how women's experiences fit into the definition of evidence these counseling protocols use. Using a qualitative approach, this analysis draws on semistructured interviews with 38 young (ages 18-24) Black and Latina women. We use a qualitative content analysis approach, with coding categories derived directly from the textual data. Our analysis suggests that contraceptive decision making is an iterative, relational, reflective journey. Throughout contraceptive histories, participants described experiences evolving to create a foundation from which decision-making power was drawn. The same contraceptive-related decisions were revisited repeatedly, with knowledge accrued along the way. The cumulative experience of using, assigning meanings to, and developing values around contraception meant that young women experienced contraceptive decision making as a dynamic process. This journey creates a rich body of evidence that informs contraceptive decision making. To provide appropriate, acceptable, patient-centered family planning care, providers must engage with evidence grounded in women's expertise on their contraceptive use in addition to medically accurate data on method effectiveness, side effects, and contraindications. Copyright © 2017 Jacobs Institute of Women's Health. Published by Elsevier Inc. All rights reserved.
Direct Allocation Costing: Informed Management Decisions in a Changing Environment.
ERIC Educational Resources Information Center
Mancini, Cesidio G.; Goeres, Ernest R.
1995-01-01
It is argued that colleges and universities can use direct allocation costing to provide quantitative information needed for decision making. This method of analysis requires institutions to modify traditional ideas of costing, looking to the private sector for examples of accurate costing techniques. (MSE)
Selection of remedial alternatives for mine sites: a multicriteria decision analysis approach.
Betrie, Getnet D; Sadiq, Rehan; Morin, Kevin A; Tesfamariam, Solomon
2013-04-15
The selection of remedial alternatives for mine sites is a complex task because it involves multiple criteria and often with conflicting objectives. However, an existing framework used to select remedial alternatives lacks multicriteria decision analysis (MCDA) aids and does not consider uncertainty in the selection of alternatives. The objective of this paper is to improve the existing framework by introducing deterministic and probabilistic MCDA methods. The Preference Ranking Organization Method for Enrichment Evaluation (PROMETHEE) methods have been implemented in this study. The MCDA analysis involves processing inputs to the PROMETHEE methods that are identifying the alternatives, defining the criteria, defining the criteria weights using analytical hierarchical process (AHP), defining the probability distribution of criteria weights, and conducting Monte Carlo Simulation (MCS); running the PROMETHEE methods using these inputs; and conducting a sensitivity analysis. A case study was presented to demonstrate the improved framework at a mine site. The results showed that the improved framework provides a reliable way of selecting remedial alternatives as well as quantifying the impact of different criteria on selecting alternatives. Copyright © 2013 Elsevier Ltd. All rights reserved.
2012-01-01
Background The importance of respecting women’s wishes to give birth close to their local community is supported by policy in many developed countries. However, persistent concerns about the quality and safety of maternity care in rural communities have been expressed. Safe childbirth in rural communities depends on good risk assessment and decision making as to whether and when the transfer of a woman in labour to an obstetric led unit is required. This is a difficult decision. Wide variation in transfer rates between rural maternity units have been reported suggesting different decision making criteria may be involved; furthermore, rural midwives and family doctors report feeling isolated in making these decisions and that staff in urban centres do not understand the difficulties they face. In order to develop more evidence based decision making strategies greater understanding of the way in which maternity care providers currently make decisions is required. This study aimed to examine how midwives working in urban and rural settings and obstetricians make intrapartum transfer decisions, and describe sources of variation in decision making. Methods The study was conducted in three stages. 1. 20 midwives and four obstetricians described factors influencing transfer decisions. 2. Vignettes depicting an intrapartum scenario were developed based on stage one data. 3. Vignettes were presented to 122 midwives and 12 obstetricians who were asked to assess the level of risk in each case and decide whether to transfer or not. Social judgment analysis was used to identify the factors and factor weights used in assessment. Signal detection analysis was used to identify participants’ ability to distinguish high and low risk cases and personal decision thresholds. Results When reviewing the same case information in vignettes midwives in different settings and obstetricians made very similar risk assessments. Despite this, a wide range of transfer decisions were still made, suggesting that the main source of variation in decision making and transfer rates is not in the assessment but the personal decision thresholds of clinicians. Conclusions Currently health care practice focuses on supporting or improving decision making through skills training and clinical guidelines. However, these methods alone are unlikely to be effective in improving consistency of decision making. PMID:23114289
NASA Astrophysics Data System (ADS)
Ren, Lixia; He, Li; Lu, Hongwei; Chen, Yizhong
2016-08-01
A new Monte Carlo-based interval transformation analysis (MCITA) is used in this study for multi-criteria decision analysis (MCDA) of naphthalene-contaminated groundwater management strategies. The analysis can be conducted when input data such as total cost, contaminant concentration and health risk are represented as intervals. Compared to traditional MCDA methods, MCITA-MCDA has the advantages of (1) dealing with inexactness of input data represented as intervals, (2) mitigating computational time due to the introduction of Monte Carlo sampling method, (3) identifying the most desirable management strategies under data uncertainty. A real-world case study is employed to demonstrate the performance of this method. A set of inexact management alternatives are considered in each duration on the basis of four criteria. Results indicated that the most desirable management strategy lied in action 15 for the 5-year, action 8 for the 10-year, action 12 for the 15-year, and action 2 for the 20-year management.
Ye, Fang; Chen, Zhi-Hua; Chen, Jie; Liu, Fang; Zhang, Yong; Fan, Qin-Ying; Wang, Lin
2016-01-01
Background: In the past decades, studies on infant anemia have mainly focused on rural areas of China. With the increasing heterogeneity of population in recent years, available information on infant anemia is inconclusive in large cities of China, especially with comparison between native residents and floating population. This population-based cross-sectional study was implemented to determine the anemic status of infants as well as the risk factors in a representative downtown area of Beijing. Methods: As useful methods to build a predictive model, Chi-squared automatic interaction detection (CHAID) decision tree analysis and logistic regression analysis were introduced to explore risk factors of infant anemia. A total of 1091 infants aged 6–12 months together with their parents/caregivers living at Heping Avenue Subdistrict of Beijing were surveyed from January 1, 2013 to December 31, 2014. Results: The prevalence of anemia was 12.60% with a range of 3.47%–40.00% in different subgroup characteristics. The CHAID decision tree model has demonstrated multilevel interaction among risk factors through stepwise pathways to detect anemia. Besides the three predictors identified by logistic regression model including maternal anemia during pregnancy, exclusive breastfeeding in the first 6 months, and floating population, CHAID decision tree analysis also identified the fourth risk factor, the maternal educational level, with higher overall classification accuracy and larger area below the receiver operating characteristic curve. Conclusions: The infant anemic status in metropolis is complex and should be carefully considered by the basic health care practitioners. CHAID decision tree analysis has demonstrated a better performance in hierarchical analysis of population with great heterogeneity. Risk factors identified by this study might be meaningful in the early detection and prompt treatment of infant anemia in large cities. PMID:27174328
Green material selection for sustainability: A hybrid MCDM approach.
Zhang, Honghao; Peng, Yong; Tian, Guangdong; Wang, Danqi; Xie, Pengpeng
2017-01-01
Green material selection is a crucial step for the material industry to comprehensively improve material properties and promote sustainable development. However, because of the subjectivity and conflicting evaluation criteria in its process, green material selection, as a multi-criteria decision making (MCDM) problem, has been a widespread concern to the relevant experts. Thus, this study proposes a hybrid MCDM approach that combines decision making and evaluation laboratory (DEMATEL), analytical network process (ANP), grey relational analysis (GRA) and technique for order performance by similarity to ideal solution (TOPSIS) to select the optimal green material for sustainability based on the product's needs. A nonlinear programming model with constraints was proposed to obtain the integrated closeness index. Subsequently, an empirical application of rubbish bins was used to illustrate the proposed method. In addition, a sensitivity analysis and a comparison with existing methods were employed to validate the accuracy and stability of the obtained final results. We found that this method provides a more accurate and effective decision support tool for alternative evaluation or strategy selection.
Green material selection for sustainability: A hybrid MCDM approach
Zhang, Honghao; Peng, Yong; Tian, Guangdong; Wang, Danqi; Xie, Pengpeng
2017-01-01
Green material selection is a crucial step for the material industry to comprehensively improve material properties and promote sustainable development. However, because of the subjectivity and conflicting evaluation criteria in its process, green material selection, as a multi-criteria decision making (MCDM) problem, has been a widespread concern to the relevant experts. Thus, this study proposes a hybrid MCDM approach that combines decision making and evaluation laboratory (DEMATEL), analytical network process (ANP), grey relational analysis (GRA) and technique for order performance by similarity to ideal solution (TOPSIS) to select the optimal green material for sustainability based on the product's needs. A nonlinear programming model with constraints was proposed to obtain the integrated closeness index. Subsequently, an empirical application of rubbish bins was used to illustrate the proposed method. In addition, a sensitivity analysis and a comparison with existing methods were employed to validate the accuracy and stability of the obtained final results. We found that this method provides a more accurate and effective decision support tool for alternative evaluation or strategy selection. PMID:28498864
Probabilistic Exposure Analysis for Chemical Risk Characterization
Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.
2009-01-01
This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660
Using Boosting Decision Trees in Gravitational Wave Searches triggered by Gamma-ray Bursts
NASA Astrophysics Data System (ADS)
Zuraw, Sarah; LIGO Collaboration
2015-04-01
The search for gravitational wave bursts requires the ability to distinguish weak signals from background detector noise. Gravitational wave bursts are characterized by their transient nature, making them particularly difficult to detect as they are similar to non-Gaussian noise fluctuations in the detector. The Boosted Decision Tree method is a powerful machine learning algorithm which uses Multivariate Analysis techniques to explore high-dimensional data sets in order to distinguish between gravitational wave signal and background detector noise. It does so by training with known noise events and simulated gravitational wave events. The method is tested using waveform models and compared with the performance of the standard gravitational wave burst search pipeline for Gamma-ray Bursts. It is shown that the method is able to effectively distinguish between signal and background events under a variety of conditions and over multiple Gamma-ray Burst events. This example demonstrates the usefulness and robustness of the Boosted Decision Tree and Multivariate Analysis techniques as a detection method for gravitational wave bursts. LIGO, UMass, PREP, NEGAP.
Decision-problem state analysis methodology
NASA Technical Reports Server (NTRS)
Dieterly, D. L.
1980-01-01
A methodology for analyzing a decision-problem state is presented. The methodology is based on the analysis of an incident in terms of the set of decision-problem conditions encountered. By decomposing the events that preceded an unwanted outcome, such as an accident, into the set of decision-problem conditions that were resolved, a more comprehensive understanding is possible. All human-error accidents are not caused by faulty decision-problem resolutions, but it appears to be one of the major areas of accidents cited in the literature. A three-phase methodology is presented which accommodates a wide spectrum of events. It allows for a systems content analysis of the available data to establish: (1) the resolutions made, (2) alternatives not considered, (3) resolutions missed, and (4) possible conditions not considered. The product is a map of the decision-problem conditions that were encountered as well as a projected, assumed set of conditions that should have been considered. The application of this methodology introduces a systematic approach to decomposing the events that transpired prior to the accident. The initial emphasis is on decision and problem resolution. The technique allows for a standardized method of accident into a scenario which may used for review or the development of a training simulation.
Affordability Engineering: Bridging the Gap Between Design and Cost
NASA Technical Reports Server (NTRS)
Reeves, J. D.; DePasquale, Dominic; Lim, Evan
2010-01-01
Affordability is a commonly used term that takes on numerous meanings depending on the context used. Within conceptual design of complex systems, the term generally implies comparisons between expected costs and expected resources. This characterization is largely correct, but does not convey the many nuances and considerations that are frequently misunderstood and underappreciated. In the most fundamental sense, affordability and cost directly relate to engineering and programmatic decisions made throughout development programs. Systems engineering texts point out that there is a temporal aspect to this relationship, for decisions made earlier in a program dictate design implications much more so than those made during latter phases. This paper explores affordability engineering and its many sub-disciplines by discussing how it can be considered an additional engineering discipline to be balanced throughout the systems engineering and systems analysis processes. Example methods of multidisciplinary design analysis with affordability as a key driver will be discussed, as will example methods of data visualization, probabilistic analysis, and other ways of relating design decisions to affordability results.
NASA Astrophysics Data System (ADS)
Marović, Ivan; Hanak, Tomaš
2017-10-01
In the management of construction projects special attention should be given to the planning as the most important phase of decision-making process. Quality decision-making based on adequate and comprehensive collaboration of all involved stakeholders is crucial in project’s early stages. Fundamental reasons for existence of this problem arise from: specific conditions of construction industry (final products are inseparable from the location i.e. location has a strong influence of building design and its structural characteristics as well as technology which will be used during construction), investors’ desires and attitudes, and influence of socioeconomic and environment aspects. Considering all mentioned reasons one can conclude that selection of adequate construction site location for future investment is complex, low structured and multi-criteria problem. To take into account all the dimensions, the proposed model for selection of adequate site location is devised. The model is based on AHP (for designing the decision-making hierarchy) and PROMETHEE (for pairwise comparison of investment locations) methods. As a result of mixing basis feature of both methods, operational synergies can be achieved in multi-criteria decision analysis. Such gives the decision-maker a sense of assurance, knowing that if the procedure proposed by the presented model has been followed, it will lead to a rational decision, carefully and systematically thought out.
Portfolio Decisions and Brain Reactions via the CEAD method.
Majer, Piotr; Mohr, Peter N C; Heekeren, Hauke R; Härdle, Wolfgang K
2016-09-01
Decision making can be a complex process requiring the integration of several attributes of choice options. Understanding the neural processes underlying (uncertain) investment decisions is an important topic in neuroeconomics. We analyzed functional magnetic resonance imaging (fMRI) data from an investment decision study for stimulus-related effects. We propose a new technique for identifying activated brain regions: cluster, estimation, activation, and decision method. Our analysis is focused on clusters of voxels rather than voxel units. Thus, we achieve a higher signal-to-noise ratio within the unit tested and a smaller number of hypothesis tests compared with the often used General Linear Model (GLM). We propose to first conduct the brain parcellation by applying spatially constrained spectral clustering. The information within each cluster can then be extracted by the flexible dynamic semiparametric factor model (DSFM) dimension reduction technique and finally be tested for differences in activation between conditions. This sequence of Cluster, Estimation, Activation, and Decision admits a model-free analysis of the local fMRI signal. Applying a GLM on the DSFM-based time series resulted in a significant correlation between the risk of choice options and changes in fMRI signal in the anterior insula and dorsomedial prefrontal cortex. Additionally, individual differences in decision-related reactions within the DSFM time series predicted individual differences in risk attitudes as modeled with the framework of the mean-variance model.
Pattern Analysis and Decision Support for Cancer through Clinico-Genomic Profiles
NASA Astrophysics Data System (ADS)
Exarchos, Themis P.; Giannakeas, Nikolaos; Goletsis, Yorgos; Papaloukas, Costas; Fotiadis, Dimitrios I.
Advances in genome technology are playing a growing role in medicine and healthcare. With the development of new technologies and opportunities for large-scale analysis of the genome, genomic data have a clear impact on medicine. Cancer prognostics and therapeutics are among the first major test cases for genomic medicine, given that all types of cancer are related with genomic instability. In this paper we present a novel system for pattern analysis and decision support in cancer. The system integrates clinical data from electronic health records and genomic data. Pattern analysis and data mining methods are applied to these integrated data and the discovered knowledge is used for cancer decision support. Through this integration, conclusions can be drawn for early diagnosis, staging and cancer treatment.
Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J
2015-03-15
This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.
Rousson, Valentin; Zumbrunn, Thomas
2011-06-22
Decision curve analysis has been introduced as a method to evaluate prediction models in terms of their clinical consequences if used for a binary classification of subjects into a group who should and into a group who should not be treated. The key concept for this type of evaluation is the "net benefit", a concept borrowed from utility theory. We recall the foundations of decision curve analysis and discuss some new aspects. First, we stress the formal distinction between the net benefit for the treated and for the untreated and define the concept of the "overall net benefit". Next, we revisit the important distinction between the concept of accuracy, as typically assessed using the Youden index and a receiver operating characteristic (ROC) analysis, and the concept of utility of a prediction model, as assessed using decision curve analysis. Finally, we provide an explicit implementation of decision curve analysis to be applied in the context of case-control studies. We show that the overall net benefit, which combines the net benefit for the treated and the untreated, is a natural alternative to the benefit achieved by a model, being invariant with respect to the coding of the outcome, and conveying a more comprehensive picture of the situation. Further, within the framework of decision curve analysis, we illustrate the important difference between the accuracy and the utility of a model, demonstrating how poor an accurate model may be in terms of its net benefit. Eventually, we expose that the application of decision curve analysis to case-control studies, where an accurate estimate of the true prevalence of a disease cannot be obtained from the data, is achieved with a few modifications to the original calculation procedure. We present several interrelated extensions to decision curve analysis that will both facilitate its interpretation and broaden its potential area of application.
Evaluation and selection of decision-making methods to assess landfill mining projects.
Hermann, Robert; Baumgartner, Rupert J; Vorbach, Stefan; Ragossnig, Arne; Pomberger, Roland
2015-09-01
For the first time in Austria, fundamental technological and economic studies on recovering secondary raw materials from large landfills have been carried out, based on the 'LAMIS - Landfill Mining Austria' pilot project. A main focus of the research - and the subject of this article - was to develop an assessment or decision-making procedure that allows landfill owners to thoroughly examine the feasibility of a landfill mining project in advance. Currently there are no standard procedures that would sufficiently cover all the multiple-criteria requirements. The basic structure of the multiple attribute decision making process was used to narrow down on selection, conceptual design and assessment of suitable procedures. Along with a breakdown into preliminary and main assessment, the entire foundation required was created, such as definitions of requirements to an assessment method, selection and accurate description of the various assessment criteria and classification of the target system for the present 'landfill mining' vs. 'retaining the landfill in after-care' decision-making problem. Based on these studies, cost-utility analysis and the analytical-hierarchy process were selected from the range of multiple attribute decision-making procedures and examined in detail. Overall, both methods have their pros and cons with regard to their use for assessing landfill mining projects. Merging these methods or connecting them with single-criteria decision-making methods (like the net present value method) may turn out to be reasonable and constitute an appropriate assessment method. © The Author(s) 2015.
Dai, Huanping; Micheyl, Christophe
2012-11-01
Psychophysical "reverse-correlation" methods allow researchers to gain insight into the perceptual representations and decision weighting strategies of individual subjects in perceptual tasks. Although these methods have gained momentum, until recently their development was limited to experiments involving only two response categories. Recently, two approaches for estimating decision weights in m-alternative experiments have been put forward. One approach extends the two-category correlation method to m > 2 alternatives; the second uses multinomial logistic regression (MLR). In this article, the relative merits of the two methods are discussed, and the issues of convergence and statistical efficiency of the methods are evaluated quantitatively using Monte Carlo simulations. The results indicate that, for a range of values of the number of trials, the estimated weighting patterns are closer to their asymptotic values for the correlation method than for the MLR method. Moreover, for the MLR method, weight estimates for different stimulus components can exhibit strong correlations, making the analysis and interpretation of measured weighting patterns less straightforward than for the correlation method. These and other advantages of the correlation method, which include computational simplicity and a close relationship to other well-established psychophysical reverse-correlation methods, make it an attractive tool to uncover decision strategies in m-alternative experiments.
Applied Behavior Analysis and Statistical Process Control?
ERIC Educational Resources Information Center
Hopkins, B. L.
1995-01-01
Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…
Employing Conjoint Analysis in Making Compensation Decisions.
ERIC Educational Resources Information Center
Kienast, Philip; And Others
1983-01-01
Describes a method employing conjoint analysis that generates utility/cost ratios for various elements of the compensation package. Its superiority to simple preference surveys is examined. Results of a study of the use of this method in fringe benefit planning in a large financial institution are reported. (Author/JAC)
A method for scenario-based risk assessment for robust aerospace systems
NASA Astrophysics Data System (ADS)
Thomas, Victoria Katherine
In years past, aircraft conceptual design centered around creating a feasible aircraft that could be built and could fly the required missions. More recently, aircraft viability entered into conceptual design, allowing that the product's potential to be profitable should also be examined early in the design process. While examining an aerospace system's feasibility and viability early in the design process is extremely important, it is also important to examine system risk. In traditional aerospace systems risk analysis, risk is examined from the perspective of performance, schedule, and cost. Recently, safety and reliability analysis have been brought forward in the design process to also be examined during late conceptual and early preliminary design. While these analyses work as designed, existing risk analysis methods and techniques are not designed to examine an aerospace system's external operating environment and the risks present there. A new method has been developed here to examine, during the early part of concept design, the risk associated with not meeting assumptions about the system's external operating environment. The risks are examined in five categories: employment, culture, government and politics, economics, and technology. The risks are examined over a long time-period, up to the system's entire life cycle. The method consists of eight steps over three focus areas. The first focus area is Problem Setup. During problem setup, the problem is defined and understood to the best of the decision maker's ability. There are four steps in this area, in the following order: Establish the Need, Scenario Development, Identify Solution Alternatives, and Uncertainty and Risk Identification. There is significant iteration between steps two through four. Focus area two is Modeling and Simulation. In this area the solution alternatives and risks are modeled, and a numerical value for risk is calculated. A risk mitigation model is also created. The four steps involved in completing the modeling and simulation are: Alternative Solution Modeling, Uncertainty Quantification, Risk Assessment, and Risk Mitigation. Focus area three consists of Decision Support. In this area a decision support interface is created that allows for game playing between solution alternatives and risk mitigation. A multi-attribute decision making process is also implemented to aid in decision making. A demonstration problem inspired by Airbus' mid 1980s decision to break into the widebody long-range market was developed to illustrate the use of this method. The results showed that the method is able to capture additional types of risk than previous analysis methods, particularly at the early stages of aircraft design. It was also shown that the method can be used to help create a system that is robust to external environmental factors. The addition of an external environment risk analysis in the early stages of conceptual design can add another dimension to the analysis of feasibility and viability. The ability to take risk into account during the early stages of the design process can allow for the elimination of potentially feasible and viable but too-risky alternatives. The addition of a scenario-based analysis instead of a traditional probabilistic analysis enabled uncertainty to be effectively bound and examined over a variety of potential futures instead of only a single future. There is also potential for a product to be groomed for a specific future that one believes is likely to happen, or for a product to be steered during design as the future unfolds.
Gavilán, Rosa Elvira; Nebot, Carolina; Patyra, Ewelina; Miranda, Jose Manuel; Franco, Carlos Manuel; Cepeda, Alberto
2018-05-02
Taking into consideration the maximum level for coccidiostats included in the European Regulation 574/2011 and the fact that the presence of residues of sulfonamides in non-target feed is forbidden, the aim of this article is to present an analytical method based on HPLC-MS/MS for the identification and quantification of sulfonamides and coccidiostats in non-target feeds. The method was validated following Decision 2002/657/EC and recovery, repeatability, and reproducibility were within the limits stablished in the Decision. For coccidiostats, the decision limit and detection capability were calculated for the different species taking into account the maximum level allowed in Regulation 574/2011. The applicability of the method was investigated in 50 feed samples collected from dairy farms, 50 obtained from feed mills, and 10 interlaboratory feed samples.
Multiobjective Decision Analysis With Engineering and Business Applications
NASA Astrophysics Data System (ADS)
Wood, Eric
The last 15 years have witnessed the development of a large number of multiobjective decision techniques. Applying these techniques to environmental, engineering, and business problems has become well accepted. Multiobjective Decision Analysis With Engineering and Business Applications attempts to cover the main multiobjective techniques both in their mathematical treatment and in their application to real-world problems.The book is divided into 12 chapters plus three appendices. The main portion of the book is represented by chapters 3-6, Where the various approaches are identified, classified, and reviewed. Chapter 3 covers methods for generating nondominated solutions; chapter 4, continuous methods with prior preference articulation; chapter 5, discrete methods with prior preference articulation; and chapter 6, methods of progressive articulation of preferences. In these four chapters, close to 20 techniques are discussed with over 20 illustrative examples. This is both a strength and a weakness; the breadth of techniques and examples provide comprehensive coverage, but it is in a style too mathematically compact for most readers. By my count, the presentation of the 20 techniques in chapters 3-6 covered 85 pages, an average of about 4.5 pages each; therefore, a sound basis in linear algebra and linear programing is required if the reader hopes to follow the material. Chapter 2, “Concepts in Multiobjective Analysis,” also assumes such a background.
Mapping Their Road to University: First-Generation Students' Choice and Decision of University
ERIC Educational Resources Information Center
Kutty, Faridah Mydin
2014-01-01
This paper describes a qualitative case study that investigated the aspirations and decision-making process of first-generation students concerning university education. The participants comprised of 16 first-generation students at a research university. Data were obtained through interviews and analyzed using thematic analysis method. The…
Teacher Decision Making and the Implementation of an Integrated Arts Curriculum
ERIC Educational Resources Information Center
Devono, Mary K.
2009-01-01
This study examines the impact of teacher decision making upon implementing an integrated arts curriculum. Qualitative research methods, including interviews, lesson plan document analysis, and teacher discussion of student artwork comprise the research data from eight elementary classroom teachers. This study is designed to add to the descriptive…
Exploratory Honors Students: Academic Major and Career Decision Making
ERIC Educational Resources Information Center
Carduner, Jessie; Padak, Gary M.; Reynolds, Jamie
2011-01-01
In this qualitative study, we investigated the academic major and career decision-making processes of honors college students who were declared as "exploratory" students in their freshman year at a large, public, midwestern university. We used semistandardized interviews and document analysis as primary data collection methods to answer…
The Regional Vulnerability Assessment (ReV A) Program is an applied research program t,1at is focusing on using spatial information and model results to support environmental decision-making at regional- down to local-scales. Re VA has developed analysis and assessment methods to...
Marsh, Kevin; Caro, J Jaime; Zaiser, Erica; Heywood, James; Hamed, Alaa
2018-01-01
Patient preferences should be a central consideration in healthcare decision making. However, stories of patients challenging regulatory and reimbursement decisions has led to questions on whether patient voices are being considered sufficiently during those decision making processes. This has led some to argue that it is necessary to quantify patient preferences before they can be adequately considered. This study considers the lessons from the use of multi-criteria decision analysis (MCDA) for efforts to quantify patient preferences. It defines MCDA and summarizes the benefits it can provide to decision makers, identifies examples of MCDAs that have involved patients, and summarizes good practice guidelines as they relate to quantifying patient preferences. The guidance developed to support the use of MCDA in healthcare provide some useful considerations for the quantification of patient preferences, namely that researchers should give appropriate consideration to: the heterogeneity of patient preferences, and its relevance to decision makers; the cognitive challenges posed by different elicitation methods; and validity of the results they produce. Furthermore, it is important to consider how the relevance of these considerations varies with the decision being supported. The MCDA literature holds important lessons for how patient preferences should be quantified to support healthcare decision making.
Systems analysis - a new paradigm and decision support tools for the water framework directive
NASA Astrophysics Data System (ADS)
Bruen, M.
2007-06-01
In the early days of Systems Analysis the focus was on providing tools for optimisation, modelling and simulation for use by experts. Now there is a recognition of the need to develop and disseminate tools to assist in making decisions, negotiating compromises and communicating preferences that can easily be used by stakeholders without the need for specialist training. The Water Framework Directive (WFD) requires public participation and thus provides a strong incentive for progress in this direction. This paper places the new paradigm in the context of the classical one and discusses some of the new approaches which can be used in the implementation of the WFD. These include multi-criteria decision support methods suitable for environmental problems, adaptive management, cognitive mapping, social learning and cooperative design and group decision-making. Concordance methods (such as ELECTRE) and the Analytical Hierarchy Process (AHP) are identified as multi-criteria methods that can be readily integrated into Decision Support Systems (DSS) that deal with complex environmental issues with very many criteria, some of which are qualitative. The expanding use of the new paradigm provides an opportunity to observe and learn from the interaction of stakeholders with the new technology and to assess its effectiveness. This is best done by trained sociologists fully integrated into the processes. The WINCOMS research project is an example applied to the implementation of the WFD in Ireland.
A customisable framework for the assessment of therapies in the solution of therapy decision tasks.
Manjarrés Riesco, A; Martínez Tomás, R; Mira Mira, J
2000-01-01
In current medical research, a growing interest can be observed in the definition of a global therapy-evaluation framework which integrates considerations such as patients preferences and quality-of-life results. In this article, we propose the use of the research results in this domain as a source of knowledge in the design of support systems for therapy decision analysis, in particular with a view to application in oncology. We discuss the incorporation of these considerations in the definition of the therapy-assessment methods involved in the solution of a generic therapy decision task, described in the context of AI software development methodologies such as CommonKADS. The goal of the therapy decision task is to identify the ideal therapy, for a given patient, in accordance with a set of objectives of a diverse nature. The assessment methods applied are based either on data obtained from statistics or on the specific idiosyncrasies of each patient, as identified from their responses to a suite of psychological tests. In the analysis of the therapy decision task we emphasise the importance, from a methodological perspective, of using a rigorous approach to the modelling of domain ontologies and domain-specific data. To this aim we make extensive use of the semi-formal object oriented analysis notation UML to describe the domain level.
NASA Astrophysics Data System (ADS)
Radomski, Bartosz; Ćwiek, Barbara; Mróz, Tomasz M.
2017-11-01
The paper presents multicriteria decision aid analysis of the choice of PV installation providing electric energy to a public utility building. From the energy management point of view electricity obtained by solar radiation has become crucial renewable energy source. Application of PV installations may occur a profitable solution from energy, economic and ecologic point of view for both existing and newly erected buildings. Featured variants of PV installations have been assessed by multicriteria analysis based on ANP (Analytic Network Process) method. Technical, economical, energy and environmental criteria have been identified as main decision criteria. Defined set of decision criteria has an open character and can be modified in the dialog process between the decision-maker and the expert - in the present case, an expert in planning of development of energy supply systems. The proposed approach has been used to evaluate three variants of PV installation acceptable for existing educational building located in Poznań, Poland - the building of Faculty of Chemical Technology, Poznań University of Technology. Multi-criteria analysis based on ANP method and the calculation software Super Decisions has proven to be an effective tool for energy planning, leading to the indication of the recommended variant of PV installation in existing and newly erected public buildings. Achieved results show prospects and possibilities of rational renewable energy usage as complex solution to public utility buildings.
QTest: Quantitative Testing of Theories of Binary Choice.
Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William
2014-01-01
The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.
Castro Jaramillo, Hector Eduardo; Goetghebeur, Mireille; Moreno-Mattar, Ornella
2016-01-01
In 2012, Colombia experienced an important institutional transformation after the establishment of the Health Technology Assessment Institute (IETS), the disbandment of the Regulatory Commission for Health and the reassignment of reimbursement decision-making powers to the Ministry of Health and Social Protection (MoHSP). These dynamic changes provided the opportunity to test Multi-Criteria Decision Analysis (MCDA) for systematic and more transparent resource-allocation decision-making. During 2012 and 2013, the MCDA framework Evidence and Value: Impact on Decision Making (EVIDEM) was tested in Colombia. This consisted of a preparatory stage in which the investigators conducted literature searches and produced HTA reports for four interventions of interest, followed by a panel session with decision makers. This method was contrasted with a current approach used in Colombia for updating the publicly financed benefits package (POS), where narrative health technology assessment (HTA) reports are presented alongside comprehensive budget impact analyses (BIAs). Disease severity, size of population, and efficacy ranked at the top among fifteen preselected relevant criteria. MCDA estimates of technologies of interest ranged between 71 to 90 percent of maximum value. The ranking of technologies was sensitive to the methods used. Participants considered that a two-step approach including an MCDA template, complemented by a detailed BIA would be the best approach to assist decision-making in this context. Participants agreed that systematic priority setting should take place in Colombia. This work may serve as the basis to the MoHSP on its interest of setting up a systematic and more transparent process for resource-allocation decision-making.
Clayman, Marla L.; Makoul, Gregory; Harper, Maya M.; Koby, Danielle G.; Williams, Adam R.
2012-01-01
Objectives Describe the development and refinement of a scheme, Detail of Essential Elements and Participants in Shared Decision Making (DEEP-SDM), for coding Shared Decision Making (SDM) while reporting on the characteristics of decisions in a sample of patients with metastatic breast cancer. Methods The Evidence-Based Patient Choice instrument was modified to reflect Makoul and Clayman’s Integrative Model of SDM. Coding was conducted on video recordings of 20 women at the first visit with their medical oncologists after suspicion of disease progression. Noldus Observer XT v.8, a video coding software platform, was used for coding. Results The sample contained 80 decisions (range: 1-11), divided into 150 decision making segments. Most decisions were physician-led, although patients and physicians initiated similar numbers of decision-making conversations. Conclusion DEEP-SDM facilitates content analysis of encounters between women with metastatic breast cancer and their medical oncologists. Despite the fractured nature of decision making, it is possible to identify decision points and to code each of the Essential Elements of Shared Decision Making. Further work should include application of DEEP-SDM to non-cancer encounters. Practice Implications: A better understanding of how decisions unfold in the medical encounter can help inform the relationship of SDM to patient-reported outcomes. PMID:22784391
Blasco, H; Błaszczyński, J; Billaut, J C; Nadal-Desbarats, L; Pradat, P F; Devos, D; Moreau, C; Andres, C R; Emond, P; Corcia, P; Słowiński, R
2015-02-01
Metabolomics is an emerging field that includes ascertaining a metabolic profile from a combination of small molecules, and which has health applications. Metabolomic methods are currently applied to discover diagnostic biomarkers and to identify pathophysiological pathways involved in pathology. However, metabolomic data are complex and are usually analyzed by statistical methods. Although the methods have been widely described, most have not been either standardized or validated. Data analysis is the foundation of a robust methodology, so new mathematical methods need to be developed to assess and complement current methods. We therefore applied, for the first time, the dominance-based rough set approach (DRSA) to metabolomics data; we also assessed the complementarity of this method with standard statistical methods. Some attributes were transformed in a way allowing us to discover global and local monotonic relationships between condition and decision attributes. We used previously published metabolomics data (18 variables) for amyotrophic lateral sclerosis (ALS) and non-ALS patients. Principal Component Analysis (PCA) and Orthogonal Partial Least Square-Discriminant Analysis (OPLS-DA) allowed satisfactory discrimination (72.7%) between ALS and non-ALS patients. Some discriminant metabolites were identified: acetate, acetone, pyruvate and glutamine. The concentrations of acetate and pyruvate were also identified by univariate analysis as significantly different between ALS and non-ALS patients. DRSA correctly classified 68.7% of the cases and established rules involving some of the metabolites highlighted by OPLS-DA (acetate and acetone). Some rules identified potential biomarkers not revealed by OPLS-DA (beta-hydroxybutyrate). We also found a large number of common discriminating metabolites after Bayesian confirmation measures, particularly acetate, pyruvate, acetone and ascorbate, consistent with the pathophysiological pathways involved in ALS. DRSA provides a complementary method for improving the predictive performance of the multivariate data analysis usually used in metabolomics. This method could help in the identification of metabolites involved in disease pathogenesis. Interestingly, these different strategies mostly identified the same metabolites as being discriminant. The selection of strong decision rules with high value of Bayesian confirmation provides useful information about relevant condition-decision relationships not otherwise revealed in metabolomics data. Copyright © 2014 Elsevier Inc. All rights reserved.
The reliability of the pass/fail decision for assessments comprised of multiple components.
Möltner, Andreas; Tımbıl, Sevgi; Jünger, Jana
2015-01-01
The decision having the most serious consequences for a student taking an assessment is the one to pass or fail that student. For this reason, the reliability of the pass/fail decision must be determined for high quality assessments, just as the measurement reliability of the point values. Assessments in a particular subject (graded course credit) are often composed of multiple components that must be passed independently of each other. When "conjunctively" combining separate pass/fail decisions, as with other complex decision rules for passing, adequate methods of analysis are necessary for estimating the accuracy and consistency of these classifications. To date, very few papers have addressed this issue; a generally applicable procedure was published by Douglas and Mislevy in 2010. Using the example of an assessment comprised of several parts that must be passed separately, this study analyzes the reliability underlying the decision to pass or fail students and discusses the impact of an improved method for identifying those who do not fulfill the minimum requirements. The accuracy and consistency of the decision to pass or fail an examinee in the subject cluster Internal Medicine/General Medicine/Clinical Chemistry at the University of Heidelberg's Faculty of Medicine was investigated. This cluster requires students to separately pass three components (two written exams and an OSCE), whereby students may reattempt to pass each component twice. Our analysis was carried out using the method described by Douglas and Mislevy. Frequently, when complex logical connections exist between the individual pass/fail decisions in the case of low failure rates, only a very low reliability for the overall decision to grant graded course credit can be achieved, even if high reliabilities exist for the various components. For the example analyzed here, the classification accuracy and consistency when conjunctively combining the three individual parts is relatively low with κ=0.49 or κ=0.47, despite the good reliability of over 0.75 for each of the three components. The option to repeat each component twice leads to a situation in which only about half of the candidates who do not satisfy the minimum requirements would fail the overall assessment, while the other half is able to continue their studies despite having deficient knowledge and skills. The method put forth by Douglas and Mislevy allows the analysis of the decision accuracy and consistency for complex combinations of scores from different components. Even in the case of highly reliable components, it is not necessarily so that a reliable pass/fail decision has been reached - for instance in the case of low failure rates. Assessments must be administered with the explicit goal of identifying examinees that do not fulfill the minimum requirements.
Islam, Roosan; Weir, Charlene R; Jones, Makoto; Del Fiol, Guilherme; Samore, Matthew H
2015-11-30
Clinical experts' cognitive mechanisms for managing complexity have implications for the design of future innovative healthcare systems. The purpose of the study is to examine the constituents of decision complexity and explore the cognitive strategies clinicians use to control and adapt to their information environment. We used Cognitive Task Analysis (CTA) methods to interview 10 Infectious Disease (ID) experts at the University of Utah and Salt Lake City Veterans Administration Medical Center. Participants were asked to recall a complex, critical and vivid antibiotic-prescribing incident using the Critical Decision Method (CDM), a type of Cognitive Task Analysis (CTA). Using the four iterations of the Critical Decision Method, questions were posed to fully explore the incident, focusing in depth on the clinical components underlying the complexity. Probes were included to assess cognitive and decision strategies used by participants. The following three themes emerged as the constituents of decision complexity experienced by the Infectious Diseases experts: 1) the overall clinical picture does not match the pattern, 2) a lack of comprehension of the situation and 3) dealing with social and emotional pressures such as fear and anxiety. All these factors contribute to decision complexity. These factors almost always occurred together, creating unexpected events and uncertainty in clinical reasoning. Five themes emerged in the analyses of how experts deal with the complexity. Expert clinicians frequently used 1) watchful waiting instead of over- prescribing antibiotics, engaged in 2) theory of mind to project and simulate other practitioners' perspectives, reduced very complex cases into simple 3) heuristics, employed 4) anticipatory thinking to plan and re-plan events and consulted with peers to share knowledge, solicit opinions and 5) seek help on patient cases. The cognitive strategies to deal with decision complexity found in this study have important implications for design future decision support systems for the management of complex patients.
NASA Astrophysics Data System (ADS)
Chu, J. G.; Zhang, C.; Fu, G. T.; Li, Y.; Zhou, H. C.
2015-04-01
This study investigates the effectiveness of a sensitivity-informed method for multi-objective operation of reservoir systems, which uses global sensitivity analysis as a screening tool to reduce the computational demands. Sobol's method is used to screen insensitive decision variables and guide the formulation of the optimization problems with a significantly reduced number of decision variables. This sensitivity-informed problem decomposition dramatically reduces the computational demands required for attaining high quality approximations of optimal tradeoff relationships between conflicting design objectives. The search results obtained from the reduced complexity multi-objective reservoir operation problems are then used to pre-condition the full search of the original optimization problem. In two case studies, the Dahuofang reservoir and the inter-basin multi-reservoir system in Liaoning province, China, sensitivity analysis results show that reservoir performance is strongly controlled by a small proportion of decision variables. Sensitivity-informed problem decomposition and pre-conditioning are evaluated in their ability to improve the efficiency and effectiveness of multi-objective evolutionary optimization. Overall, this study illustrates the efficiency and effectiveness of the sensitivity-informed method and the use of global sensitivity analysis to inform problem decomposition when solving the complex multi-objective reservoir operation problems.
Multiattribute evaluation in formulary decision making as applied to calcium-channel blockers.
Schumacher, G E
1991-02-01
The use of multiattribute utility theory (MAUT) to make a formulary decision involving calcium-channel blockers (CCBs) is described. The MAUT method is a procedure for identifying, characterizing, and comparing the many variables that may affect a decision. Although applications in pharmacy have been infrequent, MAUT should be particularly appealing to formulary committees. The steps of the MAUT method are (1) determine the viewpoint of the decision makers, (2) identify the decision alternatives, (3) identify the attributes to be evaluated, (4) identify the factors to be used in evaluating the attributes, (5) establish a utility scale for scoring each factor, (6) transform the values for each factor to its utility scale, (7) determine weights for each attribute and factor, (8) calculate the total utility score for each decision alternative, (9) determine which decision alternative has the greatest total score, and (10) perform a sensitivity analysis. The viewpoint of a formulary committee in a health maintenance organization was simulated to develop a model for using the MAUT method to compare CCBs for single-agent therapy of chronic stable angina in ambulatory patients for one year. The attributes chosen were effectiveness, safety, patient acceptance, and cost and weighted 36%, 29%, 21%, and 14%, respectively, as contributions to the evaluation. The rank order of the decision alternatives was (1) generic verapamil, (2) brand-name verapamil, (3) diltiazem, (4) nicardipine, and (5) nifedipine. The MAUT method provides a standardized yet flexible format for comparing and selecting among formulary alternatives.
TREATMENT SWITCHING: STATISTICAL AND DECISION-MAKING CHALLENGES AND APPROACHES.
Latimer, Nicholas R; Henshall, Chris; Siebert, Uwe; Bell, Helen
2016-01-01
Treatment switching refers to the situation in a randomized controlled trial where patients switch from their randomly assigned treatment onto an alternative. Often, switching is from the control group onto the experimental treatment. In this instance, a standard intention-to-treat analysis does not identify the true comparative effectiveness of the treatments under investigation. We aim to describe statistical methods for adjusting for treatment switching in a comprehensible way for nonstatisticians, and to summarize views on these methods expressed by stakeholders at the 2014 Adelaide International Workshop on Treatment Switching in Clinical Trials. We describe three statistical methods used to adjust for treatment switching: marginal structural models, two-stage adjustment, and rank preserving structural failure time models. We draw upon discussion heard at the Adelaide International Workshop to explore the views of stakeholders on the acceptability of these methods. Stakeholders noted that adjustment methods are based on assumptions, the validity of which may often be questionable. There was disagreement on the acceptability of adjustment methods, but consensus that when these are used, they should be justified rigorously. The utility of adjustment methods depends upon the decision being made and the processes used by the decision-maker. Treatment switching makes estimating the true comparative effect of a new treatment challenging. However, many decision-makers have reservations with adjustment methods. These, and how they affect the utility of adjustment methods, require further exploration. Further technical work is required to develop adjustment methods to meet real world needs, to enhance their acceptability to decision-makers.
Lessard, Chantale; Contandriopoulos, André-Pierre; Beaulieu, Marie-Dominique
2009-01-01
Background A considerable amount of resource allocation decisions take place daily at the point of the clinical encounter; especially in primary care, where 80 percent of health problems are managed. Ignoring economic evaluation evidence in individual clinical decision-making may have a broad impact on the efficiency of health services. To date, almost all studies on the use of economic evaluation in decision-making used a quantitative approach, and few investigated decision-making at the clinical level. An important question is whether economic evaluations affect clinical practice. The project is an intervention research study designed to understand the role of economic evaluation in the decision-making process of family physicians (FPs). The contributions of the project will be from the perspective of Pierre Bourdieu's sociological theory. Methods/design A qualitative research strategy is proposed. We will conduct an embedded multiple-case study design. Ten case studies will be performed. The FPs will be the unit of analysis. The sampling strategies will be directed towards theoretical generalization. The 10 selected cases will be intended to reflect a diversity of FPs. There will be two embedded units of analysis: FPs (micro-level of analysis) and field of family medicine (macro-level of analysis). The division of the determinants of practice/behaviour into two groups, corresponding to the macro-structural level and the micro-individual level, is the basis for Bourdieu's mode of analysis. The sources of data collection for the micro-level analysis will be 10 life history interviews with FPs, documents and observational evidence. The sources of data collection for the macro-level analysis will be documents and 9 open-ended, focused interviews with key informants from medical associations and academic institutions. The analytic induction approach to data analysis will be used. A list of codes will be generated based on both the original framework and new themes introduced by the participants. We will conduct within-case and cross-case analyses of the data. Discussion The question of the role of economic evaluation in FPs' decision-making is of great interest to scientists, health care practitioners, managers and policy-makers, as well as to consultants, industry, and society. It is believed that the proposed research approach will make an original contribution to the development of knowledge, both empirical and theoretical. PMID:19210787
Methods Used to Support a Life Cycle of Complex Engineering Products
NASA Astrophysics Data System (ADS)
Zakharova, Alexandra A.; Kolegova, Olga A.; Nekrasova, Maria E.; Eremenko, Andrey O.
2016-08-01
Management of companies involved in the design, development and operation of complex engineering products recognize the relevance of creating systems for product lifecycle management. A system of methods is proposed to support life cycles of complex engineering products, based on fuzzy set theory and hierarchical analysis. The system of methods serves to demonstrate the grounds for making strategic decisions in an environment of uncertainty, allows the use of expert knowledge, and provides interconnection of decisions at all phases of strategic management and all stages of a complex engineering product lifecycle.
Lindahl, Jonas; Danell, Rickard
The aim of this study was to provide a framework to evaluate bibliometric indicators as decision support tools from a decision making perspective and to examine the information value of early career publication rate as a predictor of future productivity. We used ROC analysis to evaluate a bibliometric indicator as a tool for binary decision making. The dataset consisted of 451 early career researchers in the mathematical sub-field of number theory. We investigated the effect of three different definitions of top performance groups-top 10, top 25, and top 50 %; the consequences of using different thresholds in the prediction models; and the added prediction value of information on early career research collaboration and publications in prestige journals. We conclude that early career performance productivity has an information value in all tested decision scenarios, but future performance is more predictable if the definition of a high performance group is more exclusive. Estimated optimal decision thresholds using the Youden index indicated that the top 10 % decision scenario should use 7 articles, the top 25 % scenario should use 7 articles, and the top 50 % should use 5 articles to minimize prediction errors. A comparative analysis between the decision thresholds provided by the Youden index which take consequences into consideration and a method commonly used in evaluative bibliometrics which do not take consequences into consideration when determining decision thresholds, indicated that differences are trivial for the top 25 and the 50 % groups. However, a statistically significant difference between the methods was found for the top 10 % group. Information on early career collaboration and publication strategies did not add any prediction value to the bibliometric indicator publication rate in any of the models. The key contributions of this research is the focus on consequences in terms of prediction errors and the notion of transforming uncertainty into risk when we are choosing decision thresholds in bibliometricly informed decision making. The significance of our results are discussed from the point of view of a science policy and management.
2014-01-01
Background Shared decision making represents a clinical consultation model where both clinician and service user are conceptualised as experts; information is shared bilaterally and joint treatment decisions are reached. Little previous research has been conducted to assess experience of this model in psychiatric practice. The current project therefore sought to explore the attitudes and experiences of consultant psychiatrists relating to shared decision making in the prescribing of antipsychotic medications. Methods A qualitative research design allowed the experiences and beliefs of participants in relation to shared decision making to be elicited. Purposive sampling was used to recruit participants from a range of clinical backgrounds and with varying length of clinical experience. A semi-structured interview schedule was utilised and was adapted in subsequent interviews to reflect emergent themes. Data analysis was completed in parallel with interviews in order to guide interview topics and to inform recruitment. A directed analysis method was utilised for interview analysis with themes identified being fitted to a framework identified from the research literature as applicable to the practice of shared decision making. Examples of themes contradictory to, or not adequately explained by, the framework were sought. Results A total of 26 consultant psychiatrists were interviewed. Participants expressed support for the shared decision making model, but also acknowledged that it was necessary to be flexible as the clinical situation dictated. A number of potential barriers to the process were perceived however: The commonest barrier was the clinician’s beliefs regarding the service users’ insight into their mental disorder, presented in some cases as an absolute barrier to shared decision making. In addition factors external to the clinician - service user relationship were identified as impacting on the decision making process, including; environmental factors, financial constraints as well as societal perceptions of mental disorder in general and antipsychotic medication in particular. Conclusions This project has allowed identification of potential barriers to shared decision making in psychiatric practice. Further work is necessary to observe the decision making process in clinical practice and also to identify means in which the identified barriers, in particular ‘lack of insight’, may be more effectively managed. PMID:24886121
Maimoun, Mousa; Madani, Kaveh; Reinhart, Debra
2016-04-15
Historically, the U.S. waste collection fleet was dominated by diesel-fueled waste collection vehicles (WCVs); the growing need for sustainable waste collection has urged decision makers to incorporate economically efficient alternative fuels, while mitigating environmental impacts. The pros and cons of alternative fuels complicate the decisions making process, calling for a comprehensive study that assesses the multiple factors involved. Multi-criteria decision analysis (MCDA) methods allow decision makers to select the best alternatives with respect to selection criteria. In this study, two MCDA methods, Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) and Simple Additive Weighting (SAW), were used to rank fuel alternatives for the U.S. waste collection industry with respect to a multi-level environmental and financial decision matrix. The environmental criteria consisted of life-cycle emissions, tail-pipe emissions, water footprint (WFP), and power density, while the financial criteria comprised of vehicle cost, fuel price, fuel price stability, and fueling station availability. The overall analysis showed that conventional diesel is still the best option, followed by hydraulic-hybrid WCVs, landfill gas (LFG) sourced natural gas, fossil natural gas, and biodiesel. The elimination of the WFP and power density criteria from the environmental criteria ranked biodiesel 100 (BD100) as an environmentally better alternative compared to other fossil fuels (diesel and natural gas). This result showed that considering the WFP and power density as environmental criteria can make a difference in the decision process. The elimination of the fueling station and fuel price stability criteria from the decision matrix ranked fossil natural gas second after LFG-sourced natural gas. This scenario was found to represent the status quo of the waste collection industry. A sensitivity analysis for the status quo scenario showed the overall ranking of diesel and fossil natural gas to be more sensitive to changing fuel prices as compared to other alternatives. Copyright © 2016 Elsevier B.V. All rights reserved.
Mühlbacher, Axel C; Kaczynski, Anika
2016-02-01
Healthcare decision making is usually characterized by a low degree of transparency. The demand for transparent decision processes can be fulfilled only when assessment, appraisal and decisions about health technologies are performed under a systematic construct of benefit assessment. The benefit of an intervention is often multidimensional and, thus, must be represented by several decision criteria. Complex decision problems require an assessment and appraisal of various criteria; therefore, a decision process that systematically identifies the best available alternative and enables an optimal and transparent decision is needed. For that reason, decision criteria must be weighted and goal achievement must be scored for all alternatives. Methods of multi-criteria decision analysis (MCDA) are available to analyse and appraise multiple clinical endpoints and structure complex decision problems in healthcare decision making. By means of MCDA, value judgments, priorities and preferences of patients, insurees and experts can be integrated systematically and transparently into the decision-making process. This article describes the MCDA framework and identifies potential areas where MCDA can be of use (e.g. approval, guidelines and reimbursement/pricing of health technologies). A literature search was performed to identify current research in healthcare. The results showed that healthcare decision making is addressing the problem of multiple decision criteria and is focusing on the future development and use of techniques to weight and score different decision criteria. This article emphasizes the use and future benefit of MCDA.
Dynamic decision making for dam-break emergency management - Part 1: Theoretical framework
NASA Astrophysics Data System (ADS)
Peng, M.; Zhang, L. M.
2013-02-01
An evacuation decision for dam breaks is a very serious issue. A late decision may lead to loss of lives and properties, but a very early evacuation will incur unnecessary expenses. This paper presents a risk-based framework of dynamic decision making for dam-break emergency management (DYDEM). The dam-break emergency management in both time scale and space scale is introduced first to define the dynamic decision problem. The probability of dam failure is taken as a stochastic process and estimated using a time-series analysis method. The flood consequences are taken as functions of warning time and evaluated with a human risk analysis model (HURAM) based on Bayesian networks. A decision criterion is suggested to decide whether to evacuate the population at risk (PAR) or to delay the decision. The optimum time for evacuating the PAR is obtained by minimizing the expected total loss, which integrates the time-related probabilities and flood consequences. When a delayed decision is chosen, the decision making can be updated with available new information. A specific dam-break case study is presented in a companion paper to illustrate the application of this framework to complex dam-breaching problems.
Siebert, Uwe; Rochau, Ursula; Claxton, Karl
2013-01-01
Decision analysis (DA) and value-of-information (VOI) analysis provide a systematic, quantitative methodological framework that explicitly considers the uncertainty surrounding the currently available evidence to guide healthcare decisions. In medical decision making under uncertainty, there are two fundamental questions: 1) What decision should be made now given the best available evidence (and its uncertainty)?; 2) Subsequent to the current decision and given the magnitude of the remaining uncertainty, should we gather further evidence (i.e., perform additional studies), and if yes, which studies should be undertaken (e.g., efficacy, side effects, quality of life, costs), and what sample sizes are needed? Using the currently best available evidence, VoI analysis focuses on the likelihood of making a wrong decision if the new intervention is adopted. The value of performing further studies and gathering additional evidence is based on the extent to which the additional information will reduce this uncertainty. A quantitative framework allows for the valuation of the additional information that is generated by further research, and considers the decision maker's objectives and resource constraints. Claxton et al. summarise: "Value of information analysis can be used to inform a range of policy questions including whether a new technology should be approved based on existing evidence, whether it should be approved but additional research conducted or whether approval should be withheld until the additional evidence becomes available." [Claxton K. Value of information entry in Encyclopaedia of Health Economics, Elsevier, forthcoming 2014.] The purpose of this tutorial is to introduce the framework of systematic VoI analysis to guide further research. In our tutorial article, we explain the theoretical foundations and practical methods of decision analysis and value-of-information analysis. To illustrate, we use a simple case example of a foot ulcer (e.g., with diabetes) as well as key references from the literature, including examples for the use of the decision-analytic VoI framework by health technology assessment agencies to guide further research. These concepts may guide stakeholders involved or interested in how to determine whether or not and, if so, which additional evidence is needed to make decisions. Copyright © 2013. Published by Elsevier GmbH.
Sullivan, Maura E; Ortega, Adrian; Wasserberg, Nir; Kaufman, Howard; Nyquist, Julie; Clark, Richard
2008-01-01
The purpose of this study was to determine if a cognitive task analysis (CTA) could capture steps and decision points that were not articulated during traditional teaching of a colonoscopy. Three expert colorectal surgeons were videotaped performing a colonoscopy. After the videotapes were transcribed, the experts participated in a CTA. A 26-step procedural checklist and a 16-step cognitive demands table was created by using information obtained in the CTA. The videotape transcriptions were transposed onto the procedural checklist and cognitive demands table to identify steps and decision points that were omitted during traditional teaching. Surgeon A described 50% of "how-to" steps and 43% of decision points. Surgeon B described 30% of steps and 25% of decisions. Surgeon C described 26% of steps and 38% of cognitive decisions. By using CTA, we were able to identify relevant steps and decision points that were omitted during traditional teaching by all 3 experts.
Decision support systems in water and wastewater treatment process selection and design: a review.
Hamouda, M A; Anderson, W B; Huck, P M
2009-01-01
The continuously changing drivers of the water treatment industry, embodied by rigorous environmental and health regulations and the challenge of emerging contaminants, necessitates the development of decision support systems for the selection of appropriate treatment trains. This paper explores a systematic approach to developing decision support systems, which includes the analysis of the treatment problem(s), knowledge acquisition and representation, and the identification and evaluation of criteria controlling the selection of optimal treatment systems. The objective of this article is to review approaches and methods used in decision support systems developed to aid in the selection, sequencing of unit processes and design of drinking water, domestic wastewater, and industrial wastewater treatment systems. Not surprisingly, technical considerations were found to dominate the logic of the developed systems. Most of the existing decision-support tools employ heuristic knowledge. It has been determined that there is a need to develop integrated decision support systems that are generic, usable and consider a system analysis approach.
Decision Making in Nursing Practice: A Concept Analysis.
Johansen, Mary L; O'Brien, Janice L
2016-01-01
The study aims to gain an understanding of the concept of decision making as it relates to the nurse practice environment. Rodgers' evolutionary method on concept analysis was used as a framework for the study of the concept. Articles from 1952 to 2014 were reviewed from PsycINFO, Medline, Cumulative Index to Nursing and Allied Health Literature (CINAHL), JSTOR, PubMed, and Science Direct. Findings suggest that decision making in the nurse practice environment is a complex process, integral to the nursing profession. The definition of decision making, and the attributes, antecedents, and consequences, are discussed. Contextual factors that influence the process are also discussed. An exemplar is presented to illustrate the concept. Decision making in the nurse practice environment is a dynamic conceptual process that may affect patient outcomes. Nurses need to call upon ways of knowing to make sound decisions and should be self-reflective in order to develop the process further in the professional arena. The need for further research is discussed. © 2015 Wiley Periodicals, Inc.
A qualitative analysis of parental decision making for childhood immunisation.
Marshall, S; Swerissen, H
1999-10-01
Achieving high rates of childhood immunisation is an important public health aim. Currently, however, immunisation uptake in Australia is disappointing. This qualitative study investigated the factors that influence parental decision making for childhood immunisation, and whether parents' experiences were better conceptualised in terms of static subjective expected utility models or in terms of a more dynamic process. Semi-structured in-depth interviews were conducted with 20 predominantly middle-class mothers--17 immunizers and three non-immunizers, in Melbourne, Victoria, in 1997. The data were then examined using thematic analysis. The results suggested that for these participants the decision regarding childhood immunization was better conceptualized as a dynamic process. The decision required initial consideration, implementation then maintenance. If a better understanding of immunization decision making is to be achieved, future studies must look beyond static frameworks. Clearer insight into the dynamic nature of immunization decision making should assist in the identification of more effective methods of promoting childhood immunization to groups at risk of non-compliance.
Ayyub, Bilal M
2014-02-01
The United Nations Office for Disaster Risk Reduction reported that the 2011 natural disasters, including the earthquake and tsunami that struck Japan, resulted in $366 billion in direct damages and 29,782 fatalities worldwide. Storms and floods accounted for up to 70% of the 302 natural disasters worldwide in 2011, with earthquakes producing the greatest number of fatalities. Average annual losses in the United States amount to about $55 billion. Enhancing community and system resilience could lead to massive savings through risk reduction and expeditious recovery. The rational management of such reduction and recovery is facilitated by an appropriate definition of resilience and associated metrics. In this article, a resilience definition is provided that meets a set of requirements with clear relationships to the metrics of the relevant abstract notions of reliability and risk. Those metrics also meet logically consistent requirements drawn from measure theory, and provide a sound basis for the development of effective decision-making tools for multihazard environments. Improving the resiliency of a system to meet target levels requires the examination of system enhancement alternatives in economic terms, within a decision-making framework. Relevant decision analysis methods would typically require the examination of resilience based on its valuation by society at large. The article provides methods for valuation and benefit-cost analysis based on concepts from risk analysis and management. © 2013 Society for Risk Analysis.
A multichannel decision-level fusion method for T wave alternans detection
NASA Astrophysics Data System (ADS)
Ye, Changrong; Zeng, Xiaoping; Li, Guojun; Shi, Chenyuan; Jian, Xin; Zhou, Xichuan
2017-09-01
Sudden cardiac death (SCD) is one of the most prominent causes of death among patients with cardiac diseases. Since ventricular arrhythmia is the main cause of SCD and it can be predicted by T wave alternans (TWA), the detection of TWA in the body-surface electrocardiograph (ECG) plays an important role in the prevention of SCD. But due to the multi-source nature of TWA, the nonlinear propagation through thorax, and the effects of the strong noises, the information from different channels is uncertain and competitive with each other. As a result, the single-channel decision is one-sided while the multichannel decision is difficult to reach a consensus on. In this paper, a novel multichannel decision-level fusion method based on the Dezert-Smarandache Theory is proposed to address this issue. Due to the redistribution mechanism for highly competitive information, higher detection accuracy and robustness are achieved. It also shows promise to low-cost instruments and portable applications by reducing demands for the synchronous sampling. Experiments on the real records from the Physikalisch-Technische Bundesanstalt diagnostic ECG database indicate that the performance of the proposed method improves by 12%-20% compared with the one-dimensional decision method based on the periodic component analysis.
NASA Astrophysics Data System (ADS)
Sampson, Enrique, Jr.
Many aerospace workers believe transferring work projects abroad has an erosive effect on the U.S. aerospace industry (Pritchard, 2002). This qualitative phenomenological study examines factors for outsourcing decisions and the perceived effects of outsourcing on U.S. aerospace workers. The research sample consists of aerospace industry leaders and nonleaders from the East Coast, Midwest, and West Coast of the United States. Moustakas' modified van Kaam methods of analysis (1994) and Decision Explorer analysis software were applied to the interview transcripts. Resultant data identified five core themes: communication, best value, opportunities, cost, and offset consideration. The themes provided the framework for a model designed to assist leaders in making effective decisions and communicating the benefits of those decisions when considering outsourcing of work projects.
Stang, Paul E; Ryan, Patrick B; Overhage, J Marc; Schuemie, Martijn J; Hartzema, Abraham G; Welebob, Emily
2013-10-01
Researchers using observational data to understand drug effects must make a number of analytic design choices that suit the characteristics of the data and the subject of the study. Review of the published literature suggests that there is a lack of consistency even when addressing the same research question in the same database. To characterize the degree of similarity or difference in the method and analysis choices made by observational database research experts when presented with research study scenarios. On-line survey using research scenarios on drug-effect studies to capture method selection and analysis choices that follow a dependency branching based on response to key questions. Voluntary participants experienced in epidemiological study design solicited for participation through registration on the Observational Medical Outcomes Partnership website, membership in particular professional organizations, or links in relevant newsletters. Description (proportion) of respondents selecting particular methods and making specific analysis choices based on individual drug-outcome scenario pairs. The number of questions/decisions differed based on stem questions of study design, time-at-risk, outcome definition, and comparator. There is little consistency across scenarios, by drug or by outcome of interest, in the decisions made for design and analyses in scenarios using large healthcare databases. The most consistent choice was the cohort study design but variability in the other critical decisions was common. There is great variation among epidemiologists in the design and analytical choices that they make when implementing analyses in observational healthcare databases. These findings confirm that it will be important to generate empiric evidence to inform these decisions and to promote a better understanding of the impact of standardization on research implementation.
Marsh, Kevin; Lanitis, Tereza; Neasham, David; Orfanos, Panagiotis; Caro, Jaime
2014-04-01
The objective of this study is to support those undertaking a multi-criteria decision analysis (MCDA) by reviewing the approaches adopted in healthcare MCDAs to date, how these varied with the objective of the study, and the lessons learned from this experience. Searches of EMBASE and MEDLINE identified 40 studies that provided 41 examples of MCDA in healthcare. Data were extracted on the objective of the study, methods employed, and decision makers' and study authors' reflections on the advantages and disadvantages of the methods. The recent interest in MCDA in healthcare is mirrored in an increase in the application of MCDA to evaluate healthcare interventions. Of the studies identified, the first was published in 1990, but more than half were published since 2011. They were undertaken in 18 different countries, and were designed to support investment (coverage and reimbursement), authorization, prescription, and research funding allocation decisions. Many intervention types were assessed: pharmaceuticals, public health interventions, screening, surgical interventions, and devices. Most used the value measurement approach and scored performance using predefined scales. Beyond these similarities, a diversity of different approaches were adopted, with only limited correspondence between the approach and the type of decision or product. Decision makers consulted as part of these studies, as well as the authors of the studies are positive about the potential of MCDA to improve decision making. Further work is required, however, to develop guidance for those undertaking MCDA.
The once and future application of cost-effectiveness analysis.
Berger, M L
1999-09-01
Cost-effectiveness analysis (CEA) is used by payers to make coverage decisions, by providers to make formulary decisions, and by large purchasers/employers and policymakers to choose health care performance measures. However, it continues to be poorly utilized in the marketplace because of overriding financial imperatives to control costs and a low apparent willingness to pay for quality. There is no obvious relationship between the cost-effectiveness of life-saving interventions and their application. Health care decision makers consider financial impact, safety, and effectiveness before cost-effectiveness. WHY IS CEA NOT MORE WIDELY APPLIED? Most health care providers have a short-term parochial financial perspective, whereas CEA takes a long-term view that captures all costs, benefits, and hazards, regardless of to whom they accrue. In addition, a history of poor standardization of methods, unrealistic expectations that CEA could answer fundamental ethical and political issues, and society's failure to accept the need for allocating scarce resources more judiciously, have contributed to relatively little use of the method by decision makers. HOW WILL CEA FIND GREATER UTILITY IN THE FUTURE? As decision makers take a longer-term view and understand that CEA can provide a quantitative perspective on important resource allocation decisions, including the distributional consequences of alternative choices, CEA is likely to find greater use. However, it must be embedded within a framework that promotes confidence in the social justice of health care decision making through ongoing dialogue about how the value of health and health care are defined.
A Z-number-based decision making procedure with ranking fuzzy numbers method
NASA Astrophysics Data System (ADS)
Mohamad, Daud; Shaharani, Saidatull Akma; Kamis, Nor Hanimah
2014-12-01
The theory of fuzzy set has been in the limelight of various applications in decision making problems due to its usefulness in portraying human perception and subjectivity. Generally, the evaluation in the decision making process is represented in the form of linguistic terms and the calculation is performed using fuzzy numbers. In 2011, Zadeh has extended this concept by presenting the idea of Z-number, a 2-tuple fuzzy numbers that describes the restriction and the reliability of the evaluation. The element of reliability in the evaluation is essential as it will affect the final result. Since this concept can still be considered as new, available methods that incorporate reliability for solving decision making problems is still scarce. In this paper, a decision making procedure based on Z-numbers is proposed. Due to the limitation of its basic properties, Z-numbers will be first transformed to fuzzy numbers for simpler calculations. A method of ranking fuzzy number is later used to prioritize the alternatives. A risk analysis problem is presented to illustrate the effectiveness of this proposed procedure.
Community resilience and decision theory challenges for catastrophic events.
Cox, Louis Anthony
2012-11-01
Extreme and catastrophic events pose challenges for normative models of risk management decision making. They invite development of new methods and principles to complement existing normative decision and risk analysis. Because such events are rare, it is difficult to learn about them from experience. They can prompt both too little concern before the fact, and too much after. Emotionally charged and vivid outcomes promote probability neglect and distort risk perceptions. Aversion to acting on uncertain probabilities saps precautionary action; moral hazard distorts incentives to take care; imperfect learning and social adaptation (e.g., herd-following, group-think) complicate forecasting and coordination of individual behaviors and undermine prediction, preparation, and insurance of catastrophic events. Such difficulties raise substantial challenges for normative decision theories prescribing how catastrophe risks should be managed. This article summarizes challenges for catastrophic hazards with uncertain or unpredictable frequencies and severities, hard-to-envision and incompletely described decision alternatives and consequences, and individual responses that influence each other. Conceptual models and examples clarify where and why new methods are needed to complement traditional normative decision theories for individuals and groups. For example, prospective and retrospective preferences for risk management alternatives may conflict; procedures for combining individual beliefs or preferences can produce collective decisions that no one favors; and individual choices or behaviors in preparing for possible disasters may have no equilibrium. Recent ideas for building "disaster-resilient" communities can complement traditional normative decision theories, helping to meet the practical need for better ways to manage risks of extreme and catastrophic events. © 2012 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Subagadis, Y. H.; Schütze, N.; Grundmann, J.
2014-09-01
The conventional methods used to solve multi-criteria multi-stakeholder problems are less strongly formulated, as they normally incorporate only homogeneous information at a time and suggest aggregating objectives of different decision-makers avoiding water-society interactions. In this contribution, Multi-Criteria Group Decision Analysis (MCGDA) using a fuzzy-stochastic approach has been proposed to rank a set of alternatives in water management decisions incorporating heterogeneous information under uncertainty. The decision making framework takes hydrologically, environmentally, and socio-economically motivated conflicting objectives into consideration. The criteria related to the performance of the physical system are optimized using multi-criteria simulation-based optimization, and fuzzy linguistic quantifiers have been used to evaluate subjective criteria and to assess stakeholders' degree of optimism. The proposed methodology is applied to find effective and robust intervention strategies for the management of a coastal hydrosystem affected by saltwater intrusion due to excessive groundwater extraction for irrigated agriculture and municipal use. Preliminary results show that the MCGDA based on a fuzzy-stochastic approach gives useful support for robust decision-making and is sensitive to the decision makers' degree of optimism.
Bryan, Stirling; Williams, Iestyn; McIver, Shirley
2007-02-01
Resource scarcity is the raison d'être for the discipline of economics. Thus, the primary purpose of economic analysis is to help decision-makers when addressing problems arising due to the scarcity problem. The research reported here was concerned with how cost-effectiveness information is used by the National Institute for Health & Clinical Excellence (NICE) in national technology coverage decisions in the UK, and how its impact might be increased. The research followed a qualitative case study methodology with semi-structured interviews, supported by observation and analysis of secondary sources. Our research highlights that the technology appraisal function of NICE represents an important progression for the UK health economics community: new cost-effectiveness work is commissioned for each technology and that work directly informs national health policy. However, accountability in policy decisions necessitates that the information upon which decisions are based (including cost-effectiveness analysis, CEA) is accessible. This was found to be a serious problem and represents one of the main ongoing challenges. Other issues highlighted include perceived weaknesses in analysis methods and the poor alignment between the health maximisation objectives assumed in economic analyses and the range of other objectives facing decision-makers in reality. Copyright (c) 2006 John Wiley & Sons, Ltd.
2010-01-01
Background Decision support in health systems is a highly difficult task, due to the inherent complexity of the process and structures involved. Method This paper introduces a new hybrid methodology Expert-based Cooperative Analysis (EbCA), which incorporates explicit prior expert knowledge in data analysis methods, and elicits implicit or tacit expert knowledge (IK) to improve decision support in healthcare systems. EbCA has been applied to two different case studies, showing its usability and versatility: 1) Bench-marking of small mental health areas based on technical efficiency estimated by EbCA-Data Envelopment Analysis (EbCA-DEA), and 2) Case-mix of schizophrenia based on functional dependency using Clustering Based on Rules (ClBR). In both cases comparisons towards classical procedures using qualitative explicit prior knowledge were made. Bayesian predictive validity measures were used for comparison with expert panels results. Overall agreement was tested by Intraclass Correlation Coefficient in case "1" and kappa in both cases. Results EbCA is a new methodology composed by 6 steps:. 1) Data collection and data preparation; 2) acquisition of "Prior Expert Knowledge" (PEK) and design of the "Prior Knowledge Base" (PKB); 3) PKB-guided analysis; 4) support-interpretation tools to evaluate results and detect inconsistencies (here Implicit Knowledg -IK- might be elicited); 5) incorporation of elicited IK in PKB and repeat till a satisfactory solution; 6) post-processing results for decision support. EbCA has been useful for incorporating PEK in two different analysis methods (DEA and Clustering), applied respectively to assess technical efficiency of small mental health areas and for case-mix of schizophrenia based on functional dependency. Differences in results obtained with classical approaches were mainly related to the IK which could be elicited by using EbCA and had major implications for the decision making in both cases. Discussion This paper presents EbCA and shows the convenience of completing classical data analysis with PEK as a mean to extract relevant knowledge in complex health domains. One of the major benefits of EbCA is iterative elicitation of IK.. Both explicit and tacit or implicit expert knowledge are critical to guide the scientific analysis of very complex decisional problems as those found in health system research. PMID:20920289
Value focused rationality in AIDS policy.
Wenstøp, F; Magnus, P
2001-07-01
A health policy analysis to contain the effects of the HIV epidemic in Norway has been carried out. It was performed as a Multi Criteria Decision Analysis where participants in a decision panel used personal values to weight benefits and costs of alternative policies. The analysis is of particular interest since Norway afterwards adopted a controversial HIV policy: the authorities warned the general population against sexual relations with immigrants from countries south of Sahara. The policy might reap benefits, but a certain cost was to stigmatise that group. This paper describes the analysis and defends the underlying consequentialistic ethics against other approaches involving rule-based ethics and benefit-cost analysis. The main argument is based on Hume's insight that reason alone does not prompt action; values will always be involved and should therefore be more explicitly focused on. The paper concludes that we need an extended notion of rationality that includes well-foundedness of values. Decision-makers should try to reach an emotional equilibrium where their values concerning the issue at hand become stable. The paradigm of decision analysis provides useful methods to approach this situation, although it must be considered only an input to policy rather than something producing a final answer.
Malakooti, Behnam; Yang, Ziyong
2004-02-01
In many real-world problems, the range of consequences of different alternatives are considerably different. In addition, sometimes, selection of a group of alternatives (instead of only one best alternative) is necessary. Traditional decision making approaches treat the set of alternatives with the same method of analysis and selection. In this paper, we propose clustering alternatives into different groups so that different methods of analysis, selection, and implementation for each group can be applied. As an example, consider the selection of a group of functions (or tasks) to be processed by a group of processors. The set of tasks can be grouped according to their similar criteria, and hence, each cluster of tasks to be processed by a processor. The selection of the best alternative for each clustered group can be performed using existing methods; however, the process of selecting groups is different than the process of selecting alternatives within a group. We develop theories and procedures for clustering discrete multiple criteria alternatives. We also demonstrate how the set of alternatives is clustered into mutually exclusive groups based on 1) similar features among alternatives; 2) ideal (or most representative) alternatives given by the decision maker; and 3) other preferential information of the decision maker. The clustering of multiple criteria alternatives also has the following advantages. 1) It decreases the set of alternatives to be considered by the decision maker (for example, different decision makers are assigned to different groups of alternatives). 2) It decreases the number of criteria. 3) It may provide a different approach for analyzing multiple decision makers problems. Each decision maker may cluster alternatives differently, and hence, clustering of alternatives may provide a basis for negotiation. The developed approach is applicable for solving a class of telecommunication networks problems where a set of objects (such as routers, processors, or intelligent autonomous vehicles) are to be clustered into similar groups. Objects are clustered based on several criteria and the decision maker's preferences.
Rahman, M Azizur; Rusteberg, Bernd; Gogu, R C; Lobo Ferreira, J P; Sauter, Martin
2012-05-30
This study reports the development of a new spatial multi-criteria decision analysis (SMCDA) software tool for selecting suitable sites for Managed Aquifer Recharge (MAR) systems. The new SMCDA software tool functions based on the combination of existing multi-criteria evaluation methods with modern decision analysis techniques. More specifically, non-compensatory screening, criteria standardization and weighting, and Analytical Hierarchy Process (AHP) have been combined with Weighted Linear Combination (WLC) and Ordered Weighted Averaging (OWA). This SMCDA tool may be implemented with a wide range of decision maker's preferences. The tool's user-friendly interface helps guide the decision maker through the sequential steps for site selection, those steps namely being constraint mapping, criteria hierarchy, criteria standardization and weighting, and criteria overlay. The tool offers some predetermined default criteria and standard methods to increase the trade-off between ease-of-use and efficiency. Integrated into ArcGIS, the tool has the advantage of using GIS tools for spatial analysis, and herein data may be processed and displayed. The tool is non-site specific, adaptive, and comprehensive, and may be applied to any type of site-selection problem. For demonstrating the robustness of the new tool, a case study was planned and executed at Algarve Region, Portugal. The efficiency of the SMCDA tool in the decision making process for selecting suitable sites for MAR was also demonstrated. Specific aspects of the tool such as built-in default criteria, explicit decision steps, and flexibility in choosing different options were key features, which benefited the study. The new SMCDA tool can be augmented by groundwater flow and transport modeling so as to achieve a more comprehensive approach to the selection process for the best locations of the MAR infiltration basins, as well as the locations of recovery wells and areas of groundwater protection. The new spatial multicriteria analysis tool has already been implemented within the GIS based Gabardine decision support system as an innovative MAR planning tool. Copyright © 2012 Elsevier Ltd. All rights reserved.
An Evaluation of Health Impact Assessments in the United States, 2011–2014
Charbonneau, Diana; Cahill, Carol; Dannenberg, Andrew L.
2015-01-01
Introduction The Center for Community Health and Evaluation conducted a 3-year evaluation to assess results of health impact assessments (HIAs) in the United States and to identify elements critical for their success. Methods The study used a retrospective, mixed-methods comparative case study design, including a literature review; site visits; interviews with investigators, stakeholders, and decision makers for 23 HIAs in 16 states that were completed from 2005 through 2013; and a Web-based survey of 144 HIA practitioners. Results Analysis of interviews with decision makers suggests HIAs can directly influence decisions in nonhealth-related sectors. HIAs may also influence changes beyond the decision target, build consensus and relationships among decision makers and their constituents, and give community members a stronger voice in decisions that affect them. Factors that may increase HIA success include care in choosing a project or policy to be examined’ selecting an appropriate team to conduct the HIA; engaging stakeholders and decision makers throughout the process; crafting clear, actionable recommendations; delivering timely, compelling messages to appropriate audiences; and using multiple dissemination methods. Challenges to successful HIAs include underestimating the level of effort required, political changes during the conduct of the HIA, accessing relevant local data, engaging vulnerable populations, and following up on recommendations. Conclusion Results of this study suggest HIAs are a useful tool to promote public health because they can influence decisions in nonhealth-related sectors, strengthen cross-sector collaborations, and raise awareness of health issues among decision makers. PMID:25695261
Topical Interface between Managerial Finance and Managerial Accounting.
ERIC Educational Resources Information Center
Williams, Norman C.; Swanson, G. A.
1988-01-01
The authors present a method to examine the interfaces between business courses for redundancy. The method is demonstrated by examining the content in managerial finance and managerial accounting courses. A decision model application of analysis, expert judgment, and synthesis are incorporated in this method. (CH)
NASA Astrophysics Data System (ADS)
Zhang, Wancheng; Xu, Yejun; Wang, Huimin
2016-01-01
The aim of this paper is to put forward a consensus reaching method for multi-attribute group decision-making (MAGDM) problems with linguistic information, in which the weight information of experts and attributes is unknown. First, some basic concepts and operational laws of 2-tuple linguistic label are introduced. Then, a grey relational analysis method and a maximising deviation method are proposed to calculate the incomplete weight information of experts and attributes respectively. To eliminate the conflict in the group, a weight-updating model is employed to derive the weights of experts based on their contribution to the consensus reaching process. After conflict elimination, the final group preference can be obtained which will give the ranking of the alternatives. The model can effectively avoid information distortion which is occurred regularly in the linguistic information processing. Finally, an illustrative example is given to illustrate the application of the proposed method and comparative analysis with the existing methods are offered to show the advantages of the proposed method.
Kwon, Sun-Hong; Park, Sun-Kyeong; Byun, Ji-Hye; Lee, Eui-Kyung
2017-08-01
In order to look beyond the cost-effectiveness analysis, this study used a multi-criteria decision analysis (MCDA), which reflects societal values with regard to reimbursement decisions. This study aims to elicit societal preferences of the reimbursement decision criteria for anti cancer drugs from public and healthcare professionals. Eight criteria were defined based on a literature review and focus group sessions: disease severity, disease population size, pediatrics targets, unmet needs, innovation, clinical benefits, cost-effectiveness, and budget impacts. Using quota sampling and purposive sampling, 300 participants from the Korean public and 30 healthcare professionals were selected for the survey. Preferences were elicited using an analytic hierarchy process. Both groups rated clinical benefits the highest, followed by cost-effectiveness and disease severity, but differed with regard to disease population size and unmet needs. Innovation was the least preferred criteria. Clinical benefits and other social values should be reflected appropriately with cost-effectiveness in healthcare coverage. MCDA can be used to assess decision priorities for complicated health policy decisions, including reimbursement decisions. It is a promising method for making logical and transparent drug reimbursement decisions that consider a broad range of factors, which are perceived as important by relevant stakeholders.
Hospice decision making: diagnosis makes a difference.
Waldrop, Deborah P; Meeker, Mary Ann
2012-10-01
This study explored the process of decision making about hospice enrollment and identified factors that influence the timing of that decision. This study employed an exploratory, descriptive, cross-sectional design and was conducted using qualitative methods. In-depth in-person semistructured interviews were conducted with 36 hospice patients and 55 caregivers after 2 weeks of hospice care. The study was guided by Janis and Mann's conflict theory model (CTM) of decision making. Qualitative data analysis involved a directed content analysis using concepts from the CTM. A model of hospice enrollment decision making is presented. Concepts from the CTM (appraisal, surveying and weighing the alternatives, deliberations, adherence) were used as an organizing framework to illustrate the dynamics. Distinct differences were found by diagnosis (cancer vs. other chronic illness, e.g., heart and lung diseases) during the pre-encounter phase or before the hospice referral but no differences emerged during the post-encounter phase. Differences in decision making by diagnosis suggest the need for research about effective means for tailored communication in end-of-life decision making by type of illness. Recognition that decision making about hospice admission varies is important for clinicians who aim to provide person-centered and family-focused care.
A Multicriteria Decision Making Approach for Estimating the Number of Clusters in a Data Set
Peng, Yi; Zhang, Yong; Kou, Gang; Shi, Yong
2012-01-01
Determining the number of clusters in a data set is an essential yet difficult step in cluster analysis. Since this task involves more than one criterion, it can be modeled as a multiple criteria decision making (MCDM) problem. This paper proposes a multiple criteria decision making (MCDM)-based approach to estimate the number of clusters for a given data set. In this approach, MCDM methods consider different numbers of clusters as alternatives and the outputs of any clustering algorithm on validity measures as criteria. The proposed method is examined by an experimental study using three MCDM methods, the well-known clustering algorithm–k-means, ten relative measures, and fifteen public-domain UCI machine learning data sets. The results show that MCDM methods work fairly well in estimating the number of clusters in the data and outperform the ten relative measures considered in the study. PMID:22870181
NASA Astrophysics Data System (ADS)
Przybyła-Kasperek, M.; Wakulicz-Deja, A.
2017-05-01
Issues related to decision making based on dispersed knowledge are discussed in the paper. A dispersed decision-making system, which was proposed by the authors in previous articles, is used in this paper. In the system, a process of combining classifiers into coalitions with a negotiation stage is realized. The novelty that is proposed in this article involves the use of six different methods of conflict analysis that are known from the literature.The main purpose of the tests, which were performed, was to compare the methods from the two groups - the abstract level and the rank level. An additional aim was to investigate the efficiency of the fusion methods used in a dispersed system with a dynamic structure with the efficiency that is obtained when no structure is used. Conclusions were drawn that, in most cases, the use of a dispersed system improves the efficiency of inference.
Extraction of decision rules via imprecise probabilities
NASA Astrophysics Data System (ADS)
Abellán, Joaquín; López, Griselda; Garach, Laura; Castellano, Javier G.
2017-05-01
Data analysis techniques can be applied to discover important relations among features. This is the main objective of the Information Root Node Variation (IRNV) technique, a new method to extract knowledge from data via decision trees. The decision trees used by the original method were built using classic split criteria. The performance of new split criteria based on imprecise probabilities and uncertainty measures, called credal split criteria, differs significantly from the performance obtained using the classic criteria. This paper extends the IRNV method using two credal split criteria: one based on a mathematical parametric model, and other one based on a non-parametric model. The performance of the method is analyzed using a case study of traffic accident data to identify patterns related to the severity of an accident. We found that a larger number of rules is generated, significantly supplementing the information obtained using the classic split criteria.
[Screening for cancer - economic consideration and cost-effectiveness].
Kjellberg, Jakob
2014-06-09
Cost-effectiveness analysis has become an accepted method to evaluate medical technology and allocate scarce health-care resources. Published decision analyses show that screening for cancer in general is cost-effective. However, cost-effectiveness analyses are only as good as the clinical data and the results are sensitive to the chosen methods and perspective of the analysis.
Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Stanton, Neville A
2016-03-01
Pedestrian fatalities at rail level crossings (RLXs) are a public safety concern for governments worldwide. There is little literature examining pedestrian behaviour at RLXs and no previous studies have adopted a formative approach to understanding behaviour in this context. In this article, cognitive work analysis is applied to understand the constraints that shape pedestrian behaviour at RLXs in Melbourne, Australia. The five phases of cognitive work analysis were developed using data gathered via document analysis, behavioural observation, walk-throughs and critical decision method interviews. The analysis demonstrates the complex nature of pedestrian decision making at RLXs and the findings are synthesised to provide a model illustrating the influences on pedestrian decision making in this context (i.e. time, effort and social pressures). Further, the CWA outputs are used to inform an analysis of the risks to safety associated with pedestrian behaviour at RLXs and the identification of potential interventions to reduce risk. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Using service data: tools for taking action.
1992-01-01
Program performance can be improved through use of a simple information system. The focus of the discussion is on analysis of service data, decision making, and program improvement. Clinic managers must collect and analyze their own data and not wait for supervisors from central or district offices to conduct thorough examination. Local decision making has the advantage of providing monitoring and modification of services in a timely way and in a way responsive to client needs. Information can be shared throughout all levels of local and central administration. The model for decision making is based on data collection, data analysis, decision making, action, evaluation, information dissemination, and feedback. Data need to be collected on types of clients (new acceptor or continuing user), type of contraceptive method and quantity dispensed, and how the client learned about the clinic. Supply data also needs to be collected on methods of contraceptives on hand, number dispensed by method to clients, and projected supplies; requests for additional supplies can thus be made in a timely and appropriate way. The basic clinic forms are the family planning (FP), client record, the client referral card, an appointment card, a complication card, a daily FP activity register, a FP activities worksheet, a monthly summary of FP activities, and a commodities request/receipt form. A suggestion sheet from users addresses issues about performance targets, continuing users, dropouts, staff motivation, and setting up a system. Suggestions are also provided on the importance of staff training in data collection and analysis and in creating awareness of the program's objectives. Discussion is directed to how to interpret new acceptor data and to look for patterns. A sample chart is provided of a summary of FP activities, possible interpretations, and possible actions to take. Analysis is given for new acceptor trends, contraceptive method mix, and sources of information. A short example illustrates how client card data and bar graphs of method mix by desire for no more children or for more children revealed that couples childbearing desires did not affect method choice.
Risk Decision Making Model for Reservoir Floodwater resources Utilization
NASA Astrophysics Data System (ADS)
Huang, X.
2017-12-01
Floodwater resources utilization(FRU) can alleviate the shortage of water resources, but there are risks. In order to safely and efficiently utilize the floodwater resources, it is necessary to study the risk of reservoir FRU. In this paper, the risk rate of exceeding the design flood water level and the risk rate of exceeding safety discharge are estimated. Based on the principle of the minimum risk and the maximum benefit of FRU, a multi-objective risk decision making model for FRU is constructed. Probability theory and mathematical statistics method is selected to calculate the risk rate; C-D production function method and emergy analysis method is selected to calculate the risk benefit; the risk loss is related to flood inundation area and unit area loss; the multi-objective decision making problem of the model is solved by the constraint method. Taking the Shilianghe reservoir in Jiangsu Province as an example, the optimal equilibrium solution of FRU of the Shilianghe reservoir is found by using the risk decision making model, and the validity and applicability of the model are verified.
NASA Astrophysics Data System (ADS)
Song, Jae Yeol; Chung, Eun-Sung
2017-04-01
This study developed a multi-criteria decision analysis framework to prioritize sites and types of low impact development (LID) practices. This framework was systemized as a web-based system coupled with the Storm Water Management Model (SWMM) from the Environmental Protection Agency (EPA). Using the technique for order of preference by similarity to ideal solution (TOPSIS), which is a type of multi-criteria decision-making (MCDM) method, multiple types and sites of designated LID practices are prioritized. This system is named the Water Management Prioritization Module (WMPM) and is an improved version of the Water Management Analysis Module (WMAM) that automatically generates and simulates multiple scenarios of LID design and planning parameters for a single LID type. WMPM can simultaneously determine the priority of multiple LID types and sites. In this study, an infiltration trench and permeable pavement were considered for multiple sub-catchments in South Korea to demonstrate the WMPM procedures. The TOPSIS method was manually incorporated to select the vulnerable target sub-catchments and to prioritize the LID planning scenarios for multiple types and sites considering socio-economic, hydrologic and physical-geometric factors. In this application, the Delphi method and entropy theory were used to determine the subjective and objective weights, respectively. Comparing the ranks derived by this system, two sub-catchments, S16 and S4, out of 18 were considered to be the most suitable places for installing an infiltration trench and porous pavement to reduce the peak and total flow, respectively, considering both socio-economic factors and hydrological effectiveness. WMPM can help policy-makers to objectively develop urban water plans for sustainable development. Keywords: Low Impact Development, Multi-Criteria Decision Analysis, SWMM, TOPSIS, Water Management Prioritization Module (WMPM)
"Knowledge" in English Primary Schools' Decision-Making about Sex and Relationships Education
ERIC Educational Resources Information Center
Wilder, Rachel
2018-01-01
Objective: To assess what kinds of knowledge policymakers in a sample of English primary schools utilised to make decisions about their school's sex and relationships education policy. Method: Semi-structured interviews were conducted with policymakers at three primary schools in the southwest of England, and documentary analysis of the schools'…
Designing Species Translocation Strategies When Populaton Growth and Future Funding Are Uncertain
Robert G. Haight; Katherine Ralls; Anthony M. Starfield
2000-01-01
When translocating individuals to found new populations, managers must allocate limited funds among release and monitoring activities that differ in method, cost, and probable result. In addition, managers are increasingly expected to justify the funding decisions they have made. Within the framework of decision analysis, we used robust optimization to formulate and...
Multicriteria decision analysis applied to Glen Canyon Dam
Flug, M.; Seitz, H.L.H.; Scott, J.F.
2000-01-01
Conflicts in water resources exist because river-reservoir systems are managed to optimize traditional benefits (e.g., hydropower and flood control), which are historically quantified in economic terms, whereas natural and environmental resources, including in-stream and riparian resources, are more difficult or impossible to quantify in economic terms. Multicriteria decision analysis provides a quantitative approach to evaluate resources subject to river basin management alternatives. This objective quantification method includes inputs from special interest groups, the general public, and concerned individuals, as well as professionals for each resource considered in a trade-off analysis. Multicriteria decision analysis is applied to resources and flow alternatives presented in the environmental impact statement for Glen Canyon Dam on the Colorado River. A numeric rating and priority-weighting scheme is used to evaluate 29 specific natural resource attributes, grouped into seven main resource objectives, for nine flow alternatives enumerated in the environmental impact statement.
NASA Astrophysics Data System (ADS)
Uhde, Britta; Andreas Hahn, W.; Griess, Verena C.; Knoke, Thomas
2015-08-01
Multi-criteria decision analysis (MCDA) is a decision aid frequently used in the field of forest management planning. It includes the evaluation of multiple criteria such as the production of timber and non-timber forest products and tangible as well as intangible values of ecosystem services (ES). Hence, it is beneficial compared to those methods that take a purely financial perspective. Accordingly, MCDA methods are increasingly popular in the wide field of sustainability assessment. Hybrid approaches allow aggregating MCDA and, potentially, other decision-making techniques to make use of their individual benefits and leading to a more holistic view of the actual consequences that come with certain decisions. This review is providing a comprehensive overview of hybrid approaches that are used in forest management planning. Today, the scientific world is facing increasing challenges regarding the evaluation of ES and the trade-offs between them, for example between provisioning and regulating services. As the preferences of multiple stakeholders are essential to improve the decision process in multi-purpose forestry, participatory and hybrid approaches turn out to be of particular importance. Accordingly, hybrid methods show great potential for becoming most relevant in future decision making. Based on the review presented here, the development of models for the use in planning processes should focus on participatory modeling and the consideration of uncertainty regarding available information.
Uhde, Britta; Hahn, W Andreas; Griess, Verena C; Knoke, Thomas
2015-08-01
Multi-criteria decision analysis (MCDA) is a decision aid frequently used in the field of forest management planning. It includes the evaluation of multiple criteria such as the production of timber and non-timber forest products and tangible as well as intangible values of ecosystem services (ES). Hence, it is beneficial compared to those methods that take a purely financial perspective. Accordingly, MCDA methods are increasingly popular in the wide field of sustainability assessment. Hybrid approaches allow aggregating MCDA and, potentially, other decision-making techniques to make use of their individual benefits and leading to a more holistic view of the actual consequences that come with certain decisions. This review is providing a comprehensive overview of hybrid approaches that are used in forest management planning. Today, the scientific world is facing increasing challenges regarding the evaluation of ES and the trade-offs between them, for example between provisioning and regulating services. As the preferences of multiple stakeholders are essential to improve the decision process in multi-purpose forestry, participatory and hybrid approaches turn out to be of particular importance. Accordingly, hybrid methods show great potential for becoming most relevant in future decision making. Based on the review presented here, the development of models for the use in planning processes should focus on participatory modeling and the consideration of uncertainty regarding available information.
NASA Astrophysics Data System (ADS)
Mohammed, Habiba Ibrahim; Majid, Zulkepli; Yusof, Norhakim Bin; Bello Yamusa, Yamusa
2018-03-01
Landfilling remains the most common systematic technique of solid waste disposal in most of the developed and developing countries. Finding a suitable site for landfill is a very challenging task. Landfill site selection process aims to provide suitable areas that will protect the environment and public health from pollution and hazards. Therefore, various factors such as environmental, physical, socio-economic, and geological criteria must be considered before siting any landfill. This makes the site selection process vigorous and tedious because it involves the processing of large amount of spatial data, rules and regulations from different agencies and also policy from decision makers. This allows the incorporation of conflicting objectives and decision maker preferences into spatial decision models. This paper particularly analyzes the multi-criteria evaluation (MCE) method of landfill site selection for solid waste management by means of literature reviews and surveys. The study will help the decision makers and waste management authorities to choose the most effective method when considering landfill site selection.
Development and initial evaluation of a treatment decision dashboard
2013-01-01
Background For many healthcare decisions, multiple alternatives are available with different combinations of advantages and disadvantages across several important dimensions. The complexity of current healthcare decisions thus presents a significant barrier to informed decision making, a key element of patient-centered care. Interactive decision dashboards were developed to facilitate decision making in Management, a field marked by similarly complicated choices. These dashboards utilize data visualization techniques to reduce the cognitive effort needed to evaluate decision alternatives and a non-linear flow of information that enables users to review information in a self-directed fashion. Theoretically, both of these features should facilitate informed decision making by increasing user engagement with and understanding of the decision at hand. We sought to determine if the interactive decision dashboard format can be successfully adapted to create a clinically realistic prototype patient decision aid suitable for further evaluation and refinement. Methods We created a computerized, interactive clinical decision dashboard and performed a pilot test of its clinical feasibility and acceptability using a multi-method analysis. The dashboard summarized information about the effectiveness, risks of side effects and drug-drug interactions, out-of-pocket costs, and ease of use of nine analgesic treatment options for knee osteoarthritis. Outcome evaluations included observations of how study participants utilized the dashboard, questionnaires to assess usability, acceptability, and decisional conflict, and an open-ended qualitative analysis. Results The study sample consisted of 25 volunteers - 7 men and 18 women - with an average age of 51 years. The mean time spent interacting with the dashboard was 4.6 minutes. Mean evaluation scores on scales ranging from 1 (low) to 7 (high) were: mechanical ease of use 6.1, cognitive ease of use 6.2, emotional difficulty 2.7, decision-aiding effectiveness 5.9, clarification of values 6.5, reduction in decisional uncertainty 6.1, and provision of decision-related information 6.0. Qualitative findings were similarly positive. Conclusions Interactive decision dashboards can be adapted for clinical use and have the potential to foster informed decision making. Additional research is warranted to more rigorously test the effectiveness and efficiency of patient decision dashboards for supporting informed decision making and other aspects of patient-centered care, including shared decision making. PMID:23601912
Singh, Sonal
2013-01-01
Background: Regulatory decision-making involves assessment of risks and benefits of medications at the time of approval or when relevant safety concerns arise with a medication. The Analytic Hierarchy Process (AHP) facilitates decision-making in complex situations involving tradeoffs by considering risks and benefits of alternatives. The AHP allows a more structured method of synthesizing and understanding evidence in the context of importance assigned to outcomes. Our objective is to evaluate the use of an AHP in a simulated committee setting selecting oral medications for type 2 diabetes. Methods: This study protocol describes the AHP in five sequential steps using a small group of diabetes experts representing various clinical disciplines. The first step will involve defining the goal of the decision and developing the AHP model. In the next step, we will collect information about how well alternatives are expected to fulfill the decision criteria. In the third step, we will compare the ability of the alternatives to fulfill the criteria and judge the importance of eight criteria relative to the decision goal of the optimal medication choice for type 2 diabetes. We will use pairwise comparisons to sequentially compare the pairs of alternative options regarding their ability to fulfill the criteria. In the fourth step, the scales created in the third step will be combined to create a summary score indicating how well the alternatives met the decision goal. The resulting scores will be expressed as percentages and will indicate the alternative medications' relative abilities to fulfill the decision goal. The fifth step will consist of sensitivity analyses to explore the effects of changing the estimates. We will also conduct a cognitive interview and process evaluation. Discussion: Multi-criteria decision analysis using the AHP will aid, support and enhance the ability of decision makers to make evidence-based informed decisions consistent with their values and preferences. PMID:24555077
2014-12-26
additive value function, which assumes mutual preferential independence (Gregory S. Parnell, 2013). In other words, this method can be used if the... additive value function method to calculate the aggregate value of multiple objectives. Step 9 : Sensitivity Analysis Once the global values are...gravity metric, the additive method will be applied using equal weights for each axis value function. Pilot Satisfaction (Usability) As expressed
Insurance Contract Analysis for Company Decision Support in Acquisition Management
NASA Astrophysics Data System (ADS)
Chernovita, H. P.; Manongga, D.; Iriani, A.
2017-01-01
One of company activities to retain their business is marketing the products which include in acquisition management to get new customers. Insurance contract analysis using ID3 to produce decision tree and rules to be decision support for the insurance company. The decision tree shows 13 rules that lead to contract termination claim. This could be a guide for the insurance company in acquisition management to prevent contract binding with these contract condition because it has a big chance for the customer to terminate their insurance contract before its expired date. As the result, there are several strong points that could be the determinant of contract termination such as: 1) customer age whether too young or too old, 2) long insurance period (above 10 years), 3) big insurance amount, 4) big amount of premium charges, and 5) payment method.
Jank, Louise; Martins, Magda Targa; Arsand, Juliana Bazzan; Campos Motta, Tanara Magalhães; Hoff, Rodrigo Barcellos; Barreto, Fabiano; Pizzolato, Tânia Mara
2015-11-01
A fast and simple method for residue analysis of the antibiotics classes of macrolides (erythromycin, azithromycin, tylosin, tilmicosin and spiramycin) and lincosamides (lincomycin and clindamycin) was developed and validated for cattle, swine and chicken muscle and for bovine milk. Sample preparation consists in a liquid-liquid extraction (LLE) with acetonitrile, followed by liquid chromatography-electrospray-tandem mass spectrometry analysis (LC-ESI-MS/MS), without the need of any additional clean-up steps. Chromatographic separation was achieved using a C18 column and a mobile phase composed by acidified acetonitrile and water. The method was fully validated according the criteria of the Commission Decision 2002/657/EC. Validation parameters such as limit of detection, limit of quantification, linearity, accuracy, repeatability, specificity, reproducibility, decision limit (CCα) and detection capability (CCβ) were evaluated. All calculated values met the established criteria. Reproducibility values, expressed as coefficient of variation, were all lower than 19.1%. Recoveries range from 60% to 107%. Limits of detection were from 5 to 25 µg kg(-1).The present method is able to be applied in routine analysis, with adequate time of analysis, low cost and a simple sample preparation protocol. Copyright © 2015. Published by Elsevier B.V.
Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.
Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias
2016-01-01
To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.
QTest: Quantitative Testing of Theories of Binary Choice
Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William
2014-01-01
The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495
Espinoza, Manuel Antonio; Manca, Andrea; Claxton, Karl; Sculpher, Mark
2018-02-01
Evidence about cost-effectiveness is increasingly being used to inform decisions about the funding of new technologies that are usually implemented as guidelines from centralized decision-making bodies. However, there is also an increasing recognition for the role of patients in determining their preferred treatment option. This paper presents a method to estimate the value of implementing a choice-based decision process using the cost-effectiveness analysis toolbox. This value is estimated for 3 alternative scenarios. First, it compares centralized decisions, based on population average cost-effectiveness, against a decision process based on patient choice. Second, it compares centralized decision based on patients' subgroups versus an individual choice-based decision process. Third, it compares a centralized process based on average cost-effectiveness against a choice-based process where patients choose according to a different measure of outcome to that used by the centralized decision maker. The methods are applied to a case study for the management of acute coronary syndrome. It is concluded that implementing a choice-based process of treatment allocation may be an option in collectively funded health systems. However, its value will depend on the specific health problem and the social values considered relevant to the health system. Copyright © 2017 John Wiley & Sons, Ltd.
Chen, Xiao Yu; Ma, Li Zhuang; Chu, Na; Zhou, Min; Hu, Yiyang
2013-01-01
Chronic hepatitis B (CHB) is a serious public health problem, and Traditional Chinese Medicine (TCM) plays an important role in the control and treatment for CHB. In the treatment of TCM, zheng discrimination is the most important step. In this paper, an approach based on CFS-GA (Correlation based Feature Selection and Genetic Algorithm) and C5.0 boost decision tree is used for zheng classification and progression in the TCM treatment of CHB. The CFS-GA performs better than the typical method of CFS. By CFS-GA, the acquired attribute subset is classified by C5.0 boost decision tree for TCM zheng classification of CHB, and C5.0 decision tree outperforms two typical decision trees of NBTree and REPTree on CFS-GA, CFS, and nonselection in comparison. Based on the critical indicators from C5.0 decision tree, important lab indicators in zheng progression are obtained by the method of stepwise discriminant analysis for expressing TCM zhengs in CHB, and alterations of the important indicators are also analyzed in zheng progression. In conclusion, all the three decision trees perform better on CFS-GA than on CFS and nonselection, and C5.0 decision tree outperforms the two typical decision trees both on attribute selection and nonselection.
Stock and option portfolio using fuzzy logic approach
NASA Astrophysics Data System (ADS)
Sumarti, Novriana; Wahyudi, Nanang
2014-03-01
Fuzzy Logic in decision-making process has been widely implemented in various problems in industries. It is the theory of imprecision and uncertainty that was not based on probability theory. Fuzzy Logic adds values of degree between absolute true and absolute false. It starts with and builds on a set of human language rules supplied by the user. The fuzzy systems convert these rules to their mathematical equivalents. This could simplify the job of the system designer and the computer, and results in much more accurate representations of the way systems behave in the real world. In this paper we examine the decision making process of stock and option trading by the usage of MACD (Moving Average Convergence Divergence) technical analysis and Option Pricing with Fuzzy Logic approach. MACD technical analysis is for the prediction of the trends of underlying stock prices, such as bearish (going downward), bullish (going upward), and sideways. By using Fuzzy C-Means technique and Mamdani Fuzzy Inference System, we define the decision output where the value of MACD is high then decision is "Strong Sell", and the value of MACD is Low then the decision is "Strong Buy". We also implement the fuzzification of the Black-Scholes option-pricing formula. The stock and options methods are implemented on a portfolio of one stock and its options. Even though the values of input data, such as interest rates, stock price and its volatility, cannot be obtain accurately, these fuzzy methods can give a belief degree of the calculated the Black-Scholes formula so we can make the decision on option trading. The results show the good capability of the methods in the prediction of stock price trends. The performance of the simulated portfolio for a particular period of time also shows good return.
Intelligent data analysis: the best approach for chronic heart failure (CHF) follow up management.
Mohammadzadeh, Niloofar; Safdari, Reza; Baraani, Alireza; Mohammadzadeh, Farshid
2014-08-01
Intelligent data analysis has ability to prepare and present complex relations between symptoms and diseases, medical and treatment consequences and definitely has significant role in improving follow-up management of chronic heart failure (CHF) patients, increasing speed and accuracy in diagnosis and treatments; reducing costs, designing and implementation of clinical guidelines. The aim of this article is to describe intelligent data analysis methods in order to improve patient monitoring in follow and treatment of chronic heart failure patients as the best approach for CHF follow up management. Minimum data set (MDS) requirements for monitoring and follow up of CHF patient designed in checklist with six main parts. All CHF patients that discharged in 2013 from Tehran heart center have been selected. The MDS for monitoring CHF patient status were collected during 5 months in three different times of follow up. Gathered data was imported in RAPIDMINER 5 software. Modeling was based on decision trees methods such as C4.5, CHAID, ID3 and k-Nearest Neighbors algorithm (K-NN) with k=1. Final analysis was based on voting method. Decision trees and K-NN evaluate according to Cross-Validation. Creating and using standard terminologies and databases consistent with these terminologies help to meet the challenges related to data collection from various places and data application in intelligent data analysis. It should be noted that intelligent analysis of health data and intelligent system can never replace cardiologists. It can only act as a helpful tool for the cardiologist's decisions making.
2012-01-01
Background Decision-making in healthcare is complex. Research on coverage decision-making has focused on comparative studies for several countries, statistical analyses for single decision-makers, the decision outcome and appraisal criteria. Accounting for decision processes extends the complexity, as they are multidimensional and process elements need to be regarded as latent constructs (composites) that are not observed directly. The objective of this study was to present a practical application of partial least square path modelling (PLS-PM) to evaluate how it offers a method for empirical analysis of decision-making in healthcare. Methods Empirical approaches that applied PLS-PM to decision-making in healthcare were identified through a systematic literature search. PLS-PM was used as an estimation technique for a structural equation model that specified hypotheses between the components of decision processes and the reasonableness of decision-making in terms of medical, economic and other ethical criteria. The model was estimated for a sample of 55 coverage decisions on the extension of newborn screening programmes in Europe. Results were evaluated by standard reliability and validity measures for PLS-PM. Results After modification by dropping two indicators that showed poor measures in the measurement models’ quality assessment and were not meaningful for newborn screening, the structural equation model estimation produced plausible results. The presence of three influences was supported: the links between both stakeholder participation or transparency and the reasonableness of decision-making; and the effect of transparency on the degree of scientific rigour of assessment. Reliable and valid measurement models were obtained to describe the composites of ‘transparency’, ‘participation’, ‘scientific rigour’ and ‘reasonableness’. Conclusions The structural equation model was among the first applications of PLS-PM to coverage decision-making. It allowed testing of hypotheses in situations where there are links between several non-observable constructs. PLS-PM was compatible in accounting for the complexity of coverage decisions to obtain a more realistic perspective for empirical analysis. The model specification can be used for hypothesis testing by using larger sample sizes and for data in the full domain of health technologies. PMID:22856325
Gagnon, Marie-Pierre; Légaré, France; Fortin, Jean-Paul; Lamothe, Lise; Labrecque, Michel; Duplantie, Julie
2008-01-01
Background E-health is increasingly valued for supporting: 1) access to quality health care services for all citizens; 2) information flow and exchange; 3) integrated health care services and 4) interprofessional collaboration. Nevertheless, several questions remain on the factors allowing an optimal integration of e-health in health care policies, organisations and practices. An evidence-based integrated strategy would maximise the efficacy and efficiency of e-health implementation. However, decisions regarding e-health applications are usually not evidence-based, which can lead to a sub-optimal use of these technologies. This study aims at understanding factors influencing the application of scientific knowledge for an optimal implementation of e-health in the health care system. Methods A three-year multi-method study is being conducted in the Province of Quebec (Canada). Decision-making at each decisional level (political, organisational and clinical) are analysed based on specific approaches. At the political level, critical incidents analysis is being used. This method will identify how decisions regarding the implementation of e-health could be influenced or not by scientific knowledge. Then, interviews with key-decision-makers will look at how knowledge was actually used to support their decisions, and what factors influenced its use. At the organisational level, e-health projects are being analysed as case studies in order to explore the use of scientific knowledge to support decision-making during the implementation of the technology. Interviews with promoters, managers and clinicians will be carried out in order to identify factors influencing the production and application of scientific knowledge. At the clinical level, questionnaires are being distributed to clinicians involved in e-health projects in order to analyse factors influencing knowledge application in their decision-making. Finally, a triangulation of the results will be done using mixed methodologies to allow a transversal analysis of the results at each of the decisional levels. Results This study will identify factors influencing the use of scientific evidence and other types of knowledge by decision-makers involved in planning, financing, implementing and evaluating e-health projects. Conclusion These results will be highly relevant to inform decision-makers who wish to optimise the implementation of e-health in the Quebec health care system. This study is extremely relevant given the context of major transformations in the health care system where e-health becomes a must. PMID:18435853
A study on spatial decision support systems for HIV/AIDS prevention based on COM GIS technology
NASA Astrophysics Data System (ADS)
Yang, Kun; Luo, Huasong; Peng, Shungyun; Xu, Quanli
2007-06-01
Based on the deeply analysis of the current status and the existing problems of GIS technology applications in Epidemiology, this paper has proposed the method and process for establishing the spatial decision support systems of AIDS epidemic prevention by integrating the COM GIS, Spatial Database, GPS, Remote Sensing, and Communication technologies, as well as ASP and ActiveX software development technologies. One of the most important issues for constructing the spatial decision support systems of AIDS epidemic prevention is how to integrate the AIDS spreading models with GIS. The capabilities of GIS applications in the AIDS epidemic prevention have been described here in this paper firstly. Then some mature epidemic spreading models have also been discussed for extracting the computation parameters. Furthermore, a technical schema has been proposed for integrating the AIDS spreading models with GIS and relevant geospatial technologies, in which the GIS and model running platforms share a common spatial database and the computing results can be spatially visualized on Desktop or Web GIS clients. Finally, a complete solution for establishing the decision support systems of AIDS epidemic prevention has been offered in this paper based on the model integrating methods and ESRI COM GIS software packages. The general decision support systems are composed of data acquisition sub-systems, network communication sub-systems, model integrating sub-systems, AIDS epidemic information spatial database sub-systems, AIDS epidemic information querying and statistical analysis sub-systems, AIDS epidemic dynamic surveillance sub-systems, AIDS epidemic information spatial analysis and decision support sub-systems, as well as AIDS epidemic information publishing sub-systems based on Web GIS.
NASA Astrophysics Data System (ADS)
Malczewski, Jacek; Rinner, Claus
2005-06-01
Commonly used GIS combination operators such as Boolean conjunction/disjunction and weighted linear combination can be generalized to the ordered weighted averaging (OWA) family of operators. This multicriteria evaluation method allows decision-makers to define a decision strategy on a continuum between pessimistic and optimistic strategies. Recently, OWA has been introduced to GIS-based decision support systems. We propose to extend a previous implementation of OWA with linguistic quantifiers to simplify the definition of decision strategies and to facilitate an exploratory analysis of multiple criteria. The linguistic quantifier-guided OWA procedure is illustrated using a dataset for evaluating residential quality of neighborhoods in London, Ontario.
Generalisability in economic evaluation studies in healthcare: a review and case studies.
Sculpher, M J; Pang, F S; Manca, A; Drummond, M F; Golder, S; Urdahl, H; Davies, L M; Eastwood, A
2004-12-01
To review, and to develop further, the methods used to assess and to increase the generalisability of economic evaluation studies. Electronic databases. Methodological studies relating to economic evaluation in healthcare were searched. This included electronic searches of a range of databases, including PREMEDLINE, MEDLINE, EMBASE and EconLit, and manual searches of key journals. The case studies of a decision analytic model involved highlighting specific features of previously published economic studies related to generalisability and location-related variability. The case-study involving the secondary analysis of cost-effectiveness analyses was based on the secondary analysis of three economic studies using data from randomised trials. The factor most frequently cited as generating variability in economic results between locations was the unit costs associated with particular resources. In the context of studies based on the analysis of patient-level data, regression analysis has been advocated as a means of looking at variability in economic results across locations. These methods have generally accepted that some components of resource use and outcomes are exchangeable across locations. Recent studies have also explored, in cost-effectiveness analysis, the use of tests of heterogeneity similar to those used in clinical evaluation in trials. The decision analytic model has been the main means by which cost-effectiveness has been adapted from trial to non-trial locations. Most models have focused on changes to the cost side of the analysis, but it is clear that the effectiveness side may also need to be adapted between locations. There have been weaknesses in some aspects of the reporting in applied cost-effectiveness studies. These may limit decision-makers' ability to judge the relevance of a study to their specific situations. The case study demonstrated the potential value of multilevel modelling (MLM). Where clustering exists by location (e.g. centre or country), MLM can facilitate correct estimates of the uncertainty in cost-effectiveness results, and also a means of estimating location-specific cost-effectiveness. The review of applied economic studies based on decision analytic models showed that few studies were explicit about their target decision-maker(s)/jurisdictions. The studies in the review generally made more effort to ensure that their cost inputs were specific to their target jurisdiction than their effectiveness parameters. Standard sensitivity analysis was the main way of dealing with uncertainty in the models, although few studies looked explicitly at variability between locations. The modelling case study illustrated how effectiveness and cost data can be made location-specific. In particular, on the effectiveness side, the example showed the separation of location-specific baseline events and pooled estimates of relative treatment effect, where the latter are assumed exchangeable across locations. A large number of factors are mentioned in the literature that might be expected to generate variation in the cost-effectiveness of healthcare interventions across locations. Several papers have demonstrated differences in the volume and cost of resource use between locations, but few studies have looked at variability in outcomes. In applied trial-based cost-effectiveness studies, few studies provide sufficient evidence for decision-makers to establish the relevance or to adjust the results of the study to their location of interest. Very few studies utilised statistical methods formally to assess the variability in results between locations. In applied economic studies based on decision models, most studies either stated their target decision-maker/jurisdiction or provided sufficient information from which this could be inferred. There was a greater tendency to ensure that cost inputs were specific to the target jurisdiction than clinical parameters. Methods to assess generalisability and variability in economic evaluation studies have been discussed extensively in the literature relating to both trial-based and modelling studies. Regression-based methods are likely to offer a systematic approach to quantifying variability in patient-level data. In particular, MLM has the potential to facilitate estimates of cost-effectiveness, which both reflect the variation in costs and outcomes between locations and also enable the consistency of cost-effectiveness estimates between locations to be assessed directly. Decision analytic models will retain an important role in adapting the results of cost-effectiveness studies between locations. Recommendations for further research include: the development of methods of evidence synthesis which model the exchangeability of data across locations and allow for the additional uncertainty in this process; assessment of alternative approaches to specifying multilevel models to the analysis of cost-effectiveness data alongside multilocation randomised trials; identification of a range of appropriate covariates relating to locations (e.g. hospitals) in multilevel models; and further assessment of the role of econometric methods (e.g. selection models) for cost-effectiveness analysis alongside observational datasets, and to increase the generalisability of randomised trials.
Fuzzy methods in decision making process - A particular approach in manufacturing systems
NASA Astrophysics Data System (ADS)
Coroiu, A. M.
2015-11-01
We are living in a competitive environment, so we can see and understand that the most of manufacturing firms do the best in order to accomplish meeting demand, increasing quality, decreasing costs, and delivery rate. In present a stake point of interest is represented by the development of fuzzy technology. A particular approach for this is represented through the development of methodologies to enhance the ability to managed complicated optimization and decision making aspects involving non-probabilistic uncertainty with the reason to understand, development, and practice the fuzzy technologies to be used in fields such as economic, engineering, management, and societal problems. Fuzzy analysis represents a method for solving problems which are related to uncertainty and vagueness; it is used in multiple areas, such as engineering and has applications in decision making problems, planning and production. As a definition for decision making process we can use the next one: result of mental processes based upon cognitive process with a main role in the selection of a course of action among several alternatives. Every process of decision making can be represented as a result of a final choice and the output can be represented as an action or as an opinion of choice. Different types of uncertainty can be discovered in a wide variety of optimization and decision making problems related to planning and operation of power systems and subsystems. The mixture of the uncertainty factor in the construction of different models serves for increasing their adequacy and, as a result, the reliability and factual efficiency of decisions based on their analysis. Another definition of decision making process which came to illustrate and sustain the necessity of using fuzzy method: the decision making is an approach of choosing a strategy among many different projects in order to achieve some purposes and is formulated as three different models: high risk decision, usual risk decision and low risk decision - some specific formulas of fuzzy logic. The fuzzy set concepts has some certain parameterization features which are certain extensions of crisp and fuzzy relations respectively and have a rich potential for application to the decision making problems. The proposed approach from this paper presents advantages of fuzzy approach, in comparison with other paradigm and presents a particular way in which fuzzy logic can emerge in decision making process and planning process with implication, as a simulation, in manufacturing - involved in measuring performance of advanced manufacturing systems. Finally, an example is presented to illustrate our simulation.
Function allocation for humans and automation in the context of team dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeffrey C. Joe; John O'Hara; Jacques Hugo
Within Human Factors Engineering, a decision-making process called function allocation (FA) is used during the design life cycle of complex systems to distribute the system functions, often identified through a functional requirements analysis, to all human and automated machine agents (or teammates) involved in controlling the system. Most FA methods make allocation decisions primarily by comparing the capabilities of humans and automation, but then also by considering secondary factors such as cost, regulations, and the health and safety of workers. The primary analysis of the strengths and weaknesses of humans and machines, however, is almost always considered in terms ofmore » individual human or machine capabilities. Yet, FA is fundamentally about teamwork in that the goal of the FA decision-making process is to determine what are the optimal allocations of functions among agents. Given this framing of FA, and the increasing use of and sophistication of automation, there are two related social psychological issues that current FA methods need to address more thoroughly. First, many principles for effective human teamwork are not considered as central decision points or in the iterative hypothesis and testing phase in most FA methods, when it is clear that social factors have numerous positive and negative effects on individual and team capabilities. Second, social psychological factors affecting team performance and can be difficult to translate to automated agents, and most FA methods currently do not account for this effect. The implications for these issues are discussed.« less
Profiling a Periodicals Collection
ERIC Educational Resources Information Center
Bolgiano, Christina E.; King, Mary Kathryn
1978-01-01
Libraries need solid information upon which to base collection development decisions. Specific evaluative methods for determining scope, access, and usefullness are described. Approaches used for data collection include analysis of interlibrary loan requests, comparison with major bibliographies, and analysis of accessibility through available…
Polisena, Julie; Garritty, Chantelle; Kamel, Chris; Stevens, Adrienne; Abou-Setta, Ahmed M
2015-03-14
Health care decision makers often need to make decisions in limited timeframes and cannot await the completion of a full evidence review. Rapid reviews (RRs), utilizing streamlined systematic review methods, are increasingly being used to synthesize the evidence with a shorter turnaround time. Our primary objective was to describe the processes and methods used internationally to produce RRs. In addition, we sought to understand the underlying themes associated with these programs. We contacted representatives of international RR programs from a broad realm in health care to gather information about the methods and processes used to produce RRs. The responses were summarized narratively to understand the characteristics associated with their processes and methods. The summaries were compared and contrasted to highlight potential themes and trends related to the different RR programs. Twenty-nine international RR programs were included in our sample with a broad organizational representation from academia, government, research institutions, and non-for-profit organizations. Responses revealed that the main objectives for RRs were to inform decision making with regards to funding health care technologies, services and policy, and program development. Central themes that influenced the methods used by RR programs, and report type and dissemination were the imposed turnaround time to complete a report, resources available, the complexity and sensitivity of the research topics, and permission from the requestor. Our study confirmed that there is no standard approach to conduct RRs. Differences in processes and methods across programs may be the result of the novelty of RR methods versus other types of evidence syntheses, customization of RRs for various decision makers, and definition of 'rapid' by organizations, since it impacts both the timelines and the evidence synthesis methods. Future research should investigate the impact of current RR methods and reporting to support informed health care decision making, the effects of potential biases that may be introduced with streamlined methods, and the effectiveness of RR reporting guidelines on transparency.
Owen, Rhiannon K; Cooper, Nicola J; Quinn, Terence J; Lees, Rosalind; Sutton, Alex J
2018-07-01
Network meta-analyses (NMA) have extensively been used to compare the effectiveness of multiple interventions for health care policy and decision-making. However, methods for evaluating the performance of multiple diagnostic tests are less established. In a decision-making context, we are often interested in comparing and ranking the performance of multiple diagnostic tests, at varying levels of test thresholds, in one simultaneous analysis. Motivated by an example of cognitive impairment diagnosis following stroke, we synthesized data from 13 studies assessing the efficiency of two diagnostic tests: Mini-Mental State Examination (MMSE) and Montreal Cognitive Assessment (MoCA), at two test thresholds: MMSE <25/30 and <27/30, and MoCA <22/30 and <26/30. Using Markov chain Monte Carlo (MCMC) methods, we fitted a bivariate network meta-analysis model incorporating constraints on increasing test threshold, and accounting for the correlations between multiple test accuracy measures from the same study. We developed and successfully fitted a model comparing multiple tests/threshold combinations while imposing threshold constraints. Using this model, we found that MoCA at threshold <26/30 appeared to have the best true positive rate, whereas MMSE at threshold <25/30 appeared to have the best true negative rate. The combined analysis of multiple tests at multiple thresholds allowed for more rigorous comparisons between competing diagnostics tests for decision making. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Rule Extracting based on MCG with its Application in Helicopter Power Train Fault Diagnosis
NASA Astrophysics Data System (ADS)
Wang, M.; Hu, N. Q.; Qin, G. J.
2011-07-01
In order to extract decision rules for fault diagnosis from incomplete historical test records for knowledge-based damage assessment of helicopter power train structure. A method that can directly extract the optimal generalized decision rules from incomplete information based on GrC was proposed. Based on semantic analysis of unknown attribute value, the granule was extended to handle incomplete information. Maximum characteristic granule (MCG) was defined based on characteristic relation, and MCG was used to construct the resolution function matrix. The optimal general decision rule was introduced, with the basic equivalent forms of propositional logic, the rules were extracted and reduction from incomplete information table. Combined with a fault diagnosis example of power train, the application approach of the method was present, and the validity of this method in knowledge acquisition was proved.
Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention
Fisher, Brian; Smith, Jennifer; Pike, Ian
2017-01-01
Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications. PMID:28895928
Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.
Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian
2017-09-12
Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.
Breast cancer patients' use of health information in decision making and coping.
Radina, M Elise; Ginter, Amanda C; Brandt, Julie; Swaney, Jan; Longo, Daniel R
2011-01-01
Breast cancer patients are some of today's most proactive healthcare consumers. Given how the media has highlighted the many issues involved in breast cancer, the unprecedented rise of consumerism in general, and the rise of healthcare consumerism specifically, a plethora of information on breast cancer has emerged in both scientific and popular media. It is timely and appropriate to consider breast cancer patients' perspectives regarding their search for health-related information and its use for treatment decision making and coping. The present study explores health information-seeking behaviors (passive and active), use of health information, sources of health information, and how such information is or is not used in patients' decision making about their treatment. This study used a secondary analysis of data regarding health information-seeking behaviors and treatment decisions from 2 separate but compatible qualitative data sets based on in-depth interviews with a total of 35 breast cancer survivors. Data were analyzed using thematic analysis. The majority of participating women were active information seekers (n = 26). Of the subsets of women who described their level of involvement in treatment decision making, the largest number (n = 13) reported a shared responsibility for decision making with their physician, and the next largest subset (n = 9) reported making the final decision themselves. These findings provide an enhanced understanding of the preferred source and method of delivery of information given health information-seeking behaviors and decision-making strategies. How health information is delivered in the future given these findings is discussed with specific attention to matching patient preferences with delivery methods to potentially enhance patients' sense of agency with regard to treatment, which has been shown to improve patients' psychosocial outcomes.
Research on AHP decision algorithms based on BP algorithm
NASA Astrophysics Data System (ADS)
Ma, Ning; Guan, Jianhe
2017-10-01
Decision making is the thinking activity that people choose or judge, and scientific decision-making has always been a hot issue in the field of research. Analytic Hierarchy Process (AHP) is a simple and practical multi-criteria and multi-objective decision-making method that combines quantitative and qualitative and can show and calculate the subjective judgment in digital form. In the process of decision analysis using AHP method, the rationality of the two-dimensional judgment matrix has a great influence on the decision result. However, in dealing with the real problem, the judgment matrix produced by the two-dimensional comparison is often inconsistent, that is, it does not meet the consistency requirements. BP neural network algorithm is an adaptive nonlinear dynamic system. It has powerful collective computing ability and learning ability. It can perfect the data by constantly modifying the weights and thresholds of the network to achieve the goal of minimizing the mean square error. In this paper, the BP algorithm is used to deal with the consistency of the two-dimensional judgment matrix of the AHP.
Saigal, Christopher S; Lambrechts, Sylvia I; Seenu Srinivasan, V; Dahan, Ely
2017-06-01
Many guidelines advocate the use of shared decision making for men with newly diagnosed prostate cancer. Decision aids can facilitate the process of shared decision making. Implicit in this approach is the idea that physicians understand which elements of treatment matter to patients. Little formal work exists to guide physicians or developers of decision aids in identifying these attributes. We use a mixed-methods technique adapted from marketing science, the 'Voice of the Patient', to describe and identify treatment elements of value for men with localized prostate cancer. We conducted semi-structured interviews with 30 men treated for prostate cancer in the urology clinic of the West Los Angeles Veteran Affairs Medical Center. We used a qualitative analysis to generate themes in patient narratives, and a quantitative approach, agglomerative hierarchical clustering, to identify attributes of treatment that were most relevant to patients making decisions about prostate cancer. We identified five 'traditional' prostate cancer treatment attributes: sexual dysfunction, bowel problems, urinary problems, lifespan, and others' opinions. We further identified two novel treatment attributes: a treatment's ability to validate a sense of proactivity and the need for an incision (separate from risks of surgery). Application of a successful marketing technique, the 'Voice of the Customer', in a clinical setting elicits non-obvious attributes that highlight unique patient decision-making concerns. Use of this method in the development of decision aids may result in more effective decision support.
Scan statistics with local vote for target detection in distributed system
NASA Astrophysics Data System (ADS)
Luo, Junhai; Wu, Qi
2017-12-01
Target detection has occupied a pivotal position in distributed system. Scan statistics, as one of the most efficient detection methods, has been applied to a variety of anomaly detection problems and significantly improves the probability of detection. However, scan statistics cannot achieve the expected performance when the noise intensity is strong, or the signal emitted by the target is weak. The local vote algorithm can also achieve higher target detection rate. After the local vote, the counting rule is always adopted for decision fusion. The counting rule does not use the information about the contiguity of sensors but takes all sensors' data into consideration, which makes the result undesirable. In this paper, we propose a scan statistics with local vote (SSLV) method. This method combines scan statistics with local vote decision. Before scan statistics, each sensor executes local vote decision according to the data of its neighbors and its own. By combining the advantages of both, our method can obtain higher detection rate in low signal-to-noise ratio environment than the scan statistics. After the local vote decision, the distribution of sensors which have detected the target becomes more intensive. To make full use of local vote decision, we introduce a variable-step-parameter for the SSLV. It significantly shortens the scan period especially when the target is absent. Analysis and simulations are presented to demonstrate the performance of our method.
Jovanovic, Sasa; Savic, Slobodan; Jovicic, Nebojsa; Boskovic, Goran; Djordjevic, Zorica
2016-09-01
Multi-criteria decision making (MCDM) is a relatively new tool for decision makers who deal with numerous and often contradictory factors during their decision making process. This paper presents a procedure to choose the optimal municipal solid waste (MSW) management system for the area of the city of Kragujevac (Republic of Serbia) based on the MCDM method. Two methods of multiple attribute decision making, i.e. SAW (simple additive weighting method) and TOPSIS (technique for order preference by similarity to ideal solution), respectively, were used to compare the proposed waste management strategies (WMS). Each of the created strategies was simulated using the software package IWM2. Total values for eight chosen parameters were calculated for all the strategies. Contribution of each of the six waste treatment options was valorized. The SAW analysis was used to obtain the sum characteristics for all the waste management treatment strategies and they were ranked accordingly. The TOPSIS method was used to calculate the relative closeness factors to the ideal solution for all the alternatives. Then, the proposed strategies were ranked in form of tables and diagrams obtained based on both MCDM methods. As shown in this paper, the results were in good agreement, which additionally confirmed and facilitated the choice of the optimal MSW management strategy. © The Author(s) 2016.
Allyn, Jérôme; Allou, Nicolas; Augustin, Pascal; Philip, Ivan; Martinet, Olivier; Belghiti, Myriem; Provenchere, Sophie; Montravers, Philippe; Ferdynus, Cyril
2017-01-01
Background The benefits of cardiac surgery are sometimes difficult to predict and the decision to operate on a given individual is complex. Machine Learning and Decision Curve Analysis (DCA) are recent methods developed to create and evaluate prediction models. Methods and finding We conducted a retrospective cohort study using a prospective collected database from December 2005 to December 2012, from a cardiac surgical center at University Hospital. The different models of prediction of mortality in-hospital after elective cardiac surgery, including EuroSCORE II, a logistic regression model and a machine learning model, were compared by ROC and DCA. Of the 6,520 patients having elective cardiac surgery with cardiopulmonary bypass, 6.3% died. Mean age was 63.4 years old (standard deviation 14.4), and mean EuroSCORE II was 3.7 (4.8) %. The area under ROC curve (IC95%) for the machine learning model (0.795 (0.755–0.834)) was significantly higher than EuroSCORE II or the logistic regression model (respectively, 0.737 (0.691–0.783) and 0.742 (0.698–0.785), p < 0.0001). Decision Curve Analysis showed that the machine learning model, in this monocentric study, has a greater benefit whatever the probability threshold. Conclusions According to ROC and DCA, machine learning model is more accurate in predicting mortality after elective cardiac surgery than EuroSCORE II. These results confirm the use of machine learning methods in the field of medical prediction. PMID:28060903
Strategies for Efficient Computation of the Expected Value of Partial Perfect Information
Madan, Jason; Ades, Anthony E.; Price, Malcolm; Maitland, Kathryn; Jemutai, Julie; Revill, Paul; Welton, Nicky J.
2014-01-01
Expected value of information methods evaluate the potential health benefits that can be obtained from conducting new research to reduce uncertainty in the parameters of a cost-effectiveness analysis model, hence reducing decision uncertainty. Expected value of partial perfect information (EVPPI) provides an upper limit to the health gains that can be obtained from conducting a new study on a subset of parameters in the cost-effectiveness analysis and can therefore be used as a sensitivity analysis to identify parameters that most contribute to decision uncertainty and to help guide decisions around which types of study are of most value to prioritize for funding. A common general approach is to use nested Monte Carlo simulation to obtain an estimate of EVPPI. This approach is computationally intensive, can lead to significant sampling bias if an inadequate number of inner samples are obtained, and incorrect results can be obtained if correlations between parameters are not dealt with appropriately. In this article, we set out a range of methods for estimating EVPPI that avoid the need for nested simulation: reparameterization of the net benefit function, Taylor series approximations, and restricted cubic spline estimation of conditional expectations. For each method, we set out the generalized functional form that net benefit must take for the method to be valid. By specifying this functional form, our methods are able to focus on components of the model in which approximation is required, avoiding the complexities involved in developing statistical approximations for the model as a whole. Our methods also allow for any correlations that might exist between model parameters. We illustrate the methods using an example of fluid resuscitation in African children with severe malaria. PMID:24449434
Hermans, C.; Erickson, J.; Noordewier, T.; Sheldon, A.; Kline, M.
2007-01-01
Multicriteria decision analysis (MCDA) provides a well-established family of decision tools to aid stakeholder groups in arriving at collective decisions. MCDA can also function as a framework for the social learning process, serving as an educational aid in decision problems characterized by a high level of public participation. In this paper, the framework and results of a structured decision process using the outranking MCDA methodology preference ranking organization method of enrichment evaluation (PROMETHEE) are presented. PROMETHEE is used to frame multi-stakeholder discussions of river management alternatives for the Upper White River of Central Vermont, in the northeastern United States. Stakeholders met over 10 months to create a shared vision of an ideal river and its services to communities, develop a list of criteria by which to evaluate river management alternatives, and elicit preferences to rank and compare individual and group preferences. The MCDA procedure helped to frame a group process that made stakeholder preferences explicit and substantive discussions about long-term river management possible. ?? 2006 Elsevier Ltd. All rights reserved.
Hermans, Caroline; Erickson, Jon; Noordewier, Tom; Sheldon, Amy; Kline, Mike
2007-09-01
Multicriteria decision analysis (MCDA) provides a well-established family of decision tools to aid stakeholder groups in arriving at collective decisions. MCDA can also function as a framework for the social learning process, serving as an educational aid in decision problems characterized by a high level of public participation. In this paper, the framework and results of a structured decision process using the outranking MCDA methodology preference ranking organization method of enrichment evaluation (PROMETHEE) are presented. PROMETHEE is used to frame multi-stakeholder discussions of river management alternatives for the Upper White River of Central Vermont, in the northeastern United States. Stakeholders met over 10 months to create a shared vision of an ideal river and its services to communities, develop a list of criteria by which to evaluate river management alternatives, and elicit preferences to rank and compare individual and group preferences. The MCDA procedure helped to frame a group process that made stakeholder preferences explicit and substantive discussions about long-term river management possible.
Patel, Vaishali N; Riley, Anne W
2007-10-01
A multiple case study was conducted to examine how staff in child out-of-home care programs used data from an Outcomes Management System (OMS) and other sources to inform decision-making. Data collection consisted of thirty-seven semi-structured interviews with clinicians, managers, and directors from two treatment foster care programs and two residential treatment centers, and individuals involved with developing the OMS; and observations of clinical and quality management meetings. Case study and grounded theory methodology guided analyses. The application of qualitative data analysis software is described. Results show that although staff rarely used data from the OMS, they did rely on other sources of systematically collected information to inform clinical, quality management, and program decisions. Analyses of how staff used these data suggest that improving the utility of OMS will involve encouraging staff to participate in data-based decision-making, and designing and implementing OMS in a manner that reflects how decision-making processes operate.
Azadeh, Ali; Zarrin, Mansour; Hamid, Mehdi
2016-02-01
Road accidents can be caused by different factors such as human factors. Quality of the decision-making process of drivers could have a considerable impact on preventing disasters. The main objective of this study is the analysis of factors affecting road accidents by considering the severity of accidents and decision-making styles of drivers. To this end, a novel framework is proposed based on data envelopment analysis (DEA) and statistical methods (SMs) to assess the factors affecting road accidents. In this study, for the first time, dominant decision-making styles of drivers with respect to severity of injuries are identified. To show the applicability of the proposed framework, this research employs actual data of more than 500 samples in Tehran, Iran. The empirical results indicate that the flexible decision style is the dominant style for both minor and severe levels of accident injuries. Copyright © 2015 Elsevier Ltd. All rights reserved.
Multi-criteria decision making--an approach to setting priorities in health care.
Nobre, F F; Trotta, L T; Gomes, L F
1999-12-15
The objective of this paper is to present a multi-criteria decision making (MCDM) approach to support public health decision making that takes into consideration the fuzziness of the decision goals and the behavioural aspect of the decision maker. The approach is used to analyse the process of health technology procurement in a University Hospital in Rio de Janeiro, Brazil. The method, known as TODIM, relies on evaluating alternatives with a set of decision criteria assessed using an ordinal scale. Fuzziness in generating criteria scores and weights or conflicts caused by dealing with different viewpoints of a group of decision makers (DMs) are solved using fuzzy set aggregation rules. The results suggested that MCDM models, incorporating fuzzy set approaches, should form a set of tools for public health decision making analysis, particularly when there are polarized opinions and conflicting objectives from the DM group. Copyright 1999 John Wiley & Sons, Ltd.
78 FR 25440 - Request for Information and Citations on Methods for Cumulative Risk Assessment
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-01
... Citations on Methods for Cumulative Risk Assessment AGENCY: Office of the Science Advisor, Environmental... requesting information and citations on approaches and methods for the planning, analysis, assessment, and... approaches to understanding risks to human health and the environment. For example, in Science & Decisions...
Lifecycle analysis for automobiles: Uses and limitations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaines, L.; Stodolsky, F.
There has been a recent trend toward the use of lifecycle analysis (LCA) as a decision-making tool for the automotive industry. However, the different practitioners` methods and assumptions vary widely, as do the interpretations put on the results. The lack of uniformity has been addressed by such groups as the Society of Environmental Toxicology and Chemistry (SETAC) and the International Organization for Standardization (ISO), but standardization of methodology assures neither meaningful results nor appropriate use of the results. This paper examines the types of analysis that are possible for automobiles, explains possible pitfalls to be avoided, and suggests ways thatmore » LCA can be used as part of a rational decision-making procedure. The key to performing a useful analysis is identification of the factors that will actually be used in making the decision. It makes no sense to analyze system energy use in detail if direct financial cost is to be the decision criterion. Criteria may depend on who is making the decision (consumer, producer, regulator). LCA can be used to track system performance for a variety of criteria, including emissions, energy use, and monetary costs, and these can have spatial and temporal distributions. Because optimization of one parameter is likely to worsen another, identification of trade-offs is an important function of LCA.« less
The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism
NASA Technical Reports Server (NTRS)
Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.
2006-01-01
This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.
NASA Astrophysics Data System (ADS)
He, Xin; Frey, Eric C.
2007-03-01
Binary ROC analysis has solid decision-theoretic foundations and a close relationship to linear discriminant analysis (LDA). In particular, for the case of Gaussian equal covariance input data, the area under the ROC curve (AUC) value has a direct relationship to the Hotelling trace. Many attempts have been made to extend binary classification methods to multi-class. For example, Fukunaga extended binary LDA to obtain multi-class LDA, which uses the multi-class Hotelling trace as a figure-of-merit, and we have previously developed a three-class ROC analysis method. This work explores the relationship between conventional multi-class LDA and three-class ROC analysis. First, we developed a linear observer, the three-class Hotelling observer (3-HO). For Gaussian equal covariance data, the 3- HO provides equivalent performance to the three-class ideal observer and, under less strict conditions, maximizes the signal to noise ratio for classification of all pairs of the three classes simultaneously. The 3-HO templates are not the eigenvectors obtained from multi-class LDA. Second, we show that the three-class Hotelling trace, which is the figureof- merit in the conventional three-class extension of LDA, has significant limitations. Third, we demonstrate that, under certain conditions, there is a linear relationship between the eigenvectors obtained from multi-class LDA and 3-HO templates. We conclude that the 3-HO based on decision theory has advantages both in its decision theoretic background and in the usefulness of its figure-of-merit. Additionally, there exists the possibility of interpreting the two linear features extracted by the conventional extension of LDA from a decision theoretic point of view.
Risk-benefit analysis and public policy: a bibliography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, E.M.; Van Horn, A.J.
1976-11-01
Risk-benefit analysis has been implicitly practiced whenever decision-makers are confronted with decisions involving risks to life, health, or to the environment. Various methodologies have been developed to evaluate relevant criteria and to aid in assessing the impacts of alternative projects. Among these have been cost-benefit analysis, which has been widely used for project evaluation. However, in many cases it has been difficult to assign dollar costs to those criteria involving risks and benefits which are not now assigned explicit monetary values in our economic system. Hence, risk-benefit analysis has evolved to become more than merely an extension of cost-benefit analysis,more » and many methods have been applied to examine the trade-offs between risks and benefits. In addition, new scientific and statistical techniques have been developed for assessing current and future risks. The 950 references included in this bibliography are meant to suggest the breadth of those methodologies which have been applied to decisions involving risk.« less
Dotson, G Scott; Hudson, Naomi L; Maier, Andrew
2015-01-01
Emergency Management and Operations (EMO) personnel are in need of resources and tools to assist in understanding the health risks associated with dermal exposures during chemical incidents. This article reviews available resources and presents a conceptual framework for a decision support system (DSS) that assists in characterizing and managing risk during chemical emergencies involving dermal exposures. The framework merges principles of three decision-making techniques: 1) scenario planning, 2) risk analysis, and 3) multicriteria decision analysis (MCDA). This DSS facilitates dynamic decision making during each of the distinct life cycle phases of an emergency incident (ie, preparedness, response, or recovery) and identifies EMO needs. A checklist tool provides key questions intended to guide users through the complexities of conducting a dermal risk assessment. The questions define the scope of the framework for resource identification and application to support decision-making needs. The framework consists of three primary modules: 1) resource compilation, 2) prioritization, and 3) decision. The modules systematically identify, organize, and rank relevant information resources relating to the hazards of dermal exposures to chemicals and risk management strategies. Each module is subdivided into critical elements designed to further delineate the resources based on relevant incident phase and type of information. The DSS framework provides a much needed structure based on contemporary decision analysis principles for 1) documenting key questions for EMO problem formulation and 2) a method for systematically organizing, screening, and prioritizing information resources on dermal hazards, exposures, risk characterization, and management.
Dotson, G. Scott; Hudson, Naomi L.; Maier, Andrew
2016-01-01
Emergency Management and Operations (EMO) personnel are in need of resources and tools to assist in understanding the health risks associated with dermal exposures during chemical incidents. This article reviews available resources and presents a conceptual framework for a decision support system (DSS) that assists in characterizing and managing risk during chemical emergencies involving dermal exposures. The framework merges principles of three decision-making techniques: 1) scenario planning, 2) risk analysis, and 3) multicriteria decision analysis (MCDA). This DSS facilitates dynamic decision making during each of the distinct life cycle phases of an emergency incident (ie, preparedness, response, or recovery) and identifies EMO needs. A checklist tool provides key questions intended to guide users through the complexities of conducting a dermal risk assessment. The questions define the scope of the framework for resource identification and application to support decision-making needs. The framework consists of three primary modules: 1) resource compilation, 2) prioritization, and 3) decision. The modules systematically identify, organize, and rank relevant information resources relating to the hazards of dermal exposures to chemicals and risk management strategies. Each module is subdivided into critical elements designed to further delineate the resources based on relevant incident phase and type of information. The DSS framework provides a much needed structure based on contemporary decision analysis principles for 1) documenting key questions for EMO problem formulation and 2) a method for systematically organizing, screening, and prioritizing information resources on dermal hazards, exposures, risk characterization, and management. PMID:26312660
Holt, S; Bertelli, G; Humphreys, I; Valentine, W; Durrani, S; Pudney, D; Rolles, M; Moe, M; Khawaja, S; Sharaiha, Y; Brinkworth, E; Whelan, S; Jones, S; Bennett, H; Phillips, C J
2013-01-01
Background: Tumour gene expression analysis is useful in predicting adjuvant chemotherapy benefit in early breast cancer patients. This study aims to examine the implications of routine Oncotype DX testing in the UK. Methods: Women with oestrogen receptor positive (ER+), pNO or pN1mi breast cancer were assessed for adjuvant chemotherapy and subsequently offered Oncotype DX testing, with changes in chemotherapy decisions recorded. A subset of patients completed questionnaires about their uncertainties regarding chemotherapy decisions pre- and post-testing. All patients were asked to complete a diary of medical interactions over the next 6 months, from which economic data were extracted to model the cost-effectiveness of testing. Results: Oncotype DX testing resulted in changes in chemotherapy decisions in 38 of 142 (26.8%) women, with 26 of 57 (45.6%) spared chemotherapy and 12 of 85 (14.1%) requiring chemotherapy when not initially recommended (9.9% reduction overall). Decision conflict analysis showed that Oncotype DX testing increased patients' confidence in treatment decision making. Economic analysis showed that routine Oncotype DX testing costs £6232 per quality-adjusted life year gained. Conclusion: Oncotype DX decreased chemotherapy use and increased confidence in treatment decision making in patients with ER+ early-stage breast cancer. Based on these findings, Oncotype DX is cost-effective in the UK setting. PMID:23695023
Vego, Goran; Kucar-Dragicević, Savka; Koprivanac, Natalija
2008-11-01
The efficiency of providing a waste management system in the coastal part of Croatia consisting of four Dalmatian counties has been modelled. Two multi-criteria decision-making (MCDM) methods, PROMETHEE and GAIA, were applied to assist with the systematic analysis and evaluation of the alternatives. The analysis covered two levels; first, the potential number of waste management centres resulting from possible inter-county cooperation; and second, the relative merits of siting of waste management centres in the coastal or hinterland zone was evaluated. The problem was analysed according to several criteria; and ecological, economic, social and functional criteria sets were identified as relevant to the decision-making process. The PROMETHEE and GAIA methods were shown to be efficient tools for analysing the problem considered. Such an approach provided new insights to waste management planning at the strategic level, and gave a reason for rethinking some of the existing strategic waste management documents in Croatia.
Boutkhoum, Omar; Hanine, Mohamed; Agouti, Tarik; Tikniouine, Abdessadek
2015-01-01
In this paper, we examine the issue of strategic industrial location selection in uncertain decision making environments for implanting new industrial corporation. In fact, the industrial location issue is typically considered as a crucial factor in business research field which is related to many calculations about natural resources, distributors, suppliers, customers, and most other things. Based on the integration of environmental, economic and social decisive elements of sustainable development, this paper presents a hybrid decision making model combining fuzzy multi-criteria analysis with analytical capabilities that OLAP systems can provide for successful and optimal industrial location selection. The proposed model mainly consists in three stages. In the first stage, a decision-making committee has been established to identify the evaluation criteria impacting the location selection process. In the second stage, we develop fuzzy AHP software based on the extent analysis method to assign the importance weights to the selected criteria, which allows us to model the linguistic vagueness, ambiguity, and incomplete knowledge. In the last stage, OLAP analysis integrated with multi-criteria analysis employs these weighted criteria as inputs to evaluate, rank and select the strategic industrial location for implanting new business corporation in the region of Casablanca, Morocco. Finally, a sensitivity analysis is performed to evaluate the impact of criteria weights and the preferences given by decision makers on the final rankings of strategic industrial locations.
ERIC Educational Resources Information Center
Pampaloni, Andrea M.
2010-01-01
Colleges and universities rely on their image to attract new members. This study focuses on the decision-making process of students preparing to apply to college. High school students were surveyed at college open houses to identify the factors most influential to their college application decision-making. A multi-methods analysis found that…
Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses
ERIC Educational Resources Information Center
Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo
2018-01-01
Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…
Enrollment Projection within a Decision-Making Framework.
ERIC Educational Resources Information Center
Armstrong, David F.; Nunley, Charlene Wenckowski
1981-01-01
Two methods used to predict enrollment at Montgomery College in Maryland are compared and evaluated, and the administrative context in which they are used is considered. The two methods involve time series analysis (curve fitting) and indicator techniques (yield from components). (MSE)
Simplified web-based decision support method for traffic management and work zone analysis.
DOT National Transportation Integrated Search
2017-01-01
Traffic congestion mitigation is one of the key challenges that transportation planners and operations engineers face when planning for construction and maintenance activities. There is a wide variety of approaches and methods that address work zone ...
Simplified web-based decision support method for traffic management and work zone analysis.
DOT National Transportation Integrated Search
2015-06-01
Traffic congestion mitigation is one of the key challenges that transportation planners and operations engineers face when : planning for construction and maintenance activities. There is a wide variety of approaches and methods that address work : z...
Selecting essential information for biosurveillance--a multi-criteria decision analysis.
Generous, Nicholas; Margevicius, Kristen J; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina
2014-01-01
The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.
Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis
NASA Technical Reports Server (NTRS)
Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William
2009-01-01
This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).
Marsh, Kevin; IJzerman, Maarten; Thokala, Praveen; Baltussen, Rob; Boysen, Meindert; Kaló, Zoltán; Lönngren, Thomas; Mussen, Filip; Peacock, Stuart; Watkins, John; Devlin, Nancy
2016-01-01
Health care decisions are complex and involve confronting trade-offs between multiple, often conflicting objectives. Using structured, explicit approaches to decisions involving multiple criteria can improve the quality of decision making. A set of techniques, known under the collective heading, multiple criteria decision analysis (MCDA), are useful for this purpose. In 2014, ISPOR established an Emerging Good Practices Task Force. The task force's first report defined MCDA, provided examples of its use in health care, described the key steps, and provided an overview of the principal methods of MCDA. This second task force report provides emerging good-practice guidance on the implementation of MCDA to support health care decisions. The report includes: a checklist to support the design, implementation and review of an MCDA; guidance to support the implementation of the checklist; the order in which the steps should be implemented; illustrates how to incorporate budget constraints into an MCDA; provides an overview of the skills and resources, including available software, required to implement MCDA; and future research directions. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Mudali, D; Teune, L K; Renken, R J; Leenders, K L; Roerdink, J B T M
2015-01-01
Medical imaging techniques like fluorodeoxyglucose positron emission tomography (FDG-PET) have been used to aid in the differential diagnosis of neurodegenerative brain diseases. In this study, the objective is to classify FDG-PET brain scans of subjects with Parkinsonian syndromes (Parkinson's disease, multiple system atrophy, and progressive supranuclear palsy) compared to healthy controls. The scaled subprofile model/principal component analysis (SSM/PCA) method was applied to FDG-PET brain image data to obtain covariance patterns and corresponding subject scores. The latter were used as features for supervised classification by the C4.5 decision tree method. Leave-one-out cross validation was applied to determine classifier performance. We carried out a comparison with other types of classifiers. The big advantage of decision tree classification is that the results are easy to understand by humans. A visual representation of decision trees strongly supports the interpretation process, which is very important in the context of medical diagnosis. Further improvements are suggested based on enlarging the number of the training data, enhancing the decision tree method by bagging, and adding additional features based on (f)MRI data.
Caird, Jeff K; Edwards, Christopher J; Creaser, Janet I; Horrey, William J
2005-01-01
A modified version of the flicker technique to induce change blindness was used to examine the effects of time constraints on decision-making accuracy at intersections on a total of 62 young (18-25 years), middle-aged (26-64 years), young-old (65-73 years), and old-old (74+ years) drivers. Thirty-six intersection photographs were manipulated so that one object (i.e., pedestrian, vehicle, sign, or traffic control device) in the scene would change when the images were alternated for either 5 or 8 s using the modified flicker method. Young and middle-aged drivers made significantly more correct decisions than did young-old and old-old drivers. Logistic regression analysis of the data indicated that age and/or time were significant predictors of decision performance in 14 of the 36 intersections. Actual or potential applications of this research include driving assessment and crash investigation.
NASA Astrophysics Data System (ADS)
Gumilar, I.; Rizal, A.; Sriati; Setiawan Putra, R.
2018-04-01
This research aim was to analyzed process of decision making of purchasing ornamental freshwater fish at Peta Street, Bandung City and Analyzed what factors are driving consumers to buy freshwater fish Peta Street. The method used in this research is case study with rating scale and rank spearman analysis. The sampling technique is the accidental random sampling method consist of 30 respondents. The consumer’s decision making process consist of five stages, namely the recognition of needs, information searching, alternative evaluation, process of purchasing, and the evaluation of results. The results showed that at the stage of recognition of needs the motivation of purchasing freshwater fish because respondents are very fond of ornamental freshwater fish, at the stage of information search, the information sources are from the print media and friends or neighborhood. At the stage of alternative evaluation, the reason consumers buy ornamental freshwater fish because the quality of good products. The stage of purchasing decision process consumers bought 1-5 fish with frequency of purchase 1 time per month. The evaluation of results of post-purchasing consumers feel very satisfied with the fish products and the price is very affordable. To observe the factors that influence purchasing motivation of consumers, spearman rank test is the method. The results showed that the quality and price of the product are the factors that most influence the purchase decision of ornamental freshwater fish with the range of student-t value 3,968 and 2,107.
Evidence synthesis for decision making 7: a reviewer's checklist.
Ades, A E; Caldwell, Deborah M; Reken, Stefanie; Welton, Nicky J; Sutton, Alex J; Dias, Sofia
2013-07-01
This checklist is for the review of evidence syntheses for treatment efficacy used in decision making based on either efficacy or cost-effectiveness. It is intended to be used for pairwise meta-analysis, indirect comparisons, and network meta-analysis, without distinction. It does not generate a quality rating and is not prescriptive. Instead, it focuses on a series of questions aimed at revealing the assumptions that the authors of the synthesis are expecting readers to accept, the adequacy of the arguments authors advance in support of their position, and the need for further analyses or sensitivity analyses. The checklist is intended primarily for those who review evidence syntheses, including indirect comparisons and network meta-analyses, in the context of decision making but will also be of value to those submitting syntheses for review, whether to decision-making bodies or journals. The checklist has 4 main headings: A) definition of the decision problem, B) methods of analysis and presentation of results, C) issues specific to network synthesis, and D) embedding the synthesis in a probabilistic cost-effectiveness model. The headings and implicit advice follow directly from the other tutorials in this series. A simple table is provided that could serve as a pro forma checklist.
Satomi, Junichiro; Ghaibeh, A Ammar; Moriguchi, Hiroki; Nagahiro, Shinji
2015-07-01
The severity of clinical signs and symptoms of cranial dural arteriovenous fistulas (DAVFs) are well correlated with their pattern of venous drainage. Although the presence of cortical venous drainage can be considered a potential predictor of aggressive DAVF behaviors, such as intracranial hemorrhage or progressive neurological deficits due to venous congestion, accurate statistical analyses are currently not available. Using a decision tree data mining method, the authors aimed at clarifying the predictability of the future development of aggressive behaviors of DAVF and at identifying the main causative factors. Of 266 DAVF patients, 89 were eligible for analysis. Under observational management, 51 patients presented with intracranial hemorrhage/infarction during the follow-up period. The authors created a decision tree able to assess the risk for the development of aggressive DAVF behavior. Evaluated by 10-fold cross-validation, the decision tree's accuracy, sensitivity, and specificity were 85.28%, 88.33%, and 80.83%, respectively. The tree shows that the main factor in symptomatic patients was the presence of cortical venous drainage. In its absence, the lesion location determined the risk of a DAVF developing aggressive behavior. Decision tree analysis accurately predicts the future development of aggressive DAVF behavior.
Park, Myonghwa; Choi, Sora; Shin, A Mi; Koo, Chul Hoi
2013-02-01
The purpose of this study was to develop a prediction model for the characteristics of older adults with depression using the decision tree method. A large dataset from the 2008 Korean Elderly Survey was used and data of 14,970 elderly people were analyzed. Target variable was depression and 53 input variables were general characteristics, family & social relationship, economic status, health status, health behavior, functional status, leisure & social activity, quality of life, and living environment. Data were analyzed by decision tree analysis, a data mining technique using SPSS Window 19.0 and Clementine 12.0 programs. The decision trees were classified into five different rules to define the characteristics of older adults with depression. Classification & Regression Tree (C&RT) showed the best prediction with an accuracy of 80.81% among data mining models. Factors in the rules were life satisfaction, nutritional status, daily activity difficulty due to pain, functional limitation for basic or instrumental daily activities, number of chronic diseases and daily activity difficulty due to disease. The different rules classified by the decision tree model in this study should contribute as baseline data for discovering informative knowledge and developing interventions tailored to these individual characteristics.
MacGillivray, Brian H
2017-08-01
In many environmental and public health domains, heuristic methods of risk and decision analysis must be relied upon, either because problem structures are ambiguous, reliable data is lacking, or decisions are urgent. This introduces an additional source of uncertainty beyond model and measurement error - uncertainty stemming from relying on inexact inference rules. Here we identify and analyse heuristics used to prioritise risk objects, to discriminate between signal and noise, to weight evidence, to construct models, to extrapolate beyond datasets, and to make policy. Some of these heuristics are based on causal generalisations, yet can misfire when these relationships are presumed rather than tested (e.g. surrogates in clinical trials). Others are conventions designed to confer stability to decision analysis, yet which may introduce serious error when applied ritualistically (e.g. significance testing). Some heuristics can be traced back to formal justifications, but only subject to strong assumptions that are often violated in practical applications. Heuristic decision rules (e.g. feasibility rules) in principle act as surrogates for utility maximisation or distributional concerns, yet in practice may neglect costs and benefits, be based on arbitrary thresholds, and be prone to gaming. We highlight the problem of rule-entrenchment, where analytical choices that are in principle contestable are arbitrarily fixed in practice, masking uncertainty and potentially introducing bias. Strategies for making risk and decision analysis more rigorous include: formalising the assumptions and scope conditions under which heuristics should be applied; testing rather than presuming their underlying empirical or theoretical justifications; using sensitivity analysis, simulations, multiple bias analysis, and deductive systems of inference (e.g. directed acyclic graphs) to characterise rule uncertainty and refine heuristics; adopting "recovery schemes" to correct for known biases; and basing decision rules on clearly articulated values and evidence, rather than convention. Copyright © 2017. Published by Elsevier Ltd.
Service users' experiences of participation in decision making in mental health services.
Dahlqvist Jönsson, P; Schön, U-K; Rosenberg, D; Sandlund, M; Svedberg, P
2015-11-01
Despite the potential positive impact of shared decision making on service users knowledge and experience of decisional conflict, there is a lack of qualitative research on how participation in decision making is promoted from the perspective of psychiatric service users. This study highlights the desire of users to participate more actively in decision making and demonstrates that persons with SMI struggle to be seen as competent and equal partners in decision-making situations. Those interviewed did not feel that their strengths, abilities and needs were being recognized, which resulted in a feeling of being omitted from involvement in decision-making situations. The service users describe some essential conditions that could work to promote participation in decision making. These included having personal support, having access to knowledge, being involved in a dialogue and clarity about responsibilities. Mental health nurses can play an essential role for developing and implementing shared decision making as a tool to promote recovery-oriented mental health services. Service user participation in decision making is considered an essential component of recovery-oriented mental health services. Despite the potential of shared decision making to impact service users knowledge and positively influence their experience of decisional conflict, there is a lack of qualitative research on how participation in decision making is promoted from the perspective of psychiatric service users. In order to develop concrete methods that facilitate shared decision making, there is a need for increased knowledge regarding the users' own perspective. The aim of this study was to explore users' experiences of participation in decisions in mental health services in Sweden, and the kinds of support that may promote participation. Constructivist Grounded Theory (CGT) was utilized to analyse group and individual interviews with 20 users with experience of serious mental illness. The core category that emerged in the analysis described a 'struggle to be perceived as a competent and equal person' while three related categories including being the underdog, being controlled and being omitted described the difficulties of participating in decisions. The data analysis resulted in a model that describes internal and external conditions that influence the promotion of participation in decision making. The findings offer new insights from a user perspective and these can be utilized to develop and investigate concrete methods in order to promote user's participation in decisions. © 2015 John Wiley & Sons Ltd.
Jansen, Jeroen P; Fleurence, Rachael; Devine, Beth; Itzler, Robbin; Barrett, Annabel; Hawkins, Neil; Lee, Karen; Boersma, Cornelis; Annemans, Lieven; Cappelleri, Joseph C
2011-06-01
Evidence-based health-care decision making requires comparisons of all relevant competing interventions. In the absence of randomized, controlled trials involving a direct comparison of all treatments of interest, indirect treatment comparisons and network meta-analysis provide useful evidence for judiciously selecting the best choice(s) of treatment. Mixed treatment comparisons, a special case of network meta-analysis, combine direct and indirect evidence for particular pairwise comparisons, thereby synthesizing a greater share of the available evidence than a traditional meta-analysis. This report from the ISPOR Indirect Treatment Comparisons Good Research Practices Task Force provides guidance on the interpretation of indirect treatment comparisons and network meta-analysis to assist policymakers and health-care professionals in using its findings for decision making. We start with an overview of how networks of randomized, controlled trials allow multiple treatment comparisons of competing interventions. Next, an introduction to the synthesis of the available evidence with a focus on terminology, assumptions, validity, and statistical methods is provided, followed by advice on critically reviewing and interpreting an indirect treatment comparison or network meta-analysis to inform decision making. We finish with a discussion of what to do if there are no direct or indirect treatment comparisons of randomized, controlled trials possible and a health-care decision still needs to be made. Copyright © 2011 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Hosseinzade, Zeinab; Pagsuyoin, Sheree A; Ponnambalam, Kumaraswamy; Monem, Mohammad J
2017-12-01
The stiff competition for water between agriculture and non-agricultural production sectors makes it necessary to have effective management of irrigation networks in farms. However, the process of selecting flow control structures in irrigation networks is highly complex and involves different levels of decision makers. In this paper, we apply multi-attribute decision making (MADM) methodology to develop a decision analysis (DA) framework for evaluating, ranking and selecting check and intake structures for irrigation canals. The DA framework consists of identifying relevant attributes for canal structures, developing a robust scoring system for alternatives, identifying a procedure for data quality control, and identifying a MADM model for the decision analysis. An application is illustrated through an analysis for automation purposes of the Qazvin irrigation network, one of the oldest and most complex irrigation networks in Iran. A survey questionnaire designed based on the decision framework was distributed to experts, managers, and operators of the Qazvin network and to experts from the Ministry of Power in Iran. Five check structures and four intake structures were evaluated. A decision matrix was generated from the average scores collected from the survey, and was subsequently solved using TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution) method. To identify the most critical structure attributes for the selection process, optimal attribute weights were calculated using Entropy method. For check structures, results show that the duckbill weir is the preferred structure while the pivot weir is the least preferred. Use of the duckbill weir can potentially address the problem with existing Amil gates where manual intervention is required to regulate water levels during periods of flow extremes. For intake structures, the Neyrpic® gate and constant head orifice are the most and least preferred alternatives, respectively. Some advantages of the Neyrpic® gate are ease of operation and capacity to measure discharge flows. Overall, the application to the Qazvin irrigation network demonstrates the utility of the proposed DA framework in selecting appropriate structures for regulating water flows in irrigation canals. This framework systematically aids the decision process by capturing decisions made at various levels (individual farmers to high-level management). It can be applied to other cases where a new irrigation network is being designed, or where changes in irrigation structures need to be identified to improve flow control in existing networks. Copyright © 2017 Elsevier B.V. All rights reserved.
Allen, Peter J.; Dorozenko, Kate P.; Roberts, Lynne D.
2016-01-01
Quantitative research methods are essential to the development of professional competence in psychology. They are also an area of weakness for many students. In particular, students are known to struggle with the skill of selecting quantitative analytical strategies appropriate for common research questions, hypotheses and data types. To begin understanding this apparent deficit, we presented nine psychology undergraduates (who had all completed at least one quantitative methods course) with brief research vignettes, and asked them to explicate the process they would follow to identify an appropriate statistical technique for each. Thematic analysis revealed that all participants found this task challenging, and even those who had completed several research methods courses struggled to articulate how they would approach the vignettes on more than a very superficial and intuitive level. While some students recognized that there is a systematic decision making process that can be followed, none could describe it clearly or completely. We then presented the same vignettes to 10 psychology academics with particular expertise in conducting research and/or research methods instruction. Predictably, these “experts” were able to describe a far more systematic, comprehensive, flexible, and nuanced approach to statistical decision making, which begins early in the research process, and pays consideration to multiple contextual factors. They were sensitive to the challenges that students experience when making statistical decisions, which they attributed partially to how research methods and statistics are commonly taught. This sensitivity was reflected in their pedagogic practices. When asked to consider the format and features of an aid that could facilitate the statistical decision making process, both groups expressed a preference for an accessible, comprehensive and reputable resource that follows a basic decision tree logic. For the academics in particular, this aid should function as a teaching tool, which engages the user with each choice-point in the decision making process, rather than simply providing an “answer.” Based on these findings, we offer suggestions for tools and strategies that could be deployed in the research methods classroom to facilitate and strengthen students' statistical decision making abilities. PMID:26909064
Allen, Peter J; Dorozenko, Kate P; Roberts, Lynne D
2016-01-01
Quantitative research methods are essential to the development of professional competence in psychology. They are also an area of weakness for many students. In particular, students are known to struggle with the skill of selecting quantitative analytical strategies appropriate for common research questions, hypotheses and data types. To begin understanding this apparent deficit, we presented nine psychology undergraduates (who had all completed at least one quantitative methods course) with brief research vignettes, and asked them to explicate the process they would follow to identify an appropriate statistical technique for each. Thematic analysis revealed that all participants found this task challenging, and even those who had completed several research methods courses struggled to articulate how they would approach the vignettes on more than a very superficial and intuitive level. While some students recognized that there is a systematic decision making process that can be followed, none could describe it clearly or completely. We then presented the same vignettes to 10 psychology academics with particular expertise in conducting research and/or research methods instruction. Predictably, these "experts" were able to describe a far more systematic, comprehensive, flexible, and nuanced approach to statistical decision making, which begins early in the research process, and pays consideration to multiple contextual factors. They were sensitive to the challenges that students experience when making statistical decisions, which they attributed partially to how research methods and statistics are commonly taught. This sensitivity was reflected in their pedagogic practices. When asked to consider the format and features of an aid that could facilitate the statistical decision making process, both groups expressed a preference for an accessible, comprehensive and reputable resource that follows a basic decision tree logic. For the academics in particular, this aid should function as a teaching tool, which engages the user with each choice-point in the decision making process, rather than simply providing an "answer." Based on these findings, we offer suggestions for tools and strategies that could be deployed in the research methods classroom to facilitate and strengthen students' statistical decision making abilities.
Donovan, Sarah-Louise; Salmon, Paul M; Lenné, Michael G; Horberry, Tim
2017-10-01
Safety leadership is an important factor in supporting safety in high-risk industries. This article contends that applying systems-thinking methods to examine safety leadership can support improved learning from incidents. A case study analysis was undertaken of a large-scale mining landslide incident in which no injuries or fatalities were incurred. A multi-method approach was adopted, in which the Critical Decision Method, Rasmussen's Risk Management Framework and Accimap method were applied to examine the safety leadership decisions and actions which enabled the safe outcome. The approach enabled Rasmussen's predictions regarding safety and performance to be examined in the safety leadership context, with findings demonstrating the distribution of safety leadership across leader and system levels, and the presence of vertical integration as key to supporting the successful safety outcome. In doing so, the findings also demonstrate the usefulness of applying systems-thinking methods to examine and learn from incidents in terms of what 'went right'. The implications, including future research directions, are discussed. Practitioner Summary: This paper presents a case study analysis, in which systems-thinking methods are applied to the examination of safety leadership decisions and actions during a large-scale mining landslide incident. The findings establish safety leadership as a systems phenomenon, and furthermore, demonstrate the usefulness of applying systems-thinking methods to learn from incidents in terms of what 'went right'. Implications, including future research directions, are discussed.
Elumalai, Vetrimurugan; Brindha, K; Sithole, Bongani; Lakshmanan, Elango
2017-04-01
Mapping groundwater contaminants and identifying the sources are the initial steps in pollution control and mitigation. Due to the availability of different mapping methods and the large number of emerging pollutants, these methods need to be used together in decision making. The present study aims to map the contaminated areas in Richards Bay, South Africa and compare the results of ordinary kriging (OK) and inverse distance weighted (IDW) interpolation techniques. Statistical methods were also used for identifying contamination sources. Na-Cl groundwater type was dominant followed by Ca-Mg-Cl. Data analysis indicate that silicate weathering, ion exchange and fresh water-seawater mixing are the major geochemical processes controlling the presence of major ions in groundwater. Factor analysis also helped to confirm the results. Overlay analysis by OK and IDW gave different results. Areas where groundwater was unsuitable as a drinking source were 419 and 116 km 2 for OK and IDW, respectively. Such diverse results make decision making difficult, if only one method was to be used. Three highly contaminated zones within the study area were more accurately identified by OK. If large areas are identified as being contaminated such as by IDW in this study, the mitigation measures will be expensive. If these areas were underestimated, then even though management measures are taken, it will not be effective for a longer time. Use of multiple techniques like this study will help to avoid taking harsh decisions. Overall, the groundwater quality in this area was poor, and it is essential to identify alternate drinking water source or treat the groundwater before ingestion.
Robustness analysis of a green chemistry-based model for the ...
This paper proposes a robustness analysis based on Multiple Criteria Decision Aiding (MCDA). The ensuing model was used to assess the implementation of green chemistry principles in the synthesis of silver nanoparticles. Its recommendations were also compared to an earlier developed model for the same purpose to investigate concordance between the models and potential decision support synergies. A three-phase procedure was adopted to achieve the research objectives. Firstly, an ordinal ranking of the evaluation criteria used to characterize the implementation of green chemistry principles was identified through relative ranking analysis. Secondly, a structured selection process for an MCDA classification method was conducted, which ensued in the identification of Stochastic Multi-Criteria Acceptability Analysis (SMAA). Lastly, the agreement of the classifications by the two MCDA models and the resulting synergistic role of decision recommendations were studied. This comparison showed that the results of the two models agree between 76% and 93% of the simulation set-ups and it confirmed that different MCDA models provide a more inclusive and transparent set of recommendations. This integrative research confirmed the beneficial complementary use of MCDA methods to aid responsible development of nanosynthesis, by accounting for multiple objectives and helping communication of complex information in a comprehensive and traceable format, suitable for stakeholders and
Radio astronomy Explorer-B in-flight mission control system development effort
NASA Technical Reports Server (NTRS)
Lutsky, D. A.; Bjorkman, W. S.; Uphoff, C.
1973-01-01
A description is given of the development for the Mission Analysis Evaluation and Space Trajectory Operations (MAESTRO) program to be used for the in-flight decision making process during the translunar and lunar orbit adjustment phases of the flight of the Radio Astronomy Explorer-B. THe program serves two functions: performance and evaluation of preflight mission analysis, and in-flight support for the midcourse and lunar insertion command decisions that must be made by the flight director. The topics discussed include: analysis of program and midcourse guidance capabilities; methods for on-line control; printed displays of the MAESTRO program; and in-flight operational logistics and testing.
Constantinou, Anthony Costa; Fenton, Norman; Marsh, William; Radlinski, Lukasz
2016-01-01
Objectives 1) To develop a rigorous and repeatable method for building effective Bayesian network (BN) models for medical decision support from complex, unstructured and incomplete patient questionnaires and interviews that inevitably contain examples of repetitive, redundant and contradictory responses; 2) To exploit expert knowledge in the BN development since further data acquisition is usually not possible; 3) To ensure the BN model can be used for interventional analysis; 4) To demonstrate why using data alone to learn the model structure and parameters is often unsatisfactory even when extensive data is available. Method The method is based on applying a range of recent BN developments targeted at helping experts build BNs given limited data. While most of the components of the method are based on established work, its novelty is that it provides a rigorous consolidated and generalised framework that addresses the whole life-cycle of BN model development. The method is based on two original and recent validated BN models in forensic psychiatry, known as DSVM-MSS and DSVM-P. Results When employed with the same datasets, the DSVM-MSS demonstrated competitive to superior predictive performance (AUC scores 0.708 and 0.797) against the state-of-the-art (AUC scores ranging from 0.527 to 0.705), and the DSVM-P demonstrated superior predictive performance (cross-validated AUC score of 0.78) against the state-of-the-art (AUC scores ranging from 0.665 to 0.717). More importantly, the resulting models go beyond improving predictive accuracy and into usefulness for risk management purposes through intervention, and enhanced decision support in terms of answering complex clinical questions that are based on unobserved evidence. Conclusions This development process is applicable to any application domain which involves large-scale decision analysis based on such complex information, rather than based on data with hard facts, and in conjunction with the incorporation of expert knowledge for decision support via intervention. The novelty extends to challenging the decision scientists to reason about building models based on what information is really required for inference, rather than based on what data is available and hence, forces decision scientists to use available data in a much smarter way. PMID:26830286
Cost-Effectiveness and Cost-Benefit Analysis: Confronting the Problem of Choice.
ERIC Educational Resources Information Center
Clardy, Alan
Cost-effectiveness analysis and cost-benefit analysis are two related yet distinct methods to help decision makers choose the best course of action from among competing alternatives. For both types of analysis, costs are computed similarly. Costs may be reduced to present value amounts for multi-year programs, and parameters may be altered to show…
INTEGRATING DATA ANALYTICS AND SIMULATION METHODS TO SUPPORT MANUFACTURING DECISION MAKING
Kibira, Deogratias; Hatim, Qais; Kumara, Soundar; Shao, Guodong
2017-01-01
Modern manufacturing systems are installed with smart devices such as sensors that monitor system performance and collect data to manage uncertainties in their operations. However, multiple parameters and variables affect system performance, making it impossible for a human to make informed decisions without systematic methodologies and tools. Further, the large volume and variety of streaming data collected is beyond simulation analysis alone. Simulation models are run with well-prepared data. Novel approaches, combining different methods, are needed to use this data for making guided decisions. This paper proposes a methodology whereby parameters that most affect system performance are extracted from the data using data analytics methods. These parameters are used to develop scenarios for simulation inputs; system optimizations are performed on simulation data outputs. A case study of a machine shop demonstrates the proposed methodology. This paper also reviews candidate standards for data collection, simulation, and systems interfaces. PMID:28690363
Postmus, Douwe; Tervonen, Tommi; van Valkenhoef, Gert; Hillege, Hans L; Buskens, Erik
2014-09-01
A standard practice in health economic evaluation is to monetize health effects by assuming a certain societal willingness-to-pay per unit of health gain. Although the resulting net monetary benefit (NMB) is easy to compute, the use of a single willingness-to-pay threshold assumes expressibility of the health effects on a single non-monetary scale. To relax this assumption, this article proves that the NMB framework is a special case of the more general stochastic multi-criteria acceptability analysis (SMAA) method. Specifically, as SMAA does not restrict the number of criteria to two and also does not require the marginal rates of substitution to be constant, there are problem instances for which the use of this more general method may result in a better understanding of the trade-offs underlying the reimbursement decision-making problem. This is illustrated by applying both methods in a case study related to infertility treatment.
Aydin, Ilhan; Karakose, Mehmet; Akin, Erhan
2014-03-01
Although reconstructed phase space is one of the most powerful methods for analyzing a time series, it can fail in fault diagnosis of an induction motor when the appropriate pre-processing is not performed. Therefore, boundary analysis based a new feature extraction method in phase space is proposed for diagnosis of induction motor faults. The proposed approach requires the measurement of one phase current signal to construct the phase space representation. Each phase space is converted into an image, and the boundary of each image is extracted by a boundary detection algorithm. A fuzzy decision tree has been designed to detect broken rotor bars and broken connector faults. The results indicate that the proposed approach has a higher recognition rate than other methods on the same dataset. © 2013 ISA Published by ISA All rights reserved.
Application of effective discharge analysis to environmental flow decision-making
McKay, S. Kyle; Freeman, Mary C.; Covich, A.P.
2016-01-01
Well-informed river management decisions rely on an explicit statement of objectives, repeatable analyses, and a transparent system for assessing trade-offs. These components may then be applied to compare alternative operational regimes for water resource infrastructure (e.g., diversions, locks, and dams). Intra- and inter-annual hydrologic variability further complicates these already complex environmental flow decisions. Effective discharge analysis (developed in studies of geomorphology) is a powerful tool for integrating temporal variability of flow magnitude and associated ecological consequences. Here, we adapt the effectiveness framework to include multiple elements of the natural flow regime (i.e., timing, duration, and rate-of-change) as well as two flow variables. We demonstrate this analytical approach using a case study of environmental flow management based on long-term (60 years) daily discharge records in the Middle Oconee River near Athens, GA, USA. Specifically, we apply an existing model for estimating young-of-year fish recruitment based on flow-dependent metrics to an effective discharge analysis that incorporates hydrologic variability and multiple focal taxa. We then compare three alternative methods of environmental flow provision. Percentage-based withdrawal schemes outcompete other environmental flow methods across all levels of water withdrawal and ecological outcomes.
Application of Effective Discharge Analysis to Environmental Flow Decision-Making.
McKay, S Kyle; Freeman, Mary C; Covich, Alan P
2016-06-01
Well-informed river management decisions rely on an explicit statement of objectives, repeatable analyses, and a transparent system for assessing trade-offs. These components may then be applied to compare alternative operational regimes for water resource infrastructure (e.g., diversions, locks, and dams). Intra- and inter-annual hydrologic variability further complicates these already complex environmental flow decisions. Effective discharge analysis (developed in studies of geomorphology) is a powerful tool for integrating temporal variability of flow magnitude and associated ecological consequences. Here, we adapt the effectiveness framework to include multiple elements of the natural flow regime (i.e., timing, duration, and rate-of-change) as well as two flow variables. We demonstrate this analytical approach using a case study of environmental flow management based on long-term (60 years) daily discharge records in the Middle Oconee River near Athens, GA, USA. Specifically, we apply an existing model for estimating young-of-year fish recruitment based on flow-dependent metrics to an effective discharge analysis that incorporates hydrologic variability and multiple focal taxa. We then compare three alternative methods of environmental flow provision. Percentage-based withdrawal schemes outcompete other environmental flow methods across all levels of water withdrawal and ecological outcomes.
Analysis for Non-Traditional Security Challenges: Methods and Tools
2006-11-20
PMESII Modeling Challenges modeling or where data is not available to support the model, would aid decision Domain is large, nebulous, complex, and...traditional challenges . This includes enlisting the aid of the inter-agency and alliance/coalition communities. Second, we need to realize this...20 November 2006 MILITARY OPERATIONS RESEARCH SOCIETY MIFh MORS Workshop Analysis for Non-Traditional Security Challenges : Methods and Tools 21-23
Decision Makers' Allocation of Home-Care Therapy Services: A Process Map
Poss, Jeff; Egan, Mary; Rappolt, Susan; Berg, Katherine
2013-01-01
ABSTRACT Purpose: To explore decision-making processes currently used in allocating occupational and physical therapy services in home care for complex long-stay clients in Ontario. Method: An exploratory study using key-informant interviews and client vignettes was conducted with home-care decision makers (case managers and directors) from four home-care regions in Ontario. The interview data were analyzed using the framework analysis method. Results: The decision-making process for allocating therapy services has four stages: intake, assessment, referral to service provider, and reassessment. There are variations in the management processes deployed at each stage. The major variation is in the process of determining the volume of therapy services across home-care regions, primarily as a result of financial constraints affecting the home-care programme. Government funding methods and methods of information sharing also significantly affect home-care therapy allocation. Conclusion: Financial constraints in home care are the primary contextual factor affecting allocation of therapy services across home-care regions. Given the inflation of health care costs, new models of funding and service delivery need to be developed to ensure that the right person receives the right care before deteriorating and requiring more costly long-term care. PMID:24403672
Method for matching customer and manufacturer positions for metal product parameters standardization
NASA Astrophysics Data System (ADS)
Polyakova, Marina; Rubin, Gennadij; Danilova, Yulija
2018-04-01
Decision making is the main stage of regulation the relations between customer and manufacturer during the design the demands of norms in standards. It is necessary to match the positions of the negotiating sides in order to gain the consensus. In order to take into consideration the differences of customer and manufacturer estimation of the object under standardization process it is obvious to use special methods of analysis. It is proposed to establish relationships between product properties and its functions using functional-target analysis. The special feature of this type of functional analysis is the consideration of the research object functions and properties. It is shown on the example of hexagonal head crew the possibility to establish links between its functions and properties. Such approach allows obtaining a quantitative assessment of the closeness the positions of customer and manufacturer at decision making during the standard norms establishment.
A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules
NASA Astrophysics Data System (ADS)
Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.
2012-08-01
Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.
Methods for the evaluation of alternative disaster warning systems
NASA Technical Reports Server (NTRS)
Agnew, C. E.; Anderson, R. J., Jr.; Lanen, W. N.
1977-01-01
For each of the methods identified, a theoretical basis is provided and an illustrative example is described. The example includes sufficient realism and detail to enable an analyst to conduct an evaluation of other systems. The methods discussed in the study include equal capability cost analysis, consumers' surplus, and statistical decision theory.
External Dependencies-Driven Architecture Discovery and Analysis of Implemented Systems
NASA Technical Reports Server (NTRS)
Ganesan, Dharmalingam; Lindvall, Mikael; Ron, Monica
2014-01-01
A method for architecture discovery and analysis of implemented systems (AIS) is disclosed. The premise of the method is that architecture decisions are inspired and influenced by the external entities that the software system makes use of. Examples of such external entities are COTS components, frameworks, and ultimately even the programming language itself and its libraries. Traces of these architecture decisions can thus be found in the implemented software and is manifested in the way software systems use such external entities. While this fact is often ignored in contemporary reverse engineering methods, the AIS method actively leverages and makes use of the dependencies to external entities as a starting point for the architecture discovery. The AIS method is demonstrated using the NASA's Space Network Access System (SNAS). The results show that, with abundant evidence, the method offers reusable and repeatable guidelines for discovering the architecture and locating potential risks (e.g. low testability, decreased performance) that are hidden deep in the implementation. The analysis is conducted by using external dependencies to identify, classify and review a minimal set of key source code files. Given the benefits of analyzing external dependencies as a way to discover architectures, it is argued that external dependencies deserve to be treated as first-class citizens during reverse engineering. The current structure of a knowledge base of external entities and analysis questions with strategies for getting answers is also discussed.
Supporting Space Systems Design via Systems Dependency Analysis Methodology
NASA Astrophysics Data System (ADS)
Guariniello, Cesare
The increasing size and complexity of space systems and space missions pose severe challenges to space systems engineers. When complex systems and Systems-of-Systems are involved, the behavior of the whole entity is not only due to that of the individual systems involved but also to the interactions and dependencies between the systems. Dependencies can be varied and complex, and designers usually do not perform analysis of the impact of dependencies at the level of complex systems, or this analysis involves excessive computational cost, or occurs at a later stage of the design process, after designers have already set detailed requirements, following a bottom-up approach. While classical systems engineering attempts to integrate the perspectives involved across the variety of engineering disciplines and the objectives of multiple stakeholders, there is still a need for more effective tools and methods capable to identify, analyze and quantify properties of the complex system as a whole and to model explicitly the effect of some of the features that characterize complex systems. This research describes the development and usage of Systems Operational Dependency Analysis and Systems Developmental Dependency Analysis, two methods based on parametric models of the behavior of complex systems, one in the operational domain and one in the developmental domain. The parameters of the developed models have intuitive meaning, are usable with subjective and quantitative data alike, and give direct insight into the causes of observed, and possibly emergent, behavior. The approach proposed in this dissertation combines models of one-to-one dependencies among systems and between systems and capabilities, to analyze and evaluate the impact of failures or delays on the outcome of the whole complex system. The analysis accounts for cascading effects, partial operational failures, multiple failures or delays, and partial developmental dependencies. The user of these methods can assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.
ERIC Educational Resources Information Center
Eisenhardt, Alyson; Ninassi, Susanne Bruno
2016-01-01
Many pedagogy experts suggest the use of real world scenarios and simulations as a means of teaching students to apply decision analysis concepts to their field of study. These methods allow students an opportunity to synthesize knowledge, skills, and abilities by presenting a field-based dilemma. The use of real world scenarios and simulations…
Linguistic hesitant fuzzy multi-criteria decision-making method based on evidential reasoning
NASA Astrophysics Data System (ADS)
Zhou, Huan; Wang, Jian-qiang; Zhang, Hong-yu; Chen, Xiao-hong
2016-01-01
Linguistic hesitant fuzzy sets (LHFSs), which can be used to represent decision-makers' qualitative preferences as well as reflect their hesitancy and inconsistency, have attracted a great deal of attention due to their flexibility and efficiency. This paper focuses on a multi-criteria decision-making approach that combines LHFSs with the evidential reasoning (ER) method. After reviewing existing studies of LHFSs, a new order relationship and Hamming distance between LHFSs are introduced and some linguistic scale functions are applied. Then, the ER algorithm is used to aggregate the distributed assessment of each alternative. Subsequently, the set of aggregated alternatives on criteria are further aggregated to get the overall value of each alternative. Furthermore, a nonlinear programming model is developed and genetic algorithms are used to obtain the optimal weights of the criteria. Finally, two illustrative examples are provided to show the feasibility and usability of the method, and comparison analysis with the existing method is made.
Wang, Shen-Tsu; Li, Meng-Hua
2014-01-01
When an enterprise has thousands of varieties in its inventory, the use of a single management method could not be a feasible approach. A better way to manage this problem would be to categorise inventory items into several clusters according to inventory decisions and to use different management methods for managing different clusters. The present study applies DPSO (dynamic particle swarm optimisation) to a problem of clustering of inventory items. Without the requirement of prior inventory knowledge, inventory items are automatically clustered into near optimal clustering number. The obtained clustering results should satisfy the inventory objective equation, which consists of different objectives such as total cost, backorder rate, demand relevance, and inventory turnover rate. This study integrates the above four objectives into a multiobjective equation, and inputs the actual inventory items of the enterprise into DPSO. In comparison with other clustering methods, the proposed method can consider different objectives and obtain an overall better solution to obtain better convergence results and inventory decisions.
NASA Astrophysics Data System (ADS)
Hadjimichael, A.; Corominas, L.; Comas, J.
2017-12-01
With sustainable development as their overarching goal, urban wastewater system (UWS) managers need to take into account multiple social, economic, technical and environmental facets related to their decisions. In this complex decision-making environment, uncertainty can be formidable. It is present both in the ways the system is interpreted stochastically, but also in its natural ever-shifting behavior. This inherent uncertainty suggests that wiser decisions would be made under an adaptive and iterative decision-making regime. No decision-support framework has been presented in the literature to effectively addresses all these needs. The objective of this work is to describe such a conceptual framework to evaluate and compare alternative solutions for various UWS challenges within an adaptive management structure. Socio-economic aspects such as externalities are taken into account, along with other traditional criteria as necessary. Robustness, reliability and resilience analyses test the performance of the system against present and future variability. A valuation uncertainty analysis incorporates uncertain valuation assumptions in the decision-making process. The framework is demonstrated with an application to a case study presenting a typical problem often faced by managers: poor river water quality, increasing population, and more stringent water quality legislation. The application of the framework made use of: i) a cost-benefit analysis including monetized environmental benefits and damages; ii) a robustness analysis of system performance against future conditions; iii) reliability and resilience analyses of the system given contextual variability; and iv) a valuation uncertainty analysis of model parameters. The results suggest that the installation of bigger volumes would give rise to increased benefits despite larger capital costs, as well as increased robustness and resilience. Population numbers appear to affect the estimated benefits most, followed by electricity prices and climate change projections. The presented framework is expected to be a valuable tool for the next generation of UWS decision-making and the application demonstrates a novel and valuable integration of metrics and methods for UWS analysis.
Chen, Hai; Liang, Xiaoying; Li, Rui
2013-01-01
Multi-Agent Systems (MAS) offer a conceptual approach to include multi-actor decision making into models of land use change. Through the simulation based on the MAS, this paper tries to show the application of MAS in the micro scale LUCC, and reveal the transformation mechanism of difference scale. This paper starts with a description of the context of MAS research. Then, it adopts the Nested Spatial Choice (NSC) method to construct the multi-scale LUCC decision-making model. And a case study for Mengcha village, Mizhi County, Shaanxi Province is reported. Finally, the potentials and drawbacks of the following approach is discussed and concluded. From our design and implementation of the MAS in multi-scale model, a number of observations and conclusions can be drawn on the implementation and future research directions. (1) The use of the LUCC decision-making and multi-scale transformation framework provides, according to us, a more realistic modeling of multi-scale decision making process. (2) By using continuous function, rather than discrete function, to construct the decision-making of the households is more realistic to reflect the effect. (3) In this paper, attempts have been made to give a quantitative analysis to research the household interaction. And it provides the premise and foundation for researching the communication and learning among the households. (4) The scale transformation architecture constructed in this paper helps to accumulate theory and experience for the interaction research between the micro land use decision-making and the macro land use landscape pattern. Our future research work will focus on: (1) how to rational use risk aversion principle, and put the rule on rotation between household parcels into model. (2) Exploring the methods aiming at researching the household decision-making over a long period, it allows us to find the bridge between the long-term LUCC data and the short-term household decision-making. (3) Researching the quantitative method and model, especially the scenario analysis model which may reflect the interaction among different household types.
Graeden, Ellie; Kerr, Justin; Sorrell, Erin M.; Katz, Rebecca
2018-01-01
Managing infectious disease requires rapid and effective response to support decision making. The decisions are complex and require understanding of the diseases, disease intervention and control measures, and the disease-relevant characteristics of the local community. Though disease modeling frameworks have been developed to address these questions, the complexity of current models presents a significant barrier to community-level decision makers in using the outputs of the most scientifically robust methods to support pragmatic decisions about implementing a public health response effort, even for endemic diseases with which they are already familiar. Here, we describe the development of an application available on the internet, including from mobile devices, with a simple user interface, to support on-the-ground decision-making for integrating disease control programs, given local conditions and practical constraints. The model upon which the tool is built provides predictive analysis for the effectiveness of integration of schistosomiasis and malaria control, two diseases with extensive geographical and epidemiological overlap, and which result in significant morbidity and mortality in affected regions. Working with data from countries across sub-Saharan Africa and the Middle East, we present a proof-of-principle method and corresponding prototype tool to provide guidance on how to optimize integration of vertical disease control programs. This method and tool demonstrate significant progress in effectively translating the best available scientific models to support practical decision making on the ground with the potential to significantly increase the efficacy and cost-effectiveness of disease control. Author summary Designing and implementing effective programs for infectious disease control requires complex decision-making, informed by an understanding of the diseases, the types of disease interventions and control measures available, and the disease-relevant characteristics of the local community. Though disease modeling frameworks have been developed to address these questions and support decision-making, the complexity of current models presents a significant barrier to on-the-ground end users. The picture is further complicated when considering approaches for integration of different disease control programs, where co-infection dynamics, treatment interactions, and other variables must also be taken into account. Here, we describe the development of an application available on the internet with a simple user interface, to support on-the-ground decision-making for integrating disease control, given local conditions and practical constraints. The model upon which the tool is built provides predictive analysis for the effectiveness of integration of schistosomiasis and malaria control, two diseases with extensive geographical and epidemiological overlap. This proof-of-concept method and tool demonstrate significant progress in effectively translating the best available scientific models to support pragmatic decision-making on the ground, with the potential to significantly increase the impact and cost-effectiveness of disease control. PMID:29649260
NASA Astrophysics Data System (ADS)
Madruga de Brito, Mariana; Evers, Mariele
2016-04-01
Multi-Criteria Decision Making (MCDM) methods have received much attention from researchers and practitioners for solving flood risk management problems in the last decades due to its capacity to deal with multiple criteria, conflicting objectives as well as the knowledge arising from the participation of several actors. In order to consolidate recent research conducted in this area, this study presents a state-of-the-art literature review of MCDM applications to flood risk management, seeking to provide a better understanding of the current status of how participatory MCDM is being conducted and the way uncertainties are included in the decision-making process. Totally, 128 peer-reviewed papers published from 1995 to June 2015 in 72 different journals were systematically analyzed. Results indicated that the number of flood MCDM publications has exponentially grown during this period, with over 82% of all papers published since 2009. A wide range of application areas was identified, with most papers focusing on ranking alternatives for flood mitigation (22.78% of the total) followed by risk (21.11%) and vulnerability assessment (15%). The Analytical Hierarchy Process (AHP) was the most popular MCDM method (42.72%) followed by Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) (13.33%) and Weighted Sum Method (WSM) (12.73%). Although significant improvements have been made over the last decades, shortcomings remain in handling the uncertainty. Only eight papers (6.25%) have conducted uncertainty analysis, suggesting that a general procedure for performing it in MCDM does not yet exist. Researchers have applied the Monte Carlo simulation, Taylor's series error propagation method or assessed the uncertainty in qualitative ways, by describing its main sources or analyzing the stakeholders' degree of confidence. In addition, 35 articles (27.34%) have performed a sensitivity analysis of the criteria weights. Three distinct approaches were identified: one-way, global, and probabilistic sensitivity analysis. About half of the studies have acknowledged the involvement of multiple stakeholders. However, participation was fragmented and focused on particular stages of the decision-making process such as the elicitation of criteria weights. This segmentation may be related to methodological and time constraints since participatory decision making is time-consuming and costly. Policy makers and experts were the most participated stakeholders, with few papers considering the involvement of local community members. Another issue is that only four studies seek to obtain consensus and that decisions were often made by majority vote or averaging approaches. Therefore, greater rigor in addressing the uncertainties around stakeholders' judgments as well as in endorsing an active participation in all stages of the decision-making process should be undertaken in future applications. This could help to increase the quality of decisions and subsequent implementation of chosen measures.
System for decision analysis support on complex waste management issues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shropshire, D.E.
1997-10-01
A software system called the Waste Flow Analysis has been developed and applied to complex environmental management processes for the United States Department of Energy (US DOE). The system can evaluate proposed methods of waste retrieval, treatment, storage, transportation, and disposal. Analysts can evaluate various scenarios to see the impacts to waste slows and schedules, costs, and health and safety risks. Decision analysis capabilities have been integrated into the system to help identify preferred alternatives based on a specific objectives may be to maximize the waste moved to final disposition during a given time period, minimize health risks, minimize costs,more » or combinations of objectives. The decision analysis capabilities can support evaluation of large and complex problems rapidly, and under conditions of variable uncertainty. The system is being used to evaluate environmental management strategies to safely disposition wastes in the next ten years and reduce the environmental legacy resulting from nuclear material production over the past forty years.« less
Estimating Most Productive Scale Size in Data Envelopment Analysis with Integer Value Data
NASA Astrophysics Data System (ADS)
Dwi Sari, Yunita; Angria S, Layla; Efendi, Syahril; Zarlis, Muhammad
2018-01-01
The most productive scale size (MPSS) is a measurement that states how resources should be organized and utilized to achieve optimal results. The most productive scale size (MPSS) can be used as a benchmark for the success of an industry or company in producing goods or services. To estimate the most productive scale size (MPSS), each decision making unit (DMU) should pay attention the level of input-output efficiency, by data envelopment analysis (DEA) method decision making unit (DMU) can identify units used as references that can help to find the cause and solution from inefficiencies can optimize productivity that main advantage in managerial applications. Therefore, data envelopment analysis (DEA) is chosen to estimating most productive scale size (MPSS) that will focus on the input of integer value data with the CCR model and the BCC model. The purpose of this research is to find the best solution for estimating most productive scale size (MPSS) with input of integer value data in data envelopment analysis (DEA) method.
To Spray or Not to Spray: A Decision Analysis of Coffee Berry Borer in Hawaii
2017-01-01
Integrated pest management strategies were adopted to combat the coffee berry borer (CBB) after its arrival in Hawaii in 2010. A decision tree framework is used to model the CBB integrated pest management recommendations, for potential use by growers and to assist in developing and evaluating management strategies and policies. The model focuses on pesticide spraying (spray/no spray) as the most significant pest management decision within each period over the entire crop season. The main result from the analysis suggests the most important parameter to maximize net benefit is to ensure a low initial infestation level. A second result looks at the impact of a subsidy for the cost of pesticides and shows a typical farmer receives a positive net benefit of $947.17. Sensitivity analysis of parameters checks the robustness of the model and further confirms the importance of a low initial infestation level vis-a-vis any level of subsidy. The use of a decision tree is shown to be an effective method for understanding integrated pest management strategies and solutions. PMID:29065464
Hagbaghery, Mohsen Adib; Salsali, Mahvash; Ahmadi, Fazlolah
2004-01-01
Background Nurses' practice takes place in a context of ongoing advances in research and technology. The dynamic and uncertain nature of health care environment requires nurses to be competent decision-makers in order to respond to clients' needs. Recently, the public and the government have criticized Iranian nurses because of poor quality of patient care. However nurses' views and experiences on factors that affect their clinical function and clinical decision-making have rarely been studied. Methods Grounded theory methodology was used to analyze the participants' lived experiences and their viewpoints regarding the factors affecting their clinical function and clinical decision-making. Semi-structured interviews and participant observation methods were used to gather the data. Thirty-eight participants were interviewed and twelve sessions of observation were carried out. Constant comparative analysis method was used to analyze the data. Results Five main themes emerged from the data. From the participants' points of view, "feeling competent", "being self-confident", "organizational structure", "nursing education", and "being supported" were considered as important factors in effective clinical decision-making. Conclusion As participants in this research implied, being competent and self-confident are the most important personal factors influencing nurses clinical decision-making. Also external factors such as organizational structure, access to supportive resources and nursing education have strengthening or inhibiting effects on the nurses' decisions. Individual nurses, professional associations, schools of nursing, nurse educators, organizations that employ nurses and government all have responsibility for developing and finding strategies that facilitate nurses' effective clinical decision-making. They are responsible for identifying barriers and enhancing factors within the organizational structure that facilitate nurses' clinical decision-making. PMID:15068484
How do community pharmacists make decisions? Results of an exploratory qualitative study in Ontario.
Gregory, Paul A M; Whyte, Brenna; Austin, Zubin
2016-03-01
As the complexity of pharmacy practice increases, pharmacists are required to make more decisions under ambiguous or information-deficient conditions. There is scant literature examining how pharmacists make decisions and what factors or values influence their choices. The objective of this exploratory research was to characterize decision-making patterns in the clinical setting of community pharmacists in Ontario. The think-aloud decision-making method was used for this study. Community pharmacists with 3 or more years' experience were presented with 2 clinical case studies dealing with challenging situations and were asked to verbally reason through their decision-making process while being probed by an interviewer for clarification, justification and further explication. Verbatim transcripts were analyzed using a protocol analysis method. A total of 12 pharmacists participated in this study. Participants experienced cognitive dissonance in attempting to reconcile their desire for a clear and confrontation-free conclusion to the case discussion and the reality of the challenge presented within each case. Strategies for resolving this cognitive dissonance included strong emphasis on the educational (rather than decision-making) role of the pharmacist, the value of strong interpersonal relationships as a way to avoid conflict and achieve desired outcomes, the desire to seek external advice or defer to others' authority to avoid making a decision and the use of strict interpretations of rules to avoid ambiguity and contextual interpretation. This research was neither representative nor generalizable but was indicative of patterns of decisional avoidance and fear of assuming responsibility for outcomes that warrant further investigation. The think-aloud method functioned effectively in this context and provided insights into pharmacists' decision-making patterns in the clinical setting. Can Pharm J (Ott) 2016;149:90-98.
Perceived risks around choice and decision making at end-of-life: a literature review.
Wilson, F; Gott, M; Ingleton, C
2013-01-01
the World Health Organization identifies meeting patient choice for care as central to effective palliative care delivery. Little is known about how choice, which implies an objective balancing of options and risks, is understood and enacted through decision making at end-of-life. to explore how perceptions of 'risk' may inform decision-making processes at end-of-life. an integrative literature review was conducted between January and February 2010. Papers were reviewed using Hawker et al.'s criteria and evaluated according to clarity of methods, analysis and evidence of ethical consideration. All literature was retained as background data, but given the significant international heterogeneity the final analysis specifically focused on the UK context. the databases Medline, PsycINFO, Assia, British Nursing Index, High Wire Press and CINAHL were explored using the search terms decision*, risk, anxiety, hospice and palliative care, end-of-life care and publication date of 1998-2010. thematic analysis of 25 papers suggests that decision making at end-of-life is multifactorial, involving a balancing of risks related to caregiver support; service provider resources; health inequalities and access; challenges to information giving; and perceptions of self-identity. Overall there is a dissonance in understandings of choice and decision making between service providers and service users. the concept of risk acknowledges the factors that shape and constrain end-of-life choices. Recognition of perceived risks as a central factor in decision making would be of value in acknowledging and supporting meaningful decision making processes for patients with palliative care needs and their families.
Decision aids for multiple-decision disease management as affected by weather input errors.
Pfender, W F; Gent, D H; Mahaffee, W F; Coop, L B; Fox, A D
2011-06-01
Many disease management decision support systems (DSSs) rely, exclusively or in part, on weather inputs to calculate an indicator for disease hazard. Error in the weather inputs, typically due to forecasting, interpolation, or estimation from off-site sources, may affect model calculations and management decision recommendations. The extent to which errors in weather inputs affect the quality of the final management outcome depends on a number of aspects of the disease management context, including whether management consists of a single dichotomous decision, or of a multi-decision process extending over the cropping season(s). Decision aids for multi-decision disease management typically are based on simple or complex algorithms of weather data which may be accumulated over several days or weeks. It is difficult to quantify accuracy of multi-decision DSSs due to temporally overlapping disease events, existence of more than one solution to optimizing the outcome, opportunities to take later recourse to modify earlier decisions, and the ongoing, complex decision process in which the DSS is only one component. One approach to assessing importance of weather input errors is to conduct an error analysis in which the DSS outcome from high-quality weather data is compared with that from weather data with various levels of bias and/or variance from the original data. We illustrate this analytical approach for two types of DSS, an infection risk index for hop powdery mildew and a simulation model for grass stem rust. Further exploration of analysis methods is needed to address problems associated with assessing uncertainty in multi-decision DSSs.
Chen, Keping; Blong, Russell; Jacobson, Carol
2003-04-01
This paper develops a GIS-based integrated approach to risk assessment in natural hazards, with reference to bushfires. The challenges for undertaking this approach have three components: data integration, risk assessment tasks, and risk decision-making. First, data integration in GIS is a fundamental step for subsequent risk assessment tasks and risk decision-making. A series of spatial data integration issues within GIS such as geographical scales and data models are addressed. Particularly, the integration of both physical environmental data and socioeconomic data is examined with an example linking remotely sensed data and areal census data in GIS. Second, specific risk assessment tasks, such as hazard behavior simulation and vulnerability assessment, should be undertaken in order to understand complex hazard risks and provide support for risk decision-making. For risk assessment tasks involving heterogeneous data sources, the selection of spatial analysis units is important. Third, risk decision-making concerns spatial preferences and/or patterns, and a multicriteria evaluation (MCE)-GIS typology for risk decision-making is presented that incorporates three perspectives: spatial data types, data models, and methods development. Both conventional MCE methods and artificial intelligence-based methods with GIS are identified to facilitate spatial risk decision-making in a rational and interpretable way. Finally, the paper concludes that the integrated approach can be used to assist risk management of natural hazards, in theory and in practice.
2014-01-01
Background Zambia’s fertility rate and unmet need for family planning are still high. This is in spite of the progress reported from 1992 to 2007 of the increase in contraceptive prevalence rate from 15% to 41% and use of modern methods of family planning from 9% to 33%. However, partner disapproval of family planning has been cited by many women in many countries including Zambia. Given the effectiveness of long-acting and permanent methods of family planning (ILAPMs) in fertility regulation, this paper sought to examine the relationship between contraceptive decision-making and use of ILAPMs among married women in Zambia. Methods This paper uses data from the 2007 Zambia Demographic and Health Survey. The analysis is based on married women (15–49) who reported using a method of family planning at the time of the survey. Out of the 7,146 women interviewed, only 1,630 women were valid for this analysis. Cross-tabulations and binary logistic regressions with Chi-square were used to analyse associations and the predictors of use of ILAPMs of contraception, respectively. A confidence interval of .95 was used in determining relationships between independent and dependent variables. Results Two thirds of women made joint decisions regarding contraception and 29% of the women were using ILAPMs. Women who made joint contraceptive decisions are significantly more likely to use ILAPMs than women who did not involve their husband in contraceptive decisions. However, the most significant predictor is the wealth index. Women from rich households are more likely to use ILAPMs than women from medium rich and poor households. Results also show that women of North Western ethnicities and those from Region 3 had higher odds of using ILAPMs than Tonga women and women from Region 2, respectively. Conclusion Joint contraceptive decision-making between spouses is key to use of ILAPMs in Zambia. Our findings have also shown that the wealth index is actually the strongest factor determining use of these methods. As such, family planning programmes directed at increasing use of LAPMs ought to not only encourage spousal communication but should also consider rolling out interventions that incorporate economic empowerment. PMID:24993034
[Value-based medicine in ophthalmology].
Hirneiss, C; Neubauer, A S; Tribus, C; Kampik, A
2006-06-01
Value-based medicine (VBM) unifies costs and patient-perceived value (improvement in quality of life, length of life, or both) of an intervention. Value-based ophthalmology is of increasing importance for decisions in eye care. The methods of VBM are explained and definitions for a specific terminology in this field are given. The cost-utility analysis as part of health care economic analyses is explained. VBM exceeds evidence-based medicine by incorporating parameters of cost and benefits from an ophthalmological intervention. The benefit of the intervention is defined as an increase or maintenance of visual quality of life and can be determined by utility analysis. The time trade-off method is valid and reliable for utility analysis. The resources expended for the value gained in VBM are measured with cost-utility analysis in terms of cost per quality-adjusted life years gained (euros/QALY). Numerous cost-utility analyses of different ophthalmological interventions have been published. The fundamental instrument of VBM is cost-utility analysis. The results in cost per QALY allow estimation of cost effectiveness of an ophthalmological intervention. Using the time trade-off method for utility analysis allows the comparison of ophthalmological cost-utility analyses with those of other medical interventions. VBM is important for individual medical decision making and for general health care.
Yin, Kedong; Wang, Pengyu; Li, Xuemei
2017-12-13
With respect to multi-attribute group decision-making (MAGDM) problems, where attribute values take the form of interval grey trapezoid fuzzy linguistic variables (IGTFLVs) and the weights (including expert and attribute weight) are unknown, improved grey relational MAGDM methods are proposed. First, the concept of IGTFLV, the operational rules, the distance between IGTFLVs, and the projection formula between the two IGTFLV vectors are defined. Second, the expert weights are determined by using the maximum proximity method based on the projection values between the IGTFLV vectors. The attribute weights are determined by the maximum deviation method and the priorities of alternatives are determined by improved grey relational analysis. Finally, an example is given to prove the effectiveness of the proposed method and the flexibility of IGTFLV.
A qualitative analysis of how advanced practice nurses use clinical decision support systems.
Weber, Scott
2007-12-01
The purpose of this study was to generate a grounded theory that will reflect the experiences of advanced practice nurses (APNs) working as critical care nurse practitioners (NPs) and clinical nurse specialists (CNS) with computer-based decision-making systems. A study design using grounded theory qualitative research methods and convenience sampling was employed in this study. Twenty-three APNs (13 CNS and 10 NPs) were recruited from 16 critical care units located in six large urban medical centers in the U.S. Midwest. Single-structured in-depth interviews with open-ended audio-taped questions were conducted with each APN. Through this process, APNs defined what they consider to be relevant themes and patterns of clinical decision system use in their critical care practices, and they identified the interrelatedness of the conceptual categories that emerged from the results. Data were analyzed using the constant comparative analysis method of qualitative research. APN participants were predominantly female, white/non-Hispanic, had a history of access to the clinical decision system used in their critical care settings for an average of 14 months, and had attended a formal training program to learn how to use clinical decision systems. "Forecasting decision outcomes," which was defined as the voluntary process employed to forecast the outcomes of patient care decisions in critical care prior to actual decision making, was the core variable describing system use that emerged from the responses. This variable consisted of four user constructs or components: (a) users' perceptions of their initial system learning experience, (b) users' sense of how well they understand how system technology works, (c) users' understanding of how system inferences are created or derived, and (d) users' relative trust of system-derived data. Each of these categories was further described through the grounded theory research process, and the relationships between the categories were identified. The findings of this study suggest that the main reason critical care APNs choose to integrate clinical decision systems into their practices is to provide an objective, scientifically derived, technology-based backup for human forecasting of the outcomes of patient care decisions prior to their actual decision making. Implications for nursing, health care, and technology research are presented.
NASA Astrophysics Data System (ADS)
Kucharski, John; Tkach, Mark; Olszewski, Jennifer; Chaudhry, Rabia; Mendoza, Guillermo
2016-04-01
This presentation demonstrates the application of Climate Risk Informed Decision Analysis (CRIDA) at Zambia's principal water treatment facility, The Iolanda Water Treatment Plant. The water treatment plant is prone to unacceptable failures during periods of low hydropower production at the Kafue Gorge Dam Hydroelectric Power Plant. The case study explores approaches of increasing the water treatment plant's ability to deliver acceptable levels of service under the range of current and potential future climate states. The objective of the study is to investigate alternative investments to build system resilience that might have been informed by the CRIDA process, and to evaluate the extra resource requirements by a bilateral donor agency to implement the CRIDA process. The case study begins with an assessment of the water treatment plant's vulnerability to climate change. It does so by following general principals described in "Confronting Climate Uncertainty in Water Resource Planning and Project Design: the Decision Tree Framework". By utilizing relatively simple bootstrapping methods a range of possible future climate states is generated while avoiding the use of more complex and costly downscaling methodologies; that are beyond the budget and technical capacity of many teams. The resulting climate vulnerabilities and uncertainty in the climate states that produce them are analyzed as part of a "Level of Concern" analysis. CRIDA principals are then applied to this Level of Concern analysis in order to arrive at a set of actionable water management decisions. The principal goals of water resource management is to transform variable, uncertain hydrology into dependable services (e.g. water supply, flood risk reduction, ecosystem benefits, hydropower production, etc…). Traditional approaches to climate adaptation require the generation of predicted future climate states but do little guide decision makers how this information should impact decision making. In this context it is not surprising that the increased hydrologic variability and uncertainty produced by many climate risk analyses bedevil water resource decision making. The Climate Risk Informed Decision Analysis (CRIDA) approach builds on work found in "Confronting Climate Uncertainty in Water Resource Planning and Project Design: the Decision Tree Framework" which provide guidance of vulnerability assessments. It guides practitioners through a "Level of Concern" analysis where climate vulnerabilities are analyzed to produce actionable alternatives and decisions.
Democracy and sustainable development--what is the alternative to cost-benefit analysis?
Söderbaum, Peter
2006-04-01
Cost-benefit analysis (CBA) is part of neoclassical economics, a specific paradigm, or theoretical perspective. In searching for alternatives to CBA, competing theoretical frameworks in economics appear to be a natural starting point. Positional analysis (PA) as an alternative to CBA is built on institutional theory and a different set of assumptions about human beings, organizations, markets, etc. Sustainable development (SD) is a multidimensional concept that includes social and ecological dimensions in addition to monetary aspects. If the political commitment to SD in the European Union and elsewhere is taken seriously, then approaches to decision making should be chosen that 1st open the door for multidimensional analysis rather than close it. Sustainable development suggests a direction for development in a broad sense but is still open to different interpretations. Each such interpretation is political in kind, and a 2nd criterion for judging different approaches is whether they are ideologically open rather than closed. Although methods for decision making have traditionally been connected with mathematical objective functions and optimization, the purpose of PA is to illuminate a decision situation in a many-sided way with respect to possibly relevant ideological orientations, alternatives, and consequences. Decisions are understood in terms of matching the ideological orientation of each decision maker with the expected effects profile of each alternative considered. Appropriateness and pattern recognition are other concepts in understanding this process.
Factors Which Influence The Fish Purchasing Decision: A study on Traditional Market in Riau Mainland
NASA Astrophysics Data System (ADS)
Siswati, Latifa; Putri, Asgami
2018-05-01
The purposes of the research are to analyze and assess the factors which influence fish purchasing by the community at Tenayan Raya district Pekanbaru.Research methodology which used is survey method, especially interview and observation technique or direct supervision on the market which located at Tenayan Raya district. Determination technique of sampling location/region is done by purposive sampling. The sampling method is done by accidental sampling. Technique analysis of factors which used using the data that derived from the respondent opinion to various fish variable. The result of this research are the factors which influence fish purchasing decision done in a traditional market which located at Tenayan Raya district are product factor, price factors, social factor and individual factor. Product factor which influences fish purchasing decision as follows: the eyelets condition, the nutrition of fresh fish, the diversity of sold fish. Price factors influence the fish purchasing decision, such as: the price of fresh fish, the convincing price and the suitability price and benefits of the fresh fish. Individual factors which influence a fish purchasing decision, such as education and income levels. Social factors which influence a fish purchasing decision, such as family, colleagues and feeding habits of fish.
Application of decision science to resilience management in Jamaica Bay
Eaton, Mitchell; Fuller, Angela K.; Johnson, Fred A.; Hare, M. P.; Stedman, Richard C.; Sanderson, E.W.; Solecki, W. D.; Waldman, J.R.; Paris, A. S.
2016-01-01
This book highlights the growing interest in management interventions designed to enhance the resilience of the Jamaica Bay socio-ecological system. Effective management, whether the focus is on managing biological processes or human behavior or (most likely) both, requires decision makers to anticipate how the managed system will respond to interventions (i.e., via predictions or projections). In systems characterized by many interacting components and high uncertainty, making probabilistic predictions is often difficult and requires careful thinking not only about system dynamics, but also about how management objectives are specified and the analytic method used to select the preferred action(s). Developing a clear statement of the problem(s) and articulation of management objectives is often best achieved by including input from managers, scientists and other stakeholders affected by the decision through a process of joint problem framing (Marcot and others 2012; Keeney and others 1990). Using a deliberate, coherent and transparent framework for deciding among management alternatives to best meet these objectives then ensures a greater likelihood for successful intervention. Decision science provides the theoretical and practical basis for developing this framework and applying decision analysis methods for making complex decisions under uncertainty and risk.
Study on optimized decision-making model of offshore wind power projects investment
NASA Astrophysics Data System (ADS)
Zhao, Tian; Yang, Shangdong; Gao, Guowei; Ma, Li
2018-02-01
China’s offshore wind energy is of great potential and plays an important role in promoting China’s energy structure adjustment. However, the current development of offshore wind power in China is inadequate, and is much less developed than that of onshore wind power. On the basis of considering all kinds of risks faced by offshore wind power development, an optimized model of offshore wind power investment decision is established in this paper by proposing the risk-benefit assessment method. To prove the practicability of this method in improving the selection of wind power projects, python programming is used to simulate the investment analysis of a large number of projects. Therefore, the paper is dedicated to provide decision-making support for the sound development of offshore wind power industry.
A big data analysis of the relationship between future thinking and decision-making.
Thorstad, Robert; Wolff, Phillip
2018-02-20
We use big data methods to investigate how decision-making might depend on future sightedness (that is, on how far into the future people's thoughts about the future extend). In study 1, we establish a link between future thinking and decision-making at the population level in showing that US states with citizens having relatively far future sightedness, as reflected in their tweets, take fewer risks than citizens in states having relatively near future sightedness. In study 2, we analyze people's tweets to confirm a connection between future sightedness and decision-making at the individual level in showing that people with long future sightedness are more likely to choose larger future rewards over smaller immediate rewards. In study 3, we show that risk taking decreases with increases in future sightedness as reflected in people's tweets. The ability of future sightedness to predict decisions suggests that future sightedness is a relatively stable cognitive characteristic. This implication was supported in an analysis of tweets by over 38,000 people that showed that future sightedness has both state and trait characteristics (study 4). In study 5, we provide evidence for a potential mechanism by which future sightedness can affect decisions in showing that far future sightedness can make the future seem more connected to the present, as reflected in how people refer to the present, past, and future in their tweets over the course of several minutes. Our studies show how big data methods can be applied to naturalistic data to reveal underlying psychological properties and processes.
Monte Carlo decision curve analysis using aggregate data.
Hozo, Iztok; Tsalatsanis, Athanasios; Djulbegovic, Benjamin
2017-02-01
Decision curve analysis (DCA) is an increasingly used method for evaluating diagnostic tests and predictive models, but its application requires individual patient data. The Monte Carlo (MC) method can be used to simulate probabilities and outcomes of individual patients and offers an attractive option for application of DCA. We constructed a MC decision model to simulate individual probabilities of outcomes of interest. These probabilities were contrasted against the threshold probability at which a decision-maker is indifferent between key management strategies: treat all, treat none or use predictive model to guide treatment. We compared the results of DCA with MC simulated data against the results of DCA based on actual individual patient data for three decision models published in the literature: (i) statins for primary prevention of cardiovascular disease, (ii) hospice referral for terminally ill patients and (iii) prostate cancer surgery. The results of MC DCA and patient data DCA were identical. To the extent that patient data DCA were used to inform decisions about statin use, referral to hospice or prostate surgery, the results indicate that MC DCA could have also been used. As long as the aggregate parameters on distribution of the probability of outcomes and treatment effects are accurately described in the published reports, the MC DCA will generate indistinguishable results from individual patient data DCA. We provide a simple, easy-to-use model, which can facilitate wider use of DCA and better evaluation of diagnostic tests and predictive models that rely only on aggregate data reported in the literature. © 2017 Stichting European Society for Clinical Investigation Journal Foundation.
Selecting Essential Information for Biosurveillance—A Multi-Criteria Decision Analysis
Generous, Nicholas; Margevicius, Kristen J.; Taylor-McCabe, Kirsten J.; Brown, Mac; Daniel, W. Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina
2014-01-01
The National Strategy for Biosurveillancedefines biosurveillance as “the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels.” However, the strategy does not specify how “essential information” is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being “essential”. Thequestion of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of “essential information” for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system. PMID:24489748
Ren, Jingzheng
2018-01-01
This objective of this study is to develop a generic multi-attribute decision analysis framework for ranking the technologies for ballast water treatment and determine their grades. An evaluation criteria system consisting of eight criteria in four categories was used to evaluate the technologies for ballast water treatment. The Best-Worst method, which is a subjective weighting method and Criteria importance through inter-criteria correlation method, which is an objective weighting method, were combined to determine the weights of the evaluation criteria. The extension theory was employed to prioritize the technologies for ballast water treatment and determine their grades. An illustrative case including four technologies for ballast water treatment, i.e. Alfa Laval (T 1 ), Hyde (T 2 ), Unitor (T 3 ), and NaOH (T 4 ), were studied by the proposed method, and the Hyde (T 2 ) was recognized as the best technology. Sensitivity analysis was also carried to investigate the effects of the combined coefficients and the weights of the evaluation criteria on the final priority order of the four technologies for ballast water treatment. The sum weighted method and the TOPSIS was also employed to rank the four technologies, and the results determined by these two methods are consistent to that determined by the proposed method in this study. Copyright © 2017 Elsevier Ltd. All rights reserved.
Smart Aquifer Characterisation validated using Information Theory and Cost benefit analysis
NASA Astrophysics Data System (ADS)
Moore, Catherine
2016-04-01
The field data acquisition required to characterise aquifer systems are time consuming and expensive. Decisions regarding field testing, the type of field measurements to make and the spatial and temporal resolution of measurements have significant cost repercussions and impact the accuracy of various predictive simulations. The Smart Aquifer Characterisation (SAC) research programme (New Zealand (NZ)) addresses this issue by assembling and validating a suite of innovative methods for characterising groundwater systems at the large, regional and national scales. The primary outcome is a suite of cost effective tools and procedures provided to resource managers to advance the understanding and management of groundwater systems and thereby assist decision makers and communities in the management of their groundwater resources, including the setting of land use limits that protect fresh water flows and quality and the ecosystems dependent on that fresh water. The programme has focused novel investigation approaches including the use of geophysics, satellite remote sensing, temperature sensing and age dating. The SMART (Save Money And Reduce Time) aspect of the programme emphasises techniques that use these passive cost effective data sources to characterise groundwater systems at both the aquifer and the national scale by: • Determination of aquifer hydraulic properties • Determination of aquifer dimensions • Quantification of fluxes between ground waters and surface water • Groundwater age dating These methods allow either a lower cost method for estimating these properties and fluxes, or a greater spatial and temporal coverage for the same cost. To demonstrate the cost effectiveness of the methods a 'data worth' analysis is undertaken. The data worth method involves quantification of the utility of observation data in terms of how much it reduces the uncertainty of model parameters and decision focussed predictions which depend on these parameters. Such decision focussed predictions can include many aspects of system behaviour which underpin management decisions e.g., drawdown of groundwater levels, salt water intrusion, stream depletion, or wetland water level. The value of a data type or an observation location (e.g. remote sensing data (Westerhoff 2015) or a distributed temperature sensing measurement) is greater the more it enhances the certainty with which the model is able to predict such environmental behaviour. By comparing the difference in predictive uncertainty with or without such data, the value of potential observations is assessed. This can easily be achieved using rapid linear predictive uncertainty analysis methods (Moore 2005, Moore and Doherty 2006). By assessing the tension between the cost of data acquisition and the predictive accuracy achieved by gathering these observations in a pareto analysis, the relative cost effectiveness of these novel methods can be compared with more traditional measurements (e.g. bore logs, aquifer pumping tests, and simultaneous stream loss gaugings) for a suite of pertinent groundwater management decisions (Wallis et al 2014). This comparison illuminates those field data acquisition methods which offer the best value for the specific issues managers face in any region, and also indicates the diminishing returns of increasingly large and expensive data sets. References: Wallis I, Moore C, Post V, Wolf L, Martens E, Prommer. Using predictive uncertainty analysis to optimise tracer test design and data acquisition. Journal of Hydrology 515 (2014) 191-204. Moore, C. (2005). The use of regularized inversion in groundwater model calibration and prediction uncertainty analysis. Thesis submitted for the degree of Doctor of Philosophy at The University of Queensland, Australia. Moore, C., and Doherty, D. (2005). Role of the calibration process in reducing model predictive error. Water Resources Research 41, no.5 W05050. Westerhoff RS. Using uncertainty of Penman and Penman-Monteith methods in combined satellite and ground-based evapotranspiration estimates. Remote Sensing of Environment 169, 102-112
Mendoza, G A; Prabhu, R
2000-12-01
This paper describes an application of multiple criteria analysis (MCA) in assessing criteria and indicators adapted for a particular forest management unit. The methods include: ranking, rating, and pairwise comparisons. These methods were used in a participatory decision-making environment where a team representing various stakeholders and professionals used their expert opinions and judgements in assessing different criteria and indicators (C&I) on the one hand, and how suitable and applicable they are to a forest management unit on the other. A forest concession located in Kalimantan, Indonesia, was used as the site for the case study. Results from the study show that the multicriteria methods are effective tools that can be used as structured decision aids to evaluate, prioritize, and select sets of C&I for a particular forest management unit. Ranking and rating approaches can be used as a screening tool to develop an initial list of C&I. Pairwise comparison, on the other hand, can be used as a finer filter to further reduce the list. In addition to using these three MCA methods, the study also examines two commonly used group decision-making techniques, the Delphi method and the nominal group technique. Feedback received from the participants indicates that the methods are transparent, easy to implement, and provide a convenient environment for participatory decision-making.
Meta-analysis in evidence-based healthcare: a paradigm shift away from random effects is overdue.
Doi, Suhail A R; Furuya-Kanamori, Luis; Thalib, Lukman; Barendregt, Jan J
2017-12-01
Each year up to 20 000 systematic reviews and meta-analyses are published whose results influence healthcare decisions, thus making the robustness and reliability of meta-analytic methods one of the world's top clinical and public health priorities. The evidence synthesis makes use of either fixed-effect or random-effects statistical methods. The fixed-effect method has largely been replaced by the random-effects method as heterogeneity of study effects led to poor error estimation. However, despite the widespread use and acceptance of the random-effects method to correct this, it too remains unsatisfactory and continues to suffer from defective error estimation, posing a serious threat to decision-making in evidence-based clinical and public health practice. We discuss here the problem with the random-effects approach and demonstrate that there exist better estimators under the fixed-effect model framework that can achieve optimal error estimation. We argue for an urgent return to the earlier framework with updates that address these problems and conclude that doing so can markedly improve the reliability of meta-analytical findings and thus decision-making in healthcare.
The modified "Rockfall Hazard Rating System": a new tool for roads risk assessment
NASA Astrophysics Data System (ADS)
Budetta, P.
2003-04-01
This paper contains a modified method for the analysis of rockfall hazard along roads and motorways. The method is derived from that one developed by Pierson et alii at the Oregon State Highway Division. The Rockfall Hazard Rating System (RHRS) provides a rational way to make informed decisions on where and how to spend construction funds. An exponential scoring graph is used to represent the increase in hazard that is reflected in the nine categories forming the classification (slope height, ditch effectiveness, average vehicle risk, percent of decision site distance, roadway width, geological character, quantity of rockfall/event, climate and rock fall history). The resulting total score contains the essential elements regarding the evaluation of the consequences ("cost of failure"). In the modified method, the rating for the categories "ditch effectiveness", "decision sight distance", "rodway width", "geologic characteristic" and "climate and water circulation" have been rendered more easy and objective. The main modifications regard the introduction of the Romana's Slope Mass Rating improving the estimate of the geologic characteristics, of the volume of the potentially unstable blocks and underground water circulation. Other modifications regard the scoring determination for the categories "decision sight distance" and "road geometry". For these categories, the Italian National Council's standards (CNR) have been used. The method must be applied in both the traffic directions because the percentage of reduction in the "decision sight distance" greatly affects the results. An application of the method to a 2-km-long section of the Sorrentine road (n° 145) in Southern Italy was pointed out. A high traffic intensity affects the entire section of the road and rockfalls periodically cause casualties, as well as a large amount of damage and traffic interruptions. The method was applied on seven cross section traces of slopes adjacent to the Sorrentine road and the total final scores range between 275 and 450. For these slopes, the analysis shows that the risk is unacceptable and it must reduced using urgent remedial works. Further applications in other geological environments are welcomed.
Opening the Black Box: Cognitive Strategies in Family Practice
Christensen, Robert E.; Fetters, Michael D.; Green, Lee A.
2005-01-01
PURPOSE We wanted to describe the cognitive strategies used by family physicians when structuring the decision-making tasks of an outpatient visit. METHODS This qualitative study used cognitive task analysis, a structured interview method in which a trained interviewer works individually with expert decision makers to capture their stages and elements of information processing. RESULTS Eighteen family physicians of varying levels of experience participated. Three dominant themes emerged: time pressure, a high degree of variation in task structuring, and varying degrees of task automatization. Based on these data and previous research from the cognitive sciences, we developed a model of novice and expert approaches to decision making in primary care. The model illustrates differences in responses to unexpected opportunity in practice, particularly the expert’s use of attentional surplus (reserve capacity to handle problems) vs the novice’s choice between taking more time or displacing another task. CONCLUSIONS Family physicians have specific, highly individualized cognitive task-structuring approaches and show the decision behavior features typical of expert decision makers in other fields. This finding places constraints on and suggests useful approaches for improving practice. PMID:15798041
Supporting multi-stakeholder environmental decisions.
Hajkowicz, Stefan A
2008-09-01
This paper examines how multiple criteria analysis (MCA) can be used to support multi-stakeholder environmental management decisions. It presents a study through which 48 stakeholders from environmental, primary production and community interest groups used MCA to prioritise 30 environmental management problems in the Mackay-Whitsunday region of Queensland, Australia. The MCA model, with procedures for aggregating multi-stakeholder output, was used to inform a final decision on the priority of the region's environmental management problems. The result was used in the region's environmental management plan as required under Australia's Natural Heritage Trust programme. The study shows how relatively simple MCA methods can help stakeholders make group decisions, even when they hold strongly conflicting preferences.
Effect of Women's Decision-Making Autonomy on Infant's Birth Weight in Rural Bangladesh
Sharma, Arpana
2013-01-01
Background. Low birth weight (LBW), an outcome of maternal undernutrition, is a major public health concern in Bangladesh where the problem is most prominent. Women's decision-making autonomy is likely an important factor influencing maternal and child health outcomes. The aim of the study was to assess the effect of women's decision-making autonomy on infant's birth weight (BW). Methods. The study included data of 2175 enrolled women (14–45 years of age) from the Maternal and Infant Nutritional Intervention in Matlab (MINIMat-study) in Bangladesh. Pearson's chi-square test, analysis of covariance (ANCOVA), and logistic regression analysis were applied at the collected data. Results. Women with lowest decision-making autonomy were significantly more likely to have a low birth weight (LBW) child, after controlling for maternal age, education (woman's and her husband's), socioeconomic status (SES) (odds ratio (OR) = 1.4; 95% confidence interval (CI) 1.0, 1.8). BW was decreased significantly among women with lowest decision making autonomy after adjusting for all confounders. Conclusion. Women's decision-making autonomy has an independent effect on BW and LBW outcome. In addition, there is a need for further exploration to identify sociocultural attributes and gender related determinants of women decision-making autonomy in this study setting. PMID:24575305
Structural analysis at aircraft conceptual design stage
NASA Astrophysics Data System (ADS)
Mansouri, Reza
In the past 50 years, computers have helped by augmenting human efforts with tremendous pace. The aircraft industry is not an exception. Aircraft industry is more than ever dependent on computing because of a high level of complexity and the increasing need for excellence to survive a highly competitive marketplace. Designers choose computers to perform almost every analysis task. But while doing so, existing effective, accurate and easy to use classical analytical methods are often forgotten, which can be very useful especially in the early phases of the aircraft design where concept generation and evaluation demands physical visibility of design parameters to make decisions [39, 2004]. Structural analysis methods have been used by human beings since the very early civilization. Centuries before computers were invented; the pyramids were designed and constructed by Egyptians around 2000 B.C, the Parthenon was built by the Greeks, around 240 B.C, Dujiangyan was built by the Chinese. Persepolis, Hagia Sophia, Taj Mahal, Eiffel tower are only few more examples of historical buildings, bridges and monuments that were constructed before we had any advancement made in computer aided engineering. Aircraft industry is no exception either. In the first half of the 20th century, engineers used classical method and designed civil transport aircraft such as Ford Tri Motor (1926), Lockheed Vega (1927), Lockheed 9 Orion (1931), Douglas DC-3 (1935), Douglas DC-4/C-54 Skymaster (1938), Boeing 307 (1938) and Boeing 314 Clipper (1939) and managed to become airborne without difficulty. Evidencing, while advanced numerical methods such as the finite element analysis is one of the most effective structural analysis methods; classical structural analysis methods can also be as useful especially during the early phase of a fixed wing aircraft design where major decisions are made and concept generation and evaluation demands physical visibility of design parameters to make decisions. Considering the strength and limitations of both methodologies, the question to be answered in this thesis is: How valuable and compatible are the classical analytical methods in today's conceptual design environment? And can these methods complement each other? To answer these questions, this thesis investigates the pros and cons of classical analytical structural analysis methods during the conceptual design stage through the following objectives: Illustrate structural design methodology of these methods within the framework of Aerospace Vehicle Design (AVD) lab's design lifecycle. Demonstrate the effectiveness of moment distribution method through four case studies. This will be done by considering and evaluating the strength and limitation of these methods. In order to objectively quantify the limitation and capabilities of the analytical method at the conceptual design stage, each case study becomes more complex than the one before.
Yousuf, Naveed; Violato, Claudio; Zuberi, Rukhsana W
2015-01-01
CONSTRUCT: Authentic standard setting methods will demonstrate high convergent validity evidence of their outcomes, that is, cutoff scores and pass/fail decisions, with most other methods when compared with each other. The objective structured clinical examination (OSCE) was established for valid, reliable, and objective assessment of clinical skills in health professions education. Various standard setting methods have been proposed to identify objective, reliable, and valid cutoff scores on OSCEs. These methods may identify different cutoff scores for the same examinations. Identification of valid and reliable cutoff scores for OSCEs remains an important issue and a challenge. Thirty OSCE stations administered at least twice in the years 2010-2012 to 393 medical students in Years 2 and 3 at Aga Khan University are included. Psychometric properties of the scores are determined. Cutoff scores and pass/fail decisions of Wijnen, Cohen, Mean-1.5SD, Mean-1SD, Angoff, borderline group and borderline regression (BL-R) methods are compared with each other and with three variants of cluster analysis using repeated measures analysis of variance and Cohen's kappa. The mean psychometric indices on the 30 OSCE stations are reliability coefficient = 0.76 (SD = 0.12); standard error of measurement = 5.66 (SD = 1.38); coefficient of determination = 0.47 (SD = 0.19), and intergrade discrimination = 7.19 (SD = 1.89). BL-R and Wijnen methods show the highest convergent validity evidence among other methods on the defined criteria. Angoff and Mean-1.5SD demonstrated least convergent validity evidence. The three cluster variants showed substantial convergent validity with borderline methods. Although there was a high level of convergent validity of Wijnen method, it lacks the theoretical strength to be used for competency-based assessments. The BL-R method is found to show the highest convergent validity evidences for OSCEs with other standard setting methods used in the present study. We also found that cluster analysis using mean method can be used for quality assurance of borderline methods. These findings should be further confirmed by studies in other settings.
Sheehan, Joanne; Sherman, Kerry A; Lam, Thomas; Boyages, John
2008-01-01
This study investigated the influence of psychosocial and surgical factors on decision regret among 123 women diagnosed with breast cancer who had undergone immediate (58%) or delayed (42%) breast reconstruction following mastectomy. The majority of participants (52.8%, n = 65) experienced no decision regret, 27.6% experienced mild regret and 19.5% moderate to strong regret. Bivariate analyses indicated that decision regret was associated with negative body image and psychological distress - intrusion and avoidance. There were no differences in decision regret either with respect to methods or timing patterns of reconstructive surgery. Multinominal logistic regression analysis showed that, when controlling for mood state and time since last reconstructive procedure, increases in negative body image were associated with increased likelihood of experiencing decision regret. These findings highlight the need for optimal input from surgeons and therapists in order to promote realistic expectations regarding the outcome of breast reconstruction and to reduce the likelihood of women experiencing decision regret.
1982-05-01
Raiffa (831, LaValle [891, and other books on decision analysis. 4.2 Risk Attitudes Much recent research has focused on the investigation of various risk...Issacs, G.L., Hamer, R., Chen, J., Chuang, D., Woodworth, G., Molenaar , I., Lewis C., and Libby, D., Manual for the Computer-Assisted Data Analysis (CADA
Automated pattern analysis: A newsilent partner in insect acoustic detection studies
USDA-ARS?s Scientific Manuscript database
This seminar reviews methods that have been developed for automated analysis of field-collected sounds used to estimate pest populations and guide insect pest management decisions. Several examples are presented of successful usage of acoustic technology to map insect distributions in field environ...
2016-12-01
chosen rather than complex ones , and responds to the criticism of the DTA approach. Chapter IV provides three separate case studies in defense R&D...defense R&D projects. To this end, the first section describes the case study method and the advantages of using simple models over more complex ones ...the analysis lacked empirical data and relied on subjective data, the analysis successfully combined the DTA approach with the case study method and
Bozic, Kevin J; Chenok, Kate Eresian; Schindel, Jennifer; Chan, Vanessa; Huddleston, James I; Braddock, Clarence; Belkora, Jeffrey
2014-08-31
Despite evidence that decision and communication aids are effective for enhancing the quality of preference-sensitive decisions, their adoption in the field of orthopaedic surgery has been limited. The purpose of this mixed-methods study was to evaluate the perceived value of decision and communication aids among different healthcare stakeholders. Patients with hip or knee arthritis, orthopaedic surgeons who perform hip and knee replacement procedures, and a group of large, self-insured employers (healthcare purchasers) were surveyed regarding their views on the value of decision and communication aids in orthopaedics. Patients with hip or knee arthritis who participated in a randomized controlled trial involving decision and communication aids were asked to complete an online survey about what was most and least beneficial about each of the tools they used, the ideal mode of administration of these tools and services, and their interest in receiving comparable materials and services in the future. A subset of these patients were invited to participate in a telephone interview, where there were asked to rank and attribute a monetary value to the interventions. These interviews were analyzed using a qualitative and mixed methods analysis software. Members of the American Hip and Knee Surgeons (AAHKS) were surveyed on their perceptions and usage of decision and communication aids in orthopaedic practice. Healthcare purchasers were interviewed about their perspectives on patient-oriented decision support. All stakeholders saw value in decision and communication aids, with the major barrier to implementation being cost. Both patients and surgeons would be willing to bear at least part of the cost of implementing these tools, while employers felt health plans should be responsible for shouldering the costs. Decision and communication aids can be effective tools for incorporating patients preferences and values into preference-sensitive decisions in orthopaedics. Future efforts should be aimed at assessing strategies for efficient implementation of these tools into widespread orthopaedic practice.
Puts, Martine T E; Sattar, Schroder; McWatters, Kara; Lee, Katherine; Kulik, Michael; MacDonald, Mary-Ellen; Jang, Raymond; Amir, Eitan; Krzyzanowska, Monika K; Leighl, Natasha; Fitch, Margaret; Joshua, Anthony M; Warde, Padraig; Tourangeau, Ann E; Alibhai, Shabbir M H
2017-03-01
Although comorbidities, frailty, and functional impairment are common in older adults (OA) with cancer, little is known about how these factors are considered during the treatment decision-making process by OAs, their families, and health care providers. Our aim was to better understand the treatment decision process from all these perspectives. A mixed methods multi-perspective longitudinal study using semi-structured interviews and surveys with 29 OAs aged ≥70 years with advanced prostate, breast, colorectal, or lung cancer, 24 of their family members,13 oncologists, and 15 family physicians was conducted. The sample was stratified on age (70-79 and 80+). All interviews were analyzed using thematic analysis. There was no difference in the treatment decision-making experience based on age. Most OAs felt that they should have the final say in the treatment decision, but strongly valued their oncologists' opinion. "Trust in my oncologist" and "chemotherapy as the last resort to prolong life" were the most important reasons to accept treatment. Families indicated a need to improve communication between them, the patient and the specialist, particularly around goals of treatment. Comorbidity and potential side-effects did not play a major role in the treatment decision-making for patients, families, or oncologists. Family physicians reported no involvement in decisions but desired to be more involved. This first study using multiple perspectives showed neither frailty nor comorbidity played a role in the treatment decision-making process. Efforts to improve communication were identified as an opportunity that may enhance quality of care. In a mixed methods study multiple perspective study with older adults with cancer, their family members, their oncologist and their family physician we explored the treatment decision making process and found that most older adults were satisfied with their decision. Comorbidity, functional status and frailty did not impact the older adult's or their family members' decision.
Health economics and outcomes methods in risk-based decision-making for blood safety.
Custer, Brian; Janssen, Mart P
2015-08-01
Analytical methods appropriate for health economic assessments of transfusion safety interventions have not previously been described in ways that facilitate their use. Within the context of risk-based decision-making (RBDM), health economics can be important for optimizing decisions among competing interventions. The objective of this review is to address key considerations and limitations of current methods as they apply to blood safety. Because a voluntary blood supply is an example of a public good, analyses should be conducted from the societal perspective when possible. Two primary study designs are recommended for most blood safety intervention assessments: budget impact analysis (BIA), which measures the cost to implement an intervention both to the blood operator but also in a broader context, and cost-utility analysis (CUA), which measures the ratio between costs and health gain achieved, in terms of reduced morbidity and mortality, by use of an intervention. These analyses often have important limitations because data that reflect specific aspects, for example, blood recipient population characteristics or complication rates, are not available. Sensitivity analyses play an important role. The impact of various uncertain factors can be studied conjointly in probabilistic sensitivity analyses. The use of BIA and CUA together provides a comprehensive assessment of the costs and benefits from implementing (or not) specific interventions. RBDM is multifaceted and impacts a broad spectrum of stakeholders. Gathering and analyzing health economic evidence as part of the RBDM process enhances the quality, completeness, and transparency of decision-making. © 2015 AABB.
NASA Astrophysics Data System (ADS)
Kostyuchenko, Yuriy V.; Sztoyka, Yulia; Kopachevsky, Ivan; Artemenko, Igor; Yuschenko, Maxim
2017-10-01
Multi-model approach for remote sensing data processing and interpretation is described. The problem of satellite data utilization in multi-modeling approach for socio-ecological risks assessment is formally defined. Observation, measurement and modeling data utilization method in the framework of multi-model approach is described. Methodology and models of risk assessment in framework of decision support approach are defined and described. Method of water quality assessment using satellite observation data is described. Method is based on analysis of spectral reflectance of aquifers. Spectral signatures of freshwater bodies and offshores are analyzed. Correlations between spectral reflectance, pollutions and selected water quality parameters are analyzed and quantified. Data of MODIS, MISR, AIRS and Landsat sensors received in 2002-2014 have been utilized verified by in-field spectrometry and lab measurements. Fuzzy logic based approach for decision support in field of water quality degradation risk is discussed. Decision on water quality category is making based on fuzzy algorithm using limited set of uncertain parameters. Data from satellite observations, field measurements and modeling is utilizing in the framework of the approach proposed. It is shown that this algorithm allows estimate water quality degradation rate and pollution risks. Problems of construction of spatial and temporal distribution of calculated parameters, as well as a problem of data regularization are discussed. Using proposed approach, maps of surface water pollution risk from point and diffuse sources are calculated and discussed.
Allyn, Jérôme; Allou, Nicolas; Augustin, Pascal; Philip, Ivan; Martinet, Olivier; Belghiti, Myriem; Provenchere, Sophie; Montravers, Philippe; Ferdynus, Cyril
2017-01-01
The benefits of cardiac surgery are sometimes difficult to predict and the decision to operate on a given individual is complex. Machine Learning and Decision Curve Analysis (DCA) are recent methods developed to create and evaluate prediction models. We conducted a retrospective cohort study using a prospective collected database from December 2005 to December 2012, from a cardiac surgical center at University Hospital. The different models of prediction of mortality in-hospital after elective cardiac surgery, including EuroSCORE II, a logistic regression model and a machine learning model, were compared by ROC and DCA. Of the 6,520 patients having elective cardiac surgery with cardiopulmonary bypass, 6.3% died. Mean age was 63.4 years old (standard deviation 14.4), and mean EuroSCORE II was 3.7 (4.8) %. The area under ROC curve (IC95%) for the machine learning model (0.795 (0.755-0.834)) was significantly higher than EuroSCORE II or the logistic regression model (respectively, 0.737 (0.691-0.783) and 0.742 (0.698-0.785), p < 0.0001). Decision Curve Analysis showed that the machine learning model, in this monocentric study, has a greater benefit whatever the probability threshold. According to ROC and DCA, machine learning model is more accurate in predicting mortality after elective cardiac surgery than EuroSCORE II. These results confirm the use of machine learning methods in the field of medical prediction.
Extracting decision rules from police accident reports through decision trees.
de Oña, Juan; López, Griselda; Abellán, Joaquín
2013-01-01
Given the current number of road accidents, the aim of many road safety analysts is to identify the main factors that contribute to crash severity. To pinpoint those factors, this paper shows an application that applies some of the methods most commonly used to build decision trees (DTs), which have not been applied to the road safety field before. An analysis of accidents on rural highways in the province of Granada (Spain) between 2003 and 2009 (both inclusive) showed that the methods used to build DTs serve our purpose and may even be complementary. Applying these methods has enabled potentially useful decision rules to be extracted that could be used by road safety analysts. For instance, some of the rules may indicate that women, contrary to men, increase their risk of severity under bad lighting conditions. The rules could be used in road safety campaigns to mitigate specific problems. This would enable managers to implement priority actions based on a classification of accidents by types (depending on their severity). However, the primary importance of this proposal is that other databases not used here (i.e. other infrastructure, roads and countries) could be used to identify unconventional problems in a manner easy for road safety managers to understand, as decision rules. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Shen, Jing; Lu, Hongwei; Zhang, Yang; Song, Xinshuang; He, Li
2016-05-01
As ecosystem management is a hotspot and urgent topic with increasing population growth and resource depletion. This paper develops an urban ecosystem vulnerability assessment method representing a new vulnerability paradigm for decision makers and environmental managers, as it's an early warning system to identify and prioritize the undesirable environmental changes in terms of natural, human, economic and social elements. The whole idea is to decompose a complex problem into sub-problem, and analyze each sub-problem, and then aggregate all sub-problems to solve this problem. This method integrates spatial context of Geographic Information System (GIS) tool, multi-criteria decision analysis (MCDA) method, ordered weighted averaging (OWA) operators, and socio-economic elements. Decision makers can find out relevant urban ecosystem vulnerability assessment results with different vulnerable attitude. To test the potential of the vulnerability methodology, it has been applied to a case study area in Beijing, China, where it proved to be reliable and consistent with the Beijing City Master Plan. The results of urban ecosystem vulnerability assessment can support decision makers in evaluating the necessary of taking specific measures to preserve the quality of human health and environmental stressors for a city or multiple cities, with identifying the implications and consequences of their decisions.
Etchells, Edward; Ferrari, Michel; Kiss, Alex; Martyn, Nikki; Zinman, Deborah; Levinson, Wendy
2011-06-01
Prior studies show significant gaps in the informed decision-making process, a central goal of surgical care. These studies have been limited by their focus on low-risk decisions, single visits rather than entire consultations, or both. Our objectives were, first, to rate informed decision-making for major elective vascular surgery based on audiotapes of actual physician-patient conversations and, second, to compare ratings of informed decision-making for first visits to ratings for multiple visits by the same patient over time. We prospectively enrolled patients for whom vascular surgical treatment was a potential option at a tertiary care outpatient vascular surgery clinic. We audio-taped all surgeon-patient conversations, including multiple visits when necessary, until a decision was made. Using an existing method, we evaluated the transcripts for elements of decision-making, including basic elements (e.g., an explanation of the clinical condition), intermediate elements (e.g., risks and benefits) and complex elements (e.g., uncertainty around the decision). We analyzed 145 surgeon-patient consultations. Overall, 45% of consultations contained complex elements, whereas 23% did not contain the basic elements of decision-making. For the 67 consultations that involved multiple visits, ratings were significantly higher when evaluating all visits (50% complex elements) compared with evaluating only the first visit (33% complex elements, p < 0.001.) We found that 45% of consultations contained complex elements, which is higher than prior studies with similar methods. Analyzing decision-making over multiple visits yielded different results than analyzing decision-making for single visits.
Humphries Choptiany, John Michael; Pelot, Ronald
2014-09-01
Multicriteria decision analysis (MCDA) has been applied to various energy problems to incorporate a variety of qualitative and quantitative criteria, usually spanning environmental, social, engineering, and economic fields. MCDA and associated methods such as life-cycle assessments and cost-benefit analysis can also include risk analysis to address uncertainties in criteria estimates. One technology now being assessed to help mitigate climate change is carbon capture and storage (CCS). CCS is a new process that captures CO2 emissions from fossil-fueled power plants and injects them into geological reservoirs for storage. It presents a unique challenge to decisionmakers (DMs) due to its technical complexity, range of environmental, social, and economic impacts, variety of stakeholders, and long time spans. The authors have developed a risk assessment model using a MCDA approach for CCS decisions such as selecting between CO2 storage locations and choosing among different mitigation actions for reducing risks. The model includes uncertainty measures for several factors, utility curve representations of all variables, Monte Carlo simulation, and sensitivity analysis. This article uses a CCS scenario example to demonstrate the development and application of the model based on data derived from published articles and publicly available sources. The model allows high-level DMs to better understand project risks and the tradeoffs inherent in modern, complex energy decisions. © 2014 Society for Risk Analysis.
Loeffert, Sabine; Ommen, Oliver; Kuch, Christine; Scheibler, Fueloep; Woehrmann, Andrej; Baldamus, Conrad; Pfaff, Holger
2010-09-11
Numerous studies examined factors in promoting a patient preference for active participation in treatment decision making with only modest success. The purpose of this study was to identify types of patients wishing to participate in treatment decisions as well as those wishing to play a completely active or passive role based on a Germany-wide survey of dialysis patients; using a prediction typal analysis method that defines types as configurations of categories belonging to different attributes and takes particularly higher order interactions between variables into account. After randomly splitting the original patient sample into two halves, an exploratory prediction configural frequency analysis (CFA) was performed on one-half of the sample (n = 1969) and the identified types were considered as hypotheses for an inferential prediction CFA for the second half (n = 1914). 144 possible prediction types were tested by using five predictor variables and control preferences as criterion. An α-adjustment (0.05) for multiple testing was performed by the Holm procedure. 21 possible prediction types were identified as hypotheses in the exploratory prediction CFA; four patient types were confirmed in the confirmatory prediction CFA: patients preferring a passive role show low information seeking preference, above average trust in their physician, perceive their physician's participatory decision-making (PDM)-style positive, have a lower educational level, and are 56-75 years old (Type 1; p < 0.001) or > 76 years old (Type 2; p < 0.001). Patients preferring an active role show high information seeking preference, a higher educational level, and are < 55 years old. They have either below average trust, perceive the PDM-style negative (Type 3; p < 0.001) or above average trust and perceive the PDM-style positive (Type 4; p < 0.001). The method prediction configural frequency analysis was newly introduced to the research field of patient participation and could demonstrate how a particular control preference role is determined by an association of five variables.
Dexter, Franklin; Ledolter, Johannes
2003-07-01
Surgeons using the same amount of operating room (OR) time differ in their achieved hospital contribution margins (revenue minus variable costs) by >1000%. Thus, to improve the financial return from perioperative facilities, OR strategic decisions should selectively focus additional OR capacity and capital purchasing on a few surgeons or subspecialties. These decisions use estimates of each surgeon's and/or subspecialty's contribution margin per OR hour. The estimates are subject to uncertainty (e.g., from outliers). We account for the uncertainties by using mean-variance portfolio analysis (i.e., quadratic programming). This method characterizes the problem of selectively expanding OR capacity based on the expected financial return and risk of different portfolios of surgeons. The assessment reveals whether the choices, of which surgeons have their OR capacity expanded, are sensitive to the uncertainties in the surgeons' contribution margins per OR hour. Thus, mean-variance analysis reduces the chance of making strategic decisions based on spurious information. We also assess the financial benefit of using mean-variance portfolio analysis when the planned expansion of OR capacity is well diversified over at least several surgeons or subspecialties. Our results show that, in such circumstances, there may be little benefit from further changing the portfolio to reduce its financial risk. Surgeon and subspecialty specific hospital financial data are uncertain, a fact that should be taken into account when making decisions about expanding operating room capacity. We show that mean-variance portfolio analysis can incorporate this uncertainty, thereby guiding operating room management decision-making and reducing the chance of a strategic decision being made based on spurious information.
Anagnostou, Despina; Sivell, Stephanie; Noble, Simon; Lester, Jason; Byrne, Anthony; Sampson, Catherine; Longo, Mirella; Nelson, Annmarie
2017-07-12
Patient-centred care is essential to the delivery of healthcare; however, this necessitates direct patient involvement in clinical decision-making and can be challenging for patients diagnosed with advanced non-small cell lung cancer where there may be misunderstanding of the extent of disease, prognosis and aims of treatment. In this context, decisions are complex and there is a need to balance the risks and benefits, including treatment with palliative intent. The aim of the PACT study is to identify the information and decision support needs of patients, leading to the development of an intervention to support patients with advanced lung cancer when considering treatment options. PACT is a five-stage, multimethod and multicentre study. Participants : Patients and health professionals will be recruited from three health boards. Methods : Non-participant observation of multidisciplinary team meetings (n=12) will be used to determine patients' allocation to treatment pathways (stage I). Non-participant observation of patient-clinician consultations (n=20-30) will be used to explore communication of treatment options and decision-making. Extent of participation in decision-making will be assessed using the Observing Patient Involvement in Shared Decision-Making tool. Interviews with patients (stage III) and their clinicians (stage IV) will explore the perception of treatment options and involvement in decision-making. Based on stages I-IV, an expert consensus meeting will finalise the content and format of the intervention. Cognitive interviews with patients will then determine the face validity of the intervention (stage V). Analysis : analysis will be according to data type and research question and will include mediated discourse analysis, thematic analysis, framework analysis and interpretative phenomenological analysis. Ethical approval has been granted. The study findings will contribute to and promote shared and informed decision-making in the best interest of patients and prudent healthcare. We therefore aim to disseminate results via relevant respiratory, oncology and palliative care journals and conferences. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Decision making in asthma exacerbation: a clinical judgement analysis
Jenkins, John; Shields, Mike; Patterson, Chris; Kee, Frank
2007-01-01
Background Clinical decisions which impact directly on patient safety and quality of care are made during acute asthma attacks by individual doctors based on their knowledge and experience. Decisions include administration of systemic corticosteroids (CS) and oral antibiotics, and admission to hospital. Clinical judgement analysis provides a methodology for comparing decisions between practitioners with different training and experience, and improving decision making. Methods Stepwise linear regression was used to select clinical cues based on visual analogue scale assessments of the propensity of 62 clinicians to prescribe a short course of oral CS (decision 1), a course of antibiotics (decision 2), and/or admit to hospital (decision 3) for 60 “paper” patients. Results When compared by specialty, paediatricians' models for decision 1 were more likely to include level of alertness as a cue (54% vs 16%); for decision 2 they were more likely to include presence of crepitations (49% vs 16%) and less likely to include inhaled CS (8% vs 40%), respiratory rate (0% vs 24%) and air entry (70% vs 100%). When compared to other grades, the models derived for decision 3 by consultants/general practitioners were more likely to include wheeze severity as a cue (39% vs 6%). Conclusions Clinicians differed in their use of individual cues and the number included in their models. Patient safety and quality of care will benefit from clarification of decision‐making strategies as general learning points during medical training, in the development of guidelines and care pathways, and by clinicians developing self‐awareness of their own preferences. PMID:17428817
Multicriteria analysis of ontologically represented information
NASA Astrophysics Data System (ADS)
Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Bǎdicǎ, C.; Ivanovic, M.; Lirkov, I.
2014-11-01
Our current work concerns the development of a decision support system for the software selection problem. The main idea is to utilize expert knowledge to help the user in selecting the best software / method / computational resource to solve a computational problem. Obviously, this involves multicriterial decision making and the key open question is: which method to choose. The context of the work is provided by the Agents in Grid (AiG) project, where the software selection (and thus multicriterial analysis) is to be realized when all information concerning the problem, the hardware and the software is ontologically represented. Initially, we have considered the Analytical Hierarchy Process (AHP), which is well suited for the hierarchical data structures (e.g., such that have been formulated in terms of ontologies). However, due to its well-known shortcomings, we have decided to extend our search for the multicriterial analysis method best suited for the problem in question. In this paper we report results of our search, which involved: (i) TOPSIS (Technique for Order Preference by Similarity to Ideal Solution), (ii) PROMETHEE, and (iii) GRIP (Generalized Regression with Intensities of Preference). We also briefly argue why other methods have not been considered as valuable candidates.
Paraconsistent Annotated Logic in Viability Analysis: an Approach to Product Launching
NASA Astrophysics Data System (ADS)
Romeu de Carvalho, Fábio; Brunstein, Israel; Abe, Jair Minoro
2004-08-01
In this paper we present an application of the Para-analyzer, a logical analyzer based on the Paraconsistent Annotated Logic Pτ, introduced by Da Silva Filho and Abe in the decision-making systems. An example is analyzed in detail showing how uncertainty, inconsistency and paracompleteness can be elegantly handled with this logical system. As application for the Para-analyzer in decision-making, we developed the BAM — Baricenter Analysis Method. In order to make the presentation easier, we present the BAM applied in the viability analysis of product launching. Some of the techniques of Paraconsistent Annotated Logic have been applied in Artificial Intelligence, Robotics, Information Technolgy (Computer Sciences), etc..
ERIC Educational Resources Information Center
Hollands, Fiona M.; Kieffer, Michael J.; Shand, Robert; Pan, Yilin; Cheng, Henan; Levin, Henry M.
2016-01-01
We review the value of cost-effectiveness analysis for evaluation and decision making with respect to educational programs and discuss its application to early reading interventions. We describe the conditions for a rigorous cost-effectiveness analysis and illustrate the challenges of applying the method in practice, providing examples of programs…
Teaching Data Analysis with Interactive Visual Narratives
ERIC Educational Resources Information Center
Saundage, Dilal; Cybulski, Jacob L.; Keller, Susan; Dharmasena, Lasitha
2016-01-01
Data analysis is a major part of business analytics (BA), which refers to the skills, methods, and technologies that enable managers to make swift, quality decisions based on large amounts of data. BA has become a major component of Information Systems (IS) courses all over the world. The challenge for IS educators is to teach data analysis--the…
Urban Rain Gauge Siting Selection Based on Gis-Multicriteria Analysis
NASA Astrophysics Data System (ADS)
Fu, Yanli; Jing, Changfeng; Du, Mingyi
2016-06-01
With the increasingly rapid growth of urbanization and climate change, urban rainfall monitoring as well as urban waterlogging has widely been paid attention. In the light of conventional siting selection methods do not take into consideration of geographic surroundings and spatial-temporal scale for the urban rain gauge site selection, this paper primarily aims at finding the appropriate siting selection rules and methods for rain gauge in urban area. Additionally, for optimization gauge location, a spatial decision support system (DSS) aided by geographical information system (GIS) has been developed. In terms of a series of criteria, the rain gauge optimal site-search problem can be addressed by a multicriteria decision analysis (MCDA). A series of spatial analytical techniques are required for MCDA to identify the prospective sites. With the platform of GIS, using spatial kernel density analysis can reflect the population density; GIS buffer analysis is used to optimize the location with the rain gauge signal transmission character. Experiment results show that the rules and the proposed method are proper for the rain gauge site selection in urban areas, which is significant for the siting selection of urban hydrological facilities and infrastructure, such as water gauge.
Initiating decision-making conversations in palliative care: an ethnographic discourse analysis.
Bélanger, Emmanuelle; Rodríguez, Charo; Groleau, Danielle; Légaré, France; Macdonald, Mary Ellen; Marchand, Robert
2014-01-01
Conversations about end-of-life care remain challenging for health care providers. The tendency to delay conversations about care options represents a barrier that impedes the ability of terminally-ill patients to participate in decision-making. Family physicians with a palliative care practice are often responsible for discussing end-of-life care preferences with patients, yet there is a paucity of research directly observing these interactions. In this study, we sought to explore how patients and family physicians initiated decision-making conversations in the context of a community hospital-based palliative care service. This qualitative study combined discourse analysis with ethnographic methods. The field research lasted one year, and data were generated through participant observation and audio-recordings of consultations. A total of 101 consultations were observed longitudinally between 18 patients, 6 family physicians and 2 pivot nurses. Data analysis consisted in exploring the different types of discourses initiating decision-making conversations and how these discourses were affected by the organizational context in which they took place. The organization of care had an impact on decision-making conversations. The timing and origin of referrals to palliative care shaped whether patients were still able to participate in decision-making, and the decisions that remained to be made. The type of decisions to be made also shaped how conversations were initiated. Family physicians introduced decision-making conversations about issues needing immediate attention, such as symptom management, by directly addressing or eliciting patients' complaints. When decisions involved discussing impending death, decision-making conversations were initiated either indirectly, by prompting the patients to express their understanding of the disease and its progression, or directly, by providing a justification for broaching a difficult topic. Decision-making conversations and the initiation thereof were framed by the organization of care and the referral process prior to initial encounters. While symptom management was taken for granted as part of health care professionals' expected role, engaging in decisions regarding preparation for death implicitly remained under patients' control. This work makes important clinical contributions by exposing the rhetorical function of family physicians' discourse when introducing palliative care decisions.
Klein, Kelly R.; Burkle Jr., Frederick M.; Swienton, Raymond; King, Richard V.; Lehman, Thomas; North, Carol S.
2016-01-01
Introduction: After all large-scale disasters multiple papers are published describing the shortcomings of the triage methods utilized. This paper uses medical provider input to help describe attributes and patient characteristics that impact triage decisions. Methods: A survey distributed electronically to medical providers with and without disaster experience. Questions asked included what disaster experiences they had, and to rank six attributes in order of importance regarding triage. Results: 403 unique completed surveys were analyzed. 92% practiced a structural triage approach with the rest reporting they used “gestalt”.(gut feeling) Twelve per cent were identified as having placed patients in an expectant category during triage. Respiratory status, ability to speak, perfusion/pulse were all ranked in the top three. Gut feeling regardless of statistical analysis was fourth. Supplies were ranked in the top four when analyzed for those who had placed patients in the expectant category. Conclusion: Primary triage decisions in a mass casualty scenario are multifactorial and encompass patient mobility, life saving interventions, situational instincts, and logistics. PMID:27651979
NASA Astrophysics Data System (ADS)
El-Gafy, Mohamed Anwar
Transportation projects will have impact on the environment. The general environmental pollution and damage caused by roads is closely associated with the level of economic activity. Although Environmental Impact Assessments (EIAs) are dependent on geo-spatial information in order to make an assessment, there are no rules per se how to conduct an environmental assessment. Also, the particular objective of each assessment is dictated case-by-case, based on what information and analyses are required. The conventional way of Environmental Impact Assessment (EIA) study is a time consuming process because it has large number of dependent and independent variables which have to be taken into account, which also have different consequences. With the emergence of satellite remote sensing technology and Geographic Information Systems (GIS), this research presents a new framework for the analysis phase of the Environmental Impact Assessment (EIA) for transportation projects based on the integration between remote sensing technology, geographic information systems, and spatial modeling. By integrating the merits of the map overlay method and the matrix method, the framework analyzes comprehensively the environmental vulnerability around the road and its impact on the environment. This framework is expected to: (1) improve the quality of the decision making process, (2) be applied both to urban and inter-urban projects, regardless of transport mode, and (3) present the data and make the appropriate analysis to support the decision of the decision-makers and allow them to present these data to the public hearings in a simple manner. Case studies, transportation projects in the State of Florida, were analyzed to illustrate the use of the decision support framework and demonstrate its capabilities. This cohesive and integrated system will facilitate rational decisions through cost effective coordination of environmental information and data management that can be tailored to specific projects. The framework would facilitate collecting, organizing, analyzing, archiving, and coordinating the information and data necessary to support technical and policy transportation decisions.
Data Analysis and Data Mining: Current Issues in Biomedical Informatics
Bellazzi, Riccardo; Diomidous, Marianna; Sarkar, Indra Neil; Takabayashi, Katsuhiko; Ziegler, Andreas; McCray, Alexa T.
2011-01-01
Summary Background Medicine and biomedical sciences have become data-intensive fields, which, at the same time, enable the application of data-driven approaches and require sophisticated data analysis and data mining methods. Biomedical informatics provides a proper interdisciplinary context to integrate data and knowledge when processing available information, with the aim of giving effective decision-making support in clinics and translational research. Objectives To reflect on different perspectives related to the role of data analysis and data mining in biomedical informatics. Methods On the occasion of the 50th year of Methods of Information in Medicine a symposium was organized, that reflected on opportunities, challenges and priorities of organizing, representing and analysing data, information and knowledge in biomedicine and health care. The contributions of experts with a variety of backgrounds in the area of biomedical data analysis have been collected as one outcome of this symposium, in order to provide a broad, though coherent, overview of some of the most interesting aspects of the field. Results The paper presents sections on data accumulation and data-driven approaches in medical informatics, data and knowledge integration, statistical issues for the evaluation of data mining models, translational bioinformatics and bioinformatics aspects of genetic epidemiology. Conclusions Biomedical informatics represents a natural framework to properly and effectively apply data analysis and data mining methods in a decision-making context. In the future, it will be necessary to preserve the inclusive nature of the field and to foster an increasing sharing of data and methods between researchers. PMID:22146916
ON-SITE MERCURY ANALYSIS OF SOIL AT HAZARDOUS WASTE SITES BY IMMUNOASSAY AND ASV
Two field methods for Hg, immunoassay and anodic stripping voltammetry (ASV), that can provide onsite results for quick decisions at hazardous waste sites were evaluated. Each method was applied to samples from two Superfund sites that contain high levels of Hg; Sulphur Bank Me...
A number of investigators have recently examined the utility of applying probabilistic techniques in the derivation of toxic equivalency factors (TEFs) for polychlorinated dibenzo-p-dioxins (PCDDs), polychlorinated dibenzofurans (PCDFs) and dioxin-like polychlorinated biphenyls (...
Lin, Zi-Jing; Li, Lin; Cazzell, Mary; Liu, Hanli
2014-08-01
Diffuse optical tomography (DOT) is a variant of functional near infrared spectroscopy and has the capability of mapping or reconstructing three dimensional (3D) hemodynamic changes due to brain activity. Common methods used in DOT image analysis to define brain activation have limitations because the selection of activation period is relatively subjective. General linear model (GLM)-based analysis can overcome this limitation. In this study, we combine the atlas-guided 3D DOT image reconstruction with GLM-based analysis (i.e., voxel-wise GLM analysis) to investigate the brain activity that is associated with risk decision-making processes. Risk decision-making is an important cognitive process and thus is an essential topic in the field of neuroscience. The Balloon Analog Risk Task (BART) is a valid experimental model and has been commonly used to assess human risk-taking actions and tendencies while facing risks. We have used the BART paradigm with a blocked design to investigate brain activations in the prefrontal and frontal cortical areas during decision-making from 37 human participants (22 males and 15 females). Voxel-wise GLM analysis was performed after a human brain atlas template and a depth compensation algorithm were combined to form atlas-guided DOT images. In this work, we wish to demonstrate the excellence of using voxel-wise GLM analysis with DOT to image and study cognitive functions in response to risk decision-making. Results have shown significant hemodynamic changes in the dorsal lateral prefrontal cortex (DLPFC) during the active-choice mode and a different activation pattern between genders; these findings correlate well with published literature in functional magnetic resonance imaging (fMRI) and fNIRS studies. Copyright © 2014 The Authors. Human Brain Mapping Published by Wiley Periodicals, Inc.
Giordano, R; Passarella, G; Uricchio, V F; Vurro, M
2007-07-01
The importance of shared decision processes in water management derives from the awareness of the inadequacy of traditional--i.e. engineering--approaches in dealing with complex and ill-structured problems. It is becoming increasingly obvious that traditional problem solving and decision support techniques, based on optimisation and factual knowledge, have to be combined with stakeholder based policy design and implementation. The aim of our research is the definition of an integrated decision support system for consensus achievement (IDSS-C) able to support a participative decision-making process in all its phases: problem definition and structuring, identification of the possible alternatives, formulation of participants' judgments, and consensus achievement. Furthermore, the IDSS-C aims at structuring, i.e. systematising the knowledge which has emerged during the participative process in order to make it comprehensible for the decision-makers and functional for the decision process. Problem structuring methods (PSM) and multi-group evaluation methods (MEM) have been integrated in the IDSS-C. PSM are used to support the stakeholders in providing their perspective of the problem and to elicit their interests and preferences, while MEM are used to define not only the degree of consensus for each alternative, highlighting those where the agreement is high, but also the consensus label for each alternative and the behaviour of individuals during the participative decision-making. The IDSS-C is applied experimentally to a decision process regarding the use of treated wastewater for agricultural irrigation in the Apulia Region (southern Italy).
Drake, Julia I.; de Hart, Juan Carlos Trujillo; Monleón, Clara; Toro, Walter; Valentim, Joice
2017-01-01
ABSTRACT Background and objectives: MCDA is a decision-making tool with increasing use in the healthcare sector, including HTA (Health Technology Assessment). By applying multiple criteria, including innovation, in a comprehensive, structured and explicit manner, MCDA fosters a transparent, participative, consistent decision-making process taking into consideration values of all stakeholders. This paper by FIFARMA (Latin American Federation of Pharmaceutical Industry) proposes the deliberative (partial) MCDA as a more pragmatic, agile approach, especially when newly implemented. Methods: Literature review including real-world examples of effective MCDA implementation in healthcare decision making in both the public and private sector worldwide and in LA. Results and conclusion: It is the view of FIFARMA that MCDA should strongly be considered as a tool to support HTA and broader healthcare decision making such as the contracts and tenders process in order to foster transparency, fairness, and collaboration amongst stakeholders. PMID:29081919
Hutchinson, Marie; Hurley, John; Kozlowski, Desirée; Whitehair, Leeann
2018-02-01
To explore clinical nurses' experiences of using emotional intelligence capabilities during clinical reasoning and decision-making. There has been little research exploring whether, or how, nurses employ emotional intelligence (EI) in clinical reasoning and decision-making. Qualitative phase of a larger mixed-methods study. Semistructured qualitative interviews with a purposive sample of registered nurses (n = 12) following EI training and coaching. Constructivist thematic analysis was employed to analyse the narrative transcripts. Three themes emerged: the sensibility to engage EI capabilities in clinical contexts, motivation to actively engage with emotions in clinical decision-making and incorporating emotional and technical perspectives in decision-making. Continuing to separate cognition and emotion in research, theorising and scholarship on clinical reasoning is counterproductive. Understanding more about nurses' use of EI has the potential to improve the calibre of decisions, and the safety and quality of care delivered. © 2017 John Wiley & Sons Ltd.
2008-06-01
capacity planning; • Electrical generation capacity planning; • Machine scheduling; • Freight scheduling; • Dairy farm expansion planning...Support Systems and Multi Criteria Decision Analysis Products A.2.11.2.2.1 ELECTRE IS ELECTRE IS is a generalization of ELECTRE I. It is a...criteria, ELECTRE IS supports the user in the process of selecting one alternative or a subset of alternatives. The method consists of two parts
Background / Question / Methods Planning for the recovery of threatened species is increasingly informed by spatially-explicit population models. However, using simulation model results to guide land management decisions can be difficult due to the volume and complexity of model...
Gibert, Karina; García-Alonso, Carlos; Salvador-Carulla, Luis
2010-09-30
Decision support in health systems is a highly difficult task, due to the inherent complexity of the process and structures involved. This paper introduces a new hybrid methodology Expert-based Cooperative Analysis (EbCA), which incorporates explicit prior expert knowledge in data analysis methods, and elicits implicit or tacit expert knowledge (IK) to improve decision support in healthcare systems. EbCA has been applied to two different case studies, showing its usability and versatility: 1) Bench-marking of small mental health areas based on technical efficiency estimated by EbCA-Data Envelopment Analysis (EbCA-DEA), and 2) Case-mix of schizophrenia based on functional dependency using Clustering Based on Rules (ClBR). In both cases comparisons towards classical procedures using qualitative explicit prior knowledge were made. Bayesian predictive validity measures were used for comparison with expert panels results. Overall agreement was tested by Intraclass Correlation Coefficient in case "1" and kappa in both cases. EbCA is a new methodology composed by 6 steps:. 1) Data collection and data preparation; 2) acquisition of "Prior Expert Knowledge" (PEK) and design of the "Prior Knowledge Base" (PKB); 3) PKB-guided analysis; 4) support-interpretation tools to evaluate results and detect inconsistencies (here Implicit Knowledg -IK- might be elicited); 5) incorporation of elicited IK in PKB and repeat till a satisfactory solution; 6) post-processing results for decision support. EbCA has been useful for incorporating PEK in two different analysis methods (DEA and Clustering), applied respectively to assess technical efficiency of small mental health areas and for case-mix of schizophrenia based on functional dependency. Differences in results obtained with classical approaches were mainly related to the IK which could be elicited by using EbCA and had major implications for the decision making in both cases. This paper presents EbCA and shows the convenience of completing classical data analysis with PEK as a mean to extract relevant knowledge in complex health domains. One of the major benefits of EbCA is iterative elicitation of IK.. Both explicit and tacit or implicit expert knowledge are critical to guide the scientific analysis of very complex decisional problems as those found in health system research.
Patient or physician preferences for decision analysis: the prenatal genetic testing decision.
Heckerling, P S; Verp, M S; Albert, N
1999-01-01
The choice between amniocentesis and chorionic villus sampling for prenatal genetic testing involves tradeoffs of the benefits and risks of the tests. Decision analysis is a method of explicitly weighing such tradeoffs. The authors examined the relationship between prenatal test choices made by patients and the choices prescribed by decision-analytic models based on their preferences, and separate models based on the preferences of their physicians. Preferences were assessed using written scenarios describing prenatal testing outcomes, and were recorded on linear rating scales. After adjustment for sociodemographic and obstetric confounders, test choice was significantly associated with the choice of decision models based on patient preferences (odds ratio 4.44; Cl, 2.53 to 7.78), but not with the choice of models based on the preferences of the physicians (odds ratio 1.60; Cl, 0.79 to 3.26). Agreement between decision analyses based on patient preferences and on physician preferences was little better than chance (kappa = 0.085+/-0.063). These results were robust both to changes in the decision-analytic probabilities and to changes in the model structure itself to simulate non-expected utility decision rules. The authors conclude that patient but not physician preferences, incorporated in decision models, correspond to the choice of amniocentesis or chorionic villus sampling made by the patient. Nevertheless, because patient preferences were assessed after referral for genetic testing, prospective preference-assessment studies will be necessary to confirm this association.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prindle, N.H.; Mendenhall, F.T.; Trauth, K.
1996-05-01
The Systems Prioritization Method (SPM) is a decision-aiding tool developed by Sandia National Laboratories (SNL). SPM provides an analytical basis for supporting programmatic decisions for the Waste Isolation Pilot Plant (WIPP) to meet selected portions of the applicable US EPA long-term performance regulations. The first iteration of SPM (SPM-1), the prototype for SPM< was completed in 1994. It served as a benchmark and a test bed for developing the tools needed for the second iteration of SPM (SPM-2). SPM-2, completed in 1995, is intended for programmatic decision making. This is Volume II of the three-volume final report of the secondmore » iteration of the SPM. It describes the technical input and model implementation for SPM-2, and presents the SPM-2 technical baseline and the activities, activity outcomes, outcome probabilities, and the input parameters for SPM-2 analysis.« less
Constantinou, Anthony Costa; Fenton, Norman; Marsh, William; Radlinski, Lukasz
2016-02-01
(1) To develop a rigorous and repeatable method for building effective Bayesian network (BN) models for medical decision support from complex, unstructured and incomplete patient questionnaires and interviews that inevitably contain examples of repetitive, redundant and contradictory responses; (2) To exploit expert knowledge in the BN development since further data acquisition is usually not possible; (3) To ensure the BN model can be used for interventional analysis; (4) To demonstrate why using data alone to learn the model structure and parameters is often unsatisfactory even when extensive data is available. The method is based on applying a range of recent BN developments targeted at helping experts build BNs given limited data. While most of the components of the method are based on established work, its novelty is that it provides a rigorous consolidated and generalised framework that addresses the whole life-cycle of BN model development. The method is based on two original and recent validated BN models in forensic psychiatry, known as DSVM-MSS and DSVM-P. When employed with the same datasets, the DSVM-MSS demonstrated competitive to superior predictive performance (AUC scores 0.708 and 0.797) against the state-of-the-art (AUC scores ranging from 0.527 to 0.705), and the DSVM-P demonstrated superior predictive performance (cross-validated AUC score of 0.78) against the state-of-the-art (AUC scores ranging from 0.665 to 0.717). More importantly, the resulting models go beyond improving predictive accuracy and into usefulness for risk management purposes through intervention, and enhanced decision support in terms of answering complex clinical questions that are based on unobserved evidence. This development process is applicable to any application domain which involves large-scale decision analysis based on such complex information, rather than based on data with hard facts, and in conjunction with the incorporation of expert knowledge for decision support via intervention. The novelty extends to challenging the decision scientists to reason about building models based on what information is really required for inference, rather than based on what data is available and hence, forces decision scientists to use available data in a much smarter way. Copyright © 2016 Elsevier B.V. All rights reserved.
Naturalistic Decision Making for Power System Operators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greitzer, Frank L.; Podmore, Robin; Robinson, Marck
2010-02-01
Motivation – Investigations of large-scale outages in the North American interconnected electric system often attribute the causes to three T’s: Trees, Training and Tools. To document and understand the mental processes used by expert operators when making critical decisions, a naturalistic decision making (NDM) model was developed. Transcripts of conversations were analyzed to reveal and assess NDM-based performance criteria. Findings/Design – An item analysis indicated that the operators’ Situation Awareness Levels, mental models, and mental simulations can be mapped at different points in the training scenario. This may identify improved training methods or analytical/ visualization tools. Originality/Value – This studymore » applies for the first time, the concepts of Recognition Primed Decision Making, Situation Awareness Levels and Cognitive Task Analysis to training of electric power system operators. Take away message – The NDM approach provides a viable framework for systematic training management to accelerate learning in simulator-based training scenarios for power system operators and teams.« less
Directional Slack-Based Measure for the Inverse Data Envelopment Analysis
Abu Bakar, Mohd Rizam; Lee, Lai Soon; Jaafar, Azmi B.; Heydar, Maryam
2014-01-01
A novel technique has been introduced in this research which lends its basis to the Directional Slack-Based Measure for the inverse Data Envelopment Analysis. In practice, the current research endeavors to elucidate the inverse directional slack-based measure model within a new production possibility set. On one occasion, there is a modification imposed on the output (input) quantities of an efficient decision making unit. In detail, the efficient decision making unit in this method was omitted from the present production possibility set but substituted by the considered efficient decision making unit while its input and output quantities were subsequently modified. The efficiency score of the entire DMUs will be retained in this approach. Also, there would be an improvement in the efficiency score. The proposed approach was investigated in this study with reference to a resource allocation problem. It is possible to simultaneously consider any upsurges (declines) of certain outputs associated with the efficient decision making unit. The significance of the represented model is accentuated by presenting numerical examples. PMID:24883350
Eckman, Mark H.; Alonso-Coello, Pablo; Guyatt, Gordon H.; Ebrahim, Shanil; Tikkinen, Kari A.O.; Lopes, Luciane Cruz; Neumann, Ignacio; McDonald, Sarah D.; Zhang, Yuqing; Zhou, Qi; Akl, Elie A.; Jacobsen, Ann Flem; Santamaría, Amparo; Annichino-Bizzacchi, Joyce Maria; Bitar, Wael; Sandset, Per Morten; Bates, Shannon M.
2016-01-01
Background Women with a history of venous thromboembolism (VTE) have an increased recurrence risk during pregnancy. Low molecular weight heparin (LMWH) reduces this risk, but is costly, burdensome, and may increase risk of bleeding. The decision to start thromboprophylaxis during pregnancy is sensitive to women's values and preferences. Our objective was to compare women's choices using a holistic approach in which they were presented all of the relevant information (direct-choice) versus a personalized decision analysis in which a mathematical model incorporated their preferences and VTE risk to make a treatment recommendation. Methods Multicenter, international study. Structured interviews were on women with a history of VTE who were pregnant, planning, or considering pregnancy. Women indicated their willingness to receive thromboprophylaxis based on scenarios using personalized estimates of VTE recurrence and bleeding risks. We also obtained women's values for health outcomes using a visual analog scale. We performed individualized decision analyses for each participant and compared model recommendations to decisions made when presented with the direct-choice exercise. Results Of the 123 women in the study, the decision model recommended LMWH for 51 women and recommended against LMWH for 72 women. 12% (6/51) of women for whom the decision model recommended thromboprophylaxis chose not to take LMWH; 72% (52/72) of women for whom the decision model recommended against thromboprophylaxis chose LMWH. Conclusions We observed a high degree of discordance between decisions in the direct-choice exercise and decision model recommendations. Although which approach best captures individuals’ true values remains uncertain, personalized decision support tools presenting results based on personalized risks and values may improve decision making. PMID:26033397
NASA Astrophysics Data System (ADS)
Cunningham, Jessica D.
Newton's Universe (NU), an innovative teacher training program, strives to obtain measures from rural, middle school science teachers and their students to determine the impact of its distance learning course on understanding of temperature. No consensus exists on the most appropriate and useful method of analysis to measure change in psychological constructs over time. Several item response theory (IRT) models have been deemed useful in measuring change, which makes the choice of an IRT model not obvious. The appropriateness and utility of each model, including a comparison to a traditional analysis of variance approach, was investigated using middle school science student performance on an assessment over an instructional period. Predetermined criteria were outlined to guide model selection based on several factors including research questions, data properties, and meaningful interpretations to determine the most appropriate model for this study. All methods employed in this study reiterated one common interpretation of the data -- specifically, that the students of teachers with any NU course experience had significantly greater gains in performance over the instructional period. However, clear distinctions were made between an analysis of variance and the racked and stacked analysis using the Rasch model. Although limited research exists examining the usefulness of the Rasch model in measuring change in understanding over time, this study applied these methods and detailed plausible implications for data-driven decisions based upon results for NU and others. Being mindful of the advantages and usefulness of each method of analysis may help others make informed decisions about choosing an appropriate model to depict changes to evaluate other programs. Results may encourage other researchers to consider the meaningfulness of using IRT for this purpose. Results have implications for data-driven decisions for future professional development courses, in science education and other disciplines. KEYWORDS: Item Response Theory, Rasch Model, Racking and Stacking, Measuring Change in Student Performance, Newton's Universe teacher training
Smart algorithms and adaptive methods in computational fluid dynamics
NASA Astrophysics Data System (ADS)
Tinsley Oden, J.
1989-05-01
A review is presented of the use of smart algorithms which employ adaptive methods in processing large amounts of data in computational fluid dynamics (CFD). Smart algorithms use a rationally based set of criteria for automatic decision making in an attempt to produce optimal simulations of complex fluid dynamics problems. The information needed to make these decisions is not known beforehand and evolves in structure and form during the numerical solution of flow problems. Once the code makes a decision based on the available data, the structure of the data may change, and criteria may be reapplied in order to direct the analysis toward an acceptable end. Intelligent decisions are made by processing vast amounts of data that evolve unpredictably during the calculation. The basic components of adaptive methods and their application to complex problems of fluid dynamics are reviewed. The basic components of adaptive methods are: (1) data structures, that is what approaches are available for modifying data structures of an approximation so as to reduce errors; (2) error estimation, that is what techniques exist for estimating error evolution in a CFD calculation; and (3) solvers, what algorithms are available which can function in changing meshes. Numerical examples which demonstrate the viability of these approaches are presented.
Vulnerability assessment of atmospheric environment driven by human impacts.
Zhang, Yang; Shen, Jing; Ding, Feng; Li, Yu; He, Li
2016-11-15
Atmospheric environment quality worsening is a substantial threat to public health worldwide, and in many places, air pollution due to the intensification of the human activity is increasing dramatically. However, no studies have been investigated the integration of vulnerability assessment and atmospheric environment driven by human impacts. The objective of this study was to identify and prioritize the undesirable environmental changes as an early warning system for environment managers and decision makers in term of human, atmospheric environment, and social economic elements. We conduct a vulnerability assessment method of atmospheric environment associated with human impact, this method integrates spatial context of Geographic Information System (GIS) tool, multi-criteria decision analysis (MCDA) method, ordered weighted averaging (OWA) operators under the Exposure-Sensitivity- Adaptive Capacity (ESA) framework. Decision makers can find out relevant vulnerability assessment results with different vulnerable attitudes. In the Beijing-Tianjin-Hebei (BTH) region, China, we further applied this developed method and proved it to be reliable and consistent with the China Environmental Status Bulletin. Results indicate that the vulnerability of atmospheric environment in the BTH region is not optimistic, and environment managers should do more about air pollution. Thus, the most appropriate strategic decision and development program of city or state can be picked out assisting by the vulnerable results. Copyright © 2016 Elsevier B.V. All rights reserved.
A new approach to enhance the performance of decision tree for classifying gene expression data.
Hassan, Md; Kotagiri, Ramamohanarao
2013-12-20
Gene expression data classification is a challenging task due to the large dimensionality and very small number of samples. Decision tree is one of the popular machine learning approaches to address such classification problems. However, the existing decision tree algorithms use a single gene feature at each node to split the data into its child nodes and hence might suffer from poor performance specially when classifying gene expression dataset. By using a new decision tree algorithm where, each node of the tree consists of more than one gene, we enhance the classification performance of traditional decision tree classifiers. Our method selects suitable genes that are combined using a linear function to form a derived composite feature. To determine the structure of the tree we use the area under the Receiver Operating Characteristics curve (AUC). Experimental analysis demonstrates higher classification accuracy using the new decision tree compared to the other existing decision trees in literature. We experimentally compare the effect of our scheme against other well known decision tree techniques. Experiments show that our algorithm can substantially boost the classification performance of the decision tree.
Gebresenbet, Girma
2018-01-01
Consumers’ demand for locally produced and organic foods has increased in Sweden. This paper presents the results obtained from the analysis of data acquired from 100 consumers in Sweden who participated in an online survey during March to June 2016. The objective was to identify consumers’ demand in relation to organic food and sustainable food production, and to understand how the consumers evaluate food quality and make buying decisions. Qualitative descriptions, descriptive statistics and Pearson’s Chi-square test (with alpha value of p < 0.05 as level of significance), and Pearson’s correlation coefficient were used for analysis. About 72% of participants have the perception that organic food production method is more sustainable than conventional methods. Female consumers have more positive attitudes than men towards organic food. However, age difference, household size and income level do not significantly influence the consumers’ perception of sustainable food production concepts. Regionality, sustainable methods of production and organic production are the most important parameters to characterize the food as high quality and make buying decisions. On the other hand, product uniformity, appearance, and price were found to be relatively less important parameters. Food buying decisions and food quality were found to be highly related with Pearson’s correlation coefficient of r = 0.99. PMID:29614785
Bosona, Techane; Gebresenbet, Girma
2018-04-01
Consumers' demand for locally produced and organic foods has increased in Sweden. This paper presents the results obtained from the analysis of data acquired from 100 consumers in Sweden who participated in an online survey during March to June 2016. The objective was to identify consumers' demand in relation to organic food and sustainable food production, and to understand how the consumers evaluate food quality and make buying decisions. Qualitative descriptions, descriptive statistics and Pearson's Chi-square test (with alpha value of p < 0.05 as level of significance), and Pearson's correlation coefficient were used for analysis. About 72% of participants have the perception that organic food production method is more sustainable than conventional methods. Female consumers have more positive attitudes than men towards organic food. However, age difference, household size and income level do not significantly influence the consumers' perception of sustainable food production concepts. Regionality, sustainable methods of production and organic production are the most important parameters to characterize the food as high quality and make buying decisions. On the other hand, product uniformity, appearance, and price were found to be relatively less important parameters. Food buying decisions and food quality were found to be highly related with Pearson's correlation coefficient of r = 0.99.
Fundamentals of health risk assessment. Use, derivation, validity and limitations of safety indices.
Putzrath, R M; Wilson, J D
1999-04-01
We investigated the way results of human health risk assessments are used, and the theory used to describe those methods, sometimes called the "NAS paradigm." Contrary to a key tenet of that theory, current methods have strictly limited utility. The characterizations now considered standard, Safety Indices such as "Acceptable Daily Intake," "Reference Dose," and so on, usefully inform only decisions that require a choice between two policy alternatives (e.g., approve a food additive or not), decided solely on the basis of a finding of safety. Risk is characterized as the quotient of one of these Safety Indices divided by an estimate of exposure: a quotient greater than one implies that the situation may be considered safe. Such decisions are very widespread, both in the U.S. federal government and elsewhere. No current method is universal; different policies lead to different practices, for example, in California's "Proposition 65," where statutory provisions specify some practices. Further, an important kind of human health risk assessment is not recognized by this theory: this kind characterizes risk as likelihood of harm, given estimates of exposure consequent to various decision choices. Likelihood estimates are necessary whenever decision makers have many possible decision choices and must weigh more than two societal values, such as in EPA's implementation of "conventional air pollutants." These estimates can not be derived using current methods; different methods are needed. Our analysis suggests changes needed in both the theory and practice of human health risk assessment, and how what is done is depicted.
Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P
2014-09-01
Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.
Carbogim, Fábio da Costa; de Oliveira, Larissa Bertacchini; Püschel, Vilanice Alves de Araújo
2016-01-01
ABSTRACT Objective: to analyze the concept of critical thinking (CT) in Rodger's evolutionary perspective. Method: documentary research undertaken in the Cinahl, Lilacs, Bdenf and Dedalus databases, using the keywords of 'critical thinking' and 'Nursing', without limitation based on year of publication. The data were analyzed in accordance with the stages of Rodger's conceptual model. The following were included: books and articles in full, published in Portuguese, English or Spanish, which addressed CT in the teaching and practice of Nursing; articles which did not address aspects related to the concept of CT were excluded. Results: the sample was made up of 42 works. As a substitute term, emphasis is placed on 'analytical thinking', and, as a related factor, decision-making. In order, the most frequent preceding and consequent attributes were: ability to analyze, training of the student nurse, and clinical decision-making. As the implications of CT, emphasis is placed on achieving effective results in care for the patient, family and community. Conclusion: CT is a cognitive skill which involves analysis, logical reasoning and clinical judgment, geared towards the resolution of problems, and standing out in the training and practice of the nurse with a view to accurate clinical decision-making and the achieving of effective results. PMID:27598376
Structured decision making for managing pneumonia epizootics in bighorn sheep
Sells, Sarah N.; Mitchell, Michael S.; Edwards, Victoria L.; Gude, Justin A.; Anderson, Neil J.
2016-01-01
Good decision-making is essential to conserving wildlife populations. Although there may be multiple ways to address a problem, perfect solutions rarely exist. Managers are therefore tasked with identifying decisions that will best achieve desired outcomes. Structured decision making (SDM) is a method of decision analysis used to identify the most effective, efficient, and realistic decisions while accounting for values and priorities of the decision maker. The stepwise process includes identifying the management problem, defining objectives for solving the problem, developing alternative approaches to achieve the objectives, and formally evaluating which alternative is most likely to accomplish the objectives. The SDM process can be more effective than informal decision-making because it provides a transparent way to quantitatively evaluate decisions for addressing multiple management objectives while incorporating science, uncertainty, and risk tolerance. To illustrate the application of this process to a management need, we present an SDM-based decision tool developed to identify optimal decisions for proactively managing risk of pneumonia epizootics in bighorn sheep (Ovis canadensis) in Montana. Pneumonia epizootics are a major challenge for managers due to long-term impacts to herds, epistemic uncertainty in timing and location of future epizootics, and consequent difficulty knowing how or when to manage risk. The decision tool facilitates analysis of alternative decisions for how to manage herds based on predictions from a risk model, herd-specific objectives, and predicted costs and benefits of each alternative. Decision analyses for 2 example herds revealed that meeting management objectives necessitates specific approaches unique to each herd. The analyses showed how and under what circumstances the alternatives are optimal compared to other approaches and current management. Managers can be confident that these decisions are effective, efficient, and realistic because they explicitly account for important considerations managers implicitly weigh when making decisions, including competing management objectives, uncertainty in potential outcomes, and risk tolerance.
Mutombo, Namuunda; Bakibinga, Pauline
2014-07-03
Zambia's fertility rate and unmet need for family planning are still high. This is in spite of the progress reported from 1992 to 2007 of the increase in contraceptive prevalence rate from 15% to 41% and use of modern methods of family planning from 9% to 33%. However, partner disapproval of family planning has been cited by many women in many countries including Zambia. Given the effectiveness of long-acting and permanent methods of family planning (ILAPMs) in fertility regulation, this paper sought to examine the relationship between contraceptive decision-making and use of ILAPMs among married women in Zambia. This paper uses data from the 2007 Zambia Demographic and Health Survey. The analysis is based on married women (15-49) who reported using a method of family planning at the time of the survey. Out of the 7,146 women interviewed, only 1,630 women were valid for this analysis. Cross-tabulations and binary logistic regressions with Chi-square were used to analyse associations and the predictors of use of ILAPMs of contraception, respectively. A confidence interval of .95 was used in determining relationships between independent and dependent variables. Two thirds of women made joint decisions regarding contraception and 29% of the women were using ILAPMs. Women who made joint contraceptive decisions are significantly more likely to use ILAPMs than women who did not involve their husband in contraceptive decisions. However, the most significant predictor is the wealth index. Women from rich households are more likely to use ILAPMs than women from medium rich and poor households. Results also show that women of North Western ethnicities and those from Region 3 had higher odds of using ILAPMs than Tonga women and women from Region 2, respectively. Joint contraceptive decision-making between spouses is key to use of ILAPMs in Zambia. Our findings have also shown that the wealth index is actually the strongest factor determining use of these methods. As such, family planning programmes directed at increasing use of LAPMs ought to not only encourage spousal communication but should also consider rolling out interventions that incorporate economic empowerment.
Classifying Nanomaterial Risks Using Multi-Criteria Decision Analysis
NASA Astrophysics Data System (ADS)
Linkov, I.; Steevens, J.; Chappell, M.; Tervonen, T.; Figueira, J. R.; Merad, M.
There is rapidly growing interest by regulatory agencies and stakeholders in the potential toxicity and other risks associated with nanomaterials throughout the different stages of the product life cycle (e.g., development, production, use and disposal). Risk assessment methods and tools developed and applied to chemical and biological material may not be readily adaptable for nanomaterials because of the current uncertainty in identifying the relevant physico-chemical and biological properties that adequately describe the materials. Such uncertainty is further driven by the substantial variations in the properties of the original material because of the variable manufacturing processes employed in nanomaterial production. To guide scientists and engineers in nanomaterial research and application as well as promote the safe use/handling of these materials, we propose a decision support system for classifying nanomaterials into different risk categories. The classification system is based on a set of performance metrics that measure both the toxicity and physico-chemical characteristics of the original materials, as well as the expected environmental impacts through the product life cycle. The stochastic multicriteria acceptability analysis (SMAA-TRI), a formal decision analysis method, was used as the foundation for this task. This method allowed us to cluster various nanomaterials in different risk categories based on our current knowledge of nanomaterial's physico-chemical characteristics, variation in produced material, and best professional judgement. SMAA-TRI uses Monte Carlo simulations to explore all feasible values for weights, criteria measurements, and other model parameters to assess the robustness of nanomaterial grouping for risk management purposes.1,2
Russ, Alissa L; Militello, Laura G; Glassman, Peter A; Arthur, Karen J; Zillich, Alan J; Weiner, Michael
2017-05-03
Cognitive task analysis (CTA) can yield valuable insights into healthcare professionals' cognition and inform system design to promote safe, quality care. Our objective was to adapt CTA-the critical decision method, specifically-to investigate patient safety incidents, overcome barriers to implementing this method, and facilitate more widespread use of cognitive task analysis in healthcare. We adapted CTA to facilitate recruitment of healthcare professionals and developed a data collection tool to capture incidents as they occurred. We also leveraged the electronic health record (EHR) to expand data capture and used EHR-stimulated recall to aid reconstruction of safety incidents. We investigated 3 categories of medication-related incidents: adverse drug reactions, drug-drug interactions, and drug-disease interactions. Healthcare professionals submitted incidents, and a subset of incidents was selected for CTA. We analyzed several outcomes to characterize incident capture and completed CTA interviews. We captured 101 incidents. Eighty incidents (79%) met eligibility criteria. We completed 60 CTA interviews, 20 for each incident category. Capturing incidents before interviews allowed us to shorten the interview duration and reduced reliance on healthcare professionals' recall. Incorporating the EHR into CTA enriched data collection. The adapted CTA technique was successful in capturing specific categories of safety incidents. Our approach may be especially useful for investigating safety incidents that healthcare professionals "fix and forget." Our innovations to CTA are expected to expand the application of this method in healthcare and inform a wide range of studies on clinical decision making and patient safety.
Decision support methods for the environmental assessment of contamination at mining sites.
Jordan, Gyozo; Abdaal, Ahmed
2013-09-01
Polluting mine accidents and widespread environmental contamination associated with historic mining in Europe and elsewhere has triggered the improvement of related environmental legislation and of the environmental assessment and management methods for the mining industry. Mining has some unique features such as natural background pollution associated with natural mineral deposits, industrial activities and contamination located in the three-dimensional sub-surface space, the problem of long-term remediation after mine closure, problem of secondary contaminated areas around mine sites and abandoned mines in historic regions like Europe. These mining-specific problems require special tools to address the complexity of the environmental problems of mining-related contamination. The objective of this paper is to review and evaluate some of the decision support methods that have been developed and applied to mining contamination. In this paper, only those methods that are both efficient decision support tools and provide a 'holistic' approach to the complex problem as well are considered. These tools are (1) landscape ecology, (2) industrial ecology, (3) landscape geochemistry, (4) geo-environmental models, (5) environmental impact assessment, (6) environmental risk assessment, (7) material flow analysis and (8) life cycle assessment. This unique inter-disciplinary study should enable both the researcher and the practitioner to obtain broad view on the state-of-the-art of decision support methods for the environmental assessment of contamination at mine sites. Documented examples and abundant references are also provided.
NASA Astrophysics Data System (ADS)
Chen, W.; Bauer, J.; Kurz, C.; Tessonnier, T.; Handrack, J.; Haberer, T.; Debus, J.; Parodi, K.
2017-01-01
We present the workflow of the offline-PET based range verification method used at the Heidelberg Ion Beam Therapy Center, detailing the functionalities of an in-house developed software application, SimInterface14, with which range analysis is performed. Moreover, we introduce the design of a decision support system assessing uncertainties and facilitating physicians in decisions making for plan adaptation.
Human versus automation in responding to failures: an expected-value analysis
NASA Technical Reports Server (NTRS)
Sheridan, T. B.; Parasuraman, R.
2000-01-01
A simple analytical criterion is provided for deciding whether a human or automation is best for a failure detection task. The method is based on expected-value decision theory in much the same way as is signal detection. It requires specification of the probabilities of misses (false negatives) and false alarms (false positives) for both human and automation being considered, as well as factors independent of the choice--namely, costs and benefits of incorrect and correct decisions as well as the prior probability of failure. The method can also serve as a basis for comparing different modes of automation. Some limiting cases of application are discussed, as are some decision criteria other than expected value. Actual or potential applications include the design and evaluation of any system in which either humans or automation are being considered.
New Methods for Crafting Locally Decision-Relevant Scenarios
NASA Astrophysics Data System (ADS)
Lempert, R. J.
2015-12-01
Scenarios can play an important role in helping decision makers to imagine future worlds, both good and bad, different than the one with which we are familiar and to take concrete steps now to address the risks generated by climate change. At their best, scenarios can effectively represent deep uncertainty; integrate over multiple domains; and enable parties with different expectation and values to expand the range of futures they consider, to see the world from different points of view, and to grapple seriously with the potential implications of surprising or inconvenient futures. These attributes of scenario processes can prove crucial in helping craft effective responses to climate change. But traditional scenario methods can also fail to overcome difficulties related to choosing, communicating, and using scenarios to identify, evaluate, and reach consensus on appropriate policies. Such challenges can limit scenario's impact in broad public discourse. This talk will demonstrate how new decision support approaches can employ new quantitative tools that allow scenarios to emerge from a process of deliberation with analysis among stakeholders, rather than serve as inputs to it, thereby increasing the impacts of scenarios on decision making. This talk will demonstrate these methods in the design of a decision support tool to help residents of low lying coastal cities grapple with the long-term risks of sea level rise. In particular, this talk will show how information from the IPCC SSP's can be combined with local information to provide a rich set of locally decision-relevant information.
Economou, Anastasios; Petraki, Olympia; Tsipi, Despina; Botitsi, Eleni
2012-08-15
This work reports a sensitive liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for identification and quantification of seven sulfonamides, trimethoprim and dapsone in honey. The method is based on a solid-phase extraction (SPE) step of the target analytes with Oasis HLB cartridges after acidic hydrolysis of the honey sample to liberate the sugar-bound sulfonamides. Analysis was performed using liquid chromatography-tandem mass spectrometry (LC-MS/MS) in the positive electro-spray ionization (ESI) mode with two different isotopically labeled internal standards with the view to improve the quantitative performance of the method. The method validation has been performed according to the Commission Decision 2002/657/EC; the average recoveries, measured at three concentration levels (1.5, 2.5 and 5.0 μg kg(-1)), have been estimated in the range 70 to 106% while the respective % relative standard deviations of the within-laboratory reproducibility ranged from 6 to 18%. Mean values of the expanded uncertainties calculated were in the range 22-41% at the 99% confidence level. Decision limit (CCα) and detection capability (CCβ) values were in the ranges 0.4-0.9 and 0.7-1.4 μg kg(-1), respectively. Matrix effects have been investigated demonstrating a moderate signal suppression/enhancement for most of the target compounds. The method described has been successfully applied to the analysis of honey samples; sulfamethoxazole, sulfathiazole and trimethoprim were detected in some cases. Copyright © 2012 Elsevier B.V. All rights reserved.
[Impact of shared-decision making on patient satisfaction].
Suh, Won S; Lee, Chae Kyung
2010-01-01
The purpose of this research is to analyze the impact of shared-decision making on patient satisfaction. The study is significant since it focuses on developing appropriate methodologies and analyzing data to identify patient preferences, with the goals of optimizing treatment selection, and substantiating the relationship between such preferences and their impact on outcomes. A thorough literature review that developed the framework illustrating key dimensions of shared decision making was followed by a quantitative assessment and regression analysis of patient-perceived satisfaction, and the degree of shared-decision making. A positive association was evident between shared-decision making and patient satisfaction. The impact of shared decision making on patient satisfaction was greater than other variable including gender, education, and number of visits. Patients who participate in care-related decisions and who are given an explanation of their health problems are more likely to be satisfied with their care. It would benefit health care organizations to train their medical professionals in this communication method, and to include it in their practice guidelines.
Financial maturity of paper birch
Joseph J. Mendel
1969-01-01
One objective in forestry is to earn the greatest possible return on the capital invested in growing timber. To do this, the forester not only must know which silvicultural methods to use, but also ought to know the methods of economic analysis that will help him make the decisions that will lead to the greatest return. The financial maturity concept provides a method...
Monte Carlo simulation: Its status and future
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murtha, J.A.
1997-04-01
Monte Carlo simulation is a statistics-based analysis tool that yields probability-vs.-value relationships for key parameters, including oil and gas reserves, capital exposure, and various economic yardsticks, such as net present value (NPV) and return on investment (ROI). Monte Carlo simulation is a part of risk analysis and is sometimes performed in conjunction with or as an alternative to decision [tree] analysis. The objectives are (1) to define Monte Carlo simulation in a more general context of risk and decision analysis; (2) to provide some specific applications, which can be interrelated; (3) to respond to some of the criticisms; (4) tomore » offer some cautions about abuses of the method and recommend how to avoid the pitfalls; and (5) to predict what the future has in store.« less
Study on bayes discriminant analysis of EEG data.
Shi, Yuan; He, DanDan; Qin, Fang
2014-01-01
In this paper, we have done Bayes Discriminant analysis to EEG data of experiment objects which are recorded impersonally come up with a relatively accurate method used in feature extraction and classification decisions. In accordance with the strength of α wave, the head electrodes are divided into four species. In use of part of 21 electrodes EEG data of 63 people, we have done Bayes Discriminant analysis to EEG data of six objects. Results In use of part of EEG data of 63 people, we have done Bayes Discriminant analysis, the electrode classification accuracy rates is 64.4%. Bayes Discriminant has higher prediction accuracy, EEG features (mainly αwave) extract more accurate. Bayes Discriminant would be better applied to the feature extraction and classification decisions of EEG data.
Boesch, I
2013-04-01
This study aimed to determine key attributes of milk that drive a processor's supply decisions and possibilities for differentiation based on these product attributes. Feedback-driven exploration was applied to derive product attributes relevant to the buying decision. Conjoint analysis with hierarchical Bayes estimation methods was used to determine the relative importance of attributes. Results show that the technical aspects of milk, as well as the price and country of origin, dominate the buying decision. Potential for differentiation was found for environmental and societal attributes as well as freedom from genetically modified products. Product and supplier criteria also provide the potential to segment the market if the price premium is held within limits. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Wang, Mingming; Sweetapple, Chris; Fu, Guangtao; Farmani, Raziyeh; Butler, David
2017-10-01
This paper presents a new framework for decision making in sustainable drainage system (SuDS) scheme design. It integrates resilience, hydraulic performance, pollution control, rainwater usage, energy analysis, greenhouse gas (GHG) emissions and costs, and has 12 indicators. The multi-criteria analysis methods of entropy weight and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) were selected to support SuDS scheme selection. The effectiveness of the framework is demonstrated with a SuDS case in China. Indicators used include flood volume, flood duration, a hydraulic performance indicator, cost and resilience. Resilience is an important design consideration, and it supports scheme selection in the case study. The proposed framework will help a decision maker to choose an appropriate design scheme for implementation without subjectivity. Copyright © 2017 Elsevier Ltd. All rights reserved.
Analysis and Management of Animal Populations: Modeling, Estimation and Decision Making
Williams, B.K.; Nichols, J.D.; Conroy, M.J.
2002-01-01
This book deals with the processes involved in making informed decisions about the management of animal populations. It covers the modeling of population responses to management actions, the estimation of quantities needed in the modeling effort, and the application of these estimates and models to the development of sound management decisions. The book synthesizes and integrates in a single volume the methods associated with these themes, as they apply to ecological assessment and conservation of animal populations. KEY FEATURES * Integrates population modeling, parameter estimation and * decision-theoretic approaches to management in a single, cohesive framework * Provides authoritative, state-of-the-art descriptions of quantitative * approaches to modeling, estimation and decision-making * Emphasizes the role of mathematical modeling in the conduct of science * and management * Utilizes a unifying biological context, consistent mathematical notation, * and numerous biological examples
Bal, Mert; Amasyali, M Fatih; Sever, Hayri; Kose, Guven; Demirhan, Ayse
2014-01-01
The importance of the decision support systems is increasingly supporting the decision making process in cases of uncertainty and the lack of information and they are widely used in various fields like engineering, finance, medicine, and so forth, Medical decision support systems help the healthcare personnel to select optimal method during the treatment of the patients. Decision support systems are intelligent software systems that support decision makers on their decisions. The design of decision support systems consists of four main subjects called inference mechanism, knowledge-base, explanation module, and active memory. Inference mechanism constitutes the basis of decision support systems. There are various methods that can be used in these mechanisms approaches. Some of these methods are decision trees, artificial neural networks, statistical methods, rule-based methods, and so forth. In decision support systems, those methods can be used separately or a hybrid system, and also combination of those methods. In this study, synthetic data with 10, 100, 1000, and 2000 records have been produced to reflect the probabilities on the ALARM network. The accuracy of 11 machine learning methods for the inference mechanism of medical decision support system is compared on various data sets.
Computation and measurement of cell decision making errors using single cell data
Habibi, Iman; Cheong, Raymond; Levchenko, Andre; Emamian, Effat S.; Abdi, Ali
2017-01-01
In this study a new computational method is developed to quantify decision making errors in cells, caused by noise and signaling failures. Analysis of tumor necrosis factor (TNF) signaling pathway which regulates the transcription factor Nuclear Factor κB (NF-κB) using this method identifies two types of incorrect cell decisions called false alarm and miss. These two events represent, respectively, declaring a signal which is not present and missing a signal that does exist. Using single cell experimental data and the developed method, we compute false alarm and miss error probabilities in wild-type cells and provide a formulation which shows how these metrics depend on the signal transduction noise level. We also show that in the presence of abnormalities in a cell, decision making processes can be significantly affected, compared to a wild-type cell, and the method is able to model and measure such effects. In the TNF—NF-κB pathway, the method computes and reveals changes in false alarm and miss probabilities in A20-deficient cells, caused by cell’s inability to inhibit TNF-induced NF-κB response. In biological terms, a higher false alarm metric in this abnormal TNF signaling system indicates perceiving more cytokine signals which in fact do not exist at the system input, whereas a higher miss metric indicates that it is highly likely to miss signals that actually exist. Overall, this study demonstrates the ability of the developed method for modeling cell decision making errors under normal and abnormal conditions, and in the presence of transduction noise uncertainty. Compared to the previously reported pathway capacity metric, our results suggest that the introduced decision error metrics characterize signaling failures more accurately. This is mainly because while capacity is a useful metric to study information transmission in signaling pathways, it does not capture the overlap between TNF-induced noisy response curves. PMID:28379950
Computation and measurement of cell decision making errors using single cell data.
Habibi, Iman; Cheong, Raymond; Lipniacki, Tomasz; Levchenko, Andre; Emamian, Effat S; Abdi, Ali
2017-04-01
In this study a new computational method is developed to quantify decision making errors in cells, caused by noise and signaling failures. Analysis of tumor necrosis factor (TNF) signaling pathway which regulates the transcription factor Nuclear Factor κB (NF-κB) using this method identifies two types of incorrect cell decisions called false alarm and miss. These two events represent, respectively, declaring a signal which is not present and missing a signal that does exist. Using single cell experimental data and the developed method, we compute false alarm and miss error probabilities in wild-type cells and provide a formulation which shows how these metrics depend on the signal transduction noise level. We also show that in the presence of abnormalities in a cell, decision making processes can be significantly affected, compared to a wild-type cell, and the method is able to model and measure such effects. In the TNF-NF-κB pathway, the method computes and reveals changes in false alarm and miss probabilities in A20-deficient cells, caused by cell's inability to inhibit TNF-induced NF-κB response. In biological terms, a higher false alarm metric in this abnormal TNF signaling system indicates perceiving more cytokine signals which in fact do not exist at the system input, whereas a higher miss metric indicates that it is highly likely to miss signals that actually exist. Overall, this study demonstrates the ability of the developed method for modeling cell decision making errors under normal and abnormal conditions, and in the presence of transduction noise uncertainty. Compared to the previously reported pathway capacity metric, our results suggest that the introduced decision error metrics characterize signaling failures more accurately. This is mainly because while capacity is a useful metric to study information transmission in signaling pathways, it does not capture the overlap between TNF-induced noisy response curves.
A hybrid method for classifying cognitive states from fMRI data.
Parida, S; Dehuri, S; Cho, S-B; Cacha, L A; Poznanski, R R
2015-09-01
Functional magnetic resonance imaging (fMRI) makes it possible to detect brain activities in order to elucidate cognitive-states. The complex nature of fMRI data requires under-standing of the analyses applied to produce possible avenues for developing models of cognitive state classification and improving brain activity prediction. While many models of classification task of fMRI data analysis have been developed, in this paper, we present a novel hybrid technique through combining the best attributes of genetic algorithms (GAs) and ensemble decision tree technique that consistently outperforms all other methods which are being used for cognitive-state classification. Specifically, this paper illustrates the combined effort of decision-trees ensemble and GAs for feature selection through an extensive simulation study and discusses the classification performance with respect to fMRI data. We have shown that our proposed method exhibits significant reduction of the number of features with clear edge classification accuracy over ensemble of decision-trees.
ERIC Educational Resources Information Center
Mosier, Nancy R.
Financial analysis techniques are tools that help managers make sound financial decisions that contribute to general corporate objectives. A literature review reveals that the most commonly used financial analysis techniques are payback time, average rate of return, present value or present worth, and internal rate of return. Despite the success…
Li, Grace; Chandrasekharan, Subhashini; Allyse, Megan
2017-02-01
The introduction of cell-free DNA prenatal genetic screening has rekindled discussion of ethical and social questions surrounding prenatal testing, perceptions of disability, and abortion. The growing use of prenatal genetic screening presents a unique opportunity to assess decision-making around new methods of prenatal testing; especially as there is little available research comparing individual and cultural differences that affect a pregnant woman's decision-making on prenatal testing. We performed a content analysis of online pregnancy forums in the United States and Mainland China. Content from January 2012 to December 2013 was identified through search methodologies and refined to remove duplication. China-based content was translated by a native Mandarin speaker. We used qualitative analysis methods to identify common themes in the dataset. There were 333 English responses and 519 Mandarin responses. Three main themese were identified in the data: decision making factors, attitude towards the pregnancy, and attitudes towards abortion. Women's narratives reflected how broader social forces can have an impact on intimate personal decision-making. Women in the Mandarin dataset evoked stronger narratives of community and/or family decision-making in pregnancy and were more accepting of the possibility of abortion in the event of a finding of fetal abnormality. Narrative in the English dataset more frequently evoked ideas of unconditional love, regardless of fetal diagnosis, but also acknowledged much stronger support services for individuals with disability and less awareness of stigma. These results highlight the necessity of awareness around how broader cultural and social factors can consciously or unconsciously impact women's decisions and highlight potential focus areas for future counseling efforts.
Li, Meng-Hua
2014-01-01
When an enterprise has thousands of varieties in its inventory, the use of a single management method could not be a feasible approach. A better way to manage this problem would be to categorise inventory items into several clusters according to inventory decisions and to use different management methods for managing different clusters. The present study applies DPSO (dynamic particle swarm optimisation) to a problem of clustering of inventory items. Without the requirement of prior inventory knowledge, inventory items are automatically clustered into near optimal clustering number. The obtained clustering results should satisfy the inventory objective equation, which consists of different objectives such as total cost, backorder rate, demand relevance, and inventory turnover rate. This study integrates the above four objectives into a multiobjective equation, and inputs the actual inventory items of the enterprise into DPSO. In comparison with other clustering methods, the proposed method can consider different objectives and obtain an overall better solution to obtain better convergence results and inventory decisions. PMID:25197713
Feizizadeh, Bakhtiar; Blaschke, Thomas
2014-03-04
GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster-Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster-Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC operation yielded poor results.
Mendlinger, Sheryl; Cwikel, Julie
2008-02-01
A double helix spiral model is presented which demonstrates how to combine qualitative and quantitative methods of inquiry in an interactive fashion over time. Using findings on women's health behaviors (e.g., menstruation, breast-feeding, coping strategies), we show how qualitative and quantitative methods highlight the theory of knowledge acquisition in women's health decisions. A rich data set of 48 semistructured, in-depth ethnographic interviews with mother-daughter dyads from six ethnic groups (Israeli, European, North African, Former Soviet Union [FSU], American/Canadian, and Ethiopian), plus seven focus groups, provided the qualitative sources for analysis. This data set formed the basis of research questions used in a quantitative telephone survey of 302 Israeli women from the ages of 25 to 42 from four ethnic groups. We employed multiple cycles of data analysis from both data sets to produce a more detailed and multidimensional picture of women's health behavior decisions through a spiraling process.
NASA Technical Reports Server (NTRS)
Schlater, Nelson J.; Simonds, Charles H.; Ballin, Mark G.
1993-01-01
Applied research and technology development (R&TD) is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Given the increased awareness of limitations in resources, effective R&TD today needs a method for up-front assessment of competing technologies to help guide technology investment decisions. Such an assessment approach must account for uncertainties in system performance parameters, mission requirements and architectures, and internal and external events influencing a development program. The methodology known as decision analysis has the potential to address these issues. It was evaluated by performing a case study assessment of alternative carbon dioxide removal technologies for NASA"s proposed First Lunar Outpost program. An approach was developed that accounts for the uncertainties in each technology's cost and performance parameters as well as programmatic uncertainties such as mission architecture. Life cycle cost savings relative to a baseline, adjusted for the cost of money, was used as a figure of merit to evaluate each of the alternative carbon dioxide removal technology candidates. The methodology was found to provide a consistent decision-making strategy for the develpoment of new life support technology. The case study results provided insight that was not possible from more traditional analysis approaches.
NASA Technical Reports Server (NTRS)
Schlater, Nelson J.; Simonds, Charles H.; Ballin, Mark G.
1993-01-01
Applied research and technology development (R&TD) is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Given the increased awareness of limitations in resources, effective R&TD today needs a method for up-front assessment of competing technologies to help guide technology investment decisions. Such an assessment approach must account for uncertainties in system performance parameters, mission requirements and architectures, and internal and external events influencing a development program. The methodology known as decision analysis has the potential to address these issues. It was evaluated by performing a case study assessment of alternative carbon dioxide removal technologies for NASA's proposed First Lunar Outpost program. An approach was developed that accounts for the uncertainties in each technology's cost and performance parameters as well as programmatic uncertainties such as mission architecture. Life cycle cost savings relative to a baseline, adjusted for the cost of money, was used as a figure of merit to evaluate each of the alternative carbon dioxide removal technology candidates. The methodology was found to provide a consistent decision-making strategy for development of new life support technology. The case study results provided insight that was not possible from more traditional analysis approaches.
Periodic benefit-risk assessment using Bayesian stochastic multi-criteria acceptability analysis
Li, Kan; Yuan, Shuai Sammy; Wang, William; Wan, Shuyan Sabrina; Ceesay, Paulette; Heyse, Joseph F.; Mt-Isa, Shahrul; Luo, Sheng
2018-01-01
Benefit-risk (BR) assessment is essential to ensure the best decisions are made for a medical product in the clinical development process, regulatory marketing authorization, post-market surveillance, and coverage and reimbursement decisions. One challenge of BR assessment in practice is that the benefit and risk profile may keep evolving while new evidence is accumulating. Regulators and the International Conference on Harmonization (ICH) recommend performing periodic benefit-risk evaluation report (PBRER) through the product's lifecycle. In this paper, we propose a general statistical framework for periodic benefit-risk assessment, in which Bayesian meta-analysis and stochastic multi-criteria acceptability analysis (SMAA) will be combined to synthesize the accumulating evidence. The proposed approach allows us to compare the acceptability of different drugs dynamically and effectively and accounts for the uncertainty of clinical measurements and imprecise or incomplete preference information of decision makers. We apply our approaches to two real examples in a post-hoc way for illustration purpose. The proposed method may easily be modified for other pre and post market settings, and thus be an important complement to the current structured benefit-risk assessment (sBRA) framework to improve the transparent and consistency of the decision-making process. PMID:29505866
Meta-analysis is not an exact science: Call for guidance on quantitative synthesis decisions.
Haddaway, Neal R; Rytwinski, Trina
2018-05-01
Meta-analysis is becoming increasingly popular in the field of ecology and environmental management. It increases the effective power of analyses relative to single studies, and allows researchers to investigate effect modifiers and sources of heterogeneity that could not be easily examined within single studies. Many systematic reviewers will set out to conduct a meta-analysis as part of their synthesis, but meta-analysis requires a niche set of skills that are not widely held by the environmental research community. Each step in the process of carrying out a meta-analysis requires decisions that have both scientific and statistical implications. Reviewers are likely to be faced with a plethora of decisions over which effect size to choose, how to calculate variances, and how to build statistical models. Some of these decisions may be simple based on appropriateness of the options. At other times, reviewers must choose between equally valid approaches given the information available to them. This presents a significant problem when reviewers are attempting to conduct a reliable synthesis, such as a systematic review, where subjectivity is minimised and all decisions are documented and justified transparently. We propose three urgent, necessary developments within the evidence synthesis community. Firstly, we call on quantitative synthesis experts to improve guidance on how to prepare data for quantitative synthesis, providing explicit detail to support systematic reviewers. Secondly, we call on journal editors and evidence synthesis coordinating bodies (e.g. CEE) to ensure that quantitative synthesis methods are adequately reported in a transparent and repeatable manner in published systematic reviews. Finally, where faced with two or more broadly equally valid alternative methods or actions, reviewers should conduct multiple analyses, presenting all options, and discussing the implications of the different analytical approaches. We believe it is vital to tackle the possible subjectivity in quantitative synthesis described herein to ensure that the extensive efforts expended in producing systematic reviews and other evidence synthesis products is not wasted because of a lack of rigour or reliability in the final synthesis step. Copyright © 2018 Elsevier Ltd. All rights reserved.
Ye, Fang; Chen, Zhi-Hua; Chen, Jie; Liu, Fang; Zhang, Yong; Fan, Qin-Ying; Wang, Lin
2016-05-20
In the past decades, studies on infant anemia have mainly focused on rural areas of China. With the increasing heterogeneity of population in recent years, available information on infant anemia is inconclusive in large cities of China, especially with comparison between native residents and floating population. This population-based cross-sectional study was implemented to determine the anemic status of infants as well as the risk factors in a representative downtown area of Beijing. As useful methods to build a predictive model, Chi-squared automatic interaction detection (CHAID) decision tree analysis and logistic regression analysis were introduced to explore risk factors of infant anemia. A total of 1091 infants aged 6-12 months together with their parents/caregivers living at Heping Avenue Subdistrict of Beijing were surveyed from January 1, 2013 to December 31, 2014. The prevalence of anemia was 12.60% with a range of 3.47%-40.00% in different subgroup characteristics. The CHAID decision tree model has demonstrated multilevel interaction among risk factors through stepwise pathways to detect anemia. Besides the three predictors identified by logistic regression model including maternal anemia during pregnancy, exclusive breastfeeding in the first 6 months, and floating population, CHAID decision tree analysis also identified the fourth risk factor, the maternal educational level, with higher overall classification accuracy and larger area below the receiver operating characteristic curve. The infant anemic status in metropolis is complex and should be carefully considered by the basic health care practitioners. CHAID decision tree analysis has demonstrated a better performance in hierarchical analysis of population with great heterogeneity. Risk factors identified by this study might be meaningful in the early detection and prompt treatment of infant anemia in large cities.
[Clinical reasoning in nursing, concept analysis].
Côté, Sarah; St-Cyr Tribble, Denise
2012-12-01
Nurses work in situations of complex care requiring great clinical reasoning abilities. In literature, clinical reasoning is often confused with other concepts and it has no consensual definition. To conduct a concept analysis of a nurse's clinical reasoning in order to clarify, define and distinguish it from the other concepts as well as to better understand clinical reasoning. Rodgers's method of concept analysis was used, after literature was retrieved with the use of clinical reasoning, concept analysis, nurse, intensive care and decision making as key-words. The use of cognition, cognitive strategies, a systematic approach of analysis and data interpretation, generating hypothesis and alternatives are attributes of clinical reasoning. The antecedents are experience, knowledge, memory, cues, intuition and data collection. The consequences are decision making, action, clues and problem resolution. This concept analysis helped to define clinical reasoning, to distinguish it from other concepts used synonymously and to guide future research.
Grey situation group decision-making method based on prospect theory.
Zhang, Na; Fang, Zhigeng; Liu, Xiaqing
2014-01-01
This paper puts forward a grey situation group decision-making method on the basis of prospect theory, in view of the grey situation group decision-making problems that decisions are often made by multiple decision experts and those experts have risk preferences. The method takes the positive and negative ideal situation distance as reference points, defines positive and negative prospect value function, and introduces decision experts' risk preference into grey situation decision-making to make the final decision be more in line with decision experts' psychological behavior. Based on TOPSIS method, this paper determines the weight of each decision expert, sets up comprehensive prospect value matrix for decision experts' evaluation, and finally determines the optimal situation. At last, this paper verifies the effectiveness and feasibility of the method by means of a specific example.
Grey Situation Group Decision-Making Method Based on Prospect Theory
Zhang, Na; Fang, Zhigeng; Liu, Xiaqing
2014-01-01
This paper puts forward a grey situation group decision-making method on the basis of prospect theory, in view of the grey situation group decision-making problems that decisions are often made by multiple decision experts and those experts have risk preferences. The method takes the positive and negative ideal situation distance as reference points, defines positive and negative prospect value function, and introduces decision experts' risk preference into grey situation decision-making to make the final decision be more in line with decision experts' psychological behavior. Based on TOPSIS method, this paper determines the weight of each decision expert, sets up comprehensive prospect value matrix for decision experts' evaluation, and finally determines the optimal situation. At last, this paper verifies the effectiveness and feasibility of the method by means of a specific example. PMID:25197706
Siirala, Eriikka; Peltonen, Laura-Maria; Lundgrén-Laine, Heljä; Salanterä, Sanna; Junttila, Kristiina
2016-09-01
To describe the tactical and the operational decisions made by nurse managers when managing the daily unit operation in peri-operative settings. Management is challenging as situations change rapidly and decisions are constantly made. Understanding decision-making in this complex environment helps to develop decision support systems to support nurse managers' operative and tactical decision-making. Descriptive cross-sectional design. Data were collected from 20 nurse managers with the think-aloud method during the busiest working hours and analysed using thematic content analysis. Nurse managers made over 700 decisions; either ad hoc (n = 289), near future (n = 268) or long-term (n = 187) by nature. Decisions were often made simultaneously with many interruptions. Ad hoc decisions covered staff allocation, ensuring adequate staff, rescheduling surgical procedures, confirmation tangible resources and following-up the daily unit operation. Decisions in the near future were: planning of surgical procedures and tangible resources, and planning staff allocation. Long-term decisions were: human recourses, nursing development, supplies and equipment, and finances in the unit. Decision-making was vulnerable to interruptions, which sometimes complicated the managing tasks. The results can be used when planning decision support systems and when defining the nurse managers' tasks in peri-operative settings. © 2016 John Wiley & Sons Ltd.
Batterham, Philip J; Christensen, Helen; Mackinnon, Andrew J
2009-11-22
Relative to physical health conditions such as cardiovascular disease, little is known about risk factors that predict the prevalence of depression. The present study investigates the expected effects of a reduction of these risks over time, using the decision tree method favoured in assessing cardiovascular disease risk. The PATH through Life cohort was used for the study, comprising 2,105 20-24 year olds, 2,323 40-44 year olds and 2,177 60-64 year olds sampled from the community in the Canberra region, Australia. A decision tree methodology was used to predict the presence of major depressive disorder after four years of follow-up. The decision tree was compared with a logistic regression analysis using ROC curves. The decision tree was found to distinguish and delineate a wide range of risk profiles. Previous depressive symptoms were most highly predictive of depression after four years, however, modifiable risk factors such as substance use and employment status played significant roles in assessing the risk of depression. The decision tree was found to have better sensitivity and specificity than a logistic regression using identical predictors. The decision tree method was useful in assessing the risk of major depressive disorder over four years. Application of the model to the development of a predictive tool for tailored interventions is discussed.
Robust Sensitivity Analysis for Multi-Attribute Deterministic Hierarchical Value Models
2002-03-01
such as weighted sum method, weighted 5 product method, and the Analytic Hierarchy Process ( AHP ). This research focuses on only weighted sum...different groups. They can be termed as deterministic, stochastic, or fuzzy multi-objective decision methods if they are classified according to the...weighted product model (WPM), and analytic hierarchy process ( AHP ). His method attempts to identify the most important criteria weight and the most
Rehm, Markus; Prehn, Jochen H M
2013-06-01
Systems biology and systems medicine, i.e. the application of systems biology in a clinical context, is becoming of increasing importance in biology, drug discovery and health care. Systems biology incorporates knowledge and methods that are applied in mathematics, physics and engineering, but may not be part of classical training in biology. We here provide an introduction to basic concepts and methods relevant to the construction and application of systems models for apoptosis research. We present the key methods relevant to the representation of biochemical processes in signal transduction models, with a particular reference to apoptotic processes. We demonstrate how such models enable a quantitative and temporal analysis of changes in molecular entities in response to an apoptosis-inducing stimulus, and provide information on cell survival and cell death decisions. We introduce methods for analyzing the spatial propagation of cell death signals, and discuss the concepts of sensitivity analyses that enable a prediction of network responses to disturbances of single or multiple parameters. Copyright © 2013 Elsevier Inc. All rights reserved.
Multi-criteria decision analysis and environmental risk assessment for nanomaterials
NASA Astrophysics Data System (ADS)
Linkov, Igor; Satterstrom, F. Kyle; Steevens, Jeffery; Ferguson, Elizabeth; Pleus, Richard C.
2007-08-01
Nanotechnology is a broad and complex discipline that holds great promise for innovations that can benefit mankind. Yet, one must not overlook the wide array of factors involved in managing nanomaterial development, ranging from the technical specifications of the material to possible adverse effects in humans. Other opportunities to evaluate benefits and risks are inherent in environmental health and safety (EHS) issues related to nanotechnology. However, there is currently no structured approach for making justifiable and transparent decisions with explicit trade-offs between the many factors that need to be taken into account. While many possible decision-making approaches exist, we believe that multi-criteria decision analysis (MCDA) is a powerful and scientifically sound decision analytical framework for nanomaterial risk assessment and management. This paper combines state-of-the-art research in MCDA methods applicable to nanotechnology with a hypothetical case study for nanomaterial management. The example shows how MCDA application can balance societal benefits against unintended side effects and risks, and how it can also bring together multiple lines of evidence to estimate the likely toxicity and risk of nanomaterials given limited information on physical and chemical properties. The essential contribution of MCDA is to link this performance information with decision criteria and weightings elicited from scientists and managers, allowing visualization and quantification of the trade-offs involved in the decision-making process.
NASA Astrophysics Data System (ADS)
Eni, Yuli; Aryanto, Rudy
2014-03-01
There are problems being experienced by the Ministry of cooperatives and SME (Small and Medium Enterprise) including the length of time in the decision by the Government to establish a policy that should be taken for local cooperatives across the province of Indonesia. The decision-making process is still analyzed manually, so that sometimes the decisions taken are also less appropriate, effective and efficient. The second problem is the lack of monitoring data cooperative process province that is too much, making it difficult for the analysis of dynamic information to be useful. Therefore the authors want to fix the system that runs by using digital dashboard management system supported by the modeling of system dynamics. In addition, the author also did the design of a system that can support the system. Design of this system is aimed to ease the experts, head, and the government to decide (DSS - Decision Support System) accurately effectively and efficiently, because in the system are raised alternative simulation in a description of the decision to be taken and the result from the decision. The system is expected to be designed dan simulated can ease and expedite the decision making. The design of dynamic digital dashboard management conducted by method of OOAD (Objects Oriented Analysis and Design) complete with UML notation.
An overview of the mathematical and statistical analysis component of RICIS
NASA Technical Reports Server (NTRS)
Hallum, Cecil R.
1987-01-01
Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.
Bayesian imperfect information analysis for clinical recurrent data
Chang, Chih-Kuang; Chang, Chi-Chang
2015-01-01
In medical research, clinical practice must often be undertaken with imperfect information from limited resources. This study applied Bayesian imperfect information-value analysis to realistic situations to produce likelihood functions and posterior distributions, to a clinical decision-making problem for recurrent events. In this study, three kinds of failure models are considered, and our methods illustrated with an analysis of imperfect information from a trial of immunotherapy in the treatment of chronic granulomatous disease. In addition, we present evidence toward a better understanding of the differing behaviors along with concomitant variables. Based on the results of simulations, the imperfect information value of the concomitant variables was evaluated and different realistic situations were compared to see which could yield more accurate results for medical decision-making. PMID:25565853
Damschroder, Laura J; Fetters, Michael D; Zikmund-Fisher, Brian J; Crabtree, Benjamin F; Hudson, Shawna V; Ruffin IV, Mack T; Fucinari, Juliana; Kang, Minji; Taichman, L Susan; Creswell, John W
2018-01-01
Background Women with chronic medical conditions, such as diabetes and hypertension, have a higher risk of pregnancy-related complications compared with women without medical conditions and should be offered contraception if desired. Although evidence based guidelines for contraceptive selection in the presence of medical conditions are available via the United States Medical Eligibility Criteria (US MEC), these guidelines are underutilized. Research also supports the use of decision tools to promote shared decision making between patients and providers during contraceptive counseling. Objective The overall goal of the MiHealth, MiChoice project is to design and implement a theory-driven, Web-based tool that incorporates the US MEC (provider-level intervention) within the vehicle of a contraceptive decision tool for women with chronic medical conditions (patient-level intervention) in community-based primary care settings (practice-level intervention). This will be a 3-phase study that includes a predesign phase, a design phase, and a testing phase in a randomized controlled trial. This study protocol describes phase 1 and aim 1, which is to determine patient-, provider-, and practice-level factors that are relevant to the design and implementation of the contraceptive decision tool. Methods This is a mixed methods implementation study. To customize the delivery of the US MEC in the decision tool, we selected high-priority constructs from the Consolidated Framework for Implementation Research and the Theoretical Domains Framework to drive data collection and analysis at the practice and provider level, respectively. A conceptual model that incorporates constructs from the transtheoretical model and the health beliefs model undergirds patient-level data collection and analysis and will inform customization of the decision tool for this population. We will recruit 6 community-based primary care practices and conduct quantitative surveys and semistructured qualitative interviews with women who have chronic medical conditions, their primary care providers (PCPs), and clinic staff, as well as field observations of practice activities. Quantitative survey data will be summarized with simple descriptive statistics and relationships between participant characteristics and contraceptive recommendations (for PCPs), and current contraceptive use (for patients) will be examined using Fisher exact test. We will conduct thematic analysis of qualitative data from interviews and field observations. The integration of data will occur by comparing, contrasting, and synthesizing qualitative and quantitative findings to inform the future development and implementation of the intervention. Results We are currently enrolling practices and anticipate study completion in 15 months. Conclusions This protocol describes the first phase of a multiphase mixed methods study to develop and implement a Web-based decision tool that is customized to meet the needs of women with chronic medical conditions in primary care settings. Study findings will promote contraceptive counseling via shared decision making and reflect evidence-based guidelines for contraceptive selection. Trial Registration ClinicalTrials.gov NCT03153644; https://clinicaltrials.gov/ct2/show/NCT03153644 (Archived by WebCite at http://www.webcitation.org/6yUkA5lK8) PMID:29669707
Error Ratio Analysis: Alternate Mathematics Assessment for General and Special Educators.
ERIC Educational Resources Information Center
Miller, James H.; Carr, Sonya C.
1997-01-01
Eighty-seven elementary students in grades four, five, and six, were administered a 30-item multiplication instrument to assess performance in computation across grade levels. An interpretation of student performance using error ratio analysis is provided and the use of this method with groups of students for instructional decision making is…
A Systems Analysis of the MDTA Institutional Training Program. Final Report.
ERIC Educational Resources Information Center
North American Rockwell Information Systems Co., Anaheim, CA.
An industrial study group was contracted to perform a systems analysis of institutional training conducted under the Manpower Development and Training Act (MDTA) of 1962, as amended, in order to: (1) illuminate management decisions in the areas of program priorities, alternative methods of administration, and allocation of resources, and (2)…
A Study about Placement Support Using Semantic Similarity
ERIC Educational Resources Information Center
Katz, Marco; van Bruggen, Jan; Giesbers, Bas; Waterink, Wim; Eshuis, Jannes; Koper, Rob
2014-01-01
This paper discusses Latent Semantic Analysis (LSA) as a method for the assessment of prior learning. The Accreditation of Prior Learning (APL) is a procedure to offer learners an individualized curriculum based on their prior experiences and knowledge. The placement decisions in this process are based on the analysis of student material by domain…
The US EPA’s ToxCastTM program seeks to combine advances in high-throughput screening technology with methodologies from statistics and computer science to develop high-throughput decision support tools for assessing chemical hazard and risk. To develop new methods of analysis of...
Scott, Ryan P; Cullen, Alison C; Fox-Lent, Cate; Linkov, Igor
2016-10-01
In emergent photovoltaics, nanoscale materials hold promise for optimizing device characteristics; however, the related impacts remain uncertain, resulting in challenges to decisions on strategic investment in technology innovation. We integrate multi-criteria decision analysis (MCDA) and life-cycle assessment (LCA) results (LCA-MCDA) as a method of incorporating values of a hypothetical federal acquisition manager into the assessment of risks and benefits of emerging photovoltaic materials. Specifically, we compare adoption of copper zinc tin sulfide (CZTS) devices with molybdenum back contacts to alternative devices employing graphite or graphene instead of molybdenum. LCA impact results are interpreted alongside benefits of substitution including cost reductions and performance improvements through application of multi-attribute utility theory. To assess the role of uncertainty we apply Monte Carlo simulation and sensitivity analysis. We find that graphene or graphite back contacts outperform molybdenum under most scenarios and assumptions. The use of decision analysis clarifies potential advantages of adopting graphite as a back contact while emphasizing the importance of mitigating conventional impacts of graphene production processes if graphene is used in emerging CZTS devices. Our research further demonstrates that a combination of LCA and MCDA increases the usability of LCA in assessing product sustainability. In particular, this approach identifies the most influential assumptions and data gaps in the analysis and the areas in which either engineering controls or further data collection may be necessary. © 2016 Society for Risk Analysis.
Munson, Mark; Lieberman, Harvey; Tserlin, Elina; Rocnik, Jennifer; Ge, Jie; Fitzgerald, Maria; Patel, Vinod; Garcia-Echeverria, Carlos
2015-08-01
Herein, we report a novel and general method, lead optimization attrition analysis (LOAA), to benchmark two distinct small-molecule lead series using a relatively unbiased, simple technique and commercially available software. We illustrate this approach with data collected during lead optimization of two independent oncology programs as a case study. Easily generated graphics and attrition curves enabled us to calibrate progress and support go/no go decisions on each program. We believe that this data-driven technique could be used broadly by medicinal chemists and management to guide strategic decisions during drug discovery. Copyright © 2015 Elsevier Ltd. All rights reserved.
Amatchmethod Based on Latent Semantic Analysis for Earthquakehazard Emergency Plan
NASA Astrophysics Data System (ADS)
Sun, D.; Zhao, S.; Zhang, Z.; Shi, X.
2017-09-01
The structure of the emergency plan on earthquake is complex, and it's difficult for decision maker to make a decision in a short time. To solve the problem, this paper presents a match method based on Latent Semantic Analysis (LSA). After the word segmentation preprocessing of emergency plan, we carry out keywords extraction according to the part-of-speech and the frequency of words. Then through LSA, we map the documents and query information to the semantic space, and calculate the correlation of documents and queries by the relation between vectors. The experiments results indicate that the LSA can improve the accuracy of emergency plan retrieval efficiently.
Exploration of how women make treatment decisions after a breast cancer diagnosis.
Spittler, Cheryl A; Pallikathayil, Leonie; Bott, Marjorie
2012-09-01
To examine the information needs of women after receiving a diagnosis of breast cancer, investigate how decisions about treatment options are made, and assess personal responses to the decisions made. Mixed-methods approach using quantitative and qualitative data. The University of Kansas Medical Center and Quinn Plastic Surgery Center, both in the midwestern United States. 102 breast cancer survivors who had completed all forms of treatment for at least three months and less than five years. Phase I participants completed five questionnaires about informational needs, confidence and satisfaction with the decision, decisional regret, and conflict. In phase II, 15 participants were purposively sampled from the 102 survivors to participate in a focus group session. Data analysis included frequencies and multiple regression for phase I and qualitative content analysis for phase II. Informational needs, confidence and satisfaction with the decision, and decisional regret and conflict. The variables (widowed, confidence and satisfaction with decision, and decisional conflict and regret) significantly (p = 0.01) accounted for 14% of the variance in informational needs. Two themes emerged from the study: (a) feelings, thoughts, and essential factors that impact treatment considerations, and (b) tips for enhancing treatment consideration options. The study's results show that women viewed informational needs as very important in making treatment decisions after being diagnosed with breast cancer. The treatment team should provide the information, with consideration of the patient's personal preferences, that will assist women to make informed, confident, and satisfied decisions about treatment choices.
Clarifying values: an updated review
2013-01-01
Background Consensus guidelines have recommended that decision aids include a process for helping patients clarify their values. We sought to examine the theoretical and empirical evidence related to the use of values clarification methods in patient decision aids. Methods Building on the International Patient Decision Aid Standards (IPDAS) Collaboration’s 2005 review of values clarification methods in decision aids, we convened a multi-disciplinary expert group to examine key definitions, decision-making process theories, and empirical evidence about the effects of values clarification methods in decision aids. To summarize the current state of theory and evidence about the role of values clarification methods in decision aids, we undertook a process of evidence review and summary. Results Values clarification methods (VCMs) are best defined as methods to help patients think about the desirability of options or attributes of options within a specific decision context, in order to identify which option he/she prefers. Several decision making process theories were identified that can inform the design of values clarification methods, but no single “best” practice for how such methods should be constructed was determined. Our evidence review found that existing VCMs were used for a variety of different decisions, rarely referenced underlying theory for their design, but generally were well described in regard to their development process. Listing the pros and cons of a decision was the most common method used. The 13 trials that compared decision support with or without VCMs reached mixed results: some found that VCMs improved some decision-making processes, while others found no effect. Conclusions Values clarification methods may improve decision-making processes and potentially more distal outcomes. However, the small number of evaluations of VCMs and, where evaluations exist, the heterogeneity in outcome measures makes it difficult to determine their overall effectiveness or the specific characteristics that increase effectiveness. PMID:24625261
Marsh, Kevin; Caro, J Jaime; Hamed, Alaa; Zaiser, Erica
2017-04-01
Qualitative methods tend to be used to incorporate patient preferences into healthcare decision making. However, for patient preferences to be given adequate consideration by decision makers they need to be quantified. Multi-criteria decision analysis (MCDA) is one way to quantify and capture the patient voice. The objective of this review was to report on existing MCDAs involving patients to support the future use of MCDA to capture the patient voice. MEDLINE and EMBASE were searched in June 2014 for English-language papers with no date restriction. The following search terms were used: 'multi-criteria decision*', 'multiple criteria decision*', 'MCDA', 'benefit risk assessment*', 'risk benefit assessment*', 'multicriteri* decision*', 'MCDM', 'multi-criteri* decision*'. Abstracts were included if they reported the application of MCDA to assess healthcare interventions where patients were the source of weights. Abstracts were excluded if they did not apply MCDA, such as discussions of how MCDA could be used; or did not evaluate healthcare interventions, such as MCDAs to assess the level of health need in a locality. Data were extracted on weighting method, variation in patient and expert preferences, and discussion on different weighting techniques. The review identified ten English-language studies that reported an MCDA to assess healthcare interventions and involved patients as a source of weights. These studies reported 12 applications of MCDA. Different methods of preference elicitation were employed: direct weighting in workshops; discrete choice experiment surveys; and the analytical hierarchy process using both workshops and surveys. There was significant heterogeneity in patient responses and differences between patients, who put greater weight on disease characteristics and treatment convenience, and experts, who put more weight on efficacy. The studies highlighted cognitive challenges associated with some weighting methods, though patients' views on their ability to undertake weighting tasks was positive. This review identified several recent examples of MCDA used to elicit patient preferences, which support the feasibility of using MCDA to capture the patient voice. Challenges identified included, how best to reflect the heterogeneity of patient preferences in decision making and how to manage the cognitive burden associated with some MCDA tasks.
NASA Astrophysics Data System (ADS)
Chung-Wei, Li; Gwo-Hshiung, Tzeng
To deal with complex problems, structuring them through graphical representations and analyzing causal influences can aid in illuminating complex issues, systems, or concepts. The DEMATEL method is a methodology which can be used for researching and solving complicated and intertwined problem groups. The end product of the DEMATEL process is a visual representation—the impact-relations map—by which respondents organize their own actions in the world. The applicability of the DEMATEL method is widespread, ranging from analyzing world problematique decision making to industrial planning. The most important property of the DEMATEL method used in the multi-criteria decision making (MCDM) field is to construct interrelations between criteria. In order to obtain a suitable impact-relations map, an appropriate threshold value is needed to obtain adequate information for further analysis and decision-making. In this paper, we propose a method based on the entropy approach, the maximum mean de-entropy algorithm, to achieve this purpose. Using real cases to find the interrelationships between the criteria for evaluating effects in E-learning programs as an examples, we will compare the results obtained from the respondents and from our method, and discuss that the different impact-relations maps from these two methods.
Balk, Benjamin; Elder, Kelly
2000-01-01
We model the spatial distribution of snow across a mountain basin using an approach that combines binary decision tree and geostatistical techniques. In April 1997 and 1998, intensive snow surveys were conducted in the 6.9‐km2 Loch Vale watershed (LVWS), Rocky Mountain National Park, Colorado. Binary decision trees were used to model the large‐scale variations in snow depth, while the small‐scale variations were modeled through kriging interpolation methods. Binary decision trees related depth to the physically based independent variables of net solar radiation, elevation, slope, and vegetation cover type. These decision tree models explained 54–65% of the observed variance in the depth measurements. The tree‐based modeled depths were then subtracted from the measured depths, and the resulting residuals were spatially distributed across LVWS through kriging techniques. The kriged estimates of the residuals were added to the tree‐based modeled depths to produce a combined depth model. The combined depth estimates explained 60–85% of the variance in the measured depths. Snow densities were mapped across LVWS using regression analysis. Snow‐covered area was determined from high‐resolution aerial photographs. Combining the modeled depths and densities with a snow cover map produced estimates of the spatial distribution of snow water equivalence (SWE). This modeling approach offers improvement over previous methods of estimating SWE distribution in mountain basins.
Standage, Dominic; You, Hongzhi; Wang, Da-Hui; Dorris, Michael C.
2011-01-01
The speed–accuracy trade-off (SAT) is ubiquitous in decision tasks. While the neural mechanisms underlying decisions are generally well characterized, the application of decision-theoretic methods to the SAT has been difficult to reconcile with experimental data suggesting that decision thresholds are inflexible. Using a network model of a cortical decision circuit, we demonstrate the SAT in a manner consistent with neural and behavioral data and with mathematical models that optimize speed and accuracy with respect to one another. In simulations of a reaction time task, we modulate the gain of the network with a signal encoding the urgency to respond. As the urgency signal builds up, the network progresses through a series of processing stages supporting noise filtering, integration of evidence, amplification of integrated evidence, and choice selection. Analysis of the network's dynamics formally characterizes this progression. Slower buildup of urgency increases accuracy by slowing down the progression. Faster buildup has the opposite effect. Because the network always progresses through the same stages, decision-selective firing rates are stereotyped at decision time. PMID:21415911
Standage, Dominic; You, Hongzhi; Wang, Da-Hui; Dorris, Michael C
2011-01-01
The speed-accuracy trade-off (SAT) is ubiquitous in decision tasks. While the neural mechanisms underlying decisions are generally well characterized, the application of decision-theoretic methods to the SAT has been difficult to reconcile with experimental data suggesting that decision thresholds are inflexible. Using a network model of a cortical decision circuit, we demonstrate the SAT in a manner consistent with neural and behavioral data and with mathematical models that optimize speed and accuracy with respect to one another. In simulations of a reaction time task, we modulate the gain of the network with a signal encoding the urgency to respond. As the urgency signal builds up, the network progresses through a series of processing stages supporting noise filtering, integration of evidence, amplification of integrated evidence, and choice selection. Analysis of the network's dynamics formally characterizes this progression. Slower buildup of urgency increases accuracy by slowing down the progression. Faster buildup has the opposite effect. Because the network always progresses through the same stages, decision-selective firing rates are stereotyped at decision time.
Development and initial evaluation of a treatment decision dashboard.
Dolan, James G; Veazie, Peter J; Russ, Ann J
2013-04-21
For many healthcare decisions, multiple alternatives are available with different combinations of advantages and disadvantages across several important dimensions. The complexity of current healthcare decisions thus presents a significant barrier to informed decision making, a key element of patient-centered care.Interactive decision dashboards were developed to facilitate decision making in Management, a field marked by similarly complicated choices. These dashboards utilize data visualization techniques to reduce the cognitive effort needed to evaluate decision alternatives and a non-linear flow of information that enables users to review information in a self-directed fashion. Theoretically, both of these features should facilitate informed decision making by increasing user engagement with and understanding of the decision at hand. We sought to determine if the interactive decision dashboard format can be successfully adapted to create a clinically realistic prototype patient decision aid suitable for further evaluation and refinement. We created a computerized, interactive clinical decision dashboard and performed a pilot test of its clinical feasibility and acceptability using a multi-method analysis. The dashboard summarized information about the effectiveness, risks of side effects and drug-drug interactions, out-of-pocket costs, and ease of use of nine analgesic treatment options for knee osteoarthritis. Outcome evaluations included observations of how study participants utilized the dashboard, questionnaires to assess usability, acceptability, and decisional conflict, and an open-ended qualitative analysis. The study sample consisted of 25 volunteers - 7 men and 18 women - with an average age of 51 years. The mean time spent interacting with the dashboard was 4.6 minutes. Mean evaluation scores on scales ranging from 1 (low) to 7 (high) were: mechanical ease of use 6.1, cognitive ease of use 6.2, emotional difficulty 2.7, decision-aiding effectiveness 5.9, clarification of values 6.5, reduction in decisional uncertainty 6.1, and provision of decision-related information 6.0. Qualitative findings were similarly positive. Interactive decision dashboards can be adapted for clinical use and have the potential to foster informed decision making. Additional research is warranted to more rigorously test the effectiveness and efficiency of patient decision dashboards for supporting informed decision making and other aspects of patient-centered care, including shared decision making.
The art of maturity modeling. Part 2. Alternative models and sensitivity analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waples, D.W.; Suizu, Masahiro; Kamata, Hiromi
1992-01-01
The sensitivity of exploration decisions to variations in several input parameters for maturity modeling was examined for the MITI Rumoi well, Hokkaido, Japan. Decisions were almost completely insensitive to uncertainties about formation age and erosional removal across some unconformities, but were more sensitive to changes in removal during unconformities which occurred near maximum paleotemperatures. Exploration decisions were not very sensitive to the choice of a particular kinetic model for hydrocarbon generation. Uncertainties in kerogen type and the kinetics of different kerogen types are more serious than differences among the various kinetic models. Results of modeling using the TTI method weremore » unsatisfactory. Thermal history and timing and amount of hydrocarbon generation estimated or calculated using the TTI method were greatly different from those obtained using a purely kinetic model. The authors strongly recommend use of the kinetic R{sub o} method instead of the TTI method. If they had lacked measured R{sub o} data, subsurface temperature data, or both, their confidence in the modeling results would have been sharply reduced. Conceptual models for predicting heat flow and thermal conductivity are simply too weak at present to allow one to carry out highly meaningful modeling unless the input is constrained by measured data. Maturity modeling therefore requires the use of more, not fewer, measured temperature and maturity data. The use of sensitivity analysis in maturity modeling is very important for understanding the geologic system, for knowing what level of confidence to place on the results, and for determining what new types of data would be most necessary to improve confidence. Sensitivity analysis can be carried out easily using a rapid, interactive maturity-modeling program.« less