Wu, Jun; Li, Chengbing; Huo, Yueying
2014-01-01
Safety of dangerous goods transport is directly related to the operation safety of dangerous goods transport enterprise. Aiming at the problem of the high accident rate and large harm in dangerous goods logistics transportation, this paper took the group decision making problem based on integration and coordination thought into a multiagent multiobjective group decision making problem; a secondary decision model was established and applied to the safety assessment of dangerous goods transport enterprise. First of all, we used dynamic multivalue background and entropy theory building the first level multiobjective decision model. Secondly, experts were to empower according to the principle of clustering analysis, and combining with the relative entropy theory to establish a secondary rally optimization model based on relative entropy in group decision making, and discuss the solution of the model. Then, after investigation and analysis, we establish the dangerous goods transport enterprise safety evaluation index system. Finally, case analysis to five dangerous goods transport enterprises in the Inner Mongolia Autonomous Region validates the feasibility and effectiveness of this model for dangerous goods transport enterprise recognition, which provides vital decision making basis for recognizing the dangerous goods transport enterprises. PMID:25477954
Wu, Jun; Li, Chengbing; Huo, Yueying
2014-01-01
Safety of dangerous goods transport is directly related to the operation safety of dangerous goods transport enterprise. Aiming at the problem of the high accident rate and large harm in dangerous goods logistics transportation, this paper took the group decision making problem based on integration and coordination thought into a multiagent multiobjective group decision making problem; a secondary decision model was established and applied to the safety assessment of dangerous goods transport enterprise. First of all, we used dynamic multivalue background and entropy theory building the first level multiobjective decision model. Secondly, experts were to empower according to the principle of clustering analysis, and combining with the relative entropy theory to establish a secondary rally optimization model based on relative entropy in group decision making, and discuss the solution of the model. Then, after investigation and analysis, we establish the dangerous goods transport enterprise safety evaluation index system. Finally, case analysis to five dangerous goods transport enterprises in the Inner Mongolia Autonomous Region validates the feasibility and effectiveness of this model for dangerous goods transport enterprise recognition, which provides vital decision making basis for recognizing the dangerous goods transport enterprises.
Decision modeling for fire incident analysis
Donald G. MacGregor; Armando González-Cabán
2009-01-01
This paper reports on methods for representing and modeling fire incidents based on concepts and models from the decision and risk sciences. A set of modeling techniques are used to characterize key fire management decision processes and provide a basis for incident analysis. The results of these methods can be used to provide insights into the structure of fire...
Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J
2015-03-15
This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.
Patient or physician preferences for decision analysis: the prenatal genetic testing decision.
Heckerling, P S; Verp, M S; Albert, N
1999-01-01
The choice between amniocentesis and chorionic villus sampling for prenatal genetic testing involves tradeoffs of the benefits and risks of the tests. Decision analysis is a method of explicitly weighing such tradeoffs. The authors examined the relationship between prenatal test choices made by patients and the choices prescribed by decision-analytic models based on their preferences, and separate models based on the preferences of their physicians. Preferences were assessed using written scenarios describing prenatal testing outcomes, and were recorded on linear rating scales. After adjustment for sociodemographic and obstetric confounders, test choice was significantly associated with the choice of decision models based on patient preferences (odds ratio 4.44; Cl, 2.53 to 7.78), but not with the choice of models based on the preferences of the physicians (odds ratio 1.60; Cl, 0.79 to 3.26). Agreement between decision analyses based on patient preferences and on physician preferences was little better than chance (kappa = 0.085+/-0.063). These results were robust both to changes in the decision-analytic probabilities and to changes in the model structure itself to simulate non-expected utility decision rules. The authors conclude that patient but not physician preferences, incorporated in decision models, correspond to the choice of amniocentesis or chorionic villus sampling made by the patient. Nevertheless, because patient preferences were assessed after referral for genetic testing, prospective preference-assessment studies will be necessary to confirm this association.
A novel computer based expert decision making model for prostate cancer disease management.
Richman, Martin B; Forman, Ernest H; Bayazit, Yildirim; Einstein, Douglas B; Resnick, Martin I; Stovsky, Mark D
2005-12-01
We propose a strategic, computer based, prostate cancer decision making model based on the analytic hierarchy process. We developed a model that improves physician-patient joint decision making and enhances the treatment selection process by making this critical decision rational and evidence based. Two groups (patient and physician-expert) completed a clinical study comparing an initial disease management choice with the highest ranked option generated by the computer model. Participants made pairwise comparisons to derive priorities for the objectives and subobjectives related to the disease management decision. The weighted comparisons were then applied to treatment options to yield prioritized rank lists that reflect the likelihood that a given alternative will achieve the participant treatment goal. Aggregate data were evaluated by inconsistency ratio analysis and sensitivity analysis, which assessed the influence of individual objectives and subobjectives on the final rank list of treatment options. Inconsistency ratios less than 0.05 were reliably generated, indicating that judgments made within the model were mathematically rational. The aggregate prioritized list of treatment options was tabulated for the patient and physician groups with similar outcomes for the 2 groups. Analysis of the major defining objectives in the treatment selection decision demonstrated the same rank order for the patient and physician groups with cure, survival and quality of life being more important than controlling cancer, preventing major complications of treatment, preventing blood transfusion complications and limiting treatment cost. Analysis of subobjectives, including quality of life and sexual dysfunction, produced similar priority rankings for the patient and physician groups. Concordance between initial treatment choice and the highest weighted model option differed between the groups with the patient group having 59% concordance and the physician group having only 42% concordance. This study successfully validated the usefulness of a computer based prostate cancer management decision making model to produce individualized, rational, clinically appropriate disease management decisions without physician bias.
Sensitivity Analysis in Sequential Decision Models.
Chen, Qiushi; Ayer, Turgay; Chhatwal, Jagpreet
2017-02-01
Sequential decision problems are frequently encountered in medical decision making, which are commonly solved using Markov decision processes (MDPs). Modeling guidelines recommend conducting sensitivity analyses in decision-analytic models to assess the robustness of the model results against the uncertainty in model parameters. However, standard methods of conducting sensitivity analyses cannot be directly applied to sequential decision problems because this would require evaluating all possible decision sequences, typically in the order of trillions, which is not practically feasible. As a result, most MDP-based modeling studies do not examine confidence in their recommended policies. In this study, we provide an approach to estimate uncertainty and confidence in the results of sequential decision models. First, we provide a probabilistic univariate method to identify the most sensitive parameters in MDPs. Second, we present a probabilistic multivariate approach to estimate the overall confidence in the recommended optimal policy considering joint uncertainty in the model parameters. We provide a graphical representation, which we call a policy acceptability curve, to summarize the confidence in the optimal policy by incorporating stakeholders' willingness to accept the base case policy. For a cost-effectiveness analysis, we provide an approach to construct a cost-effectiveness acceptability frontier, which shows the most cost-effective policy as well as the confidence in that for a given willingness to pay threshold. We demonstrate our approach using a simple MDP case study. We developed a method to conduct sensitivity analysis in sequential decision models, which could increase the credibility of these models among stakeholders.
2018-01-01
Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization. PMID:29377956
Zu, Xianghuan; Yang, Chuanlei; Wang, Hechun; Wang, Yinyan
2018-01-01
Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization.
Seismic slope-performance analysis: from hazard map to decision support system
Miles, Scott B.; Keefer, David K.; Ho, Carlton L.
1999-01-01
In response to the growing recognition of engineers and decision-makers of the regional effects of earthquake-induced landslides, this paper presents a general approach to conducting seismic landslide zonation, based on the popular Newmark's sliding block analogy for modeling coherent landslides. Four existing models based on the sliding block analogy are compared. The comparison shows that the models forecast notably different levels of slope performance. Considering this discrepancy along with the limitations of static maps as a decision tool, a spatial decision support system (SDSS) for seismic landslide analysis is proposed, which will support investigations over multiple scales for any number of earthquake scenarios and input conditions. Most importantly, the SDSS will allow use of any seismic landslide analysis model and zonation approach. Developments associated with the SDSS will produce an object-oriented model for encapsulating spatial data, an object-oriented specification to allow construction of models using modular objects, and a direct-manipulation, dynamic user-interface that adapts to the particular seismic landslide model configuration.
An Intelligent Decision Support System for Workforce Forecast
2011-01-01
ARIMA ) model to forecast the demand for construction skills in Hong Kong. This model was based...Decision Trees ARIMA Rule Based Forecasting Segmentation Forecasting Regression Analysis Simulation Modeling Input-Output Models LP and NLP Markovian...data • When results are needed as a set of easily interpretable rules 4.1.4 ARIMA Auto-regressive, integrated, moving-average ( ARIMA ) models
Diaby, Vakaramoko; Goeree, Ron
2014-02-01
In recent years, the quest for more comprehensiveness, structure and transparency in reimbursement decision-making in healthcare has prompted the research into alternative decision-making frameworks. In this environment, multi-criteria decision analysis (MCDA) is arising as a valuable tool to support healthcare decision-making. In this paper, we present the main MCDA decision support methods (elementary methods, value-based measurement models, goal programming models and outranking models) using a case study approach. For each family of methods, an example of how an MCDA model would operate in a real decision-making context is presented from a critical perspective, highlighting the parameters setting, the selection of the appropriate evaluation model as well as the role of sensitivity and robustness analyses. This study aims to provide a step-by-step guide on how to use MCDA methods for reimbursement decision-making in healthcare.
The need for consumer behavior analysis in health care coverage decisions.
Thompson, A M; Rao, C P
1990-01-01
Demographic analysis has been the primary form of analysis connected with health care coverage decisions. This paper reviews past demographic research and shows the need to use behavioral analyses for health care coverage policy decisions. A behavioral model based research study is presented and a case is made for integrated study into why consumers make health care coverage decisions.
Issue a Boil-Water Advisory or Wait for Definitive Information? A Decision Analysis
Wagner, Michael M.; Wallstrom, Garrick L.; Onisko, Agnieszka
2005-01-01
Objective Study the decision to issue a boil-water advisory in response to a spike in sales of diarrhea remedies or wait 72 hours for the results of definitive testing of water and people. Methods Decision analysis. Results In the base-case analysis, the optimal decision is test-and-wait. If the cost of issuing a boil-water advisory is less than 13.92 cents per person per day, the optimal decision is to issue the boil-water advisory immediately. Conclusions Decisions based on surveillance data that are suggestive but not conclusive about the existence of a disease outbreak can be modeled. PMID:16779145
The BCD of response time analysis in experimental economics.
Spiliopoulos, Leonidas; Ortmann, Andreas
2018-01-01
For decisions in the wild, time is of the essence. Available decision time is often cut short through natural or artificial constraints, or is impinged upon by the opportunity cost of time. Experimental economists have only recently begun to conduct experiments with time constraints and to analyze response time (RT) data, in contrast to experimental psychologists. RT analysis has proven valuable for the identification of individual and strategic decision processes including identification of social preferences in the latter case, model comparison/selection, and the investigation of heuristics that combine speed and performance by exploiting environmental regularities. Here we focus on the benefits, challenges, and desiderata of RT analysis in strategic decision making. We argue that unlocking the potential of RT analysis requires the adoption of process-based models instead of outcome-based models, and discuss how RT in the wild can be captured by time-constrained experiments in the lab. We conclude that RT analysis holds considerable potential for experimental economics, deserves greater attention as a methodological tool, and promises important insights on strategic decision making in naturally occurring environments.
Hunt, Randall J.
2012-01-01
Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.
A stochastic multicriteria model for evidence-based decision making in drug benefit-risk analysis.
Tervonen, Tommi; van Valkenhoef, Gert; Buskens, Erik; Hillege, Hans L; Postmus, Douwe
2011-05-30
Drug benefit-risk (BR) analysis is based on firm clinical evidence regarding various safety and efficacy outcomes. In this paper, we propose a new and more formal approach for constructing a supporting multi-criteria model that fully takes into account the evidence on efficacy and adverse drug reactions. Our approach is based on the stochastic multi-criteria acceptability analysis methodology, which allows us to compute the typical value judgments that support a decision, to quantify decision uncertainty, and to compute a comprehensive BR profile. We construct a multi-criteria model for the therapeutic group of second-generation antidepressants. We assess fluoxetine and venlafaxine together with placebo according to incidence of treatment response and three common adverse drug reactions by using data from a published study. Our model shows that there are clear trade-offs among the treatment alternatives. Copyright © 2011 John Wiley & Sons, Ltd.
Barbieri, Christopher E; Cha, Eugene K; Chromecki, Thomas F; Dunning, Allison; Lotan, Yair; Svatek, Robert S; Scherr, Douglas S; Karakiewicz, Pierre I; Sun, Maxine; Mazumdar, Madhu; Shariat, Shahrokh F
2012-03-01
• To employ decision curve analysis to determine the impact of nuclear matrix protein 22 (NMP22) on clinical decision making in the detection of bladder cancer using data from a prospective trial. • The study included 1303 patients at risk for bladder cancer who underwent cystoscopy, urine cytology and measurement of urinary NMP22 levels. • We constructed several prediction models to estimate risk of bladder cancer. The base model was generated using patient characteristics (age, gender, race, smoking and haematuria); cytology and NMP22 were added to the base model to determine effects on predictive accuracy. • Clinical net benefit was calculated by summing the benefits and subtracting the harms and weighting these by the threshold probability at which a patient or clinician would opt for cystoscopy. • In all, 72 patients were found to have bladder cancer (5.5%). In univariate analyses, NMP22 was the strongest predictor of bladder cancer presence (predictive accuracy 71.3%), followed by age (67.5%) and cytology (64.3%). • In multivariable prediction models, NMP22 improved the predictive accuracy of the base model by 8.2% (area under the curve 70.2-78.4%) and of the base model plus cytology by 4.2% (area under the curve 75.9-80.1%). • Decision curve analysis revealed that adding NMP22 to other models increased clinical benefit, particularly at higher threshold probabilities. • NMP22 is a strong, independent predictor of bladder cancer. • Addition of NMP22 improves the accuracy of standard predictors by a statistically and clinically significant margin. • Decision curve analysis suggests that integration of NMP22 into clinical decision making helps avoid unnecessary cystoscopies, with minimal increased risk of missing a cancer. © 2011 THE AUTHORS. BJU INTERNATIONAL © 2011 BJU INTERNATIONAL.
Criteria for assessing problem solving and decision making in complex environments
NASA Technical Reports Server (NTRS)
Orasanu, Judith
1993-01-01
Training crews to cope with unanticipated problems in high-risk, high-stress environments requires models of effective problem solving and decision making. Existing decision theories use the criteria of logical consistency and mathematical optimality to evaluate decision quality. While these approaches are useful under some circumstances, the assumptions underlying these models frequently are not met in dynamic time-pressured operational environments. Also, applying formal decision models is both labor and time intensive, a luxury often lacking in operational environments. Alternate approaches and criteria are needed. Given that operational problem solving and decision making are embedded in ongoing tasks, evaluation criteria must address the relation between those activities and satisfaction of broader task goals. Effectiveness and efficiency become relevant for judging reasoning performance in operational environments. New questions must be addressed: What is the relation between the quality of decisions and overall performance by crews engaged in critical high risk tasks? Are different strategies most effective for different types of decisions? How can various decision types be characterized? A preliminary model of decision types found in air transport environments will be described along with a preliminary performance model based on an analysis of 30 flight crews. The performance analysis examined behaviors that distinguish more and less effective crews (based on performance errors). Implications for training and system design will be discussed.
Analysis of the decision-making process of nurse managers: a collective reflection.
Eduardo, Elizabete Araujo; Peres, Aida Maris; de Almeida, Maria de Lourdes; Roglio, Karina de Dea; Bernardino, Elizabeth
2015-01-01
to analyze the decision-making model adopted by nurses from the perspective of some decision-making process theories. qualitative approach, based on action research. Semi-structured questionnaires and seminars were conducted from April to June 2012 in order to understand the nature of decisions and the decision-making process of nine nurses in position of managers at a public hospital in Southern Brazil. Data were subjected to content analysis. data were classified in two categories: the current situation of decision-making, which showed a lack of systematization; the construction and collective decision-making, which emphasizes the need to develop a decision-making model. the decision-making model used by nurses is limited because it does not consider two important factors: the limits of human rationality, and the external and internal organizational environments that influence and determine right decisions.
Theory of the decision/problem state
NASA Technical Reports Server (NTRS)
Dieterly, D. L.
1980-01-01
A theory of the decision-problem state was introduced and elaborated. Starting with the basic model of a decision-problem condition, an attempt was made to explain how a major decision-problem may consist of subsets of decision-problem conditions composing different condition sequences. In addition, the basic classical decision-tree model was modified to allow for the introduction of a series of characteristics that may be encountered in an analysis of a decision-problem state. The resulting hierarchical model reflects the unique attributes of the decision-problem state. The basic model of a decision-problem condition was used as a base to evolve a more complex model that is more representative of the decision-problem state and may be used to initiate research on decision-problem states.
Development of an evidence-based decision pathway for vestibular schwannoma treatment options.
Linkov, Faina; Valappil, Benita; McAfee, Jacob; Goughnour, Sharon L; Hildrew, Douglas M; McCall, Andrew A; Linkov, Igor; Hirsch, Barry; Snyderman, Carl
To integrate multiple sources of clinical information with patient feedback to build evidence-based decision support model to facilitate treatment selection for patients suffering from vestibular schwannomas (VS). This was a mixed methods study utilizing focus group and survey methodology to solicit feedback on factors important for making treatment decisions among patients. Two 90-minute focus groups were conducted by an experienced facilitator. Previously diagnosed VS patients were recruited by clinical investigators at the University of Pittsburgh Medical Center (UPMC). Classical content analysis was used for focus group data analysis. Providers were recruited from practices within the UPMC system and were surveyed using Delphi methods. This information can provide a basis for multi-criteria decision analysis (MCDA) framework to develop a treatment decision support system for patients with VS. Eight themes were derived from these data (focus group + surveys): doctor/health care system, side effects, effectiveness of treatment, anxiety, mortality, family/other people, quality of life, and post-operative symptoms. These data, as well as feedback from physicians were utilized in building a multi-criteria decision model. The study illustrated steps involved in the development of a decision support model that integrates evidence-based data and patient values to select treatment alternatives. Studies focusing on the actual development of the decision support technology for this group of patients are needed, as decisions are highly multifactorial. Such tools have the potential to improve decision making for complex medical problems with alternate treatment pathways. Copyright © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heo, Yeonsook; Augenbroe, Godfried; Graziano, Diane
2015-05-01
The increasing interest in retrofitting of existing buildings is motivated by the need to make a major contribution to enhancing building energy efficiency and reducing energy consumption and CO2 emission by the built environment. This paper examines the relevance of calibration in model-based analysis to support decision-making for energy and carbon efficiency retrofits of individual buildings and portfolios of buildings. The authors formulate a set of real retrofit decision-making situations and evaluate the role of calibration by using a case study that compares predictions and decisions from an uncalibrated model with those of a calibrated model. The case study illustratesmore » both the mechanics and outcomes of a practical alternative to the expert- and time-intense application of dynamic energy simulation models for large-scale retrofit decision-making under uncertainty.« less
A study on spatial decision support systems for HIV/AIDS prevention based on COM GIS technology
NASA Astrophysics Data System (ADS)
Yang, Kun; Luo, Huasong; Peng, Shungyun; Xu, Quanli
2007-06-01
Based on the deeply analysis of the current status and the existing problems of GIS technology applications in Epidemiology, this paper has proposed the method and process for establishing the spatial decision support systems of AIDS epidemic prevention by integrating the COM GIS, Spatial Database, GPS, Remote Sensing, and Communication technologies, as well as ASP and ActiveX software development technologies. One of the most important issues for constructing the spatial decision support systems of AIDS epidemic prevention is how to integrate the AIDS spreading models with GIS. The capabilities of GIS applications in the AIDS epidemic prevention have been described here in this paper firstly. Then some mature epidemic spreading models have also been discussed for extracting the computation parameters. Furthermore, a technical schema has been proposed for integrating the AIDS spreading models with GIS and relevant geospatial technologies, in which the GIS and model running platforms share a common spatial database and the computing results can be spatially visualized on Desktop or Web GIS clients. Finally, a complete solution for establishing the decision support systems of AIDS epidemic prevention has been offered in this paper based on the model integrating methods and ESRI COM GIS software packages. The general decision support systems are composed of data acquisition sub-systems, network communication sub-systems, model integrating sub-systems, AIDS epidemic information spatial database sub-systems, AIDS epidemic information querying and statistical analysis sub-systems, AIDS epidemic dynamic surveillance sub-systems, AIDS epidemic information spatial analysis and decision support sub-systems, as well as AIDS epidemic information publishing sub-systems based on Web GIS.
Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat
2008-01-01
Background Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. Methods In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Results Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Conclusion Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided. PMID:19036144
Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat
2008-11-26
Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided.
A multicriteria decision making model for assessment and selection of an ERP in a logistics context
NASA Astrophysics Data System (ADS)
Pereira, Teresa; Ferreira, Fernanda A.
2017-07-01
The aim of this work is to apply a methodology of decision support based on a multicriteria decision analyses (MCDA) model that allows the assessment and selection of an Enterprise Resource Planning (ERP) in a Portuguese logistics company by Group Decision Maker (GDM). A Decision Support system (DSS) that implements a MCDA - Multicriteria Methodology for the Assessment and Selection of Information Systems / Information Technologies (MMASSI / IT) is used based on its features and facility to change and adapt the model to a given scope. Using this DSS it was obtained the information system that best suited to the decisional context, being this result evaluated through a sensitivity and robustness analysis.
Hernandez, Jonathan M; Tsalatsanis, Athanasios; Humphries, Leigh Ann; Miladinovic, Branko; Djulbegovic, Benjamin; Velanovich, Vic
2014-06-01
To use regret decision theory methodology to assess three treatment strategies in pancreatic adenocarcinoma. Pancreatic adenocarcinoma is uniformly fatal without operative intervention. Resection can prolong survival in some patients; however, it is associated with significant morbidity and mortality. Regret theory serves as a novel framework linking both rationality and intuition to determine the optimal course for physicians facing difficult decisions related to treatment. We used the Cox proportional hazards model to predict survival of patients with pancreatic adenocarcinoma and generated a decision model using regret-based decision curve analysis, which integrates both the patient's prognosis and the physician's preferences expressed in terms of regret associated with a certain action. A physician's treatment preferences are indicated by a threshold probability, which is the probability of death/survival at which the physician is uncertain whether or not to perform surgery. The analysis modeled 3 possible choices: perform surgery on all patients; never perform surgery; and act according to the prediction model. The records of 156 consecutive patients with pancreatic adenocarcinoma were retrospectively evaluated by a single surgeon at a tertiary referral center. Significant independent predictors of overall survival included preoperative stage [P = 0.005; 95% confidence interval (CI), 1.19-2.27], vitality (P < 0.001; 95% CI, 0.96-0.98), daily physical function (P < 0.001; 95% CI, 0.97-0.99), and pathological stage (P < 0.001; 95% CI, 3.06-16.05). Compared with the "always aggressive" or "always passive" surgical treatment strategies, the survival model was associated with the least amount of regret for a wide range of threshold probabilities. Regret-based decision curve analysis provides a novel perspective for making treatment-related decisions by incorporating the decision maker's preferences expressed as his or her estimates of benefits and harms associated with the treatment considered.
Jiao, Y; Chen, R; Ke, X; Cheng, L; Chu, K; Lu, Z; Herskovits, E H
2011-01-01
Autism spectrum disorder (ASD) is a neurodevelopmental disorder, of which Asperger syndrome and high-functioning autism are subtypes. Our goal is: 1) to determine whether a diagnostic model based on single-nucleotide polymorphisms (SNPs), brain regional thickness measurements, or brain regional volume measurements can distinguish Asperger syndrome from high-functioning autism; and 2) to compare the SNP, thickness, and volume-based diagnostic models. Our study included 18 children with ASD: 13 subjects with high-functioning autism and 5 subjects with Asperger syndrome. For each child, we obtained 25 SNPs for 8 ASD-related genes; we also computed regional cortical thicknesses and volumes for 66 brain structures, based on structural magnetic resonance (MR) examination. To generate diagnostic models, we employed five machine-learning techniques: decision stump, alternating decision trees, multi-class alternating decision trees, logistic model trees, and support vector machines. For SNP-based classification, three decision-tree-based models performed better than the other two machine-learning models. The performance metrics for three decision-tree-based models were similar: decision stump was modestly better than the other two methods, with accuracy = 90%, sensitivity = 0.95 and specificity = 0.75. All thickness and volume-based diagnostic models performed poorly. The SNP-based diagnostic models were superior to those based on thickness and volume. For SNP-based classification, rs878960 in GABRB3 (gamma-aminobutyric acid A receptor, beta 3) was selected by all tree-based models. Our analysis demonstrated that SNP-based classification was more accurate than morphometry-based classification in ASD subtype classification. Also, we found that one SNP--rs878960 in GABRB3--distinguishes Asperger syndrome from high-functioning autism.
He, Xin; Frey, Eric C
2006-08-01
Previously, we have developed a decision model for three-class receiver operating characteristic (ROC) analysis based on decision theory. The proposed decision model maximizes the expected decision utility under the assumption that incorrect decisions have equal utilities under the same hypothesis (equal error utility assumption). This assumption reduced the dimensionality of the "general" three-class ROC analysis and provided a practical figure-of-merit to evaluate the three-class task performance. However, it also limits the generality of the resulting model because the equal error utility assumption will not apply for all clinical three-class decision tasks. The goal of this study was to investigate the optimality of the proposed three-class decision model with respect to several other decision criteria. In particular, besides the maximum expected utility (MEU) criterion used in the previous study, we investigated the maximum-correctness (MC) (or minimum-error), maximum likelihood (ML), and Nyman-Pearson (N-P) criteria. We found that by making assumptions for both MEU and N-P criteria, all decision criteria lead to the previously-proposed three-class decision model. As a result, this model maximizes the expected utility under the equal error utility assumption, maximizes the probability of making correct decisions, satisfies the N-P criterion in the sense that it maximizes the sensitivity of one class given the sensitivities of the other two classes, and the resulting ROC surface contains the maximum likelihood decision operating point. While the proposed three-class ROC analysis model is not optimal in the general sense due to the use of the equal error utility assumption, the range of criteria for which it is optimal increases its applicability for evaluating and comparing a range of diagnostic systems.
Trusted Advisors, Decision Models and Other Keys to Communicating Science to Decision Makers
NASA Astrophysics Data System (ADS)
Webb, E.
2006-12-01
Water resource management decisions often involve multiple parties engaged in contentious negotiations that try to navigate through complex combinations of legal, social, hydrologic, financial, and engineering considerations. The standard approach for resolving these issues is some form of multi-party negotiation, a formal court decision, or a combination of the two. In all these cases, the role of the decision maker(s) is to choose and implement the best option that fits the needs and wants of the community. However, each path to a decision carries the risk of technical and/or financial infeasibility as well as the possibility of unintended consequences. To help reduce this risk, decision makers often rely on some type of predictive analysis from which they can evaluate the projected consequences of their decisions. Typically, decision makers are supported in the analysis process by trusted advisors who engage in the analysis as well as the day to day tasks associated with multi-party negotiations. In the case of water resource management, the analysis is frequently a numerical model or set of models that can simulate various management decisions across multiple systems and output results that illustrate the impact on areas of concern. Thus, in order to communicate scientific knowledge to the decision makers, the quality of the communication between the analysts, the trusted advisor, and the decision maker must be clear and direct. To illustrate this concept, a multi-attribute decision analysis matrix will be used to outline the value of computer model-based collaborative negotiation approaches to guide water resources decision making and communication with decision makers. In addition, the critical role of the trusted advisor and other secondary participants in the decision process will be discussed using examples from recent water negotiations.
History matching through dynamic decision-making
Maschio, Célio; Santos, Antonio Alberto; Schiozer, Denis; Rocha, Anderson
2017-01-01
History matching is the process of modifying the uncertain attributes of a reservoir model to reproduce the real reservoir performance. It is a classical reservoir engineering problem and plays an important role in reservoir management since the resulting models are used to support decisions in other tasks such as economic analysis and production strategy. This work introduces a dynamic decision-making optimization framework for history matching problems in which new models are generated based on, and guided by, the dynamic analysis of the data of available solutions. The optimization framework follows a ‘learning-from-data’ approach, and includes two optimizer components that use machine learning techniques, such as unsupervised learning and statistical analysis, to uncover patterns of input attributes that lead to good output responses. These patterns are used to support the decision-making process while generating new, and better, history matched solutions. The proposed framework is applied to a benchmark model (UNISIM-I-H) based on the Namorado field in Brazil. Results show the potential the dynamic decision-making optimization framework has for improving the quality of history matching solutions using a substantial smaller number of simulations when compared with a previous work on the same benchmark. PMID:28582413
Boutkhoum, Omar; Hanine, Mohamed; Agouti, Tarik; Tikniouine, Abdessadek
2015-01-01
In this paper, we examine the issue of strategic industrial location selection in uncertain decision making environments for implanting new industrial corporation. In fact, the industrial location issue is typically considered as a crucial factor in business research field which is related to many calculations about natural resources, distributors, suppliers, customers, and most other things. Based on the integration of environmental, economic and social decisive elements of sustainable development, this paper presents a hybrid decision making model combining fuzzy multi-criteria analysis with analytical capabilities that OLAP systems can provide for successful and optimal industrial location selection. The proposed model mainly consists in three stages. In the first stage, a decision-making committee has been established to identify the evaluation criteria impacting the location selection process. In the second stage, we develop fuzzy AHP software based on the extent analysis method to assign the importance weights to the selected criteria, which allows us to model the linguistic vagueness, ambiguity, and incomplete knowledge. In the last stage, OLAP analysis integrated with multi-criteria analysis employs these weighted criteria as inputs to evaluate, rank and select the strategic industrial location for implanting new business corporation in the region of Casablanca, Morocco. Finally, a sensitivity analysis is performed to evaluate the impact of criteria weights and the preferences given by decision makers on the final rankings of strategic industrial locations.
Tsalatsanis, Athanasios; Barnes, Laura E; Hozo, Iztok; Djulbegovic, Benjamin
2011-12-23
Despite the well documented advantages of hospice care, most terminally ill patients do not reap the maximum benefit from hospice services, with the majority of them receiving hospice care either prematurely or delayed. Decision systems to improve the hospice referral process are sorely needed. We present a novel theoretical framework that is based on well-established methodologies of prognostication and decision analysis to assist with the hospice referral process for terminally ill patients. We linked the SUPPORT statistical model, widely regarded as one of the most accurate models for prognostication of terminally ill patients, with the recently developed regret based decision curve analysis (regret DCA). We extend the regret DCA methodology to consider harms associated with the prognostication test as well as harms and effects of the management strategies. In order to enable patients and physicians in making these complex decisions in real-time, we developed an easily accessible web-based decision support system available at the point of care. The web-based decision support system facilitates the hospice referral process in three steps. First, the patient or surrogate is interviewed to elicit his/her personal preferences regarding the continuation of life-sustaining treatment vs. palliative care. Then, regret DCA is employed to identify the best strategy for the particular patient in terms of threshold probability at which he/she is indifferent between continuation of treatment and of hospice referral. Finally, if necessary, the probabilities of survival and death for the particular patient are computed based on the SUPPORT prognostication model and contrasted with the patient's threshold probability. The web-based design of the CDSS enables patients, physicians, and family members to participate in the decision process from anywhere internet access is available. We present a theoretical framework to facilitate the hospice referral process. Further rigorous clinical evaluation including testing in a prospective randomized controlled trial is required and planned.
2011-01-01
Background Despite the well documented advantages of hospice care, most terminally ill patients do not reap the maximum benefit from hospice services, with the majority of them receiving hospice care either prematurely or delayed. Decision systems to improve the hospice referral process are sorely needed. Methods We present a novel theoretical framework that is based on well-established methodologies of prognostication and decision analysis to assist with the hospice referral process for terminally ill patients. We linked the SUPPORT statistical model, widely regarded as one of the most accurate models for prognostication of terminally ill patients, with the recently developed regret based decision curve analysis (regret DCA). We extend the regret DCA methodology to consider harms associated with the prognostication test as well as harms and effects of the management strategies. In order to enable patients and physicians in making these complex decisions in real-time, we developed an easily accessible web-based decision support system available at the point of care. Results The web-based decision support system facilitates the hospice referral process in three steps. First, the patient or surrogate is interviewed to elicit his/her personal preferences regarding the continuation of life-sustaining treatment vs. palliative care. Then, regret DCA is employed to identify the best strategy for the particular patient in terms of threshold probability at which he/she is indifferent between continuation of treatment and of hospice referral. Finally, if necessary, the probabilities of survival and death for the particular patient are computed based on the SUPPORT prognostication model and contrasted with the patient's threshold probability. The web-based design of the CDSS enables patients, physicians, and family members to participate in the decision process from anywhere internet access is available. Conclusions We present a theoretical framework to facilitate the hospice referral process. Further rigorous clinical evaluation including testing in a prospective randomized controlled trial is required and planned. PMID:22196308
Faults Discovery By Using Mined Data
NASA Technical Reports Server (NTRS)
Lee, Charles
2005-01-01
Fault discovery in the complex systems consist of model based reasoning, fault tree analysis, rule based inference methods, and other approaches. Model based reasoning builds models for the systems either by mathematic formulations or by experiment model. Fault Tree Analysis shows the possible causes of a system malfunction by enumerating the suspect components and their respective failure modes that may have induced the problem. The rule based inference build the model based on the expert knowledge. Those models and methods have one thing in common; they have presumed some prior-conditions. Complex systems often use fault trees to analyze the faults. Fault diagnosis, when error occurs, is performed by engineers and analysts performing extensive examination of all data gathered during the mission. International Space Station (ISS) control center operates on the data feedback from the system and decisions are made based on threshold values by using fault trees. Since those decision-making tasks are safety critical and must be done promptly, the engineers who manually analyze the data are facing time challenge. To automate this process, this paper present an approach that uses decision trees to discover fault from data in real-time and capture the contents of fault trees as the initial state of the trees.
Eckman, Mark H.; Alonso-Coello, Pablo; Guyatt, Gordon H.; Ebrahim, Shanil; Tikkinen, Kari A.O.; Lopes, Luciane Cruz; Neumann, Ignacio; McDonald, Sarah D.; Zhang, Yuqing; Zhou, Qi; Akl, Elie A.; Jacobsen, Ann Flem; Santamaría, Amparo; Annichino-Bizzacchi, Joyce Maria; Bitar, Wael; Sandset, Per Morten; Bates, Shannon M.
2016-01-01
Background Women with a history of venous thromboembolism (VTE) have an increased recurrence risk during pregnancy. Low molecular weight heparin (LMWH) reduces this risk, but is costly, burdensome, and may increase risk of bleeding. The decision to start thromboprophylaxis during pregnancy is sensitive to women's values and preferences. Our objective was to compare women's choices using a holistic approach in which they were presented all of the relevant information (direct-choice) versus a personalized decision analysis in which a mathematical model incorporated their preferences and VTE risk to make a treatment recommendation. Methods Multicenter, international study. Structured interviews were on women with a history of VTE who were pregnant, planning, or considering pregnancy. Women indicated their willingness to receive thromboprophylaxis based on scenarios using personalized estimates of VTE recurrence and bleeding risks. We also obtained women's values for health outcomes using a visual analog scale. We performed individualized decision analyses for each participant and compared model recommendations to decisions made when presented with the direct-choice exercise. Results Of the 123 women in the study, the decision model recommended LMWH for 51 women and recommended against LMWH for 72 women. 12% (6/51) of women for whom the decision model recommended thromboprophylaxis chose not to take LMWH; 72% (52/72) of women for whom the decision model recommended against thromboprophylaxis chose LMWH. Conclusions We observed a high degree of discordance between decisions in the direct-choice exercise and decision model recommendations. Although which approach best captures individuals’ true values remains uncertain, personalized decision support tools presenting results based on personalized risks and values may improve decision making. PMID:26033397
Eckman, Mark H; Alonso-Coello, Pablo; Guyatt, Gordon H; Ebrahim, Shanil; Tikkinen, Kari A O; Lopes, Luciane Cruz; Neumann, Ignacio; McDonald, Sarah D; Zhang, Yuqing; Zhou, Qi; Akl, Elie A; Jacobsen, Ann Flem; Santamaría, Amparo; Annichino-Bizzacchi, Joyce Maria; Bitar, Wael; Sandset, Per Morten; Bates, Shannon M
2015-08-01
Women with a history of venous thromboembolism (VTE) have an increased recurrence risk during pregnancy. Low molecular weight heparin (LMWH) reduces this risk, but is costly, burdensome, and may increase risk of bleeding. The decision to start thromboprophylaxis during pregnancy is sensitive to women's values and preferences. Our objective was to compare women's choices using a holistic approach in which they were presented all of the relevant information (direct-choice) versus a personalized decision analysis in which a mathematical model incorporated their preferences and VTE risk to make a treatment recommendation. Multicenter, international study. Structured interviews were on women with a history of VTE who were pregnant, planning, or considering pregnancy. Women indicated their willingness to receive thromboprophylaxis based on scenarios using personalized estimates of VTE recurrence and bleeding risks. We also obtained women's values for health outcomes using a visual analog scale. We performed individualized decision analyses for each participant and compared model recommendations to decisions made when presented with the direct-choice exercise. Of the 123 women in the study, the decision model recommended LMWH for 51 women and recommended against LMWH for 72 women. 12% (6/51) of women for whom the decision model recommended thromboprophylaxis chose not to take LMWH; 72% (52/72) of women for whom the decision model recommended against thromboprophylaxis chose LMWH. We observed a high degree of discordance between decisions in the direct-choice exercise and decision model recommendations. Although which approach best captures individuals' true values remains uncertain, personalized decision support tools presenting results based on personalized risks and values may improve decision making. Copyright © 2015 Elsevier Ltd. All rights reserved.
Pasta, D J; Taylor, J L; Henning, J M
1999-01-01
Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Malley, Daniel; Vesselinov, Velimir V.
MADSpython (Model analysis and decision support tools in Python) is a code in Python that streamlines the process of using data and models for analysis and decision support using the code MADS. MADS is open-source code developed at LANL and written in C/C++ (MADS; http://mads.lanl.gov; LA-CC-11-035). MADS can work with external models of arbitrary complexity as well as built-in models of flow and transport in porous media. The Python scripts in MADSpython facilitate the generation of input and output file needed by MADS as wells as the external simulators which include FEHM and PFLOTRAN. MADSpython enables a number of data-more » and model-based analyses including model calibration, sensitivity analysis, uncertainty quantification, and decision analysis. MADSpython will be released under GPL V3 license. MADSpython will be distributed as a Git repo at gitlab.com and github.com. MADSpython manual and documentation will be posted at http://madspy.lanl.gov.« less
Yu, Yuncui; Jia, Lulu; Meng, Yao; Hu, Lihua; Liu, Yiwei; Nie, Xiaolu; Zhang, Meng; Zhang, Xuan; Han, Sheng; Peng, Xiaoxia; Wang, Xiaoling
2018-04-01
Establishing a comprehensive clinical evaluation system is critical in enacting national drug policy and promoting rational drug use. In China, the 'Clinical Comprehensive Evaluation System for Pediatric Drugs' (CCES-P) project, which aims to compare drugs based on clinical efficacy and cost effectiveness to help decision makers, was recently proposed; therefore, a systematic and objective method is required to guide the process. An evidence-based multi-criteria decision analysis model that involved an analytic hierarchy process (AHP) was developed, consisting of nine steps: (1) select the drugs to be reviewed; (2) establish the evaluation criterion system; (3) determine the criterion weight based on the AHP; (4) construct the evidence body for each drug under evaluation; (5) select comparative measures and calculate the original utility score; (6) place a common utility scale and calculate the standardized utility score; (7) calculate the comprehensive utility score; (8) rank the drugs; and (9) perform a sensitivity analysis. The model was applied to the evaluation of three different inhaled corticosteroids (ICSs) used for asthma management in children (a total of 16 drugs with different dosage forms and strengths or different manufacturers). By applying the drug analysis model, the 16 ICSs under review were successfully scored and evaluated. Budesonide suspension for inhalation (drug ID number: 7) ranked the highest, with comprehensive utility score of 80.23, followed by fluticasone propionate inhaled aerosol (drug ID number: 16), with a score of 79.59, and budesonide inhalation powder (drug ID number: 6), with a score of 78.98. In the sensitivity analysis, the ranking of the top five and lowest five drugs remains unchanged, suggesting this model is generally robust. An evidence-based drug evaluation model based on AHP was successfully developed. The model incorporates sufficient utility and flexibility for aiding the decision-making process, and can be a useful tool for the CCES-P.
2017-01-01
In principle, formal dynamical models of decision making hold the potential to represent fundamental computations underpinning value-based (i.e., preferential) decisions in addition to perceptual decisions. Sequential-sampling models such as the race model and the drift-diffusion model that are grounded in simplicity, analytical tractability, and optimality remain popular, but some of their more recent counterparts have instead been designed with an aim for more feasibility as architectures to be implemented by actual neural systems. Connectionist models are proposed herein at an intermediate level of analysis that bridges mental phenomena and underlying neurophysiological mechanisms. Several such models drawing elements from the established race, drift-diffusion, feedforward-inhibition, divisive-normalization, and competing-accumulator models were tested with respect to fitting empirical data from human participants making choices between foods on the basis of hedonic value rather than a traditional perceptual attribute. Even when considering performance at emulating behavior alone, more neurally plausible models were set apart from more normative race or drift-diffusion models both quantitatively and qualitatively despite remaining parsimonious. To best capture the paradigm, a novel six-parameter computational model was formulated with features including hierarchical levels of competition via mutual inhibition as well as a static approximation of attentional modulation, which promotes “winner-take-all” processing. Moreover, a meta-analysis encompassing several related experiments validated the robustness of model-predicted trends in humans’ value-based choices and concomitant reaction times. These findings have yet further implications for analysis of neurophysiological data in accordance with computational modeling, which is also discussed in this new light. PMID:29077746
Colas, Jaron T
2017-01-01
In principle, formal dynamical models of decision making hold the potential to represent fundamental computations underpinning value-based (i.e., preferential) decisions in addition to perceptual decisions. Sequential-sampling models such as the race model and the drift-diffusion model that are grounded in simplicity, analytical tractability, and optimality remain popular, but some of their more recent counterparts have instead been designed with an aim for more feasibility as architectures to be implemented by actual neural systems. Connectionist models are proposed herein at an intermediate level of analysis that bridges mental phenomena and underlying neurophysiological mechanisms. Several such models drawing elements from the established race, drift-diffusion, feedforward-inhibition, divisive-normalization, and competing-accumulator models were tested with respect to fitting empirical data from human participants making choices between foods on the basis of hedonic value rather than a traditional perceptual attribute. Even when considering performance at emulating behavior alone, more neurally plausible models were set apart from more normative race or drift-diffusion models both quantitatively and qualitatively despite remaining parsimonious. To best capture the paradigm, a novel six-parameter computational model was formulated with features including hierarchical levels of competition via mutual inhibition as well as a static approximation of attentional modulation, which promotes "winner-take-all" processing. Moreover, a meta-analysis encompassing several related experiments validated the robustness of model-predicted trends in humans' value-based choices and concomitant reaction times. These findings have yet further implications for analysis of neurophysiological data in accordance with computational modeling, which is also discussed in this new light.
2018 Military Retirement Options: An Expected Net Present Value Decision Analysis Model
2017-03-23
Decision Analysis Model Bret N. Witham Follow this and additional works at: https://scholar.afit.edu/etd Part of the Benefits and Compensation Commons...FORCE INSTITUTE OF TECHNOLOGY Wright-Patterson Air Force Base, Ohio DISTRIBUTION STATEMENT A. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED...Science in Operations Research Bret N. Witham, BS Captain, USAF March 2017 DISTRIBUTION STATEMENT A. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION
NASA Astrophysics Data System (ADS)
Zubir, S. N. A.; Thiruchelvam, S.; Mustapha, K. N. M.; Che Muda, Z.; Ghazali, A.; Hakimie, H.
2017-12-01
For the past few years, natural disaster has been the subject of debate in disaster management especially in flood disaster. Each year, natural disaster results in significant loss of life, destruction of homes and public infrastructure, and economic hardship. Hence, an effective and efficient flood disaster management would assure non-futile efforts for life saving. The aim of this article is to examine the relationship between approach, decision maker, influence factor, result, and ethic to decision making for flood disaster management in Malaysia. The key elements of decision making in the disaster management were studied based on the literature. Questionnaire surveys were administered among lead agencies at East Coast of Malaysia in the state of Kelantan and Pahang. A total of 307 valid responses had been obtained for further analysis. Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA) were carried out to analyse the measurement model involved in the study. The CFA for second-order reflective and first-order reflective measurement model indicates that approach, decision maker, influence factor, result, and ethic have a significant and direct effect on decision making during disaster. The results from this study showed that decision- making during disaster is an important element for disaster management to necessitate a successful collaborative decision making. The measurement model is accepted to proceed with further analysis known as Structural Equation Modeling (SEM) and can be assessed for the future research.
Applicability of aquifer impact models to support decisions at CO 2 sequestration sites
Keating, Elizabeth; Bacon, Diana; Carroll, Susan; ...
2016-07-25
The National Risk Assessment Partnership has developed a suite of tools to assess and manage risk at CO 2 sequestration sites. This capability includes polynomial or look-up table based reduced-order models (ROMs) that predict the impact of CO 2 and brine leaks on overlying aquifers. The development of these computationally-efficient models and the underlying reactive transport simulations they emulate has been documented elsewhere (Carroll et al., 2014a; Carroll et al., 2014b; Dai et al., 2014 ; Keating et al., 2016). Here in this paper, we seek to demonstrate applicability of ROM-based analysis by considering what types of decisions and aquifermore » types would benefit from the ROM analysis. We present four hypothetical examples where applying ROMs, in ensemble mode, could support decisions during a geologic CO 2 sequestration project. These decisions pertain to site selection, site characterization, monitoring network evaluation, and health impacts. In all cases, we consider potential brine/CO 2 leak rates at the base of the aquifer to be uncertain. We show that derived probabilities provide information relevant to the decision at hand. Although the ROMs were developed using site-specific data from two aquifers (High Plains and Edwards), the models accept aquifer characteristics as variable inputs and so they may have more broad applicability. We conclude that pH and TDS predictions are the most transferable to other aquifers based on the analysis of the nine water quality metrics (pH, TDS, 4 trace metals, 3 organic compounds). Guidelines are presented for determining the aquifer types for which the ROMs should be applicable.« less
Characterizing Decision-Analysis Performances of Risk Prediction Models Using ADAPT Curves.
Lee, Wen-Chung; Wu, Yun-Chun
2016-01-01
The area under the receiver operating characteristic curve is a widely used index to characterize the performance of diagnostic tests and prediction models. However, the index does not explicitly acknowledge the utilities of risk predictions. Moreover, for most clinical settings, what counts is whether a prediction model can guide therapeutic decisions in a way that improves patient outcomes, rather than to simply update probabilities.Based on decision theory, the authors propose an alternative index, the "average deviation about the probability threshold" (ADAPT).An ADAPT curve (a plot of ADAPT value against the probability threshold) neatly characterizes the decision-analysis performances of a risk prediction model.Several prediction models can be compared for their ADAPT values at a chosen probability threshold, for a range of plausible threshold values, or for the whole ADAPT curves. This should greatly facilitate the selection of diagnostic tests and prediction models.
ERIC Educational Resources Information Center
Hilbig, Benjamin E.; Pohl, Rudiger F.
2009-01-01
According to part of the adaptive toolbox notion of decision making known as the recognition heuristic (RH), the decision process in comparative judgments--and its duration--is determined by whether recognition discriminates between objects. By contrast, some recently proposed alternative models predict that choices largely depend on the amount of…
Azadeh, A; Mokhtari, Z; Sharahi, Z Jiryaei; Zarrin, M
2015-12-01
Decision making failure is a predominant human error in emergency situations. To demonstrate the subject model, operators of an oil refinery were asked to answer a health, safety and environment HSE-decision styles (DS) questionnaire. In order to achieve this purpose, qualitative indicators in HSE and ergonomics domain have been collected. Decision styles, related to the questions, have been selected based on Driver taxonomy of human decision making approach. Teamwork efficiency has been assessed based on different decision style combinations. The efficiency has been ranked based on HSE performance. Results revealed that efficient decision styles resulted from data envelopment analysis (DEA) optimization model is consistent with the plant's dominant styles. Therefore, improvement in system performance could be achieved using the best operator for critical posts or in team arrangements. This is the first study that identifies the best decision styles with respect to HSE and ergonomics factors. Copyright © 2015 Elsevier Ltd. All rights reserved.
Reflections in the clinical practice.
Borrell-Carrió, F; Hernández-Clemente, J C
2014-03-01
The purpose of this article is to analyze some models of expert decision and their impact on the clinical practice. We have analyzed decision-making considering the cognitive aspects (explanatory models, perceptual skills, analysis of the variability of a phenomenon, creating habits and inertia of reasoning and declarative models based on criteria). We have added the importance of emotions in decision making within highly complex situations, such as those occurring within the clinical practice. The quality of the reflective act depends, among other factors, on the ability of metacognition (thinking about what we think). Finally, we propose an educational strategy based on having a task supervisor and rectification scenarios to improve the quality of medical decision making. Copyright © 2013 Elsevier España, S.L. All rights reserved.
Constantinou, Anthony Costa; Fenton, Norman; Marsh, William; Radlinski, Lukasz
2016-01-01
Objectives 1) To develop a rigorous and repeatable method for building effective Bayesian network (BN) models for medical decision support from complex, unstructured and incomplete patient questionnaires and interviews that inevitably contain examples of repetitive, redundant and contradictory responses; 2) To exploit expert knowledge in the BN development since further data acquisition is usually not possible; 3) To ensure the BN model can be used for interventional analysis; 4) To demonstrate why using data alone to learn the model structure and parameters is often unsatisfactory even when extensive data is available. Method The method is based on applying a range of recent BN developments targeted at helping experts build BNs given limited data. While most of the components of the method are based on established work, its novelty is that it provides a rigorous consolidated and generalised framework that addresses the whole life-cycle of BN model development. The method is based on two original and recent validated BN models in forensic psychiatry, known as DSVM-MSS and DSVM-P. Results When employed with the same datasets, the DSVM-MSS demonstrated competitive to superior predictive performance (AUC scores 0.708 and 0.797) against the state-of-the-art (AUC scores ranging from 0.527 to 0.705), and the DSVM-P demonstrated superior predictive performance (cross-validated AUC score of 0.78) against the state-of-the-art (AUC scores ranging from 0.665 to 0.717). More importantly, the resulting models go beyond improving predictive accuracy and into usefulness for risk management purposes through intervention, and enhanced decision support in terms of answering complex clinical questions that are based on unobserved evidence. Conclusions This development process is applicable to any application domain which involves large-scale decision analysis based on such complex information, rather than based on data with hard facts, and in conjunction with the incorporation of expert knowledge for decision support via intervention. The novelty extends to challenging the decision scientists to reason about building models based on what information is really required for inference, rather than based on what data is available and hence, forces decision scientists to use available data in a much smarter way. PMID:26830286
Constantinou, Anthony Costa; Fenton, Norman; Marsh, William; Radlinski, Lukasz
2016-02-01
(1) To develop a rigorous and repeatable method for building effective Bayesian network (BN) models for medical decision support from complex, unstructured and incomplete patient questionnaires and interviews that inevitably contain examples of repetitive, redundant and contradictory responses; (2) To exploit expert knowledge in the BN development since further data acquisition is usually not possible; (3) To ensure the BN model can be used for interventional analysis; (4) To demonstrate why using data alone to learn the model structure and parameters is often unsatisfactory even when extensive data is available. The method is based on applying a range of recent BN developments targeted at helping experts build BNs given limited data. While most of the components of the method are based on established work, its novelty is that it provides a rigorous consolidated and generalised framework that addresses the whole life-cycle of BN model development. The method is based on two original and recent validated BN models in forensic psychiatry, known as DSVM-MSS and DSVM-P. When employed with the same datasets, the DSVM-MSS demonstrated competitive to superior predictive performance (AUC scores 0.708 and 0.797) against the state-of-the-art (AUC scores ranging from 0.527 to 0.705), and the DSVM-P demonstrated superior predictive performance (cross-validated AUC score of 0.78) against the state-of-the-art (AUC scores ranging from 0.665 to 0.717). More importantly, the resulting models go beyond improving predictive accuracy and into usefulness for risk management purposes through intervention, and enhanced decision support in terms of answering complex clinical questions that are based on unobserved evidence. This development process is applicable to any application domain which involves large-scale decision analysis based on such complex information, rather than based on data with hard facts, and in conjunction with the incorporation of expert knowledge for decision support via intervention. The novelty extends to challenging the decision scientists to reason about building models based on what information is really required for inference, rather than based on what data is available and hence, forces decision scientists to use available data in a much smarter way. Copyright © 2016 Elsevier B.V. All rights reserved.
An integrated GIS-based, multi-attribute decision model deployed in a web-based platform is presented enabling an iterative, spatially explicit and collaborative analysis of relevant and available information for repurposing vacant land. The process incorporated traditional and ...
This paper proposes a robustness analysis based on Multiple Criteria Decision Aiding (MCDA). The ensuing model was used to assess the implementation of green chemistry principles in the synthesis of silver nanoparticles. Its recommendations were also compared to an earlier develo...
Demeter, Sandor J
2016-12-21
Health care providers (HCP) and clinical scientists (CS) are generally most comfortable using evidence-based rational decision-making models. They become very frustrated when policymakers make decisions that, on the surface, seem irrational and unreasonable. However, such decisions usually make sense when analysed properly. The goal of this paper to provide a basic theoretical understanding of major policy models, to illustrate which models are most prevalent in publicly funded health care systems, and to propose a policy analysis framework to better understand the elements that drive policy decision-making. The proposed policy framework will also assist HCP and CS achieve greater success with their own proposals.
Analysis of a decision model in the context of equilibrium pricing and order book pricing
NASA Astrophysics Data System (ADS)
Wagner, D. C.; Schmitt, T. A.; Schäfer, R.; Guhr, T.; Wolf, D. E.
2014-12-01
An agent-based model for financial markets has to incorporate two aspects: decision making and price formation. We introduce a simple decision model and consider its implications in two different pricing schemes. First, we study its parameter dependence within a supply-demand balance setting. We find realistic behavior in a wide parameter range. Second, we embed our decision model in an order book setting. Here, we observe interesting features which are not present in the equilibrium pricing scheme. In particular, we find a nontrivial behavior of the order book volumes which reminds of a trend switching phenomenon. Thus, the decision making model alone does not realistically represent the trading and the stylized facts. The order book mechanism is crucial.
Dhukaram, Anandhi Vivekanandan; Baber, Chris
2015-06-01
Patients make various healthcare decisions on a daily basis. Such day-to-day decision making can have significant consequences on their own health, treatment, care, and costs. While decision aids (DAs) provide effective support in enhancing patient's decision making, to date there have been few studies examining patient's decision making process or exploring how the understanding of such decision processes can aid in extracting requirements for the design of DAs. This paper applies Cognitive Work Analysis (CWA) to analyse patient's decision making in order to inform requirements for supporting self-care decision making. This study uses focus groups to elicit information from elderly cardiovascular disease (CVD) patients concerning a range of decision situations they face on a daily basis. Specifically, the focus groups addressed issues related to the decision making of CVD in terms of medication compliance, pain, diet and exercise. The results of these focus groups are used to develop high level views using CWA. CWA framework decomposes the complex decision making problem to inform three approaches to DA design: one design based on high level requirements; one based on a normative model of decision-making for patients; and the third based on a range of heuristics that patients seem to use. CWA helps in extracting and synthesising decision making from different perspectives: decision processes, work organisation, patient competencies and strategies used in decision making. As decision making can be influenced by human behaviour like skills, rules and knowledge, it is argued that patients require support to different types of decision making. This paper also provides insights for designers in using CWA framework for the design of effective DAs to support patients in self-management. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
[Parameter of evidence-based medicine in health care economics].
Wasem, J; Siebert, U
1999-08-01
In the view of scarcity of resources, economic evaluations in health care, in which not only effects but also costs related to a medical intervention are examined and a incremental cost-outcome-ratio is build, are an important supplement to the program of evidence based medicine. Outcomes of a medical intervention can be measured by clinical effectiveness, quality-adjusted life years, and monetary evaluation of benefits. As far as costs are concerned, direct medical costs, direct non-medical costs and indirect costs have to be considered in an economic evaluation. Data can be used from primary studies or secondary analysis; metaanalysis for synthesizing of data may be adequate. For calculation of incremental cost-benefit-ratios, models of decision analysis (decision tree models, Markov-models) often are necessary. Methodological and ethical limits for application of the results of economic evaluation in resource allocation decision in health care have to be regarded: Economic evaluations and the calculation of cost-outcome-rations should only support decision making but cannot replace it.
Impact of model-based risk analysis for liver surgery planning.
Hansen, C; Zidowitz, S; Preim, B; Stavrou, G; Oldhafer, K J; Hahn, H K
2014-05-01
A model-based risk analysis for oncologic liver surgery was described in previous work (Preim et al. in Proceedings of international symposium on computer assisted radiology and surgery (CARS), Elsevier, Amsterdam, pp. 353–358, 2002; Hansen et al. Int I Comput Assist Radiol Surg 4(5):469–474, 2009). In this paper, we present an evaluation of this method. To prove whether and how the risk analysis facilitates the process of liver surgery planning, an explorative user study with 10 liver experts was conducted. The purpose was to compare and analyze their decision-making. The results of the study show that model-based risk analysis enhances the awareness of surgical risk in the planning stage. Participants preferred smaller resection volumes and agreed more on the safety margins’ width in case the risk analysis was available. In addition, time to complete the planning task and confidence of participants were not increased when using the risk analysis. This work shows that the applied model-based risk analysis may influence important planning decisions in liver surgery. It lays a basis for further clinical evaluations and points out important fields for future research.
Cai, Hao; Long, Weiding; Li, Xianting; Kong, Lingjuan; Xiong, Shuang
2010-06-15
In case hazardous contaminants are suddenly released indoors, the prompt and proper emergency responses are critical to protect occupants. This paper aims to provide a framework for determining the optimal combination of ventilation and evacuation strategies by considering the uncertainty of source locations. The certainty of source locations is classified as complete certainty, incomplete certainty, and complete uncertainty to cover all the possible situations. According to this classification, three types of decision analysis models are presented. A new concept, efficiency factor of contaminant source (EFCS), is incorporated in these models to evaluate the payoffs of the ventilation and evacuation strategies. A procedure of decision-making based on these models is proposed and demonstrated by numerical studies of one hundred scenarios with ten ventilation modes, two evacuation modes, and five source locations. The results show that the models can be useful to direct the decision analysis of both the ventilation and evacuation strategies. In addition, the certainty of the source locations has an important effect on the outcomes of the decision-making. Copyright 2010 Elsevier B.V. All rights reserved.
Directional Slack-Based Measure for the Inverse Data Envelopment Analysis
Abu Bakar, Mohd Rizam; Lee, Lai Soon; Jaafar, Azmi B.; Heydar, Maryam
2014-01-01
A novel technique has been introduced in this research which lends its basis to the Directional Slack-Based Measure for the inverse Data Envelopment Analysis. In practice, the current research endeavors to elucidate the inverse directional slack-based measure model within a new production possibility set. On one occasion, there is a modification imposed on the output (input) quantities of an efficient decision making unit. In detail, the efficient decision making unit in this method was omitted from the present production possibility set but substituted by the considered efficient decision making unit while its input and output quantities were subsequently modified. The efficiency score of the entire DMUs will be retained in this approach. Also, there would be an improvement in the efficiency score. The proposed approach was investigated in this study with reference to a resource allocation problem. It is possible to simultaneously consider any upsurges (declines) of certain outputs associated with the efficient decision making unit. The significance of the represented model is accentuated by presenting numerical examples. PMID:24883350
Robustness analysis of a green chemistry-based model for the ...
This paper proposes a robustness analysis based on Multiple Criteria Decision Aiding (MCDA). The ensuing model was used to assess the implementation of green chemistry principles in the synthesis of silver nanoparticles. Its recommendations were also compared to an earlier developed model for the same purpose to investigate concordance between the models and potential decision support synergies. A three-phase procedure was adopted to achieve the research objectives. Firstly, an ordinal ranking of the evaluation criteria used to characterize the implementation of green chemistry principles was identified through relative ranking analysis. Secondly, a structured selection process for an MCDA classification method was conducted, which ensued in the identification of Stochastic Multi-Criteria Acceptability Analysis (SMAA). Lastly, the agreement of the classifications by the two MCDA models and the resulting synergistic role of decision recommendations were studied. This comparison showed that the results of the two models agree between 76% and 93% of the simulation set-ups and it confirmed that different MCDA models provide a more inclusive and transparent set of recommendations. This integrative research confirmed the beneficial complementary use of MCDA methods to aid responsible development of nanosynthesis, by accounting for multiple objectives and helping communication of complex information in a comprehensive and traceable format, suitable for stakeholders and
Abe, James; Lobo, Jennifer M; Trifiletti, Daniel M; Showalter, Timothy N
2017-08-24
Despite the emergence of genomics-based risk prediction tools in oncology, there is not yet an established framework for communication of test results to cancer patients to support shared decision-making. We report findings from a stakeholder engagement program that aimed to develop a framework for using Markov models with individualized model inputs, including genomics-based estimates of cancer recurrence probability, to generate personalized decision aids for prostate cancer patients faced with radiation therapy treatment decisions after prostatectomy. We engaged a total of 22 stakeholders, including: prostate cancer patients, urological surgeons, radiation oncologists, genomic testing industry representatives, and biomedical informatics faculty. Slides were at each meeting to provide background information regarding the analytical framework. Participants were invited to provide feedback during the meeting, including revising the overall project aims. Stakeholder meeting content was reviewed and summarized by stakeholder group and by theme. The majority of stakeholder suggestions focused on aspects of decision aid design and formatting. Stakeholders were enthusiastic about the potential value of using decision analysis modeling with personalized model inputs for cancer recurrence risk, as well as competing risks from age and comorbidities, to generate a patient-centered tool to assist decision-making. Stakeholders did not view privacy considerations as a major barrier to the proposed decision aid program. A common theme was that decision aids should be portable across multiple platforms (electronic and paper), should allow for interaction by the user to adjust model inputs iteratively, and available to patients both before and during consult appointments. Emphasis was placed on the challenge of explaining the model's composite result of quality-adjusted life years. A range of stakeholders provided valuable insights regarding the design of a personalized decision aid program, based upon Markov modeling with individualized model inputs, to provide a patient-centered framework to support for genomic-based treatment decisions for cancer patients. The guidance provided by our stakeholders may be broadly applicable to the communication of genomic test results to patients in a patient-centered fashion that supports effective shared decision-making that represents a spectrum of personal factors such as age, medical comorbidities, and individual priorities and values.
NASA Technical Reports Server (NTRS)
2002-01-01
Under a Phase II SBIR contract, Kennedy and Lumina Decision Systems, Inc., jointly developed the Schedule and Cost Risk Analysis Modeling (SCRAM) system, based on a version of Lumina's flagship software product, Analytica(R). Acclaimed as "the best single decision-analysis program yet produced" by MacWorld magazine, Analytica is a "visual" tool used in decision-making environments worldwide to build, revise, and present business models, minus the time-consuming difficulty commonly associated with spreadsheets. With Analytica as their platform, Kennedy and Lumina created the SCRAM system in response to NASA's need to identify the importance of major delays in Shuttle ground processing, a critical function in project management and process improvement. As part of the SCRAM development project, Lumina designed a version of Analytica called the Analytica Design Engine (ADE) that can be easily incorporated into larger software systems. ADE was commercialized and utilized in many other developments, including web-based decision support.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vesselinov, Velimir; O'Malley, Daniel; Lin, Youzuo
2016-07-01
Mads.jl (Model analysis and decision support in Julia) is a code that streamlines the process of using data and models for analysis and decision support. It is based on another open-source code developed at LANL and written in C/C++ (MADS; http://mads.lanl.gov; LA-CC-11- 035). Mads.jl can work with external models of arbitrary complexity as well as built-in models of flow and transport in porous media. It enables a number of data- and model-based analyses including model calibration, sensitivity analysis, uncertainty quantification, and decision analysis. The code also can use a series of alternative adaptive computational techniques for Bayesian sampling, Monte Carlo,more » and Bayesian Information-Gap Decision Theory. The code is implemented in the Julia programming language, and has high-performance (parallel) and memory management capabilities. The code uses a series of third party modules developed by others. The code development will also include contributions to the existing third party modules written in Julia; this contributions will be important for the efficient implementation of the algorithm used by Mads.jl. The code also uses a series of LANL developed modules that are developed by Dan O'Malley; these modules will be also a part of the Mads.jl release. Mads.jl will be released under GPL V3 license. The code will be distributed as a Git repo at gitlab.com and github.com. Mads.jl manual and documentation will be posted at madsjulia.lanl.gov.« less
Background / Question / Methods Planning for the recovery of threatened species is increasingly informed by spatially-explicit population models. However, using simulation model results to guide land management decisions can be difficult due to the volume and complexity of model...
Dual Processes in Decision Making and Developmental Neuroscience: A Fuzzy-Trace Model.
Reyna, Valerie F; Brainerd, Charles J
2011-09-01
From Piaget to the present, traditional and dual-process theories have predicted improvement in reasoning from childhood to adulthood, and improvement has been observed. However, developmental reversals-that reasoning biases emerge with development -have also been observed in a growing list of paradigms. We explain how fuzzy-trace theory predicts both improvement and developmental reversals in reasoning and decision making. Drawing on research on logical and quantitative reasoning, as well as on risky decision making in the laboratory and in life, we illustrate how the same small set of theoretical principles apply to typical neurodevelopment, encompassing childhood, adolescence, and adulthood, and to neurological conditions such as autism and Alzheimer's disease. For example, framing effects-that risk preferences shift when the same decisions are phrases in terms of gains versus losses-emerge in early adolescence as gist-based intuition develops. In autistic individuals, who rely less on gist-based intuition and more on verbatim-based analysis, framing biases are attenuated (i.e., they outperform typically developing control subjects). In adults, simple manipulations based on fuzzy-trace theory can make framing effects appear and disappear depending on whether gist-based intuition or verbatim-based analysis is induced. These theoretical principles are summarized and integrated in a new mathematical model that specifies how dual modes of reasoning combine to produce predictable variability in performance. In particular, we show how the most popular and extensively studied model of decision making-prospect theory-can be derived from fuzzy-trace theory by combining analytical (verbatim-based) and intuitive (gist-based) processes.
Dual Processes in Decision Making and Developmental Neuroscience: A Fuzzy-Trace Model
Reyna, Valerie F.; Brainerd, Charles J.
2011-01-01
From Piaget to the present, traditional and dual-process theories have predicted improvement in reasoning from childhood to adulthood, and improvement has been observed. However, developmental reversals—that reasoning biases emerge with development —have also been observed in a growing list of paradigms. We explain how fuzzy-trace theory predicts both improvement and developmental reversals in reasoning and decision making. Drawing on research on logical and quantitative reasoning, as well as on risky decision making in the laboratory and in life, we illustrate how the same small set of theoretical principles apply to typical neurodevelopment, encompassing childhood, adolescence, and adulthood, and to neurological conditions such as autism and Alzheimer's disease. For example, framing effects—that risk preferences shift when the same decisions are phrases in terms of gains versus losses—emerge in early adolescence as gist-based intuition develops. In autistic individuals, who rely less on gist-based intuition and more on verbatim-based analysis, framing biases are attenuated (i.e., they outperform typically developing control subjects). In adults, simple manipulations based on fuzzy-trace theory can make framing effects appear and disappear depending on whether gist-based intuition or verbatim-based analysis is induced. These theoretical principles are summarized and integrated in a new mathematical model that specifies how dual modes of reasoning combine to produce predictable variability in performance. In particular, we show how the most popular and extensively studied model of decision making—prospect theory—can be derived from fuzzy-trace theory by combining analytical (verbatim-based) and intuitive (gist-based) processes. PMID:22096268
Air Force Nuclear Enterprise Organization: A Case Study
2016-09-15
will improve the performance of the AFNE. Based on analysis of commercial and industrial business models, what organizational structure , or...Business Dictionary 2015). Organizational structures will be developed based on decisions made with regards to design. The core of an...work flows. Based on design parameter decisions, senior leaders will establish an organizational structure that includes the layout of the
Demographics of reintroduced populations: estimation, modeling, and decision analysis
Converse, Sarah J.; Moore, Clinton T.; Armstrong, Doug P.
2013-01-01
Reintroduction can be necessary for recovering populations of threatened species. However, the success of reintroduction efforts has been poorer than many biologists and managers would hope. To increase the benefits gained from reintroduction, management decision making should be couched within formal decision-analytic frameworks. Decision analysis is a structured process for informing decision making that recognizes that all decisions have a set of components—objectives, alternative management actions, predictive models, and optimization methods—that can be decomposed, analyzed, and recomposed to facilitate optimal, transparent decisions. Because the outcome of interest in reintroduction efforts is typically population viability or related metrics, models used in decision analysis efforts for reintroductions will need to include population models. In this special section of the Journal of Wildlife Management, we highlight examples of the construction and use of models for informing management decisions in reintroduced populations. In this introductory contribution, we review concepts in decision analysis, population modeling for analysis of decisions in reintroduction settings, and future directions. Increased use of formal decision analysis, including adaptive management, has great potential to inform reintroduction efforts. Adopting these practices will require close collaboration among managers, decision analysts, population modelers, and field biologists.
Devaluation and sequential decisions: linking goal-directed and model-based behavior
Friedel, Eva; Koch, Stefan P.; Wendt, Jean; Heinz, Andreas; Deserno, Lorenz; Schlagenhauf, Florian
2014-01-01
In experimental psychology different experiments have been developed to assess goal–directed as compared to habitual control over instrumental decisions. Similar to animal studies selective devaluation procedures have been used. More recently sequential decision-making tasks have been designed to assess the degree of goal-directed vs. habitual choice behavior in terms of an influential computational theory of model-based compared to model-free behavioral control. As recently suggested, different measurements are thought to reflect the same construct. Yet, there has been no attempt to directly assess the construct validity of these different measurements. In the present study, we used a devaluation paradigm and a sequential decision-making task to address this question of construct validity in a sample of 18 healthy male human participants. Correlational analysis revealed a positive association between model-based choices during sequential decisions and goal-directed behavior after devaluation suggesting a single framework underlying both operationalizations and speaking in favor of construct validity of both measurement approaches. Up to now, this has been merely assumed but never been directly tested in humans. PMID:25136310
Generalisability in economic evaluation studies in healthcare: a review and case studies.
Sculpher, M J; Pang, F S; Manca, A; Drummond, M F; Golder, S; Urdahl, H; Davies, L M; Eastwood, A
2004-12-01
To review, and to develop further, the methods used to assess and to increase the generalisability of economic evaluation studies. Electronic databases. Methodological studies relating to economic evaluation in healthcare were searched. This included electronic searches of a range of databases, including PREMEDLINE, MEDLINE, EMBASE and EconLit, and manual searches of key journals. The case studies of a decision analytic model involved highlighting specific features of previously published economic studies related to generalisability and location-related variability. The case-study involving the secondary analysis of cost-effectiveness analyses was based on the secondary analysis of three economic studies using data from randomised trials. The factor most frequently cited as generating variability in economic results between locations was the unit costs associated with particular resources. In the context of studies based on the analysis of patient-level data, regression analysis has been advocated as a means of looking at variability in economic results across locations. These methods have generally accepted that some components of resource use and outcomes are exchangeable across locations. Recent studies have also explored, in cost-effectiveness analysis, the use of tests of heterogeneity similar to those used in clinical evaluation in trials. The decision analytic model has been the main means by which cost-effectiveness has been adapted from trial to non-trial locations. Most models have focused on changes to the cost side of the analysis, but it is clear that the effectiveness side may also need to be adapted between locations. There have been weaknesses in some aspects of the reporting in applied cost-effectiveness studies. These may limit decision-makers' ability to judge the relevance of a study to their specific situations. The case study demonstrated the potential value of multilevel modelling (MLM). Where clustering exists by location (e.g. centre or country), MLM can facilitate correct estimates of the uncertainty in cost-effectiveness results, and also a means of estimating location-specific cost-effectiveness. The review of applied economic studies based on decision analytic models showed that few studies were explicit about their target decision-maker(s)/jurisdictions. The studies in the review generally made more effort to ensure that their cost inputs were specific to their target jurisdiction than their effectiveness parameters. Standard sensitivity analysis was the main way of dealing with uncertainty in the models, although few studies looked explicitly at variability between locations. The modelling case study illustrated how effectiveness and cost data can be made location-specific. In particular, on the effectiveness side, the example showed the separation of location-specific baseline events and pooled estimates of relative treatment effect, where the latter are assumed exchangeable across locations. A large number of factors are mentioned in the literature that might be expected to generate variation in the cost-effectiveness of healthcare interventions across locations. Several papers have demonstrated differences in the volume and cost of resource use between locations, but few studies have looked at variability in outcomes. In applied trial-based cost-effectiveness studies, few studies provide sufficient evidence for decision-makers to establish the relevance or to adjust the results of the study to their location of interest. Very few studies utilised statistical methods formally to assess the variability in results between locations. In applied economic studies based on decision models, most studies either stated their target decision-maker/jurisdiction or provided sufficient information from which this could be inferred. There was a greater tendency to ensure that cost inputs were specific to the target jurisdiction than clinical parameters. Methods to assess generalisability and variability in economic evaluation studies have been discussed extensively in the literature relating to both trial-based and modelling studies. Regression-based methods are likely to offer a systematic approach to quantifying variability in patient-level data. In particular, MLM has the potential to facilitate estimates of cost-effectiveness, which both reflect the variation in costs and outcomes between locations and also enable the consistency of cost-effectiveness estimates between locations to be assessed directly. Decision analytic models will retain an important role in adapting the results of cost-effectiveness studies between locations. Recommendations for further research include: the development of methods of evidence synthesis which model the exchangeability of data across locations and allow for the additional uncertainty in this process; assessment of alternative approaches to specifying multilevel models to the analysis of cost-effectiveness data alongside multilocation randomised trials; identification of a range of appropriate covariates relating to locations (e.g. hospitals) in multilevel models; and further assessment of the role of econometric methods (e.g. selection models) for cost-effectiveness analysis alongside observational datasets, and to increase the generalisability of randomised trials.
Decision Making Analysis: Critical Factors-Based Methodology
2010-04-01
the pitfalls associated with current wargaming methods such as assuming a western view of rational values in decision - making regardless of the cultures...Utilization theory slightly expands the rational decision making model as it states that “actors try to maximize their expected utility by weighing the...items to categorize the decision - making behavior of political leaders which tend to demonstrate either a rational or cognitive leaning. Leaders
MacGillivray, Brian H
2017-08-01
In many environmental and public health domains, heuristic methods of risk and decision analysis must be relied upon, either because problem structures are ambiguous, reliable data is lacking, or decisions are urgent. This introduces an additional source of uncertainty beyond model and measurement error - uncertainty stemming from relying on inexact inference rules. Here we identify and analyse heuristics used to prioritise risk objects, to discriminate between signal and noise, to weight evidence, to construct models, to extrapolate beyond datasets, and to make policy. Some of these heuristics are based on causal generalisations, yet can misfire when these relationships are presumed rather than tested (e.g. surrogates in clinical trials). Others are conventions designed to confer stability to decision analysis, yet which may introduce serious error when applied ritualistically (e.g. significance testing). Some heuristics can be traced back to formal justifications, but only subject to strong assumptions that are often violated in practical applications. Heuristic decision rules (e.g. feasibility rules) in principle act as surrogates for utility maximisation or distributional concerns, yet in practice may neglect costs and benefits, be based on arbitrary thresholds, and be prone to gaming. We highlight the problem of rule-entrenchment, where analytical choices that are in principle contestable are arbitrarily fixed in practice, masking uncertainty and potentially introducing bias. Strategies for making risk and decision analysis more rigorous include: formalising the assumptions and scope conditions under which heuristics should be applied; testing rather than presuming their underlying empirical or theoretical justifications; using sensitivity analysis, simulations, multiple bias analysis, and deductive systems of inference (e.g. directed acyclic graphs) to characterise rule uncertainty and refine heuristics; adopting "recovery schemes" to correct for known biases; and basing decision rules on clearly articulated values and evidence, rather than convention. Copyright © 2017. Published by Elsevier Ltd.
Decision-analytic modeling studies: An overview for clinicians using multiple myeloma as an example.
Rochau, U; Jahn, B; Qerimi, V; Burger, E A; Kurzthaler, C; Kluibenschaedl, M; Willenbacher, E; Gastl, G; Willenbacher, W; Siebert, U
2015-05-01
The purpose of this study was to provide a clinician-friendly overview of decision-analytic models evaluating different treatment strategies for multiple myeloma (MM). We performed a systematic literature search to identify studies evaluating MM treatment strategies using mathematical decision-analytic models. We included studies that were published as full-text articles in English, and assessed relevant clinical endpoints, and summarized methodological characteristics (e.g., modeling approaches, simulation techniques, health outcomes, perspectives). Eleven decision-analytic modeling studies met our inclusion criteria. Five different modeling approaches were adopted: decision-tree modeling, Markov state-transition modeling, discrete event simulation, partitioned-survival analysis and area-under-the-curve modeling. Health outcomes included survival, number-needed-to-treat, life expectancy, and quality-adjusted life years. Evaluated treatment strategies included novel agent-based combination therapies, stem cell transplantation and supportive measures. Overall, our review provides a comprehensive summary of modeling studies assessing treatment of MM and highlights decision-analytic modeling as an important tool for health policy decision making. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Decision making in asthma exacerbation: a clinical judgement analysis
Jenkins, John; Shields, Mike; Patterson, Chris; Kee, Frank
2007-01-01
Background Clinical decisions which impact directly on patient safety and quality of care are made during acute asthma attacks by individual doctors based on their knowledge and experience. Decisions include administration of systemic corticosteroids (CS) and oral antibiotics, and admission to hospital. Clinical judgement analysis provides a methodology for comparing decisions between practitioners with different training and experience, and improving decision making. Methods Stepwise linear regression was used to select clinical cues based on visual analogue scale assessments of the propensity of 62 clinicians to prescribe a short course of oral CS (decision 1), a course of antibiotics (decision 2), and/or admit to hospital (decision 3) for 60 “paper” patients. Results When compared by specialty, paediatricians' models for decision 1 were more likely to include level of alertness as a cue (54% vs 16%); for decision 2 they were more likely to include presence of crepitations (49% vs 16%) and less likely to include inhaled CS (8% vs 40%), respiratory rate (0% vs 24%) and air entry (70% vs 100%). When compared to other grades, the models derived for decision 3 by consultants/general practitioners were more likely to include wheeze severity as a cue (39% vs 6%). Conclusions Clinicians differed in their use of individual cues and the number included in their models. Patient safety and quality of care will benefit from clarification of decision‐making strategies as general learning points during medical training, in the development of guidelines and care pathways, and by clinicians developing self‐awareness of their own preferences. PMID:17428817
Role of scientific data in health decisions.
Samuels, S W
1979-01-01
The distinction between reality and models or methodological assumptions is necessary for an understanding of the use of data--economic, technical or biological--in decision-making. The traditional modes of analysis used in decisions are discussed historically and analytically. Utilitarian-based concepts such as cost-benefit analysis and cannibalistic concepts such as "acceptable risk" are rejected on logical and moral grounds. Historical reality suggests the concept of socially necessary risk determined through the dialectic process in democracy. PMID:120251
Bures, Vladimír; Otcenásková, Tereza; Cech, Pavel; Antos, Karel
2012-11-01
Biological incidents jeopardising public health require decision-making that consists of one dominant feature: complexity. Therefore, public health decision-makers necessitate appropriate support. Based on the analogy with business intelligence (BI) principles, the contextual analysis of the environment and available data resources, and conceptual modelling within systems and knowledge engineering, this paper proposes a general framework for computer-based decision support in the case of a biological incident. At the outset, the analysis of potential inputs to the framework is conducted and several resources such as demographic information, strategic documents, environmental characteristics, agent descriptors and surveillance systems are considered. Consequently, three prototypes were developed, tested and evaluated by a group of experts. Their selection was based on the overall framework scheme. Subsequently, an ontology prototype linked with an inference engine, multi-agent-based model focusing on the simulation of an environment, and expert-system prototypes were created. All prototypes proved to be utilisable support tools for decision-making in the field of public health. Nevertheless, the research revealed further issues and challenges that might be investigated by both public health focused researchers and practitioners.
A dynamic model of reasoning and memory.
Hawkins, Guy E; Hayes, Brett K; Heit, Evan
2016-02-01
Previous models of category-based induction have neglected how the process of induction unfolds over time. We conceive of induction as a dynamic process and provide the first fine-grained examination of the distribution of response times observed in inductive reasoning. We used these data to develop and empirically test the first major quantitative modeling scheme that simultaneously accounts for inductive decisions and their time course. The model assumes that knowledge of similarity relations among novel test probes and items stored in memory drive an accumulation-to-bound sequential sampling process: Test probes with high similarity to studied exemplars are more likely to trigger a generalization response, and more rapidly, than items with low exemplar similarity. We contrast data and model predictions for inductive decisions with a recognition memory task using a common stimulus set. Hierarchical Bayesian analyses across 2 experiments demonstrated that inductive reasoning and recognition memory primarily differ in the threshold to trigger a decision: Observers required less evidence to make a property generalization judgment (induction) than an identity statement about a previously studied item (recognition). Experiment 1 and a condition emphasizing decision speed in Experiment 2 also found evidence that inductive decisions use lower quality similarity-based information than recognition. The findings suggest that induction might represent a less cautious form of recognition. We conclude that sequential sampling models grounded in exemplar-based similarity, combined with hierarchical Bayesian analysis, provide a more fine-grained and informative analysis of the processes involved in inductive reasoning than is possible solely through examination of choice data. PsycINFO Database Record (c) 2016 APA, all rights reserved.
1995-03-01
advisory system provides a decision framework for selecting an appropriate model from the nuimerous available transport models conditinni-ed on...l1, T ,TV Groundwater Modeling, Contaminant Transport , Optimi2atio’ 2; Total Reliability, Remediation Si , , -J % UNCLASSIFIED UNCLASSIFIED...0 0 0 0 S 0 Sn S Even with the choice of an appropriate transport model, considlrable uncertainty is likely to be present in the analysis of
Hilbig, Benjamin E; Pohl, Rüdiger F
2009-09-01
According to part of the adaptive toolbox notion of decision making known as the recognition heuristic (RH), the decision process in comparative judgments-and its duration-is determined by whether recognition discriminates between objects. By contrast, some recently proposed alternative models predict that choices largely depend on the amount of evidence speaking for each of the objects and that decision times thus depend on the evidential difference between objects, or the degree of conflict between options. This article presents 3 experiments that tested predictions derived from the RH against those from alternative models. All experiments used naturally recognized objects without teaching participants any information and thus provided optimal conditions for application of the RH. However, results supported the alternative, evidence-based models and often conflicted with the RH. Recognition was not the key determinant of decision times, whereas differences between objects with respect to (both positive and negative) evidence predicted effects well. In sum, alternative models that allow for the integration of different pieces of information may well provide a better account of comparative judgments. (c) 2009 APA, all rights reserved.
Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn
2006-09-01
Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.
An uncertainty analysis of wildfire modeling [Chapter 13
Karin Riley; Matthew Thompson
2017-01-01
Before fire models can be understood, evaluated, and effectively applied to support decision making, model-based uncertainties must be analyzed. In this chapter, we identify and classify sources of uncertainty using an established analytical framework, and summarize results graphically in an uncertainty matrix. Our analysis facilitates characterization of the...
NASA Astrophysics Data System (ADS)
Marović, Ivan; Hanak, Tomaš
2017-10-01
In the management of construction projects special attention should be given to the planning as the most important phase of decision-making process. Quality decision-making based on adequate and comprehensive collaboration of all involved stakeholders is crucial in project’s early stages. Fundamental reasons for existence of this problem arise from: specific conditions of construction industry (final products are inseparable from the location i.e. location has a strong influence of building design and its structural characteristics as well as technology which will be used during construction), investors’ desires and attitudes, and influence of socioeconomic and environment aspects. Considering all mentioned reasons one can conclude that selection of adequate construction site location for future investment is complex, low structured and multi-criteria problem. To take into account all the dimensions, the proposed model for selection of adequate site location is devised. The model is based on AHP (for designing the decision-making hierarchy) and PROMETHEE (for pairwise comparison of investment locations) methods. As a result of mixing basis feature of both methods, operational synergies can be achieved in multi-criteria decision analysis. Such gives the decision-maker a sense of assurance, knowing that if the procedure proposed by the presented model has been followed, it will lead to a rational decision, carefully and systematically thought out.
Lin, Zi-Jing; Li, Lin; Cazzell, Mary; Liu, Hanli
2014-08-01
Diffuse optical tomography (DOT) is a variant of functional near infrared spectroscopy and has the capability of mapping or reconstructing three dimensional (3D) hemodynamic changes due to brain activity. Common methods used in DOT image analysis to define brain activation have limitations because the selection of activation period is relatively subjective. General linear model (GLM)-based analysis can overcome this limitation. In this study, we combine the atlas-guided 3D DOT image reconstruction with GLM-based analysis (i.e., voxel-wise GLM analysis) to investigate the brain activity that is associated with risk decision-making processes. Risk decision-making is an important cognitive process and thus is an essential topic in the field of neuroscience. The Balloon Analog Risk Task (BART) is a valid experimental model and has been commonly used to assess human risk-taking actions and tendencies while facing risks. We have used the BART paradigm with a blocked design to investigate brain activations in the prefrontal and frontal cortical areas during decision-making from 37 human participants (22 males and 15 females). Voxel-wise GLM analysis was performed after a human brain atlas template and a depth compensation algorithm were combined to form atlas-guided DOT images. In this work, we wish to demonstrate the excellence of using voxel-wise GLM analysis with DOT to image and study cognitive functions in response to risk decision-making. Results have shown significant hemodynamic changes in the dorsal lateral prefrontal cortex (DLPFC) during the active-choice mode and a different activation pattern between genders; these findings correlate well with published literature in functional magnetic resonance imaging (fMRI) and fNIRS studies. Copyright © 2014 The Authors. Human Brain Mapping Published by Wiley Periodicals, Inc.
Henriques, Justin J; Louis, Garrick E
2011-01-01
Capacity Factor Analysis is a decision support system for selection of appropriate technologies for municipal sanitation services in developing communities. Developing communities are those that lack the capability to provide adequate access to one or more essential services, such as water and sanitation, to their residents. This research developed two elements of Capacity Factor Analysis: a capacity factor based classification for technologies using requirements analysis, and a matching policy for choosing technology options. First, requirements analysis is used to develop a ranking for drinking water supply and greywater reuse technologies. Second, using the Capacity Factor Analysis approach, a matching policy is developed to guide decision makers in selecting the appropriate drinking water supply or greywater reuse technology option for their community. Finally, a scenario-based informal hypothesis test is developed to assist in qualitative model validation through case study. Capacity Factor Analysis is then applied in Cimahi Indonesia as a form of validation. The completed Capacity Factor Analysis model will allow developing communities to select drinking water supply and greywater reuse systems that are safe, affordable, able to be built and managed by the community using local resources, and are amenable to expansion as the community's management capacity increases. Copyright © 2010 Elsevier Ltd. All rights reserved.
Framing of Uncertainty in Scientific Publications: Towards Recommendations for Decision Support
NASA Astrophysics Data System (ADS)
Guillaume, J. H. A.; Helgeson, C.; Elsawah, S.; Jakeman, A. J.; Kummu, M.
2016-12-01
Uncertainty is recognised as an essential issue in environmental decision making and decision support. As modellers, we notably use a variety of tools and techniques within an analysis, for example related to uncertainty quantification and model validation. We also address uncertainty by how we present results. For example, experienced modellers are careful to distinguish robust conclusions from those that need further work, and the precision of quantitative results is tailored to their accuracy. In doing so, the modeller frames how uncertainty should be interpreted by their audience. This is an area which extends beyond modelling to fields such as philosophy of science, semantics, discourse analysis, intercultural communication and rhetoric. We propose that framing of uncertainty deserves greater attention in the context of decision support, and that there are opportunities in this area for fundamental research, synthesis and knowledge transfer, development of teaching curricula, and significant advances in managing uncertainty in decision making. This presentation reports preliminary results of a study of framing practices. Specifically, we analyse the framing of uncertainty that is visible in the abstracts from a corpus of scientific articles. We do this through textual analysis of the content and structure of those abstracts. Each finding that appears in an abstract is classified according to the uncertainty framing approach used, using a classification scheme that was iteratively revised based on reflection and comparison amongst three coders. This analysis indicates how frequently the different framing approaches are used, and provides initial insights into relationships between frames, how the frames relate to interpretation of uncertainty, and how rhetorical devices are used by modellers to communicate uncertainty in their work. We propose initial hypotheses for how the resulting insights might influence decision support, and help advance decision making to better address uncertainty.
Decision tree and PCA-based fault diagnosis of rotating machinery
NASA Astrophysics Data System (ADS)
Sun, Weixiang; Chen, Jin; Li, Jiaqing
2007-04-01
After analysing the flaws of conventional fault diagnosis methods, data mining technology is introduced to fault diagnosis field, and a new method based on C4.5 decision tree and principal component analysis (PCA) is proposed. In this method, PCA is used to reduce features after data collection, preprocessing and feature extraction. Then, C4.5 is trained by using the samples to generate a decision tree model with diagnosis knowledge. At last the tree model is used to make diagnosis analysis. To validate the method proposed, six kinds of running states (normal or without any defect, unbalance, rotor radial rub, oil whirl, shaft crack and a simultaneous state of unbalance and radial rub), are simulated on Bently Rotor Kit RK4 to test C4.5 and PCA-based method and back-propagation neural network (BPNN). The result shows that C4.5 and PCA-based diagnosis method has higher accuracy and needs less training time than BPNN.
Balk, Benjamin; Elder, Kelly
2000-01-01
We model the spatial distribution of snow across a mountain basin using an approach that combines binary decision tree and geostatistical techniques. In April 1997 and 1998, intensive snow surveys were conducted in the 6.9‐km2 Loch Vale watershed (LVWS), Rocky Mountain National Park, Colorado. Binary decision trees were used to model the large‐scale variations in snow depth, while the small‐scale variations were modeled through kriging interpolation methods. Binary decision trees related depth to the physically based independent variables of net solar radiation, elevation, slope, and vegetation cover type. These decision tree models explained 54–65% of the observed variance in the depth measurements. The tree‐based modeled depths were then subtracted from the measured depths, and the resulting residuals were spatially distributed across LVWS through kriging techniques. The kriged estimates of the residuals were added to the tree‐based modeled depths to produce a combined depth model. The combined depth estimates explained 60–85% of the variance in the measured depths. Snow densities were mapped across LVWS using regression analysis. Snow‐covered area was determined from high‐resolution aerial photographs. Combining the modeled depths and densities with a snow cover map produced estimates of the spatial distribution of snow water equivalence (SWE). This modeling approach offers improvement over previous methods of estimating SWE distribution in mountain basins.
Capalbo, Susan M; Antle, John M; Seavert, Clark
2017-07-01
Research on next generation agricultural systems models shows that the most important current limitation is data, both for on-farm decision support and for research investment and policy decision making. One of the greatest data challenges is to obtain reliable data on farm management decision making, both for current conditions and under scenarios of changed bio-physical and socio-economic conditions. This paper presents a framework for the use of farm-level and landscape-scale models and data to provide analysis that could be used in NextGen knowledge products, such as mobile applications or personal computer data analysis and visualization software. We describe two analytical tools - AgBiz Logic and TOA-MD - that demonstrate the current capability of farmlevel and landscape-scale models. The use of these tools is explored with a case study of an oilseed crop, Camelina sativa , which could be used to produce jet aviation fuel. We conclude with a discussion of innovations needed to facilitate the use of farm and policy-level models to generate data and analysis for improved knowledge products.
Humphries Choptiany, John Michael; Pelot, Ronald
2014-09-01
Multicriteria decision analysis (MCDA) has been applied to various energy problems to incorporate a variety of qualitative and quantitative criteria, usually spanning environmental, social, engineering, and economic fields. MCDA and associated methods such as life-cycle assessments and cost-benefit analysis can also include risk analysis to address uncertainties in criteria estimates. One technology now being assessed to help mitigate climate change is carbon capture and storage (CCS). CCS is a new process that captures CO2 emissions from fossil-fueled power plants and injects them into geological reservoirs for storage. It presents a unique challenge to decisionmakers (DMs) due to its technical complexity, range of environmental, social, and economic impacts, variety of stakeholders, and long time spans. The authors have developed a risk assessment model using a MCDA approach for CCS decisions such as selecting between CO2 storage locations and choosing among different mitigation actions for reducing risks. The model includes uncertainty measures for several factors, utility curve representations of all variables, Monte Carlo simulation, and sensitivity analysis. This article uses a CCS scenario example to demonstrate the development and application of the model based on data derived from published articles and publicly available sources. The model allows high-level DMs to better understand project risks and the tradeoffs inherent in modern, complex energy decisions. © 2014 Society for Risk Analysis.
Psychophysical Models for Signal Detection with Time Varying Uncertainty. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Gai, E.
1975-01-01
Psychophysical models for the behavior of the human operator in detection tasks which include change in detectability, correlation between observations and deferred decisions are developed. Classical Signal Detection Theory (SDT) is discussed and its emphasis on the sensory processes is contrasted to decision strategies. The analysis of decision strategies utilizes detection tasks with time varying signal strength. The classical theory is modified to include such tasks and several optimal decision strategies are explored. Two methods of classifying strategies are suggested. The first method is similar to the analysis of ROC curves, while the second is based on the relation between the criterion level (CL) and the detectability. Experiments to verify the analysis of tasks with changes of signal strength are designed. The results show that subjects are aware of changes in detectability and tend to use strategies that involve changes in the CL's.
An index-based robust decision making framework for watershed management in a changing climate.
Kim, Yeonjoo; Chung, Eun-Sung
2014-03-01
This study developed an index-based robust decision making framework for watershed management dealing with water quantity and quality issues in a changing climate. It consists of two parts of management alternative development and analysis. The first part for alternative development consists of six steps: 1) to understand the watershed components and process using HSPF model, 2) to identify the spatial vulnerability ranking using two indices: potential streamflow depletion (PSD) and potential water quality deterioration (PWQD), 3) to quantify the residents' preferences on water management demands and calculate the watershed evaluation index which is the weighted combinations of PSD and PWQD, 4) to set the quantitative targets for water quantity and quality, 5) to develop a list of feasible alternatives and 6) to eliminate the unacceptable alternatives. The second part for alternative analysis has three steps: 7) to analyze all selected alternatives with a hydrologic simulation model considering various climate change scenarios, 8) to quantify the alternative evaluation index including social and hydrologic criteria with utilizing multi-criteria decision analysis methods and 9) to prioritize all options based on a minimax regret strategy for robust decision. This framework considers the uncertainty inherent in climate models and climate change scenarios with utilizing the minimax regret strategy, a decision making strategy under deep uncertainty and thus this procedure derives the robust prioritization based on the multiple utilities of alternatives from various scenarios. In this study, the proposed procedure was applied to the Korean urban watershed, which has suffered from streamflow depletion and water quality deterioration. Our application shows that the framework provides a useful watershed management tool for incorporating quantitative and qualitative information into the evaluation of various policies with regard to water resource planning and management. Copyright © 2013 Elsevier B.V. All rights reserved.
Naturalistic Decision Making for Power System Operators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greitzer, Frank L.; Podmore, Robin; Robinson, Marck
2010-02-01
Motivation – Investigations of large-scale outages in the North American interconnected electric system often attribute the causes to three T’s: Trees, Training and Tools. To document and understand the mental processes used by expert operators when making critical decisions, a naturalistic decision making (NDM) model was developed. Transcripts of conversations were analyzed to reveal and assess NDM-based performance criteria. Findings/Design – An item analysis indicated that the operators’ Situation Awareness Levels, mental models, and mental simulations can be mapped at different points in the training scenario. This may identify improved training methods or analytical/ visualization tools. Originality/Value – This studymore » applies for the first time, the concepts of Recognition Primed Decision Making, Situation Awareness Levels and Cognitive Task Analysis to training of electric power system operators. Take away message – The NDM approach provides a viable framework for systematic training management to accelerate learning in simulator-based training scenarios for power system operators and teams.« less
Multicriteria decision model for retrofitting existing buildings
NASA Astrophysics Data System (ADS)
Bostenaru Dan, B.
2003-04-01
In this paper a model to decide which buildings from an urban area should be retrofitted is presented. The model has been cast into existing ones by choosing the decision rule, criterion weighting and decision support system types most suitable for the spatial problem of reducing earthquake risk in urban areas, considering existing spatial multiatributive and multiobjective decision methods and especially collaborative issues. Due to the participative character of the group decision problem "retrofitting existing buildings" the decision making model is based on interactivity. Buildings have been modeled following the criteria of spatial decision support systems. This includes identifying the corresponding spatial elements of buildings according to the information needs of actors from different sphaeres like architects, construction engineers and economists. The decision model aims to facilitate collaboration between this actors. The way of setting priorities interactivelly will be shown, by detailing the two phases: judgemental and computational, in this case site analysis, collection and evaluation of the unmodified data and converting survey data to information with computational methods using additional expert support. Buildings have been divided into spatial elements which are characteristic for the survey, present typical damages in case of an earthquake and are decisive for a better seismic behaviour in case of retrofitting. The paper describes the architectural and engineering characteristics as well as the structural damage for constuctions of different building ages on the example of building types in Bucharest, Romania in compressible and interdependent charts, based on field observation, reports from the 1977 earthquake and detailed studies made by the author together with a local engineer for the EERI Web Housing Encyclopedia. On this base criteria for setting priorities flow into the expert information contained in the system.
NASA Astrophysics Data System (ADS)
Lin, Zi-Jing; Li, Lin; Cazzell, Marry; Liu, Hanli
2013-03-01
Functional near-infrared spectroscopy (fNIRS) is a non-invasive imaging technique which measures the hemodynamic changes that reflect the brain activity. Diffuse optical tomography (DOT), a variant of fNIRS with multi-channel NIRS measurements, has demonstrated capability of three dimensional (3D) reconstructions of hemodynamic changes due to the brain activity. Conventional method of DOT image analysis to define the brain activation is based upon the paired t-test between two different states, such as resting-state versus task-state. However, it has limitation because the selection of activation and post-activation period is relatively subjective. General linear model (GLM) based analysis can overcome this limitation. In this study, we combine the 3D DOT image reconstruction with GLM-based analysis (i.e., voxel-wise GLM analysis) to investigate the brain activity that is associated with the risk-decision making process. Risk decision-making is an important cognitive process and thus is an essential topic in the field of neuroscience. The balloon analogue risk task (BART) is a valid experimental model and has been commonly used in behavioral measures to assess human risk taking action and tendency while facing risks. We have utilized the BART paradigm with a blocked design to investigate brain activations in the prefrontal and frontal cortical areas during decision-making. Voxel-wise GLM analysis was performed on 18human participants (10 males and 8females).In this work, we wish to demonstrate the feasibility of using voxel-wise GLM analysis to image and study cognitive functions in response to risk decision making by DOT. Results have shown significant changes in the dorsal lateral prefrontal cortex (DLPFC) during the active choice mode and a different hemodynamic pattern between genders, which are in good agreements with published literatures in functional magnetic resonance imaging (fMRI) and fNIRS studies.
NASA Astrophysics Data System (ADS)
Kostyuchenko, Yuriy V.; Sztoyka, Yulia; Kopachevsky, Ivan; Artemenko, Igor; Yuschenko, Maxim
2017-10-01
Multi-model approach for remote sensing data processing and interpretation is described. The problem of satellite data utilization in multi-modeling approach for socio-ecological risks assessment is formally defined. Observation, measurement and modeling data utilization method in the framework of multi-model approach is described. Methodology and models of risk assessment in framework of decision support approach are defined and described. Method of water quality assessment using satellite observation data is described. Method is based on analysis of spectral reflectance of aquifers. Spectral signatures of freshwater bodies and offshores are analyzed. Correlations between spectral reflectance, pollutions and selected water quality parameters are analyzed and quantified. Data of MODIS, MISR, AIRS and Landsat sensors received in 2002-2014 have been utilized verified by in-field spectrometry and lab measurements. Fuzzy logic based approach for decision support in field of water quality degradation risk is discussed. Decision on water quality category is making based on fuzzy algorithm using limited set of uncertain parameters. Data from satellite observations, field measurements and modeling is utilizing in the framework of the approach proposed. It is shown that this algorithm allows estimate water quality degradation rate and pollution risks. Problems of construction of spatial and temporal distribution of calculated parameters, as well as a problem of data regularization are discussed. Using proposed approach, maps of surface water pollution risk from point and diffuse sources are calculated and discussed.
NASA Astrophysics Data System (ADS)
Estuar, Maria Regina Justina; Victorino, John Noel; Coronel, Andrei; Co, Jerelyn; Tiausas, Francis; Señires, Chiara Veronica
2017-09-01
Use of wireless sensor networks and smartphone integration design to monitor environmental parameters surrounding plantations is made possible because of readily available and affordable sensors. Providing low cost monitoring devices would be beneficial, especially to small farm owners, in a developing country like the Philippines, where agriculture covers a significant amount of the labor market. This study discusses the integration of wireless soil sensor devices and smartphones to create an application that will use multidimensional analysis to detect the presence or absence of plant disease. Specifically, soil sensors are designed to collect soil quality parameters in a sink node from which the smartphone collects data from via Bluetooth. Given these, there is a need to develop a classification model on the mobile phone that will report infection status of a soil. Though tree classification is the most appropriate approach for continuous parameter-based datasets, there is a need to determine whether tree models will result to coherent results or not. Soil sensor data that resides on the phone is modeled using several variations of decision tree, namely: decision tree (DT), best-fit (BF) decision tree, functional tree (FT), Naive Bayes (NB) decision tree, J48, J48graft and LAD tree, where decision tree approaches the problem by considering all sensor nodes as one. Results show that there are significant differences among soil sensor parameters indicating that there are variances in scores between the infected and uninfected sites. Furthermore, analysis of variance in accuracy, recall, precision and F1 measure scores from tree classification models homogeneity among NBTree, J48graft and J48 tree classification models.
The management of patients with T1 adenocarcinoma of the low rectum: a decision analysis.
Johnston, Calvin F; Tomlinson, George; Temple, Larissa K; Baxter, Nancy N
2013-04-01
Decision making for patients with T1 adenocarcinoma of the low rectum, when treatment options are limited to a transanal local excision or abdominoperineal resection, is challenging. The aim of this study was to develop a contemporary decision analysis to assist patients and clinicians in balancing the goals of maximizing life expectancy and quality of life in this situation. We constructed a Markov-type microsimulation in open-source software. Recurrence rates and quality-of-life parameters were elicited by systematic literature reviews. Sensitivity analyses were performed on key model parameters. Our base case for analysis was a 65-year-old man with low-lying T1N0 rectal cancer. We determined the sensitivity of our model for sex, age up to 80, and T stage. The main outcome measured was quality-adjusted life-years. In the base case, selecting transanal local excision over abdominoperineal resection resulted in a loss of 0.53 years of life expectancy but a gain of 0.97 quality-adjusted life-years. One-way sensitivity analysis demonstrated a health state utility value threshold for permanent colostomy of 0.93. This value ranged from 0.88 to 1.0 based on tumor recurrence risk. There were no other model sensitivities. Some model parameter estimates were based on weak data. In our model, transanal local excision was found to be the preferable approach for most patients. An abdominoperineal resection has a 3.5% longer life expectancy, but this advantage is lost when the quality-of-life reduction reported by stoma patients is weighed in. The minority group in whom abdominoperineal resection is preferred are those who are unwilling to sacrifice 7% of their life expectancy to avoid a permanent stoma. This is estimated to be approximately 25% of all patients. The threshold increases to 12% of life expectancy in high-risk tumors. No other factors are found to be relevant to the decision.
MRI-based decision tree model for diagnosis of biliary atresia.
Kim, Yong Hee; Kim, Myung-Joon; Shin, Hyun Joo; Yoon, Haesung; Han, Seok Joo; Koh, Hong; Roh, Yun Ho; Lee, Mi-Jung
2018-02-23
To evaluate MRI findings and to generate a decision tree model for diagnosis of biliary atresia (BA) in infants with jaundice. We retrospectively reviewed features of MRI and ultrasonography (US) performed in infants with jaundice between January 2009 and June 2016 under approval of the institutional review board, including the maximum diameter of periportal signal change on MRI (MR triangular cord thickness, MR-TCT) or US (US-TCT), visibility of common bile duct (CBD) and abnormality of gallbladder (GB). Hepatic subcapsular flow was reviewed on Doppler US. We performed conditional inference tree analysis using MRI findings to generate a decision tree model. A total of 208 infants were included, 112 in the BA group and 96 in the non-BA group. Mean age at the time of MRI was 58.7 ± 36.6 days. Visibility of CBD, abnormality of GB and MR-TCT were good discriminators for the diagnosis of BA and the MRI-based decision tree using these findings with MR-TCT cut-off 5.1 mm showed 97.3 % sensitivity, 94.8 % specificity and 96.2 % accuracy. MRI-based decision tree model reliably differentiates BA in infants with jaundice. MRI can be an objective imaging modality for the diagnosis of BA. • MRI-based decision tree model reliably differentiates biliary atresia in neonatal cholestasis. • Common bile duct, gallbladder and periportal signal changes are the discriminators. • MRI has comparable performance to ultrasonography for diagnosis of biliary atresia.
Decision Aid Use in Primary Care: An Overview and Theory-Based Framework.
Shultz, Cameron G; Jimbo, Masahito
2015-10-01
Increasing patients' participation in health care is a commonly cited goal. While patient decision aids can promote participation, they remain underutilized. Theory-based models that assess barriers and facilitators to sustained decision aid use are needed. The ready, willing, and able model specifies three preconditions for behavioral change. We present a descriptive analysis of the uptake of patient decision aids in the primary care setting and show how the ready, willing, and able model can be used to identify potential barriers and facilitators. An Ovid Medline literature search from January 2004 to November 2014 was used; additional sources were identified from reference lists and through peer consultations. Barriers and facilitators to decision aid use were identified and grouped into salient themes. The ready, willing, and able model provided a simple yet practical framework for identifying the mechanisms that facilitate (or work against) the adoption of patient decision aids within primary care. While time was a prominent barrier, additional barriers such as perceived legitimacy, clinic capacity, processes of care, and the overarching health care environment were also noted. The ready, willing, and able model posits that several preconditions must first be satisfied before sustained use of patient decision aids can take hold. By pinpointing bottlenecks, the model can inform policies and tailored interventions to target identified problems. Using the model to troubleshoot for bottlenecks prior to the implementation of a decision aid could help to improve uptake and sustained use within the primary care setting.
Controlling Chronic Diseases Through Evidence-Based Decision Making: A Group-Randomized Trial.
Brownson, Ross C; Allen, Peg; Jacob, Rebekah R; deRuyter, Anna; Lakshman, Meenakshi; Reis, Rodrigo S; Yan, Yan
2017-11-30
Although practitioners in state health departments are ideally positioned to implement evidence-based interventions, few studies have examined how to build their capacity to do so. The objective of this study was to explore how to increase the use of evidence-based decision-making processes at both the individual and organization levels. We conducted a 2-arm, group-randomized trial with baseline data collection and follow-up at 18 to 24 months. Twelve state health departments were paired and randomly assigned to intervention or control condition. In the 6 intervention states, a multiday training on evidence-based decision making was conducted from March 2014 through March 2015 along with a set of supplemental capacity-building activities. Individual-level outcomes were evidence-based decision making skills of public health practitioners; organization-level outcomes were access to research evidence and participatory decision making. Mixed analysis of covariance models was used to evaluate the intervention effect by accounting for the cluster randomized trial design. Analysis was performed from March through May 2017. Participation 18 to 24 months after initial training was 73.5%. In mixed models adjusted for participant and state characteristics, the intervention group improved significantly in the overall skill gap (P = .01) and in 6 skill areas. Among the 4 organizational variables, only access to evidence and skilled staff showed an intervention effect (P = .04). Tailored and active strategies are needed to build capacity at the individual and organization levels for evidence-based decision making. Our study suggests several dissemination interventions for consideration by leaders seeking to improve public health practice.
Use of Inverse Reinforcement Learning for Identity Prediction
NASA Technical Reports Server (NTRS)
Hayes, Roy; Bao, Jonathan; Beling, Peter; Horowitz, Barry
2011-01-01
We adopt Markov Decision Processes (MDP) to model sequential decision problems, which have the characteristic that the current decision made by a human decision maker has an uncertain impact on future opportunity. We hypothesize that the individuality of decision makers can be modeled as differences in the reward function under a common MDP model. A machine learning technique, Inverse Reinforcement Learning (IRL), was used to learn an individual's reward function based on limited observation of his or her decision choices. This work serves as an initial investigation for using IRL to analyze decision making, conducted through a human experiment in a cyber shopping environment. Specifically, the ability to determine the demographic identity of users is conducted through prediction analysis and supervised learning. The results show that IRL can be used to correctly identify participants, at a rate of 68% for gender and 66% for one of three college major categories.
Miller, W B; Pasta, D J
2001-01-01
In this study we develop and then test a couple model of contraceptive method choice decision-making following a pregnancy scare. The central constructs in our model are satisfaction with one's current method and confidence in the use of it. Downstream in the decision sequence, satisfaction and confidence predict desires and intentions to change methods. Upstream they are predicted by childbearing motivations, contraceptive attitudes, and the residual effects of the couples' previous method decisions. We collected data from 175 mostly unmarried and racially/ethnically diverse couples who were seeking pregnancy tests. We used LISREL and its latent variable capacity to estimate a structural equation model of the couple decision-making sequence leading to a change (or not) in contraceptive method. Results confirm most elements in our model and demonstrate a number of important cross-partner effects. Almost one-half of the sample had positive pregnancy tests and the base model fitted to this subsample indicates less accuracy in partner perception and greater influence of the female partner on method change decision-making. The introduction of some hypothesis-generating exogenous variables to our base couple model, together with some unexpected findings for the contraceptive attitude variables, suggest interesting questions that require further exploration.
Hansson, Sven Ove; Aven, Terje
2014-07-01
This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greitzer, Frank L.; Podmore, Robin
2008-11-17
The focus of the present study is on improved training approaches to accelerate learning and improved methods for analyzing effectiveness of tools within a high-fidelity power grid simulated environment. A theory-based model has been developed to document and understand the mental processes that an expert power system operator uses when making critical decisions. The theoretical foundation for the method is based on the concepts of situation awareness, the methods of cognitive task analysis, and the naturalistic decision making (NDM) approach of Recognition Primed Decision Making. The method has been systematically explored and refined as part of a capability demonstration ofmore » a high-fidelity real-time power system simulator under normal and emergency conditions. To examine NDM processes, we analyzed transcripts of operator-to-operator conversations during the simulated scenario to reveal and assess NDM-based performance criteria. The results of the analysis indicate that the proposed framework can be used constructively to map or assess the Situation Awareness Level of the operators at each point in the scenario. We can also identify the mental models and mental simulations that the operators employ at different points in the scenario. This report documents the method, describes elements of the model, and provides appendices that document the simulation scenario and the associated mental models used by operators in the scenario.« less
An Agent-Based Model of Farmer Decision Making in Jordan
NASA Astrophysics Data System (ADS)
Selby, Philip; Medellin-Azuara, Josue; Harou, Julien; Klassert, Christian; Yoon, Jim
2016-04-01
We describe an agent based hydro-economic model of groundwater irrigated agriculture in the Jordan Highlands. The model employs a Multi-Agent-Simulation (MAS) framework and is designed to evaluate direct and indirect outcomes of climate change scenarios and policy interventions on farmer decision making, including annual land use, groundwater use for irrigation, and water sales to a water tanker market. Land use and water use decisions are simulated for groups of farms grouped by location and their behavioural and economic similarities. Decreasing groundwater levels, and the associated increase in pumping costs, are important drivers for change within Jordan'S agricultural sector. We describe how this is considered by coupling of agricultural and groundwater models. The agricultural production model employs Positive Mathematical Programming (PMP), a method for calibrating agricultural production functions to observed planted areas. PMP has successfully been used with disaggregate models for policy analysis. We adapt the PMP approach to allow explicit evaluation of the impact of pumping costs, groundwater purchase fees and a water tanker market. The work demonstrates the applicability of agent-based agricultural decision making assessment in the Jordan Highlands and its integration with agricultural model calibration methods. The proposed approach is designed and implemented with software such that it could be used to evaluate a variety of physical and human influences on decision making in agricultural water management.
A diffusion decision model analysis of evidence variability in the lexical decision task.
Tillman, Gabriel; Osth, Adam F; van Ravenzwaaij, Don; Heathcote, Andrew
2017-12-01
The lexical-decision task is among the most commonly used paradigms in psycholinguistics. In both the signal-detection theory and Diffusion Decision Model (DDM; Ratcliff, Gomez, & McKoon, Psychological Review, 111, 159-182, 2004) frameworks, lexical-decisions are based on a continuous source of word-likeness evidence for both words and non-words. The Retrieving Effectively from Memory model of Lexical-Decision (REM-LD; Wagenmakers et al., Cognitive Psychology, 48(3), 332-367, 2004) provides a comprehensive explanation of lexical-decision data and makes the prediction that word-likeness evidence is more variable for words than non-words and that higher frequency words are more variable than lower frequency words. To test these predictions, we analyzed five lexical-decision data sets with the DDM. For all data sets, drift-rate variability changed across word frequency and non-word conditions. For the most part, REM-LD's predictions about the ordering of evidence variability across stimuli in the lexical-decision task were confirmed.
Applicability of aquifer impact models to support decisions at CO2 sequestration sites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keating, Elizabeth; Bacon, Diana; Carroll, Susan
2016-09-01
The National Risk Assessment Partnership has developed a suite of tools to assess and manage risk at CO2 sequestration sites (www.netldoe.gov/nrap). This capability includes polynomial or look-up table based reduced-order models (ROMs) that predict the impact of CO2 and brine leaks on overlying aquifers. The development of these computationally-efficient models and the underlying reactive transport simulations they emulate has been documented elsewhere (Carroll et al., 2014, Dai et al., 2014, Keating et al., 2015). The ROMs reproduce the ensemble behavior of large numbers of simulations and are well-suited to applications that consider a large number of scenarios to understand parametermore » sensitivity and uncertainty on the risk of CO2 leakage to groundwater quality. In this paper, we seek to demonstrate applicability of ROM-based ensemble analysis by considering what types of decisions and aquifer types would benefit from the ROM analysis. We present four hypothetical four examples where applying ROMs, in ensemble mode, could support decisions in the early stages in a geologic CO2 sequestration project. These decisions pertain to site selection, site characterization, monitoring network evaluation, and health impacts. In all cases, we consider potential brine/CO2 leak rates at the base of the aquifer to be uncertain. We show that derived probabilities provide information relevant to the decision at hand. Although the ROMs were developed using site-specific data from two aquifers (High Plains and Edwards), the models accept aquifer characteristics as variable inputs and so they may have more broad applicability. We conclude that pH and TDS predictions are the most transferable to other aquifers based on the analysis of the nine water quality metrics (pH, TDS, 4 trace metals, 3 organic compounds). Guidelines are presented for determining the aquifer types for which the ROMs should be applicable.« less
Mo, Shaobo; Dai, Weixing; Xiang, Wenqiang; Li, Qingguo; Wang, Renjie; Cai, Guoxiang
2018-05-03
The objective of this study was to summarize the clinicopathological and molecular features of synchronous colorectal peritoneal metastases (CPM). We then combined clinical and pathological variables associated with synchronous CPM into a nomogram and confirmed its utilities using decision curve analysis. Synchronous metastatic colorectal cancer (mCRC) patients who received primary tumor resection and underwent KRAS, NRAS, and BRAF gene mutation detection at our center from January 2014 to September 2015 were included in this retrospective study. An analysis was performed to investigate the clinicopathological and molecular features for independent risk factors of synchronous CPM and to subsequently develop a nomogram for synchronous CPM based on multivariate logistic regression. Model performance was quantified in terms of calibration and discrimination. We studied the utility of the nomogram using decision curve analysis. In total, 226 patients were diagnosed with synchronous mCRC, of whom 50 patients (22.1%) presented with CPM. After uni- and multivariate analysis, a nomogram was built based on tumor site, histological type, age, and T4 status. The model had good discrimination with an area under the curve (AUC) at 0.777 (95% CI 0.703-0.850) and adequate calibration. By decision curve analysis, the model was shown to be relevant between thresholds of 0.10 and 0.66. Synchronous CPM is more likely to happen to patients with age ≤60, right-sided primary lesions, signet ring cell cancer or T4 stage. This is the first nomogram to predict synchronous CPM. To ensure generalizability, this model needs to be externally validated. Copyright © 2018 IJS Publishing Group Ltd. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Quinta-Nova, Luis; Fernandez, Paulo; Pedro, Nuno
2017-12-01
This work focuses on developed a decision support system based on multicriteria spatial analysis to assess the potential for generation of biomass residues from forestry sources in a region of Portugal (Beira Baixa). A set of environmental, economic and social criteria was defined, evaluated and weighted in the context of Saaty’s analytic hierarchies. The best alternatives were obtained after applying Analytic Hierarchy Process (AHP). The model was applied to the central region of Portugal where forest and agriculture are the most representative land uses. Finally, sensitivity analysis of the set of factors and their associated weights was performed to test the robustness of the model. The proposed evaluation model provides a valuable reference for decision makers in establishing a standardized means of selecting the optimal location for new biomass plants.
Wolfslehner, Bernhard; Seidl, Rupert
2010-12-01
The decision-making environment in forest management (FM) has changed drastically during the last decades. Forest management planning is facing increasing complexity due to a widening portfolio of forest goods and services, a societal demand for a rational, transparent decision process and rising uncertainties concerning future environmental conditions (e.g., climate change). Methodological responses to these challenges include an intensified use of ecosystem models to provide an enriched, quantitative information base for FM planning. Furthermore, multi-criteria methods are increasingly used to amalgamate information, preferences, expert judgments and value expressions, in support of the participatory and communicative dimensions of modern forestry. Although the potential of combining these two approaches has been demonstrated in a number of studies, methodological aspects in interfacing forest ecosystem models (FEM) and multi-criteria decision analysis (MCDA) are scarcely addressed explicitly. In this contribution we review the state of the art in FEM and MCDA in the context of FM planning and highlight some of the crucial issues when combining ecosystem and preference modeling. We discuss issues and requirements in selecting approaches suitable for supporting FM planning problems from the growing body of FEM and MCDA concepts. We furthermore identify two major challenges in a harmonized application of FEM-MCDA: (i) the design and implementation of an indicator-based analysis framework capturing ecological and social aspects and their interactions relevant for the decision process, and (ii) holistic information management that supports consistent use of different information sources, provides meta-information as well as information on uncertainties throughout the planning process.
NASA Astrophysics Data System (ADS)
Wolfslehner, Bernhard; Seidl, Rupert
2010-12-01
The decision-making environment in forest management (FM) has changed drastically during the last decades. Forest management planning is facing increasing complexity due to a widening portfolio of forest goods and services, a societal demand for a rational, transparent decision process and rising uncertainties concerning future environmental conditions (e.g., climate change). Methodological responses to these challenges include an intensified use of ecosystem models to provide an enriched, quantitative information base for FM planning. Furthermore, multi-criteria methods are increasingly used to amalgamate information, preferences, expert judgments and value expressions, in support of the participatory and communicative dimensions of modern forestry. Although the potential of combining these two approaches has been demonstrated in a number of studies, methodological aspects in interfacing forest ecosystem models (FEM) and multi-criteria decision analysis (MCDA) are scarcely addressed explicitly. In this contribution we review the state of the art in FEM and MCDA in the context of FM planning and highlight some of the crucial issues when combining ecosystem and preference modeling. We discuss issues and requirements in selecting approaches suitable for supporting FM planning problems from the growing body of FEM and MCDA concepts. We furthermore identify two major challenges in a harmonized application of FEM-MCDA: (i) the design and implementation of an indicator-based analysis framework capturing ecological and social aspects and their interactions relevant for the decision process, and (ii) holistic information management that supports consistent use of different information sources, provides meta-information as well as information on uncertainties throughout the planning process.
Van Dessel, E; Fierens, K; Pattyn, P; Van Nieuwenhove, Y; Berrevoet, F; Troisi, R; Ceelen, W
2009-01-01
Approximately 5%-20% of colorectal cancer (CRC) patients present with synchronous potentially resectable liver metastatic disease. Preclinical and clinical studies suggest a benefit of the 'liver first' approach, i.e. resection of the liver metastasis followed by resection of the primary tumour. A formal decision analysis may support a rational choice between several therapy options. Survival and morbidity data were retrieved from relevant clinical studies identified by a Web of Science search. Data were entered into decision analysis software (TreeAge Pro 2009, Williamstown, MA, USA). Transition probabilities including the risk of death from complications or disease progression associated with individual therapy options were entered into the model. Sensitivity analysis was performed to evaluate the model's validity under a variety of assumptions. The result of the decision analysis confirms the superiority of the 'liver first' approach. Sensitivity analysis demonstrated that this assumption is valid on condition that the mortality associated with the hepatectomy first is < 4.5%, and that the mortality of colectomy performed after hepatectomy is < 3.2%. The results of this decision analysis suggest that, in patients with synchronous resectable colorectal liver metastases, the 'liver first' approach is to be preferred. Randomized trials will be needed to confirm the results of this simulation based outcome.
Exploring model based engineering for large telescopes: getting started with descriptive models
NASA Astrophysics Data System (ADS)
Karban, R.; Zamparelli, M.; Bauvir, B.; Koehler, B.; Noethe, L.; Balestra, A.
2008-07-01
Large telescopes pose a continuous challenge to systems engineering due to their complexity in terms of requirements, operational modes, long duty lifetime, interfaces and number of components. A multitude of decisions must be taken throughout the life cycle of a new system, and a prime means of coping with complexity and uncertainty is using models as one decision aid. The potential of descriptive models based on the OMG Systems Modeling Language (OMG SysMLTM) is examined in different areas: building a comprehensive model serves as the basis for subsequent activities of soliciting and review for requirements, analysis and design alike. Furthermore a model is an effective communication instrument against misinterpretation pitfalls which are typical of cross disciplinary activities when using natural language only or free-format diagrams. Modeling the essential characteristics of the system, like interfaces, system structure and its behavior, are important system level issues which are addressed. Also shown is how to use a model as an analysis tool to describe the relationships among disturbances, opto-mechanical effects and control decisions and to refine the control use cases. Considerations on the scalability of the model structure and organization, its impact on the development process, the relation to document-centric structures, style and usage guidelines and the required tool chain are presented.
Tsalatsanis, Athanasios; Hozo, Iztok; Vickers, Andrew; Djulbegovic, Benjamin
2010-09-16
Decision curve analysis (DCA) has been proposed as an alternative method for evaluation of diagnostic tests, prediction models, and molecular markers. However, DCA is based on expected utility theory, which has been routinely violated by decision makers. Decision-making is governed by intuition (system 1), and analytical, deliberative process (system 2), thus, rational decision-making should reflect both formal principles of rationality and intuition about good decisions. We use the cognitive emotion of regret to serve as a link between systems 1 and 2 and to reformulate DCA. First, we analysed a classic decision tree describing three decision alternatives: treat, do not treat, and treat or no treat based on a predictive model. We then computed the expected regret for each of these alternatives as the difference between the utility of the action taken and the utility of the action that, in retrospect, should have been taken. For any pair of strategies, we measure the difference in net expected regret. Finally, we employ the concept of acceptable regret to identify the circumstances under which a potentially wrong strategy is tolerable to a decision-maker. We developed a novel dual visual analog scale to describe the relationship between regret associated with "omissions" (e.g. failure to treat) vs. "commissions" (e.g. treating unnecessary) and decision maker's preferences as expressed in terms of threshold probability. We then proved that the Net Expected Regret Difference, first presented in this paper, is equivalent to net benefits as described in the original DCA. Based on the concept of acceptable regret we identified the circumstances under which a decision maker tolerates a potentially wrong decision and expressed it in terms of probability of disease. We present a novel method for eliciting decision maker's preferences and an alternative derivation of DCA based on regret theory. Our approach may be intuitively more appealing to a decision-maker, particularly in those clinical situations when the best management option is the one associated with the least amount of regret (e.g. diagnosis and treatment of advanced cancer, etc).
2010-01-01
Background Decision curve analysis (DCA) has been proposed as an alternative method for evaluation of diagnostic tests, prediction models, and molecular markers. However, DCA is based on expected utility theory, which has been routinely violated by decision makers. Decision-making is governed by intuition (system 1), and analytical, deliberative process (system 2), thus, rational decision-making should reflect both formal principles of rationality and intuition about good decisions. We use the cognitive emotion of regret to serve as a link between systems 1 and 2 and to reformulate DCA. Methods First, we analysed a classic decision tree describing three decision alternatives: treat, do not treat, and treat or no treat based on a predictive model. We then computed the expected regret for each of these alternatives as the difference between the utility of the action taken and the utility of the action that, in retrospect, should have been taken. For any pair of strategies, we measure the difference in net expected regret. Finally, we employ the concept of acceptable regret to identify the circumstances under which a potentially wrong strategy is tolerable to a decision-maker. Results We developed a novel dual visual analog scale to describe the relationship between regret associated with "omissions" (e.g. failure to treat) vs. "commissions" (e.g. treating unnecessary) and decision maker's preferences as expressed in terms of threshold probability. We then proved that the Net Expected Regret Difference, first presented in this paper, is equivalent to net benefits as described in the original DCA. Based on the concept of acceptable regret we identified the circumstances under which a decision maker tolerates a potentially wrong decision and expressed it in terms of probability of disease. Conclusions We present a novel method for eliciting decision maker's preferences and an alternative derivation of DCA based on regret theory. Our approach may be intuitively more appealing to a decision-maker, particularly in those clinical situations when the best management option is the one associated with the least amount of regret (e.g. diagnosis and treatment of advanced cancer, etc). PMID:20846413
Portfolio Decisions and Brain Reactions via the CEAD method.
Majer, Piotr; Mohr, Peter N C; Heekeren, Hauke R; Härdle, Wolfgang K
2016-09-01
Decision making can be a complex process requiring the integration of several attributes of choice options. Understanding the neural processes underlying (uncertain) investment decisions is an important topic in neuroeconomics. We analyzed functional magnetic resonance imaging (fMRI) data from an investment decision study for stimulus-related effects. We propose a new technique for identifying activated brain regions: cluster, estimation, activation, and decision method. Our analysis is focused on clusters of voxels rather than voxel units. Thus, we achieve a higher signal-to-noise ratio within the unit tested and a smaller number of hypothesis tests compared with the often used General Linear Model (GLM). We propose to first conduct the brain parcellation by applying spatially constrained spectral clustering. The information within each cluster can then be extracted by the flexible dynamic semiparametric factor model (DSFM) dimension reduction technique and finally be tested for differences in activation between conditions. This sequence of Cluster, Estimation, Activation, and Decision admits a model-free analysis of the local fMRI signal. Applying a GLM on the DSFM-based time series resulted in a significant correlation between the risk of choice options and changes in fMRI signal in the anterior insula and dorsomedial prefrontal cortex. Additionally, individual differences in decision-related reactions within the DSFM time series predicted individual differences in risk attitudes as modeled with the framework of the mean-variance model.
ERIC Educational Resources Information Center
Hall, John S.
This review analyzes the trend in educational decision making to replace hierarchical authority structures with more rational models for decision making drawn from management science. Emphasis is also placed on alternatives to a hierarchical decision-making model, including governing models, union models, and influence models. A 54-item…
Modeling Hospital Discharge and Placement Decision Making: Whither the Elderly.
ERIC Educational Resources Information Center
Clark, William F.; Pelham, Anabel O.
This paper examines the hospital discharge decision making process for elderly patients, based on observations of the operations of a long term care agency, the California Multipurpose Senior Services Project. The analysis is divided into four components: actors, factors, processes, and strategy critique. The first section discusses the major…
Risk-Based Prioritization of Research for Aviation Security Using Logic-Evolved Decision Analysis
NASA Technical Reports Server (NTRS)
Eisenhawer, S. W.; Bott, T. F.; Sorokach, M. R.; Jones, F. P.; Foggia, J. R.
2004-01-01
The National Aeronautics and Space Administration is developing advanced technologies to reduce terrorist risk for the air transportation system. Decision support tools are needed to help allocate assets to the most promising research. An approach to rank ordering technologies (using logic-evolved decision analysis), with risk reduction as the metric, is presented. The development of a spanning set of scenarios using a logic-gate tree is described. Baseline risk for these scenarios is evaluated with an approximate reasoning model. Illustrative risk and risk reduction results are presented.
Ontology based decision system for breast cancer diagnosis
NASA Astrophysics Data System (ADS)
Trabelsi Ben Ameur, Soumaya; Cloppet, Florence; Wendling, Laurent; Sellami, Dorra
2018-04-01
In this paper, we focus on analysis and diagnosis of breast masses inspired by expert concepts and rules. Accordingly, a Bag of Words is built based on the ontology of breast cancer diagnosis, accurately described in the Breast Imaging Reporting and Data System. To fill the gap between low level knowledge and expert concepts, a semantic annotation is developed using a machine learning tool. Then, breast masses are classified into benign or malignant according to expert rules implicitly modeled with a set of classifiers (KNN, ANN, SVM and Decision Tree). This semantic context of analysis offers a frame where we can include external factors and other meta-knowledge such as patient risk factors as well as exploiting more than one modality. Based on MRI and DECEDM modalities, our developed system leads a recognition rate of 99.7% with Decision Tree where an improvement of 24.7 % is obtained owing to semantic analysis.
Baptista, Sofia; Teles Sampaio, Elvira; Heleno, Bruno; Azevedo, Luís Filipe; Martins, Carlos
2018-06-26
Prostate cancer is a leading cause of cancer among men. Because screening for prostate cancer is a controversial issue, many experts in the field have defended the use of shared decision making using validated decision aids, which can be presented in different formats (eg, written, multimedia, Web). Recent studies have concluded that decision aids improve knowledge and reduce decisional conflict. This meta-analysis aimed to investigate the impact of using Web-based decision aids to support men's prostate cancer screening decisions in comparison with usual care and other formats of decision aids. We searched PubMed, CINAHL, PsycINFO, and Cochrane CENTRAL databases up to November 2016. This search identified randomized controlled trials, which assessed Web-based decision aids for men making a prostate cancer screening decision and reported quality of decision-making outcomes. Two reviewers independently screened citations for inclusion criteria, extracted data, and assessed risk of bias. Using a random-effects model, meta-analyses were conducted pooling results using mean differences (MD), standardized mean differences (SMD), and relative risks (RR). Of 2406 unique citations, 7 randomized controlled trials met the inclusion criteria. For risk of bias, selective outcome reporting and participant/personnel blinding were mostly rated as unclear due to inadequate reporting. Based on seven items, two studies had high risk of bias for one item. Compared to usual care, Web-based decision aids increased knowledge (SMD 0.46; 95% CI 0.18-0.75), reduced decisional conflict (MD -7.07%; 95% CI -9.44 to -4.71), and reduced the practitioner control role in the decision-making process (RR 0.50; 95% CI 0.31-0.81). Web-based decision aids compared to printed decision aids yielded no differences in knowledge, decisional conflict, and participation in decision or screening behaviors. Compared to video decision aids, Web-based decision aids showed lower average knowledge scores (SMD -0.50; 95% CI -0.88 to -0.12) and a slight decrease in prostate-specific antigen screening (RR 1.12; 95% CI 1.01-1.25). According to this analysis, Web-based decision aids performed similarly to alternative formats (ie, printed, video) for the assessed decision-quality outcomes. The low cost, readiness, availability, and anonymity of the Web can be an advantage for increasing access to decision aids that support prostate cancer screening decisions among men. ©Sofia Baptista, Elvira Teles Sampaio, Bruno Heleno, Luís Filipe Azevedo, Carlos Martins. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 26.06.2018.
NASA Astrophysics Data System (ADS)
Chen, Ting-Yu
2012-06-01
This article presents a useful method for relating anchor dependency and accuracy functions to multiple attribute decision-making (MADM) problems in the context of Atanassov intuitionistic fuzzy sets (A-IFSs). Considering anchored judgement with displaced ideals and solution precision with minimal hesitation, several auxiliary optimisation models have proposed to obtain the optimal weights of the attributes and to acquire the corresponding TOPSIS (the technique for order preference by similarity to the ideal solution) index for alternative rankings. Aside from the TOPSIS index, as a decision-maker's personal characteristics and own perception of self may also influence the direction in the axiom of choice, the evaluation of alternatives is conducted based on distances of each alternative from the positive and negative ideal alternatives, respectively. This article originates from Li's [Li, D.-F. (2005), 'Multiattribute Decision Making Models and Methods Using Intuitionistic Fuzzy Sets', Journal of Computer and System Sciences, 70, 73-85] work, which is a seminal study of intuitionistic fuzzy decision analysis using deduced auxiliary programming models, and deems it a benchmark method for comparative studies on anchor dependency and accuracy functions. The feasibility and effectiveness of the proposed methods are illustrated by a numerical example. Finally, a comparative analysis is illustrated with computational experiments on averaging accuracy functions, TOPSIS indices, separation measures from positive and negative ideal alternatives, consistency rates of ranking orders, contradiction rates of the top alternative and average Spearman correlation coefficients.
Kondo, M; Nagao, Y; Mahbub, M H; Tanabe, T; Tanizawa, Y
2018-04-29
To identify factors predicting early postpartum glucose intolerance in Japanese women with gestational diabetes mellitus, using decision-curve analysis. A retrospective cohort study was performed. The participants were 123 Japanese women with gestational diabetes who underwent 75-g oral glucose tolerance tests at 8-12 weeks after delivery. They were divided into a glucose intolerance and a normal glucose tolerance group based on postpartum oral glucose tolerance test results. Analysis of the pregnancy oral glucose tolerance test results showed predictive factors for postpartum glucose intolerance. We also evaluated the clinical usefulness of the prediction model based on decision-curve analysis. Of 123 women, 78 (63.4%) had normoglycaemia and 45 (36.6%) had glucose intolerance. Multivariable logistic regression analysis showed insulinogenic index/fasting immunoreactive insulin and summation of glucose levels, assessed during pregnancy oral glucose tolerance tests (total glucose), to be independent risk factors for postpartum glucose intolerance. Evaluating the regression models, the best discrimination (area under the curve 0.725) was obtained using the basic model (i.e. age, family history of diabetes, BMI ≥25 kg/m 2 and use of insulin during pregnancy) plus insulinogenic index/fasting immunoreactive insulin <1.1. Decision-curve analysis showed that combining insulinogenic index/fasting immunoreactive insulin <1.1 with basic clinical information resulted in superior net benefits for prediction of postpartum glucose intolerance. Insulinogenic index/fasting immunoreactive insulin calculated using oral glucose tolerance test results during pregnancy is potentially useful for predicting early postpartum glucose intolerance in Japanese women with gestational diabetes. © 2018 Diabetes UK.
NASA Technical Reports Server (NTRS)
Christie, Vanessa L.; Landess, David J.
2012-01-01
In the international arena, decision makers are often swayed away from fact-based analysis by their own individual cultural and political bias. Modeling and Simulation-based training can raise awareness of individual predisposition and improve the quality of decision making by focusing solely on fact vice perception. This improved decision making methodology will support the multinational collaborative efforts of military and civilian leaders to solve challenges more effectively. The intent of this experimental research is to create a framework that allows decision makers to "come to the table" with the latest and most significant facts necessary to determine an appropriate solution for any given contingency.
Effectiveness of a Case-Based Computer Program on Students' Ethical Decision Making.
Park, Eun-Jun; Park, Mihyun
2015-11-01
The aim of this study was to test the effectiveness of a case-based computer program, using an integrative ethical decision-making model, on the ethical decision-making competency of nursing students in South Korea. This study used a pre- and posttest comparison design. Students in the intervention group used a computer program for case analysis assignments, whereas students in the standard group used a traditional paper assignment for case analysis. The findings showed that using the case-based computer program as a complementary tool for the ethics courses offered at the university enhanced students' ethical preparedness and satisfaction with the course. On the basis of the findings, it is recommended that nurse educators use a case-based computer program as a complementary self-study tool in ethics courses to supplement student learning without an increase in course hours, particularly in terms of analyzing ethics cases with dilemma scenarios and exercising ethical decision making. Copyright 2015, SLACK Incorporated.
Risk-based decision making for terrorism applications.
Dillon, Robin L; Liebe, Robert M; Bestafka, Thomas
2009-03-01
This article describes the anti-terrorism risk-based decision aid (ARDA), a risk-based decision-making approach for prioritizing anti-terrorism measures. The ARDA model was developed as part of a larger effort to assess investments for protecting U.S. Navy assets at risk and determine whether the most effective anti-terrorism alternatives are being used to reduce the risk to the facilities and war-fighting assets. With ARDA and some support from subject matter experts, we examine thousands of scenarios composed of 15 attack modes against 160 facility types on two installations and hundreds of portfolios of 22 mitigation alternatives. ARDA uses multiattribute utility theory to solve some of the commonly identified challenges in security risk analysis. This article describes the process and documents lessons learned from applying the ARDA model for this application.
Considerations for Reporting Finite Element Analysis Studies in Biomechanics
Erdemir, Ahmet; Guess, Trent M.; Halloran, Jason; Tadepalli, Srinivas C.; Morrison, Tina M.
2012-01-01
Simulation-based medicine and the development of complex computer models of biological structures is becoming ubiquitous for advancing biomedical engineering and clinical research. Finite element analysis (FEA) has been widely used in the last few decades to understand and predict biomechanical phenomena. Modeling and simulation approaches in biomechanics are highly interdisciplinary, involving novice and skilled developers in all areas of biomedical engineering and biology. While recent advances in model development and simulation platforms offer a wide range of tools to investigators, the decision making process during modeling and simulation has become more opaque. Hence, reliability of such models used for medical decision making and for driving multiscale analysis comes into question. Establishing guidelines for model development and dissemination is a daunting task, particularly with the complex and convoluted models used in FEA. Nonetheless, if better reporting can be established, researchers will have a better understanding of a model’s value and the potential for reusability through sharing will be bolstered. Thus, the goal of this document is to identify resources and considerate reporting parameters for FEA studies in biomechanics. These entail various levels of reporting parameters for model identification, model structure, simulation structure, verification, validation, and availability. While we recognize that it may not be possible to provide and detail all of the reporting considerations presented, it is possible to establish a level of confidence with selective use of these parameters. More detailed reporting, however, can establish an explicit outline of the decision-making process in simulation-based analysis for enhanced reproducibility, reusability, and sharing. PMID:22236526
Spatial planning using probabilistic flood maps
NASA Astrophysics Data System (ADS)
Alfonso, Leonardo; Mukolwe, Micah; Di Baldassarre, Giuliano
2015-04-01
Probabilistic flood maps account for uncertainty in flood inundation modelling and convey a degree of certainty in the outputs. Major sources of uncertainty include input data, topographic data, model structure, observation data and parametric uncertainty. Decision makers prefer less ambiguous information from modellers; this implies that uncertainty is suppressed to yield binary flood maps. Though, suppressing information may potentially lead to either surprise or misleading decisions. Inclusion of uncertain information in the decision making process is therefore desirable and transparent. To this end, we utilise the Prospect theory and information from a probabilistic flood map to evaluate potential decisions. Consequences related to the decisions were evaluated using flood risk analysis. Prospect theory explains how choices are made given options for which probabilities of occurrence are known and accounts for decision makers' characteristics such as loss aversion and risk seeking. Our results show that decision making is pronounced when there are high gains and loss, implying higher payoffs and penalties, therefore a higher gamble. Thus the methodology may be appropriately considered when making decisions based on uncertain information.
da Costa, Márcia Gisele Santos; Santos, Marisa da Silva; Sarti, Flávia Mori; Senna, Kátia Marie Simões e.; Tura, Bernardo Rangel; Goulart, Marcelo Correia
2014-01-01
Objectives The study performs a cost-effectiveness analysis of procedures for atrial septal defects occlusion, comparing conventional surgery to septal percutaneous implant. Methods A model of analytical decision was structured with symmetric branches to estimate cost-effectiveness ratio between the procedures. The decision tree model was based on evidences gathered through meta-analysis of literature, and validated by a panel of specialists. The lower number of surgical procedures performed for atrial septal defects occlusion at each branch was considered as the effectiveness outcome. Direct medical costs and probabilities for each event were inserted in the model using data available from Brazilian public sector database system and information extracted from the literature review, using micro-costing technique. Sensitivity analysis included price variations of percutaneous implant. Results The results obtained from the decision model demonstrated that the percutaneous implant was more cost effective in cost-effectiveness analysis at a cost of US$8,936.34 with a reduction in the probability of surgery occurrence in 93% of the cases. Probability of atrial septal communication occlusion and cost of the implant are the determinant factors of cost-effectiveness ratio. Conclusions The proposal of a decision model seeks to fill a void in the academic literature. The decision model proposed includes the outcomes that present major impact in relation to the overall costs of the procedure. The atrial septal defects occlusion using percutaneous implant reduces the physical and psychological distress to the patients in relation to the conventional surgery, which represent intangible costs in the context of economic evaluation. PMID:25302806
da Costa, Márcia Gisele Santos; Santos, Marisa da Silva; Sarti, Flávia Mori; Simões e Senna, Kátia Marie; Tura, Bernardo Rangel; Correia, Marcelo Goulart; Goulart, Marcelo Correia
2014-01-01
The study performs a cost-effectiveness analysis of procedures for atrial septal defects occlusion, comparing conventional surgery to septal percutaneous implant. A model of analytical decision was structured with symmetric branches to estimate cost-effectiveness ratio between the procedures. The decision tree model was based on evidences gathered through meta-analysis of literature, and validated by a panel of specialists. The lower number of surgical procedures performed for atrial septal defects occlusion at each branch was considered as the effectiveness outcome. Direct medical costs and probabilities for each event were inserted in the model using data available from Brazilian public sector database system and information extracted from the literature review, using micro-costing technique. Sensitivity analysis included price variations of percutaneous implant. The results obtained from the decision model demonstrated that the percutaneous implant was more cost effective in cost-effectiveness analysis at a cost of US$8,936.34 with a reduction in the probability of surgery occurrence in 93% of the cases. Probability of atrial septal communication occlusion and cost of the implant are the determinant factors of cost-effectiveness ratio. The proposal of a decision model seeks to fill a void in the academic literature. The decision model proposed includes the outcomes that present major impact in relation to the overall costs of the procedure. The atrial septal defects occlusion using percutaneous implant reduces the physical and psychological distress to the patients in relation to the conventional surgery, which represent intangible costs in the context of economic evaluation.
A Computational Model of Reasoning from the Clinical Literature
Rennels, Glenn D.
1986-01-01
This paper explores the premise that a formalized representation of empirical studies can play a central role in computer-based decision support. The specific motivations underlying this research include the following propositions: 1. Reasoning from experimental evidence contained in the clinical literature is central to the decisions physicians make in patient care. 2. A computational model, based upon a declarative representation for published reports of clinical studies, can drive a computer program that selectively tailors knowledge of the clinical literature as it is applied to a particular case. 3. The development of such a computational model is an important first step toward filling a void in computer-based decision support systems. Furthermore, the model may help us better understand the general principles of reasoning from experimental evidence both in medicine and other domains. Roundsman is a developmental computer system which draws upon structured representations of the clinical literature in order to critique plans for the management of primary breast cancer. Roundsman is able to produce patient-specific analyses of breast cancer management options based on the 24 clinical studies currently encoded in its knowledge base. The Roundsman system is a first step in exploring how the computer can help to bring a critical analysis of the relevant literature to the physician, structured around a particular patient and treatment decision.
Prospect theory on the brain? Toward a cognitive neuroscience of decision under risk.
Trepel, Christopher; Fox, Craig R; Poldrack, Russell A
2005-04-01
Most decisions must be made without advance knowledge of their consequences. Economists and psychologists have devoted much attention to modeling decisions made under conditions of risk in which options can be characterized by a known probability distribution over possible outcomes. The descriptive shortcomings of classical economic models motivated the development of prospect theory (D. Kahneman, A. Tversky, Prospect theory: An analysis of decision under risk. Econometrica, 4 (1979) 263-291; A. Tversky, D. Kahneman, Advances in prospect theory: Cumulative representation of uncertainty. Journal of Risk and Uncertainty, 5 (4) (1992) 297-323) the most successful behavioral model of decision under risk. In the prospect theory, subjective value is modeled by a value function that is concave for gains, convex for losses, and steeper for losses than for gains; the impact of probabilities are characterized by a weighting function that overweights low probabilities and underweights moderate to high probabilities. We outline the possible neural bases of the components of prospect theory, surveying evidence from human imaging, lesion, and neuropharmacology studies as well as animal neurophysiology studies. These results provide preliminary suggestions concerning the neural bases of prospect theory that include a broad set of brain regions and neuromodulatory systems. These data suggest that focused studies of decision making in the context of quantitative models may provide substantial leverage towards a fuller understanding of the cognitive neuroscience of decision making.
NASA Astrophysics Data System (ADS)
Basye, Austin T.
A matrix element method analysis of the Standard Model Higgs boson, produced in association with two top quarks decaying to the lepton-plus-jets channel is presented. Based on 20.3 fb--1 of s=8 TeV data, produced at the Large Hadron Collider and collected by the ATLAS detector, this analysis utilizes multiple advanced techniques to search for ttH signatures with a 125 GeV Higgs boson decaying to two b -quarks. After categorizing selected events based on their jet and b-tag multiplicities, signal rich regions are analyzed using the matrix element method. Resulting variables are then propagated to two parallel multivariate analyses utilizing Neural Networks and Boosted Decision Trees respectively. As no significant excess is found, an observed (expected) limit of 3.4 (2.2) times the Standard Model cross-section is determined at 95% confidence, using the CLs method, for the Neural Network analysis. For the Boosted Decision Tree analysis, an observed (expected) limit of 5.2 (2.7) times the Standard Model cross-section is determined at 95% confidence, using the CLs method. Corresponding unconstrained fits of the Higgs boson signal strength to the observed data result in the measured signal cross-section to Standard Model cross-section prediction of mu = 1.2 +/- 1.3(total) +/- 0.7(stat.) for the Neural Network analysis, and mu = 2.9 +/- 1.4(total) +/- 0.8(stat.) for the Boosted Decision Tree analysis.
The Value of Information in Decision-Analytic Modeling for Malaria Vector Control in East Africa.
Kim, Dohyeong; Brown, Zachary; Anderson, Richard; Mutero, Clifford; Miranda, Marie Lynn; Wiener, Jonathan; Kramer, Randall
2017-02-01
Decision analysis tools and mathematical modeling are increasingly emphasized in malaria control programs worldwide to improve resource allocation and address ongoing challenges with sustainability. However, such tools require substantial scientific evidence, which is costly to acquire. The value of information (VOI) has been proposed as a metric for gauging the value of reduced model uncertainty. We apply this concept to an evidenced-based Malaria Decision Analysis Support Tool (MDAST) designed for application in East Africa. In developing MDAST, substantial gaps in the scientific evidence base were identified regarding insecticide resistance in malaria vector control and the effectiveness of alternative mosquito control approaches, including larviciding. We identify four entomological parameters in the model (two for insecticide resistance and two for larviciding) that involve high levels of uncertainty and to which outputs in MDAST are sensitive. We estimate and compare a VOI for combinations of these parameters in evaluating three policy alternatives relative to a status quo policy. We find having perfect information on the uncertain parameters could improve program net benefits by up to 5-21%, with the highest VOI associated with jointly eliminating uncertainty about reproductive speed of malaria-transmitting mosquitoes and initial efficacy of larviciding at reducing the emergence of new adult mosquitoes. Future research on parameter uncertainty in decision analysis of malaria control policy should investigate the VOI with respect to other aspects of malaria transmission (such as antimalarial resistance), the costs of reducing uncertainty in these parameters, and the extent to which imperfect information about these parameters can improve payoffs. © 2016 Society for Risk Analysis.
RELATING ACCUMULATOR MODEL PARAMETERS AND NEURAL DYNAMICS
Purcell, Braden A.; Palmeri, Thomas J.
2016-01-01
Accumulator models explain decision-making as an accumulation of evidence to a response threshold. Specific model parameters are associated with specific model mechanisms, such as the time when accumulation begins, the average rate of evidence accumulation, and the threshold. These mechanisms determine both the within-trial dynamics of evidence accumulation and the predicted behavior. Cognitive modelers usually infer what mechanisms vary during decision-making by seeing what parameters vary when a model is fitted to observed behavior. The recent identification of neural activity with evidence accumulation suggests that it may be possible to directly infer what mechanisms vary from an analysis of how neural dynamics vary. However, evidence accumulation is often noisy, and noise complicates the relationship between accumulator dynamics and the underlying mechanisms leading to those dynamics. To understand what kinds of inferences can be made about decision-making mechanisms based on measures of neural dynamics, we measured simulated accumulator model dynamics while systematically varying model parameters. In some cases, decision- making mechanisms can be directly inferred from dynamics, allowing us to distinguish between models that make identical behavioral predictions. In other cases, however, different parameterized mechanisms produce surprisingly similar dynamics, limiting the inferences that can be made based on measuring dynamics alone. Analyzing neural dynamics can provide a powerful tool to resolve model mimicry at the behavioral level, but we caution against drawing inferences based solely on neural analyses. Instead, simultaneous modeling of behavior and neural dynamics provides the most powerful approach to understand decision-making and likely other aspects of cognition and perception. PMID:28392584
Archetypes for Organisational Safety
NASA Technical Reports Server (NTRS)
Marais, Karen; Leveson, Nancy G.
2003-01-01
We propose a framework using system dynamics to model the dynamic behavior of organizations in accident analysis. Most current accident analysis techniques are event-based and do not adequately capture the dynamic complexity and non-linear interactions that characterize accidents in complex systems. In this paper we propose a set of system safety archetypes that model common safety culture flaws in organizations, i.e., the dynamic behaviour of organizations that often leads to accidents. As accident analysis and investigation tools, the archetypes can be used to develop dynamic models that describe the systemic and organizational factors contributing to the accident. The archetypes help clarify why safety-related decisions do not always result in the desired behavior, and how independent decisions in different parts of the organization can combine to impact safety.
Development of a model-based flood emergency management system in Yujiang River Basin, South China
NASA Astrophysics Data System (ADS)
Zeng, Yong; Cai, Yanpeng; Jia, Peng; Mao, Jiansu
2014-06-01
Flooding is the most frequent disaster in China. It affects people's lives and properties, causing considerable economic loss. Flood forecast and operation of reservoirs are important in flood emergency management. Although great progress has been achieved in flood forecast and reservoir operation through using computer, network technology, and geographic information system technology in China, the prediction accuracy of models are not satisfactory due to the unavailability of real-time monitoring data. Also, real-time flood control scenario analysis is not effective in many regions and can seldom provide online decision support function. In this research, a decision support system for real-time flood forecasting in Yujiang River Basin, South China (DSS-YRB) is introduced in this paper. This system is based on hydrological and hydraulic mathematical models. The conceptual framework and detailed components of the proposed DSS-YRB is illustrated, which employs real-time rainfall data conversion, model-driven hydrologic forecasting, model calibration, data assimilation methods, and reservoir operational scenario analysis. Multi-tiered architecture offers great flexibility, portability, reusability, and reliability. The applied case study results show the development and application of a decision support system for real-time flood forecasting and operation is beneficial for flood control.
Orhan, U.; Erdogmus, D.; Roark, B.; Oken, B.; Purwar, S.; Hild, K. E.; Fowler, A.; Fried-Oken, M.
2013-01-01
RSVP Keyboard™ is an electroencephalography (EEG) based brain computer interface (BCI) typing system, designed as an assistive technology for the communication needs of people with locked-in syndrome (LIS). It relies on rapid serial visual presentation (RSVP) and does not require precise eye gaze control. Existing BCI typing systems which uses event related potentials (ERP) in EEG suffer from low accuracy due to low signal-to-noise ratio. Henceforth, RSVP Keyboard™ utilizes a context based decision making via incorporating a language model, to improve the accuracy of letter decisions. To further improve the contributions of the language model, we propose recursive Bayesian estimation, which relies on non-committing string decisions, and conduct an offline analysis, which compares it with the existing naïve Bayesian fusion approach. The results indicate the superiority of the recursive Bayesian fusion and in the next generation of RSVP Keyboard™ we plan to incorporate this new approach. PMID:23366432
An Analysis of the EPA Report on Pipeline Renewal Decision Making Tools and Approaches
Few DSS are commercially available for technology selection as most utilities make decisions based on in-house and consultant expertise (Matthews et al., 2011). This review presents some of the models proposed over the past 15 years for selecting technologies in the U.S. and wor...
Rieger, Marc Oliver; Wang, Mei
2008-01-01
Comments on the article by E. Brandstätter, G. Gigerenzer, and R. Hertwig. The authors discuss the priority heuristic, a recent model for decisions under risk. They reanalyze the experimental validity of this approach and discuss how these results compare with cumulative prospect theory, the currently most established model in behavioral economics. They also discuss how general models for decisions under risk based on a heuristic approach can be understood mathematically to gain some insight in their limitations. They finally consider whether the priority heuristic model can lead to some understanding of the decision process of individuals or whether it is better seen as an as-if model. (c) 2008 APA, all rights reserved
NASA Astrophysics Data System (ADS)
Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan
2018-03-01
GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.
Dynamic decision making for dam-break emergency management - Part 1: Theoretical framework
NASA Astrophysics Data System (ADS)
Peng, M.; Zhang, L. M.
2013-02-01
An evacuation decision for dam breaks is a very serious issue. A late decision may lead to loss of lives and properties, but a very early evacuation will incur unnecessary expenses. This paper presents a risk-based framework of dynamic decision making for dam-break emergency management (DYDEM). The dam-break emergency management in both time scale and space scale is introduced first to define the dynamic decision problem. The probability of dam failure is taken as a stochastic process and estimated using a time-series analysis method. The flood consequences are taken as functions of warning time and evaluated with a human risk analysis model (HURAM) based on Bayesian networks. A decision criterion is suggested to decide whether to evacuate the population at risk (PAR) or to delay the decision. The optimum time for evacuating the PAR is obtained by minimizing the expected total loss, which integrates the time-related probabilities and flood consequences. When a delayed decision is chosen, the decision making can be updated with available new information. A specific dam-break case study is presented in a companion paper to illustrate the application of this framework to complex dam-breaching problems.
Emergent collective decision-making: Control, model and behavior
NASA Astrophysics Data System (ADS)
Shen, Tian
In this dissertation we study emergent collective decision-making in social groups with time-varying interactions and heterogeneously informed individuals. First we analyze a nonlinear dynamical systems model motivated by animal collective motion with heterogeneously informed subpopulations, to examine the role of uninformed individuals. We find through formal analysis that adding uninformed individuals in a group increases the likelihood of a collective decision. Secondly, we propose a model for human shared decision-making with continuous-time feedback and where individuals have little information about the true preferences of other group members. We study model equilibria using bifurcation analysis to understand how the model predicts decisions based on the critical threshold parameters that represent an individual's tradeoff between social and environmental influences. Thirdly, we analyze continuous-time data of pairs of human subjects performing an experimental shared tracking task using our second proposed model in order to understand transient behavior and the decision-making process. We fit the model to data and show that it reproduces a wide range of human behaviors surprisingly well, suggesting that the model may have captured the mechanisms of observed behaviors. Finally, we study human behavior from a game-theoretic perspective by modeling the aforementioned tracking task as a repeated game with incomplete information. We show that the majority of the players are able to converge to playing Nash equilibrium strategies. We then suggest with simulations that the mean field evolution of strategies in the population resemble replicator dynamics, indicating that the individual strategies may be myopic. Decisions form the basis of control and problems involving deciding collectively between alternatives are ubiquitous in nature and in engineering. Understanding how multi-agent systems make decisions among alternatives also provides insight for designing decentralized control laws for engineering applications from mobile sensor networks for environmental monitoring to collective construction robots. With this dissertation we hope to provide additional methodology and mathematical models for understanding the behavior and control of collective decision-making in multi-agent systems.
Heckbert, Scott; Wilson, Jeffrey J.; Vandenbroeck, Andrew J. K.; Cranston, Jerome; Farr, Daniel R.
2016-01-01
The science of ecosystem service (ES) mapping has become increasingly sophisticated over the past 20 years, and examples of successfully integrating ES into management decisions at national and sub-national scales have begun to emerge. However, increasing model sophistication and accuracy—and therefore complexity—may trade-off with ease of use and applicability to real-world decision-making contexts, so it is vital to incorporate the lessons learned from implementation efforts into new model development. Using successful implementation efforts for guidance, we developed an integrated ES modelling system to quantify several ecosystem services: forest timber production and carbon storage, water purification, pollination, and biodiversity. The system is designed to facilitate uptake of ES information into land-use decisions through three principal considerations: (1) using relatively straightforward models that can be readily deployed and interpreted without specialized expertise; (2) using an agent-based modelling framework to enable the incorporation of human decision-making directly within the model; and (3) integration among all ES models to simultaneously demonstrate the effects of a single land-use decision on multiple ES. We present an implementation of the model for a major watershed in Alberta, Canada, and highlight the system’s capabilities to assess a suite of ES under future management decisions, including forestry activities under two alternative timber harvest strategies, and through a scenario modelling analysis exploring different intensities of hypothetical agricultural expansion. By using a modular approach, the modelling system can be readily expanded to evaluate additional ecosystem services or management questions of interest in order to guide land-use decisions to achieve socioeconomic and environmental objectives. PMID:28028479
Web-services-based spatial decision support system to facilitate nuclear waste siting
NASA Astrophysics Data System (ADS)
Huang, L. Xinglai; Sheng, Grant
2006-10-01
The availability of spatial web services enables data sharing among managers, decision and policy makers and other stakeholders in much simpler ways than before and subsequently has created completely new opportunities in the process of spatial decision making. Though generally designed for a certain problem domain, web-services-based spatial decision support systems (WSDSS) can provide a flexible problem-solving environment to explore the decision problem, understand and refine problem definition, and generate and evaluate multiple alternatives for decision. This paper presents a new framework for the development of a web-services-based spatial decision support system. The WSDSS is comprised of distributed web services that either have their own functions or provide different geospatial data and may reside in different computers and locations. WSDSS includes six key components, namely: database management system, catalog, analysis functions and models, GIS viewers and editors, report generators, and graphical user interfaces. In this study, the architecture of a web-services-based spatial decision support system to facilitate nuclear waste siting is described as an example. The theoretical, conceptual and methodological challenges and issues associated with developing web services-based spatial decision support system are described.
Toward an operational model of decision making, emotional regulation, and mental health impact.
Collura, Thomas Francis; Zalaquett, Ronald P; Bonnstetter, Carlos Joyce; Chatters, Seria J
2014-01-01
Current brain research increasingly reveals the underlying mechanisms and processes of human behavior, cognition, and emotion. In addition to being of interest to a wide range of scientists, educators, and professionals, as well as laypeople, brain-based models are of particular value in a clinical setting. Psychiatrists, psychologists, counselors, and other mental health professionals are in need of operational models that integrate recent findings in the physical, cognitive, and emotional domains, and offer a common language for interdisciplinary understanding and communication. Based on individual traits, predispositions, and responses to stimuli, we can begin to identify emotional and behavioral pathways and mental processing patterns. The purpose of this article is to present a brain-path activation model to understand individual differences in decision making and psychopathology. The first section discusses the role of frontal lobe electroencephalography (EEG) asymmetry, summarizes state- and trait-based models of decision making, and provides a more complex analysis that supplements the traditional simple left-right brain model. Key components of the new model are the introduction of right hemisphere parallel and left hemisphere serial scanning in rendering decisions, and the proposition of pathways that incorporate both past experiences as well as future implications into the decision process. Main attributes of each decision-making mechanism are provided. The second section applies the model within the realm of clinical mental health as a tool to understand specific human behavior and pathology. Applications include general and chronic anxiety, depression, paranoia, risk taking, and the pathways employed when well-functioning operational integration is observed. Finally, specific applications such as meditation and mindfulness are offered to facilitate positive functioning.
Monitoring and decision making by people in man machine systems
NASA Technical Reports Server (NTRS)
Johannsen, G.
1979-01-01
The analysis of human monitoring and decision making behavior as well as its modeling are described. Classic and optimal control theoretical, monitoring models are surveyed. The relationship between attention allocation and eye movements is discussed. As an example of applications, the evaluation of predictor displays by means of the optimal control model is explained. Fault detection involving continuous signals and decision making behavior of a human operator engaged in fault diagnosis during different operation and maintenance situations are illustrated. Computer aided decision making is considered as a queueing problem. It is shown to what extent computer aids can be based on the state of human activity as measured by psychophysiological quantities. Finally, management information systems for different application areas are mentioned. The possibilities of mathematical modeling of human behavior in complex man machine systems are also critically assessed.
Review of early assessment models of innovative medical technologies.
Fasterholdt, Iben; Krahn, Murray; Kidholm, Kristian; Yderstræde, Knud Bonnet; Pedersen, Kjeld Møller
2017-08-01
Hospitals increasingly make decisions regarding the early development of and investment in technologies, but a formal evaluation model for assisting hospitals early on in assessing the potential of innovative medical technologies is lacking. This article provides an overview of models for early assessment in different health organisations and discusses which models hold most promise for hospital decision makers. A scoping review of published studies between 1996 and 2015 was performed using nine databases. The following information was collected: decision context, decision problem, and a description of the early assessment model. 2362 articles were identified and 12 studies fulfilled the inclusion criteria. An additional 12 studies were identified and included in the review by searching reference lists. The majority of the 24 early assessment studies were variants of traditional cost-effectiveness analysis. Around one fourth of the studies presented an evaluation model with a broader focus than cost-effectiveness. Uncertainty was mostly handled by simple sensitivity or scenario analysis. This review shows that evaluation models using known methods assessing cost-effectiveness are most prevalent in early assessment, but seems ill-suited for early assessment in hospitals. Four models provided some usable elements for the development of a hospital-based model. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.
Chen, Hai; Liang, Xiaoying; Li, Rui
2013-01-01
Multi-Agent Systems (MAS) offer a conceptual approach to include multi-actor decision making into models of land use change. Through the simulation based on the MAS, this paper tries to show the application of MAS in the micro scale LUCC, and reveal the transformation mechanism of difference scale. This paper starts with a description of the context of MAS research. Then, it adopts the Nested Spatial Choice (NSC) method to construct the multi-scale LUCC decision-making model. And a case study for Mengcha village, Mizhi County, Shaanxi Province is reported. Finally, the potentials and drawbacks of the following approach is discussed and concluded. From our design and implementation of the MAS in multi-scale model, a number of observations and conclusions can be drawn on the implementation and future research directions. (1) The use of the LUCC decision-making and multi-scale transformation framework provides, according to us, a more realistic modeling of multi-scale decision making process. (2) By using continuous function, rather than discrete function, to construct the decision-making of the households is more realistic to reflect the effect. (3) In this paper, attempts have been made to give a quantitative analysis to research the household interaction. And it provides the premise and foundation for researching the communication and learning among the households. (4) The scale transformation architecture constructed in this paper helps to accumulate theory and experience for the interaction research between the micro land use decision-making and the macro land use landscape pattern. Our future research work will focus on: (1) how to rational use risk aversion principle, and put the rule on rotation between household parcels into model. (2) Exploring the methods aiming at researching the household decision-making over a long period, it allows us to find the bridge between the long-term LUCC data and the short-term household decision-making. (3) Researching the quantitative method and model, especially the scenario analysis model which may reflect the interaction among different household types.
ERIC Educational Resources Information Center
Zwick, Rebecca; Lenaburg, Lubella
2009-01-01
In certain data analyses (e.g., multiple discriminant analysis and multinomial log-linear modeling), classification decisions are made based on the estimated posterior probabilities that individuals belong to each of several distinct categories. In the Bayesian network literature, this type of classification is often accomplished by assigning…
Growth Dynamics of Information Search Services
ERIC Educational Resources Information Center
Lindquist, Mats G.
1978-01-01
An analysis of computer-based search services (ISSs) from a system's viewpoint, using a continuous simulation model to reveal growth and stagnation of a typical system is presented, as well as an analysis of decision making for an ISS. (Author/MBR)
Stochastic Watershed Models for Risk Based Decision Making
NASA Astrophysics Data System (ADS)
Vogel, R. M.
2017-12-01
Over half a century ago, the Harvard Water Program introduced the field of operational or synthetic hydrology providing stochastic streamflow models (SSMs), which could generate ensembles of synthetic streamflow traces useful for hydrologic risk management. The application of SSMs, based on streamflow observations alone, revolutionized water resources planning activities, yet has fallen out of favor due, in part, to their inability to account for the now nearly ubiquitous anthropogenic influences on streamflow. This commentary advances the modern equivalent of SSMs, termed `stochastic watershed models' (SWMs) useful as input to nearly all modern risk based water resource decision making approaches. SWMs are deterministic watershed models implemented using stochastic meteorological series, model parameters and model errors, to generate ensembles of streamflow traces that represent the variability in possible future streamflows. SWMs combine deterministic watershed models, which are ideally suited to accounting for anthropogenic influences, with recent developments in uncertainty analysis and principles of stochastic simulation
Klijn, Sven L; Weijenberg, Matty P; Lemmens, Paul; van den Brandt, Piet A; Lima Passos, Valéria
2017-10-01
Background and objective Group-based trajectory modelling is a model-based clustering technique applied for the identification of latent patterns of temporal changes. Despite its manifold applications in clinical and health sciences, potential problems of the model selection procedure are often overlooked. The choice of the number of latent trajectories (class-enumeration), for instance, is to a large degree based on statistical criteria that are not fail-safe. Moreover, the process as a whole is not transparent. To facilitate class enumeration, we introduce a graphical summary display of several fit and model adequacy criteria, the fit-criteria assessment plot. Methods An R-code that accepts universal data input is presented. The programme condenses relevant group-based trajectory modelling output information of model fit indices in automated graphical displays. Examples based on real and simulated data are provided to illustrate, assess and validate fit-criteria assessment plot's utility. Results Fit-criteria assessment plot provides an overview of fit criteria on a single page, placing users in an informed position to make a decision. Fit-criteria assessment plot does not automatically select the most appropriate model but eases the model assessment procedure. Conclusions Fit-criteria assessment plot is an exploratory, visualisation tool that can be employed to assist decisions in the initial and decisive phase of group-based trajectory modelling analysis. Considering group-based trajectory modelling's widespread resonance in medical and epidemiological sciences, a more comprehensive, easily interpretable and transparent display of the iterative process of class enumeration may foster group-based trajectory modelling's adequate use.
Enhancing Consumer Choice: Are We Making Appropriate Recommendations?
ERIC Educational Resources Information Center
Lee, Jinkook; Geistfeld, Loren V.
1998-01-01
This study used conjoint analysis to identify consumer choice models. Results suggest a need to base choice-making aids on ideal choice models if the aid is to lead consumers to decisions consistent with true preferences. (Author/JOW)
A Chaotic Ordered Hierarchies Consistency Analysis Performance Evaluation Model
NASA Astrophysics Data System (ADS)
Yeh, Wei-Chang
2013-02-01
The Hierarchies Consistency Analysis (HCA) is proposed by Guh in-cooperated along with some case study on a Resort to reinforce the weakness of Analytical Hierarchy Process (AHP). Although the results obtained enabled aid for the Decision Maker to make more reasonable and rational verdicts, the HCA itself is flawed. In this paper, our objective is to indicate the problems of HCA, and then propose a revised method called chaotic ordered HCA (COH in short) which can avoid problems. Since the COH is based upon Guh's method, the Decision Maker establishes decisions in a way similar to that of the original method.
Problem-Oriented Corporate Knowledge Base Models on the Case-Based Reasoning Approach Basis
NASA Astrophysics Data System (ADS)
Gluhih, I. N.; Akhmadulin, R. K.
2017-07-01
One of the urgent directions of efficiency enhancement of production processes and enterprises activities management is creation and use of corporate knowledge bases. The article suggests a concept of problem-oriented corporate knowledge bases (PO CKB), in which knowledge is arranged around possible problem situations and represents a tool for making and implementing decisions in such situations. For knowledge representation in PO CKB a case-based reasoning approach is encouraged to use. Under this approach, the content of a case as a knowledge base component has been defined; based on the situation tree a PO CKB knowledge model has been developed, in which the knowledge about typical situations as well as specific examples of situations and solutions have been represented. A generalized problem-oriented corporate knowledge base structural chart and possible modes of its operation have been suggested. The obtained models allow creating and using corporate knowledge bases for support of decision making and implementing, training, staff skill upgrading and analysis of the decisions taken. The universal interpretation of terms “situation” and “solution” adopted in the work allows using the suggested models to develop problem-oriented corporate knowledge bases in different subject domains. It has been suggested to use the developed models for making corporate knowledge bases of the enterprises that operate engineer systems and networks at large production facilities.
[Analyzing consumer preference by using the latest semantic model for verbal protocol].
Tamari, Yuki; Takemura, Kazuhisa
2012-02-01
This paper examines consumers' preferences for competing brands by using a preference model of verbal protocols. Participants were 150 university students, who reported their opinions and feelings about McDonalds and Mos Burger (competing hamburger restaurants in Japan). Their verbal protocols were analyzed by using the singular value decomposition method, and the latent decision frames were estimated. The verbal protocols having a large value in the decision frames could be interpreted as showing attributes that consumers emphasize. Based on the estimated decision frames, we predicted consumers' preferences using the logistic regression analysis method. The results indicate that the decision frames projected from the verbal protocol data explained consumers' preferences effectively.
Conjoint analysis: using a market-based research model for healthcare decision making.
Mele, Nancy L
2008-01-01
Conjoint analysis is a market-based research model that has been used by businesses for more than 35 years to predict consumer preferences in product design and purchasing. Researchers in medicine, healthcare economics, and health policy have discovered the value of this methodology in determining treatment preferences, resource allocation, and willingness to pay. To describe the conjoint analysis methodology and explore value-added applications in nursing research. Conjoint analysis methodology is described, using examples from the healthcare and business literature, and personal experience with the method. Nurses are called upon to increase interdisciplinary research, provide an evidence base for nursing practice, create patient-centered treatments, and revise nursing education. Other disciplines have met challenges like these using conjoint analysis and discrete choice modeling.
Usage Intention Framework Model: A Fuzzy Logic Interpretation of the Classical Utaut Model
ERIC Educational Resources Information Center
Sandaire, Johnny
2009-01-01
A fuzzy conjoint analysis (FCA: Turksen, 1992) model for enhancing management decision in the technology adoption domain was implemented as an extension to the UTAUT model (Venkatesh, Morris, Davis, & Davis, 2003). Additionally, a UTAUT-based Usage Intention Framework Model (UIFM) introduced a closed-loop feedback system. The empirical evidence…
Peters, Jaime L; Cooper, Chris; Buchanan, James
2015-01-01
Introduction Decision models can be used to conduct economic evaluations of new pharmacogenetic and pharmacogenomic tests to ensure they offer value for money to healthcare systems. These models require a great deal of evidence, yet research suggests the evidence used is diverse and of uncertain quality. By conducting a systematic review, we aim to investigate the test-related evidence used to inform decision models developed for the economic evaluation of genetic tests. Methods and analysis We will search electronic databases including MEDLINE, EMBASE and NHS EEDs to identify model-based economic evaluations of pharmacogenetic and pharmacogenomic tests. The search will not be limited by language or date. Title and abstract screening will be conducted independently by 2 reviewers, with screening of full texts and data extraction conducted by 1 reviewer, and checked by another. Characteristics of the decision problem, the decision model and the test evidence used to inform the model will be extracted. Specifically, we will identify the reported evidence sources for the test-related evidence used, describe the study design and how the evidence was identified. A checklist developed specifically for decision analytic models will be used to critically appraise the models described in these studies. Variations in the test evidence used in the decision models will be explored across the included studies, and we will identify gaps in the evidence in terms of both quantity and quality. Dissemination The findings of this work will be disseminated via a peer-reviewed journal publication and at national and international conferences. PMID:26560056
Modelling the risk-benefit impact of H1N1 influenza vaccines.
Phillips, Lawrence D; Fasolo, Barbara; Zafiropoulous, Nikolaos; Eichler, Hans-Georg; Ehmann, Falk; Jekerle, Veronika; Kramarz, Piotr; Nicoll, Angus; Lönngren, Thomas
2013-08-01
Shortly after the H1N1 influenza virus reached pandemic status in June 2009, the benefit-risk project team at the European Medicines Agency recognized this presented a research opportunity for testing the usefulness of a decision analysis model in deliberations about approving vaccines soon based on limited data or waiting for more data. Undertaken purely as a research exercise, the model was not connected to the ongoing assessment by the European Medicines Agency, which approved the H1N1 vaccines on 25 September 2009. A decision tree model constructed initially on 1 September 2009, and slightly revised subsequently as new data were obtained, represented an end-of-September or end-of-October approval of vaccines. The model showed combinations of uncertain events, the severity of the disease and the vaccines' efficacy and safety, leading to estimates of numbers of deaths and serious disabilities. The group based their probability assessments on available information and background knowledge about vaccines and similar pandemics in the past. Weighting the numbers by their joint probabilities for all paths through the decision tree gave a weighted average for a September decision of 216 500 deaths and serious disabilities, and for a decision delayed to October of 291 547, showing that an early decision was preferable. The process of constructing the model facilitated communications among the group's members and led to new insights for several participants, while its robustness built confidence in the decision. These findings suggest that models might be helpful to regulators, as they form their preferences during the process of deliberation and debate, and more generally, for public health issues when decision makers face considerable uncertainty.
Hierarchical Bayes approach for subgroup analysis.
Hsu, Yu-Yi; Zalkikar, Jyoti; Tiwari, Ram C
2017-01-01
In clinical data analysis, both treatment effect estimation and consistency assessment are important for a better understanding of the drug efficacy for the benefit of subjects in individual subgroups. The linear mixed-effects model has been used for subgroup analysis to describe treatment differences among subgroups with great flexibility. The hierarchical Bayes approach has been applied to linear mixed-effects model to derive the posterior distributions of overall and subgroup treatment effects. In this article, we discuss the prior selection for variance components in hierarchical Bayes, estimation and decision making of the overall treatment effect, as well as consistency assessment of the treatment effects across the subgroups based on the posterior predictive p-value. Decision procedures are suggested using either the posterior probability or the Bayes factor. These decision procedures and their properties are illustrated using a simulated example with normally distributed response and repeated measurements.
Yang, Z Janet; McComas, Katherine A; Gay, Geri K; Leonard, John P; Dannenberg, Andrew J; Dillon, Hildy
2012-01-01
This study extends a risk information seeking and processing model to explore the relative effect of cognitive processing strategies, positive and negative emotions, and normative beliefs on individuals' decision making about potential health risks. Most previous research based on this theoretical framework has examined environmental risks. Applying this risk communication model to study health decision making presents an opportunity to explore theoretical boundaries of the model, while also bringing this research to bear on a pressing medical issue: low enrollment in clinical trials. Comparative analysis of data gathered from 2 telephone surveys of a representative national sample (n = 500) and a random sample of cancer patients (n = 411) indicated that emotions played a more substantive role in cancer patients' decisions to enroll in a potential trial, whereas cognitive processing strategies and normative beliefs had greater influences on the decisions of respondents from the national sample.
Angelis, Aris; Kanavos, Panos
2017-09-01
Escalating drug prices have catalysed the generation of numerous "value frameworks" with the aim of informing payers, clinicians and patients on the assessment and appraisal process of new medicines for the purpose of coverage and treatment selection decisions. Although this is an important step towards a more inclusive Value Based Assessment (VBA) approach, aspects of these frameworks are based on weak methodologies and could potentially result in misleading recommendations or decisions. In this paper, a Multiple Criteria Decision Analysis (MCDA) methodological process, based on Multi Attribute Value Theory (MAVT), is adopted for building a multi-criteria evaluation model. A five-stage model-building process is followed, using a top-down "value-focused thinking" approach, involving literature reviews and expert consultations. A generic value tree is structured capturing decision-makers' concerns for assessing the value of new medicines in the context of Health Technology Assessment (HTA) and in alignment with decision theory. The resulting value tree (Advance Value Tree) consists of three levels of criteria (top level criteria clusters, mid-level criteria, bottom level sub-criteria or attributes) relating to five key domains that can be explicitly measured and assessed: (a) burden of disease, (b) therapeutic impact, (c) safety profile (d) innovation level and (e) socioeconomic impact. A number of MAVT modelling techniques are introduced for operationalising (i.e. estimating) the model, for scoring the alternative treatment options, assigning relative weights of importance to the criteria, and combining scores and weights. Overall, the combination of these MCDA modelling techniques for the elicitation and construction of value preferences across the generic value tree provides a new value framework (Advance Value Framework) enabling the comprehensive measurement of value in a structured and transparent way. Given its flexibility to meet diverse requirements and become readily adaptable across different settings, the Advance Value Framework could be offered as a decision-support tool for evaluators and payers to aid coverage and reimbursement of new medicines. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Bayesian outcome-based strategy classification.
Lee, Michael D
2016-03-01
Hilbig and Moshagen (Psychonomic Bulletin & Review, 21, 1431-1443, 2014) recently developed a method for making inferences about the decision processes people use in multi-attribute forced choice tasks. Their paper makes a number of worthwhile theoretical and methodological contributions. Theoretically, they provide an insightful psychological motivation for a probabilistic extension of the widely-used "weighted additive" (WADD) model, and show how this model, as well as other important models like "take-the-best" (TTB), can and should be expressed in terms of meaningful priors. Methodologically, they develop an inference approach based on the Minimum Description Length (MDL) principles that balances both the goodness-of-fit and complexity of the decision models they consider. This paper aims to preserve these useful contributions, but provide a complementary Bayesian approach with some theoretical and methodological advantages. We develop a simple graphical model, implemented in JAGS, that allows for fully Bayesian inferences about which models people use to make decisions. To demonstrate the Bayesian approach, we apply it to the models and data considered by Hilbig and Moshagen (Psychonomic Bulletin & Review, 21, 1431-1443, 2014), showing how a prior predictive analysis of the models, and posterior inferences about which models people use and the parameter settings at which they use them, can contribute to our understanding of human decision making.
NASA Astrophysics Data System (ADS)
Zachary, Wayne; Eggleston, Robert; Donmoyer, Jason; Schremmer, Serge
2003-09-01
Decision-making is strongly shaped and influenced by the work context in which decisions are embedded. This suggests that decision support needs to be anchored by a model (implicit or explicit) of the work process, in contrast to traditional approaches that anchor decision support to either context free decision models (e.g., utility theory) or to detailed models of the external (e.g., battlespace) environment. An architecture for cognitively-based, work centered decision support called the Work-centered Informediary Layer (WIL) is presented. WIL separates decision support into three overall processes that build and dynamically maintain an explicit context model, use the context model to identify opportunities for decision support and tailor generic decision-support strategies to the current context and offer them to the system-user/decision-maker. The generic decision support strategies include such things as activity/attention aiding, decision process structuring, work performance support (selective, contextual automation), explanation/ elaboration, infosphere data retrieval, and what if/action-projection and visualization. A WIL-based application is a work-centered decision support layer that provides active support without intent inferencing, and that is cognitively based without requiring classical cognitive task analyses. Example WIL applications are detailed and discussed.
NASA Astrophysics Data System (ADS)
Chen, Yizhong; Lu, Hongwei; Li, Jing; Ren, Lixia; He, Li
2017-05-01
This study presents the mathematical formulation and implementations of a synergistic optimization framework based on an understanding of water availability and reliability together with the characteristics of multiple water demands. This framework simultaneously integrates a set of leader-followers-interactive objectives established by different decision makers during the synergistic optimization. The upper-level model (leader's one) determines the optimal pollutants discharge to satisfy the environmental target. The lower-level model (follower's one) accepts the dispatch requirement from the upper-level one and dominates the optimal water-allocation strategy to maximize economic benefits representing the regional authority. The complicated bi-level model significantly improves upon the conventional programming methods through the mutual influence and restriction between the upper- and lower-level decision processes, particularly when limited water resources are available for multiple completing users. To solve the problem, a bi-level interactive solution algorithm based on satisfactory degree is introduced into the decision-making process for measuring to what extent the constraints are met and the objective reaches its optima. The capabilities of the proposed model are illustrated through a real-world case study of water resources management system in the district of Fengtai located in Beijing, China. Feasible decisions in association with water resources allocation, wastewater emission and pollutants discharge would be sequentially generated for balancing the objectives subject to the given water-related constraints, which can enable Stakeholders to grasp the inherent conflicts and trade-offs between the environmental and economic interests. The performance of the developed bi-level model is enhanced by comparing with single-level models. Moreover, in consideration of the uncertainty in water demand and availability, sensitivity analysis and policy analysis are employed for identifying their impacts on the final decisions and improving the practical applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ascough, II, James Clifford
1992-05-01
The capability to objectively evaluate design performance of shallow landfill burial (SLB) systems is of great interest to diverse scientific disciplines, including hydrologists, engineers, environmental scientists, and SLB regulators. The goal of this work was to develop and validate a procedure for the nonsubjective evaluation of SLB designs under actual or simulated environmental conditions. A multiobjective decision module (MDM) based on scoring functions (Wymore, 1988) was implemented to evaluate SLB design performance. Input values to the MDM are provided by hydrologic models. The MDM assigns a total score to each SLB design alternative, thereby allowing for rapid and repeatable designmore » performance evaluation. The MDM was validated for a wide range of SLB designs under different climatic conditions. Rigorous assessment of SLB performance also requires incorporation of hydrologic probabilistic analysis and hydrologic risk into the overall design. This was accomplished through the development of a frequency analysis module. The frequency analysis module allows SLB design event magnitudes to be calculated based on the hydrologic return period. The multiobjective decision and freqeuncy anslysis modules were integrated in a decision support system (DSS) framework, SLEUTH (Shallow Landfill Evaluation Using Transport and Hydrology). SLEUTH is a Microsoft Windows {trademark} application, and is written in the Knowledge Pro Windows (Knowledge Garden, Inc., 1991) development language.« less
Cognitive mapping tools: review and risk management needs.
Wood, Matthew D; Bostrom, Ann; Bridges, Todd; Linkov, Igor
2012-08-01
Risk managers are increasingly interested in incorporating stakeholder beliefs and other human factors into the planning process. Effective risk assessment and management requires understanding perceptions and beliefs of involved stakeholders, and how these beliefs give rise to actions that influence risk management decisions. Formal analyses of risk manager and stakeholder cognitions represent an important first step. Techniques for diagramming stakeholder mental models provide one tool for risk managers to better understand stakeholder beliefs and perceptions concerning risk, and to leverage this new understanding in developing risk management strategies. This article reviews three methodologies for assessing and diagramming stakeholder mental models--decision-analysis-based mental modeling, concept mapping, and semantic web analysis--and assesses them with regard to their ability to address risk manager needs. © 2012 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Pierce, S. A.; Wagner, K.; Schwartz, S.; Gentle, J. N., Jr.
2016-12-01
Critical water resources face the effects of historic drought, increased demand, and potential contamination, the need has never been greater to develop resources to effectively communicate conservation and protection across a broad audience and geographical area. The Watermark application and macro-analysis methodology merges topical analysis of context rich corpus from policy texts with multi-attributed solution sets from integrated models of water resource and other subsystems, such as mineral, food, energy, or environmental systems to construct a scalable, robust, and reproducible approach for identifying links between policy and science knowledge bases. The Watermark application is an open-source, interactive workspace to support science-based visualization and decision making. Designed with generalization in mind, Watermark is a flexible platform that allows for data analysis and inclusion of large datasets with an interactive front-end capable of connecting with other applications as well as advanced computing resources. In addition, the Watermark analysis methodology offers functionality that streamlines communication with non-technical users for policy, education, or engagement with groups around scientific topics of societal relevance. The technology stack for Watermark was selected with the goal of creating a robust and dynamic modular codebase that can be adjusted to fit many use cases and scale to support usage loads that range between simple data display to complex scientific simulation-based modelling and analytics. The methodology uses to topical analysis and simulation-optimization to systematically analyze the policy and management realities of resource systems and explicitly connect the social and problem contexts with science-based and engineering knowledge from models. A case example demonstrates use in a complex groundwater resources management study highlighting multi-criteria spatial decision making and uncertainty comparisons.
Use of Knowledge Base Systems (EMDS) in Strategic and Tactical Forest Planning
NASA Astrophysics Data System (ADS)
Jensen, M. E.; Reynolds, K.; Stockmann, K.
2008-12-01
The USDA Forest Service 2008 Planning Rule requires Forest plans to provide a strategic vision for maintaining the sustainability of ecological, economic, and social systems across USFS lands through the identification of desired conditions and objectives. In this paper we show how knowledge-based systems can be efficiently used to evaluate disparate natural resource information to assess desired conditions and related objectives in Forest planning. We use the Ecosystem Management Decision Support (EMDS) system (http://www.institute.redlands.edu/emds/), which facilitates development of both logic-based models for evaluating ecosystem sustainability (desired conditions) and decision models to identify priority areas for integrated landscape restoration (objectives). The study area for our analysis spans 1,057 subwatersheds within western Montana and northern Idaho. Results of our study suggest that knowledge-based systems such as EMDS are well suited to both strategic and tactical planning and that the following points merit consideration in future National Forest (and other land management) planning efforts: 1) Logic models provide a consistent, transparent, and reproducible method for evaluating broad propositions about ecosystem sustainability such as: are watershed integrity, ecosystem and species diversity, social opportunities, and economic integrity in good shape across a planning area? The ability to evaluate such propositions in a formal logic framework also allows users the opportunity to evaluate statistical changes in outcomes over time, which could be very useful for regional and national reporting purposes and for addressing litigation; 2) The use of logic and decision models in strategic and tactical Forest planning provides a repository for expert knowledge (corporate memory) that is critical to the evaluation and management of ecosystem sustainability over time. This is especially true for the USFS and other federal resource agencies, which are likely to experience rapid turnover in tenured resource specialist positions within the next five years due to retirements; 3) Use of logic model output in decision models is an efficient method for synthesizing the typically large amounts of information needed to support integrated landscape restoration. Moreover, use of logic and decision models to design customized scenarios for integrated landscape restoration, as we have demonstrated with EMDS, offers substantial improvements to traditional GIS-based procedures such as suitability analysis. To our knowledge, this study represents the first attempt to link evaluations of desired conditions for ecosystem sustainability in strategic planning to tactical planning regarding the location of subwatersheds that best meet the objectives of integrated landscape restoration. The basic knowledge-based approach implemented in EMDS, with its logic (NetWeaver) and decision (Criterion Decision Plus) engines, is well suited both to multi-scale strategic planning and to multi-resource tactical planning.
Good modeling practice guidelines for applying multimedia models in chemical assessments.
Buser, Andreas M; MacLeod, Matthew; Scheringer, Martin; Mackay, Don; Bonnell, Mark; Russell, Mark H; DePinto, Joseph V; Hungerbühler, Konrad
2012-10-01
Multimedia mass balance models of chemical fate in the environment have been used for over 3 decades in a regulatory context to assist decision making. As these models become more comprehensive, reliable, and accepted, there is a need to recognize and adopt principles of Good Modeling Practice (GMP) to ensure that multimedia models are applied with transparency and adherence to accepted scientific principles. We propose and discuss 6 principles of GMP for applying existing multimedia models in a decision-making context, namely 1) specification of the goals of the model assessment, 2) specification of the model used, 3) specification of the input data, 4) specification of the output data, 5) conduct of a sensitivity and possibly also uncertainty analysis, and finally 6) specification of the limitations and limits of applicability of the analysis. These principles are justified and discussed with a view to enhancing the transparency and quality of model-based assessments. Copyright © 2012 SETAC.
Kawano, Shingo; Komai, Yoshinobu; Ishioka, Junichiro; Sakai, Yasuyuki; Fuse, Nozomu; Ito, Masaaki; Kihara, Kazunori; Saito, Norio
2016-10-01
The aim of this study was to determine risk factors for survival after retrograde placement of ureteral stents and develop a prognostic model for advanced gastrointestinal tract (GIT: esophagus, stomach, colon and rectum) cancer patients. We examined the clinical records of 122 patients who underwent retrograde placement of a ureteral stent against malignant extrinsic ureteral obstruction. A prediction model for survival after stenting was developed. We compared its clinical usefulness with our previous model based on the results from nephrostomy cases by decision curve analysis. Median follow-up period was 201 days (8-1490) and 97 deaths occurred. The 1-year survival rate in this cohort was 29%. Based on multivariate analysis, primary site of colon origin, absence of retroperitoneal lymph node metastasis and serum albumin >3g/dL were significantly associated with a prolonged survival time. To develop a prognostic model, we divided the patients into 3 risk groups of favorable: 0-1 factors (N.=53), intermediate: 2 risk factors (N.=54), and poor: 3 risk factors (N.=15). There were significant differences in the survival profiles of these 3 risk groups (P<0.0001). Decision curve analyses revealed that the current model has a superior net benefit than our previous model for most of the examined probabilities. We have developed a novel prognostic model for GIT cancer patients who were treated with retrograde placement of a ureteral stent. The current model should help urologists and medical oncologists to predict survival in cases of malignant extrinsic ureteral obstruction.
Advanced Computational Framework for Environmental Management ZEM, Version 1.x
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vesselinov, Velimir V.; O'Malley, Daniel; Pandey, Sachin
2016-11-04
Typically environmental management problems require analysis of large and complex data sets originating from concurrent data streams with different data collection frequencies and pedigree. These big data sets require on-the-fly integration into a series of models with different complexity for various types of model analyses where the data are applied as soft and hard model constraints. This is needed to provide fast iterative model analyses based on the latest available data to guide decision-making. Furthermore, the data and model are associated with uncertainties. The uncertainties are probabilistic (e.g. measurement errors) and non-probabilistic (unknowns, e.g. alternative conceptual models characterizing site conditions).more » To address all of these issues, we have developed an integrated framework for real-time data and model analyses for environmental decision-making called ZEM. The framework allows for seamless and on-the-fly integration of data and modeling results for robust and scientifically-defensible decision-making applying advanced decision analyses tools such as Bayesian- Information-Gap Decision Theory (BIG-DT). The framework also includes advanced methods for optimization that are capable of dealing with a large number of unknown model parameters, and surrogate (reduced order) modeling capabilities based on support vector regression techniques. The framework is coded in Julia, a state-of-the-art high-performance programing language (http://julialang.org). The ZEM framework is open-source and can be applied to any environmental management site. The framework will be open-source and released under GPL V3 license.« less
NASA Astrophysics Data System (ADS)
Kim, Kwang Hyeon; Lee, Suk; Shim, Jang Bo; Chang, Kyung Hwan; Yang, Dae Sik; Yoon, Won Sup; Park, Young Je; Kim, Chul Yong; Cao, Yuan Jie
2017-08-01
The aim of this study is an integrated research for text-based data mining and toxicity prediction modeling system for clinical decision support system based on big data in radiation oncology as a preliminary research. The structured and unstructured data were prepared by treatment plans and the unstructured data were extracted by dose-volume data image pattern recognition of prostate cancer for research articles crawling through the internet. We modeled an artificial neural network to build a predictor model system for toxicity prediction of organs at risk. We used a text-based data mining approach to build the artificial neural network model for bladder and rectum complication predictions. The pattern recognition method was used to mine the unstructured toxicity data for dose-volume at the detection accuracy of 97.9%. The confusion matrix and training model of the neural network were achieved with 50 modeled plans (n = 50) for validation. The toxicity level was analyzed and the risk factors for 25% bladder, 50% bladder, 20% rectum, and 50% rectum were calculated by the artificial neural network algorithm. As a result, 32 plans could cause complication but 18 plans were designed as non-complication among 50 modeled plans. We integrated data mining and a toxicity modeling method for toxicity prediction using prostate cancer cases. It is shown that a preprocessing analysis using text-based data mining and prediction modeling can be expanded to personalized patient treatment decision support based on big data.
NASA Astrophysics Data System (ADS)
Clark, Martyn; Essery, Richard
2017-04-01
When faced with the complex and interdisciplinary challenge of building process-based land models, different modelers make different decisions at different points in the model development process. These modeling decisions are generally based on several considerations, including fidelity (e.g., what approaches faithfully simulate observed processes), complexity (e.g., which processes should be represented explicitly), practicality (e.g., what is the computational cost of the model simulations; are there sufficient resources to implement the desired modeling concepts), and data availability (e.g., is there sufficient data to force and evaluate models). Consequently the research community, comprising modelers of diverse background, experience, and modeling philosophy, has amassed a wide range of models, which differ in almost every aspect of their conceptualization and implementation. Model comparison studies have been undertaken to explore model differences, but have not been able to meaningfully attribute inter-model differences in predictive ability to individual model components because there are often too many structural and implementation differences among the different models considered. As a consequence, model comparison studies to date have provided limited insight into the causes of differences in model behavior, and model development has often relied on the inspiration and experience of individual modelers rather than on a systematic analysis of model shortcomings. This presentation will summarize the use of "multiple-hypothesis" modeling frameworks to understand differences in process-based snow models. Multiple-hypothesis frameworks define a master modeling template, and include a a wide variety of process parameterizations and spatial configurations that are used in existing models. Such frameworks provide the capability to decompose complex models into the individual decisions that are made as part of model development, and evaluate each decision in isolation. It is hence possible to attribute differences in system-scale model predictions to individual modeling decisions, providing scope to mimic the behavior of existing models, understand why models differ, characterize model uncertainty, and identify productive pathways to model improvement. Results will be presented applying multiple hypothesis frameworks to snow model comparison projects, including PILPS, SnowMIP, and the upcoming ESM-SnowMIP project.
Neural systems analysis of decision making during goal-directed navigation.
Penner, Marsha R; Mizumori, Sheri J Y
2012-01-01
The ability to make adaptive decisions during goal-directed navigation is a fundamental and highly evolved behavior that requires continual coordination of perceptions, learning and memory processes, and the planning of behaviors. Here, a neurobiological account for such coordination is provided by integrating current literatures on spatial context analysis and decision-making. This integration includes discussions of our current understanding of the role of the hippocampal system in experience-dependent navigation, how hippocampal information comes to impact midbrain and striatal decision making systems, and finally the role of the striatum in the implementation of behaviors based on recent decisions. These discussions extend across cellular to neural systems levels of analysis. Not only are key findings described, but also fundamental organizing principles within and across neural systems, as well as between neural systems functions and behavior, are emphasized. It is suggested that studying decision making during goal-directed navigation is a powerful model for studying interactive brain systems and their mediation of complex behaviors. Copyright © 2011. Published by Elsevier Ltd.
A Decision Fusion Framework for Treatment Recommendation Systems.
Mei, Jing; Liu, Haifeng; Li, Xiang; Xie, Guotong; Yu, Yiqin
2015-01-01
Treatment recommendation is a nontrivial task--it requires not only domain knowledge from evidence-based medicine, but also data insights from descriptive, predictive and prescriptive analysis. A single treatment recommendation system is usually trained or modeled with a limited (size or quality) source. This paper proposes a decision fusion framework, combining both knowledge-driven and data-driven decision engines for treatment recommendation. End users (e.g. using the clinician workstation or mobile apps) could have a comprehensive view of various engines' opinions, as well as the final decision after fusion. For implementation, we leverage several well-known fusion algorithms, such as decision templates and meta classifiers (of logistic and SVM, etc.). Using an outcome-driven evaluation metric, we compare the fusion engine with base engines, and our experimental results show that decision fusion is a promising way towards a more valuable treatment recommendation.
Models-3 is a flexible system designed to simplify the development and use of air quality models and other environmental decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheric...
NASA Astrophysics Data System (ADS)
Li, Qi
As a potential substitute for petroleum-based fuel, second generation biofuels are playing an increasingly important role due to their economic, environmental, and social benefits. With the rapid development of biofuel industry, there has been an increasing literature on the techno-economic analysis and supply chain design for biofuel production based on a variety of production pathways. A recently proposed production pathway of advanced biofuel is to convert biomass to bio-oil at widely distributed small-scale fast pyrolysis plants, then gasify the bio-oil to syngas and upgrade the syngas to transportation fuels in centralized biorefinery. This thesis aims to investigate two types of assessments on this bio-oil gasification pathway: techno-economic analysis based on process modeling and literature data; supply chain design with a focus on optimal decisions for number of facilities to build, facility capacities and logistic decisions considering uncertainties. A detailed process modeling with corn stover as feedstock and liquid fuels as the final products is presented. Techno-economic analysis of the bio-oil gasification pathway is also discussed to assess the economic feasibility. Some preliminary results show a capital investment of 438 million dollar and minimum fuel selling price (MSP) of $5.6 per gallon of gasoline equivalent. The sensitivity analysis finds that MSP is most sensitive to internal rate of return (IRR), biomass feedstock cost, and fixed capital cost. A two-stage stochastic programming is formulated to solve the supply chain design problem considering uncertainties in biomass availability, technology advancement, and biofuel price. The first-stage makes the capital investment decisions including the locations and capacities of the decentralized fast pyrolysis plants and the centralized biorefinery while the second-stage determines the biomass and biofuel flows. The numerical results and case study illustrate that considering uncertainties can be pivotal in this supply chain design and optimization problem. Also, farmers' participation has a significant effect on the decision making process.
NASA Astrophysics Data System (ADS)
Haer, Toon; Botzen, Wouter; de Moel, Hans; Aerts, Jeroen
2015-04-01
In the period 1998-2009, floods triggered roughly 52 billion euro in insured economic losses making floods the most costly natural hazard in Europe. Climate change and socio/economic trends are expected to further aggrevate floods losses in many regions. Research shows that flood risk can be significantly reduced if households install protective measures, and that the implementation of such measures can be stimulated through flood insurance schemes and subsidies. However, the effectiveness of such incentives to stimulate implementation of loss-reducing measures greatly depends on the decision process of individuals and is hardly studied. In our study, we developed an Agent-Based Model that integrates flood damage models, insurance mechanisms, subsidies, and household behaviour models to assess the effectiveness of different economic tools on stimulating households to invest in loss-reducing measures. Since the effectiveness depends on the decision making process of individuals, the study compares different household decision models ranging from standard economic models, to economic models for decision making under risk, to more complex decision models integrating economic models and risk perceptions, opinion dynamics, and the influence of flood experience. The results show the effectiveness of incentives to stimulate investment in loss-reducing measures for different household behavior types, while assuming climate change scenarios. It shows how complex decision models can better reproduce observed real-world behaviour compared to traditional economic models. Furthermore, since flood events are included in the simulations, the results provide an analysis of the dynamics in insured and uninsured losses for households, the costs of reducing risk by implementing loss-reducing measures, the capacity of the insurance market, and the cost of government subsidies under different scenarios. The model has been applied to the City of Rotterdam in The Netherlands.
NASA Astrophysics Data System (ADS)
Zein-Sabatto, Saleh; Mikhail, Maged; Bodruzzaman, Mohammad; DeSimio, Martin; Derriso, Mark; Behbahani, Alireza
2012-06-01
It has been widely accepted that data fusion and information fusion methods can improve the accuracy and robustness of decision-making in structural health monitoring systems. It is arguably true nonetheless, that decision-level is equally beneficial when applied to integrated health monitoring systems. Several decisions at low-levels of abstraction may be produced by different decision-makers; however, decision-level fusion is required at the final stage of the process to provide accurate assessment about the health of the monitored system as a whole. An example of such integrated systems with complex decision-making scenarios is the integrated health monitoring of aircraft. Thorough understanding of the characteristics of the decision-fusion methodologies is a crucial step for successful implementation of such decision-fusion systems. In this paper, we have presented the major information fusion methodologies reported in the literature, i.e., probabilistic, evidential, and artificial intelligent based methods. The theoretical basis and characteristics of these methodologies are explained and their performances are analyzed. Second, candidate methods from the above fusion methodologies, i.e., Bayesian, Dempster-Shafer, and fuzzy logic algorithms are selected and their applications are extended to decisions fusion. Finally, fusion algorithms are developed based on the selected fusion methods and their performance are tested on decisions generated from synthetic data and from experimental data. Also in this paper, a modeling methodology, i.e. cloud model, for generating synthetic decisions is presented and used. Using the cloud model, both types of uncertainties; randomness and fuzziness, involved in real decision-making are modeled. Synthetic decisions are generated with an unbiased process and varying interaction complexities among decisions to provide for fair performance comparison of the selected decision-fusion algorithms. For verification purposes, implementation results of the developed fusion algorithms on structural health monitoring data collected from experimental tests are reported in this paper.
IT vendor selection model by using structural equation model & analytical hierarchy process
NASA Astrophysics Data System (ADS)
Maitra, Sarit; Dominic, P. D. D.
2012-11-01
Selecting and evaluating the right vendors is imperative for an organization's global marketplace competitiveness. Improper selection and evaluation of potential vendors can dwarf an organization's supply chain performance. Numerous studies have demonstrated that firms consider multiple criteria when selecting key vendors. This research intends to develop a new hybrid model for vendor selection process with better decision making. The new proposed model provides a suitable tool for assisting decision makers and managers to make the right decisions and select the most suitable vendor. This paper proposes a Hybrid model based on Structural Equation Model (SEM) and Analytical Hierarchy Process (AHP) for long-term strategic vendor selection problems. The five steps framework of the model has been designed after the thorough literature study. The proposed hybrid model will be applied using a real life case study to assess its effectiveness. In addition, What-if analysis technique will be used for model validation purpose.
Broekhuizen, Henk; IJzerman, Maarten J; Hauber, A Brett; Groothuis-Oudshoorn, Catharina G M
2017-03-01
The need for patient engagement has been recognized by regulatory agencies, but there is no consensus about how to operationalize this. One approach is the formal elicitation and use of patient preferences for weighing clinical outcomes. The aim of this study was to demonstrate how patient preferences can be used to weigh clinical outcomes when both preferences and clinical outcomes are uncertain by applying a probabilistic value-based multi-criteria decision analysis (MCDA) method. Probability distributions were used to model random variation and parameter uncertainty in preferences, and parameter uncertainty in clinical outcomes. The posterior value distributions and rank probabilities for each treatment were obtained using Monte-Carlo simulations. The probability of achieving the first rank is the probability that a treatment represents the highest value to patients. We illustrated our methodology for a simplified case on six HIV treatments. Preferences were modeled with normal distributions and clinical outcomes were modeled with beta distributions. The treatment value distributions showed the rank order of treatments according to patients and illustrate the remaining decision uncertainty. This study demonstrated how patient preference data can be used to weigh clinical evidence using MCDA. The model takes into account uncertainty in preferences and clinical outcomes. The model can support decision makers during the aggregation step of the MCDA process and provides a first step toward preference-based personalized medicine, yet requires further testing regarding its appropriate use in real-world settings.
NASA Astrophysics Data System (ADS)
Butchart-Kuhlmann, Daniel; Kralisch, Sven; Meinhardt, Markus; Fleischer, Melanie
2017-04-01
Assessing the quantity and quality of water available in water stressed environments under various potential climate and land-use changes is necessary for good water and environmental resources management and governance. Within the region covered by the Southern African Science Service Centre for Climate Change and Adaptive Land Management (SASSCAL) project, such areas are common. One goal of the SASSCAL project is to develop and provide an integrated decision support system (DSS) with which decision makers (DMs) within a given catchment can obtain objective information regarding potential changes in water flow quantity and timing. The SASSCAL DSS builds upon existing data storage and distribution capability, through the SASSCAL Information System (IS), as well as the J2000 hydrological model. Using output from validated J2000 models, the SASSCAL DSS incorporates the calculation of a range of hydrological indicators based upon Indicators of Hydrological Alteration/Environmental Flow Components (IHA/EFC) calculated for a historic time series (pre-impact) and a set of model simulations based upon a selection of possible climate and land-use change scenarios (post-impact). These indicators, obtained using the IHA software package, are then used as input for a multi-criteria decision analysis (MCDA) undertaken using the open source diviz software package. The results of these analyses will provide DMs with an indication as to how various hydrological indicators within a catchment may be altered under different future scenarios, as well providing a ranking of how each scenario is preferred according to different DM preferences. Scenarios are represented through a combination of model input data and parameter settings in J2000, and preferences are represented through criteria weighting in the MCDA. Here, the methodology is presented and applied to the J2000 Luanginga model results using a set of hypothetical decision maker preference values as input for an MCDA based on the PROMETHEE II outranking method. Future work on the SASSCAL DSS will entail automation of this process, as well as its application to other hydrological models and land-use and/or climate change scenarios.
Willis, Michael; Persson, Ulf; Zoellner, York; Gradl, Birgit
2010-01-01
Value-based pricing (VBP), whereby prices are set according to the perceived benefits offered to the consumer at a time when costs and benefits are characterized by considerable uncertainty and are then reviewed ex post, is a much discussed topic in pharmaceutical reimbursement. It is usually combined with coverage with evidence development (CED), a tool in which manufacturers are granted temporary reimbursement but are required to collect and submit additional health economic data at review. Many countries, including the UK, are signalling shifts in this direction. Several countries, including Sweden, have already adopted this approach and offer good insight into the benefits and pitfalls in actual practice. To describe VBP reimbursement decision making using CED in actual practice in Sweden. Decision making by The Dental and Pharmaceutical Benefits Agency (TLV) in Sweden was reviewed using a case study of continuous intraduodenal infusion of levodopa/carbidopa (Duodopa®) in the treatment of advanced Parkinson's disease (PD) with severe motor fluctuations. The manufacturer of Duodopa® applied for reimbursement in late 2003. While the proper economic data were not included in the submission, TLV granted reimbursement until early 2005 to provide time for the manufacturer to submit a formal economic evaluation. The re-submission with economic data was considered inadequate to judge cost effectiveness, so TLV granted an additional extension of reimbursement until August 2007, at which time conclusive data were expected. The manufacturer initiated a 3-year, prospective health economic study and a formal economic model. Data from a pre-planned interim analysis of the data were loaded into the model and the cost-effectiveness ratio was the basis of the next re-submission. TLV concluded that the data were suitable for making a definite decision and that the drug was not cost effective, deciding to discontinue reimbursement for any new patients (current patients were unaffected). The manufacturer continued to collect data and to improve the economic model and re-submitted in 2008. New data and the improved model resulted in reduced uncertainty and a lower cost-effectiveness ratio in the range of Swedish kronor (SEK)430,000 per QALY gained in the base-case analysis, ranging up to SEK900,000 in the most conservative sensitivity analysis, resulting in reimbursement being granted. The case of Duodopa® provides excellent insight into VBP reimbursement decision making in combination with CED and ex post review in actual practice. Publicly available decisions document the rigorous, time-consuming process (four iterations were required before a final decision could be reached). The data generated as part of the risk-sharing agreement proved correct the initial decision to grant limited coverage despite lack of economic data. Access was provided to 100 patients while evidence was generated. Economic appraisal differs from clinical assessment, and decision makers benefit from analysis of naturalistic, actual practice data. Despite reviewing the initial trial-based, 'piggy-back' economic analysis, TLV was uncertain of the cost effectiveness in actual practice and deferred a final decision until observational data from the DAPHNE study became available. Second, acceptance of economic modelling and use of temporary reimbursement conditional on additional evidence development provide a mechanism for risk sharing between TLV and manufacturers, which enabled patient access to a drug with proven clinical benefit while necessary evidence to support claims of cost effectiveness could be generated.
Of goals and habits: age-related and individual differences in goal-directed decision-making.
Eppinger, Ben; Walter, Maik; Heekeren, Hauke R; Li, Shu-Chen
2013-01-01
In this study we investigated age-related and individual differences in habitual (model-free) and goal-directed (model-based) decision-making. Specifically, we were interested in three questions. First, does age affect the balance between model-based and model-free decision mechanisms? Second, are these age-related changes due to age differences in working memory (WM) capacity? Third, can model-based behavior be affected by manipulating the distinctiveness of the reward value of choice options? To answer these questions we used a two-stage Markov decision task in in combination with computational modeling to dissociate model-based and model-free decision mechanisms. To affect model-based behavior in this task we manipulated the distinctiveness of reward probabilities of choice options. The results show age-related deficits in model-based decision-making, which are particularly pronounced if unexpected reward indicates the need for a shift in decision strategy. In this situation younger adults explore the task structure, whereas older adults show perseverative behavior. Consistent with previous findings, these results indicate that older adults have deficits in the representation and updating of expected reward value. We also observed substantial individual differences in model-based behavior. In younger adults high WM capacity is associated with greater model-based behavior and this effect is further elevated when reward probabilities are more distinct. However, in older adults we found no effect of WM capacity. Moreover, age differences in model-based behavior remained statistically significant, even after controlling for WM capacity. Thus, factors other than decline in WM, such as deficits in the in the integration of expected reward value into strategic decisions may contribute to the observed impairments in model-based behavior in older adults.
Of goals and habits: age-related and individual differences in goal-directed decision-making
Eppinger, Ben; Walter, Maik; Heekeren, Hauke R.; Li, Shu-Chen
2013-01-01
In this study we investigated age-related and individual differences in habitual (model-free) and goal-directed (model-based) decision-making. Specifically, we were interested in three questions. First, does age affect the balance between model-based and model-free decision mechanisms? Second, are these age-related changes due to age differences in working memory (WM) capacity? Third, can model-based behavior be affected by manipulating the distinctiveness of the reward value of choice options? To answer these questions we used a two-stage Markov decision task in in combination with computational modeling to dissociate model-based and model-free decision mechanisms. To affect model-based behavior in this task we manipulated the distinctiveness of reward probabilities of choice options. The results show age-related deficits in model-based decision-making, which are particularly pronounced if unexpected reward indicates the need for a shift in decision strategy. In this situation younger adults explore the task structure, whereas older adults show perseverative behavior. Consistent with previous findings, these results indicate that older adults have deficits in the representation and updating of expected reward value. We also observed substantial individual differences in model-based behavior. In younger adults high WM capacity is associated with greater model-based behavior and this effect is further elevated when reward probabilities are more distinct. However, in older adults we found no effect of WM capacity. Moreover, age differences in model-based behavior remained statistically significant, even after controlling for WM capacity. Thus, factors other than decline in WM, such as deficits in the in the integration of expected reward value into strategic decisions may contribute to the observed impairments in model-based behavior in older adults. PMID:24399925
DOT National Transportation Integrated Search
1996-11-01
The Highway Economic Requirements System (HERS) is a computer model designed to simulate improvement selection decisions based on the relative benefit-cost merits of alternative improvement options. HERS is intended to estimate national level investm...
The Three Gorges Project: How sustainable?
NASA Astrophysics Data System (ADS)
Kepa Brian Morgan, Te Kipa; Sardelic, Daniel N.; Waretini, Amaria F.
2012-08-01
SummaryIn 1984 the Government of China approved the decision to construct the Three Gorges Dam Project, the largest project since the Great Wall. The project had many barriers to overcome, and the decision was made at a time when sustainability was a relatively unknown concept. The decision to construct the Three Gorges Project remains contentious today, especially since Deputy Director of the Three Gorges Project Construction Committee, Wang Xiaofeng, stated that "We absolutely cannot relax our guard against ecological and environmental security problems sparked by the Three Gorges Project" (Bristow, 2007; McCabe, 2007). The question therefore was posed: how sustainable is the Three Gorges Project? Conventional approaches to sustainability assessment tend to use monetary based assessment aligned to triple bottom line thinking. That is, projects are evaluated as trade-offs between economic, environmental and social costs and benefits. The question of sustainability is considered using such a traditional Cost-Benefit Analysis approach, as undertaken in 1988 by a CIPM-Yangtze Joint Venture, and the Mauri Model Decision Making Framework (MMDMF). The Mauri Model differs from other approaches in that sustainability performance indicators are considered independently from any particular stakeholder bias. Bias is then introduced subsequently as a sensitivity analysis on the raw results obtained. The MMDMF is unique in that it is based on the Māori concept of Mauri, the binding force between the physical and the spiritual attributes of something, or the capacity to support life in the air, soil, and water. This concept of Mauri is analogous to the Chinese concept of Qi, and there are many analogous concepts in other cultures. It is the universal relevance of Mauri that allows its use to assess sustainability. This research identified that the MMDMF was a strong complement to Cost-Benefit Analysis, which is not designed as a sustainability assessment tool in itself. The MMDMF does have relevance in identifying areas of conflict, and it can support the Cost-Benefit Analysis in assessing sustainability, as a Decision Support Tool. The research concluded that, based on both models, the Three Gorges Project as understood in 1988, and incorporating more recent sustainability analysis is contributing to enhanced sustainability.
NASA Astrophysics Data System (ADS)
Huang, Wei; Zhang, Xingnan; Li, Chenming; Wang, Jianying
Management of group decision-making is an important issue in water source management development. In order to overcome the defects in lacking of effective communication and cooperation in the existing decision-making models, this paper proposes a multi-layer dynamic model for coordination in water resource allocation and scheduling based group decision making. By introducing the scheme-recognized cooperative satisfaction index and scheme-adjusted rationality index, the proposed model can solve the problem of poor convergence of multi-round decision-making process in water resource allocation and scheduling. Furthermore, the problem about coordination of limited resources-based group decision-making process can be solved based on the effectiveness of distance-based group of conflict resolution. The simulation results show that the proposed model has better convergence than the existing models.
Extraction of decision rules via imprecise probabilities
NASA Astrophysics Data System (ADS)
Abellán, Joaquín; López, Griselda; Garach, Laura; Castellano, Javier G.
2017-05-01
Data analysis techniques can be applied to discover important relations among features. This is the main objective of the Information Root Node Variation (IRNV) technique, a new method to extract knowledge from data via decision trees. The decision trees used by the original method were built using classic split criteria. The performance of new split criteria based on imprecise probabilities and uncertainty measures, called credal split criteria, differs significantly from the performance obtained using the classic criteria. This paper extends the IRNV method using two credal split criteria: one based on a mathematical parametric model, and other one based on a non-parametric model. The performance of the method is analyzed using a case study of traffic accident data to identify patterns related to the severity of an accident. We found that a larger number of rules is generated, significantly supplementing the information obtained using the classic split criteria.
Dolan, James G
2010-01-01
Current models of healthcare quality recommend that patient management decisions be evidence-based and patient-centered. Evidence-based decisions require a thorough understanding of current information regarding the natural history of disease and the anticipated outcomes of different management options. Patient-centered decisions incorporate patient preferences, values, and unique personal circumstances into the decision making process and actively involve both patients along with health care providers as much as possible. Fundamentally, therefore, evidence-based, patient-centered decisions are multi-dimensional and typically involve multiple decision makers.Advances in the decision sciences have led to the development of a number of multiple criteria decision making methods. These multi-criteria methods are designed to help people make better choices when faced with complex decisions involving several dimensions. They are especially helpful when there is a need to combine "hard data" with subjective preferences, to make trade-offs between desired outcomes, and to involve multiple decision makers. Evidence-based, patient-centered clinical decision making has all of these characteristics. This close match suggests that clinical decision support systems based on multi-criteria decision making techniques have the potential to enable patients and providers to carry out the tasks required to implement evidence-based, patient-centered care effectively and efficiently in clinical settings.The goal of this paper is to give readers a general introduction to the range of multi-criteria methods available and show how they could be used to support clinical decision-making. Methods discussed include the balance sheet, the even swap method, ordinal ranking methods, direct weighting methods, multi-attribute decision analysis, and the analytic hierarchy process (AHP).
Dolan, James G.
2010-01-01
Current models of healthcare quality recommend that patient management decisions be evidence-based and patient-centered. Evidence-based decisions require a thorough understanding of current information regarding the natural history of disease and the anticipated outcomes of different management options. Patient-centered decisions incorporate patient preferences, values, and unique personal circumstances into the decision making process and actively involve both patients along with health care providers as much as possible. Fundamentally, therefore, evidence-based, patient-centered decisions are multi-dimensional and typically involve multiple decision makers. Advances in the decision sciences have led to the development of a number of multiple criteria decision making methods. These multi-criteria methods are designed to help people make better choices when faced with complex decisions involving several dimensions. They are especially helpful when there is a need to combine “hard data” with subjective preferences, to make trade-offs between desired outcomes, and to involve multiple decision makers. Evidence-based, patient-centered clinical decision making has all of these characteristics. This close match suggests that clinical decision support systems based on multi-criteria decision making techniques have the potential to enable patients and providers to carry out the tasks required to implement evidence-based, patient-centered care effectively and efficiently in clinical settings. The goal of this paper is to give readers a general introduction to the range of multi-criteria methods available and show how they could be used to support clinical decision-making. Methods discussed include the balance sheet, the even swap method, ordinal ranking methods, direct weighting methods, multi-attribute decision analysis, and the analytic hierarchy process (AHP) PMID:21394218
NASA Astrophysics Data System (ADS)
Luo, Keqin
1999-11-01
The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and accurately what is going on in each tank, and (iii) identify all WM opportunities through process improvement. This work has formed a solid foundation for the further development of powerful WM technologies for comprehensive WM in the following decade.
Nash Equilibria in Theory of Reasoned Action
NASA Astrophysics Data System (ADS)
Almeida, Leando; Cruz, José; Ferreira, Helena; Pinto, Alberto Adrego
2009-08-01
Game theory and Decision Theory have been applied to many different areas such as Physics, Economics, Biology, etc. In its application to Psychology, we introduce, in the literature, a Game Theoretical Model of Planned Behavior or Reasoned Action by establishing an analogy between two specific theories. In this study we take in account that individual decision-making is an outcome of a process where group decisions can determine individual probabilistic behavior. Using Game Theory concepts, we describe how intentions can be transformed in behavior and according to the Nash Equilibrium, this process will correspond to the best individual decision/response taking in account the collective response. This analysis can be extended to several examples based in the Game Theoretical Model of Planned Behavior or Reasoned Action.
Using Decision Structures for Policy Analysis in Software Product-line Evolution - A Case Study
NASA Astrophysics Data System (ADS)
Sarang, Nita; Sanglikar, Mukund A.
Project management decisions are the primary basis for project success (or failure). Mostly, such decisions are based on an intuitive understanding of the underlying software engineering and management process and have a likelihood of being misjudged. Our problem domain is product-line evolution. We model the dynamics of the process by incorporating feedback loops appropriate to two decision structures: staffing policy, and the forces of growth associated with long-term software evolution. The model is executable and supports project managers to assess the long-term effects of possible actions. Our work also corroborates results from earlier studies of E-type systems, in particular the FEAST project and the rules for software evolution, planning and management.
1998-04-28
be discussed. 2.1 ECONOMIC REPLACEMENT THEORY Decisions about heavy equipment should be made based on sound economic principles , not emotions...Life) will be less than L*. The converse is also true. 2.1.3 The Repair Limit Theory A different way of looking at the economic replacement decision...Summary Three different economic models have been reviewed in this section. The output of each is distinct. One seeks to minimize costs, one seeks to
ERIC Educational Resources Information Center
Sambodo, Leonardo A. A. T.; Nuthall, Peter L.
2010-01-01
Purpose: This study traced the origins of subsistence Farmers' technology adoption attitudes and extracted the critical elements in their decision making systems. Design/Methodology/Approach: The analysis was structured using a model based on the Theory of Planned Behaviour (TPB). The role of a "bargaining process" was particularly…
Two-Stage Fracturing Wastewater Management in Shale Gas Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xiaodong; Sun, Alexander Y.; Duncan, Ian J.
Here, management of shale gas wastewater treatment, disposal, and reuse has become a significant environmental challenge, driven by an ongoing boom in development of U.S. shale gas reservoirs. Systems-analysis based decision support is helpful for effective management of wastewater, and provision of cost-effective decision alternatives from a whole-system perspective. Uncertainties are inherent in many modeling parameters, affecting the generated decisions. In order to effectively deal with the recourse issue in decision making, in this work a two-stage stochastic fracturing wastewater management model, named TSWM, is developed to provide decision support for wastewater management planning in shale plays. Using the TSWMmore » model, probabilistic and nonprobabilistic uncertainties are effectively handled. The TSWM model provides flexibility in generating shale gas wastewater management strategies, in which the first-stage decision predefined by decision makers before uncertainties are unfolded is corrected in the second stage to achieve the whole-system’s optimality. Application of the TSWM model to a comprehensive synthetic example demonstrates its practical applicability and feasibility. Optimal results are generated for allowable wastewater quantities, excess wastewater, and capacity expansions of hazardous wastewater treatment plants to achieve the minimized total system cost. The obtained interval solutions encompass both optimistic and conservative decisions. Trade-offs between economic and environmental objectives are made depending on decision makers’ knowledge and judgment, as well as site-specific information. In conclusion, the proposed model is helpful in forming informed decisions for wastewater management associated with shale gas development.« less
Two-Stage Fracturing Wastewater Management in Shale Gas Development
Zhang, Xiaodong; Sun, Alexander Y.; Duncan, Ian J.; ...
2017-01-19
Here, management of shale gas wastewater treatment, disposal, and reuse has become a significant environmental challenge, driven by an ongoing boom in development of U.S. shale gas reservoirs. Systems-analysis based decision support is helpful for effective management of wastewater, and provision of cost-effective decision alternatives from a whole-system perspective. Uncertainties are inherent in many modeling parameters, affecting the generated decisions. In order to effectively deal with the recourse issue in decision making, in this work a two-stage stochastic fracturing wastewater management model, named TSWM, is developed to provide decision support for wastewater management planning in shale plays. Using the TSWMmore » model, probabilistic and nonprobabilistic uncertainties are effectively handled. The TSWM model provides flexibility in generating shale gas wastewater management strategies, in which the first-stage decision predefined by decision makers before uncertainties are unfolded is corrected in the second stage to achieve the whole-system’s optimality. Application of the TSWM model to a comprehensive synthetic example demonstrates its practical applicability and feasibility. Optimal results are generated for allowable wastewater quantities, excess wastewater, and capacity expansions of hazardous wastewater treatment plants to achieve the minimized total system cost. The obtained interval solutions encompass both optimistic and conservative decisions. Trade-offs between economic and environmental objectives are made depending on decision makers’ knowledge and judgment, as well as site-specific information. In conclusion, the proposed model is helpful in forming informed decisions for wastewater management associated with shale gas development.« less
Diagnostic classification scheme in Iranian breast cancer patients using a decision tree.
Malehi, Amal Saki
2014-01-01
The objective of this study was to determine a diagnostic classification scheme using a decision tree based model. The study was conducted as a retrospective case-control study in Imam Khomeini hospital in Tehran during 2001 to 2009. Data, including demographic and clinical-pathological characteristics, were uniformly collected from 624 females, 312 of them were referred with positive diagnosis of breast cancer (cases) and 312 healthy women (controls). The decision tree was implemented to develop a diagnostic classification scheme using CART 6.0 Software. The AUC (area under curve), was measured as the overall performance of diagnostic classification of the decision tree. Five variables as main risk factors of breast cancer and six subgroups as high risk were identified. The results indicated that increasing age, low age at menarche, single and divorced statues, irregular menarche pattern and family history of breast cancer are the important diagnostic factors in Iranian breast cancer patients. The sensitivity and specificity of the analysis were 66% and 86.9% respectively. The high AUC (0.82) also showed an excellent classification and diagnostic performance of the model. Decision tree based model appears to be suitable for identifying risk factors and high or low risk subgroups. It can also assists clinicians in making a decision, since it can identify underlying prognostic relationships and understanding the model is very explicit.
Capoccia, Massimo; Marconi, Silvia; Singh, Sanjeet Avtaar; Pisanelli, Domenico M; De Lazzari, Claudio
2018-05-02
Modelling and simulation may become clinically applicable tools for detailed evaluation of the cardiovascular system and clinical decision-making to guide therapeutic intervention. Models based on pressure-volume relationship and zero-dimensional representation of the cardiovascular system may be a suitable choice given their simplicity and versatility. This approach has great potential for application in heart failure where the impact of left ventricular assist devices has played a significant role as a bridge to transplant and more recently as a long-term solution for non eligible candidates. We sought to investigate the value of simulation in the context of three heart failure patients with a view to predict or guide further management. CARDIOSIM © was the software used for this purpose. The study was based on retrospective analysis of haemodynamic data previously discussed at a multidisciplinary meeting. The outcome of the simulations addressed the value of a more quantitative approach in the clinical decision process. Although previous experience, co-morbidities and the risk of potentially fatal complications play a role in clinical decision-making, patient-specific modelling may become a daily approach for selection and optimisation of device-based treatment for heart failure patients. Willingness to adopt this integrated approach may be the key to further progress.
Patterns of out-of-home placement decision-making in child welfare.
Chor, Ka Ho Brian; McClelland, Gary M; Weiner, Dana A; Jordan, Neil; Lyons, John S
2013-10-01
Out-of-home placement decision-making in child welfare is founded on the best interest of the child in the least restrictive setting. After a child is removed from home, however, little is known about the mechanism of placement decision-making. This study aims to systematically examine the patterns of out-of-home placement decisions made in a state's child welfare system by comparing two models of placement decision-making: a multidisciplinary team decision-making model and a clinically based decision support algorithm. Based on records of 7816 placement decisions representing 6096 children over a 4-year period, hierarchical log-linear modeling characterized concordance or agreement, and discordance or disagreement when comparing the two models and accounting for age-appropriate placement options. Children aged below 16 had an overall concordance rate of 55.7%, most apparent in the least restrictive (20.4%) and the most restrictive placement (18.4%). Older youth showed greater discordant distributions (62.9%). Log-linear analysis confirmed the overall robustness of concordance (odd ratios [ORs] range: 2.9-442.0), though discordance was most evident from small deviations from the decision support algorithm, such as one-level under-placement in group home (OR=5.3) and one-level over-placement in residential treatment center (OR=4.8). Concordance should be further explored using child-level clinical and placement stability outcomes. Discordance might be explained by dynamic factors such as availability of placements, caregiver preferences, or policy changes and could be justified by positive child-level outcomes. Empirical placement decision-making is critical to a child's journey in child welfare and should be continuously improved to effect positive child welfare outcomes. Copyright © 2013 Elsevier Ltd. All rights reserved.
Rittman, Timothy; Nombela, Cristina; Fois, Alessandro; Coyle-Gilchrist, Ian; Barker, Roger A.; Hughes, Laura E.; Rowe, James B.
2016-01-01
Abstract Progressive supranuclear palsy and Parkinson’s disease have distinct underlying neuropathology, but both diseases affect cognitive function in addition to causing a movement disorder. They impair response inhibition and may lead to impulsivity, which can occur even in the presence of profound akinesia and rigidity. The current study examined the mechanisms of cognitive impairments underlying disinhibition, using horizontal saccadic latencies that obviate the impact of limb slowness on executing response decisions. Nineteen patients with clinically diagnosed progressive supranuclear palsy (Richardson’s syndrome), 24 patients with clinically diagnosed Parkinson’s disease and 26 healthy control subjects completed a saccadic Go/No-Go task with a head-mounted infrared saccadometer. Participants were cued on each trial to make a pro-saccade to a horizontal target or withhold their responses. Both patient groups had impaired behavioural performance, with more commission errors than controls. Mean saccadic latencies were similar between all three groups. We analysed behavioural responses as a binary decision between Go and No-Go choices. By using Bayesian parameter estimation, we fitted a hierarchical drift–diffusion model to individual participants’ single trial data. The model decomposes saccadic latencies into parameters for the decision process: decision boundary, drift rate of accumulation, decision bias, and non-decision time. In a leave-one-out three-way classification analysis, the model parameters provided better discrimination between patients and controls than raw behavioural measures. Furthermore, the model revealed disease-specific deficits in the Go/No-Go decision process. Both patient groups had slower drift rate of accumulation, and shorter non-decision time than controls. But patients with progressive supranuclear palsy were strongly biased towards a pro-saccade decision boundary compared to Parkinson’s patients and controls. This indicates a prepotency of responding in combination with a reduction in further accumulation of evidence, which provides a parsimonious explanation for the apparently paradoxical combination of disinhibition and severe akinesia. The combination of the well-tolerated oculomotor paradigm and the sensitivity of the model-based analysis provides a valuable approach for interrogating decision-making processes in neurodegenerative disorders. The mechanistic differences underlying participants’ poor performance were not observable from classical analysis of behavioural data, but were clearly revealed by modelling. These differences provide a rational basis on which to develop and assess new therapeutic strategies for cognition and behaviour in these disorders. PMID:26582559
Zhang, Jiaxiang; Rittman, Timothy; Nombela, Cristina; Fois, Alessandro; Coyle-Gilchrist, Ian; Barker, Roger A; Hughes, Laura E; Rowe, James B
2016-01-01
Progressive supranuclear palsy and Parkinson's disease have distinct underlying neuropathology, but both diseases affect cognitive function in addition to causing a movement disorder. They impair response inhibition and may lead to impulsivity, which can occur even in the presence of profound akinesia and rigidity. The current study examined the mechanisms of cognitive impairments underlying disinhibition, using horizontal saccadic latencies that obviate the impact of limb slowness on executing response decisions. Nineteen patients with clinically diagnosed progressive supranuclear palsy (Richardson's syndrome), 24 patients with clinically diagnosed Parkinson's disease and 26 healthy control subjects completed a saccadic Go/No-Go task with a head-mounted infrared saccadometer. Participants were cued on each trial to make a pro-saccade to a horizontal target or withhold their responses. Both patient groups had impaired behavioural performance, with more commission errors than controls. Mean saccadic latencies were similar between all three groups. We analysed behavioural responses as a binary decision between Go and No-Go choices. By using Bayesian parameter estimation, we fitted a hierarchical drift-diffusion model to individual participants' single trial data. The model decomposes saccadic latencies into parameters for the decision process: decision boundary, drift rate of accumulation, decision bias, and non-decision time. In a leave-one-out three-way classification analysis, the model parameters provided better discrimination between patients and controls than raw behavioural measures. Furthermore, the model revealed disease-specific deficits in the Go/No-Go decision process. Both patient groups had slower drift rate of accumulation, and shorter non-decision time than controls. But patients with progressive supranuclear palsy were strongly biased towards a pro-saccade decision boundary compared to Parkinson's patients and controls. This indicates a prepotency of responding in combination with a reduction in further accumulation of evidence, which provides a parsimonious explanation for the apparently paradoxical combination of disinhibition and severe akinesia. The combination of the well-tolerated oculomotor paradigm and the sensitivity of the model-based analysis provides a valuable approach for interrogating decision-making processes in neurodegenerative disorders. The mechanistic differences underlying participants' poor performance were not observable from classical analysis of behavioural data, but were clearly revealed by modelling. These differences provide a rational basis on which to develop and assess new therapeutic strategies for cognition and behaviour in these disorders. © The Author (2015). Published by Oxford University Press on behalf of the Guarantors of Brain.
Kimber, Melissa; Couturier, Jennifer; Jack, Susan; Niccols, Alison; Van Blyderveen, Sherry; McVey, Gail
2014-01-01
To explore the decision-making processes involved in the uptake and implementation of evidence-based treatments (EBTs), namely, family-based treatment (FBT), among therapists and their administrators within publically funded eating disorder treatment programs in Ontario, Canada. Fundamental qualitative description guided sampling, data collection, and analytic decisions. Forty therapists and 11 administrators belonging to a network of clinicians treating eating disorders completed an in-depth interview regarding the decision-making processes involved in EBT uptake and implementation within their organizations. Content analysis and the constant comparative technique were used to analyze interview transcripts, with 20% of the data independently double-coded by a second coder. Therapists and their administrators identified the importance of an inclusive change culture in evidence-based practice (EBP) decision-making. Each group indicated reluctance to make EBP decisions in isolation from the other. Additionally, participants identified seven stages of decision-making involved in EBT adoption, beginning with exposure to the EBT model and ending with evaluating the impact of the EBT on patient outcomes. Support for a stage-based decision-making process was in participants' indication that the stages were needed to demonstrate that they considered the costs and benefits of making a practice change. Participants indicated that EBTs endorsed by the Provincial Network for Eating Disorders or the Academy for Eating Disorders would more likely be adopted. Future work should focus on integrating the important decision-making processes identified in this study with known implementation models to increase the use of low-cost and effective treatments, such as FBT, within eating disorder treatment programs. Copyright © 2013 Wiley Periodicals, Inc.
DOT National Transportation Integrated Search
1978-01-01
A system analysis was completed of the general deterrence of driving while intoxicated (DWI). Elements which influence DWI decisions were identified and interrelated in a system model; then, potential countermeasures which might be employed in DWI ge...
NASA Astrophysics Data System (ADS)
Cunningham, Jessica D.
Newton's Universe (NU), an innovative teacher training program, strives to obtain measures from rural, middle school science teachers and their students to determine the impact of its distance learning course on understanding of temperature. No consensus exists on the most appropriate and useful method of analysis to measure change in psychological constructs over time. Several item response theory (IRT) models have been deemed useful in measuring change, which makes the choice of an IRT model not obvious. The appropriateness and utility of each model, including a comparison to a traditional analysis of variance approach, was investigated using middle school science student performance on an assessment over an instructional period. Predetermined criteria were outlined to guide model selection based on several factors including research questions, data properties, and meaningful interpretations to determine the most appropriate model for this study. All methods employed in this study reiterated one common interpretation of the data -- specifically, that the students of teachers with any NU course experience had significantly greater gains in performance over the instructional period. However, clear distinctions were made between an analysis of variance and the racked and stacked analysis using the Rasch model. Although limited research exists examining the usefulness of the Rasch model in measuring change in understanding over time, this study applied these methods and detailed plausible implications for data-driven decisions based upon results for NU and others. Being mindful of the advantages and usefulness of each method of analysis may help others make informed decisions about choosing an appropriate model to depict changes to evaluate other programs. Results may encourage other researchers to consider the meaningfulness of using IRT for this purpose. Results have implications for data-driven decisions for future professional development courses, in science education and other disciplines. KEYWORDS: Item Response Theory, Rasch Model, Racking and Stacking, Measuring Change in Student Performance, Newton's Universe teacher training
Darisi, Tanya; Thorne, Sarah; Iacobelli, Carolyn
2005-09-01
Research was conducted to gain insight into potential clients' decisions to undergo plastic surgery, their perception of benefits and risks, their judgment of outcomes, and their selection of a plastic surgeon. Semistructured, open-ended interviews were conducted with 60 people who expressed interest in plastic surgery. Qualitative analysis revealed their "mental models" regarding influences on their decision to undergo plastic surgery and their choice of a surgeon. Interview results were used to design a Web-based survey in which 644 individuals considering plastic surgery responded. The desire for change was the most direct motivator to undergo plastic surgery. Improvements to physical well-being were related to emotional and social benefits. When prompted about risks, participants mentioned physical, emotional, and social risks. Surgeon selection was a critical influence on decisions to undergo plastic surgery. Participants gave considerable weight to personal consultation and believed that finding the "right" plastic surgeon would minimize potential risks. Findings from the Web-based survey were similar to the mental models interviews in terms of benefit ratings but differed in risk ratings and surgeon selection criteria. The mental models interviews revealed that interview participants were thoughtful about their decision to undergo plastic surgery and focused on finding the right plastic surgeon.
Optimal policy for value-based decision-making.
Tajima, Satohiro; Drugowitsch, Jan; Pouget, Alexandre
2016-08-18
For decades now, normative theories of perceptual decisions, and their implementation as drift diffusion models, have driven and significantly improved our understanding of human and animal behaviour and the underlying neural processes. While similar processes seem to govern value-based decisions, we still lack the theoretical understanding of why this ought to be the case. Here, we show that, similar to perceptual decisions, drift diffusion models implement the optimal strategy for value-based decisions. Such optimal decisions require the models' decision boundaries to collapse over time, and to depend on the a priori knowledge about reward contingencies. Diffusion models only implement the optimal strategy under specific task assumptions, and cease to be optimal once we start relaxing these assumptions, by, for example, using non-linear utility functions. Our findings thus provide the much-needed theory for value-based decisions, explain the apparent similarity to perceptual decisions, and predict conditions under which this similarity should break down.
Optimal policy for value-based decision-making
Tajima, Satohiro; Drugowitsch, Jan; Pouget, Alexandre
2016-01-01
For decades now, normative theories of perceptual decisions, and their implementation as drift diffusion models, have driven and significantly improved our understanding of human and animal behaviour and the underlying neural processes. While similar processes seem to govern value-based decisions, we still lack the theoretical understanding of why this ought to be the case. Here, we show that, similar to perceptual decisions, drift diffusion models implement the optimal strategy for value-based decisions. Such optimal decisions require the models' decision boundaries to collapse over time, and to depend on the a priori knowledge about reward contingencies. Diffusion models only implement the optimal strategy under specific task assumptions, and cease to be optimal once we start relaxing these assumptions, by, for example, using non-linear utility functions. Our findings thus provide the much-needed theory for value-based decisions, explain the apparent similarity to perceptual decisions, and predict conditions under which this similarity should break down. PMID:27535638
Training Decisions Technology Analysis
1992-06-01
4.5.1 Relational Data Base Management 69 4.5.2 TASCS Data Content 69 4.5.3 Relationships with TDS 69 4.6 Other Air Force Modeling R&D 70 4.6.1 Time ...executive decision making were first developed by M. S. Scott Morton in the early 1970’s who, at that time , termed them " management decision systems" (Scott...Allocations to Training Settings o Managers ’ Preferences for Task Allocations to Training Settings o Times Required to Training Tasks in Various
Present-value analysis: A systems approach to public decisionmaking for cost effectiveness
NASA Technical Reports Server (NTRS)
Herbert, T. T.
1971-01-01
Decision makers within Governmental agencies and Congress must evaluate competing (and sometimes conflicting) proposals which seek funding and implementation. Present value analysis can be an effective decision making tool by enabling the formal evaluation of the effects of competing proposals on efficient national resource utilization. A project's costs are not only its direct disbursements, but its social costs as well. How much does it cost to have those funds diverted from their use and economic benefit by the private sector to the public project? Comparisons of competing projects' social costs allow decision makers to expand their decision bases by quantifying the projects' impacts upon the economy and the efficient utilization of the country's limited national resources. A conceptual model is established for the choosing of the appropriate discount rate to be used in evaluation decisions through the technique.
An Overview of R in Health Decision Sciences.
Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam
2017-10-01
As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.
Monte Carlo decision curve analysis using aggregate data.
Hozo, Iztok; Tsalatsanis, Athanasios; Djulbegovic, Benjamin
2017-02-01
Decision curve analysis (DCA) is an increasingly used method for evaluating diagnostic tests and predictive models, but its application requires individual patient data. The Monte Carlo (MC) method can be used to simulate probabilities and outcomes of individual patients and offers an attractive option for application of DCA. We constructed a MC decision model to simulate individual probabilities of outcomes of interest. These probabilities were contrasted against the threshold probability at which a decision-maker is indifferent between key management strategies: treat all, treat none or use predictive model to guide treatment. We compared the results of DCA with MC simulated data against the results of DCA based on actual individual patient data for three decision models published in the literature: (i) statins for primary prevention of cardiovascular disease, (ii) hospice referral for terminally ill patients and (iii) prostate cancer surgery. The results of MC DCA and patient data DCA were identical. To the extent that patient data DCA were used to inform decisions about statin use, referral to hospice or prostate surgery, the results indicate that MC DCA could have also been used. As long as the aggregate parameters on distribution of the probability of outcomes and treatment effects are accurately described in the published reports, the MC DCA will generate indistinguishable results from individual patient data DCA. We provide a simple, easy-to-use model, which can facilitate wider use of DCA and better evaluation of diagnostic tests and predictive models that rely only on aggregate data reported in the literature. © 2017 Stichting European Society for Clinical Investigation Journal Foundation.
Abidi, Samina
2017-10-26
Clinical management of comorbidities is a challenge, especially in a clinical decision support setting, as it requires the safe and efficient reconciliation of multiple disease-specific clinical procedures to formulate a comorbid therapeutic plan that is both effective and safe for the patient. In this paper we pursue the integration of multiple disease-specific Clinical Practice Guidelines (CPG) in order to manage co-morbidities within a computerized Clinical Decision Support System (CDSS). We present a CPG integration framework-termed as COMET (Comorbidity Ontological Modeling & ExecuTion) that manifests a knowledge management approach to model, computerize and integrate multiple CPG to yield a comorbid CPG knowledge model that upon execution can provide evidence-based recommendations for handling comorbid patients. COMET exploits semantic web technologies to achieve (a) CPG knowledge synthesis to translate a paper-based CPG to disease-specific clinical pathways (CP) that include specialized co-morbidity management procedures based on input from domain experts; (b) CPG knowledge modeling to computerize the disease-specific CP using a Comorbidity CPG ontology; (c) CPG knowledge integration by aligning multiple ontologically-modeled CP to develop a unified comorbid CPG knowledge model; and (e) CPG knowledge execution using reasoning engines to derive CPG-mediated recommendations for managing patients with comorbidities. We present a web-accessible COMET CDSS that provides family physicians with CPG-mediated comorbidity decision support to manage Atrial Fibrillation and Chronic Heart Failure. We present our qualitative and quantitative analysis of the knowledge content and usability of COMET CDSS.
Oakley, Jeremy E.; Brennan, Alan; Breeze, Penny
2015-01-01
Health economic decision-analytic models are used to estimate the expected net benefits of competing decision options. The true values of the input parameters of such models are rarely known with certainty, and it is often useful to quantify the value to the decision maker of reducing uncertainty through collecting new data. In the context of a particular decision problem, the value of a proposed research design can be quantified by its expected value of sample information (EVSI). EVSI is commonly estimated via a 2-level Monte Carlo procedure in which plausible data sets are generated in an outer loop, and then, conditional on these, the parameters of the decision model are updated via Bayes rule and sampled in an inner loop. At each iteration of the inner loop, the decision model is evaluated. This is computationally demanding and may be difficult if the posterior distribution of the model parameters conditional on sampled data is hard to sample from. We describe a fast nonparametric regression-based method for estimating per-patient EVSI that requires only the probabilistic sensitivity analysis sample (i.e., the set of samples drawn from the joint distribution of the parameters and the corresponding net benefits). The method avoids the need to sample from the posterior distributions of the parameters and avoids the need to rerun the model. The only requirement is that sample data sets can be generated. The method is applicable with a model of any complexity and with any specification of model parameter distribution. We demonstrate in a case study the superior efficiency of the regression method over the 2-level Monte Carlo method. PMID:25810269
Multi-modal management of acromegaly: a value perspective.
Kimmell, Kristopher T; Weil, Robert J; Marko, Nicholas F
2015-10-01
The Acromegaly Consensus Group recently released updated guidelines for medical management of acromegaly patients. We subjected these guidelines to a cost analysis. We conducted a cost analysis of the recommendations based on published efficacy rates as well as publicly available cost data. The results were compared to findings from a previously reported comparative effectiveness analysis of acromegaly treatments. Using decision tree software, two models were created based on the Acromegaly Consensus Group's recommendations and the comparative effectiveness analysis. The decision tree for the Consensus Group's recommendations was subjected to multi-way tornado analysis to identify variables that most impacted the value analysis of the decision tree. The value analysis confirmed the Consensus Group's recommendations of somatostatin analogs as first line therapy for medical management. Our model also demonstrated significant value in using dopamine agonist agents as upfront therapy as well. Sensitivity analysis identified the cost of somatostatin analogs and growth hormone receptor antagonists as having the most significant impact on the cost effectiveness of medical therapies. Our analysis confirmed the value of surgery as first-line therapy for patients with surgically accessible lesions. Surgery provides the greatest value for management of patients with acromegaly. However, in accordance with the Acromegaly Consensus Group's recent recommendations, somatostatin analogs provide the greatest value and should be used as first-line therapy for patients who cannot be managed surgically. At present, the substantial cost is the most significant negative factor in the value of medical therapies for acromegaly.
Clark, Renee M; Besterfield-Sacre, Mary E
2009-03-01
We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.
Tools and techniques for developing policies for complex and uncertain systems.
Bankes, Steven C
2002-05-14
Agent-based models (ABM) are examples of complex adaptive systems, which can be characterized as those systems for which no model less complex than the system itself can accurately predict in detail how the system will behave at future times. Consequently, the standard tools of policy analysis, based as they are on devising policies that perform well on some best estimate model of the system, cannot be reliably used for ABM. This paper argues that policy analysis by using ABM requires an alternative approach to decision theory. The general characteristics of such an approach are described, and examples are provided of its application to policy analysis.
Gonzalez Bernaldo de Quiros, Fernan; Dawidowski, Adriana R; Figar, Silvana
2017-02-01
In this study, we aimed: 1) to conceptualize the theoretical challenges facing health information systems (HIS) to represent patients' decisions about health and medical treatments in everyday life; 2) to suggest approaches for modeling these processes. The conceptualization of the theoretical and methodological challenges was discussed in 2015 during a series of interdisciplinary meetings attended by health informatics staff, epidemiologists and health professionals working in quality management and primary and secondary prevention of chronic diseases of the Hospital Italiano de Buenos Aires, together with sociologists, anthropologists and e-health stakeholders. HIS are facing the need and challenge to represent social human processes based on constructivist and complexity theories, which are the current frameworks of human sciences for understanding human learning and socio-cultural changes. Computer systems based on these theories can model processes of social construction of concrete and subjective entities and the interrelationships between them. These theories could be implemented, among other ways, through the mapping of health assets, analysis of social impact through community trials and modeling of complexity with system simulation tools. This analysis suggested the need to complement the traditional linear causal explanations of disease onset (and treatments) that are the bases for models of analysis of HIS with constructivist and complexity frameworks. Both may enlighten the complex interrelationships among patients, health services and the health system. The aim of this strategy is to clarify people's decision making processes to improve the efficiency, quality and equity of the health services and the health system.
NASA Astrophysics Data System (ADS)
Liu, Z.; Li, Y.
2018-04-01
This paper from the perspective of the Neighbor cellular space, Proposed a new urban space expansion model based on a new multi-objective gray decision and CA. The model solved the traditional cellular automata conversion rules is difficult to meet the needs of the inner space-time analysis of urban changes and to overcome the problem of uncertainty in the combination of urban drivers and urban cellular automata. At the same time, the study takes Pidu District as a research area and carries out urban spatial simulation prediction and analysis, and draws the following conclusions: (1) The design idea of the urban spatial expansion model proposed in this paper is that the urban driving factor and the neighborhood function are tightly coupled by the multi-objective grey decision method based on geographical conditions. The simulation results show that the simulation error of urban spatial expansion is less than 5.27 %. The Kappa coefficient is 0.84. It shows that the model can better capture the inner transformation mechanism of the city. (2) We made a simulation prediction for Pidu District of Chengdu by discussing Pidu District of Chengdu as a system instance.In this way, we analyzed the urban growth tendency of this area.presenting a contiguous increasing mode, which is called "urban intensive development". This expansion mode accorded with sustainable development theory and the ecological urbanization design theory.
Competent Systems: Effective, Efficient, Deliverable.
ERIC Educational Resources Information Center
Abramson, Bruce
Recent developments in artificial intelligence and decision analysis suggest reassessing the approaches commonly taken to the design of knowledge-based systems. Competent systems are based on models known as influence diagrams, which graphically capture a domain's basic objects and their interrelationships. Among the benefits offered by influence…
Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.
Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana
2016-01-01
The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.
Strategic rehabilitation planning of piped water networks using multi-criteria decision analysis.
Scholten, Lisa; Scheidegger, Andreas; Reichert, Peter; Maurer, Max; Mauer, Max; Lienert, Judit
2014-02-01
To overcome the difficulties of strategic asset management of water distribution networks, a pipe failure and a rehabilitation model are combined to predict the long-term performance of rehabilitation strategies. Bayesian parameter estimation is performed to calibrate the failure and replacement model based on a prior distribution inferred from three large water utilities in Switzerland. Multi-criteria decision analysis (MCDA) and scenario planning build the framework for evaluating 18 strategic rehabilitation alternatives under future uncertainty. Outcomes for three fundamental objectives (low costs, high reliability, and high intergenerational equity) are assessed. Exploitation of stochastic dominance concepts helps to identify twelve non-dominated alternatives and local sensitivity analysis of stakeholder preferences is used to rank them under four scenarios. Strategies with annual replacement of 1.5-2% of the network perform reasonably well under all scenarios. In contrast, the commonly used reactive replacement is not recommendable unless cost is the only relevant objective. Exemplified for a small Swiss water utility, this approach can readily be adapted to support strategic asset management for any utility size and based on objectives and preferences that matter to the respective decision makers. Copyright © 2013 Elsevier Ltd. All rights reserved.
Addy, Nii Antiaye; Shaban-Nejad, Arash; Buckeridge, David L; Dubé, Laurette
2015-01-23
Multi-stakeholder partnerships (MSPs) have become a widespread means for deploying policies in a whole of society strategy to address the complex problem of childhood obesity. However, decision-making in MSPs is fraught with challenges, as decision-makers are faced with complexity, and have to reconcile disparate conceptualizations of knowledge across multiple sectors with diverse sets of indicators and data. These challenges can be addressed by supporting MSPs with innovative tools for obtaining, organizing and using data to inform decision-making. The purpose of this paper is to describe and analyze the development of a knowledge-based infrastructure to support MSP decision-making processes. The paper emerged from a study to define specifications for a knowledge-based infrastructure to provide decision support for community-level MSPs in the Canadian province of Quebec. As part of the study, a process assessment was conducted to understand the needs of communities as they collect, organize, and analyze data to make decisions about their priorities. The result of this process is a "portrait", which is an epidemiological profile of health and nutrition in their community. Portraits inform strategic planning and development of interventions, and are used to assess the impact of interventions. Our key findings indicate ambiguities and disagreement among MSP decision-makers regarding causal relationships between actions and outcomes, and the relevant data needed for making decisions. MSP decision-makers expressed a desire for easy-to-use tools that facilitate the collection, organization, synthesis, and analysis of data, to enable decision-making in a timely manner. Findings inform conceptual modeling and ontological analysis to capture the domain knowledge and specify relationships between actions and outcomes. This modeling and analysis provide the foundation for an ontology, encoded using OWL 2 Web Ontology Language. The ontology is developed to provide semantic support for the MSP process, defining objectives, strategies, actions, indicators, and data sources. In the future, software interacting with the ontology can facilitate interactive browsing by decision-makers in the MSP in the form of concepts, instances, relationships, and axioms. Our ontology also facilitates the integration and interpretation of community data, and can help in managing semantic interoperability between different knowledge sources. Future work will focus on defining specifications for the development of a database of indicators and an information system to help decision-makers to view, analyze and organize indicators for their community. This work should improve MSP decision-making in the development of interventions to address childhood obesity.
Addy, Nii Antiaye; Shaban-Nejad, Arash; Buckeridge, David L.; Dubé, Laurette
2015-01-01
Multi-stakeholder partnerships (MSPs) have become a widespread means for deploying policies in a whole of society strategy to address the complex problem of childhood obesity. However, decision-making in MSPs is fraught with challenges, as decision-makers are faced with complexity, and have to reconcile disparate conceptualizations of knowledge across multiple sectors with diverse sets of indicators and data. These challenges can be addressed by supporting MSPs with innovative tools for obtaining, organizing and using data to inform decision-making. The purpose of this paper is to describe and analyze the development of a knowledge-based infrastructure to support MSP decision-making processes. The paper emerged from a study to define specifications for a knowledge-based infrastructure to provide decision support for community-level MSPs in the Canadian province of Quebec. As part of the study, a process assessment was conducted to understand the needs of communities as they collect, organize, and analyze data to make decisions about their priorities. The result of this process is a “portrait”, which is an epidemiological profile of health and nutrition in their community. Portraits inform strategic planning and development of interventions, and are used to assess the impact of interventions. Our key findings indicate ambiguities and disagreement among MSP decision-makers regarding causal relationships between actions and outcomes, and the relevant data needed for making decisions. MSP decision-makers expressed a desire for easy-to-use tools that facilitate the collection, organization, synthesis, and analysis of data, to enable decision-making in a timely manner. Findings inform conceptual modeling and ontological analysis to capture the domain knowledge and specify relationships between actions and outcomes. This modeling and analysis provide the foundation for an ontology, encoded using OWL 2 Web Ontology Language. The ontology is developed to provide semantic support for the MSP process, defining objectives, strategies, actions, indicators, and data sources. In the future, software interacting with the ontology can facilitate interactive browsing by decision-makers in the MSP in the form of concepts, instances, relationships, and axioms. Our ontology also facilitates the integration and interpretation of community data, and can help in managing semantic interoperability between different knowledge sources. Future work will focus on defining specifications for the development of a database of indicators and an information system to help decision-makers to view, analyze and organize indicators for their community. This work should improve MSP decision-making in the development of interventions to address childhood obesity. PMID:25625409
NASA Astrophysics Data System (ADS)
Henderson, Charles; Yerushalmi, Edit; Kuo, Vince H.; Heller, Kenneth; Heller, Patricia
2007-12-01
To identify and describe the basis upon which instructors make curricular and pedagogical decisions, we have developed an artifact-based interview and an analysis technique based on multilayered concept maps. The policy capturing technique used in the interview asks instructors to make judgments about concrete instructional artifacts similar to those they likely encounter in their teaching environment. The analysis procedure alternatively employs both an a priori systems view analysis and an emergent categorization to construct a multilayered concept map, which is a hierarchically arranged set of concept maps where child maps include more details than parent maps. Although our goal was to develop a model of physics faculty beliefs about the teaching and learning of problem solving in the context of an introductory calculus-based physics course, the techniques described here are applicable to a variety of situations in which instructors make decisions that influence teaching and learning.
Ye, Fang; Chen, Zhi-Hua; Chen, Jie; Liu, Fang; Zhang, Yong; Fan, Qin-Ying; Wang, Lin
2016-01-01
Background: In the past decades, studies on infant anemia have mainly focused on rural areas of China. With the increasing heterogeneity of population in recent years, available information on infant anemia is inconclusive in large cities of China, especially with comparison between native residents and floating population. This population-based cross-sectional study was implemented to determine the anemic status of infants as well as the risk factors in a representative downtown area of Beijing. Methods: As useful methods to build a predictive model, Chi-squared automatic interaction detection (CHAID) decision tree analysis and logistic regression analysis were introduced to explore risk factors of infant anemia. A total of 1091 infants aged 6–12 months together with their parents/caregivers living at Heping Avenue Subdistrict of Beijing were surveyed from January 1, 2013 to December 31, 2014. Results: The prevalence of anemia was 12.60% with a range of 3.47%–40.00% in different subgroup characteristics. The CHAID decision tree model has demonstrated multilevel interaction among risk factors through stepwise pathways to detect anemia. Besides the three predictors identified by logistic regression model including maternal anemia during pregnancy, exclusive breastfeeding in the first 6 months, and floating population, CHAID decision tree analysis also identified the fourth risk factor, the maternal educational level, with higher overall classification accuracy and larger area below the receiver operating characteristic curve. Conclusions: The infant anemic status in metropolis is complex and should be carefully considered by the basic health care practitioners. CHAID decision tree analysis has demonstrated a better performance in hierarchical analysis of population with great heterogeneity. Risk factors identified by this study might be meaningful in the early detection and prompt treatment of infant anemia in large cities. PMID:27174328
Ye, Fang; Chen, Zhi-Hua; Chen, Jie; Liu, Fang; Zhang, Yong; Fan, Qin-Ying; Wang, Lin
2016-05-20
In the past decades, studies on infant anemia have mainly focused on rural areas of China. With the increasing heterogeneity of population in recent years, available information on infant anemia is inconclusive in large cities of China, especially with comparison between native residents and floating population. This population-based cross-sectional study was implemented to determine the anemic status of infants as well as the risk factors in a representative downtown area of Beijing. As useful methods to build a predictive model, Chi-squared automatic interaction detection (CHAID) decision tree analysis and logistic regression analysis were introduced to explore risk factors of infant anemia. A total of 1091 infants aged 6-12 months together with their parents/caregivers living at Heping Avenue Subdistrict of Beijing were surveyed from January 1, 2013 to December 31, 2014. The prevalence of anemia was 12.60% with a range of 3.47%-40.00% in different subgroup characteristics. The CHAID decision tree model has demonstrated multilevel interaction among risk factors through stepwise pathways to detect anemia. Besides the three predictors identified by logistic regression model including maternal anemia during pregnancy, exclusive breastfeeding in the first 6 months, and floating population, CHAID decision tree analysis also identified the fourth risk factor, the maternal educational level, with higher overall classification accuracy and larger area below the receiver operating characteristic curve. The infant anemic status in metropolis is complex and should be carefully considered by the basic health care practitioners. CHAID decision tree analysis has demonstrated a better performance in hierarchical analysis of population with great heterogeneity. Risk factors identified by this study might be meaningful in the early detection and prompt treatment of infant anemia in large cities.
Decision-making for foot-and-mouth disease control: Objectives matter
Probert, William J. M.; Shea, Katriona; Fonnesbeck, Christopher J.; Runge, Michael C.; Carpenter, Tim E.; Durr, Salome; Garner, M. Graeme; Harvey, Neil; Stevenson, Mark A.; Webb, Colleen T.; Werkman, Marleen; Tildesley, Michael J.; Ferrari, Matthew J.
2016-01-01
Formal decision-analytic methods can be used to frame disease control problems, the first step of which is to define a clear and specific objective. We demonstrate the imperative of framing clearly-defined management objectives in finding optimal control actions for control of disease outbreaks. We illustrate an analysis that can be applied rapidly at the start of an outbreak when there are multiple stakeholders involved with potentially multiple objectives, and when there are also multiple disease models upon which to compare control actions. The output of our analysis frames subsequent discourse between policy-makers, modellers and other stakeholders, by highlighting areas of discord among different management objectives and also among different models used in the analysis. We illustrate this approach in the context of a hypothetical foot-and-mouth disease (FMD) outbreak in Cumbria, UK using outputs from five rigorously-studied simulation models of FMD spread. We present both relative rankings and relative performance of controls within each model and across a range of objectives. Results illustrate how control actions change across both the base metric used to measure management success and across the statistic used to rank control actions according to said metric. This work represents a first step towards reconciling the extensive modelling work on disease control problems with frameworks for structured decision making.
NASA Astrophysics Data System (ADS)
Rosenberg, D. E.
2008-12-01
Designing and implementing a hydro-economic computer model to support or facilitate collaborative decision making among multiple stakeholders or users can be challenging and daunting. Collaborative modeling is distinguished and more difficult than non-collaborative efforts because of a large number of users with different backgrounds, disagreement or conflict among stakeholders regarding problem definitions, modeling roles, and analysis methods, plus evolving ideas of model scope and scale and needs for information and analysis as stakeholders interact, use the model, and learn about the underlying water system. This presentation reviews the lifecycle for collaborative model making and identifies some key design decisions that stakeholders and model developers must make to develop robust and trusted, verifiable and transparent, integrated and flexible, and ultimately useful models. It advances some best practices to implement and program these decisions. Among these best practices are 1) modular development of data- aware input, storage, manipulation, results recording and presentation components plus ways to couple and link to other models and tools, 2) explicitly structure both input data and the meta data that describes data sources, who acquired it, gaps, and modifications or translations made to put the data in a form usable by the model, 3) provide in-line documentation on model inputs, assumptions, calculations, and results plus ways for stakeholders to document their own model use and share results with others, and 4) flexibly program with graphical object-oriented properties and elements that allow users or the model maintainers to easily see and modify the spatial, temporal, or analysis scope as the collaborative process moves forward. We draw on examples of these best practices from the existing literature, the author's prior work, and some new applications just underway. The presentation concludes by identifying some future directions for collaborative modeling including geo-spatial display and analysis, real-time operations, and internet-based tools plus the design and programming needed to implement these capabilities.
Distributed collaborative environments for virtual capability-based planning
NASA Astrophysics Data System (ADS)
McQuay, William K.
2003-09-01
Distributed collaboration is an emerging technology that will significantly change how decisions are made in the 21st century. Collaboration involves two or more geographically dispersed individuals working together to share and exchange data, information, knowledge, and actions. The marriage of information, collaboration, and simulation technologies provides the decision maker with a collaborative virtual environment for planning and decision support. This paper reviews research that is focusing on the applying open standards agent-based framework with integrated modeling and simulation to a new Air Force initiative in capability-based planning and the ability to implement it in a distributed virtual environment. Virtual Capability Planning effort will provide decision-quality knowledge for Air Force resource allocation and investment planning including examining proposed capabilities and cost of alternative approaches, the impact of technologies, identification of primary risk drivers, and creation of executable acquisition strategies. The transformed Air Force business processes are enabled by iterative use of constructive and virtual modeling, simulation, and analysis together with information technology. These tools are applied collaboratively via a technical framework by all the affected stakeholders - warfighter, laboratory, product center, logistics center, test center, and primary contractor.
Levin, Lia; Schwartz-Tayri, Talia
2017-06-01
Partnerships between service users and social workers are complex in nature and can be driven by both personal and contextual circumstances. This study sought to explore the relationship between social workers' involvement in shared decision making with service users, their attitudes towards service users in poverty, moral standards and health and social care organizations' policies towards shared decision making. Based on the responses of 225 licensed social workers from health and social care agencies in the public, private and third sectors in Israel, path analysis was used to test a hypothesized model. Structural attributions for poverty contributed to attitudes towards people who live in poverty, which led to shared decision making. Also, organizational support in shared decision making, and professional moral identity, contributed to ethical behaviour which led to shared decision making. The results of this analysis revealed that shared decision making may be a scion of branched roots planted in the relationship between ethics, organizations and Stigma. © 2016 The Authors. Health Expectations Published by John Wiley & Sons Ltd.
Moreau, Alain; Carol, Laurent; Dedianne, Marie Cécile; Dupraz, Christian; Perdrix, Corinne; Lainé, Xavier; Souweine, Gilbert
2012-05-01
To understand patients' perceptions of decision making and identify relationships among decision-making models. This qualitative study was made up of four focus group interviews (elderly persons, users of health support groups, students, and rural inhabitants). Participants were asked to report their perceptions of decision making in three written clinical scenarios (hypertension, breast cancer, prostate cancer). The analysis was based on the principles of grounded theory. Most patients perceived decision making as shared decision making, a deliberative question-response interaction with the physician that allowed patients to be experts in obtaining clearer information, participating in the care process, and negotiating compromises with physician preferences. Requesting second opinions allowed patients to maintain control, even within the paternalistic model preferred by elderly persons. Facilitating factors (trust, qualitative non-verbal communication, time to think) and obstacles (serious/emergency situations, perceived inadequate scientific competence, problems making requests, fear of knowing) were also part of shared decision making. In the global concept of patient-centered care, shared decision making can be flexible and can integrate paternalistic and informative models. Physicians' expertise should be associated with biomedical and relational skills through listening to, informing, and advising patients, and by supporting patients' choices. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Exploratory reconstructability analysis of accident TBI data
NASA Astrophysics Data System (ADS)
Zwick, Martin; Carney, Nancy; Nettleton, Rosemary
2018-02-01
This paper describes the use of reconstructability analysis to perform a secondary study of traumatic brain injury data from automobile accidents. Neutral searches were done and their results displayed with a hypergraph. Directed searches, using both variable-based and state-based models, were applied to predict performance on two cognitive tests and one neurological test. Very simple state-based models gave large uncertainty reductions for all three DVs and sizeable improvements in percent correct for the two cognitive test DVs which were equally sampled. Conditional probability distributions for these models are easily visualized with simple decision trees. Confounding variables and counter-intuitive findings are also reported.
NASA Astrophysics Data System (ADS)
Ng, Tze Ling; Eheart, J. Wayland; Cai, Ximing; Braden, John B.
2011-09-01
An agent-based model of farmers' crop and best management practice (BMP) decisions is developed and linked to a hydrologic-agronomic model of a watershed, to examine farmer behavior, and the attendant effects on stream nitrate load, under the influence of markets for conventional crops, carbon allowances, and a second-generation biofuel crop. The agent-based approach introduces interactions among farmers about new technologies and market opportunities, and includes the updating of forecast expectations and uncertainties using Bayesian inference. The model is applied to a semi-hypothetical example case of farmers in the Salt Creek Watershed in Central Illinois, and a sensitivity analysis is performed to effect a first-order assessment of the plausibility of the results. The results show that the most influential factors affecting farmers' decisions are crop prices, production costs, and yields. The results also show that different farmer behavioral profiles can lead to different predictions of farmer decisions. The farmers who are predicted to be more likely to adopt new practices are those who interact more with other farmers, are less risk averse, quick to adjust their expectations, and slow to reduce their forecast confidence. The decisions of farmers have direct water quality consequences, especially those pertaining to the adoption of the second-generation biofuel crop, which are estimated to lead to reductions in stream nitrate load. The results, though empirically untested, appear plausible and consistent with general farmer behavior. The results demonstrate the usefulness of the coupled agent-based and hydrologic-agronomic models for normative research on watershed management on the water-energy nexus.
Real-Time Analysis of a Sensor's Data for Automated Decision Making in an IoT-Based Smart Home.
Khan, Nida Saddaf; Ghani, Sayeed; Haider, Sajjad
2018-05-25
IoT devices frequently generate large volumes of streaming data and in order to take advantage of this data, their temporal patterns must be learned and identified. Streaming data analysis has become popular after being successfully used in many applications including forecasting electricity load, stock market prices, weather conditions, etc. Artificial Neural Networks (ANNs) have been successfully utilized in understanding the embedded interesting patterns/behaviors in the data and forecasting the future values based on it. One such pattern is modelled and learned in the present study to identify the occurrence of a specific pattern in a Water Management System (WMS). This prediction aids in making an automatic decision support system, to switch OFF a hydraulic suction pump at the appropriate time. Three types of ANN, namely Multi-Input Multi-Output (MIMO), Multi-Input Single-Output (MISO), and Recurrent Neural Network (RNN) have been compared, for multi-step-ahead forecasting, on a sensor's streaming data. Experiments have shown that RNN has the best performance among three models and based on its prediction, a system can be implemented to make the best decision with 86% accuracy.
Derivative Trade Optimizing Model Utilizing GP Based on Behavioral Finance Theory
NASA Astrophysics Data System (ADS)
Matsumura, Koki; Kawamoto, Masaru
This paper proposed a new technique which makes the strategy trees for the derivative (option) trading investment decision based on the behavioral finance theory and optimizes it using evolutionary computation, in order to achieve high profitability. The strategy tree uses a technical analysis based on a statistical, experienced technique for the investment decision. The trading model is represented by various technical indexes, and the strategy tree is optimized by the genetic programming(GP) which is one of the evolutionary computations. Moreover, this paper proposed a method using the prospect theory based on the behavioral finance theory to set psychological bias for profit and deficit and attempted to select the appropriate strike price of option for the higher investment efficiency. As a result, this technique produced a good result and found the effectiveness of this trading model by the optimized dealings strategy.
Optimal management of colorectal liver metastases in older patients: a decision analysis
Yang, Simon; Alibhai, Shabbir MH; Kennedy, Erin D; El-Sedfy, Abraham; Dixon, Matthew; Coburn, Natalie; Kiss, Alex; Law, Calvin HL
2014-01-01
Background Comparative trials evaluating management strategies for colorectal cancer liver metastases (CLM) are lacking, especially for older patients. This study developed a decision-analytic model to quantify outcomes associated with treatment strategies for CLM in older patients. Methods A Markov-decision model was built to examine the effect on life expectancy (LE) and quality-adjusted life expectancy (QALE) for best supportive care (BSC), systemic chemotherapy (SC), radiofrequency ablation (RFA) and hepatic resection (HR). The baseline patient cohort assumptions included healthy 70-year-old CLM patients after a primary cancer resection. Event and transition probabilities and utilities were derived from a literature review. Deterministic and probabilistic sensitivity analyses were performed on all study parameters. Results In base case analysis, BSC, SC, RFA and HR yielded LEs of 11.9, 23.1, 34.8 and 37.0 months, and QALEs of 7.8, 13.2, 22.0 and 25.0 months, respectively. Model results were sensitive to age, comorbidity, length of model simulation and utility after HR. Probabilistic sensitivity analysis showed increasing preference for RFA over HR with increasing patient age. Conclusions HR may be optimal for healthy 70-year-old patients with CLM. In older patients with comorbidities, RFA may provide better LE and QALE. Treatment decisions in older cancer patients should account for patient age, comorbidities, local expertise and individual values. PMID:24961482
Zhang, J L; Li, Y P; Huang, G H; Baetz, B W; Liu, J
2017-06-01
In this study, a Bayesian estimation-based simulation-optimization modeling approach (BESMA) is developed for identifying effluent trading strategies. BESMA incorporates nutrient fate modeling with soil and water assessment tool (SWAT), Bayesian estimation, and probabilistic-possibilistic interval programming with fuzzy random coefficients (PPI-FRC) within a general framework. Based on the water quality protocols provided by SWAT, posterior distributions of parameters can be analyzed through Bayesian estimation; stochastic characteristic of nutrient loading can be investigated which provides the inputs for the decision making. PPI-FRC can address multiple uncertainties in the form of intervals with fuzzy random boundaries and the associated system risk through incorporating the concept of possibility and necessity measures. The possibility and necessity measures are suitable for optimistic and pessimistic decision making, respectively. BESMA is applied to a real case of effluent trading planning in the Xiangxihe watershed, China. A number of decision alternatives can be obtained under different trading ratios and treatment rates. The results can not only facilitate identification of optimal effluent-trading schemes, but also gain insight into the effects of trading ratio and treatment rate on decision making. The results also reveal that decision maker's preference towards risk would affect decision alternatives on trading scheme as well as system benefit. Compared with the conventional optimization methods, it is proved that BESMA is advantageous in (i) dealing with multiple uncertainties associated with randomness and fuzziness in effluent-trading planning within a multi-source, multi-reach and multi-period context; (ii) reflecting uncertainties existing in nutrient transport behaviors to improve the accuracy in water quality prediction; and (iii) supporting pessimistic and optimistic decision making for effluent trading as well as promoting diversity of decision alternatives. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Rodriguez Lucatero, C.; Schaum, A.; Alarcon Ramos, L.; Bernal-Jaquez, R.
2014-07-01
In this study, the dynamics of decisions in complex networks subject to external fields are studied within a Markov process framework using nonlinear dynamical systems theory. A mathematical discrete-time model is derived using a set of basic assumptions regarding the convincement mechanisms associated with two competing opinions. The model is analyzed with respect to the multiplicity of critical points and the stability of extinction states. Sufficient conditions for extinction are derived in terms of the convincement probabilities and the maximum eigenvalues of the associated connectivity matrices. The influences of exogenous (e.g., mass media-based) effects on decision behavior are analyzed qualitatively. The current analysis predicts: (i) the presence of fixed-point multiplicity (with a maximum number of four different fixed points), multi-stability, and sensitivity with respect to the process parameters; and (ii) the bounded but significant impact of exogenous perturbations on the decision behavior. These predictions were verified using a set of numerical simulations based on a scale-free network topology.
A farm-level precision land management framework based on integer programming
Li, Qi; Hu, Guiping; Jubery, Talukder Zaki; Ganapathysubramanian, Baskar
2017-01-01
Farmland management involves several planning and decision making tasks including seed selection and irrigation management. A farm-level precision farmland management model based on mixed integer linear programming is proposed in this study. Optimal decisions are designed for pre-season planning of crops and irrigation water allocation. The model captures the effect of size and shape of decision scale as well as special irrigation patterns. The authors illustrate the model with a case study on a farm in the state of California in the U.S. and show the model can capture the impact of precision farm management on profitability. The results show that threefold increase of annual net profit for farmers could be achieved by carefully choosing irrigation and seed selection. Although farmers could increase profits by applying precision management to seed or irrigation alone, profit increase is more significant if farmers apply precision management on seed and irrigation simultaneously. The proposed model can also serve as a risk analysis tool for farmers facing seasonal irrigation water limits as well as a quantitative tool to explore the impact of precision agriculture. PMID:28346499
Chaisangmongkon, Warasinee; Swaminathan, Sruthi K.; Freedman, David J.; Wang, Xiao-Jing
2017-01-01
Summary Decision making involves dynamic interplay between internal judgements and external perception, which has been investigated in delayed match-to-category (DMC) experiments. Our analysis of neural recordings shows that, during DMC tasks, LIP and PFC neurons demonstrate mixed, time-varying, and heterogeneous selectivity, but previous theoretical work has not established the link between these neural characteristics and population-level computations. We trained a recurrent network model to perform DMC tasks and found that the model can remarkably reproduce key features of neuronal selectivity at the single-neuron and population levels. Analysis of the trained networks elucidates that robust transient trajectories of the neural population are the key driver of sequential categorical decisions. The directions of trajectories are governed by network self-organized connectivity, defining a ‘neural landscape’, consisting of a task-tailored arrangement of slow states and dynamical tunnels. With this model, we can identify functionally-relevant circuit motifs and generalize the framework to solve other categorization tasks. PMID:28334612
NASA Astrophysics Data System (ADS)
Antle, J. M.; Valdivia, R. O.; Jones, J.; Rosenzweig, C.; Ruane, A. C.
2013-12-01
This presentation provides an overview of the new methods developed by researchers in the Agricultural Model Inter-comparison and Improvement Project (AgMIP) for regional climate impact assessment and analysis of adaptation in agricultural systems. This approach represents a departure from approaches in the literature in several dimensions. First, the approach is based on the analysis of agricultural systems (not individual crops) and is inherently trans-disciplinary: it is based on a deep collaboration among a team of climate scientists, agricultural scientists and economists to design and implement regional integrated assessments of agricultural systems. Second, in contrast to previous approaches that have imposed future climate on models based on current socio-economic conditions, this approach combines bio-physical and economic models with a new type of pathway analysis (Representative Agricultural Pathways) to parameterize models consistent with a plausible future world in which climate change would be occurring. Third, adaptation packages for the agricultural systems in a region are designed by the research team with a level of detail that is useful to decision makers, such as research administrators and donors, who are making agricultural R&D investment decisions. The approach is illustrated with examples from AgMIP's projects currently being carried out in Africa and South Asia.
From guideline modeling to guideline execution: defining guideline-based decision-support services.
Tu, S. W.; Musen, M. A.
2000-01-01
We describe our task-based approach to defining the guideline-based decision-support services that the EON system provides. We categorize uses of guidelines in patient-specific decision support into a set of generic tasks--making of decisions, specification of work to be performed, interpretation of data, setting of goals, and issuance of alert and reminders--that can be solved using various techniques. Our model includes constructs required for representing the knowledge used by these techniques. These constructs form a toolkit from which developers can select modeling solutions for guideline task. Based on the tasks and the guideline model, we define a guideline-execution architecture and a model of interactions between a decision-support server and clients that invoke services provided by the server. These services use generic interfaces derived from guideline tasks and their associated modeling constructs. We describe two implementations of these decision-support services and discuss how this work can be generalized. We argue that a well-defined specification of guideline-based decision-support services will facilitate sharing of tools that implement computable clinical guidelines. PMID:11080007
Model-based choices involve prospective neural activity
Doll, Bradley B.; Duncan, Katherine D.; Simon, Dylan A.; Shohamy, Daphna; Daw, Nathaniel D.
2015-01-01
Decisions may arise via “model-free” repetition of previously reinforced actions, or by “model-based” evaluation, which is widely thought to follow from prospective anticipation of action consequences using a learned map or model. While choices and neural correlates of decision variables sometimes reflect knowledge of their consequences, it remains unclear whether this actually arises from prospective evaluation. Using functional MRI and a sequential reward-learning task in which paths contained decodable object categories, we found that humans’ model-based choices were associated with neural signatures of future paths observed at decision time, suggesting a prospective mechanism for choice. Prospection also covaried with the degree of model-based influences on neural correlates of decision variables, and was inversely related to prediction error signals thought to underlie model-free learning. These results dissociate separate mechanisms underlying model-based and model-free evaluation and support the hypothesis that model-based influences on choices and neural decision variables result from prospection. PMID:25799041
NASA Astrophysics Data System (ADS)
Hamedianfar, Alireza; Shafri, Helmi Zulhaidi Mohd
2016-04-01
This paper integrates decision tree-based data mining (DM) and object-based image analysis (OBIA) to provide a transferable model for the detailed characterization of urban land-cover classes using WorldView-2 (WV-2) satellite images. Many articles have been published on OBIA in recent years based on DM for different applications. However, less attention has been paid to the generation of a transferable model for characterizing detailed urban land cover features. Three subsets of WV-2 images were used in this paper to generate transferable OBIA rule-sets. Many features were explored by using a DM algorithm, which created the classification rules as a decision tree (DT) structure from the first study area. The developed DT algorithm was applied to object-based classifications in the first study area. After this process, we validated the capability and transferability of the classification rules into second and third subsets. Detailed ground truth samples were collected to assess the classification results. The first, second, and third study areas achieved 88%, 85%, and 85% overall accuracies, respectively. Results from the investigation indicate that DM was an efficient method to provide the optimal and transferable classification rules for OBIA, which accelerates the rule-sets creation stage in the OBIA classification domain.
Proposed Clinical Decision Rules to Diagnose Acute Rhinosinusitis Among Adults in Primary Care.
Ebell, Mark H; Hansen, Jens Georg
2017-07-01
To reduce inappropriate antibiotic prescribing, we sought to develop a clinical decision rule for the diagnosis of acute rhinosinusitis and acute bacterial rhinosinusitis. Multivariate analysis and classification and regression tree (CART) analysis were used to develop clinical decision rules for the diagnosis of acute rhinosinusitis, defined using 3 different reference standards (purulent antral puncture fluid or abnormal finding on a computed tomographic (CT) scan; for acute bacterial rhinosinusitis, we used a positive bacterial culture of antral fluid). Signs, symptoms, C-reactive protein (CRP), and reference standard tests were prospectively recorded in 175 Danish patients aged 18 to 65 years seeking care for suspected acute rhinosinusitis. For each reference standard, we developed 2 clinical decision rules: a point score based on a logistic regression model and an algorithm based on a CART model. We identified low-, moderate-, and high-risk groups for acute rhinosinusitis or acute bacterial rhinosinusitis for each clinical decision rule. The point scores each had between 5 and 6 predictors, and an area under the receiver operating characteristic curve (AUROCC) between 0.721 and 0.767. For positive bacterial culture as the reference standard, low-, moderate-, and high-risk groups had a 16%, 49%, and 73% likelihood of acute bacterial rhinosinusitis, respectively. CART models had an AUROCC ranging from 0.783 to 0.827. For positive bacterial culture as the reference standard, low-, moderate-, and high-risk groups had a likelihood of acute bacterial rhinosinusitis of 6%, 31%, and 59% respectively. We have developed a series of clinical decision rules integrating signs, symptoms, and CRP to diagnose acute rhinosinusitis and acute bacterial rhinosinusitis with good accuracy. They now require prospective validation and an assessment of their effect on clinical and process outcomes. © 2017 Annals of Family Medicine, Inc.
Clarity versus complexity: land-use modeling as a practical tool for decision-makers
Sohl, Terry L.; Claggett, Peter
2013-01-01
The last decade has seen a remarkable increase in the number of modeling tools available to examine future land-use and land-cover (LULC) change. Integrated modeling frameworks, agent-based models, cellular automata approaches, and other modeling techniques have substantially improved the representation of complex LULC systems, with each method using a different strategy to address complexity. However, despite the development of new and better modeling tools, the use of these tools is limited for actual planning, decision-making, or policy-making purposes. LULC modelers have become very adept at creating tools for modeling LULC change, but complicated models and lack of transparency limit their utility for decision-makers. The complicated nature of many LULC models also makes it impractical or even impossible to perform a rigorous analysis of modeling uncertainty. This paper provides a review of land-cover modeling approaches and the issues causes by the complicated nature of models, and provides suggestions to facilitate the increased use of LULC models by decision-makers and other stakeholders. The utility of LULC models themselves can be improved by 1) providing model code and documentation, 2) through the use of scenario frameworks to frame overall uncertainties, 3) improving methods for generalizing key LULC processes most important to stakeholders, and 4) adopting more rigorous standards for validating models and quantifying uncertainty. Communication with decision-makers and other stakeholders can be improved by increasing stakeholder participation in all stages of the modeling process, increasing the transparency of model structure and uncertainties, and developing user-friendly decision-support systems to bridge the link between LULC science and policy. By considering these options, LULC science will be better positioned to support decision-makers and increase real-world application of LULC modeling results.
2011-01-01
Background A real-time clinical decision support system (RTCDSS) with interactive diagrams enables clinicians to instantly and efficiently track patients' clinical records (PCRs) and improve their quality of clinical care. We propose a RTCDSS to process online clinical informatics from multiple databases for clinical decision making in the treatment of prostate cancer based on Web Model-View-Controller (MVC) architecture, by which the system can easily be adapted to different diseases and applications. Methods We designed a framework upon the Web MVC-based architecture in which the reusable and extractable models can be conveniently adapted to other hospital information systems and which allows for efficient database integration. Then, we determined the clinical variables of the prostate cancer treatment based on participating clinicians' opinions and developed a computational model to determine the pretreatment parameters. Furthermore, the components of the RTCDSS integrated PCRs and decision factors for real-time analysis to provide evidence-based diagrams upon the clinician-oriented interface for visualization of treatment guidance and health risk assessment. Results The resulting system can improve quality of clinical treatment by allowing clinicians to concurrently analyze and evaluate the clinical markers of prostate cancer patients with instantaneous clinical data and evidence-based diagrams which can automatically identify pretreatment parameters. Moreover, the proposed RTCDSS can aid interactions between patients and clinicians. Conclusions Our proposed framework supports online clinical informatics, evaluates treatment risks, offers interactive guidance, and provides real-time reference for decision making in the treatment of prostate cancer. The developed clinician-oriented interface can assist clinicians in conveniently presenting evidence-based information to patients and can be readily adapted to an existing hospital information system and be easily applied in other chronic diseases. PMID:21385459
Lin, Hsueh-Chun; Wu, Hsi-Chin; Chang, Chih-Hung; Li, Tsai-Chung; Liang, Wen-Miin; Wang, Jong-Yi Wang
2011-03-08
A real-time clinical decision support system (RTCDSS) with interactive diagrams enables clinicians to instantly and efficiently track patients' clinical records (PCRs) and improve their quality of clinical care. We propose a RTCDSS to process online clinical informatics from multiple databases for clinical decision making in the treatment of prostate cancer based on Web Model-View-Controller (MVC) architecture, by which the system can easily be adapted to different diseases and applications. We designed a framework upon the Web MVC-based architecture in which the reusable and extractable models can be conveniently adapted to other hospital information systems and which allows for efficient database integration. Then, we determined the clinical variables of the prostate cancer treatment based on participating clinicians' opinions and developed a computational model to determine the pretreatment parameters. Furthermore, the components of the RTCDSS integrated PCRs and decision factors for real-time analysis to provide evidence-based diagrams upon the clinician-oriented interface for visualization of treatment guidance and health risk assessment. The resulting system can improve quality of clinical treatment by allowing clinicians to concurrently analyze and evaluate the clinical markers of prostate cancer patients with instantaneous clinical data and evidence-based diagrams which can automatically identify pretreatment parameters. Moreover, the proposed RTCDSS can aid interactions between patients and clinicians. Our proposed framework supports online clinical informatics, evaluates treatment risks, offers interactive guidance, and provides real-time reference for decision making in the treatment of prostate cancer. The developed clinician-oriented interface can assist clinicians in conveniently presenting evidence-based information to patients and can be readily adapted to an existing hospital information system and be easily applied in other chronic diseases.
Oddo, Perry C; Lee, Ben S; Garner, Gregory G; Srikrishnan, Vivek; Reed, Patrick M; Forest, Chris E; Keller, Klaus
2017-09-05
Sea levels are rising in many areas around the world, posing risks to coastal communities and infrastructures. Strategies for managing these flood risks present decision challenges that require a combination of geophysical, economic, and infrastructure models. Previous studies have broken important new ground on the considerable tensions between the costs of upgrading infrastructure and the damages that could result from extreme flood events. However, many risk-based adaptation strategies remain silent on certain potentially important uncertainties, as well as the tradeoffs between competing objectives. Here, we implement and improve on a classic decision-analytical model (Van Dantzig 1956) to: (i) capture tradeoffs across conflicting stakeholder objectives, (ii) demonstrate the consequences of structural uncertainties in the sea-level rise and storm surge models, and (iii) identify the parametric uncertainties that most strongly influence each objective using global sensitivity analysis. We find that the flood adaptation model produces potentially myopic solutions when formulated using traditional mean-centric decision theory. Moving from a single-objective problem formulation to one with multiobjective tradeoffs dramatically expands the decision space, and highlights the need for compromise solutions to address stakeholder preferences. We find deep structural uncertainties that have large effects on the model outcome, with the storm surge parameters accounting for the greatest impacts. Global sensitivity analysis effectively identifies important parameter interactions that local methods overlook, and that could have critical implications for flood adaptation strategies. © 2017 Society for Risk Analysis.
Chung, Younjin; Salvador-Carulla, Luis; Salinas-Pérez, José A; Uriarte-Uriarte, Jose J; Iruin-Sanz, Alvaro; García-Alonso, Carlos R
2018-04-25
Decision-making in mental health systems should be supported by the evidence-informed knowledge transfer of data. Since mental health systems are inherently complex, involving interactions between its structures, processes and outcomes, decision support systems (DSS) need to be developed using advanced computational methods and visual tools to allow full system analysis, whilst incorporating domain experts in the analysis process. In this study, we use a DSS model developed for interactive data mining and domain expert collaboration in the analysis of complex mental health systems to improve system knowledge and evidence-informed policy planning. We combine an interactive visual data mining approach, the self-organising map network (SOMNet), with an operational expert knowledge approach, expert-based collaborative analysis (EbCA), to develop a DSS model. The SOMNet was applied to the analysis of healthcare patterns and indicators of three different regional mental health systems in Spain, comprising 106 small catchment areas and providing healthcare for over 9 million inhabitants. Based on the EbCA, the domain experts in the development team guided and evaluated the analytical processes and results. Another group of 13 domain experts in mental health systems planning and research evaluated the model based on the analytical information of the SOMNet approach for processing information and discovering knowledge in a real-world context. Through the evaluation, the domain experts assessed the feasibility and technology readiness level (TRL) of the DSS model. The SOMNet, combined with the EbCA, effectively processed evidence-based information when analysing system outliers, explaining global and local patterns, and refining key performance indicators with their analytical interpretations. The evaluation results showed that the DSS model was feasible by the domain experts and reached level 7 of the TRL (system prototype demonstration in operational environment). This study supports the benefits of combining health systems engineering (SOMNet) and expert knowledge (EbCA) to analyse the complexity of health systems research. The use of the SOMNet approach contributes to the demonstration of DSS for mental health planning in practice.
NASA Technical Reports Server (NTRS)
Schenker, Paul S. (Editor)
1992-01-01
Various papers on control paradigms and data structures in sensor fusion are presented. The general topics addressed include: decision models and computational methods, sensor modeling and data representation, active sensing strategies, geometric planning and visualization, task-driven sensing, motion analysis, models motivated biology and psychology, decentralized detection and distributed decision, data fusion architectures, robust estimation of shapes and features, application and implementation. Some of the individual subjects considered are: the Firefly experiment on neural networks for distributed sensor data fusion, manifold traversing as a model for learning control of autonomous robots, choice of coordinate systems for multiple sensor fusion, continuous motion using task-directed stereo vision, interactive and cooperative sensing and control for advanced teleoperation, knowledge-based imaging for terrain analysis, physical and digital simulations for IVA robotics.
Predicting agricultural impacts of large-scale drought: 2012 and the case for better modeling
USDA-ARS?s Scientific Manuscript database
We present an example of a simulation-based forecast for the 2012 U.S. maize growing season produced as part of a high-resolution, multi-scale, predictive mechanistic modeling study designed for decision support, risk management, and counterfactual analysis. The simulations undertaken for this analy...
Publishing web-based guidelines using interactive decision models.
Sanders, G D; Nease, R F; Owens, D K
2001-05-01
Commonly used methods for guideline development and dissemination do not enable developers to tailor guidelines systematically to specific patient populations and update guidelines easily. We developed a web-based system, ALCHEMIST, that uses decision models and automatically creates evidence-based guidelines that can be disseminated, tailored and updated over the web. Our objective was to demonstrate the use of this system with clinical scenarios that provide challenges for guideline development. We used the ALCHEMIST system to develop guidelines for three clinical scenarios: (1) Chlamydia screening for adolescent women, (2) antiarrhythmic therapy for the prevention of sudden cardiac death; and (3) genetic testing for the BRCA breast-cancer mutation. ALCHEMIST uses information extracted directly from the decision model, combined with the additional information from the author of the decision model, to generate global guidelines. ALCHEMIST generated electronic web-based guidelines for each of the three scenarios. Using ALCHEMIST, we demonstrate that tailoring a guideline for a population at high-risk for Chlamydia changes the recommended policy for control of Chlamydia from contact tracing of reported cases to a population-based screening programme. We used ALCHEMIST to incorporate new evidence about the effectiveness of implantable cardioverter defibrillators (ICD) and demonstrate that the cost-effectiveness of use of ICDs improves from $74 400 per quality-adjusted life year (QALY) gained to $34 500 per QALY gained. Finally, we demonstrate how a clinician could use ALCHEMIST to incorporate a woman's utilities for relevant health states and thereby develop patient-specific recommendations for BRCA testing; the patient-specific recommendation improved quality-adjusted life expectancy by 37 days. The ALCHEMIST system enables guideline developers to publish both a guideline and an interactive decision model on the web. This web-based tool enables guideline developers to tailor guidelines systematically, to update guidelines easily, and to make the underlying evidence and analysis transparent for users.
ERIC Educational Resources Information Center
Harrington, Robert; Jenkins, Peter; Marzke, Carolyn; Cohen, Carol
Prominent among the new models of social service delivery are organizations providing comprehensive, community-based supports and services (CCBSS) to children and their families. A needs analysis explored CCBSS sites' interest in and readiness to use a software tool designed to help them make more effective internal resource allocation decisions…
Multiscale modelling and analysis of collective decision making in swarm robotics.
Vigelius, Matthias; Meyer, Bernd; Pascoe, Geoffrey
2014-01-01
We present a unified approach to describing certain types of collective decision making in swarm robotics that bridges from a microscopic individual-based description to aggregate properties. Our approach encompasses robot swarm experiments, microscopic and probabilistic macroscopic-discrete simulations as well as an analytic mathematical model. Following up on previous work, we identify the symmetry parameter, a measure of the progress of the swarm towards a decision, as a fundamental integrated swarm property and formulate its time evolution as a continuous-time Markov process. Contrary to previous work, which justified this approach only empirically and a posteriori, we justify it from first principles and derive hard limits on the parameter regime in which it is applicable.
Guo, Xuezhen; Claassen, G D H; Oude Lansink, A G J M; Saatkamp, H W
2014-06-01
Economic analysis of hazard surveillance in livestock production chains is essential for surveillance organizations (such as food safety authorities) when making scientifically based decisions on optimization of resource allocation. To enable this, quantitative decision support tools are required at two levels of analysis: (1) single-hazard surveillance system and (2) surveillance portfolio. This paper addresses the first level by presenting a conceptual approach for the economic analysis of single-hazard surveillance systems. The concept includes objective and subjective aspects of single-hazard surveillance system analysis: (1) a simulation part to derive an efficient set of surveillance setups based on the technical surveillance performance parameters (TSPPs) and the corresponding surveillance costs, i.e., objective analysis, and (2) a multi-criteria decision making model to evaluate the impacts of the hazard surveillance, i.e., subjective analysis. The conceptual approach was checked for (1) conceptual validity and (2) data validity. Issues regarding the practical use of the approach, particularly the data requirement, were discussed. We concluded that the conceptual approach is scientifically credible for economic analysis of single-hazard surveillance systems and that the practicability of the approach depends on data availability. Copyright © 2014 Elsevier B.V. All rights reserved.
Spatially explicit multi-criteria decision analysis for managing vector-borne diseases
2011-01-01
The complex epidemiology of vector-borne diseases creates significant challenges in the design and delivery of prevention and control strategies, especially in light of rapid social and environmental changes. Spatial models for predicting disease risk based on environmental factors such as climate and landscape have been developed for a number of important vector-borne diseases. The resulting risk maps have proven value for highlighting areas for targeting public health programs. However, these methods generally only offer technical information on the spatial distribution of disease risk itself, which may be incomplete for making decisions in a complex situation. In prioritizing surveillance and intervention strategies, decision-makers often also need to consider spatially explicit information on other important dimensions, such as the regional specificity of public acceptance, population vulnerability, resource availability, intervention effectiveness, and land use. There is a need for a unified strategy for supporting public health decision making that integrates available data for assessing spatially explicit disease risk, with other criteria, to implement effective prevention and control strategies. Multi-criteria decision analysis (MCDA) is a decision support tool that allows for the consideration of diverse quantitative and qualitative criteria using both data-driven and qualitative indicators for evaluating alternative strategies with transparency and stakeholder participation. Here we propose a MCDA-based approach to the development of geospatial models and spatially explicit decision support tools for the management of vector-borne diseases. We describe the conceptual framework that MCDA offers as well as technical considerations, approaches to implementation and expected outcomes. We conclude that MCDA is a powerful tool that offers tremendous potential for use in public health decision-making in general and vector-borne disease management in particular. PMID:22206355
Liu, Pei-Yang
2014-01-01
Metabolic syndrome (MetS) in young adults (age 20–39) is often undiagnosed. A simple screening tool using a surrogate measure might be invaluable in the early detection of MetS. Methods. A chi-squared automatic interaction detection (CHAID) decision tree analysis with waist circumference user-specified as the first level was used to detect MetS in young adults using data from the National Health and Nutrition Examination Survey (NHANES) 2009-2010 Cohort as a representative sample of the United States population (n = 745). Results. Twenty percent of the sample met the National Cholesterol Education Program Adult Treatment Panel III (NCEP) classification criteria for MetS. The user-specified CHAID model was compared to both CHAID model with no user-specified first level and logistic regression based model. This analysis identified waist circumference as a strong predictor in the MetS diagnosis. The accuracy of the final model with waist circumference user-specified as the first level was 92.3% with its ability to detect MetS at 71.8% which outperformed comparison models. Conclusions. Preliminary findings suggest that young adults at risk for MetS could be identified for further followup based on their waist circumference. Decision tree methods show promise for the development of a preliminary detection algorithm for MetS. PMID:24817904
Hristozov, Danail; Zabeo, Alex; Alstrup Jensen, Keld; Gottardo, Stefania; Isigonis, Panagiotis; Maccalman, Laura; Critto, Andrea; Marcomini, Antonio
2016-11-01
Several tools to facilitate the risk assessment and management of manufactured nanomaterials (MN) have been developed. Most of them require input data on physicochemical properties, toxicity and scenario-specific exposure information. However, such data are yet not readily available, and tools that can handle data gaps in a structured way to ensure transparent risk analysis for industrial and regulatory decision making are needed. This paper proposes such a quantitative risk prioritisation tool, based on a multi-criteria decision analysis algorithm, which combines advanced exposure and dose-response modelling to calculate margins of exposure (MoE) for a number of MN in order to rank their occupational risks. We demonstrated the tool in a number of workplace exposure scenarios (ES) involving the production and handling of nanoscale titanium dioxide, zinc oxide (ZnO), silver and multi-walled carbon nanotubes. The results of this application demonstrated that bag/bin filling, manual un/loading and dumping of large amounts of dry powders led to high emissions, which resulted in high risk associated with these ES. The ZnO MN revealed considerable hazard potential in vivo, which significantly influenced the risk prioritisation results. In order to study how variations in the input data affect our results, we performed probabilistic Monte Carlo sensitivity/uncertainty analysis, which demonstrated that the performance of the proposed model is stable against changes in the exposure and hazard input variables.
Multi-criteria comparative evaluation of spallation reaction models
NASA Astrophysics Data System (ADS)
Andrianov, Andrey; Andrianova, Olga; Konobeev, Alexandr; Korovin, Yury; Kuptsov, Ilya
2017-09-01
This paper presents an approach to a comparative evaluation of the predictive ability of spallation reaction models based on widely used, well-proven multiple-criteria decision analysis methods (MAVT/MAUT, AHP, TOPSIS, PROMETHEE) and the results of such a comparison for 17 spallation reaction models in the presence of the interaction of high-energy protons with natPb.
Constantinou, Anthony Costa; Yet, Barbaros; Fenton, Norman; Neil, Martin; Marsh, William
2016-01-01
Inspired by real-world examples from the forensic medical sciences domain, we seek to determine whether a decision about an interventional action could be subject to amendments on the basis of some incomplete information within the model, and whether it would be worthwhile for the decision maker to seek further information prior to suggesting a decision. The method is based on the underlying principle of Value of Information to enhance decision analysis in interventional and counterfactual Bayesian networks. The method is applied to two real-world Bayesian network models (previously developed for decision support in forensic medical sciences) to examine the average gain in terms of both Value of Information (average relative gain ranging from 11.45% and 59.91%) and decision making (potential amendments in decision making ranging from 0% to 86.8%). We have shown how the method becomes useful for decision makers, not only when decision making is subject to amendments on the basis of some unknown risk factors, but also when it is not. Knowing that a decision outcome is independent of one or more unknown risk factors saves us from the trouble of seeking information about the particular set of risk factors. Further, we have also extended the assessment of this implication to the counterfactual case and demonstrated how answers about interventional actions are expected to change when some unknown factors become known, and how useful this becomes in forensic medical science. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Nowak, W.; Schöniger, A.; Wöhling, T.; Illman, W. A.
2016-12-01
Model-based decision support requires justifiable models with good predictive capabilities. This, in turn, calls for a fine adjustment between predictive accuracy (small systematic model bias that can be achieved with rather complex models), and predictive precision (small predictive uncertainties that can be achieved with simpler models with fewer parameters). The implied complexity/simplicity trade-off depends on the availability of informative data for calibration. If not available, additional data collection can be planned through optimal experimental design. We present a model justifiability analysis that can compare models of vastly different complexity. It rests on Bayesian model averaging (BMA) to investigate the complexity/performance trade-off dependent on data availability. Then, we disentangle the complexity component from the performance component. We achieve this by replacing actually observed data by realizations of synthetic data predicted by the models. This results in a "model confusion matrix". Based on this matrix, the modeler can identify the maximum model complexity that can be justified by the available (or planned) amount and type of data. As a side product, the matrix quantifies model (dis-)similarity. We apply this analysis to aquifer characterization via hydraulic tomography, comparing four models with a vastly different number of parameters (from a homogeneous model to geostatistical random fields). As a testing scenario, we consider hydraulic tomography data. Using subsets of these data, we determine model justifiability as a function of data set size. The test case shows that geostatistical parameterization requires a substantial amount of hydraulic tomography data to be justified, while a zonation-based model can be justified with more limited data set sizes. The actual model performance (as opposed to model justifiability), however, depends strongly on the quality of prior geological information.
NASA Technical Reports Server (NTRS)
Tian, Jianhui; Porter, Adam; Zelkowitz, Marvin V.
1992-01-01
Identification of high cost modules has been viewed as one mechanism to improve overall system reliability, since such modules tend to produce more than their share of problems. A decision tree model was used to identify such modules. In this current paper, a previously developed axiomatic model of program complexity is merged with the previously developed decision tree process for an improvement in the ability to identify such modules. This improvement was tested using data from the NASA Software Engineering Laboratory.
An Analysis of the President’s Budgetary Proposals for Fiscal Year 2005
2004-03-01
productivity —which represents the state of technological know-how. The model is not forward-looking—people base their decisions entirely on...Bottles: A Meta-Analysis of Ricardian Equivalence,” Southern Economic Journal, vol. 64, no. 3 (January 1998), pp. 713-727. APPENDIX B THE MODELS ...Under the President’s Budget and UnderCBO’s Baseline Policy Assumptions 25How Fiscal Policy Affects the Economy 28 A Description of CBO’s Models and
2012-01-01
Background A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. Methods We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). Results The instruments under study provide excellent tools for preparing decisions within the SAP in a transparent way when structuring the primary analysis, sensitivity or ancillary analyses, and specific analyses for secondary endpoints. The mean logarithmic score and DIC discriminate well between different model scenarios. It becomes obvious that the naive choice of a conventional random effects Poisson model is often inappropriate for real-life count data. The findings are used to specify an appropriate mixed model employed in the sensitivity analyses of an ongoing phase III trial. Conclusions The proposed Bayesian methods are not only appealing for inference but notably provide a sophisticated insight into different aspects of model performance, such as forecast verification or calibration checks, and can be applied within the model selection process. The mean of the logarithmic score is a robust tool for model ranking and is not sensitive to sample size. Therefore, these Bayesian model selection techniques offer helpful decision support for shaping sensitivity and ancillary analyses in a statistical analysis plan of a clinical trial with longitudinal count data as the primary endpoint. PMID:22962944
Adrion, Christine; Mansmann, Ulrich
2012-09-10
A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). The instruments under study provide excellent tools for preparing decisions within the SAP in a transparent way when structuring the primary analysis, sensitivity or ancillary analyses, and specific analyses for secondary endpoints. The mean logarithmic score and DIC discriminate well between different model scenarios. It becomes obvious that the naive choice of a conventional random effects Poisson model is often inappropriate for real-life count data. The findings are used to specify an appropriate mixed model employed in the sensitivity analyses of an ongoing phase III trial. The proposed Bayesian methods are not only appealing for inference but notably provide a sophisticated insight into different aspects of model performance, such as forecast verification or calibration checks, and can be applied within the model selection process. The mean of the logarithmic score is a robust tool for model ranking and is not sensitive to sample size. Therefore, these Bayesian model selection techniques offer helpful decision support for shaping sensitivity and ancillary analyses in a statistical analysis plan of a clinical trial with longitudinal count data as the primary endpoint.
Data Clustering and Evolving Fuzzy Decision Tree for Data Base Classification Problems
NASA Astrophysics Data System (ADS)
Chang, Pei-Chann; Fan, Chin-Yuan; Wang, Yen-Wen
Data base classification suffers from two well known difficulties, i.e., the high dimensionality and non-stationary variations within the large historic data. This paper presents a hybrid classification model by integrating a case based reasoning technique, a Fuzzy Decision Tree (FDT), and Genetic Algorithms (GA) to construct a decision-making system for data classification in various data base applications. The model is major based on the idea that the historic data base can be transformed into a smaller case-base together with a group of fuzzy decision rules. As a result, the model can be more accurately respond to the current data under classifying from the inductions by these smaller cases based fuzzy decision trees. Hit rate is applied as a performance measure and the effectiveness of our proposed model is demonstrated by experimentally compared with other approaches on different data base classification applications. The average hit rate of our proposed model is the highest among others.
NASA Astrophysics Data System (ADS)
Cheng, Fen; Hu, Wanxin
2017-05-01
Based on analysis of the impact of the experience of parking policy at home and abroad, design the impact analysis process of parking strategy. First, using group decision theory to create a parking strategy index system and calculate its weight. Index system includes government, parking operators and travelers. Then, use a multi-level extension theory to analyze the CBD parking strategy. Assess the parking strategy by calculating the correlation of each indicator. Finally, assess the strategy of parking charges through a case. Provide a scientific and reasonable basis for assessing parking strategy. The results showed that the model can effectively analyze multi-target, multi-property parking policy evaluation.
Analysis of stock investment selection based on CAPM using covariance and genetic algorithm approach
NASA Astrophysics Data System (ADS)
Sukono; Susanti, D.; Najmia, M.; Lesmana, E.; Napitupulu, H.; Supian, S.; Putra, A. S.
2018-03-01
Investment is one of the economic growth factors of countries, especially in Indonesia. Stocks is a form of investment, which is liquid. In determining the stock investment decisions which need to be considered by investors is to choose stocks that can generate maximum returns with a minimum risk level. Therefore, we need to know how to allocate the capital which may give the optimal benefit. This study discusses the issue of stock investment based on CAPM which is estimated using covariance and Genetic Algorithm approach. It is assumed that the stocks analyzed follow the CAPM model. To do the estimation of beta parameter on CAPM equation is done by two approach, first is to be represented by covariance approach, and second with genetic algorithm optimization. As a numerical illustration, in this paper analyzed ten stocks traded on the capital market in Indonesia. The results of the analysis show that estimation of beta parameters using covariance and genetic algorithm approach, give the same decision, that is, six underpriced stocks with buying decision, and four overpriced stocks with a sales decision. Based on the analysis, it can be concluded that the results can be used as a consideration for investors buying six under-priced stocks, and selling four overpriced stocks.
Energy-Water Nexus: Balancing the Tradeoffs between Two-Level Decision Makers
Zhang, Xiaodong; Vesselinov, Velimir Valentinov
2016-09-03
Energy-water nexus has substantially increased importance in the recent years. Synergistic approaches based on systems-analysis and mathematical models are critical for helping decision makers better understand the interrelationships and tradeoffs between energy and water. In energywater nexus management, various decision makers with different goals and preferences, which are often conflicting, are involved. These decision makers may have different controlling power over the management objectives and the decisions. They make decisions sequentially from the upper level to the lower level, challenging decision making in energy-water nexus. In order to address such planning issues, a bi-level decision model is developed, which improvesmore » upon the existing studies by integration of bi-level programming into energy-water nexus management. The developed model represents a methodological contribution to the challenge of sequential decisionmaking in energy-water nexus through provision of an integrated modeling framework/tool. An interactive fuzzy optimization methodology is introduced to seek a satisfactory solution to meet the overall satisfaction of the two-level decision makers. The tradeoffs between the two-level decision makers in energy-water nexus management are effectively addressed and quantified. Application of the proposed model to a synthetic example problem has demonstrated its applicability in practical energy-water nexus management. Optimal solutions for electricity generation, fuel supply, water supply including groundwater, surface water and recycled water, capacity expansion of the power plants, and GHG emission control are generated. In conclusion, these analyses are capable of helping decision makers or stakeholders adjust their tolerances to make informed decisions to achieve the overall satisfaction of energy-water nexus management where bi-level sequential decision making process is involved.« less
Energy-Water Nexus: Balancing the Tradeoffs between Two-Level Decision Makers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xiaodong; Vesselinov, Velimir Valentinov
Energy-water nexus has substantially increased importance in the recent years. Synergistic approaches based on systems-analysis and mathematical models are critical for helping decision makers better understand the interrelationships and tradeoffs between energy and water. In energywater nexus management, various decision makers with different goals and preferences, which are often conflicting, are involved. These decision makers may have different controlling power over the management objectives and the decisions. They make decisions sequentially from the upper level to the lower level, challenging decision making in energy-water nexus. In order to address such planning issues, a bi-level decision model is developed, which improvesmore » upon the existing studies by integration of bi-level programming into energy-water nexus management. The developed model represents a methodological contribution to the challenge of sequential decisionmaking in energy-water nexus through provision of an integrated modeling framework/tool. An interactive fuzzy optimization methodology is introduced to seek a satisfactory solution to meet the overall satisfaction of the two-level decision makers. The tradeoffs between the two-level decision makers in energy-water nexus management are effectively addressed and quantified. Application of the proposed model to a synthetic example problem has demonstrated its applicability in practical energy-water nexus management. Optimal solutions for electricity generation, fuel supply, water supply including groundwater, surface water and recycled water, capacity expansion of the power plants, and GHG emission control are generated. In conclusion, these analyses are capable of helping decision makers or stakeholders adjust their tolerances to make informed decisions to achieve the overall satisfaction of energy-water nexus management where bi-level sequential decision making process is involved.« less
The neural representation of unexpected uncertainty during value-based decision making.
Payzan-LeNestour, Elise; Dunne, Simon; Bossaerts, Peter; O'Doherty, John P
2013-07-10
Uncertainty is an inherent property of the environment and a central feature of models of decision-making and learning. Theoretical propositions suggest that one form, unexpected uncertainty, may be used to rapidly adapt to changes in the environment, while being influenced by two other forms: risk and estimation uncertainty. While previous studies have reported neural representations of estimation uncertainty and risk, relatively little is known about unexpected uncertainty. Here, participants performed a decision-making task while undergoing functional magnetic resonance imaging (fMRI), which, in combination with a Bayesian model-based analysis, enabled us to separately examine each form of uncertainty examined. We found representations of unexpected uncertainty in multiple cortical areas, as well as the noradrenergic brainstem nucleus locus coeruleus. Other unique cortical regions were found to encode risk, estimation uncertainty, and learning rate. Collectively, these findings support theoretical models in which several formally separable uncertainty computations determine the speed of learning. Copyright © 2013 Elsevier Inc. All rights reserved.
Racial Labor Market Gaps: The Role of Abilities and Schooling Choices
ERIC Educational Resources Information Center
Urzua, Sergio
2008-01-01
This paper studies the relationship between abilities, schooling choices, and black-white differentials in labor market outcomes. The analysis is based on a model of endogenous schooling choices. Agents' schooling decisions are based on expected future earnings, family background, and unobserved abilities. Earnings are also determined by…
Decision aids for multiple-decision disease management as affected by weather input errors.
Pfender, W F; Gent, D H; Mahaffee, W F; Coop, L B; Fox, A D
2011-06-01
Many disease management decision support systems (DSSs) rely, exclusively or in part, on weather inputs to calculate an indicator for disease hazard. Error in the weather inputs, typically due to forecasting, interpolation, or estimation from off-site sources, may affect model calculations and management decision recommendations. The extent to which errors in weather inputs affect the quality of the final management outcome depends on a number of aspects of the disease management context, including whether management consists of a single dichotomous decision, or of a multi-decision process extending over the cropping season(s). Decision aids for multi-decision disease management typically are based on simple or complex algorithms of weather data which may be accumulated over several days or weeks. It is difficult to quantify accuracy of multi-decision DSSs due to temporally overlapping disease events, existence of more than one solution to optimizing the outcome, opportunities to take later recourse to modify earlier decisions, and the ongoing, complex decision process in which the DSS is only one component. One approach to assessing importance of weather input errors is to conduct an error analysis in which the DSS outcome from high-quality weather data is compared with that from weather data with various levels of bias and/or variance from the original data. We illustrate this analytical approach for two types of DSS, an infection risk index for hop powdery mildew and a simulation model for grass stem rust. Further exploration of analysis methods is needed to address problems associated with assessing uncertainty in multi-decision DSSs.
Practical example of game theory application for production route selection
NASA Astrophysics Data System (ADS)
Olender, M.; Krenczyk, D.
2017-08-01
The opportunity which opens before manufacturers on the dynamic market, especially before those from the sector of the small and medium-sized enterprises, is associated with the use of the virtual organizations concept. The planning stage of such organizations could be based on supporting decision-making tasks using the tools and formalisms taken from the game theory. In the paper the model of the virtual manufacturing network, along with the practical example of decision-making situation as two person game and the decision strategies with an analysis of calculation results are presented.
An Agent-Based Model of Evolving Community Flood Risk.
Tonn, Gina L; Guikema, Seth D
2018-06-01
Although individual behavior plays a major role in community flood risk, traditional flood risk models generally do not capture information on how community policies and individual decisions impact the evolution of flood risk over time. The purpose of this study is to improve the understanding of the temporal aspects of flood risk through a combined analysis of the behavioral, engineering, and physical hazard aspects of flood risk. Additionally, the study aims to develop a new modeling approach for integrating behavior, policy, flood hazards, and engineering interventions. An agent-based model (ABM) is used to analyze the influence of flood protection measures, individual behavior, and the occurrence of floods and near-miss flood events on community flood risk. The ABM focuses on the following decisions and behaviors: dissemination of flood management information, installation of community flood protection, elevation of household mechanical equipment, and elevation of homes. The approach is place based, with a case study area in Fargo, North Dakota, but is focused on generalizable insights. Generally, community mitigation results in reduced future damage, and individual action, including mitigation and movement into and out of high-risk areas, can have a significant influence on community flood risk. The results of this study provide useful insights into the interplay between individual and community actions and how it affects the evolution of flood risk. This study lends insight into priorities for future work, including the development of more in-depth behavioral and decision rules at the individual and community level. © 2017 Society for Risk Analysis.
Acquisition and production of skilled behavior in dynamic decision-making tasks
NASA Technical Reports Server (NTRS)
Kirlik, Alex
1993-01-01
Summaries of the four projects completed during the performance of this research are included. The four projects described are: Perceptual Augmentation Aiding for Situation Assessment, Perceptual Augmentation Aiding for Dynamic Decision-Making and Control, Action Advisory Aiding for Dynamic Decision-Making and Control, and Display Design to Support Time-Constrained Route Optimization. Papers based on each of these projects are currently in preparation. The theoretical framework upon which the first three projects are based, Ecological Task Analysis, was also developed during the performance of this research, and is described in a previous report. A project concerned with modeling strategies in human control of a dynamic system was also completed during the performance of this research.
Multiple attribute decision making model and application to food safety risk evaluation.
Ma, Lihua; Chen, Hong; Yan, Huizhe; Yang, Lifeng; Wu, Lifeng
2017-01-01
Decision making for supermarket food purchase decisions are characterized by network relationships. This paper analyzed factors that influence supermarket food selection and proposes a supplier evaluation index system based on the whole process of food production. The author established the intuitive interval value fuzzy set evaluation model based on characteristics of the network relationship among decision makers, and validated for a multiple attribute decision making case study. Thus, the proposed model provides a reliable, accurate method for multiple attribute decision making.
Modeling, simulation, and analysis at Sandia National Laboratories for health care systems
NASA Astrophysics Data System (ADS)
Polito, Joseph
1994-12-01
Modeling, Simulation, and Analysis are special competencies of the Department of Energy (DOE) National Laboratories which have been developed and refined through years of national defense work. Today, many of these skills are being applied to the problem of understanding the performance of medical devices and treatments. At Sandia National Laboratories we are developing models at all three levels of health care delivery: (1) phenomenology models for Observation and Test, (2) model-based outcomes simulations for Diagnosis and Prescription, and (3) model-based design and control simulations for the Administration of Treatment. A sampling of specific applications include non-invasive sensors for blood glucose, ultrasonic scanning for development of prosthetics, automated breast cancer diagnosis, laser burn debridement, surgical staple deformation, minimally invasive control for administration of a photodynamic drug, and human-friendly decision support aids for computer-aided diagnosis. These and other projects are being performed at Sandia with support from the DOE and in cooperation with medical research centers and private companies. Our objective is to leverage government engineering, modeling, and simulation skills with the biotechnical expertise of the health care community to create a more knowledge-rich environment for decision making and treatment.
Decision curve analysis: a novel method for evaluating prediction models.
Vickers, Andrew J; Elkin, Elena B
2006-01-01
Diagnostic and prognostic models are typically evaluated with measures of accuracy that do not address clinical consequences. Decision-analytic techniques allow assessment of clinical outcomes but often require collection of additional information and may be cumbersome to apply to models that yield a continuous result. The authors sought a method for evaluating and comparing prediction models that incorporates clinical consequences,requires only the data set on which the models are tested,and can be applied to models that have either continuous or dichotomous results. The authors describe decision curve analysis, a simple, novel method of evaluating predictive models. They start by assuming that the threshold probability of a disease or event at which a patient would opt for treatment is informative of how the patient weighs the relative harms of a false-positive and a false-negative prediction. This theoretical relationship is then used to derive the net benefit of the model across different threshold probabilities. Plotting net benefit against threshold probability yields the "decision curve." The authors apply the method to models for the prediction of seminal vesicle invasion in prostate cancer patients. Decision curve analysis identified the range of threshold probabilities in which a model was of value, the magnitude of benefit, and which of several models was optimal. Decision curve analysis is a suitable method for evaluating alternative diagnostic and prognostic strategies that has advantages over other commonly used measures and techniques.
Damschroder, Laura J; Fetters, Michael D; Zikmund-Fisher, Brian J; Crabtree, Benjamin F; Hudson, Shawna V; Ruffin IV, Mack T; Fucinari, Juliana; Kang, Minji; Taichman, L Susan; Creswell, John W
2018-01-01
Background Women with chronic medical conditions, such as diabetes and hypertension, have a higher risk of pregnancy-related complications compared with women without medical conditions and should be offered contraception if desired. Although evidence based guidelines for contraceptive selection in the presence of medical conditions are available via the United States Medical Eligibility Criteria (US MEC), these guidelines are underutilized. Research also supports the use of decision tools to promote shared decision making between patients and providers during contraceptive counseling. Objective The overall goal of the MiHealth, MiChoice project is to design and implement a theory-driven, Web-based tool that incorporates the US MEC (provider-level intervention) within the vehicle of a contraceptive decision tool for women with chronic medical conditions (patient-level intervention) in community-based primary care settings (practice-level intervention). This will be a 3-phase study that includes a predesign phase, a design phase, and a testing phase in a randomized controlled trial. This study protocol describes phase 1 and aim 1, which is to determine patient-, provider-, and practice-level factors that are relevant to the design and implementation of the contraceptive decision tool. Methods This is a mixed methods implementation study. To customize the delivery of the US MEC in the decision tool, we selected high-priority constructs from the Consolidated Framework for Implementation Research and the Theoretical Domains Framework to drive data collection and analysis at the practice and provider level, respectively. A conceptual model that incorporates constructs from the transtheoretical model and the health beliefs model undergirds patient-level data collection and analysis and will inform customization of the decision tool for this population. We will recruit 6 community-based primary care practices and conduct quantitative surveys and semistructured qualitative interviews with women who have chronic medical conditions, their primary care providers (PCPs), and clinic staff, as well as field observations of practice activities. Quantitative survey data will be summarized with simple descriptive statistics and relationships between participant characteristics and contraceptive recommendations (for PCPs), and current contraceptive use (for patients) will be examined using Fisher exact test. We will conduct thematic analysis of qualitative data from interviews and field observations. The integration of data will occur by comparing, contrasting, and synthesizing qualitative and quantitative findings to inform the future development and implementation of the intervention. Results We are currently enrolling practices and anticipate study completion in 15 months. Conclusions This protocol describes the first phase of a multiphase mixed methods study to develop and implement a Web-based decision tool that is customized to meet the needs of women with chronic medical conditions in primary care settings. Study findings will promote contraceptive counseling via shared decision making and reflect evidence-based guidelines for contraceptive selection. Trial Registration ClinicalTrials.gov NCT03153644; https://clinicaltrials.gov/ct2/show/NCT03153644 (Archived by WebCite at http://www.webcitation.org/6yUkA5lK8) PMID:29669707
Measuring sustainable development using a multi-criteria model: a case study.
Boggia, Antonio; Cortina, Carla
2010-11-01
This paper shows how Multi-criteria Decision Analysis (MCDA) can help in a complex process such as the assessment of the level of sustainability of a certain area. The paper presents the results of a study in which a model for measuring sustainability was implemented to better aid public policy decisions regarding sustainability. In order to assess sustainability in specific areas, a methodological approach based on multi-criteria analysis has been developed. The aim is to rank areas in order to understand the specific technical and/or financial support that they need to develop sustainable growth. The case study presented is an assessment of the level of sustainability in different areas of an Italian Region using the MCDA approach. Our results show that MCDA is a proper approach for sustainability assessment. The results are easy to understand and the evaluation path is clear and transparent. This is what decision makers need for having support to their decisions. The multi-criteria model for evaluation has been developed respecting the sustainable development economic theory, so that final results can have a clear meaning in terms of sustainability. Copyright 2010 Elsevier Ltd. All rights reserved.
Decision-relevant evaluation of climate models: A case study of chill hours in California
NASA Astrophysics Data System (ADS)
Jagannathan, K. A.; Jones, A. D.; Kerr, A. C.
2017-12-01
The past decade has seen a proliferation of different climate datasets with over 60 climate models currently in use. Comparative evaluation and validation of models can assist practitioners chose the most appropriate models for adaptation planning. However, such assessments are usually conducted for `climate metrics' such as seasonal temperature, while sectoral decisions are often based on `decision-relevant outcome metrics' such as growing degree days or chill hours. Since climate models predict different metrics with varying skill, the goal of this research is to conduct a bottom-up evaluation of model skill for `outcome-based' metrics. Using chill hours (number of hours in winter months where temperature is lesser than 45 deg F) in Fresno, CA as a case, we assess how well different GCMs predict the historical mean and slope of chill hours, and whether and to what extent projections differ based on model selection. We then compare our results with other climate-based evaluations of the region, to identify similarities and differences. For the model skill evaluation, historically observed chill hours were compared with simulations from 27 GCMs (and multiple ensembles). Model skill scores were generated based on a statistical hypothesis test of the comparative assessment. Future projections from RCP 8.5 runs were evaluated, and a simple bias correction was also conducted. Our analysis indicates that model skill in predicting chill hour slope is dependent on its skill in predicting mean chill hours, which results from the non-linear nature of the chill metric. However, there was no clear relationship between the models that performed well for the chill hour metric and those that performed well in other temperature-based evaluations (such winter minimum temperature or diurnal temperature range). Further, contrary to conclusions from other studies, we also found that the multi-model mean or large ensemble mean results may not always be most appropriate for this outcome metric. Our assessment sheds light on key differences between global versus local skill, and broad versus specific skill of climate models, highlighting that decision-relevant model evaluation may be crucial for providing practitioners with the best available climate information for their specific needs.
NASA Astrophysics Data System (ADS)
Li, Hui; Hong, Lu-Yao; Zhou, Qing; Yu, Hai-Jie
2015-08-01
The business failure of numerous companies results in financial crises. The high social costs associated with such crises have made people to search for effective tools for business risk prediction, among which, support vector machine is very effective. Several modelling means, including single-technique modelling, hybrid modelling, and ensemble modelling, have been suggested in forecasting business risk with support vector machine. However, existing literature seldom focuses on the general modelling frame for business risk prediction, and seldom investigates performance differences among different modelling means. We reviewed researches on forecasting business risk with support vector machine, proposed the general assisted prediction modelling frame with hybridisation and ensemble (APMF-WHAE), and finally, investigated the use of principal components analysis, support vector machine, random sampling, and group decision, under the general frame in forecasting business risk. Under the APMF-WHAE frame with support vector machine as the base predictive model, four specific predictive models were produced, namely, pure support vector machine, a hybrid support vector machine involved with principal components analysis, a support vector machine ensemble involved with random sampling and group decision, and an ensemble of hybrid support vector machine using group decision to integrate various hybrid support vector machines on variables produced from principle components analysis and samples from random sampling. The experimental results indicate that hybrid support vector machine and ensemble of hybrid support vector machines were able to produce dominating performance than pure support vector machine and support vector machine ensemble.
NASA Astrophysics Data System (ADS)
Malard, J. J.; Rojas, M.; Adamowski, J. F.; Gálvez, J.; Tuy, H. A.; Melgar-Quiñonez, H.
2015-12-01
While cropping models represent the biophysical aspects of agricultural systems, system dynamics modelling offers the possibility of representing the socioeconomic (including social and cultural) aspects of these systems. The two types of models can then be coupled in order to include the socioeconomic dimensions of climate change adaptation in the predictions of cropping models.We develop a dynamically coupled socioeconomic-biophysical model of agricultural production and its repercussions on food security in two case studies from Guatemala (a market-based, intensive agricultural system and a low-input, subsistence crop-based system). Through the specification of the climate inputs to the cropping model, the impacts of climate change on the entire system can be analysed, and the participatory nature of the system dynamics model-building process, in which stakeholders from NGOs to local governmental extension workers were included, helps ensure local trust in and use of the model.However, the analysis of climate variability's impacts on agroecosystems includes uncertainty, especially in the case of joint physical-socioeconomic modelling, and the explicit representation of this uncertainty in the participatory development of the models is important to ensure appropriate use of the models by the end users. In addition, standard model calibration, validation, and uncertainty interval estimation techniques used for physically-based models are impractical in the case of socioeconomic modelling. We present a methodology for the calibration and uncertainty analysis of coupled biophysical (cropping) and system dynamics (socioeconomic) agricultural models, using survey data and expert input to calibrate and evaluate the uncertainty of the system dynamics as well as of the overall coupled model. This approach offers an important tool for local decision makers to evaluate the potential impacts of climate change and their feedbacks through the associated socioeconomic system.
Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model
2016-01-01
The omnipresent need for optimisation requires constant improvements of companies’ business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and “what-if” scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results. PMID:26871694
Mehrotra, Sanjay; Kim, Kibaek
2011-12-01
We consider the problem of outcomes based budget allocations to chronic disease prevention programs across the United States (US) to achieve greater geographical healthcare equity. We use Diabetes Prevention and Control Programs (DPCP) by the Center for Disease Control and Prevention (CDC) as an example. We present a multi-criteria robust weighted sum model for such multi-criteria decision making in a group decision setting. The principal component analysis and an inverse linear programming techniques are presented and used to study the actual 2009 budget allocation by CDC. Our results show that the CDC budget allocation process for the DPCPs is not likely model based. In our empirical study, the relative weights for different prevalence and comorbidity factors and the corresponding budgets obtained under different weight regions are discussed. Parametric analysis suggests that money should be allocated to states to promote diabetes education and to increase patient-healthcare provider interactions to reduce disparity across the US.
Haer, Toon; Botzen, W J Wouter; de Moel, Hans; Aerts, Jeroen C J H
2017-10-01
Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.
Kernel-Based Approximate Dynamic Programming Using Bellman Residual Elimination
2010-02-01
framework is the ability to utilize stochastic system models, thereby allowing the system to make sound decisions even if there is randomness in the system ...approximate policy when a system model is unavailable. We present theoretical analysis of all BRE algorithms proving convergence to the optimal policy in...policies based on MDPs is that there may be parameters of the system model that are poorly known and/or vary with time as the system operates. System
NASA Astrophysics Data System (ADS)
Gronewold, A. D.; Wolpert, R. L.; Reckhow, K. H.
2007-12-01
Most probable number (MPN) and colony-forming-unit (CFU) are two estimates of fecal coliform bacteria concentration commonly used as measures of water quality in United States shellfish harvesting waters. The MPN is the maximum likelihood estimate (or MLE) of the true fecal coliform concentration based on counts of non-sterile tubes in serial dilution of a sample aliquot, indicating bacterial metabolic activity. The CFU is the MLE of the true fecal coliform concentration based on the number of bacteria colonies emerging on a growth plate after inoculation from a sample aliquot. Each estimating procedure has intrinsic variability and is subject to additional uncertainty arising from minor variations in experimental protocol. Several versions of each procedure (using different sized aliquots or different numbers of tubes, for example) are in common use, each with its own levels of probabilistic and experimental error and uncertainty. It has been observed empirically that the MPN procedure is more variable than the CFU procedure, and that MPN estimates are somewhat higher on average than CFU estimates, on split samples from the same water bodies. We construct a probabilistic model that provides a clear theoretical explanation for the observed variability in, and discrepancy between, MPN and CFU measurements. We then explore how this variability and uncertainty might propagate into shellfish harvesting area management decisions through a two-phased modeling strategy. First, we apply our probabilistic model in a simulation-based analysis of future water quality standard violation frequencies under alternative land use scenarios, such as those evaluated under guidelines of the total maximum daily load (TMDL) program. Second, we apply our model to water quality data from shellfish harvesting areas which at present are closed (either conditionally or permanently) to shellfishing, to determine if alternative laboratory analysis procedures might have led to different management decisions. Our research results indicate that the (often large) observed differences between MPN and CFU values for the same water body are well within the ranges predicted by our probabilistic model. Our research also indicates that the probability of violating current water quality guidelines at specified true fecal coliform concentrations depends on the laboratory procedure used. As a result, quality-based management decisions, such as opening or closing a shellfishing area, may also depend on the laboratory procedure used.
Eppinger, Ben; Walter, Maik; Li, Shu-Chen
2017-04-01
In this study, we investigated the interplay of habitual (model-free) and goal-directed (model-based) decision processes by using a two-stage Markov decision task in combination with event-related potentials (ERPs) and computational modeling. To manipulate the demands on model-based decision making, we applied two experimental conditions with different probabilities of transitioning from the first to the second stage of the task. As we expected, when the stage transitions were more predictable, participants showed greater model-based (planning) behavior. Consistent with this result, we found that stimulus-evoked parietal (P300) activity at the second stage of the task increased with the predictability of the state transitions. However, the parietal activity also reflected model-free information about the expected values of the stimuli, indicating that at this stage of the task both types of information are integrated to guide decision making. Outcome-related ERP components only reflected reward-related processes: Specifically, a medial prefrontal ERP component (the feedback-related negativity) was sensitive to negative outcomes, whereas a component that is elicited by reward (the feedback-related positivity) increased as a function of positive prediction errors. Taken together, our data indicate that stimulus-locked parietal activity reflects the integration of model-based and model-free information during decision making, whereas feedback-related medial prefrontal signals primarily reflect reward-related decision processes.
A Novel Multiobjective Evolutionary Algorithm Based on Regression Analysis
Song, Zhiming; Wang, Maocai; Dai, Guangming; Vasile, Massimiliano
2015-01-01
As is known, the Pareto set of a continuous multiobjective optimization problem with m objective functions is a piecewise continuous (m − 1)-dimensional manifold in the decision space under some mild conditions. However, how to utilize the regularity to design multiobjective optimization algorithms has become the research focus. In this paper, based on this regularity, a model-based multiobjective evolutionary algorithm with regression analysis (MMEA-RA) is put forward to solve continuous multiobjective optimization problems with variable linkages. In the algorithm, the optimization problem is modelled as a promising area in the decision space by a probability distribution, and the centroid of the probability distribution is (m − 1)-dimensional piecewise continuous manifold. The least squares method is used to construct such a model. A selection strategy based on the nondominated sorting is used to choose the individuals to the next generation. The new algorithm is tested and compared with NSGA-II and RM-MEDA. The result shows that MMEA-RA outperforms RM-MEDA and NSGA-II on the test instances with variable linkages. At the same time, MMEA-RA has higher efficiency than the other two algorithms. A few shortcomings of MMEA-RA have also been identified and discussed in this paper. PMID:25874246
Use of multiattribute utility theory for formulary management in a health system.
Chung, Seonyoung; Kim, Sooyon; Kim, Jeongmee; Sohn, Kieho
2010-01-15
The application, utility, and flexibility of the multiattribute utility theory (MAUT) when used as a formulary decision methodology in a Korean medical center were evaluated. A drug analysis model using MAUT consisting of 10 steps was designed for two drug classes of dihydropyridine calcium channel blockers (CCBs) and angiotensin II receptor blockers (ARBs). These two drug classes contain the most diverse agents among cardiovascular drugs on Samsung Medical Center's drug formulary. The attributes identified for inclusion in the drug analysis model were effectiveness, safety, patient convenience, and cost, with relative weights of 50%, 30%, 10%, and 10%, respectively. The factors were incorporated into the model to quantify the contribution of each attribute. For each factor, a utility scale of 0-100 was established, and the total utility score for each alternative was calculated. An attempt was made to make the model adaptable to changing health care and regulatory circumstances. The analysis revealed amlodipine besylate to be an alternative agent, with the highest total utility score among the dihydropyridine CCBs, while barnidipine hydrochloride had the lowest score. For ARBs, losartan potassium had the greatest total utility score, while olmesartan medoxomil had the lowest. A drug analysis model based on the MAUT was successfully developed and used in making formulary decisions for dihydropyridine CCBs and ARBs for a Korean health system. The model incorporates sufficient utility and flexibility of a drug's attributes and can be used as an alternative decision-making tool for formulary management in health systems.
Lee, Eun Gyung; Kim, Seung Won; Feigley, Charles E.; Harper, Martin
2015-01-01
This study introduces two semi-quantitative methods, Structured Subjective Assessment (SSA) and Control of Substances Hazardous to Health (COSHH) Essentials, in conjunction with two-dimensional Monte Carlo simulations for determining prior probabilities. Prior distribution using expert judgment was included for comparison. Practical applications of the proposed methods were demonstrated using personal exposure measurements of isoamyl acetate in an electronics manufacturing facility and of isopropanol in a printing shop. Applicability of these methods in real workplaces was discussed based on the advantages and disadvantages of each method. Although these methods could not be completely independent of expert judgments, this study demonstrated a methodological improvement in the estimation of the prior distribution for the Bayesian decision analysis tool. The proposed methods provide a logical basis for the decision process by considering determinants of worker exposure. PMID:23252451
Decision making under uncertainty in a spiking neural network model of the basal ganglia.
Héricé, Charlotte; Khalil, Radwa; Moftah, Marie; Boraud, Thomas; Guthrie, Martin; Garenne, André
2016-12-01
The mechanisms of decision-making and action selection are generally thought to be under the control of parallel cortico-subcortical loops connecting back to distinct areas of cortex through the basal ganglia and processing motor, cognitive and limbic modalities of decision-making. We have used these properties to develop and extend a connectionist model at a spiking neuron level based on a previous rate model approach. This model is demonstrated on decision-making tasks that have been studied in primates and the electrophysiology interpreted to show that the decision is made in two steps. To model this, we have used two parallel loops, each of which performs decision-making based on interactions between positive and negative feedback pathways. This model is able to perform two-level decision-making as in primates. We show here that, before learning, synaptic noise is sufficient to drive the decision-making process and that, after learning, the decision is based on the choice that has proven most likely to be rewarded. The model is then submitted to lesion tests, reversal learning and extinction protocols. We show that, under these conditions, it behaves in a consistent manner and provides predictions in accordance with observed experimental data.
Advancements in Risk-Informed Performance-Based Asset Management for Commercial Nuclear Power Plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liming, James K.; Ravindra, Mayasandra K.
2006-07-01
Over the past several years, ABSG Consulting Inc. (ABS Consulting) and the South Texas Project Nuclear Operating Company (STPNOC) have developed a decision support process and associated software for risk-informed, performance-based asset management (RIPBAM) of nuclear power plant facilities. RIPBAM applies probabilistic risk assessment (PRA) tools and techniques in the realm of plant physical and financial asset management. The RIPBAM process applies a tiered set of models and supporting performance measures (or metrics) that can ultimately be applied to support decisions affecting the allocation and management of plant resources (e.g., funding, staffing, scheduling, etc.). In general, the ultimate goal ofmore » the RIPBAM process is to continually support decision-making to maximize a facility's net present value (NPV) and long-term profitability for its owners. While the initial applications of RIPBAM have been for nuclear power stations, the methodology can easily be adapted to other types of power station or complex facility decision-making support. RIPBAM can also be designed to focus on performance metrics other than NPV and profitability (e.g., mission reliability, operational availability, probability of mission success per dollar invested, etc.). Recent advancements in the RIPBAM process focus on expanding the scope of previous RIPBAM applications to include not only operations, maintenance, and safety issues, but also broader risk perception components affecting plant owner (stockholder), operator, and regulator biases. Conceptually, RIPBAM is a comprehensive risk-informed cash flow model for decision support. It originated as a tool to help manage plant refueling outage scheduling, and was later expanded to include the full spectrum of operations and maintenance decision support. However, it differs from conventional business modeling tools in that it employs a systems engineering approach with broadly based probabilistic analysis of organizational 'value streams'. The scope of value stream inclusion in the process can be established by the user, but in its broadest applications, RIPBAM can be used to address how risk perceptions of plant owners and regulators are impacted by plant performance. Plant staffs can expand and refine RIPBAM models scope via a phased program of activities over time. This paper shows how the multi-metric uncertainty analysis feature of RIPBAM can apply a wide spectrum of decision-influencing factors to support decisions designed to maximize the probability of achieving, maintaining, and improving upon plant goals and objectives. In this paper, the authors show how this approach can be extremely valuable to plant owners and operators in supporting plant value-impacting decision-making processes. (authors)« less
NASA Astrophysics Data System (ADS)
Benedict, K. K.
2008-12-01
Since 2004 the Earth Data Analysis Center, in collaboration with the researchers at the University of Arizona and George Mason University, with funding from NASA, has been developing a services oriented architecture (SOA) that acquires remote sensing, meteorological forecast, and observed ground level particulate data (EPA AirNow) from NASA, NOAA, and DataFed through a variety of standards-based service interfaces. These acquired data are used to initialize and set boundary conditions for the execution of the Dust Regional Atmospheric Model (DREAM) to generate daily 48-hour dust forecasts, which are then published via a combination of Open Geospatial Consortium (OGC) services (WMS and WCS), basic HTTP request-based services, and SOAP services. The goal of this work has been to develop services that can be integrated into existing public health decision support systems (DSS) to provide enhanced environmental data (i.e. ground surface particulate concentration estimates) for use in epidemiological analysis, public health warning systems, and syndromic surveillance systems. While the project has succeeded in deploying these products into the target systems, there has been differential adoption of the different service interface products, with the simple OGC and HTTP interfaces generating much greater interest by DSS developers and researchers than the more complex SOAP service interfaces. This paper reviews the SOA developed as part of this project and provides insights into how different service models may have a significant impact on the infusion of Earth science products into decision making processes and systems.
Konovalov, Arkady; Krajbich, Ian
2016-01-01
Organisms appear to learn and make decisions using different strategies known as model-free and model-based learning; the former is mere reinforcement of previously rewarded actions and the latter is a forward-looking strategy that involves evaluation of action-state transition probabilities. Prior work has used neural data to argue that both model-based and model-free learners implement a value comparison process at trial onset, but model-based learners assign more weight to forward-looking computations. Here using eye-tracking, we report evidence for a different interpretation of prior results: model-based subjects make their choices prior to trial onset. In contrast, model-free subjects tend to ignore model-based aspects of the task and instead seem to treat the decision problem as a simple comparison process between two differentially valued items, consistent with previous work on sequential-sampling models of decision making. These findings illustrate a problem with assuming that experimental subjects make their decisions at the same prescribed time. PMID:27511383
Sojda, Richard S.; Chen, Serena H.; El Sawah, Sondoss; Guillaume, Joseph H.A.; Jakeman, A.J.; Lautenbach, Sven; McIntosh, Brian S.; Rizzoli, A.E.; Seppelt, Ralf; Struss, Peter; Voinov, Alexey; Volk, Martin
2012-01-01
Two of the basic tenets of decision support system efforts are to help identify and structure the decisions to be supported, and to then provide analysis in how those decisions might be best made. One example from wetland management would be that wildlife biologists must decide when to draw down water levels to optimise aquatic invertebrates as food for breeding ducks. Once such a decision is identified, a system or tool to help them make that decision in the face of current and projected climate conditions could be developed. We examined a random sample of 100 papers published from 2001-2011 in Environmental Modelling and Software that used the phrase “decision support system” or “decision support tool”, and which are characteristic of different sectors. In our review, 41% of the systems and tools related to the water resources sector, 34% were related to agriculture, and 22% to the conservation of fish, wildlife, and protected area management. Only 60% of the papers were deemed to be reporting on DSS. This was based on the papers reviewed not having directly identified a specific decision to be supported. We also report on the techniques that were used to identify the decisions, such as formal survey, focus group, expert opinion, or sole judgment of the author(s). The primary underlying modelling system, e.g., expert system, agent based model, Bayesian belief network, geographical information system (GIS), and the like was categorised next. Finally, since decision support typically should target some aspect of unstructured decisions, we subjectively determined to what degree this was the case. In only 23% of the papers reviewed, did the system appear to tackle unstructured decisions. This knowledge should be useful in helping workers in the field develop more effective systems and tools, especially by being exposed to the approaches in different, but related, disciplines. We propose that a standard blueprint for reporting on DSS be developed for consideration by journal editors to aid them in filtering papers that use the term, “decision support”.
Using structured decision making to manage disease risk for Montana wildlife
Mitchell, Michael S.; Gude, Justin A.; Anderson, Neil J.; Ramsey, Jennifer M.; Thompson, Michael J.; Sullivan, Mark G.; Edwards, Victoria L.; Gower, Claire N.; Cochrane, Jean Fitts; Irwin, Elise R.; Walshe, Terry
2013-01-01
We used structured decision-making to develop a 2-part framework to assist managers in the proactive management of disease outbreaks in Montana, USA. The first part of the framework is a model to estimate the probability of disease outbreak given field observations available to managers. The second part of the framework is decision analysis that evaluates likely outcomes of management alternatives based on the estimated probability of disease outbreak, and applies managers' values for different objectives to indicate a preferred management strategy. We used pneumonia in bighorn sheep (Ovis canadensis) as a case study for our approach, applying it to 2 populations in Montana that differed in their likelihood of a pneumonia outbreak. The framework provided credible predictions of both probability of disease outbreaks, as well as biological and monetary consequences of management actions. The structured decision-making approach to this problem was valuable for defining the challenges of disease management in a decentralized agency where decisions are generally made at the local level in cooperation with stakeholders. Our approach provides local managers with the ability to tailor management planning for disease outbreaks to local conditions. Further work is needed to refine our disease risk models and decision analysis, including robust prediction of disease outbreaks and improved assessment of management alternatives.
[The role of research-based evidence in health system policy decision-making].
Patiño, Daniel; Lavis, John N; Moat, Kaelan
2013-01-01
Different models may be used for explaining how research-based evidence is used in healthcare system policy-making. It is argued that models arising from a clinical setting (i.e. evidence-based policy-making model) could be useful regarding some types of healthcare system decision-making. However, such models are "silent" concerning the influence of political contextual factors on healthcare policy-making and are thus inconsistent with decision-making regarding the modification of healthcare system arrangements. Other political science-based models would seem to be more useful for understanding that research is just one factor affecting decision-making and that different types of research-based evidence can be used instrumentally, conceptual or strategically during different policy-making stages.
Operational Plan Ontology Model for Interconnection and Interoperability
NASA Astrophysics Data System (ADS)
Long, F.; Sun, Y. K.; Shi, H. Q.
2017-03-01
Aiming at the assistant decision-making system’s bottleneck of processing the operational plan data and information, this paper starts from the analysis of the problem of traditional expression and the technical advantage of ontology, and then it defines the elements of the operational plan ontology model and determines the basis of construction. Later, it builds up a semi-knowledge-level operational plan ontology model. Finally, it probes into the operational plan expression based on the operational plan ontology model and the usage of the application software. Thus, this paper has the theoretical significance and application value in the improvement of interconnection and interoperability of the operational plan among assistant decision-making systems.
Risk Decision Making Model for Reservoir Floodwater resources Utilization
NASA Astrophysics Data System (ADS)
Huang, X.
2017-12-01
Floodwater resources utilization(FRU) can alleviate the shortage of water resources, but there are risks. In order to safely and efficiently utilize the floodwater resources, it is necessary to study the risk of reservoir FRU. In this paper, the risk rate of exceeding the design flood water level and the risk rate of exceeding safety discharge are estimated. Based on the principle of the minimum risk and the maximum benefit of FRU, a multi-objective risk decision making model for FRU is constructed. Probability theory and mathematical statistics method is selected to calculate the risk rate; C-D production function method and emergy analysis method is selected to calculate the risk benefit; the risk loss is related to flood inundation area and unit area loss; the multi-objective decision making problem of the model is solved by the constraint method. Taking the Shilianghe reservoir in Jiangsu Province as an example, the optimal equilibrium solution of FRU of the Shilianghe reservoir is found by using the risk decision making model, and the validity and applicability of the model are verified.
Xia, Shang; Liu, Jiming
2013-01-01
In modeling individuals vaccination decision making, existing studies have typically used the payoff-based (e.g., game-theoretical) approaches that evaluate the risks and benefits of vaccination. In reality, whether an individual takes vaccine or not is also influenced by the decisions of others, i.e., due to the impact of social influence. In this regard, we present a dual-perspective view on individuals decision making that incorporates both the cost analysis of vaccination and the impact of social influence. In doing so, we consider a group of individuals making their vaccination decisions by both minimizing the associated costs and evaluating the decisions of others. We apply social impact theory (SIT) to characterize the impact of social influence with respect to individuals interaction relationships. By doing so, we propose a novel modeling framework that integrates an extended SIT-based characterization of social influence with a game-theoretical analysis of cost minimization. We consider the scenario of voluntary vaccination against an influenza-like disease through a series of simulations. We investigate the steady state of individuals’ decision making, and thus, assess the impact of social influence by evaluating the coverage of vaccination for infectious diseases control. Our simulation results suggest that individuals high conformity to social influence will increase the vaccination coverage if the cost of vaccination is low, and conversely, will decrease it if the cost is high. Interestingly, if individuals are social followers, the resulting vaccination coverage would converge to a certain level, depending on individuals’ initial level of vaccination willingness rather than the associated costs. We conclude that social influence will have an impact on the control of an infectious disease as they can affect the vaccination coverage. In this respect, our work can provide a means for modeling the impact of social influence as well as for estimating the effectiveness of a voluntary vaccination program. PMID:23585835
Xia, Shang; Liu, Jiming
2013-01-01
In modeling individuals vaccination decision making, existing studies have typically used the payoff-based (e.g., game-theoretical) approaches that evaluate the risks and benefits of vaccination. In reality, whether an individual takes vaccine or not is also influenced by the decisions of others, i.e., due to the impact of social influence. In this regard, we present a dual-perspective view on individuals decision making that incorporates both the cost analysis of vaccination and the impact of social influence. In doing so, we consider a group of individuals making their vaccination decisions by both minimizing the associated costs and evaluating the decisions of others. We apply social impact theory (SIT) to characterize the impact of social influence with respect to individuals interaction relationships. By doing so, we propose a novel modeling framework that integrates an extended SIT-based characterization of social influence with a game-theoretical analysis of cost minimization. We consider the scenario of voluntary vaccination against an influenza-like disease through a series of simulations. We investigate the steady state of individuals' decision making, and thus, assess the impact of social influence by evaluating the coverage of vaccination for infectious diseases control. Our simulation results suggest that individuals high conformity to social influence will increase the vaccination coverage if the cost of vaccination is low, and conversely, will decrease it if the cost is high. Interestingly, if individuals are social followers, the resulting vaccination coverage would converge to a certain level, depending on individuals' initial level of vaccination willingness rather than the associated costs. We conclude that social influence will have an impact on the control of an infectious disease as they can affect the vaccination coverage. In this respect, our work can provide a means for modeling the impact of social influence as well as for estimating the effectiveness of a voluntary vaccination program.
NED-IIS: An Intelligent Information System for Forest Ecosystem Management
W.D. Potter; S. Somasekar; R. Kommineni; H.M. Rauscher
1999-01-01
We view Intelligent Information System (IIS) as composed of a unified knowledge base, database, and model base. The model base includes decision support models, forecasting models, and cvsualization models for example. In addition, we feel that the model base should include domain specific porblems solving modules as well as decision support models. This, then,...
Volk, Michael L; Lok, Anna S F; Ubel, Peter A; Vijan, Sandeep
2008-01-01
The utilitarian foundation of decision analysis limits its usefulness for many social policy decisions. In this study, the authors examine a method to incorporate competing ethical principles in a decision analysis of liver transplantation for a patient with acute liver failure (ALF). A Markov model was constructed to compare the benefit of transplantation for a patient with ALF versus the harm caused to other patients on the waiting list and to determine the lowest acceptable 5-y posttransplant survival for the ALF patient. The weighting of the ALF patient and other patients was then adjusted using a multiattribute variable incorporating utilitarianism, urgency, and other principles such as fair chances. In the base-case analysis, the strategy of transplanting the ALF patient resulted in a 0.8% increase in the risk of death and a utility loss of 7.8 quality-adjusted days of life for each of the other patients on the waiting list. These harms cumulatively outweighed the benefit of transplantation for an ALF patient having a posttransplant survival of less than 48% at 5 y. However, the threshold for an acceptable posttransplant survival for the ALF patient ranged from 25% to 56% at 5 y, depending on the ethical principles involved. The results of the decision analysis vary depending on the ethical perspective. This study demonstrates how competing ethical principles can be numerically incorporated in a decision analysis.
NASA Astrophysics Data System (ADS)
Booth, N. L.; Everman, E.; Kuo, I.; Sprague, L.; Murphy, L.
2011-12-01
A new web-based decision support system has been developed as part of the U.S. Geological Survey (USGS) National Water Quality Assessment Program's (NAWQA) effort to provide ready access to Spatially Referenced Regressions On Watershed attributes (SPARROW) results of stream water-quality conditions and to offer sophisticated scenario testing capabilities for research and water-quality planning via an intuitive graphical user interface with a map-based display. The SPARROW Decision Support System (DSS) is delivered through a web browser over an Internet connection, making it widely accessible to the public in a format that allows users to easily display water-quality conditions, distribution of nutrient sources, nutrient delivery to downstream waterbodies, and simulations of altered nutrient inputs including atmospheric and agricultural sources. The DSS offers other features for analysis including various background map layers, model output exports, and the ability to save and share prediction scenarios. SPARROW models currently supported by the DSS are based on the modified digital versions of the 1:500,000-scale River Reach File (RF1) and 1:100,000-scale National Hydrography Dataset (medium-resolution, NHDPlus) stream networks. The underlying modeling framework and server infrastructure illustrate innovations in the information technology and geosciences fields for delivering SPARROW model predictions over the web by performing intensive model computations and map visualizations of the predicted conditions within the stream network.
Clayman, Marla L.; Makoul, Gregory; Harper, Maya M.; Koby, Danielle G.; Williams, Adam R.
2012-01-01
Objectives Describe the development and refinement of a scheme, Detail of Essential Elements and Participants in Shared Decision Making (DEEP-SDM), for coding Shared Decision Making (SDM) while reporting on the characteristics of decisions in a sample of patients with metastatic breast cancer. Methods The Evidence-Based Patient Choice instrument was modified to reflect Makoul and Clayman’s Integrative Model of SDM. Coding was conducted on video recordings of 20 women at the first visit with their medical oncologists after suspicion of disease progression. Noldus Observer XT v.8, a video coding software platform, was used for coding. Results The sample contained 80 decisions (range: 1-11), divided into 150 decision making segments. Most decisions were physician-led, although patients and physicians initiated similar numbers of decision-making conversations. Conclusion DEEP-SDM facilitates content analysis of encounters between women with metastatic breast cancer and their medical oncologists. Despite the fractured nature of decision making, it is possible to identify decision points and to code each of the Essential Elements of Shared Decision Making. Further work should include application of DEEP-SDM to non-cancer encounters. Practice Implications: A better understanding of how decisions unfold in the medical encounter can help inform the relationship of SDM to patient-reported outcomes. PMID:22784391
[Breast cancer and pregnancy: decision making and the point of view of the mother].
Eisinger, François; Noizet, Agnès
2002-09-01
For the treatment of breast cancer, modifications of decision making related to pregnancy could be assessed through three questions. Why a decision had been chosen? In that case, the hypothesis is that decisions are based on the expected utility. The theory assumes weighting and computation of complete possibilities with their associated probabilities and values. However values exhibits a wide inter-individual variation range. Therefore the predictability of choice based on this model is indeed very low. Furthermore it is likely that the willingness of pregnancy after breast cancer contains besides classic constituents of appeals of motherhood, a specific meaning of recovery both of health and femininity. The second question: who is in charge of the decision? And under the paradigm of autonomy, women' decision is, merely by itself, the right decision. The last question is how? For some situations for which foreseeing is quiet complex, the value of the process in itself is increased and could help the end-oriented or self-determined decision. Casuistic analysis could therefore improve women' decisions. The issue is not only about decision but also related to patient-physician relationship, about an issue that is not only a biomedical problem.
Structural analysis consultation using artificial intelligence
NASA Technical Reports Server (NTRS)
Melosh, R. J.; Marcal, P. V.; Berke, L.
1978-01-01
The primary goal of consultation is definition of the best strategy to deal with a structural engineering analysis objective. The knowledge base to meet the need is designed to identify the type of numerical analysis, the needed modeling detail, and specific analysis data required. Decisions are constructed on the basis of the data in the knowledge base - material behavior, relations between geometry and structural behavior, measures of the importance of time and temperature changes - and user supplied specifics characteristics of the spectrum of analysis types, the relation between accuracy and model detail on the structure, its mechanical loadings, and its temperature states. Existing software demonstrated the feasibility of the approach, encompassing the 36 analysis classes spanning nonlinear, temperature affected, incremental analyses which track the behavior of structural systems.
Rousson, Valentin; Zumbrunn, Thomas
2011-06-22
Decision curve analysis has been introduced as a method to evaluate prediction models in terms of their clinical consequences if used for a binary classification of subjects into a group who should and into a group who should not be treated. The key concept for this type of evaluation is the "net benefit", a concept borrowed from utility theory. We recall the foundations of decision curve analysis and discuss some new aspects. First, we stress the formal distinction between the net benefit for the treated and for the untreated and define the concept of the "overall net benefit". Next, we revisit the important distinction between the concept of accuracy, as typically assessed using the Youden index and a receiver operating characteristic (ROC) analysis, and the concept of utility of a prediction model, as assessed using decision curve analysis. Finally, we provide an explicit implementation of decision curve analysis to be applied in the context of case-control studies. We show that the overall net benefit, which combines the net benefit for the treated and the untreated, is a natural alternative to the benefit achieved by a model, being invariant with respect to the coding of the outcome, and conveying a more comprehensive picture of the situation. Further, within the framework of decision curve analysis, we illustrate the important difference between the accuracy and the utility of a model, demonstrating how poor an accurate model may be in terms of its net benefit. Eventually, we expose that the application of decision curve analysis to case-control studies, where an accurate estimate of the true prevalence of a disease cannot be obtained from the data, is achieved with a few modifications to the original calculation procedure. We present several interrelated extensions to decision curve analysis that will both facilitate its interpretation and broaden its potential area of application.
2011-01-01
Background Decision curve analysis has been introduced as a method to evaluate prediction models in terms of their clinical consequences if used for a binary classification of subjects into a group who should and into a group who should not be treated. The key concept for this type of evaluation is the "net benefit", a concept borrowed from utility theory. Methods We recall the foundations of decision curve analysis and discuss some new aspects. First, we stress the formal distinction between the net benefit for the treated and for the untreated and define the concept of the "overall net benefit". Next, we revisit the important distinction between the concept of accuracy, as typically assessed using the Youden index and a receiver operating characteristic (ROC) analysis, and the concept of utility of a prediction model, as assessed using decision curve analysis. Finally, we provide an explicit implementation of decision curve analysis to be applied in the context of case-control studies. Results We show that the overall net benefit, which combines the net benefit for the treated and the untreated, is a natural alternative to the benefit achieved by a model, being invariant with respect to the coding of the outcome, and conveying a more comprehensive picture of the situation. Further, within the framework of decision curve analysis, we illustrate the important difference between the accuracy and the utility of a model, demonstrating how poor an accurate model may be in terms of its net benefit. Eventually, we expose that the application of decision curve analysis to case-control studies, where an accurate estimate of the true prevalence of a disease cannot be obtained from the data, is achieved with a few modifications to the original calculation procedure. Conclusions We present several interrelated extensions to decision curve analysis that will both facilitate its interpretation and broaden its potential area of application. PMID:21696604
Models based on value and probability in health improve shared decision making.
Ortendahl, Monica
2008-10-01
Diagnostic reasoning and treatment decisions are a key competence of doctors. A model based on values and probability provides a conceptual framework for clinical judgments and decisions, and also facilitates the integration of clinical and biomedical knowledge into a diagnostic decision. Both value and probability are usually estimated values in clinical decision making. Therefore, model assumptions and parameter estimates should be continually assessed against data, and models should be revised accordingly. Introducing parameter estimates for both value and probability, which usually pertain in clinical work, gives the model labelled subjective expected utility. Estimated values and probabilities are involved sequentially for every step in the decision-making process. Introducing decision-analytic modelling gives a more complete picture of variables that influence the decisions carried out by the doctor and the patient. A model revised for perceived values and probabilities by both the doctor and the patient could be used as a tool for engaging in a mutual and shared decision-making process in clinical work.
Semi-supervised Machine Learning for Analysis of Hydrogeochemical Data and Models
NASA Astrophysics Data System (ADS)
Vesselinov, Velimir; O'Malley, Daniel; Alexandrov, Boian; Moore, Bryan
2017-04-01
Data- and model-based analyses such as uncertainty quantification, sensitivity analysis, and decision support using complex physics models with numerous model parameters and typically require a huge number of model evaluations (on order of 10^6). Furthermore, model simulations of complex physics may require substantial computational time. For example, accounting for simultaneously occurring physical processes such as fluid flow and biogeochemical reactions in heterogeneous porous medium may require several hours of wall-clock computational time. To address these issues, we have developed a novel methodology for semi-supervised machine learning based on Non-negative Matrix Factorization (NMF) coupled with customized k-means clustering. The algorithm allows for automated, robust Blind Source Separation (BSS) of groundwater types (contamination sources) based on model-free analyses of observed hydrogeochemical data. We have also developed reduced order modeling tools, which coupling support vector regression (SVR), genetic algorithms (GA) and artificial and convolutional neural network (ANN/CNN). SVR is applied to predict the model behavior within prior uncertainty ranges associated with the model parameters. ANN and CNN procedures are applied to upscale heterogeneity of the porous medium. In the upscaling process, fine-scale high-resolution models of heterogeneity are applied to inform coarse-resolution models which have improved computational efficiency while capturing the impact of fine-scale effects at the course scale of interest. These techniques are tested independently on a series of synthetic problems. We also present a decision analysis related to contaminant remediation where the developed reduced order models are applied to reproduce groundwater flow and contaminant transport in a synthetic heterogeneous aquifer. The tools are coded in Julia and are a part of the MADS high-performance computational framework (https://github.com/madsjulia/Mads.jl).
Samantra, Chitrasen; Datta, Saurav; Mahapatra, Siba Sankar
2017-03-01
In the context of underground coal mining industry, the increased economic issues regarding implementation of additional safety measure systems, along with growing public awareness to ensure high level of workers safety, have put great pressure on the managers towards finding the best solution to ensure safe as well as economically viable alternative selection. Risk-based decision support system plays an important role in finding such solutions amongst candidate alternatives with respect to multiple decision criteria. Therefore, in this paper, a unified risk-based decision-making methodology has been proposed for selecting an appropriate safety measure system in relation to an underground coal mining industry with respect to multiple risk criteria such as financial risk, operating risk, and maintenance risk. The proposed methodology uses interval-valued fuzzy set theory for modelling vagueness and subjectivity in the estimates of fuzzy risk ratings for making appropriate decision. The methodology is based on the aggregative fuzzy risk analysis and multi-criteria decision making. The selection decisions are made within the context of understanding the total integrated risk that is likely to incur while adapting the particular safety system alternative. Effectiveness of the proposed methodology has been validated through a real-time case study. The result in the context of final priority ranking is seemed fairly consistent.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shirley, Rachel; Smidts, Carol; Boring, Ronald
Information-Decision-Action Crew (IDAC) operator model simulations of a Steam Generator Tube Rupture are compared to student operator performance in studies conducted in the Ohio State University’s Nuclear Power Plant Simulator Facility. This study is presented as a prototype for conducting simulator studies to validate key aspects of Human Reliability Analysis (HRA) methods. Seven student operator crews are compared to simulation results for crews designed to demonstrate three different decision-making strategies. The IDAC model used in the simulations is modified slightly to capture novice behavior rather that expert operators. Operator actions and scenario pacing are compared. A preliminary review of availablemore » performance shaping factors (PSFs) is presented. After the scenario in the NPP Simulator Facility, student operators review a video of the scenario and evaluate six PSFs at pre-determined points in the scenario. This provides a dynamic record of the PSFs experienced by the OSU student operators. In this preliminary analysis, Time Constraint Load (TCL) calculated in the IDAC simulations is compared to TCL reported by student operators. We identify potential modifications to the IDAC model to develop an “IDAC Student Operator Model.” This analysis provides insights into how similar experiments could be conducted using expert operators to improve the fidelity of IDAC simulations.« less
Multiscale Modelling and Analysis of Collective Decision Making in Swarm Robotics
Vigelius, Matthias; Meyer, Bernd; Pascoe, Geoffrey
2014-01-01
We present a unified approach to describing certain types of collective decision making in swarm robotics that bridges from a microscopic individual-based description to aggregate properties. Our approach encompasses robot swarm experiments, microscopic and probabilistic macroscopic-discrete simulations as well as an analytic mathematical model. Following up on previous work, we identify the symmetry parameter, a measure of the progress of the swarm towards a decision, as a fundamental integrated swarm property and formulate its time evolution as a continuous-time Markov process. Contrary to previous work, which justified this approach only empirically and a posteriori, we justify it from first principles and derive hard limits on the parameter regime in which it is applicable. PMID:25369026
Decision-Making in National Security Affairs: Toward a Typology.
1985-06-07
decisional model, and thus provide the necessary linkage between observation and application of theory in explaining and/or predicting policy decisions . r...examines theories and models of decision -making processes from an interdisciplinary perspective, with a view toward deriving means by which the behavior of...processes, game theory , linear programming, network and graph theory , time series analysis, and the like. The discipline of decision analysis is a relatively
Team-Based Simulations: Learning Ethical Conduct in Teacher Trainee Programs
ERIC Educational Resources Information Center
Shapira-Lishchinsky, Orly
2013-01-01
This study aimed to identify the learning aspects of team-based simulations (TBS) through the analysis of ethical incidents experienced by 50 teacher trainees. A four-dimensional model emerged: learning to make decisions in a "supportive-forgiving" environment; learning to develop standards of care; learning to reduce misconduct; and learning to…
Modelling technological process of ion-exchange filtration of fluids in porous media
NASA Astrophysics Data System (ADS)
Ravshanov, N.; Saidov, U. M.
2018-05-01
Solution of an actual problem related to the process of filtration and dehydration of liquid and ionic solutions from gel particles and heavy ionic compounds is considered in the paper. This technological process is realized during the preparation and cleaning of chemical solutions, drinking water, pharmaceuticals, liquid fuels, products for public use, etc. For the analysis, research, determination of the main parameters of the technological process and operating modes of filter units and for support in managerial decision-making, a mathematical model is developed. Using the developed model, a series of computational experiments on a computer is carried out. The results of numerical calculations are illustrated in the form of graphs. Based on the analysis of numerical experiments, the conclusions are formulated that serve as the basis for making appropriate managerial decisions.
A decision-based perspective for the design of methods for systems design
NASA Technical Reports Server (NTRS)
Mistree, Farrokh; Muster, Douglas; Shupe, Jon A.; Allen, Janet K.
1989-01-01
Organization of material, a definition of decision based design, a hierarchy of decision based design, the decision support problem technique, a conceptual model design that can be manufactured and maintained, meta-design, computer-based design, action learning, and the characteristics of decisions are among the topics covered.
Smart Grid as Multi-layer Interacting System for Complex Decision Makings
NASA Astrophysics Data System (ADS)
Bompard, Ettore; Han, Bei; Masera, Marcelo; Pons, Enrico
This chapter presents an approach to the analysis of Smart Grids based on a multi-layer representation of their technical, cyber, social and decision-making aspects, as well as the related environmental constraints. In the Smart Grid paradigm, self-interested active customers (prosumers), system operators and market players interact among themselves making use of an extensive cyber infrastructure. In addition, policy decision makers define regulations, incentives and constraints to drive the behavior of the competing operators and prosumers, with the objective of ensuring the global desired performance (e.g. system stability, fair prices). For these reasons, the policy decision making is more complicated than in traditional power systems, and needs proper modeling and simulation tools for assessing "in vitro" and ex-ante the possible impacts of the decisions assumed. In this chapter, we consider the smart grids as multi-layered interacting complex systems. The intricacy of the framework, characterized by several interacting layers, cannot be captured by closed-form mathematical models. Therefore, a new approach using Multi Agent Simulation is described. With case studies we provide some indications about how to develop agent-based simulation tools presenting some preliminary examples.
Sangchan, Apichat; Chaiyakunapruk, Nathorn; Supakankunti, Siripen; Pugkhem, Ake; Mairiang, Pisaln
2014-01-01
Endoscopic biliary drainage using metal and plastic stent in unresectable hilar cholangiocarcinoma (HCA) is widely used but little is known about their cost-effectiveness. This study evaluated the cost-utility of endoscopic metal and plastic stent drainage in unresectable complex, Bismuth type II-IV, HCA patients. Decision analytic model, Markov model, was used to evaluate cost and quality-adjusted life year (QALY) of endoscopic biliary drainage in unresectable HCA. Costs of treatment and utilities of each Markov state were retrieved from hospital charges and unresectable HCA patients from tertiary care hospital in Thailand, respectively. Transition probabilities were derived from international literature. Base case analyses and sensitivity analyses were performed. Under the base-case analysis, metal stent is more effective but more expensive than plastic stent. An incremental cost per additional QALY gained is 192,650 baht (US$ 6,318). From probabilistic sensitivity analysis, at the willingness to pay threshold of one and three times GDP per capita or 158,000 baht (US$ 5,182) and 474,000 baht (US$ 15,546), the probability of metal stent being cost-effective is 26.4% and 99.8%, respectively. Based on the WHO recommendation regarding the cost-effectiveness threshold criteria, endoscopic metal stent drainage is cost-effective compared to plastic stent in unresectable complex HCA.
NASA Astrophysics Data System (ADS)
Vesselinov, V. V.; Harp, D.
2010-12-01
The process of decision making to protect groundwater resources requires a detailed estimation of uncertainties in model predictions. Various uncertainties associated with modeling a natural system, such as: (1) measurement and computational errors; (2) uncertainties in the conceptual model and model-parameter estimates; (3) simplifications in model setup and numerical representation of governing processes, contribute to the uncertainties in the model predictions. Due to this combination of factors, the sources of predictive uncertainties are generally difficult to quantify individually. Decision support related to optimal design of monitoring networks requires (1) detailed analyses of existing uncertainties related to model predictions of groundwater flow and contaminant transport, (2) optimization of the proposed monitoring network locations in terms of their efficiency to detect contaminants and provide early warning. We apply existing and newly-proposed methods to quantify predictive uncertainties and to optimize well locations. An important aspect of the analysis is the application of newly-developed optimization technique based on coupling of Particle Swarm and Levenberg-Marquardt optimization methods which proved to be robust and computationally efficient. These techniques and algorithms are bundled in a software package called MADS. MADS (Model Analyses for Decision Support) is an object-oriented code that is capable of performing various types of model analyses and supporting model-based decision making. The code can be executed under different computational modes, which include (1) sensitivity analyses (global and local), (2) Monte Carlo analysis, (3) model calibration, (4) parameter estimation, (5) uncertainty quantification, and (6) model selection. The code can be externally coupled with any existing model simulator through integrated modules that read/write input and output files using a set of template and instruction files (consistent with the PEST I/O protocol). MADS can also be internally coupled with a series of built-in analytical simulators. MADS provides functionality to work directly with existing control files developed for the code PEST (Doherty 2009). To perform the computational modes mentioned above, the code utilizes (1) advanced Latin-Hypercube sampling techniques (including Improved Distributed Sampling), (2) various gradient-based Levenberg-Marquardt optimization methods, (3) advanced global optimization methods (including Particle Swarm Optimization), and (4) a selection of alternative objective functions. The code has been successfully applied to perform various model analyses related to environmental management of real contamination sites. Examples include source identification problems, quantification of uncertainty, model calibration, and optimization of monitoring networks. The methodology and software codes are demonstrated using synthetic and real case studies where monitoring networks are optimized taking into account the uncertainty in model predictions of contaminant transport.
An approach to and web-based tool for infectious disease outbreak intervention analysis
NASA Astrophysics Data System (ADS)
Daughton, Ashlynn R.; Generous, Nicholas; Priedhorsky, Reid; Deshpande, Alina
2017-04-01
Infectious diseases are a leading cause of death globally. Decisions surrounding how to control an infectious disease outbreak currently rely on a subjective process involving surveillance and expert opinion. However, there are many situations where neither may be available. Modeling can fill gaps in the decision making process by using available data to provide quantitative estimates of outbreak trajectories. Effective reduction of the spread of infectious diseases can be achieved through collaboration between the modeling community and public health policy community. However, such collaboration is rare, resulting in a lack of models that meet the needs of the public health community. Here we show a Susceptible-Infectious-Recovered (SIR) model modified to include control measures that allows parameter ranges, rather than parameter point estimates, and includes a web user interface for broad adoption. We apply the model to three diseases, measles, norovirus and influenza, to show the feasibility of its use and describe a research agenda to further promote interactions between decision makers and the modeling community.
Activity-based costing and its application in a Turkish university hospital.
Yereli, Ayşe Necef
2009-03-01
Resource management in hospitals is of increasing importance in today's global economy. Traditional accounting systems have become inadequate for managing hospital resources and accurately determining service costs. Conversely, the activity-based costing approach to hospital accounting is an effective cost management model that determines costs and evaluates financial performance across departments. Obtaining costs that are more accurate can enable hospitals to analyze and interpret costing decisions and make more accurate budgeting decisions. Traditional and activity-based costing approaches were compared using a cost analysis of gall bladder surgeries in the general surgery department of one university hospital in Manisa, Turkey. Copyright (c) AORN, Inc, 2009.
Khan, Md Mohib-Ul-Haque; Jain, Siddharth; Vaezi, Mahdi; Kumar, Amit
2016-02-01
Economic competitiveness is one of the key factors in making decisions towards the development of waste conversion facilities and devising a sustainable waste management strategy. The goal of this study is to develop a framework, as well as to develop and demonstrate a comprehensive techno-economic model to help county and municipal decision makers in establishing waste conversion facilities. The user-friendly data-intensive model, called the FUNdamental ENgineering PrinciplEs-based ModeL for Estimation of Cost of Energy and Fuels from MSW (FUNNEL-Cost-MSW), compares nine different waste management scenarios, including landfilling and composting, in terms of economic parameters such as gate fees and return on investment. In addition, a geographic information system (GIS) model was developed to determine suitable locations for waste conversion facilities and landfill sites based on integration of environmental, social, and economic factors. Finally, a case study on Parkland County and its surrounding counties in the province of Alberta, Canada, was conducted and a sensitivity analysis was performed to assess the influence of the key technical and economic parameters on the calculated results. Copyright © 2015 Elsevier Ltd. All rights reserved.
Using Trust to Establish a Secure Routing Model in Cognitive Radio Network.
Zhang, Guanghua; Chen, Zhenguo; Tian, Liqin; Zhang, Dongwen
2015-01-01
Specific to the selective forwarding attack on routing in cognitive radio network, this paper proposes a trust-based secure routing model. Through monitoring nodes' forwarding behaviors, trusts of nodes are constructed to identify malicious nodes. In consideration of that routing selection-based model must be closely collaborative with spectrum allocation, a route request piggybacking available spectrum opportunities is sent to non-malicious nodes. In the routing decision phase, nodes' trusts are used to construct available path trusts and delay measurement is combined for making routing decisions. At the same time, according to the trust classification, different responses are made specific to their service requests. By adopting stricter punishment on malicious behaviors from non-trusted nodes, the cooperation of nodes in routing can be stimulated. Simulation results and analysis indicate that this model has good performance in network throughput and end-to-end delay under the selective forwarding attack.
A Bayesian sequential processor approach to spectroscopic portal system decisions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sale, K; Candy, J; Breitfeller, E
The development of faster more reliable techniques to detect radioactive contraband in a portal type scenario is an extremely important problem especially in this era of constant terrorist threats. Towards this goal the development of a model-based, Bayesian sequential data processor for the detection problem is discussed. In the sequential processor each datum (detector energy deposit and pulse arrival time) is used to update the posterior probability distribution over the space of model parameters. The nature of the sequential processor approach is that a detection is produced as soon as it is statistically justified by the data rather than waitingmore » for a fixed counting interval before any analysis is performed. In this paper the Bayesian model-based approach, physics and signal processing models and decision functions are discussed along with the first results of our research.« less
A Conceptual Modeling Approach for OLAP Personalization
NASA Astrophysics Data System (ADS)
Garrigós, Irene; Pardillo, Jesús; Mazón, Jose-Norberto; Trujillo, Juan
Data warehouses rely on multidimensional models in order to provide decision makers with appropriate structures to intuitively analyze data with OLAP technologies. However, data warehouses may be potentially large and multidimensional structures become increasingly complex to be understood at a glance. Even if a departmental data warehouse (also known as data mart) is used, these structures would be also too complex. As a consequence, acquiring the required information is more costly than expected and decision makers using OLAP tools may get frustrated. In this context, current approaches for data warehouse design are focused on deriving a unique OLAP schema for all analysts from their previously stated information requirements, which is not enough to lighten the complexity of the decision making process. To overcome this drawback, we argue for personalizing multidimensional models for OLAP technologies according to the continuously changing user characteristics, context, requirements and behaviour. In this paper, we present a novel approach to personalizing OLAP systems at the conceptual level based on the underlying multidimensional model of the data warehouse, a user model and a set of personalization rules. The great advantage of our approach is that a personalized OLAP schema is provided for each decision maker contributing to better satisfy their specific analysis needs. Finally, we show the applicability of our approach through a sample scenario based on our CASE tool for data warehouse development.
Nakao, Takashi; Ohira, Hideki; Northoff, Georg
2012-01-01
Most experimental studies of decision-making have specifically examined situations in which a single less-predictable correct answer exists (externally guided decision-making under uncertainty). Along with such externally guided decision-making, there are instances of decision-making in which no correct answer based on external circumstances is available for the subject (internally guided decision-making). Such decisions are usually made in the context of moral decision-making as well as in preference judgment, where the answer depends on the subject’s own, i.e., internal, preferences rather than on external, i.e., circumstantial, criteria. The neuronal and psychological mechanisms that allow guidance of decisions based on more internally oriented criteria in the absence of external ones remain unclear. This study was undertaken to compare decision-making of these two kinds empirically and theoretically. First, we reviewed studies of decision-making to clarify experimental–operational differences between externally guided and internally guided decision-making. Second, using multi-level kernel density analysis, a whole-brain-based quantitative meta-analysis of neuroimaging studies was performed. Our meta-analysis revealed that the neural network used predominantly for internally guided decision-making differs from that for externally guided decision-making under uncertainty. This result suggests that studying only externally guided decision-making under uncertainty is insufficient to account for decision-making processes in the brain. Finally, based on the review and results of the meta-analysis, we discuss the differences and relations between decision-making of these two types in terms of their operational, neuronal, and theoretical characteristics. PMID:22403525
NASA Technical Reports Server (NTRS)
Murphy, M. R.; Awe, C. A.
1986-01-01
Six professionally active, retired captains rated the coordination and decisionmaking performances of sixteen aircrews while viewing videotapes of a simulated commercial air transport operation. The scenario featured a required diversion and a probable minimum fuel situation. Seven point Likert-type scales were used in rating variables on the basis of a model of crew coordination and decisionmaking. The variables were based on concepts of, for example, decision difficulty, efficiency, and outcome quality; and leader-subordin ate concepts such as person and task-oriented leader behavior, and competency motivation of subordinate crewmembers. Five-front-end variables of the model were in turn dependent variables for a hierarchical regression procedure. The variance in safety performance was explained 46%, by decision efficiency, command reversal, and decision quality. The variance of decision quality, an alternative substantive dependent variable to safety performance, was explained 60% by decision efficiency and the captain's quality of within-crew communications. The variance of decision efficiency, crew coordination, and command reversal were in turn explained 78%, 80%, and 60% by small numbers of preceding independent variables. A principle component, varimax factor analysis supported the model structure suggested by regression analyses.
Stamovlasis, Dimitrios; Vaiopoulou, Julie
2017-07-01
The present study examines the factors influencing a decision-making process, with specific focus on the role of dysfunctional myths (DM). DM are thoughts or beliefs that are rather irrational, however influential to people's decisions. In this paper a decision-making process regarding the career choice of university students majoring in natural sciences and education (N=496) is examined by analyzing survey data taken via Career Decision Making Difficulties Questionnaire (CDDQ). The difficulty of making the choice and the certainty about one's decision were the state variables, while the independent variables were factors related to the lack of information or knowledge needed, which actually reflect a bounded rationality. Cusp catastrophe analysis, based on both least squares and maximum likelihood procedures, showed that the nonlinear models predicting the two state variables were superior to linear alternatives. Factors related to lack of knowledge about the steps involved in the process of career decision-making, lack of information about the various occupations, lack of information about self and lack of motivation acted as asymmetry, while dysfunctional myths acted as bifurcation factor for both state variables. The catastrophe model, grounded in empirical data, revealed a unique role for DM and a better interpretation within the context of complexity and the notion of bounded rationality. The analysis opens the nonlinear dynamical systems (NDS) perspective in studying decision-making processes. Theoretical and practical implications are discussed.
Benndorf, Matthias; Kotter, Elmar; Langer, Mathias; Herda, Christoph; Wu, Yirong; Burnside, Elizabeth S
2015-06-01
To develop and validate a decision support tool for mammographic mass lesions based on a standardized descriptor terminology (BI-RADS lexicon) to reduce variability of practice. We used separate training data (1,276 lesions, 138 malignant) and validation data (1,177 lesions, 175 malignant). We created naïve Bayes (NB) classifiers from the training data with tenfold cross-validation. Our "inclusive model" comprised BI-RADS categories, BI-RADS descriptors, and age as predictive variables; our "descriptor model" comprised BI-RADS descriptors and age. The resulting NB classifiers were applied to the validation data. We evaluated and compared classifier performance with ROC-analysis. In the training data, the inclusive model yields an AUC of 0.959; the descriptor model yields an AUC of 0.910 (P < 0.001). The inclusive model is superior to the clinical performance (BI-RADS categories alone, P < 0.001); the descriptor model performs similarly. When applied to the validation data, the inclusive model yields an AUC of 0.935; the descriptor model yields an AUC of 0.876 (P < 0.001). Again, the inclusive model is superior to the clinical performance (P < 0.001); the descriptor model performs similarly. We consider our classifier a step towards a more uniform interpretation of combinations of BI-RADS descriptors. We provide our classifier at www.ebm-radiology.com/nbmm/index.html . • We provide a decision support tool for mammographic masses at www.ebm-radiology.com/nbmm/index.html . • Our tool may reduce variability of practice in BI-RADS category assignment. • A formal analysis of BI-RADS descriptors may enhance radiologists' diagnostic performance.
Application of effective discharge analysis to environmental flow decision-making
McKay, S. Kyle; Freeman, Mary C.; Covich, A.P.
2016-01-01
Well-informed river management decisions rely on an explicit statement of objectives, repeatable analyses, and a transparent system for assessing trade-offs. These components may then be applied to compare alternative operational regimes for water resource infrastructure (e.g., diversions, locks, and dams). Intra- and inter-annual hydrologic variability further complicates these already complex environmental flow decisions. Effective discharge analysis (developed in studies of geomorphology) is a powerful tool for integrating temporal variability of flow magnitude and associated ecological consequences. Here, we adapt the effectiveness framework to include multiple elements of the natural flow regime (i.e., timing, duration, and rate-of-change) as well as two flow variables. We demonstrate this analytical approach using a case study of environmental flow management based on long-term (60 years) daily discharge records in the Middle Oconee River near Athens, GA, USA. Specifically, we apply an existing model for estimating young-of-year fish recruitment based on flow-dependent metrics to an effective discharge analysis that incorporates hydrologic variability and multiple focal taxa. We then compare three alternative methods of environmental flow provision. Percentage-based withdrawal schemes outcompete other environmental flow methods across all levels of water withdrawal and ecological outcomes.
Application of Effective Discharge Analysis to Environmental Flow Decision-Making.
McKay, S Kyle; Freeman, Mary C; Covich, Alan P
2016-06-01
Well-informed river management decisions rely on an explicit statement of objectives, repeatable analyses, and a transparent system for assessing trade-offs. These components may then be applied to compare alternative operational regimes for water resource infrastructure (e.g., diversions, locks, and dams). Intra- and inter-annual hydrologic variability further complicates these already complex environmental flow decisions. Effective discharge analysis (developed in studies of geomorphology) is a powerful tool for integrating temporal variability of flow magnitude and associated ecological consequences. Here, we adapt the effectiveness framework to include multiple elements of the natural flow regime (i.e., timing, duration, and rate-of-change) as well as two flow variables. We demonstrate this analytical approach using a case study of environmental flow management based on long-term (60 years) daily discharge records in the Middle Oconee River near Athens, GA, USA. Specifically, we apply an existing model for estimating young-of-year fish recruitment based on flow-dependent metrics to an effective discharge analysis that incorporates hydrologic variability and multiple focal taxa. We then compare three alternative methods of environmental flow provision. Percentage-based withdrawal schemes outcompete other environmental flow methods across all levels of water withdrawal and ecological outcomes.
Decision science and cervical cancer.
Cantor, Scott B; Fahs, Marianne C; Mandelblatt, Jeanne S; Myers, Evan R; Sanders, Gillian D
2003-11-01
Mathematical modeling is an effective tool for guiding cervical cancer screening, diagnosis, and treatment decisions for patients and policymakers. This article describes the use of mathematical modeling as outlined in five presentations from the Decision Science and Cervical Cancer session of the Second International Conference on Cervical Cancer held at The University of Texas M. D. Anderson Cancer Center, April 11-14, 2002. The authors provide an overview of mathematical modeling, especially decision analysis and cost-effectiveness analysis, and examples of how it can be used for clinical decision making regarding the prevention, diagnosis, and treatment of cervical cancer. Included are applications as well as theory regarding decision science and cervical cancer. Mathematical modeling can answer such questions as the optimal frequency for screening, the optimal age to stop screening, and the optimal way to diagnose cervical cancer. Results from one mathematical model demonstrated that a vaccine against high-risk strains of human papillomavirus was a cost-effective use of resources, and discussion of another model demonstrated the importance of collecting direct non-health care costs and time costs for cost-effectiveness analysis. Research presented indicated that care must be taken when applying the results of population-wide, cost-effectiveness analyses to reduce health disparities. Mathematical modeling can encompass a variety of theoretical and applied issues regarding decision science and cervical cancer. The ultimate objective of using decision-analytic and cost-effectiveness models is to identify ways to improve women's health at an economically reasonable cost. Copyright 2003 American Cancer Society.
Advances in the Application of Decision Theory to Test-Based Decision Making.
ERIC Educational Resources Information Center
van der Linden, Wim J.
This paper reviews recent research in the Netherlands on the application of decision theory to test-based decision making about personnel selection and student placement. The review is based on an earlier model proposed for the classification of decision problems, and emphasizes an empirical Bayesian framework. Classification decisions with…
NASA Astrophysics Data System (ADS)
Meng, Fanyong
2018-02-01
Triangular fuzzy reciprocal preference relations (TFRPRs) are powerful tools to denoting decision-makers' fuzzy judgments, which permit the decision-makers to apply triangular fuzzy ratio rather than real numbers to express their judgements. Consistency analysis is one of the most crucial issues in preference relations that can guarantee the reasonable ranking order. However, all previous consistency concepts cannot well address this type of preference relations. Based on the operational laws on triangular fuzzy numbers, this paper introduces an additive consistency concept for TFRPRs by using quasi TFRPRs, which can be seen as a natural extension of the crisp case. Using this consistency concept, models to judging the additive consistency of TFRPRs and to estimating missing values in complete TFRPRs are constructed. Then, an algorithm to decision-making with TFRPRs is developed. Finally, two numerical examples are offered to illustrate the application of the proposed procedure, and comparison analysis is performed.
Does probability guided hysteroscopy reduce costs in women investigated for postmenopausal bleeding?
Breijer, M C; van Hanegem, N; Visser, N C M; Verheijen, R H M; Mol, B W J; Pijnenborg, J M A; Opmeer, B C; Timmermans, A
2015-01-01
To evaluate whether a model to predict a failed endometrial biopsy in women with postmenopausal bleeding (PMB) and a thickened endometrium can reduce costs without compromising diagnostic accuracy. Model based cost-minimization analysis. A decision analytic model was designed to compare two diagnostic strategies for women with PMB: (I) attempting office endometrial biopsy and performing outpatient hysteroscopy after failed biopsy and (II) predicted probability of a failed endometrial biopsy based on patient characteristics to guide the decision for endometrial biopsy or immediate hysteroscopy. Robustness of assumptions regarding costs was evaluated in sensitivity analyses. Costs for the different strategies. At different cut-offs for the predicted probability of failure of an endometrial biopsy, strategy I was generally less expensive than strategy II. The costs for strategy I were always € 460; the costs for strategy II varied between € 457 and € 475. At a 65% cut-off, a possible saving of € 3 per woman could be achieved. Individualizing the decision to perform an endometrial biopsy or immediate hysteroscopy in women presenting with postmenopausal bleeding based on patient characteristics does not increase the efficiency of the diagnostic work-up.
Reiter, Andrea M F; Heinze, Hans-Jochen; Schlagenhauf, Florian; Deserno, Lorenz
2017-02-01
Despite its clinical relevance and the recent recognition as a diagnostic category in the DSM-5, binge eating disorder (BED) has rarely been investigated from a cognitive neuroscientific perspective targeting a more precise neurocognitive profiling of the disorder. BED patients suffer from a lack of behavioral control during recurrent binge eating episodes and thus fail to adapt their behavior in the face of negative consequences, eg, high risk for obesity. To examine impairments in flexible reward-based decision-making, we exposed BED patients (n=22) and matched healthy individuals (n=22) to a reward-guided decision-making task during functional resonance imaging (fMRI). Performing fMRI analysis informed via computational modeling of choice behavior, we were able to identify specific signatures of altered decision-making in BED. On the behavioral level, we observed impaired behavioral adaptation in BED, which was due to enhanced switching behavior, a putative deficit in striking a balance between exploration and exploitation appropriately. This was accompanied by diminished activation related to exploratory decisions in the anterior insula/ventro-lateral prefrontal cortex. Moreover, although so-called model-free reward prediction errors remained intact, representation of ventro-medial prefrontal learning signatures, incorporating inference on unchosen options, was reduced in BED, which was associated with successful decision-making in the task. On the basis of a computational psychiatry account, the presented findings contribute to defining a neurocognitive phenotype of BED.
Reiter, Andrea M F; Heinze, Hans-Jochen; Schlagenhauf, Florian; Deserno, Lorenz
2017-01-01
Despite its clinical relevance and the recent recognition as a diagnostic category in the DSM-5, binge eating disorder (BED) has rarely been investigated from a cognitive neuroscientific perspective targeting a more precise neurocognitive profiling of the disorder. BED patients suffer from a lack of behavioral control during recurrent binge eating episodes and thus fail to adapt their behavior in the face of negative consequences, eg, high risk for obesity. To examine impairments in flexible reward-based decision-making, we exposed BED patients (n=22) and matched healthy individuals (n=22) to a reward-guided decision-making task during functional resonance imaging (fMRI). Performing fMRI analysis informed via computational modeling of choice behavior, we were able to identify specific signatures of altered decision-making in BED. On the behavioral level, we observed impaired behavioral adaptation in BED, which was due to enhanced switching behavior, a putative deficit in striking a balance between exploration and exploitation appropriately. This was accompanied by diminished activation related to exploratory decisions in the anterior insula/ventro-lateral prefrontal cortex. Moreover, although so-called model-free reward prediction errors remained intact, representation of ventro–medial prefrontal learning signatures, incorporating inference on unchosen options, was reduced in BED, which was associated with successful decision-making in the task. On the basis of a computational psychiatry account, the presented findings contribute to defining a neurocognitive phenotype of BED. PMID:27301429
Welton, Nicky J; Madan, Jason; Ades, Anthony E
2011-09-01
Reimbursement decisions are typically based on cost-effectiveness analyses. While a cost-effectiveness analysis can identify the optimum strategy, there is usually some degree of uncertainty around this decision. Sources of uncertainty include statistical sampling error in treatment efficacy measures, underlying baseline risk, utility measures and costs, as well as uncertainty in the structure of the model. The optimal strategy is therefore only optimal on average, and a decision to adopt this strategy might still be the wrong decision if all uncertainty could be eliminated. This means that there is a quantifiable expected (average) loss attaching to decisions made under uncertainty, and hence a value in collecting information to reduce that uncertainty. Value of information (VOI) analyses can be used to provide guidance on whether more research would be cost-effective, which particular model inputs (parameters) have the most bearing on decision uncertainty, and can also help with the design and sample size of further research. Here, we introduce the key concepts in VOI analyses, and highlight the inputs required to calculate it. The adoption of the new biologic treatments for RA and PsA tends to be based on placebo-controlled trials. We discuss the possible role of VOI analyses in deciding whether head-to-head comparisons of the biologic therapies should be carried out, illustrating with examples from other fields. We emphasize the need for a model of the natural history of RA and PsA, which reflects a consensus view.
Modular Architecture for Integrated Model-Based Decision Support.
Gaebel, Jan; Schreiber, Erik; Oeser, Alexander; Oeltze-Jafra, Steffen
2018-01-01
Model-based decision support systems promise to be a valuable addition to oncological treatments and the implementation of personalized therapies. For the integration and sharing of decision models, the involved systems must be able to communicate with each other. In this paper, we propose a modularized architecture of dedicated systems for the integration of probabilistic decision models into existing hospital environments. These systems interconnect via web services and provide model sharing and processing capabilities for clinical information systems. Along the lines of IHE integration profiles from other disciplines and the meaningful reuse of routinely recorded patient data, our approach aims for the seamless integration of decision models into hospital infrastructure and the physicians' daily work.
ERIC Educational Resources Information Center
Lee, Scott Weng Fai
2013-01-01
The assessment of young children's thinking competence in task performances has typically followed the novice-to-expert regimen involving models of strategies that adults use when engaged in cognitive tasks such as problem-solving and decision-making. Socio-constructivists argue for a balanced pedagogical approach between the adult and child that…
A Model for Institutional Policy Analysis: The Case of Student Financial Aid. AIR Forum 1981 Paper.
ERIC Educational Resources Information Center
Fenske, Robert H.; Parker, John D.
The development of an operational model that would enable a college institutional research unit to improve administrative decision-making by expanding its data base to include new activities not widely recognized throughout the institution is considered. Attention is directed to institutional research as a function within an institution,…
Laura Phillips-Mao; Susan M. Galatowitsch; Stephanie A. Snyder; Robert G. Haight
2016-01-01
Incorporating climate change into conservation decision-making at site and population scales is challenging due to uncertainties associated with localized climate change impacts and population responses to multiple interacting impacts and adaptation strategies. We explore the use of spatially explicit population models to facilitate scenario analysis, a conservation...
Error rate information in attention allocation pilot models
NASA Technical Reports Server (NTRS)
Faulkner, W. H.; Onstott, E. D.
1977-01-01
The Northrop urgency decision pilot model was used in a command tracking task to compare the optimized performance of multiaxis attention allocation pilot models whose urgency functions were (1) based on tracking error alone, and (2) based on both tracking error and error rate. A matrix of system dynamics and command inputs was employed, to create both symmetric and asymmetric two axis compensatory tracking tasks. All tasks were single loop on each axis. Analysis showed that a model that allocates control attention through nonlinear urgency functions using only error information could not achieve performance of the full model whose attention shifting algorithm included both error and error rate terms. Subsequent to this analysis, tracking performance predictions for the full model were verified by piloted flight simulation. Complete model and simulation data are presented.
NASA Astrophysics Data System (ADS)
Wu, C. Z.; Huang, G. H.; Yan, X. P.; Cai, Y. P.; Li, Y. P.
2010-05-01
Large crowds are increasingly common at political, social, economic, cultural and sports events in urban areas. This has led to attention on the management of evacuations under such situations. In this study, we optimise an approximation method for vehicle allocation and route planning in case of an evacuation. This method, based on an interval-parameter multi-objective optimisation model, has potential for use in a flexible decision support system for evacuation management. The modeling solutions are obtained by sequentially solving two sub-models corresponding to lower- and upper-bounds for the desired objective function value. The interval solutions are feasible and stable in the given decision space, and this may reduce the negative effects of uncertainty, thereby improving decision makers' estimates under different conditions. The resulting model can be used for a systematic analysis of the complex relationships among evacuation time, cost and environmental considerations. The results of a case study used to validate the proposed model show that the model does generate useful solutions for planning evacuation management and practices. Furthermore, these results are useful for evacuation planners, not only in making vehicle allocation decisions but also for providing insight into the tradeoffs among evacuation time, environmental considerations and economic objectives.
NASA Astrophysics Data System (ADS)
Glaubius, J.; Maerker, M.
2016-12-01
Anthropogenic landforms, such as mines and agricultural terraces, are impacted by both geomorphic and social processes at varying intensities through time. In the case of agricultural terraces, decisions regarding terrace maintenance are intertwined with land use, such as when terraced fields are abandoned. Furthermore, terrace maintenance and land use decisions, either jointly or separately, may be in response to geomorphic processes, as well as geomorphic feedbacks. Previous studies of these complex geomorphic systems considered agricultural terraces as static features or analyzed only the geomorphic response to landowner decisions. Such research is appropriate for short-term or binary landscape scenarios (e.g. the impact of maintained vs. abandoned terraces), but the complexities inherent in these socio-natural systems requires an approach that includes both social and geomorphic processes. This project analyzes feedbacks and emergent properties in terraced systems by implementing a coupled landscape evolution model (LEM) and agent-based model (ABM) using the Landlab and Mesa modeling libraries. In the ABM portion of the model, agricultural terraces are conceptualized using a life-cycle stages schema and implemented using Markov Decision Processes to simulate the changing geomorphic impact of terracing based on human decisions. This paper examines the applicability of this approach by comparing results from a LEM-only model against the coupled LEM-ABM model for a terraced region. Model results are compared by quantify and spatial patterning of sediment transport. This approach fully captures long-term landscape evolution of terraced terrain that is otherwise lost when the life-cycle of terraces is not considered. The coupled LEM-ABM approach balances both environmental and social processes so that the socio-natural feedbacks in such anthropogenic systems can be disentangled.
A decision analysis approach for risk management of near-earth objects
NASA Astrophysics Data System (ADS)
Lee, Robert C.; Jones, Thomas D.; Chapman, Clark R.
2014-10-01
Risk management of near-Earth objects (NEOs; e.g., asteroids and comets) that can potentially impact Earth is an important issue that took on added urgency with the Chelyabinsk event of February 2013. Thousands of NEOs large enough to cause substantial damage are known to exist, although only a small fraction of these have the potential to impact Earth in the next few centuries. The probability and location of a NEO impact are subject to complex physics and great uncertainty, and consequences can range from minimal to devastating, depending upon the size of the NEO and location of impact. Deflecting a potential NEO impactor would be complex and expensive, and inter-agency and international cooperation would be necessary. Such deflection campaigns may be risky in themselves, and mission failure may result in unintended consequences. The benefits, risks, and costs of different potential NEO risk management strategies have not been compared in a systematic fashion. We present a decision analysis framework addressing this hazard. Decision analysis is the science of informing difficult decisions. It is inherently multi-disciplinary, especially with regard to managing catastrophic risks. Note that risk analysis clarifies the nature and magnitude of risks, whereas decision analysis guides rational risk management. Decision analysis can be used to inform strategic, policy, or resource allocation decisions. First, a problem is defined, including the decision situation and context. Second, objectives are defined, based upon what the different decision-makers and stakeholders (i.e., participants in the decision) value as important. Third, quantitative measures or scales for the objectives are determined. Fourth, alternative choices or strategies are defined. Fifth, the problem is then quantitatively modeled, including probabilistic risk analysis, and the alternatives are ranked in terms of how well they satisfy the objectives. Sixth, sensitivity analyses are performed in order to examine the impact of uncertainties. Finally, the need for further analysis, data collection, or refinement is determined. The first steps of defining the problem and the objectives are critical to constructing an informative decision analysis. Such steps must be undertaken with participation from experts, decision-makers, and stakeholders (defined here as "decision participants"). The basic problem here can be framed as: “What is the best strategy to manage risk associated with NEOs?” Some high-level objectives might be to minimize: mortality and injuries, damage to critical infrastructure (e.g., power, communications and food distribution), ecosystem damage, property damage, ungrounded media and public speculation, resources expended, and overall cost. Another valuable objective would be to maximize inter-agency/government coordination. Some of these objectives (e.g., “minimize mortality”) are readily quantified (e.g., deaths and injuries averted). Others are less so (e.g., “maximize inter-agency/government coordination”), but these can be scaled. Objectives may be inversely related: e.g., a strategy that minimizes mortality may cost more. They are also unlikely to be weighted equally. Defining objectives and assessing their relative weight and interactions requires early engagement with decision participants. High-level decisions include whether to deflect a NEO, when to deflect, what is the best alternative for deflection/destruction, and disaster management strategies if an impact occurs. Important influences include, for example: NEO characteristics (orbital characteristics, diameter, mass, spin and composition), impact probability and location, interval between discovery and projected impact date, interval between discovery and deflection target date, costs of information collection, costs and technological feasibility of deflection alternatives, risks of deflection campaigns, requirements for inter-agency and international cooperation, and timing of informing the public. The analytical aspects of decision analysis center on estimation of the expected value (i.e. utility) of different alternatives. The expected value of an alternative is a function of the probability-weighted consequences, estimated using Bayesian calculations in a decision tree or influence diagram model. The result is a set of expected-value estimates for all alternatives evaluated that enables a ranking; the higher the expected value, the more preferred the alternative. A common way to include resource limitations is by framing the decision analysis in the context of economics (e.g., cost-effectiveness analysis). An important aspect of decision analysis in the NEO risk management case is the ability, known as sensitivity analysis, to examine the effect of parameter uncertainty upon decisions. The simplest way to evaluate uncertainty associated with the information used in a decision analysis is to adjust the input values one at a time (or simultaneously) to examine how the results change. Monte Carlo simulations can be used to adjust the inputs over ranges or distributions of values; statistical means then are used to determine the most influential variables. These techniques yield a measure known as the expected value of imperfect information. This value is highly informative, because it allows the decision-maker with imperfect information to evaluate the impact of using experiments, tests, or data collection (e.g. Earth-based observations, space-based remote sensing, etc.) to refine judgments; and indeed to estimate how much should be spent to reduce uncertainty.
A critical narrative analysis of shared decision-making in acute inpatient mental health care.
Stacey, Gemma; Felton, Anne; Morgan, Alastair; Stickley, Theo; Willis, Martin; Diamond, Bob; Houghton, Philip; Johnson, Beverley; Dumenya, John
2016-01-01
Shared decision-making (SDM) is a high priority in healthcare policy and is complementary to the recovery philosophy in mental health care. This agenda has been operationalised within the Values-Based Practice (VBP) framework, which offers a theoretical and practical model to promote democratic interprofessional approaches to decision-making. However, these are limited by a lack of recognition of the implications of power implicit within the mental health system. This study considers issues of power within the context of decision-making and examines to what extent decisions about patients' care on acute in-patient wards are perceived to be shared. Focus groups were conducted with 46 mental health professionals, service users, and carers. The data were analysed using the framework of critical narrative analysis (CNA). The findings of the study suggested each group constructed different identity positions, which placed them as inside or outside of the decision-making process. This reflected their view of themselves as best placed to influence a decision on behalf of the service user. In conclusion, the discourse of VBP and SDM needs to take account of how differentials of power and the positioning of speakers affect the context in which decisions take place.
Sebold, Miriam; Nebe, Stephan; Garbusow, Maria; Guggenmos, Matthias; Schad, Daniel J; Beck, Anne; Kuitunen-Paul, Soeren; Sommer, Christian; Frank, Robin; Neu, Peter; Zimmermann, Ulrich S; Rapp, Michael A; Smolka, Michael N; Huys, Quentin J M; Schlagenhauf, Florian; Heinz, Andreas
2017-12-01
Addiction is supposedly characterized by a shift from goal-directed to habitual decision making, thus facilitating automatic drug intake. The two-step task allows distinguishing between these mechanisms by computationally modeling goal-directed and habitual behavior as model-based and model-free control. In addicted patients, decision making may also strongly depend upon drug-associated expectations. Therefore, we investigated model-based versus model-free decision making and its neural correlates as well as alcohol expectancies in alcohol-dependent patients and healthy controls and assessed treatment outcome in patients. Ninety detoxified, medication-free, alcohol-dependent patients and 96 age- and gender-matched control subjects underwent functional magnetic resonance imaging during the two-step task. Alcohol expectancies were measured with the Alcohol Expectancy Questionnaire. Over a follow-up period of 48 weeks, 37 patients remained abstinent and 53 patients relapsed as indicated by the Alcohol Timeline Followback method. Patients who relapsed displayed reduced medial prefrontal cortex activation during model-based decision making. Furthermore, high alcohol expectancies were associated with low model-based control in relapsers, while the opposite was observed in abstainers and healthy control subjects. However, reduced model-based control per se was not associated with subsequent relapse. These findings suggest that poor treatment outcome in alcohol dependence does not simply result from a shift from model-based to model-free control but is instead dependent on the interaction between high drug expectancies and low model-based decision making. Reduced model-based medial prefrontal cortex signatures in those who relapse point to a neural correlate of relapse risk. These observations suggest that therapeutic interventions should target subjective alcohol expectancies. Copyright © 2017 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
Kasthurirathne, Suranga N; Dixon, Brian E; Gichoya, Judy; Xu, Huiping; Xia, Yuni; Mamlin, Burke; Grannis, Shaun J
2017-05-01
Existing approaches to derive decision models from plaintext clinical data frequently depend on medical dictionaries as the sources of potential features. Prior research suggests that decision models developed using non-dictionary based feature sourcing approaches and "off the shelf" tools could predict cancer with performance metrics between 80% and 90%. We sought to compare non-dictionary based models to models built using features derived from medical dictionaries. We evaluated the detection of cancer cases from free text pathology reports using decision models built with combinations of dictionary or non-dictionary based feature sourcing approaches, 4 feature subset sizes, and 5 classification algorithms. Each decision model was evaluated using the following performance metrics: sensitivity, specificity, accuracy, positive predictive value, and area under the receiver operating characteristics (ROC) curve. Decision models parameterized using dictionary and non-dictionary feature sourcing approaches produced performance metrics between 70 and 90%. The source of features and feature subset size had no impact on the performance of a decision model. Our study suggests there is little value in leveraging medical dictionaries for extracting features for decision model building. Decision models built using features extracted from the plaintext reports themselves achieve comparable results to those built using medical dictionaries. Overall, this suggests that existing "off the shelf" approaches can be leveraged to perform accurate cancer detection using less complex Named Entity Recognition (NER) based feature extraction, automated feature selection and modeling approaches. Copyright © 2017 Elsevier Inc. All rights reserved.
Zhang, Xiaoling; Huang, Kai; Zou, Rui; Liu, Yong; Yu, Yajuan
2013-01-01
The conflict of water environment protection and economic development has brought severe water pollution and restricted the sustainable development in the watershed. A risk explicit interval linear programming (REILP) method was used to solve integrated watershed environmental-economic optimization problem. Interval linear programming (ILP) and REILP models for uncertainty-based environmental economic optimization at the watershed scale were developed for the management of Lake Fuxian watershed, China. Scenario analysis was introduced into model solution process to ensure the practicality and operability of optimization schemes. Decision makers' preferences for risk levels can be expressed through inputting different discrete aspiration level values into the REILP model in three periods under two scenarios. Through balancing the optimal system returns and corresponding system risks, decision makers can develop an efficient industrial restructuring scheme based directly on the window of "low risk and high return efficiency" in the trade-off curve. The representative schemes at the turning points of two scenarios were interpreted and compared to identify a preferable planning alternative, which has the relatively low risks and nearly maximum benefits. This study provides new insights and proposes a tool, which was REILP, for decision makers to develop an effectively environmental economic optimization scheme in integrated watershed management.
Zou, Rui; Liu, Yong; Yu, Yajuan
2013-01-01
The conflict of water environment protection and economic development has brought severe water pollution and restricted the sustainable development in the watershed. A risk explicit interval linear programming (REILP) method was used to solve integrated watershed environmental-economic optimization problem. Interval linear programming (ILP) and REILP models for uncertainty-based environmental economic optimization at the watershed scale were developed for the management of Lake Fuxian watershed, China. Scenario analysis was introduced into model solution process to ensure the practicality and operability of optimization schemes. Decision makers' preferences for risk levels can be expressed through inputting different discrete aspiration level values into the REILP model in three periods under two scenarios. Through balancing the optimal system returns and corresponding system risks, decision makers can develop an efficient industrial restructuring scheme based directly on the window of “low risk and high return efficiency” in the trade-off curve. The representative schemes at the turning points of two scenarios were interpreted and compared to identify a preferable planning alternative, which has the relatively low risks and nearly maximum benefits. This study provides new insights and proposes a tool, which was REILP, for decision makers to develop an effectively environmental economic optimization scheme in integrated watershed management. PMID:24191144
Steingroever, Helen; Pachur, Thorsten; Šmíra, Martin; Lee, Michael D
2018-06-01
The Iowa Gambling Task (IGT) is one of the most popular experimental paradigms for comparing complex decision-making across groups. Most commonly, IGT behavior is analyzed using frequentist tests to compare performance across groups, and to compare inferred parameters of cognitive models developed for the IGT. Here, we present a Bayesian alternative based on Bayesian repeated-measures ANOVA for comparing performance, and a suite of three complementary model-based methods for assessing the cognitive processes underlying IGT performance. The three model-based methods involve Bayesian hierarchical parameter estimation, Bayes factor model comparison, and Bayesian latent-mixture modeling. We illustrate these Bayesian methods by applying them to test the extent to which differences in intuitive versus deliberate decision style are associated with differences in IGT performance. The results show that intuitive and deliberate decision-makers behave similarly on the IGT, and the modeling analyses consistently suggest that both groups of decision-makers rely on similar cognitive processes. Our results challenge the notion that individual differences in intuitive and deliberate decision styles have a broad impact on decision-making. They also highlight the advantages of Bayesian methods, especially their ability to quantify evidence in favor of the null hypothesis, and that they allow model-based analyses to incorporate hierarchical and latent-mixture structures.
Multi Criteria Evaluation Module for RiskChanges Spatial Decision Support System
NASA Astrophysics Data System (ADS)
Olyazadeh, Roya; Jaboyedoff, Michel; van Westen, Cees; Bakker, Wim
2015-04-01
Multi-Criteria Evaluation (MCE) module is one of the five modules of RiskChanges spatial decision support system. RiskChanges web-based platform aims to analyze changes in hydro-meteorological risk and provides tools for selecting the best risk reduction alternative. It is developed under CHANGES framework (changes-itn.eu) and INCREO project (increo-fp7.eu). MCE tool helps decision makers and spatial planners to evaluate, sort and rank the decision alternatives. The users can choose among different indicators that are defined within the system using Risk and Cost Benefit analysis results besides they can add their own indicators. Subsequently the system standardizes and prioritizes them. Finally, the best decision alternative is selected by using the weighted sum model (WSM). The Application of this work is to facilitate the effect of MCE for analyzing changing risk over the time under different scenarios and future years by adopting a group decision making into practice and comparing the results by numeric and graphical view within the system. We believe that this study helps decision-makers to achieve the best solution by expressing their preferences for strategies under future scenarios. Keywords: Multi-Criteria Evaluation, Spatial Decision Support System, Weighted Sum Model, Natural Hazard Risk Management
2016-05-05
Training for IND Response Decision-Making: Models for Government–Industry Collaboration for the Development of Game -Based Training Tools R.M. Seater...Skill Transfer and Virtual Training for IND Response Decision-Making: Models for Government–Industry Collaboration for the Development of Game -Based...unlimited. This page intentionally left blank. iii EXECUTIVE SUMMARY Game -based training tools, sometimes called “serious games ,” are becoming
Gray, Ewan; Donten, Anna; Karssemeijer, Nico; van Gils, Carla; Evans, D Gareth; Astley, Sue; Payne, Katherine
2017-09-01
To identify the incremental costs and consequences of stratified national breast screening programs (stratified NBSPs) and drivers of relative cost-effectiveness. A decision-analytic model (discrete event simulation) was conceptualized to represent four stratified NBSPs (risk 1, risk 2, masking [supplemental screening for women with higher breast density], and masking and risk 1) compared with the current UK NBSP and no screening. The model assumed a lifetime horizon, the health service perspective to identify costs (£, 2015), and measured consequences in quality-adjusted life-years (QALYs). Multiple data sources were used: systematic reviews of effectiveness and utility, published studies reporting costs, and cohort studies embedded in existing NBSPs. Model parameter uncertainty was assessed using probabilistic sensitivity analysis and one-way sensitivity analysis. The base-case analysis, supported by probabilistic sensitivity analysis, suggested that the risk stratified NBSPs (risk 1 and risk-2) were relatively cost-effective when compared with the current UK NBSP, with incremental cost-effectiveness ratios of £16,689 per QALY and £23,924 per QALY, respectively. Stratified NBSP including masking approaches (supplemental screening for women with higher breast density) was not a cost-effective alternative, with incremental cost-effectiveness ratios of £212,947 per QALY (masking) and £75,254 per QALY (risk 1 and masking). When compared with no screening, all stratified NBSPs could be considered cost-effective. Key drivers of cost-effectiveness were discount rate, natural history model parameters, mammographic sensitivity, and biopsy rates for recalled cases. A key assumption was that the risk model used in the stratification process was perfectly calibrated to the population. This early model-based cost-effectiveness analysis provides indicative evidence for decision makers to understand the key drivers of costs and QALYs for exemplar stratified NBSP. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Kuesten, Carla; Bi, Jian
2018-06-03
Conventional drivers of liking analysis was extended with a time dimension into temporal drivers of liking (TDOL) based on functional data analysis methodology and non-additive models for multiple-attribute time-intensity (MATI) data. The non-additive models, which consider both direct effects and interaction effects of attributes to consumer overall liking, include Choquet integral and fuzzy measure in the multi-criteria decision-making, and linear regression based on variance decomposition. Dynamics of TDOL, i.e., the derivatives of the relative importance functional curves were also explored. Well-established R packages 'fda', 'kappalab' and 'relaimpo' were used in the paper for developing TDOL. Applied use of these methods shows that the relative importance of MATI curves offers insights for understanding the temporal aspects of consumer liking for fruit chews.
Cost-Utility Analysis of Bariatric Surgery in Italy: Results of Decision-Analytic Modelling
Lucchese, Marcello; Borisenko, Oleg; Mantovani, Lorenzo Giovanni; Cortesi, Paolo Angelo; Cesana, Giancarlo; Adam, Daniel; Burdukova, Elisabeth; Lukyanov, Vasily; Di Lorenzo, Nicola
2017-01-01
Objective To evaluate the cost-effectiveness of bariatric surgery in Italy from a third-party payer perspective over a medium-term (10 years) and a long-term (lifetime) horizon. Methods A state-transition Markov model was developed, in which patients may experience surgery, post-surgery complications, diabetes mellitus type 2, cardiovascular diseases or die. Transition probabilities, costs, and utilities were obtained from the Italian and international literature. Three types of surgeries were considered: gastric bypass, sleeve gastrectomy, and adjustable gastric banding. A base-case analysis was performed for the population, the characteristics of which were obtained from surgery candidates in Italy. Results In the base-case analysis, over 10 years, bariatric surgery led to cost increment of EUR 2,661 and generated additional 1.1 quality-adjusted life years (QALYs). Over a lifetime, surgery led to savings of EUR 8,649, additional 0.5 life years and 3.2 QALYs. Bariatric surgery was cost-effective at 10 years with an incremental cost-effectiveness ratio of EUR 2,412/QALY and dominant over conservative management over a lifetime. Conclusion In a comprehensive decision analytic model, a current mix of surgical methods for bariatric surgery was cost-effective at 10 years and cost-saving over the lifetime of the Italian patient cohort considered in this analysis. PMID:28601866
Wu, Justine P; Damschroder, Laura J; Fetters, Michael D; Zikmund-Fisher, Brian J; Crabtree, Benjamin F; Hudson, Shawna V; Ruffin, Mack T; Fucinari, Juliana; Kang, Minji; Taichman, L Susan; Creswell, John W
2018-04-18
Women with chronic medical conditions, such as diabetes and hypertension, have a higher risk of pregnancy-related complications compared with women without medical conditions and should be offered contraception if desired. Although evidence based guidelines for contraceptive selection in the presence of medical conditions are available via the United States Medical Eligibility Criteria (US MEC), these guidelines are underutilized. Research also supports the use of decision tools to promote shared decision making between patients and providers during contraceptive counseling. The overall goal of the MiHealth, MiChoice project is to design and implement a theory-driven, Web-based tool that incorporates the US MEC (provider-level intervention) within the vehicle of a contraceptive decision tool for women with chronic medical conditions (patient-level intervention) in community-based primary care settings (practice-level intervention). This will be a 3-phase study that includes a predesign phase, a design phase, and a testing phase in a randomized controlled trial. This study protocol describes phase 1 and aim 1, which is to determine patient-, provider-, and practice-level factors that are relevant to the design and implementation of the contraceptive decision tool. This is a mixed methods implementation study. To customize the delivery of the US MEC in the decision tool, we selected high-priority constructs from the Consolidated Framework for Implementation Research and the Theoretical Domains Framework to drive data collection and analysis at the practice and provider level, respectively. A conceptual model that incorporates constructs from the transtheoretical model and the health beliefs model undergirds patient-level data collection and analysis and will inform customization of the decision tool for this population. We will recruit 6 community-based primary care practices and conduct quantitative surveys and semistructured qualitative interviews with women who have chronic medical conditions, their primary care providers (PCPs), and clinic staff, as well as field observations of practice activities. Quantitative survey data will be summarized with simple descriptive statistics and relationships between participant characteristics and contraceptive recommendations (for PCPs), and current contraceptive use (for patients) will be examined using Fisher exact test. We will conduct thematic analysis of qualitative data from interviews and field observations. The integration of data will occur by comparing, contrasting, and synthesizing qualitative and quantitative findings to inform the future development and implementation of the intervention. We are currently enrolling practices and anticipate study completion in 15 months. This protocol describes the first phase of a multiphase mixed methods study to develop and implement a Web-based decision tool that is customized to meet the needs of women with chronic medical conditions in primary care settings. Study findings will promote contraceptive counseling via shared decision making and reflect evidence-based guidelines for contraceptive selection. ClinicalTrials.gov NCT03153644; https://clinicaltrials.gov/ct2/show/NCT03153644 (Archived by WebCite at http://www.webcitation.org/6yUkA5lK8). ©Justine P Wu, Laura J Damschroder, Michael D Fetters, Brian J Zikmund-Fisher, Benjamin F Crabtree, Shawna V Hudson, Mack T Ruffin IV, Juliana Fucinari, Minji Kang, L Susan Taichman, John W Creswell. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 18.04.2018.
Hostettler, Isabel Charlotte; Muroi, Carl; Richter, Johannes Konstantin; Schmid, Josef; Neidert, Marian Christoph; Seule, Martin; Boss, Oliver; Pangalu, Athina; Germans, Menno Robbert; Keller, Emanuela
2018-01-19
OBJECTIVE The aim of this study was to create prediction models for outcome parameters by decision tree analysis based on clinical and laboratory data in patients with aneurysmal subarachnoid hemorrhage (aSAH). METHODS The database consisted of clinical and laboratory parameters of 548 patients with aSAH who were admitted to the Neurocritical Care Unit, University Hospital Zurich. To examine the model performance, the cohort was randomly divided into a derivation cohort (60% [n = 329]; training data set) and a validation cohort (40% [n = 219]; test data set). The classification and regression tree prediction algorithm was applied to predict death, functional outcome, and ventriculoperitoneal (VP) shunt dependency. Chi-square automatic interaction detection was applied to predict delayed cerebral infarction on days 1, 3, and 7. RESULTS The overall mortality was 18.4%. The accuracy of the decision tree models was good for survival on day 1 and favorable functional outcome at all time points, with a difference between the training and test data sets of < 5%. Prediction accuracy for survival on day 1 was 75.2%. The most important differentiating factor was the interleukin-6 (IL-6) level on day 1. Favorable functional outcome, defined as Glasgow Outcome Scale scores of 4 and 5, was observed in 68.6% of patients. Favorable functional outcome at all time points had a prediction accuracy of 71.1% in the training data set, with procalcitonin on day 1 being the most important differentiating factor at all time points. A total of 148 patients (27%) developed VP shunt dependency. The most important differentiating factor was hyperglycemia on admission. CONCLUSIONS The multiple variable analysis capability of decision trees enables exploration of dependent variables in the context of multiple changing influences over the course of an illness. The decision tree currently generated increases awareness of the early systemic stress response, which is seemingly pertinent for prognostication.
Warnke, Tom; Reinhardt, Oliver; Klabunde, Anna; Willekens, Frans; Uhrmacher, Adelinde M
2017-10-01
Individuals' decision processes play a central role in understanding modern migration phenomena and other demographic processes. Their integration into agent-based computational demography depends largely on suitable support by a modelling language. We are developing the Modelling Language for Linked Lives (ML3) to describe the diverse decision processes of linked lives succinctly in continuous time. The context of individuals is modelled by networks the individual is part of, such as family ties and other social networks. Central concepts, such as behaviour conditional on agent attributes, age-dependent behaviour, and stochastic waiting times, are tightly integrated in the language. Thereby, alternative decisions are modelled by concurrent processes that compete by stochastic race. Using a migration model, we demonstrate how this allows for compact description of complex decisions, here based on the Theory of Planned Behaviour. We describe the challenges for the simulation algorithm posed by stochastic race between multiple concurrent complex decisions.
Challenges to Applying a Metamodel for Groundwater Flow Beyond Underlying Numerical Model Boundaries
NASA Astrophysics Data System (ADS)
Reeves, H. W.; Fienen, M. N.; Feinstein, D.
2015-12-01
Metamodels of environmental behavior offer opportunities for decision support, adaptive management, and increased stakeholder engagement through participatory modeling and model exploration. Metamodels are derived from calibrated, computationally demanding, numerical models. They may potentially be applied to non-modeled areas to provide screening or preliminary analysis tools for areas that do not yet have the benefit of more comprehensive study. In this decision-support mode, they may be fulfilling a role often accomplished by application of analytical solutions. The major challenge to transferring a metamodel to a non-modeled area is how to quantify the spatial data in the new area of interest in such a way that it is consistent with the data used to derive the metamodel. Tests based on transferring a metamodel derived from a numerical groundwater-flow model of the Lake Michigan Basin to other glacial settings across the northern U.S. show that the spatial scale of the numerical model must be appropriately scaled to adequately represent different settings. Careful GIS analysis of the numerical model, metamodel, and new area of interest is required for successful transfer of results.
Zajac, Zuzanna; Stith, Bradley M.; Bowling, Andrea C.; Langtimm, Catherine A.; Swain, Eric D.
2015-01-01
Habitat suitability index (HSI) models are commonly used to predict habitat quality and species distributions and are used to develop biological surveys, assess reserve and management priorities, and anticipate possible change under different management or climate change scenarios. Important management decisions may be based on model results, often without a clear understanding of the level of uncertainty associated with model outputs. We present an integrated methodology to assess the propagation of uncertainty from both inputs and structure of the HSI models on model outputs (uncertainty analysis: UA) and relative importance of uncertain model inputs and their interactions on the model output uncertainty (global sensitivity analysis: GSA). We illustrate the GSA/UA framework using simulated hydrology input data from a hydrodynamic model representing sea level changes and HSI models for two species of submerged aquatic vegetation (SAV) in southwest Everglades National Park: Vallisneria americana (tape grass) and Halodule wrightii (shoal grass). We found considerable spatial variation in uncertainty for both species, but distributions of HSI scores still allowed discrimination of sites with good versus poor conditions. Ranking of input parameter sensitivities also varied spatially for both species, with high habitat quality sites showing higher sensitivity to different parameters than low-quality sites. HSI models may be especially useful when species distribution data are unavailable, providing means of exploiting widely available environmental datasets to model past, current, and future habitat conditions. The GSA/UA approach provides a general method for better understanding HSI model dynamics, the spatial and temporal variation in uncertainties, and the parameters that contribute most to model uncertainty. Including an uncertainty and sensitivity analysis in modeling efforts as part of the decision-making framework will result in better-informed, more robust decisions.
Cost-effectiveness of orthoptic screening in kindergarten: a decision-analytic model.
König, H H; Barry, J C; Leidl, R; Zrenner, E
2000-06-01
The purpose of this study was to analyze the cost-effectiveness of orthoptic screening for amblyopia in kindergarten. A decision-analytic model was used. In this model all kindergarten children in Germany aged 3 years were examined by an orthoptist. Children with positive screening results were referred to an ophthalmologist for diagnosis. The number of newly diagnosed cases of amblyopia, amblyogenic non-obvious strabismus and amblyogenic refractive errors was used as the measure of effectiveness. Direct costs were measured form a third-party payer perspective. Data for model parameters were obtained from the literature and from own measurements in kindergartens. A base analysis was performed using median parameter values. The influence of uncertain parameters was tested in sensitivity analyses. According to the base analysis, the cost of one orthoptic screening test was 7.87 euro. One ophthalmologic examination cost 36.40 euro. The total cost of the screening program in all kindergartens was 3.1 million euro. A total of 4,261 new cases would be detected. The cost-effectiveness ratio was 727 euro per case detected. Sensitivity analysis showed considerable influence of the prevalence rate of target conditions and of the specificity of the orthopic examination on the cost-effectiveness ratio. This analysis provides information which is useful for discussion about the implementation of orthoptic screening and for planning a field study.
A pattern-based analysis of clinical computer-interpretable guideline modeling languages.
Mulyar, Nataliya; van der Aalst, Wil M P; Peleg, Mor
2007-01-01
Languages used to specify computer-interpretable guidelines (CIGs) differ in their approaches to addressing particular modeling challenges. The main goals of this article are: (1) to examine the expressive power of CIG modeling languages, and (2) to define the differences, from the control-flow perspective, between process languages in workflow management systems and modeling languages used to design clinical guidelines. The pattern-based analysis was applied to guideline modeling languages Asbru, EON, GLIF, and PROforma. We focused on control-flow and left other perspectives out of consideration. We evaluated the selected CIG modeling languages and identified their degree of support of 43 control-flow patterns. We used a set of explicitly defined evaluation criteria to determine whether each pattern is supported directly, indirectly, or not at all. PROforma offers direct support for 22 of 43 patterns, Asbru 20, GLIF 17, and EON 11. All four directly support basic control-flow patterns, cancellation patterns, and some advance branching and synchronization patterns. None support multiple instances patterns. They offer varying levels of support for synchronizing merge patterns and state-based patterns. Some support a few scenarios not covered by the 43 control-flow patterns. CIG modeling languages are remarkably close to traditional workflow languages from the control-flow perspective, but cover many fewer workflow patterns. CIG languages offer some flexibility that supports modeling of complex decisions and provide ways for modeling some decisions not covered by workflow management systems. Workflow management systems may be suitable for clinical guideline applications.
The value of a statistical life: a meta-analysis with a mixed effects regression model.
Bellavance, François; Dionne, Georges; Lebeau, Martin
2009-03-01
The value of a statistical life (VSL) is a very controversial topic, but one which is essential to the optimization of governmental decisions. We see a great variability in the values obtained from different studies. The source of this variability needs to be understood, in order to offer public decision-makers better guidance in choosing a value and to set clearer guidelines for future research on the topic. This article presents a meta-analysis based on 39 observations obtained from 37 studies (from nine different countries) which all use a hedonic wage method to calculate the VSL. Our meta-analysis is innovative in that it is the first to use the mixed effects regression model [Raudenbush, S.W., 1994. Random effects models. In: Cooper, H., Hedges, L.V. (Eds.), The Handbook of Research Synthesis. Russel Sage Foundation, New York] to analyze studies on the value of a statistical life. We conclude that the variability found in the values studied stems in large part from differences in methodologies.
Akam, Thomas; Costa, Rui; Dayan, Peter
2015-12-01
The recently developed 'two-step' behavioural task promises to differentiate model-based from model-free reinforcement learning, while generating neurophysiologically-friendly decision datasets with parametric variation of decision variables. These desirable features have prompted its widespread adoption. Here, we analyse the interactions between a range of different strategies and the structure of transitions and outcomes in order to examine constraints on what can be learned from behavioural performance. The task involves a trade-off between the need for stochasticity, to allow strategies to be discriminated, and a need for determinism, so that it is worth subjects' investment of effort to exploit the contingencies optimally. We show through simulation that under certain conditions model-free strategies can masquerade as being model-based. We first show that seemingly innocuous modifications to the task structure can induce correlations between action values at the start of the trial and the subsequent trial events in such a way that analysis based on comparing successive trials can lead to erroneous conclusions. We confirm the power of a suggested correction to the analysis that can alleviate this problem. We then consider model-free reinforcement learning strategies that exploit correlations between where rewards are obtained and which actions have high expected value. These generate behaviour that appears model-based under these, and also more sophisticated, analyses. Exploiting the full potential of the two-step task as a tool for behavioural neuroscience requires an understanding of these issues.
Akam, Thomas; Costa, Rui; Dayan, Peter
2015-01-01
The recently developed ‘two-step’ behavioural task promises to differentiate model-based from model-free reinforcement learning, while generating neurophysiologically-friendly decision datasets with parametric variation of decision variables. These desirable features have prompted its widespread adoption. Here, we analyse the interactions between a range of different strategies and the structure of transitions and outcomes in order to examine constraints on what can be learned from behavioural performance. The task involves a trade-off between the need for stochasticity, to allow strategies to be discriminated, and a need for determinism, so that it is worth subjects’ investment of effort to exploit the contingencies optimally. We show through simulation that under certain conditions model-free strategies can masquerade as being model-based. We first show that seemingly innocuous modifications to the task structure can induce correlations between action values at the start of the trial and the subsequent trial events in such a way that analysis based on comparing successive trials can lead to erroneous conclusions. We confirm the power of a suggested correction to the analysis that can alleviate this problem. We then consider model-free reinforcement learning strategies that exploit correlations between where rewards are obtained and which actions have high expected value. These generate behaviour that appears model-based under these, and also more sophisticated, analyses. Exploiting the full potential of the two-step task as a tool for behavioural neuroscience requires an understanding of these issues. PMID:26657806
NASA Technical Reports Server (NTRS)
Ross, A.; Richards, A.; Keith, K.; Frew, C.; Boseck, J.; Sutton, S.; Watts, C.; Rickman, D.
2007-01-01
This project focused on a comprehensive utilization of air quality model products as decision support tools (DST) needed for public health applications. A review of past and future air quality measurement methods and their uncertainty, along with the relationship of air quality to national and global public health, is vital. This project described current and future NASA satellite remote sensing and ground sensing capabilities and the potential for using these sensors to enhance the prediction, prevention, and control of public health effects that result from poor air quality. The qualitative uncertainty of current satellite remotely sensed air quality, the ground-based remotely sensed air quality, the air quality/public health model, and the decision making process is evaluated in this study. Current peer-reviewed literature suggests that remotely sensed air quality parameters correlate well with ground-based sensor data. A satellite remote-sensed and ground-sensed data complement is needed to enhance the models/tools used by policy makers for the protection of national and global public health communities
[Modeling in value-based medicine].
Neubauer, A S; Hirneiss, C; Kampik, A
2010-03-01
Modeling plays an important role in value-based medicine (VBM). It allows decision support by predicting potential clinical and economic consequences, frequently combining different sources of evidence. Based on relevant publications and examples focusing on ophthalmology the key economic modeling methods are explained and definitions are given. The most frequently applied model types are decision trees, Markov models, and discrete event simulation (DES) models. Model validation includes besides verifying internal validity comparison with other models (external validity) and ideally validation of its predictive properties. The existing uncertainty with any modeling should be clearly stated. This is true for economic modeling in VBM as well as when using disease risk models to support clinical decisions. In economic modeling uni- and multivariate sensitivity analyses are usually applied; the key concepts here are tornado plots and cost-effectiveness acceptability curves. Given the existing uncertainty, modeling helps to make better informed decisions than without this additional information.
UQ for Decision Making: How (at least five) Kinds of Probability Might Come Into Play
NASA Astrophysics Data System (ADS)
Smith, L. A.
2013-12-01
In 1959 IJ Good published the discussion "Kinds of Probability" in Science. Good identified (at least) five kinds. The need for (at least) a sixth kind of probability when quantifying uncertainty in the context of climate science is discussed. This discussion brings out the differences in weather-like forecasting tasks and climate-links tasks, with a focus on the effective use both of science and of modelling in support of decision making. Good also introduced the idea of a "Dynamic probability" a probability one expects to change without any additional empirical evidence; the probabilities assigned by a chess playing program when it is only half thorough its analysis being an example. This case is contrasted with the case of "Mature probabilities" where a forecast algorithm (or model) has converged on its asymptotic probabilities and the question hinges in whether or not those probabilities are expected to change significantly before the event in question occurs, even in the absence of new empirical evidence. If so, then how might one report and deploy such immature probabilities in scientific-support of decision-making rationally? Mature Probability is suggested as a useful sixth kind, although Good would doubtlessly argue that we can get by with just one, effective communication with decision makers may be enhanced by speaking as if the others existed. This again highlights the distinction between weather-like contexts and climate-like contexts. In the former context one has access to a relevant climatology (a relevant, arguably informative distribution prior to any model simulations), in the latter context that information is not available although one can fall back on the scientific basis upon which the model itself rests, and estimate the probability that the model output is in fact misinformative. This subjective "probability of a big surprise" is one way to communicate the probability of model-based information holding in practice, the probability that the information the model-based probability is conditioned on holds. It is argued that no model-based climate-like probability forecast is complete without a quantitative estimate of its own irrelevance, and that the clear identification of model-based probability forecasts as mature or immature, are critical elements for maintaining the credibility of science-based decision support, and can shape uncertainty quantification more widely.
He, Y J; Li, X T; Fan, Z Q; Li, Y L; Cao, K; Sun, Y S; Ouyang, T
2018-01-23
Objective: To construct a dynamic enhanced MR based predictive model for early assessing pathological complete response (pCR) to neoadjuvant therapy in breast cancer, and to evaluate the clinical benefit of the model by using decision curve. Methods: From December 2005 to December 2007, 170 patients with breast cancer treated with neoadjuvant therapy were identified and their MR images before neoadjuvant therapy and at the end of the first cycle of neoadjuvant therapy were collected. Logistic regression model was used to detect independent factors for predicting pCR and construct the predictive model accordingly, then receiver operating characteristic (ROC) curve and decision curve were used to evaluate the predictive model. Results: ΔArea(max) and Δslope(max) were independent predictive factors for pCR, OR =0.942 (95% CI : 0.918-0.967) and 0.961 (95% CI : 0.940-0.987), respectively. The area under ROC curve (AUC) for the constructed model was 0.886 (95% CI : 0.820-0.951). Decision curve showed that in the range of the threshold probability above 0.4, the predictive model presented increased net benefit as the threshold probability increased. Conclusions: The constructed predictive model for pCR is of potential clinical value, with an AUC>0.85. Meanwhile, decision curve analysis indicates the constructed predictive model has net benefit from 3 to 8 percent in the likely range of probability threshold from 80% to 90%.
Orthogonal search-based rule extraction for modelling the decision to transfuse.
Etchells, T A; Harrison, M J
2006-04-01
Data from an audit relating to transfusion decisions during intermediate or major surgery were analysed to determine the strengths of certain factors in the decision making process. The analysis, using orthogonal search-based rule extraction (OSRE) from a trained neural network, demonstrated that the risk of tissue hypoxia (ROTH) assessed using a 100-mm visual analogue scale, the haemoglobin value (Hb) and the presence or absence of on-going haemorrhage (OGH) were able to reproduce the transfusion decisions with a joint specificity of 0.96 and sensitivity of 0.93 and a positive predictive value of 0.9. The rules indicating transfusion were: 1. ROTH > 32 mm and Hb < 94 g x l(-1); 2. ROTH > 13 mm and Hb < 87 g x l(-1); 3. ROTH > 38 mm, Hb < 102 g x l(-1) and OGH; 4. Hb < 78 g x l(-1).
Holt, S; Bertelli, G; Humphreys, I; Valentine, W; Durrani, S; Pudney, D; Rolles, M; Moe, M; Khawaja, S; Sharaiha, Y; Brinkworth, E; Whelan, S; Jones, S; Bennett, H; Phillips, C J
2013-06-11
Tumour gene expression analysis is useful in predicting adjuvant chemotherapy benefit in early breast cancer patients. This study aims to examine the implications of routine Oncotype DX testing in the U.K. Women with oestrogen receptor positive (ER+), pNO or pN1mi breast cancer were assessed for adjuvant chemotherapy and subsequently offered Oncotype DX testing, with changes in chemotherapy decisions recorded. A subset of patients completed questionnaires about their uncertainties regarding chemotherapy decisions pre- and post-testing. All patients were asked to complete a diary of medical interactions over the next 6 months, from which economic data were extracted to model the cost-effectiveness of testing. Oncotype DX testing resulted in changes in chemotherapy decisions in 38 of 142 (26.8%) women, with 26 of 57 (45.6%) spared chemotherapy and 12 of 85 (14.1%) requiring chemotherapy when not initially recommended (9.9% reduction overall). Decision conflict analysis showed that Oncotype DX testing increased patients' confidence in treatment decision making. Economic analysis showed that routine Oncotype DX testing costs £6232 per quality-adjusted life year gained. Oncotype DX decreased chemotherapy use and increased confidence in treatment decision making in patients with ER+ early-stage breast cancer. Based on these findings, Oncotype DX is cost-effective in the UK setting.
Yatsalo, Boris; Sullivan, Terrence; Didenko, Vladimir; Linkov, Igor
2011-07-01
The consequences of the Tohuku earthquake and subsequent tsunami in March 2011 caused a loss of power at the Fukushima Daiichi nuclear power plant, in Japan, and led to the release of radioactive materials into the environment. Although the full extent of the contamination is not currently known, the highly complex nature of the environmental contamination (radionuclides in water, soil, and agricultural produce) typical of nuclear accidents requires a detailed geospatial analysis of information with the ability to extrapolate across different scales with applications to risk assessment models and decision making support. This article briefly summarizes the approach used to inform risk-based land management and remediation decision making after the Chernobyl, Soviet Ukraine, accident in 1986. Copyright © 2011 SETAC.
NASA Astrophysics Data System (ADS)
El-Gafy, Mohamed Anwar
Transportation projects will have impact on the environment. The general environmental pollution and damage caused by roads is closely associated with the level of economic activity. Although Environmental Impact Assessments (EIAs) are dependent on geo-spatial information in order to make an assessment, there are no rules per se how to conduct an environmental assessment. Also, the particular objective of each assessment is dictated case-by-case, based on what information and analyses are required. The conventional way of Environmental Impact Assessment (EIA) study is a time consuming process because it has large number of dependent and independent variables which have to be taken into account, which also have different consequences. With the emergence of satellite remote sensing technology and Geographic Information Systems (GIS), this research presents a new framework for the analysis phase of the Environmental Impact Assessment (EIA) for transportation projects based on the integration between remote sensing technology, geographic information systems, and spatial modeling. By integrating the merits of the map overlay method and the matrix method, the framework analyzes comprehensively the environmental vulnerability around the road and its impact on the environment. This framework is expected to: (1) improve the quality of the decision making process, (2) be applied both to urban and inter-urban projects, regardless of transport mode, and (3) present the data and make the appropriate analysis to support the decision of the decision-makers and allow them to present these data to the public hearings in a simple manner. Case studies, transportation projects in the State of Florida, were analyzed to illustrate the use of the decision support framework and demonstrate its capabilities. This cohesive and integrated system will facilitate rational decisions through cost effective coordination of environmental information and data management that can be tailored to specific projects. The framework would facilitate collecting, organizing, analyzing, archiving, and coordinating the information and data necessary to support technical and policy transportation decisions.
2015-10-28
techniques such as regression analysis, correlation, and multicollinearity assessment to identify the change and error on the input to the model...between many of the independent or predictor variables, the issue of multicollinearity may arise [18]. VII. SUMMARY Accurate decisions concerning
Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Knox, Lenora A.
The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.
Decision Making and Behavioral Choice during Predator Avoidance
Herberholz, Jens; Marquart, Gregory D.
2012-01-01
One of the most important decisions animals have to make is how to respond to an attack from a potential predator. The response must be prompt and appropriate to ensure survival. Invertebrates have been important models in studying the underlying neurobiology of the escape response due to their accessible nervous systems and easily quantifiable behavioral output. Moreover, invertebrates provide opportunities for investigating these processes at a level of analysis not available in most other organisms. Recently, there has been a renewed focus in understanding how value-based calculations are made on the level of the nervous system, i.e., when decisions are made under conflicting circumstances, and the most desirable choice must be selected by weighing the costs and benefits for each behavioral choice. This article reviews samples from the current literature on anti-predator decision making in invertebrates, from single neurons to complex behaviors. Recent progress in understanding the mechanisms underlying value-based behavioral decisions is also discussed. PMID:22973187
NASA Astrophysics Data System (ADS)
Frey, Elaine F.
Even though environmental policy can greatly affect the path of technology diffusion, the economics literature contains limited empirical evidence of this relationship. My research will contribute to the available evidence by providing insight into the technology adoption decisions of electric generating firms. Since policies are often evaluated based on the incentives they provide to promote adoption of new technologies, it is important that policy makers understand the relationship between technological diffusion and regulation structure to make informed decisions. Lessons learned from this study can be used to guide future policies such as those directed to mitigate climate change. I first explore the diffusion of scrubbers, a sulfur dioxide (SO 2) abatement technology, in response to federal market-based regulations and state command-and-control regulations. I develop a simple theoretical model to describe the adoption decisions of scrubbers and use a survival model to empirically test the theoretical model. I find that power plants with strict command-and-control regulations have a high probability of installing a scrubber. These findings suggest that although market-based regulations have encouraged diffusion, many scrubbers have been installed because of state regulatory pressure. Although tradable permit systems are thought to give firms more flexibility in choosing abatement technologies, I show that interactions between a permit system and pre-existing command-and-control regulations can limit that flexibility. In a separate analysis, I explore the diffusion of combined cycle (CC) generating units, which are natural gas-fired generating units that are cleaner and more efficient than alternative generating units. I model the decision to consider adoption of a CC generating unit and the extent to which the technology is adopted in response to environmental regulations imposed on new sources of pollutants. To accomplish this, I use a zero-inflated Poisson model and focus on both the decision to adopt a CC unit at an existing power plant as well as the firm-level decision to adopt a CC unit in either a new or an existing power plant. Evidence from this empirical investigation shows that environmental regulation has a significant effect on both the decision to consider adoption as well as the extent of adoption.
Richter Sundberg, Linda; Garvare, Rickard; Nyström, Monica Elisabeth
2017-05-11
The judgment and decision making process during guideline development is central for producing high-quality clinical practice guidelines, but the topic is relatively underexplored in the guideline research literature. We have studied the development process of national guidelines with a disease-prevention scope produced by the National board of Health and Welfare (NBHW) in Sweden. The NBHW formal guideline development model states that guideline recommendations should be based on five decision-criteria: research evidence; curative/preventive effect size, severity of the condition; cost-effectiveness; and ethical considerations. A group of health profession representatives (i.e. a prioritization group) was assigned the task of ranking condition-intervention pairs for guideline recommendations, taking into consideration the multiple decision criteria. The aim of this study was to investigate the decision making process during the two-year development of national guidelines for methods of preventing disease. A qualitative inductive longitudinal case study approach was used to investigate the decision making process. Questionnaires, non-participant observations of nine two-day group meetings, and documents provided data for the analysis. Conventional and summative qualitative content analysis was used to analyse data. The guideline development model was modified ad-hoc as the group encountered three main types of dilemmas: high quality evidence vs. low adoptability of recommendation; insufficient evidence vs. high urgency to act; and incoherence in assessment and prioritization within and between four different lifestyle areas. The formal guideline development model guided the decision-criteria used, but three new or revised criteria were added by the group: 'clinical knowledge and experience', 'potential guideline consequences' and 'needs of vulnerable groups'. The frequency of the use of various criteria in discussions varied over time. Gender, professional status, and interpersonal skills were perceived to affect individuals' relative influence on group discussions. The study shows that guideline development groups make compromises between rigour and pragmatism. The formal guideline development model incorporated multiple aspects, but offered few details on how the different criteria should be handled. The guideline development model devoted little attention to the role of the decision-model and group-related factors. Guideline development models could benefit from clarifying the role of the group-related factors and non-research evidence, such as clinical experience and ethical considerations, in decision-processes during guideline development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Still, C.M.
1996-12-01
The primary waste management alternatives are source reduction, recycling, composting, incineration, and landfilling. Often waste management policies are based entirely on technical considerations and ignore that actual disposal practices depend on individuals` attitudes and behaviors. This research formulated a decision analysis model that incorporates social value measures to determine the waste management strategy that maximizes the individuals` willingness to participate. The social values that are important and that were considered in the decision support model to assist with making decisions about solid waste management were convenience, feeling good about reducing waste, feeling good about leaving a good environment for futuremore » generations, and the value of recreation programs that can be provided with profit from a recycling program.« less
NASA Technical Reports Server (NTRS)
Hale, C.; Valentino, G. J.
1982-01-01
Supervisory decision making and control behavior within a C(3) oriented, ground based weapon system is being studied. The program involves empirical investigation of the sequence of control strategies used during engagement of aircraft targets. An engagement is conceptually divided into several stages which include initial information processing activity, tracking, and ongoing adaptive control decisions. Following a brief description of model parameters, two experiments which served as initial investigation into the accuracy of assumptions regarding the importance of situation assessment in procedure selection are outlined. Preliminary analysis of the results upheld the validity of the assumptions regarding strategic information processing and cue-criterion relationship learning. These results indicate that this model structure should be useful in studies of supervisory decision behavior.
Stakeholder perspectives on decision-analytic modeling frameworks to assess genetic services policy.
Guzauskas, Gregory F; Garrison, Louis P; Stock, Jacquie; Au, Sylvia; Doyle, Debra Lochner; Veenstra, David L
2013-01-01
Genetic services policymakers and insurers often make coverage decisions in the absence of complete evidence of clinical utility and under budget constraints. We evaluated genetic services stakeholder opinions on the potential usefulness of decision-analytic modeling to inform coverage decisions, and asked them to identify genetic tests for decision-analytic modeling studies. We presented an overview of decision-analytic modeling to members of the Western States Genetic Services Collaborative Reimbursement Work Group and state Medicaid representatives and conducted directed content analysis and an anonymous survey to gauge their attitudes toward decision-analytic modeling. Participants also identified and prioritized genetic services for prospective decision-analytic evaluation. Participants expressed dissatisfaction with current processes for evaluating insurance coverage of genetic services. Some participants expressed uncertainty about their comprehension of decision-analytic modeling techniques. All stakeholders reported openness to using decision-analytic modeling for genetic services assessments. Participants were most interested in application of decision-analytic concepts to multiple-disorder testing platforms, such as next-generation sequencing and chromosomal microarray. Decision-analytic modeling approaches may provide a useful decision tool to genetic services stakeholders and Medicaid decision-makers.
Evidence-based dentistry: a model for clinical practice.
Faggion, Clóvis M; Tu, Yu-Kang
2007-06-01
Making decisions in dentistry should be based on the best evidence available. The objective of this study was to demonstrate a practical procedure and model that clinicians can use to apply the results of well-conducted studies to patient care by critically appraising the evidence with checklists and letter grade scales. To demonstrate application of this model for critically appraising the quality of research evidence, a hypothetical case involving an adult male with chronic periodontitis is used as an example. To determine the best clinical approach for this patient, a four-step, evidence-based model is demonstrated, consisting of the following: definition of a research question using the PICO format, search and selection of relevant literature, critical appraisal of identified research reports using checklists, and the application of evidence. In this model, the quality of research evidence was assessed quantitatively based on different levels of quality that are assigned letter grades of A, B, and C by evaluating the studies against the QUOROM (Quality of Reporting Meta-Analyses) and CONSORT (Consolidated Standards of Reporting Trials) checklists in a tabular format. For this hypothetical periodontics case, application of the model identified the best available evidence for clinical decision making, i.e., one randomized controlled trial and one systematic review of randomized controlled trials. Both studies showed similar answers for the research question. The use of a letter grade scale allowed an objective analysis of the quality of evidence. A checklist-driven model that assesses and applies evidence to dental practice may substantially improve dentists' decision making skill.
IEEE 1982. Proceedings of the international conference on cybernetics and society
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1982-01-01
The following topics were dealt with: knowledge-based systems; risk analysis; man-machine interactions; human information processing; metaphor, analogy and problem-solving; manual control modelling; transportation systems; simulation; adaptive and learning systems; biocybernetics; cybernetics; mathematical programming; robotics; decision support systems; analysis, design and validation of models; computer vision; systems science; energy systems; environmental modelling and policy; pattern recognition; nuclear warfare; technological forecasting; artificial intelligence; the Turin shroud; optimisation; workloads. Abstracts of individual papers can be found under the relevant classification codes in this or future issues.
NASA Astrophysics Data System (ADS)
Lachaut, T.; Yoon, J.; Klassert, C. J. A.; Talozi, S.; Mustafa, D.; Knox, S.; Selby, P. D.; Haddad, Y.; Gorelick, S.; Tilmant, A.
2016-12-01
Probabilistic approaches to uncertainty in water systems management can face challenges of several types: non stationary climate, sudden shocks such as conflict-driven migrations, or the internal complexity and dynamics of large systems. There has been a rising trend in the development of bottom-up methods that place focus on the decision side instead of probability distributions and climate scenarios. These approaches are based on defining acceptability thresholds for the decision makers and considering the entire range of possibilities over which such thresholds are crossed. We aim at improving the knowledge on the applicability and relevance of this approach by enlarging its scope beyond climate uncertainty and single decision makers; thus including demographic shifts, internal system dynamics, and multiple stakeholders at different scales. This vulnerability analysis is part of the Jordan Water Project and makes use of an ambitious multi-agent model developed by its teams with the extensive cooperation of the Ministry of Water and Irrigation of Jordan. The case of Jordan is a relevant example for migration spikes, rapid social changes, resource depletion and climate change impacts. The multi-agent modeling framework used provides a consistent structure to assess the vulnerability of complex water resources systems with distributed acceptability thresholds and stakeholder interaction. A proof of concept and preliminary results are presented for a non-probabilistic vulnerability analysis that involves different types of stakeholders, uncertainties other than climatic and the integration of threshold-based indicators. For each stakeholder (agent) a vulnerability matrix is constructed over a multi-dimensional domain, which includes various hydrologic and/or demographic variables.
Navigating the Decision Space: Shared Medical Decision Making as Distributed Cognition.
Lippa, Katherine D; Feufel, Markus A; Robinson, F Eric; Shalin, Valerie L
2017-06-01
Despite increasing prominence, little is known about the cognitive processes underlying shared decision making. To investigate these processes, we conceptualize shared decision making as a form of distributed cognition. We introduce a Decision Space Model to identify physical and social influences on decision making. Using field observations and interviews, we demonstrate that patients and physicians in both acute and chronic care consider these influences when identifying the need for a decision, searching for decision parameters, making actionable decisions Based on the distribution of access to information and actions, we then identify four related patterns: physician dominated; physician-defined, patient-made; patient-defined, physician-made; and patient-dominated decisions. Results suggests that (a) decision making is necessarily distributed between physicians and patients, (b) differential access to information and action over time requires participants to transform a distributed task into a shared decision, and (c) adverse outcomes may result from failures to integrate physician and patient reasoning. Our analysis unifies disparate findings in the medical decision-making literature and has implications for improving care and medical training.
Khadam, Ibrahim; Kaluarachchi, Jagath J
2003-07-01
Decision analysis in subsurface contamination management is generally carried out through a traditional engineering economic viewpoint. However, new advances in human health risk assessment, namely, the probabilistic risk assessment, and the growing awareness of the importance of soft data in the decision-making process, require decision analysis methodologies that are capable of accommodating non-technical and politically biased qualitative information. In this work, we discuss the major limitations of the currently practiced decision analysis framework, which evolves around the definition of risk and cost of risk, and its poor ability to communicate risk-related information. A demonstration using a numerical example was conducted to provide insight on these limitations of the current decision analysis framework. The results from this simple ground water contamination and remediation scenario were identical to those obtained from studies carried out on existing Superfund sites, which suggests serious flaws in the current risk management framework. In order to provide a perspective on how these limitations may be avoided in future formulation of the management framework, more matured and well-accepted approaches to decision analysis in dam safety and the utility industry, where public health and public investment are of great concern, are presented and their applicability in subsurface remediation management is discussed. Finally, in light of the success of the application of risk-based decision analysis in dam safety and the utility industry, potential options for decision analysis in subsurface contamination management are discussed.
Socio-climatic Exposure of an Afghan Poppy Farmer
NASA Astrophysics Data System (ADS)
Mankin, J. S.; Diffenbaugh, N. S.
2011-12-01
Many posit that climate impacts from anthropogenic greenhouse gas emissions will have consequences for the natural and agricultural systems on which humans rely for food, energy, and livelihoods, and therefore, on stability and human security. However, many of the potential mechanisms of action in climate impacts and human systems response, as well as the differential vulnerabilities of such systems, remain underexplored and unquantified. Here I present two initial steps necessary to characterize and quantify the consequences of climate change for farmer livelihood in Afghanistan, given both climate impacts and farmer vulnerabilities. The first is a conceptual model mapping the potential relationships between Afghanistan's climate, the winter agricultural season, and the country's political economy of violence and instability. The second is a utility-based decision model for assessing farmer response sensitivity to various climate impacts based on crop sensitivities. A farmer's winter planting decision can be modeled roughly as a tradeoff between cultivating the two crops that dominate the winter growing season-opium poppy (a climate tolerant cash crop) and wheat (a climatically vulnerable crop grown for household consumption). Early sensitivity analysis results suggest that wheat yield dominates farmer decision making variability; however, such initial results may dependent on the relative parameter ranges of wheat and poppy yields. Importantly though, the variance in Afghanistan's winter harvest yields of poppy and wheat is tightly linked to household livelihood and thus, is indirectly connected to the wider instability and insecurity within the country. This initial analysis motivates my focused research on the sensitivity of these crops to climate variability in order to project farmer well-being and decision sensitivity in a warmer world.
User-centered design to improve clinical decision support in primary care.
Brunner, Julian; Chuang, Emmeline; Goldzweig, Caroline; Cain, Cindy L; Sugar, Catherine; Yano, Elizabeth M
2017-08-01
A growing literature has demonstrated the ability of user-centered design to make clinical decision support systems more effective and easier to use. However, studies of user-centered design have rarely examined more than a handful of sites at a time, and have frequently neglected the implementation climate and organizational resources that influence clinical decision support. The inclusion of such factors was identified by a systematic review as "the most important improvement that can be made in health IT evaluations." (1) Identify the prevalence of four user-centered design practices at United States Veterans Affairs (VA) primary care clinics and assess the perceived utility of clinical decision support at those clinics; (2) Evaluate the association between those user-centered design practices and the perceived utility of clinical decision support. We analyzed clinic-level survey data collected in 2006-2007 from 170 VA primary care clinics. We examined four user-centered design practices: 1) pilot testing, 2) provider satisfaction assessment, 3) formal usability assessment, and 4) analysis of impact on performance improvement. We used a regression model to evaluate the association between user-centered design practices and the perceived utility of clinical decision support, while accounting for other important factors at those clinics, including implementation climate, available resources, and structural characteristics. We also examined associations separately at community-based clinics and at hospital-based clinics. User-centered design practices for clinical decision support varied across clinics: 74% conducted pilot testing, 62% conducted provider satisfaction assessment, 36% conducted a formal usability assessment, and 79% conducted an analysis of impact on performance improvement. Overall perceived utility of clinical decision support was high, with a mean rating of 4.17 (±.67) out of 5 on a composite measure. "Analysis of impact on performance improvement" was the only user-centered design practice significantly associated with perceived utility of clinical decision support, b=.47 (p<.001). This association was present in hospital-based clinics, b=.34 (p<.05), but was stronger at community-based clinics, b=.61 (p<.001). Our findings are highly supportive of the practice of analyzing the impact of clinical decision support on performance metrics. This was the most common user-centered design practice in our study, and was the practice associated with higher perceived utility of clinical decision support. This practice may be particularly helpful at community-based clinics, which are typically less connected to VA medical center resources. Published by Elsevier B.V.
Modeling Adversaries in Counterterrorism Decisions Using Prospect Theory.
Merrick, Jason R W; Leclerc, Philip
2016-04-01
Counterterrorism decisions have been an intense area of research in recent years. Both decision analysis and game theory have been used to model such decisions, and more recently approaches have been developed that combine the techniques of the two disciplines. However, each of these approaches assumes that the attacker is maximizing its utility. Experimental research shows that human beings do not make decisions by maximizing expected utility without aid, but instead deviate in specific ways such as loss aversion or likelihood insensitivity. In this article, we modify existing methods for counterterrorism decisions. We keep expected utility as the defender's paradigm to seek for the rational decision, but we use prospect theory to solve for the attacker's decision to descriptively model the attacker's loss aversion and likelihood insensitivity. We study the effects of this approach in a critical decision, whether to screen containers entering the United States for radioactive materials. We find that the defender's optimal decision is sensitive to the attacker's levels of loss aversion and likelihood insensitivity, meaning that understanding such descriptive decision effects is important in making such decisions. © 2014 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Falinski, K. A.; Oleson, K.; Htun, H.; Kappel, C.; Lecky, J.; Rowe, C.; Selkoe, K.; White, C.
2016-12-01
Faced with anthropogenic stressors and declining coral reef states, managers concerned with restoration and resilience of coral reefs are increasingly recognizing the need to take a ridge-to-reef, ecosystem-based approach. An ecosystem services framing can help managers move towards these goals, helping to illustrate trade-offs and opportunities of management actions in terms of their impacts on society. We describe a research program building a spatial ecosystem services-based decision-support tool, and being applied to guide ridge-to-reef management in a NOAA priority site in West Maui. We use multiple modeling methods to link biophysical processes to ecosystem services and their spatial flows and social values in an integrating platform. Modeled services include water availability, sediment retention, nutrient retention and carbon sequestration on land. A coral reef ecosystem service model is under development to capture the linkages between terrestrial and coastal ecosystem services. Valuation studies are underway to quantify the implications for human well-being. The tool integrates techniques from decision science to facilitate decision making. We use the sediment retention model to illustrate the types of analyses the tool can support. The case study explores the tradeoffs between road rehabilitation costs and sediment export avoided. We couple the sediment and cost models with trade-off analysis to identify optimal distributed solutions that are most cost-effective in reducing erosion, and then use those models to estimate sediment exposure to coral reefs. We find that cooperation between land owners reveals opportunities for maximizing the benefits of fixing roads and minimizes costs. This research forms the building blocks of an ecosystem service decision support tool that we intend to continue to test and apply in other Pacific Island settings.
LANL Institutional Decision Support By Process Modeling and Analysis Group (AET-2)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Booth, Steven Richard
2016-04-04
AET-2 has expertise in process modeling, economics, business case analysis, risk assessment, Lean/Six Sigma tools, and decision analysis to provide timely decision support to LANS leading to continuous improvement. This capability is critical during the current tight budgetary environment as LANS pushes to identify potential areas of cost savings and efficiencies. An important arena is business systems and operations, where processes can impact most or all laboratory employees. Lab-wide efforts are needed to identify and eliminate inefficiencies to accomplish Director McMillan’s charge of “doing more with less.” LANS faces many critical and potentially expensive choices that require sound decision supportmore » to ensure success. AET-2 is available to provide this analysis support to expedite the decisions at hand.« less
Governance and decision making in complex socio-hydrological systems
NASA Astrophysics Data System (ADS)
Elshorbagy, Amin; Wheater, Howard; Gober, Patricia; Hassanzadeh, Elmira
2017-04-01
The transboundary Saskatchewan River, originating in the Canadian Rockies in Alberta, flows through Saskatchewan and Manitoba and discharges its water into Lake Winnipeg. It supports irrigated agriculture, hydropower generation, flood protection, municipal water supplies, mining, recreation, and environmental services across a large area and in multiple administrative jurisdictions. Managing the region's water-based economic activities and environmental services, requires decisions at a variety of scales to incorporate competing values and priorities about water use. Current inter-provincial allocations are based on the 1969 Master Agreement of Water Apportionment whereby upstream Alberta must release one-half of the annual natural flows of the Saskatchewan River to Saskatchewan, which in turn must pass one-half of the residual natural flow to the Province of Manitoba. This analysis uses a hydro-economic simulation model, SWAMP, to examine risk-based tradeoffs in Saskatchewan for various types of water use including, agriculture, energy, and flood protection under various scenarios of water availability. The eco-hydrological effects of the scenarios on the largest inland delta in North America - the Saskatchewan River Delta - are also shown. Results enable decision makers to weigh the costs and benefits of implementing particular sector-based future development strategies. Assuming net provincial benefit as a single monetary indicator of economic value, the effects of various scenarios of environmental and policy changes are quantified Results show that improving irrigation technology and expanding irrigated lands in Alberta will positively affect the province's economic development and have compound effects downstream on hydropower generation, environmental flows and the economies of Saskatchewan and Manitoba. The implementation of similar policies in Saskatchewan will have different downstream impacts because of the large hydro-power capacity downstream in Manitoba. The model highlights the spatial tradeoffs across the three provinces and sectoral trade-offs among the differing water uses. These trade-offs represent challenging dilemmas for water management decisions in a complex system. The study reveals the need for a holistic framework of water resources analysis that can dynamically capture the feedback loops among hydrological, social, and administrative/political analysis units to support public discussion of critical water tradeoffs and a consensual water value framework to guide future development decisions.
Structured decision making for managing pneumonia epizootics in bighorn sheep
Sells, Sarah N.; Mitchell, Michael S.; Edwards, Victoria L.; Gude, Justin A.; Anderson, Neil J.
2016-01-01
Good decision-making is essential to conserving wildlife populations. Although there may be multiple ways to address a problem, perfect solutions rarely exist. Managers are therefore tasked with identifying decisions that will best achieve desired outcomes. Structured decision making (SDM) is a method of decision analysis used to identify the most effective, efficient, and realistic decisions while accounting for values and priorities of the decision maker. The stepwise process includes identifying the management problem, defining objectives for solving the problem, developing alternative approaches to achieve the objectives, and formally evaluating which alternative is most likely to accomplish the objectives. The SDM process can be more effective than informal decision-making because it provides a transparent way to quantitatively evaluate decisions for addressing multiple management objectives while incorporating science, uncertainty, and risk tolerance. To illustrate the application of this process to a management need, we present an SDM-based decision tool developed to identify optimal decisions for proactively managing risk of pneumonia epizootics in bighorn sheep (Ovis canadensis) in Montana. Pneumonia epizootics are a major challenge for managers due to long-term impacts to herds, epistemic uncertainty in timing and location of future epizootics, and consequent difficulty knowing how or when to manage risk. The decision tool facilitates analysis of alternative decisions for how to manage herds based on predictions from a risk model, herd-specific objectives, and predicted costs and benefits of each alternative. Decision analyses for 2 example herds revealed that meeting management objectives necessitates specific approaches unique to each herd. The analyses showed how and under what circumstances the alternatives are optimal compared to other approaches and current management. Managers can be confident that these decisions are effective, efficient, and realistic because they explicitly account for important considerations managers implicitly weigh when making decisions, including competing management objectives, uncertainty in potential outcomes, and risk tolerance.
Kansal-Kalra, Suleena; Milad, Magdy P; Grobman, William A
2005-09-01
To compare the economic consequences of proceeding directly to IVF to those of proceeding with gonadotropins followed by IVF in patients <35 years of age with unexplained infertility. A decision-tree model. The model incorporated the cost and success of each infertility regimen as well as the pregnancy-associated costs of singleton or multiple gestations and the risk and cost of cerebral palsy. Cost per live birth. Both treatment arms resulted in a >80% chance of birth. The gonadotropin arm was over four times more likely to result in a high-order multiple pregnancy (HOMP). Despite this, when the base case estimates were utilized, immediate IVF emerged as more costly per live birth. In sensitivity analysis, immediate IVF became less costly per live birth when IVF was more likely to achieve birth (55.1%) or cheaper (11,432 dollars) than our base case assumptions. After considering the risk and cost of HOMP, immediate IVF is more costly per live birth than a trial of gonadotropins prior to IVF.
Opportunities and pitfalls in clinical proof-of-concept: principles and examples.
Chen, Chao
2018-04-01
Clinical proof-of-concept trials crucially inform major resource deployment decisions. This paper discusses several mechanisms for enhancing their rigour and efficiency. The importance of careful consideration when using a surrogate endpoint is illustrated; situational effectiveness of run-in patient enrichment is explored; a versatile tool is introduced to ensure a strong pharmacological underpinning; the benefits of dose-titration are revealed by simulation; and the importance of adequately scheduled observations is shown. The general process of model-based trial design and analysis is described and several examples demonstrate the value in historical data, simulation-guided design, model-based analysis and trial adaptation informed by interim analysis. Copyright © 2018 Elsevier Ltd. All rights reserved.
Logit Estimation of a Gravity Model of the College Enrollment Decision.
ERIC Educational Resources Information Center
Leppel, Karen
1993-01-01
A study investigated the factors influencing students' decisions about attending a college to which they had been admitted. Logit analysis confirmed gravity model predictions that geographic distance and student ability would most influence the enrollment decision and found other variables, although affecting earlier stages of decision making, did…
A Critical Analysis of HRD Evaluation Models from a Decision-Making Perspective
ERIC Educational Resources Information Center
Holton, Elwood F., III; Naquin, Sharon
2005-01-01
HRD evaluation models are recommended for use by organizations to improve decisions made about HRD interventions. However, the organizational decision-making literature has been virtually ignored by evaluation researchers. In this article, we review the organizational decision-making literature and critically review HRD evaluation research through…
A Benefit-Risk Analysis Approach to Capture Regulatory Decision-Making: Multiple Myeloma.
Raju, G K; Gurumurthi, Karthik; Domike, Reuben; Kazandjian, Dickran; Landgren, Ola; Blumenthal, Gideon M; Farrell, Ann; Pazdur, Richard; Woodcock, Janet
2018-01-01
Drug regulators around the world make decisions about drug approvability based on qualitative benefit-risk analysis. In this work, a quantitative benefit-risk analysis approach captures regulatory decision-making about new drugs to treat multiple myeloma (MM). MM assessments have been based on endpoints such as time to progression (TTP), progression-free survival (PFS), and objective response rate (ORR) which are different than benefit-risk analysis based on overall survival (OS). Twenty-three FDA decisions on MM drugs submitted to FDA between 2003 and 2016 were identified and analyzed. The benefits and risks were quantified relative to comparators (typically the control arm of the clinical trial) to estimate whether the median benefit-risk was positive or negative. A sensitivity analysis was demonstrated using ixazomib to explore the magnitude of uncertainty. FDA approval decision outcomes were consistent and logical using this benefit-risk framework. © 2017 American Society for Clinical Pharmacology and Therapeutics.
Jin, S W; Li, Y P; Nie, S
2018-05-15
In this study, an interval chance-constrained bi-level programming (ICBP) method is developed for air quality management of municipal energy system under uncertainty. ICBP can deal with uncertainties presented as interval values and probability distributions as well as examine the risk of violating constraints. Besides, a leader-follower decision strategy is incorporated into the optimization process where two decision makers with different goals and preferences are involved. To solve the proposed model, a bi-level interactive algorithm based on satisfactory degree is introduced into the decision-making processes. Then, an ICBP based energy and environmental systems (ICBP-EES) model is formulated for Beijing, in which air quality index (AQI) is used for evaluating the integrated air quality of multiple pollutants. Result analysis can help different stakeholders adjust their tolerances to achieve the overall satisfaction of EES planning for the study city. Results reveal that natural gas is the main source for electricity-generation and heating that could lead to a potentially increment of imported energy for Beijing in future. Results also disclose that PM 10 is the major contributor to AQI. These findings can help decision makers to identify desired alternatives for EES planning and provide useful information for regional air quality management under uncertainty. Copyright © 2018 Elsevier B.V. All rights reserved.
Solutions to pervasive environmental problems often are not amenable to a straightforward application of science-based actions. These problems encompass large-scale environmental policy questions where environmental concerns, economic constraints, and societal values conflict ca...
A Multiple Streams analysis of the decisions to fund gender-neutral HPV vaccination in Canada.
Shapiro, Gilla K; Guichon, Juliet; Prue, Gillian; Perez, Samara; Rosberger, Zeev
2017-07-01
In Canada, the human papillomavirus (HPV) vaccine is licensed and recommended for females and males. Although all Canadian jurisdictions fund school-based HPV vaccine programs for girls, only six jurisdictions fund school-based HPV vaccination for boys. The research aimed to analyze the factors that underpin government decisions to fund HPV vaccine for boys using a theoretical policy model, Kingdon's Multiple Streams framework. This approach assesses policy development by examining three concurrent, but independent, streams that guide analysis: Problem Stream, Policy Stream, and Politics Stream. Analysis from the Problem Stream highlights that males are affected by HPV-related diseases and are involved in transmitting HPV infection to their sexual partners. Policy Stream analysis makes clear that while the inclusion of males in HPV vaccine programs is suitable, equitable, and acceptable; there is debate regarding cost-effectiveness. Politics Stream analysis identifies the perspectives of six different stakeholder groups and highlights the contribution of government officials at the provincial and territorial level. Kingdon's Multiple Streams framework helps clarify the opportunities and barriers for HPV vaccine policy change. This analysis identified that the interpretation of cost-effectiveness models and advocacy of stakeholders such as citizen-advocates and HPV-affected politicians have been particularly important in galvanizing policy change. Copyright © 2017 Elsevier Inc. All rights reserved.
A control-theory model for human decision-making
NASA Technical Reports Server (NTRS)
Levison, W. H.; Tanner, R. B.
1971-01-01
A model for human decision making is an adaptation of an optimal control model for pilot/vehicle systems. The models for decision and control both contain concepts of time delay, observation noise, optimal prediction, and optimal estimation. The decision making model was intended for situations in which the human bases his decision on his estimate of the state of a linear plant. Experiments are described for the following task situations: (a) single decision tasks, (b) two-decision tasks, and (c) simultaneous manual control and decision making. Using fixed values for model parameters, single-task and two-task decision performance can be predicted to within an accuracy of 10 percent. Agreement is less good for the simultaneous decision and control situation.
2013-10-21
depend on the quality of allocating resources. This work uses a reliability model of system and environmental covariates incorporating information at...state space. Further, the use of condition variables allows for the direct modeling of maintenance impact with the assumption that a nominal value ... value ), the model in the application of aviation maintenance can provide a useful estimation of reliability at multiple levels. Adjusted survival
Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J; Ibarra-Manzano, Mario A
2016-03-05
Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states.
Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R.; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J.; Ibarra-Manzano, Mario A.
2016-01-01
Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states. PMID:26959029
NASA Astrophysics Data System (ADS)
Tisza, Kata
Photovoltaic (PV) development shows significantly smaller growth in the Southeast U.S., than in the Southwest; which is mainly due to the low cost of fossil-fuel based energy production in the region and the lack of solar incentives. However, the Southeast has appropriate insolation conditions (4.0-6.0 KWh/m2/day) for photovoltaic deployment and in the past decade the region has experienced the highest population growth for the entire country. These factors, combined with new renewable energy portfolio policies, could create an opportunity for PV to provide some of the energy that will be required to sustain this growth. The goal of the study was to investigate the potential for PV generation in the Southeast region by identifying suitable areas for a utility-scale solar power plant deployment. Four states with currently low solar penetration were studied: Georgia, North Carolina, South Carolina and Tennessee. Feasible areas were assessed with Geographic Information Systems (GIS) software using solar, land use and population growth criteria combined with proximity to transmission lines and roads. After the GIS-based assessment of the areas, technological potential was calculated for each state. Multi-decision analysis model (MCDA) was used to simulate the decision making method for a strategic PV installation. The model accounted for all criteria necessary to consider in case of a PV development and also included economic and policy criteria, which is thought to be a strong influence on the PV market. Three different scenarios were established, representing decision makers' theoretical preferences. Map layers created in the first part were used as basis for the MCDA and additional technical, economic and political/market criteria were added. A sensitivity analysis was conducted to test the model's robustness. Finally, weighted criteria were assigned to the GIS map layers, so that the different preference systems could be visualized. As a result, lands suitable for a potential industrial-scale PV deployment were assessed. Moreover, a precise calculation for technical potential was conducted, with a capacity factor determined by the actual insolation of the sum of each specific feasible area. The results of the study showed that, for a utility-scale PV utility deployment, significant amount of feasible areas are available, with good electricity generation potential Moreover, a stable MCDA model was established for supporting strategic decision making in a PV deployment. Also, changes of suitable lands for utility-scale PV installations were visualized in GIS for the state of Tennessee.
Markov chain decision model for urinary incontinence procedures.
Kumar, Sameer; Ghildayal, Nidhi; Ghildayal, Neha
2017-03-13
Purpose Urinary incontinence (UI) is a common chronic health condition, a problem specifically among elderly women that impacts quality of life negatively. However, UI is usually viewed as likely result of old age, and as such is generally not evaluated or even managed appropriately. Many treatments are available to manage incontinence, such as bladder training and numerous surgical procedures such as Burch colposuspension and Sling for UI which have high success rates. The purpose of this paper is to analyze which of these popular surgical procedures for UI is effective. Design/methodology/approach This research employs randomized, prospective studies to obtain robust cost and utility data used in the Markov chain decision model for examining which of these surgical interventions is more effective in treating women with stress UI based on two measures: number of quality adjusted life years (QALY) and cost per QALY. Treeage Pro Healthcare software was employed in Markov decision analysis. Findings Results showed the Sling procedure is a more effective surgical intervention than the Burch. However, if a utility greater than certain utility value, for which both procedures are equally effective, is assigned to persistent incontinence, the Burch procedure is more effective than the Sling procedure. Originality/value This paper demonstrates the efficacy of a Markov chain decision modeling approach to study the comparative effectiveness analysis of available treatments for patients with UI, an important public health issue, widely prevalent among elderly women in developed and developing countries. This research also improves upon other analyses using a Markov chain decision modeling process to analyze various strategies for treating UI.
Fews-Risk: A step towards risk-based flood forecasting
NASA Astrophysics Data System (ADS)
Bachmann, Daniel; Eilander, Dirk; de Leeuw, Annemargreet; Diermanse, Ferdinand; Weerts, Albrecht; de Bruijn, Karin; Beckers, Joost; Boelee, Leonore; Brown, Emma; Hazlewood, Caroline
2015-04-01
Operational flood prediction and the assessment of flood risk are important components of flood management. Currently, the model-based prediction of discharge and/or water level in a river is common practice for operational flood forecasting. Based on the prediction of these values decisions about specific emergency measures are made within operational flood management. However, the information provided for decision support is restricted to pure hydrological or hydraulic aspects of a flood. Information about weak sections within the flood defences, flood prone areas and assets at risk in the protected areas are rarely used in a model-based flood forecasting system. This information is often available for strategic planning, but is not in an appropriate format for operational purposes. The idea of FEWS-Risk is the extension of existing flood forecasting systems with elements of strategic flood risk analysis, such as probabilistic failure analysis, two dimensional flood spreading simulation and the analysis of flood impacts and consequences. Thus, additional information is provided to the decision makers, such as: • Location, timing and probability of failure of defined sections of the flood defence line; • Flood spreading, extent and hydraulic values in the hinterland caused by an overflow or a breach flow • Impacts and consequences in case of flooding in the protected areas, such as injuries or casualties and/or damages to critical infrastructure or economy. In contrast with purely hydraulic-based operational information, these additional data focus upon decision support for answering crucial questions within an operational flood forecasting framework, such as: • Where should I reinforce my flood defence system? • What type of action can I take to mend a weak spot in my flood defences? • What are the consequences of a breach? • Which areas should I evacuate first? This presentation outlines the additional required workflows towards risk-based flood forecasting systems. In a cooperation between HR Wallingford and Deltares, the extended workflows are being integrated into the Delft-FEWS software system. Delft-FEWS provides modules for managing the data handling and forecasting process. Results of a pilot study that demonstrates the new tools are presented. The value of the newly generated information for decision support during a flood event is discussed.
Bayesian Decision Support for Adaptive Lung Treatments
NASA Astrophysics Data System (ADS)
McShan, Daniel; Luo, Yi; Schipper, Matt; TenHaken, Randall
2014-03-01
Purpose: A Bayesian Decision Network will be demonstrated to provide clinical decision support for adaptive lung response-driven treatment management based on evidence that physiologic metrics may correlate better with individual patient response than traditional (population-based) dose and volume-based metrics. Further, there is evidence that information obtained during the course of radiation therapy may further improve response predictions. Methods: Clinical factors were gathered for 58 patients including planned mean lung dose, and the bio-markers IL-8 and TGF-β1 obtained prior to treatment and two weeks into treatment along with complication outcomes for these patients. A Bayesian Decision Network was constructed using Netica 5.0.2 from Norsys linking these clinical factors to obtain a prediction of radiation induced lung disese (RILD) complication. A decision node was added to the network to provide a plan adaption recommendation based on the trade-off between the RILD prediction and complexity of replanning. A utility node provides the weighting cost between the competing factors. Results: The decision node predictions were optimized against the data for the 58 cases. With this decision network solution, one can consider the decision result for a new patient with specific findings to obtain a recommendation to adaptively modify the originally planned treatment course. Conclusions: A Bayesian approach allows handling and propagating probabilistic data in a logical and principled manner. Decision networks provide the further ability to provide utility-based trade-offs, reflecting non-medical but practical cost/benefit analysis. The network demonstrated illustrates the basic concept, but many other factors may affect these decisions and work on building better models are being designed and tested. Acknowledgement: Supported by NIH-P01-CA59827
Pitcher, Brandon; Alaqla, Ali; Noujeim, Marcel; Wealleans, James A; Kotsakis, Georgios; Chrepa, Vanessa
2017-03-01
Cone-beam computed tomographic (CBCT) analysis allows for 3-dimensional assessment of periradicular lesions and may facilitate preoperative periapical cyst screening. The purpose of this study was to develop and assess the predictive validity of a cyst screening method based on CBCT volumetric analysis alone or combined with designated radiologic criteria. Three independent examiners evaluated 118 presurgical CBCT scans from cases that underwent apicoectomies and had an accompanying gold standard histopathological diagnosis of either a cyst or granuloma. Lesion volume, density, and specific radiologic characteristics were assessed using specialized software. Logistic regression models with histopathological diagnosis as the dependent variable were constructed for cyst prediction, and receiver operating characteristic curves were used to assess the predictive validity of the models. A conditional inference binary decision tree based on a recursive partitioning algorithm was constructed to facilitate preoperative screening. Interobserver agreement was excellent for volume and density, but it varied from poor to good for the radiologic criteria. Volume and root displacement were strong predictors for cyst screening in all analyses. The binary decision tree classifier determined that if the volume of the lesion was >247 mm 3 , there was 80% probability of a cyst. If volume was <247 mm 3 and root displacement was present, cyst probability was 60% (78% accuracy). The good accuracy and high specificity of the decision tree classifier renders it a useful preoperative cyst screening tool that can aid in clinical decision making but not a substitute for definitive histopathological diagnosis after biopsy. Confirmatory studies are required to validate the present findings. Published by Elsevier Inc.
Maurer, Max; Lienert, Judit
2017-01-01
We compare the use of multi-criteria decision analysis (MCDA)–or more precisely, models used in multi-attribute value theory (MAVT)–to integrated assessment (IA) models for supporting long-term water supply planning in a small town case study in Switzerland. They are used to evaluate thirteen system scale water supply alternatives in four future scenarios regarding forty-four objectives, covering technical, social, environmental, and economic aspects. The alternatives encompass both conventional and unconventional solutions and differ regarding technical, spatial and organizational characteristics. This paper focuses on the impact assessment and final evaluation step of the structured MCDA decision support process. We analyze the performance of the alternatives for ten stakeholders. We demonstrate the implications of model assumptions by comparing two IA and three MAVT evaluation model layouts of different complexity. For this comparison, we focus on the validity (ranking stability), desirability (value), and distinguishability (value range) of the alternatives given the five model layouts. These layouts exclude or include stakeholder preferences and uncertainties. Even though all five led us to identify the same best alternatives, they did not produce identical rankings. We found that the MAVT-type models provide higher distinguishability and a more robust basis for discussion than the IA-type models. The needed complexity of the model, however, should be determined based on the intended use of the model within the decision support process. The best-performing alternatives had consistently strong performance for all stakeholders and future scenarios, whereas the current water supply system was outperformed in all evaluation layouts. The best-performing alternatives comprise proactive pipe rehabilitation, adapted firefighting provisions, and decentralized water storage and/or treatment. We present recommendations for possible ways of improving water supply planning in the case study and beyond. PMID:28481881
Case based reasoning in criminal intelligence using forensic case data.
Ribaux, O; Margot, P
2003-01-01
A model that is based on the knowledge of experienced investigators in the analysis of serial crime is suggested to bridge a gap between technology and methodology. Its purpose is to provide a solid methodology for the analysis of serial crimes that supports decision making in the deployment of resources, either by guiding proactive policing operations or helping the investigative process. Formalisation has helped to derive a computerised system that efficiently supports the reasoning processes in the analysis of serial crime. This novel approach fully integrates forensic science data.
Guikema, Seth
2012-07-01
Intelligent adversary modeling has become increasingly important for risk analysis, and a number of different approaches have been proposed for incorporating intelligent adversaries in risk analysis models. However, these approaches are based on a range of often-implicit assumptions about the desirable properties of intelligent adversary models. This "Perspective" paper aims to further risk analysis for situations involving intelligent adversaries by fostering a discussion of the desirable properties for these models. A set of four basic necessary conditions for intelligent adversary models is proposed and discussed. These are: (1) behavioral accuracy to the degree possible, (2) computational tractability to support decision making, (3) explicit consideration of uncertainty, and (4) ability to gain confidence in the model. It is hoped that these suggested necessary conditions foster discussion about the goals and assumptions underlying intelligent adversary modeling in risk analysis. © 2011 Society for Risk Analysis.
Seror, Valerie
2008-05-01
Choices regarding prenatal diagnosis of Down syndrome - the most frequent chromosomal defect - are particularly relevant to decision analysis, since women's decisions are based on the assessment of their risk of carrying a child with Down syndrome, and involve tradeoffs (giving birth to an affected child vs procedure-related miscarriage). The aim of this study, based on face-to-face interviews with 78 women aged 25-35 with prior experience of pregnancy, was to compare the women' expressed choices towards prenatal diagnosis with those derived from theoretical models of choice (expected utility theory, rank-dependent theory, and cumulative prospect theory). The main finding obtained in this study was that the cumulative prospect model fitted the observed choices best: both subjective transformation of probabilities and loss aversion, which are basic features of the cumulative prospect model, have to be taken into account to make the observed choices consistent with the theoretical ones.
Wolf, Lisa
2013-02-01
To explore the relationship between multiple variables within a model of critical thinking and moral reasoning. A quantitative descriptive correlational design using a purposive sample of 200 emergency nurses. Measured variables were accuracy in clinical decision-making, moral reasoning, perceived care environment, and demographics. Analysis was by bivariate correlation using Pearson's product-moment correlation coefficients, chi square and multiple linear regression analysis. The elements as identified in the integrated ethically-driven environmental model of clinical decision-making (IEDEM-CD) corrected depict moral reasoning and environment of care as factors significantly affecting accuracy in decision-making. The integrated, ethically driven environmental model of clinical decision making is a framework useful for predicting clinical decision making accuracy for emergency nurses in practice, with further implications in education, research and policy. A diagnostic and therapeutic framework for identifying and remediating individual and environmental challenges to accurate clinical decision making. © 2012, The Author. International Journal of Nursing Knowledge © 2012, NANDA International.
Democratic parenting: paradoxical messages in democratic parent education theories
NASA Astrophysics Data System (ADS)
Oryan, Shlomit; Gastil, John
2013-06-01
Some prominent parent education theories in the United States and other Western countries base their educational viewpoint explicitly on democratic values, such as mutual respect, equality and personal freedom. These democratic parenting theories advocate sharing power with children and including them in family decision making. This study presents a textual analysis of two such theories, the Adlerian model of parent education and the Parent Effectiveness Training (PET) model, as they are embodied in two original bestselling textbooks. Through content and argumentation analysis of these influential texts, this study examines the paradoxes inherent in these two theories when they articulate how to implement fully democratic principles within the parent-child relationship. We discover that in spite of their democratic rationale, both books offer communication practices that guide the child to modify misbehaviour, enforce parental power, and manipulate the child to make decisions that follow parental judgment, and thus do not endorse the use of a truly democratic parenting style. We suggest, as an alternative to the democratic parenting style, that parents be introduced to a guardianship management style, in which they do not share authority with children, but seek opportunities for enabling children to make more autonomous decisions and participate in more family decision making.
Dominkovics, Pau; Granell, Carlos; Pérez-Navarro, Antoni; Casals, Martí; Orcau, Angels; Caylà, Joan A
2011-11-29
Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios.
2011-01-01
Background Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Methods Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. Results The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. Conclusions In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios. PMID:22126392
Chen, Keping; Blong, Russell; Jacobson, Carol
2003-04-01
This paper develops a GIS-based integrated approach to risk assessment in natural hazards, with reference to bushfires. The challenges for undertaking this approach have three components: data integration, risk assessment tasks, and risk decision-making. First, data integration in GIS is a fundamental step for subsequent risk assessment tasks and risk decision-making. A series of spatial data integration issues within GIS such as geographical scales and data models are addressed. Particularly, the integration of both physical environmental data and socioeconomic data is examined with an example linking remotely sensed data and areal census data in GIS. Second, specific risk assessment tasks, such as hazard behavior simulation and vulnerability assessment, should be undertaken in order to understand complex hazard risks and provide support for risk decision-making. For risk assessment tasks involving heterogeneous data sources, the selection of spatial analysis units is important. Third, risk decision-making concerns spatial preferences and/or patterns, and a multicriteria evaluation (MCE)-GIS typology for risk decision-making is presented that incorporates three perspectives: spatial data types, data models, and methods development. Both conventional MCE methods and artificial intelligence-based methods with GIS are identified to facilitate spatial risk decision-making in a rational and interpretable way. Finally, the paper concludes that the integrated approach can be used to assist risk management of natural hazards, in theory and in practice.
2015-10-01
capability to meet the task to the standard under the condition, nothing more or less, else the funding is wasted . Also, that funding for the...bin to segregate gaps qualitatively before the gap value model determined preference among gaps within the bins. Computation of a gap’s...for communication, interpretation, or processing by humans or by automatic means (as it pertains to modeling and simulation). Delphi Method -- a
Evidence synthesis for decision making 7: a reviewer's checklist.
Ades, A E; Caldwell, Deborah M; Reken, Stefanie; Welton, Nicky J; Sutton, Alex J; Dias, Sofia
2013-07-01
This checklist is for the review of evidence syntheses for treatment efficacy used in decision making based on either efficacy or cost-effectiveness. It is intended to be used for pairwise meta-analysis, indirect comparisons, and network meta-analysis, without distinction. It does not generate a quality rating and is not prescriptive. Instead, it focuses on a series of questions aimed at revealing the assumptions that the authors of the synthesis are expecting readers to accept, the adequacy of the arguments authors advance in support of their position, and the need for further analyses or sensitivity analyses. The checklist is intended primarily for those who review evidence syntheses, including indirect comparisons and network meta-analyses, in the context of decision making but will also be of value to those submitting syntheses for review, whether to decision-making bodies or journals. The checklist has 4 main headings: A) definition of the decision problem, B) methods of analysis and presentation of results, C) issues specific to network synthesis, and D) embedding the synthesis in a probabilistic cost-effectiveness model. The headings and implicit advice follow directly from the other tutorials in this series. A simple table is provided that could serve as a pro forma checklist.
Costs of detection bias in index-based population monitoring
Moore, C.T.; Kendall, W.L.
2004-01-01
Managers of wildlife populations commonly rely on indirect, count-based measures of the population in making decisions regarding conservation, harvest, or control. The main appeal in the use of such counts is their low material expense compared to methods that directly measure the population. However, their correct use rests on the rarely-tested but often-assumed premise that they proportionately reflect population size, i.e., that they constitute a population index. This study investigates forest management for the endangered Red-cockaded Woodpecker (Picoides borealis) and the Wood Thrush (Hylocichla mustelina) at the Piedmont National Wildlife Refuge in central Georgia, U.S.A. Optimal decision policies for a joint species objective were derived for two alternative models of Wood Thrush population dynamics. Policies were simulated under scenarios of unbiasedness, consistent negative bias, and habitat-dependent negative bias in observed Wood Thrush densities. Differences in simulation outcomes between biased and unbiased detection scenarios indicated the expected loss in resource objectives (here, forest habitat and birds) through decision-making based on biased population counts. Given the models and objective function used in our analysis, expected losses were as great as 11%, a degree of loss perhaps not trivial for applications such as endangered species management. Our analysis demonstrates that costs of uncertainty about the relationship between the population and its observation can be measured in units of the resource, costs which may offset apparent savings achieved by collecting uncorrected population counts.
NASA Astrophysics Data System (ADS)
Peng, M.; Zhang, L. M.
2013-02-01
Tangjiashan landslide dam, which was triggered by the Ms = 8.0 Wenchuan earthquake in 2008 in China, threatened 1.2 million people downstream of the dam. All people in Beichuan Town 3.5 km downstream of the dam and 197 thousand people in Mianyang City 85 km downstream of the dam were evacuated 10 days before the breaching of the dam. Making such an important decision under uncertainty was difficult. This paper applied a dynamic decision-making framework for dam-break emergency management (DYDEM) to help rational decision in the emergency management of the Tangjiashan landslide dam. Three stages are identified with different levels of hydrological, geological and social-economic information along the timeline of the landslide dam failure event. The probability of dam failure is taken as a time series. The dam breaching parameters are predicted with a set of empirical models in stage 1 when no soil property information is known, and a physical model in stages 2 and 3 when knowledge of soil properties has been obtained. The flood routing downstream of the dam in these three stages is analyzed to evaluate the population at risk (PAR). The flood consequences, including evacuation costs, flood damage and monetized loss of life, are evaluated as functions of warning time using a human risk analysis model based on Bayesian networks. Finally, dynamic decision analysis is conducted to find the optimal time to evacuate the population at risk with minimum total loss in each of these three stages.
The Risky Shift in Policy Decision Making: A Comparative Analysis
ERIC Educational Resources Information Center
Wilpert, B.; And Others
1976-01-01
Based on analysis of data on 432 decision-makers from around the world, this study examines the decision-making phenomenon that individuals tend to move toward riskier decisions after group discussion. Findings of the analysis contradicted earlier studies, showing a consistent shift toward greater risk avoidance. Available from Elsevier Scientific…
A P2P Botnet detection scheme based on decision tree and adaptive multilayer neural networks.
Alauthaman, Mohammad; Aslam, Nauman; Zhang, Li; Alasem, Rafe; Hossain, M A
2018-01-01
In recent years, Botnets have been adopted as a popular method to carry and spread many malicious codes on the Internet. These malicious codes pave the way to execute many fraudulent activities including spam mail, distributed denial-of-service attacks and click fraud. While many Botnets are set up using centralized communication architecture, the peer-to-peer (P2P) Botnets can adopt a decentralized architecture using an overlay network for exchanging command and control data making their detection even more difficult. This work presents a method of P2P Bot detection based on an adaptive multilayer feed-forward neural network in cooperation with decision trees. A classification and regression tree is applied as a feature selection technique to select relevant features. With these features, a multilayer feed-forward neural network training model is created using a resilient back-propagation learning algorithm. A comparison of feature set selection based on the decision tree, principal component analysis and the ReliefF algorithm indicated that the neural network model with features selection based on decision tree has a better identification accuracy along with lower rates of false positives. The usefulness of the proposed approach is demonstrated by conducting experiments on real network traffic datasets. In these experiments, an average detection rate of 99.08 % with false positive rate of 0.75 % was observed.
Grim, Katarina; Rosenberg, David; Svedberg, Petra; Schön, Ulla-Karin
2016-01-01
Shared decision-making (SDM) is an emergent research topic in the field of mental health care and is considered to be a central component of a recovery-oriented system. Despite the evidence suggesting the benefits of this change in the power relationship between users and practitioners, the method has not been widely implemented in clinical practice. The objective of this study was to investigate decisional and information needs among users with mental illness as a prerequisite for the development of a decision support tool aimed at supporting SDM in community-based mental health services in Sweden. Three semi-structured focus group interviews were conducted with 22 adult users with mental illness. The transcribed interviews were analyzed using a directed content analysis. This method was used to develop an in-depth understanding of the decisional process as well as to validate and conceptually extend Elwyn et al.'s model of SDM. The model Elwyn et al. have created for SDM in somatic care fits well for mental health services, both in terms of process and content. However, the results also suggest an extension of the model because decisions related to mental illness are often complex and involve a number of life domains. Issues related to social context and individual recovery point to the need for a preparation phase focused on establishing cooperation and mutual understanding as well as a clear follow-up phase that allows for feedback and adjustments to the decision-making process. The current study contributes to a deeper understanding of decisional and information needs among users of community-based mental health services that may reduce barriers to participation in decision-making. The results also shed light on attitudinal, relationship-based, and cognitive factors that are important to consider in adapting SDM in the mental health system.
Martelli, Nicolas; Hansen, Paul; van den Brink, Hélène; Boudard, Aurélie; Cordonnier, Anne-Laure; Devaux, Capucine; Pineau, Judith; Prognon, Patrice; Borget, Isabelle
2016-02-01
At the hospital level, decisions about purchasing new and oftentimes expensive medical devices must take into account multiple criteria simultaneously. Multi-criteria decision analysis (MCDA) is increasingly used for health technology assessment (HTA). One of the most successful hospital-based HTA approaches is mini-HTA, of which a notable example is the Matrix4value model. To develop a funding decision-support tool combining MCDA and mini-HTA, based on Matrix4value, suitable for medical devices for individual patient use in French university hospitals - known as the IDA tool, short for 'innovative device assessment'. Criteria for assessing medical devices were identified from a literature review and a survey of 18 French university hospitals. Weights for the criteria, representing their relative importance, were derived from a survey of 25 members of a medical devices committee using an elicitation technique involving pairwise comparisons. As a test of its usefulness, the IDA tool was applied to two new drug-eluting beads (DEBs) for transcatheter arterial chemoembolization. The IDA tool comprises five criteria and weights for each of two over-arching categories: risk and value. The tool revealed that the two new DEBs conferred no additional value relative to DEBs currently available. Feedback from participating decision-makers about the IDA tool was very positive. The tool could help to promote a more structured and transparent approach to HTA decision-making in French university hospitals. Copyright © 2015 Elsevier Inc. All rights reserved.
Building a maintenance policy through a multi-criterion decision-making model
NASA Astrophysics Data System (ADS)
Faghihinia, Elahe; Mollaverdi, Naser
2012-08-01
A major competitive advantage of production and service systems is establishing a proper maintenance policy. Therefore, maintenance managers should make maintenance decisions that best fit their systems. Multi-criterion decision-making methods can take into account a number of aspects associated with the competitiveness factors of a system. This paper presents a multi-criterion decision-aided maintenance model with three criteria that have more influence on decision making: reliability, maintenance cost, and maintenance downtime. The Bayesian approach has been applied to confront maintenance failure data shortage. Therefore, the model seeks to make the best compromise between these three criteria and establish replacement intervals using Preference Ranking Organization Method for Enrichment Evaluation (PROMETHEE II), integrating the Bayesian approach with regard to the preference of the decision maker to the problem. Finally, using a numerical application, the model has been illustrated, and for a visual realization and an illustrative sensitivity analysis, PROMETHEE GAIA (the visual interactive module) has been used. Use of PROMETHEE II and PROMETHEE GAIA has been made with Decision Lab software. A sensitivity analysis has been made to verify the robustness of certain parameters of the model.
Rudmik, Luke; Smith, Kristine A; Soler, Zachary M; Schlosser, Rodney J; Smith, Timothy L
2014-10-01
Idiopathic olfactory loss is a common clinical scenario encountered by otolaryngologists. While trying to allocate limited health care resources appropriately, the decision to obtain a magnetic resonance imaging (MRI) scan to investigate for a rare intracranial abnormality can be difficult. To evaluate the cost-effectiveness of ordering routine MRI in patients with idiopathic olfactory loss. We performed a modeling-based economic evaluation with a time horizon of less than 1 year. Patients included in the analysis had idiopathic olfactory loss defined by no preceding viral illness or head trauma and negative findings of a physical examination and nasal endoscopy. Routine MRI vs no-imaging strategies. We developed a decision tree economic model from the societal perspective. Effectiveness, probability, and cost data were obtained from the published literature. Litigation rates and costs related to a missed diagnosis were obtained from the Physicians Insurers Association of America. A univariate threshold analysis and multivariate probabilistic sensitivity analysis were performed to quantify the degree of certainty in the economic conclusion of the reference case. The comparative groups included those who underwent routine MRI of the brain with contrast alone and those who underwent no brain imaging. The primary outcome was the cost per correct diagnosis of idiopathic olfactory loss. The mean (SD) cost for the MRI strategy totaled $2400.00 ($1717.54) and was effective 100% of the time, whereas the mean (SD) cost for the no-imaging strategy totaled $86.61 ($107.40) and was effective 98% of the time. The incremental cost-effectiveness ratio for the MRI strategy compared with the no-imaging strategy was $115 669.50, which is higher than most acceptable willingness-to-pay thresholds. The threshold analysis demonstrated that when the probability of having a treatable intracranial disease process reached 7.9%, the incremental cost-effectiveness ratio for MRI vs no imaging was $24 654.38. The probabilistic sensitivity analysis demonstrated that the no-imaging strategy was the cost-effective decision with 81% certainty at a willingness-to-pay threshold of $50 000. This economic evaluation suggests that the most cost-effective decision is to not obtain a routine MRI scan of the brain in patients with idiopathic olfactory loss. Outcomes from this study may be used to counsel patients and aid in the decision-making process.
Combining MLC and SVM Classifiers for Learning Based Decision Making: Analysis and Evaluations
Zhang, Yi; Ren, Jinchang; Jiang, Jianmin
2015-01-01
Maximum likelihood classifier (MLC) and support vector machines (SVM) are two commonly used approaches in machine learning. MLC is based on Bayesian theory in estimating parameters of a probabilistic model, whilst SVM is an optimization based nonparametric method in this context. Recently, it is found that SVM in some cases is equivalent to MLC in probabilistically modeling the learning process. In this paper, MLC and SVM are combined in learning and classification, which helps to yield probabilistic output for SVM and facilitate soft decision making. In total four groups of data are used for evaluations, covering sonar, vehicle, breast cancer, and DNA sequences. The data samples are characterized in terms of Gaussian/non-Gaussian distributed and balanced/unbalanced samples which are then further used for performance assessment in comparing the SVM and the combined SVM-MLC classifier. Interesting results are reported to indicate how the combined classifier may work under various conditions. PMID:26089862
Combining MLC and SVM Classifiers for Learning Based Decision Making: Analysis and Evaluations.
Zhang, Yi; Ren, Jinchang; Jiang, Jianmin
2015-01-01
Maximum likelihood classifier (MLC) and support vector machines (SVM) are two commonly used approaches in machine learning. MLC is based on Bayesian theory in estimating parameters of a probabilistic model, whilst SVM is an optimization based nonparametric method in this context. Recently, it is found that SVM in some cases is equivalent to MLC in probabilistically modeling the learning process. In this paper, MLC and SVM are combined in learning and classification, which helps to yield probabilistic output for SVM and facilitate soft decision making. In total four groups of data are used for evaluations, covering sonar, vehicle, breast cancer, and DNA sequences. The data samples are characterized in terms of Gaussian/non-Gaussian distributed and balanced/unbalanced samples which are then further used for performance assessment in comparing the SVM and the combined SVM-MLC classifier. Interesting results are reported to indicate how the combined classifier may work under various conditions.
Evidence Accumulator or Decision Threshold – Which Cortical Mechanism are We Observing?
Simen, Patrick
2012-01-01
Most psychological models of perceptual decision making are of the accumulation-to-threshold variety. The neural basis of accumulation in parietal and prefrontal cortex is therefore a topic of great interest in neuroscience. In contrast, threshold mechanisms have received less attention, and their neural basis has usually been sought in subcortical structures. Here I analyze a model of a decision threshold that can be implemented in the same cortical areas as evidence accumulators, and whose behavior bears on two open questions in decision neuroscience: (1) When ramping activity is observed in a brain region during decision making, does it reflect evidence accumulation? (2) Are changes in speed-accuracy tradeoffs and response biases more likely to be achieved by changes in thresholds, or in accumulation rates and starting points? The analysis suggests that task-modulated ramping activity, by itself, is weak evidence that a brain area mediates evidence accumulation as opposed to threshold readout; and that signs of modulated accumulation are as likely to indicate threshold adaptation as adaptation of starting points and accumulation rates. These conclusions imply that how thresholds are modeled can dramatically impact accumulator-based interpretations of this data. PMID:22737136
Modeling Opponents in Adversarial Risk Analysis.
Rios Insua, David; Banks, David; Rios, Jesus
2016-04-01
Adversarial risk analysis has been introduced as a framework to deal with risks derived from intentional actions of adversaries. The analysis supports one of the decisionmakers, who must forecast the actions of the other agents. Typically, this forecast must take account of random consequences resulting from the set of selected actions. The solution requires one to model the behavior of the opponents, which entails strategic thinking. The supported agent may face different kinds of opponents, who may use different rationality paradigms, for example, the opponent may behave randomly, or seek a Nash equilibrium, or perform level-k thinking, or use mirroring, or employ prospect theory, among many other possibilities. We describe the appropriate analysis for these situations, and also show how to model the uncertainty about the rationality paradigm used by the opponent through a Bayesian model averaging approach, enabling a fully decision-theoretic solution. We also show how as we observe an opponent's decision behavior, this approach allows learning about the validity of each of the rationality models used to predict his decision by computing the models' (posterior) probabilities, which can be understood as a measure of their validity. We focus on simultaneous decision making by two agents. © 2015 Society for Risk Analysis.
Neuroanatomical basis for recognition primed decision making.
Hudson, Darren
2013-01-01
Effective decision making under time constraints is often overlooked in medical decision making. The recognition primed decision making (RPDM) model was developed by Gary Klein based on previous recognized situations to develop a satisfactory solution to the current problem. Bayes Theorem is the most popular decision making model in medicine but is limited by the need for adequate time to consider all probabilities. Unlike other decision making models, there is a potential neurobiological basis for RPDM. This model has significant implication for health informatics and medical education.
NASA Astrophysics Data System (ADS)
Banerjee, Polash; Ghose, Mrinal Kanti; Pradhan, Ratika
2018-05-01
Spatial analysis of water quality impact assessment of highway projects in mountainous areas remains largely unexplored. A methodology is presented here for Spatial Water Quality Impact Assessment (SWQIA) due to highway-broadening-induced vehicular traffic change in the East district of Sikkim. Pollution load of the highway runoff was estimated using an Average Annual Daily Traffic-Based Empirical model in combination with mass balance model to predict pollution in the rivers within the study area. Spatial interpolation and overlay analysis were used for impact mapping. Analytic Hierarchy Process-Based Water Quality Status Index was used to prepare a composite impact map. Model validation criteria, cross-validation criteria, and spatial explicit sensitivity analysis show that the SWQIA model is robust. The study shows that vehicular traffic is a significant contributor to water pollution in the study area. The model is catering specifically to impact analysis of the concerned project. It can be an aid for decision support system for the project stakeholders. The applicability of SWQIA model needs to be explored and validated in the context of a larger set of water quality parameters and project scenarios at a greater spatial scale.
Nonstationary decision model for flood risk decision scaling
NASA Astrophysics Data System (ADS)
Spence, Caitlin M.; Brown, Casey M.
2016-11-01
Hydroclimatic stationarity is increasingly questioned as a default assumption in flood risk management (FRM), but successor methods are not yet established. Some potential successors depend on estimates of future flood quantiles, but methods for estimating future design storms are subject to high levels of uncertainty. Here we apply a Nonstationary Decision Model (NDM) to flood risk planning within the decision scaling framework. The NDM combines a nonstationary probability distribution of annual peak flow with optimal selection of flood management alternatives using robustness measures. The NDM incorporates structural and nonstructural FRM interventions and valuation of flows supporting ecosystem services to calculate expected cost of a given FRM strategy. A search for the minimum-cost strategy under incrementally varied representative scenarios extending across the plausible range of flood trend and value of the natural flow regime discovers candidate FRM strategies that are evaluated and compared through a decision scaling analysis (DSA). The DSA selects a management strategy that is optimal or close to optimal across the broadest range of scenarios or across the set of scenarios deemed most likely to occur according to estimates of future flood hazard. We illustrate the decision framework using a stylized example flood management decision based on the Iowa City flood management system, which has experienced recent unprecedented high flow episodes. The DSA indicates a preference for combining infrastructural and nonstructural adaptation measures to manage flood risk and makes clear that options-based approaches cannot be assumed to be "no" or "low regret."
NASA Astrophysics Data System (ADS)
Flaming, Susan C.
2007-12-01
The continuing saga of satellite technology development is as much a story of successful risk management as of innovative engineering. How do program leaders on complex, technology projects manage high stakes risks that threaten business success and satellite performance? This grounded theory study of risk decision making portrays decision leadership practices at one communication satellite company. Integrated product team (IPT) leaders of multi-million dollar programs were interviewed and observed to develop an extensive description of the leadership skills required to navigate organizational influences and drive challenging risk decisions to closure. Based on the study's findings the researcher proposes a new decision making model, Deliberative Decision Making, to describe the program leaders' cognitive and organizational leadership practices. This Deliberative Model extends the insights of prominent decision making models including the rational (or classical) and the naturalistic and qualifies claims made by bounded rationality theory. The Deliberative Model describes how leaders proactively engage resources to play a variety of decision leadership roles. The Model incorporates six distinct types of leadership decision activities, undertaken in varying sequence based on the challenges posed by specific risks. Novel features of the Deliberative Decision Model include: an inventory of leadership methods for managing task challenges, potential stakeholder bias and debates; four types of leadership meta-decisions that guide decision processes, and aligned organizational culture. Both supporting and constraining organizational influences were observed as leaders managed major risks, requiring active leadership on the most difficult decisions. Although the company's engineering culture emphasized the importance of data-based decisions, the uncertainties intrinsic to satellite risks required expert engineering judgment to be exercised throughout. An investigation into the co-variation of decision methods with uncertainty suggests that perceived risk severity may serve as a robust indicator for choices about decision practices. The Deliberative Decision processes incorporate multiple organizational and cultural controls as cross-checks to mitigate potential parochial bias of individuals, stakeholder groups, or leaders. Overall the Deliberative Decision framework describes how expert leadership practices, supportive organizational systems along with aligned cultural values and behavioral norms help leaders drive high stakes risk decisions to closure in this complex, advanced-technology setting.
Decision Support Model for Optimal Management of Coastal Gate
NASA Astrophysics Data System (ADS)
Ditthakit, Pakorn; Chittaladakorn, Suwatana
2010-05-01
The coastal areas are intensely settled by human beings owing to their fertility of natural resources. However, at present those areas are facing with water scarcity problems: inadequate water and poor water quality as a result of saltwater intrusion and inappropriate land-use management. To solve these problems, several measures have been exploited. The coastal gate construction is a structural measure widely performed in several countries. This manner requires the plan for suitably operating coastal gates. Coastal gate operation is a complicated task and usually concerns with the management of multiple purposes, which are generally conflicted one another. This paper delineates the methodology and used theories for developing decision support modeling for coastal gate operation scheduling. The developed model was based on coupling simulation and optimization model. The weighting optimization technique based on Differential Evolution (DE) was selected herein for solving multiple objective problems. The hydrodynamic and water quality models were repeatedly invoked during searching the optimal gate operations. In addition, two forecasting models:- Auto Regressive model (AR model) and Harmonic Analysis model (HA model) were applied for forecasting water levels and tide levels, respectively. To demonstrate the applicability of the developed model, it was applied to plan the operations for hypothetical system of Pak Phanang coastal gate system, located in Nakhon Si Thammarat province, southern part of Thailand. It was found that the proposed model could satisfyingly assist decision-makers for operating coastal gates under various environmental, ecological and hydraulic conditions.
ERIC Educational Resources Information Center
Aldosari, Mubarak S.
2016-01-01
This study conducted an in-depth analysis of the efficacy of the Decision Model in the development of function-based treatments for disruptive behaviors in four toddlers with disabilities aged from 26 to 34 months in inclusive toddler classrooms. The research was conducted in three parts. In Part 1, a functional behavioral assessment was conducted…
Using the weighted area under the net benefit curve for decision curve analysis.
Talluri, Rajesh; Shete, Sanjay
2016-07-18
Risk prediction models have been proposed for various diseases and are being improved as new predictors are identified. A major challenge is to determine whether the newly discovered predictors improve risk prediction. Decision curve analysis has been proposed as an alternative to the area under the curve and net reclassification index to evaluate the performance of prediction models in clinical scenarios. The decision curve computed using the net benefit can evaluate the predictive performance of risk models at a given or range of threshold probabilities. However, when the decision curves for 2 competing models cross in the range of interest, it is difficult to identify the best model as there is no readily available summary measure for evaluating the predictive performance. The key deterrent for using simple measures such as the area under the net benefit curve is the assumption that the threshold probabilities are uniformly distributed among patients. We propose a novel measure for performing decision curve analysis. The approach estimates the distribution of threshold probabilities without the need of additional data. Using the estimated distribution of threshold probabilities, the weighted area under the net benefit curve serves as the summary measure to compare risk prediction models in a range of interest. We compared 3 different approaches, the standard method, the area under the net benefit curve, and the weighted area under the net benefit curve. Type 1 error and power comparisons demonstrate that the weighted area under the net benefit curve has higher power compared to the other methods. Several simulation studies are presented to demonstrate the improvement in model comparison using the weighted area under the net benefit curve compared to the standard method. The proposed measure improves decision curve analysis by using the weighted area under the curve and thereby improves the power of the decision curve analysis to compare risk prediction models in a clinical scenario.
Decision analysis in clinical cardiology: When is coronary angiography required in aortic stenosis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Georgeson, S.; Meyer, K.B.; Pauker, S.G.
1990-03-15
Decision analysis offers a reproducible, explicit approach to complex clinical decisions. It consists of developing a model, typically a decision tree, that separates choices from chances and that specifies and assigns relative values to outcomes. Sensitivity analysis allows exploration of alternative assumptions. Cost-effectiveness analysis shows the relation between dollars spent and improved health outcomes achieved. In a tutorial format, this approach is applied to the decision whether to perform coronary angiography in a patient who requires aortic valve replacement for critical aortic stenosis.
Robustness for slope stability modelling under deep uncertainty
NASA Astrophysics Data System (ADS)
Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten
2015-04-01
Landslides can have large negative societal and economic impacts, such as loss of life and damage to infrastructure. However, the ability of slope stability assessment to guide management is limited by high levels of uncertainty in model predictions. Many of these uncertainties cannot be easily quantified, such as those linked to climate change and other future socio-economic conditions, restricting the usefulness of traditional decision analysis tools. Deep uncertainty can be managed more effectively by developing robust, but not necessarily optimal, policies that are expected to perform adequately under a wide range of future conditions. Robust strategies are particularly valuable when the consequences of taking a wrong decision are high as is often the case of when managing natural hazard risks such as landslides. In our work a physically based numerical model of hydrologically induced slope instability (the Combined Hydrology and Stability Model - CHASM) is applied together with robust decision making to evaluate the most important uncertainties (storm events, groundwater conditions, surface cover, slope geometry, material strata and geotechnical properties) affecting slope stability. Specifically, impacts of climate change on long-term slope stability are incorporated, accounting for the deep uncertainty in future climate projections. Our findings highlight the potential of robust decision making to aid decision support for landslide hazard reduction and risk management under conditions of deep uncertainty.
A History of Sandia’s Water Decision Modeling and Analysis Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Thomas Stephen; Pate, Ronald C.
This document provides a brief narrative, and selected project descriptions, that represent Sandia’s history involving data, modeling, and analysis related to water, energy-water nexus, and energy-water-agriculture nexus within the context of climate change. Sandia National Laboratories has been engaged since the early-1990s with program development involving data, modeling, and analysis projects that address the interdependent issues, risks, and technology-based mitigations associated with increasing demands and stresses being placed on energy, water, and agricultural/food resources, and the related impacts on their security and sustainability in the face of both domestic and global population growth, expanding economic development, and climate change.
Finding shared decisions in stakeholder networks: An agent-based approach
NASA Astrophysics Data System (ADS)
Le Pira, Michela; Inturri, Giuseppe; Ignaccolo, Matteo; Pluchino, Alessandro; Rapisarda, Andrea
2017-01-01
We address the problem of a participatory decision-making process where a shared priority list of alternatives has to be obtained while avoiding inconsistent decisions. An agent-based model (ABM) is proposed to mimic this process in different social networks of stakeholders who interact according to an opinion dynamics model. Simulations' results show the efficacy of interaction in finding a transitive and, above all, shared decision. These findings are in agreement with real participation experiences regarding transport planning decisions and can give useful suggestions on how to plan an effective participation process for sustainable policy-making based on opinion consensus.
Holden, Richard J; Srinivas, Preethi; Campbell, Noll L; Clark, Daniel O; Bodke, Kunal S; Hong, Youngbok; Boustani, Malaz A; Ferguson, Denisha; Callahan, Christopher M
2018-03-06
Older adults purchase and use over-the-counter (OTC) medications with potentially significant adverse effects. Some OTC medications, such as those with anticholinergic effects, are relatively contraindicated for use by older adults due to evidence of impaired cognition and other adverse effects. To inform the design of future OTC medication safety interventions for older adults, this study investigated consumers' decision making and behavior related to OTC medication purchasing and use, with a focus on OTC anticholinergic medications. The study had a cross-sectional design with multiple methods. A total of 84 adults participated in qualitative research interviews (n = 24), in-store shopper observations (n = 39), and laboratory-based simulated OTC shopping tasks (n = 21). Simulated shopping participants also rank-ordered eight factors on their importance for OTC decision making. Findings revealed that many participants had concerns about medication adverse effects, generally, but were not aware of age-related risk associated with the use of anticholinergic medications. Analyses produced a map of the workflow of OTC-related behavior and decision making as well as related barriers such as difficulty locating medications or comparing them to an alternative. Participants reported effectiveness, adverse effects or health risks, and price as most important to their OTC medication purchase and use decisions. A persona analysis identified two types of consumers: the habit follower, who frequently purchased OTC medications and considered them safe; and the deliberator, who was more likely to weigh their options and consider alternatives to OTC medications. A conceptual model of OTC medication purchase and use is presented. Drawing on study findings and behavioral theories, the model depicts dual processes for OTC medication decision making - habit-based and deliberation-based - as well as the antecedents and consequences of decision making. This model suggests several design directions for consumer-oriented interventions to promote OTC medication safety. Copyright © 2018 Elsevier Inc. All rights reserved.
Decision-Making Models in a Tunisian University: Towards a Framework for Analysis
ERIC Educational Resources Information Center
Khefacha, I.; Belkacem, L.
2014-01-01
This study investigates how decisions are made in Tunisian public higher education establishments. Some factors are identified as having a potentially significant impact on the odds that the decision-making process follows the characteristics of one of the most well known decision-making models: collegial, political, bureaucratic or anarchical…
An approach to and web-based tool for infectious disease outbreak intervention analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daughton, Ashlynn R.; Generous, Nicholas; Priedhorsky, Reid
Infectious diseases are a leading cause of death globally. Decisions surrounding how to control an infectious disease outbreak currently rely on a subjective process involving surveillance and expert opinion. However, there are many situations where neither may be available. Modeling can fill gaps in the decision making process by using available data to provide quantitative estimates of outbreak trajectories. Effective reduction of the spread of infectious diseases can be achieved through collaboration between the modeling community and public health policy community. However, such collaboration is rare, resulting in a lack of models that meet the needs of the public healthmore » community. Here we show a Susceptible-Infectious-Recovered (SIR) model modified to include control measures that allows parameter ranges, rather than parameter point estimates, and includes a web user interface for broad adoption. We apply the model to three diseases, measles, norovirus and influenza, to show the feasibility of its use and describe a research agenda to further promote interactions between decision makers and the modeling community.« less
An approach to and web-based tool for infectious disease outbreak intervention analysis
Daughton, Ashlynn R.; Generous, Nicholas; Priedhorsky, Reid; ...
2017-04-18
Infectious diseases are a leading cause of death globally. Decisions surrounding how to control an infectious disease outbreak currently rely on a subjective process involving surveillance and expert opinion. However, there are many situations where neither may be available. Modeling can fill gaps in the decision making process by using available data to provide quantitative estimates of outbreak trajectories. Effective reduction of the spread of infectious diseases can be achieved through collaboration between the modeling community and public health policy community. However, such collaboration is rare, resulting in a lack of models that meet the needs of the public healthmore » community. Here we show a Susceptible-Infectious-Recovered (SIR) model modified to include control measures that allows parameter ranges, rather than parameter point estimates, and includes a web user interface for broad adoption. We apply the model to three diseases, measles, norovirus and influenza, to show the feasibility of its use and describe a research agenda to further promote interactions between decision makers and the modeling community.« less
ERIC Educational Resources Information Center
Iivari, Juhani; Hirschheim, Rudy
1996-01-01
Analyzes and compares eight information systems (IS) development approaches: Information Modelling, Decision Support Systems, the Socio-Technical approach, the Infological approach, the Interactionist approach, the Speech Act-based approach, Soft Systems Methodology, and the Scandinavian Trade Unionist approach. Discusses the organizational roles…
Hoover, Kevin M.; Bubak, Andrew N.; Law, Isaac J.; Yaeger, Jazmine D. W.; Renner, Kenneth J.; Swallow, John G.; Greene, Michael J.
2016-01-01
Abstract Ant colonies self-organize to solve complex problems despite the simplicity of an individual ant’s brain. Pavement ant Tetramorium caespitum colonies must solve the problem of defending the territory that they patrol in search of energetically rich forage. When members of 2 colonies randomly interact at the territory boundary a decision to fight occurs when: 1) there is a mismatch in nestmate recognition cues and 2) each ant has a recent history of high interaction rates with nestmate ants. Instead of fighting, some ants will decide to recruit more workers from the nest to the fighting location, and in this way a positive feedback mediates the development of colony wide wars. In ants, the monoamines serotonin (5-HT) and octopamine (OA) modulate many behaviors associated with colony organization and in particular behaviors associated with nestmate recognition and aggression. In this article, we develop and explore an agent-based model that conceptualizes how individual changes in brain concentrations of 5-HT and OA, paired with a simple threshold-based decision rule, can lead to the development of colony wide warfare. Model simulations do lead to the development of warfare with 91% of ants fighting at the end of 1 h. When conducting a sensitivity analysis, we determined that uncertainty in monoamine concentration signal decay influences the behavior of the model more than uncertainty in the decision-making rule or density. We conclude that pavement ant behavior is consistent with the detection of interaction rate through a single timed interval rather than integration of multiple interactions. PMID:29491915
Hoover, Kevin M; Bubak, Andrew N; Law, Isaac J; Yaeger, Jazmine D W; Renner, Kenneth J; Swallow, John G; Greene, Michael J
2016-06-01
Ant colonies self-organize to solve complex problems despite the simplicity of an individual ant's brain. Pavement ant Tetramorium caespitum colonies must solve the problem of defending the territory that they patrol in search of energetically rich forage. When members of 2 colonies randomly interact at the territory boundary a decision to fight occurs when: 1) there is a mismatch in nestmate recognition cues and 2) each ant has a recent history of high interaction rates with nestmate ants. Instead of fighting, some ants will decide to recruit more workers from the nest to the fighting location, and in this way a positive feedback mediates the development of colony wide wars. In ants, the monoamines serotonin (5-HT) and octopamine (OA) modulate many behaviors associated with colony organization and in particular behaviors associated with nestmate recognition and aggression. In this article, we develop and explore an agent-based model that conceptualizes how individual changes in brain concentrations of 5-HT and OA, paired with a simple threshold-based decision rule, can lead to the development of colony wide warfare. Model simulations do lead to the development of warfare with 91% of ants fighting at the end of 1 h. When conducting a sensitivity analysis, we determined that uncertainty in monoamine concentration signal decay influences the behavior of the model more than uncertainty in the decision-making rule or density. We conclude that pavement ant behavior is consistent with the detection of interaction rate through a single timed interval rather than integration of multiple interactions.
A decision model for cost effective design of biomass based green energy supply chains.
Yılmaz Balaman, Şebnem; Selim, Hasan
2015-09-01
The core driver of this study is to deal with the design of anaerobic digestion based biomass to energy supply chains in a cost effective manner. In this concern, a decision model is developed. The model is based on fuzzy multi objective decision making in order to simultaneously optimize multiple economic objectives and tackle the inherent uncertainties in the parameters and decision makers' aspiration levels for the goals. The viability of the decision model is explored with computational experiments on a real-world biomass to energy supply chain and further analyses are performed to observe the effects of different conditions. To this aim, scenario analyses are conducted to investigate the effects of energy crop utilization and operational costs on supply chain structure and performance measures. Copyright © 2015 Elsevier Ltd. All rights reserved.
Proceedings of the Second NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Munoz, Cesar (Editor)
2010-01-01
This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.
E-DECIDER Decision Support Gateway For Earthquake Disaster Response
NASA Astrophysics Data System (ADS)
Glasscoe, M. T.; Stough, T. M.; Parker, J. W.; Burl, M. C.; Donnellan, A.; Blom, R. G.; Pierce, M. E.; Wang, J.; Ma, Y.; Rundle, J. B.; Yoder, M. R.
2013-12-01
Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing capabilities for decision-making utilizing remote sensing data and modeling software in order to provide decision support for earthquake disaster management and response. E-DECIDER incorporates earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project in order to produce standards-compliant map data products to aid in decision-making following an earthquake. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools, help provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). E-DECIDER utilizes a service-based GIS model for its cyber-infrastructure in order to produce standards-compliant products for different user types with multiple service protocols (such as KML, WMS, WFS, and WCS). The goal is to make complex GIS processing and domain-specific analysis tools more accessible to general users through software services as well as provide system sustainability through infrastructure services. The system comprises several components, which include: a GeoServer for thematic mapping and data distribution, a geospatial database for storage and spatial analysis, web service APIs, including simple-to-use REST APIs for complex GIS functionalities, and geoprocessing tools including python scripts to produce standards-compliant data products. These are then served to the E-DECIDER decision support gateway (http://e-decider.org), the E-DECIDER mobile interface, and to the Department of Homeland Security decision support middleware UICDS (Unified Incident Command and Decision Support). The E-DECIDER decision support gateway features a web interface that delivers map data products including deformation modeling results (slope change and strain magnitude) and aftershock forecasts, with remote sensing change detection results under development. These products are event triggered (from the USGS earthquake feed) and will be posted to event feeds on the E-DECIDER webpage and accessible via the mobile interface and UICDS. E-DECIDER also features a KML service that provides infrastructure information from the FEMA HAZUS database through UICDS and the mobile interface. The back-end GIS service architecture and front-end gateway components form a decision support system that is designed for ease-of-use and extensibility for end-users.
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
NASA Astrophysics Data System (ADS)
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
Bayesian imperfect information analysis for clinical recurrent data
Chang, Chih-Kuang; Chang, Chi-Chang
2015-01-01
In medical research, clinical practice must often be undertaken with imperfect information from limited resources. This study applied Bayesian imperfect information-value analysis to realistic situations to produce likelihood functions and posterior distributions, to a clinical decision-making problem for recurrent events. In this study, three kinds of failure models are considered, and our methods illustrated with an analysis of imperfect information from a trial of immunotherapy in the treatment of chronic granulomatous disease. In addition, we present evidence toward a better understanding of the differing behaviors along with concomitant variables. Based on the results of simulations, the imperfect information value of the concomitant variables was evaluated and different realistic situations were compared to see which could yield more accurate results for medical decision-making. PMID:25565853
Testing the robustness of management decisions to uncertainty: Everglades restoration scenarios.
Fuller, Michael M; Gross, Louis J; Duke-Sylvester, Scott M; Palmer, Mark
2008-04-01
To effectively manage large natural reserves, resource managers must prepare for future contingencies while balancing the often conflicting priorities of different stakeholders. To deal with these issues, managers routinely employ models to project the response of ecosystems to different scenarios that represent alternative management plans or environmental forecasts. Scenario analysis is often used to rank such alternatives to aid the decision making process. However, model projections are subject to uncertainty in assumptions about model structure, parameter values, environmental inputs, and subcomponent interactions. We introduce an approach for testing the robustness of model-based management decisions to the uncertainty inherent in complex ecological models and their inputs. We use relative assessment to quantify the relative impacts of uncertainty on scenario ranking. To illustrate our approach we consider uncertainty in parameter values and uncertainty in input data, with specific examples drawn from the Florida Everglades restoration project. Our examples focus on two alternative 30-year hydrologic management plans that were ranked according to their overall impacts on wildlife habitat potential. We tested the assumption that varying the parameter settings and inputs of habitat index models does not change the rank order of the hydrologic plans. We compared the average projected index of habitat potential for four endemic species and two wading-bird guilds to rank the plans, accounting for variations in parameter settings and water level inputs associated with hypothetical future climates. Indices of habitat potential were based on projections from spatially explicit models that are closely tied to hydrology. For the American alligator, the rank order of the hydrologic plans was unaffected by substantial variation in model parameters. By contrast, simulated major shifts in water levels led to reversals in the ranks of the hydrologic plans in 24.1-30.6% of the projections for the wading bird guilds and several individual species. By exposing the differential effects of uncertainty, relative assessment can help resource managers assess the robustness of scenario choice in model-based policy decisions.
Decision on risk-averse dual-channel supply chain under demand disruption
NASA Astrophysics Data System (ADS)
Yan, Bo; Jin, Zijie; Liu, Yanping; Yang, Jianbo
2018-02-01
We studied dual-channel supply chains using centralized and decentralized decision-making models. We also conducted a comparative analysis of the decisions before and after demand disruption. The study shows that the amount of change in decision-making is a linear function of the amount of demand disruption, and it is independent of the risk-averse coefficient. The optimal sales volume decision of the disturbing supply chain is related to market share and demand disruption in the decentralized decision-making model. The optimal decision is only influenced by demand disruption in the centralized decision-making model. The stability of the sales volume of the two models is related to market share and demand disruption. The optimal system production of the two models shows robustness, but their stable internals are different.
Chronic Heart Failure Follow-up Management Based on Agent Technology.
Mohammadzadeh, Niloofar; Safdari, Reza
2015-10-01
Monitoring heart failure patients through continues assessment of sign and symptoms by information technology tools lead to large reduction in re-hospitalization. Agent technology is one of the strongest artificial intelligence areas; therefore, it can be expected to facilitate, accelerate, and improve health services especially in home care and telemedicine. The aim of this article is to provide an agent-based model for chronic heart failure (CHF) follow-up management. This research was performed in 2013-2014 to determine appropriate scenarios and the data required to monitor and follow-up CHF patients, and then an agent-based model was designed. Agents in the proposed model perform the following tasks: medical data access, communication with other agents of the framework and intelligent data analysis, including medical data processing, reasoning, negotiation for decision-making, and learning capabilities. The proposed multi-agent system has ability to learn and thus improve itself. Implementation of this model with more and various interval times at a broader level could achieve better results. The proposed multi-agent system is no substitute for cardiologists, but it could assist them in decision-making.
Hartley, Matt; Roberts, Helen
2015-09-01
Disease control management relies on the development of policy supported by an evidence base. The evidence base for disease in zoo animals is often absent or incomplete. Resources for disease research in these species are limited, and so in order to develop effective policies, novel approaches to extrapolating knowledge and dealing with uncertainty need to be developed. This article demonstrates how qualitative risk analysis techniques can be used to aid decision-making in circumstances in which there is a lack of specific evidence using the import of rabies-susceptible zoo mammals into the United Kingdom as a model.
Presser, Theresa S.; Jenni, Karen E.; Nieman, Timothy; Coleman, James
2010-01-01
Constraints on drainage management in the western San Joaquin Valley and implications of proposed approaches to management were recently evaluated by the U.S. Geological Survey (USGS). The USGS found that a significant amount of data for relevant technical issues was available and that a structured, analytical decision support tool could help optimize combinations of specific in-valley drainage management strategies, address uncertainties, and document underlying data analysis for future use. To follow-up on USGS's technical analysis and to help define a scientific basis for decisionmaking in implementing in-valley drainage management strategies, this report describes the first step (that is, a framing study) in a Decision Analysis process. In general, a Decision Analysis process includes four steps: (1) problem framing to establish the scope of the decision problem(s) and a set of fundamental objectives to evaluate potential solutions, (2) generation of strategies to address identified decision problem(s), (3) identification of uncertainties and their relationships, and (4) construction of a decision support model. Participation in such a systematic approach can help to promote consensus and to build a record of qualified supporting data for planning and implementation. In December 2008, a Decision Analysis framing study was initiated with a series of meetings designed to obtain preliminary input from key stakeholder groups on the scope of decisions relevant to drainage management that were of interest to them, and on the fundamental objectives each group considered relevant to those decisions. Two key findings of this framing study are: (1) participating stakeholders have many drainage management objectives in common; and (2) understanding the links between drainage management and water management is necessary both for sound science-based decisionmaking and for resolving stakeholder differences about the value of proposed drainage management solutions. Citing ongoing legal processes associated with drainage management in the western San Joaquin Valley, the U.S. Bureau of Reclamation (USBR) withdrew from the Decision Analysis process early in the proceedings. Without the involvement of the USBR, the USGS discontinued further development of this study.
Analysis of strength-of-preference measures in dichotomous choice models
Donald F. Dennis; Peter Newman; Robert Manning
2008-01-01
Choice models are becoming increasingly useful for soliciting and analyzing multiple objective decisions faced by recreation managers and others interested in decisions involving natural resources. Choice models are used to estimate relative values for multiple aspects of natural resource management, not individually but within the context of other relevant decision...
Li, Xingang; Li, Jia; Sui, Hong; He, Lin; Cao, Xingtao; Li, Yonghong
2018-07-05
Soil remediation has been considered as one of the most difficult pollution treatment tasks due to its high complexity in contaminants, geological conditions, usage, urgency, etc. The diversity in remediation technologies further makes quick selection of suitable remediation schemes much tougher even the site investigation has been done. Herein, a sustainable decision support hierarchical model has been developed to select, evaluate and determine preferred soil remediation schemes comprehensively based on modified analytic hierarchy process (MAHP). This MAHP method combines competence model and the Grubbs criteria with the conventional AHP. It not only considers the competence differences among experts in group decision, but also adjusts the big deviation caused by different experts' preference through sample analysis. This conversion allows the final remediation decision more reasonable. In this model, different evaluation criteria, including economic effect, environmental effect and technological effect, are employed to evaluate the integrated performance of remediation schemes followed by a strict computation using above MAHP. To confirm the feasibility of this developed model, it has been tested by a benzene workshop contaminated site in Beijing coking plant. Beyond soil remediation, this MAHP model would also be applied in other fields referring to multi-criteria group decision making. Copyright © 2018 Elsevier B.V. All rights reserved.
Predicting explorative motor learning using decision-making and motor noise.
Chen, Xiuli; Mohr, Kieran; Galea, Joseph M
2017-04-01
A fundamental problem faced by humans is learning to select motor actions based on noisy sensory information and incomplete knowledge of the world. Recently, a number of authors have asked whether this type of motor learning problem might be very similar to a range of higher-level decision-making problems. If so, participant behaviour on a high-level decision-making task could be predictive of their performance during a motor learning task. To investigate this question, we studied performance during an explorative motor learning task and a decision-making task which had a similar underlying structure with the exception that it was not subject to motor (execution) noise. We also collected an independent measurement of each participant's level of motor noise. Our analysis showed that explorative motor learning and decision-making could be modelled as the (approximately) optimal solution to a Partially Observable Markov Decision Process bounded by noisy neural information processing. The model was able to predict participant performance in motor learning by using parameters estimated from the decision-making task and the separate motor noise measurement. This suggests that explorative motor learning can be formalised as a sequential decision-making process that is adjusted for motor noise, and raises interesting questions regarding the neural origin of explorative motor learning.
Predicting explorative motor learning using decision-making and motor noise
Galea, Joseph M.
2017-01-01
A fundamental problem faced by humans is learning to select motor actions based on noisy sensory information and incomplete knowledge of the world. Recently, a number of authors have asked whether this type of motor learning problem might be very similar to a range of higher-level decision-making problems. If so, participant behaviour on a high-level decision-making task could be predictive of their performance during a motor learning task. To investigate this question, we studied performance during an explorative motor learning task and a decision-making task which had a similar underlying structure with the exception that it was not subject to motor (execution) noise. We also collected an independent measurement of each participant’s level of motor noise. Our analysis showed that explorative motor learning and decision-making could be modelled as the (approximately) optimal solution to a Partially Observable Markov Decision Process bounded by noisy neural information processing. The model was able to predict participant performance in motor learning by using parameters estimated from the decision-making task and the separate motor noise measurement. This suggests that explorative motor learning can be formalised as a sequential decision-making process that is adjusted for motor noise, and raises interesting questions regarding the neural origin of explorative motor learning. PMID:28437451
Konijeti, Gauree G; Sauk, Jenny; Shrime, Mark G; Gupta, Meera; Ananthakrishnan, Ashwin N
2014-06-01
Clostridium difficile infection (CDI) is an important cause of morbidity and healthcare costs, and is characterized by high rates of disease recurrence. The cost-effectiveness of newer treatments for recurrent CDI has not been examined, yet would be important to inform clinical practice. The aim of this study was to analyze the cost effectiveness of competing strategies for recurrent CDI. We constructed a decision-analytic model comparing 4 treatment strategies for first-line treatment of recurrent CDI in a population with a median age of 65 years: metronidazole, vancomycin, fidaxomicin, and fecal microbiota transplant (FMT). We modeled up to 2 additional recurrences following the initial recurrence. We assumed FMT delivery via colonoscopy as our base case, but conducted sensitivity analyses based on different modes of delivery. Willingness-to-pay threshold was set at $50 000 per quality-adjusted life-year. At our base case estimates, initial treatment of recurrent CDI using FMT colonoscopy was the most cost-effective strategy, with an incremental cost-effectiveness ratio of $17 016 relative to oral vancomycin. Fidaxomicin and metronidazole were both dominated by FMT colonoscopy. On sensitivity analysis, FMT colonoscopy remained the most cost-effective strategy at cure rates >88.4% and CDI recurrence rates <14.9%. Fidaxomicin required a cost <$1359 to meet our cost-effectiveness threshold. In clinical settings where FMT is not available or applicable, the preferred strategy appears to be initial treatment with oral vancomycin. In this decision analysis examining treatment strategies for recurrent CDI, we demonstrate that FMT colonoscopy is the most cost-effective initial strategy for management of recurrent CDI.
Graeden, Ellie; Kerr, Justin; Sorrell, Erin M.; Katz, Rebecca
2018-01-01
Managing infectious disease requires rapid and effective response to support decision making. The decisions are complex and require understanding of the diseases, disease intervention and control measures, and the disease-relevant characteristics of the local community. Though disease modeling frameworks have been developed to address these questions, the complexity of current models presents a significant barrier to community-level decision makers in using the outputs of the most scientifically robust methods to support pragmatic decisions about implementing a public health response effort, even for endemic diseases with which they are already familiar. Here, we describe the development of an application available on the internet, including from mobile devices, with a simple user interface, to support on-the-ground decision-making for integrating disease control programs, given local conditions and practical constraints. The model upon which the tool is built provides predictive analysis for the effectiveness of integration of schistosomiasis and malaria control, two diseases with extensive geographical and epidemiological overlap, and which result in significant morbidity and mortality in affected regions. Working with data from countries across sub-Saharan Africa and the Middle East, we present a proof-of-principle method and corresponding prototype tool to provide guidance on how to optimize integration of vertical disease control programs. This method and tool demonstrate significant progress in effectively translating the best available scientific models to support practical decision making on the ground with the potential to significantly increase the efficacy and cost-effectiveness of disease control. Author summary Designing and implementing effective programs for infectious disease control requires complex decision-making, informed by an understanding of the diseases, the types of disease interventions and control measures available, and the disease-relevant characteristics of the local community. Though disease modeling frameworks have been developed to address these questions and support decision-making, the complexity of current models presents a significant barrier to on-the-ground end users. The picture is further complicated when considering approaches for integration of different disease control programs, where co-infection dynamics, treatment interactions, and other variables must also be taken into account. Here, we describe the development of an application available on the internet with a simple user interface, to support on-the-ground decision-making for integrating disease control, given local conditions and practical constraints. The model upon which the tool is built provides predictive analysis for the effectiveness of integration of schistosomiasis and malaria control, two diseases with extensive geographical and epidemiological overlap. This proof-of-concept method and tool demonstrate significant progress in effectively translating the best available scientific models to support pragmatic decision-making on the ground, with the potential to significantly increase the impact and cost-effectiveness of disease control. PMID:29649260
Applications of decision analysis and related techniques to industrial engineering problems at KSC
NASA Technical Reports Server (NTRS)
Evans, Gerald W.
1995-01-01
This report provides: (1) a discussion of the origination of decision analysis problems (well-structured problems) from ill-structured problems; (2) a review of the various methodologies and software packages for decision analysis and related problem areas; (3) a discussion of how the characteristics of a decision analysis problem affect the choice of modeling methodologies, thus providing a guide as to when to choose a particular methodology; and (4) examples of applications of decision analysis to particular problems encountered by the IE Group at KSC. With respect to the specific applications at KSC, particular emphasis is placed on the use of the Demos software package (Lumina Decision Systems, 1993).
NASA Astrophysics Data System (ADS)
Ohdaira, Tetsushi
2014-07-01
Previous studies discussing cooperation employ the best decision that every player knows all information regarding the payoff matrix and selects the strategy of the highest payoff. Therefore, they do not discuss cooperation based on the altruistic decision with limited information (bounded rational altruistic decision). In addition, they do not cover the case where every player can submit his/her strategy several times in a match of the game. This paper is based on Ohdaira's reconsideration of the bounded rational altruistic decision, and also employs the framework of the prisoner's dilemma game (PDG) with sequential strategy. The distinction between this study and the Ohdaira's reconsideration is that the former covers the model of multiple groups, but the latter deals with the model of only two groups. Ohdaira's reconsideration shows that the bounded rational altruistic decision facilitates much more cooperation in the PDG with sequential strategy than Ohdaira and Terano's bounded rational second-best decision does. However, the detail of cooperation of multiple groups based on the bounded rational altruistic decision has not been resolved yet. This study, therefore, shows how randomness in the network composed of multiple groups affects the increase of the average frequency of mutual cooperation (cooperation between groups) based on the bounded rational altruistic decision of multiple groups. We also discuss the results of the model in comparison with related studies which employ the best decision.
NASA Technical Reports Server (NTRS)
Menke, M. M.; Judd, B. R.
1973-01-01
The development policy for thermionic reactors to provide electric propulsion and power for space exploration was analyzed to develop a logical procedure for selecting development alternatives that reflect the technical feasibility, JPL/NASA project objectives, and the economic environment of the project. The partial evolution of a decision model from the underlying philosophy of decision analysis to a deterministic pilot phase is presented, and the general manner in which this decision model can be employed to examine propulsion development alternatives is illustrated.
A Decision Support Framework for Science-Based, Multi-Stakeholder Deliberation: A Coral Reef Example
NASA Astrophysics Data System (ADS)
Rehr, Amanda P.; Small, Mitchell J.; Bradley, Patricia; Fisher, William S.; Vega, Ann; Black, Kelly; Stockton, Tom
2012-12-01
We present a decision support framework for science-based assessment and multi-stakeholder deliberation. The framework consists of two parts: a DPSIR (Drivers-Pressures-States-Impacts-Responses) analysis to identify the important causal relationships among anthropogenic environmental stressors, processes, and outcomes; and a Decision Landscape analysis to depict the legal, social, and institutional dimensions of environmental decisions. The Decision Landscape incorporates interactions among government agencies, regulated businesses, non-government organizations, and other stakeholders. It also identifies where scientific information regarding environmental processes is collected and transmitted to improve knowledge about elements of the DPSIR and to improve the scientific basis for decisions. Our application of the decision support framework to coral reef protection and restoration in the Florida Keys focusing on anthropogenic stressors, such as wastewater, proved to be successful and offered several insights. Using information from a management plan, it was possible to capture the current state of the science with a DPSIR analysis as well as important decision options, decision makers and applicable laws with a the Decision Landscape analysis. A structured elicitation of values and beliefs conducted at a coral reef management workshop held in Key West, Florida provided a diversity of opinion and also indicated a prioritization of several environmental stressors affecting coral reef health. The integrated DPSIR/Decision landscape framework for the Florida Keys developed based on the elicited opinion and the DPSIR analysis can be used to inform management decisions, to reveal the role that further scientific information and research might play to populate the framework, and to facilitate better-informed agreement among participants.
Hanrahan, Kirsten; McCarthy, Ann Marie; Kleiber, Charmaine; Ataman, Kaan; Street, W Nick; Zimmerman, M Bridget; Ersig, Anne L
2012-10-01
This secondary data analysis used data mining methods to develop predictive models of child risk for distress during a healthcare procedure. Data used came from a study that predicted factors associated with children's responses to an intravenous catheter insertion while parents provided distraction coaching. From the 255 items used in the primary study, 44 predictive items were identified through automatic feature selection and used to build support vector machine regression models. Models were validated using multiple cross-validation tests and by comparing variables identified as explanatory in the traditional versus support vector machine regression. Rule-based approaches were applied to the model outputs to identify overall risk for distress. A decision tree was then applied to evidence-based instructions for tailoring distraction to characteristics and preferences of the parent and child. The resulting decision support computer application, titled Children, Parents and Distraction, is being used in research. Future use will support practitioners in deciding the level and type of distraction intervention needed by a child undergoing a healthcare procedure.
[GIS and scenario analysis aid to water pollution control planning of river basin].
Wang, Shao-ping; Cheng, Sheng-tong; Jia, Hai-feng; Ou, Zhi-dan; Tan, Bin
2004-07-01
The forward and backward algorithms for watershed water pollution control planning were summarized in this paper as well as their advantages and shortages. The spatial databases of water environmental function region, pollution sources, monitoring sections and sewer outlets were built with ARCGIS8.1 as the platform in the case study of Ganjiang valley, Jiangxi province. Based on the principles of the forward algorithm, four scenarios were designed for the watershed pollution control. Under these scenarios, ten sets of planning schemes were generated to implement cascade pollution source control. The investment costs of sewage treatment for these schemes were estimated by means of a series of cost-effective functions; with pollution source prediction, the water quality was modeled with CSTR model for each planning scheme. The modeled results of different planning schemes were visualized through GIS to aid decision-making. With the results of investment cost and water quality attainment as decision-making accords and based on the analysis of the economic endurable capacity for water pollution control in Ganjiang river basin, two optimized schemes were proposed. The research shows that GIS technology and scenario analysis can provide a good guidance to the synthesis, integrity and sustainability aspects for river basin water quality planning.
Urdahl, Hege; Manca, Andrea; Sculpher, Mark J
2008-01-01
Background To support decision making many countries have now introduced some formal assessment process to evaluate whether health technologies represent good ‘value for money’. These often take the form of decision models which can be used to explore elements of importance to generalisability of study results across clinical settings and jurisdictions. The objectives of the present review were to assess: (i) whether the published studies clearly defined the decision-making audience for the model; (ii) the transparency of the reporting in terms of study question, structure and data inputs; (iii) the relevance of the data inputs used in the model to the stated decision-maker or jurisdiction; and (iv) how fully the robustness of the model's results to variation in data inputs between locations was assessed. Methods Articles reporting decision-analytic models in the area of osteoporosis were assessed to establish the extent to which the information provided enabled decision makers in different countries/jurisdictions to fully appreciate the variability of results according to location, and the relevance to their own. Results Of the 18 articles included in the review, only three explicitly stated the decision-making audience. It was not possible to infer a decision-making audience in eight studies. Target population was well reported, as was resource and cost data, and clinical data used for estimates of relative risk reduction. However, baseline risk was rarely adapted to the relevant jurisdiction, and when no decision-maker was explicit it was difficult to assess whether the reported cost and resource use data was in fact relevant. A few studies used sensitivity analysis to explore elements of generalisability, such as compliance rates and baseline fracture risk rates, although such analyses were generally restricted to evaluating parameter uncertainty. Conclusion This review found that variability in cost-effectiveness across locations is addressed to a varying extent in modelling studies in the field of osteoporosis, limiting their use for decision-makers across different locations. Transparency of reporting is expected to increase as methodology develops, and decision-makers publish “reference case” type guidance. PMID:17129074
Hong, Taehoon; Koo, Choongwan; Kim, Hyunjoong
2012-12-15
The number of deteriorated multi-family housing complexes in South Korea continues to rise, and consequently their electricity consumption is also increasing. This needs to be addressed as part of the nation's efforts to reduce energy consumption. The objective of this research was to develop a decision support model for determining the need to improve multi-family housing complexes. In this research, 1664 cases located in Seoul were selected for model development. The research team collected the characteristics and electricity energy consumption data of these projects in 2009-2010. The following were carried out in this research: (i) using the Decision Tree, multi-family housing complexes were clustered based on their electricity energy consumption; (ii) using Case-Based Reasoning, similar cases were retrieved from the same cluster; and (iii) using a combination of Multiple Regression Analysis, Artificial Neural Network, and Genetic Algorithm, the prediction performance of the developed model was improved. The results of this research can be used as follows: (i) as basic research data for continuously managing several energy consumption data of multi-family housing complexes; (ii) as advanced research data for predicting energy consumption based on the project characteristics; (iii) as practical research data for selecting the most optimal multi-family housing complex with the most potential in terms of energy savings; and (iv) as consistent and objective criteria for incentives and penalties. Copyright © 2012 Elsevier Ltd. All rights reserved.
Operationalising uncertainty in data and models for integrated water resources management.
Blind, M W; Refsgaard, J C
2007-01-01
Key sources of uncertainty of importance for water resources management are (1) uncertainty in data; (2) uncertainty related to hydrological models (parameter values, model technique, model structure); and (3) uncertainty related to the context and the framing of the decision-making process. The European funded project 'Harmonised techniques and representative river basin data for assessment and use of uncertainty information in integrated water management (HarmoniRiB)' has resulted in a range of tools and methods to assess such uncertainties, focusing on items (1) and (2). The project also engaged in a number of discussions surrounding uncertainty and risk assessment in support of decision-making in water management. Based on the project's results and experiences, and on the subsequent discussions a number of conclusions can be drawn on the future needs for successful adoption of uncertainty analysis in decision support. These conclusions range from additional scientific research on specific uncertainties, dedicated guidelines for operational use to capacity building at all levels. The purpose of this paper is to elaborate on these conclusions and anchoring them in the broad objective of making uncertainty and risk assessment an essential and natural part in future decision-making processes.
Ertefaie, Ashkan; Shortreed, Susan; Chakraborty, Bibhas
2016-06-15
Q-learning is a regression-based approach that uses longitudinal data to construct dynamic treatment regimes, which are sequences of decision rules that use patient information to inform future treatment decisions. An optimal dynamic treatment regime is composed of a sequence of decision rules that indicate how to optimally individualize treatment using the patients' baseline and time-varying characteristics to optimize the final outcome. Constructing optimal dynamic regimes using Q-learning depends heavily on the assumption that regression models at each decision point are correctly specified; yet model checking in the context of Q-learning has been largely overlooked in the current literature. In this article, we show that residual plots obtained from standard Q-learning models may fail to adequately check the quality of the model fit. We present a modified Q-learning procedure that accommodates residual analyses using standard tools. We present simulation studies showing the advantage of the proposed modification over standard Q-learning. We illustrate this new Q-learning approach using data collected from a sequential multiple assignment randomized trial of patients with schizophrenia. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Dowding, Dawn; Lichtner, Valentina; Allcock, Nick; Briggs, Michelle; James, Kirstin; Keady, John; Lasrado, Reena; Sampson, Elizabeth L; Swarbrick, Caroline; José Closs, S
2016-01-01
The recognition, assessment and management of pain in hospital settings is suboptimal, and is a particular challenge in patients with dementia. The existing process guiding pain assessment and management in clinical settings is based on the assumption that nurses follow a sequential linear approach to decision making. In this paper we re-evaluate this theoretical assumption drawing on findings from a study of pain recognition, assessment and management in patients with dementia. To provide a revised conceptual model of pain recognition, assessment and management based on sense-making theories of decision making. The research we refer to is an exploratory ethnographic study using nested case sites. Patients with dementia (n=31) were the unit of data collection, nested in 11 wards (vascular, continuing care, stroke rehabilitation, orthopaedic, acute medicine, care of the elderly, elective and emergency surgery), located in four NHS hospital organizations in the UK. Data consisted of observations of patients at bedside (170h in total); observations of the context of care; audits of patient hospital records; documentary analysis of artefacts; semi-structured interviews (n=56) and informal open conversations with staff and carers (family members). Existing conceptualizations of pain recognition, assessment and management do not fully explain how the decision process occurs in clinical practice. Our research indicates that pain recognition, assessment and management is not an individual cognitive activity; rather it is carried out by groups of individuals over time and within a specific organizational culture or climate, which influences both health care professional and patient behaviour. We propose a revised theoretical model of decision making related to pain assessment and management for patients with dementia based on theories of sense-making, which is reflective of the reality of clinical decision making in acute hospital wards. The revised model recognizes the salience of individual cognition as well as acknowledging that decisions are constructed through social interaction and organizational context. The model will be used in further research to develop decision support interventions to assist with the assessment and management of patients with dementia in acute hospital settings. Copyright © 2015. Published by Elsevier Ltd.
Gervais, Debra A.; Hartman, Rebecca I.; Harisinghani, Mukesh G.; Feldman, Adam S.; Mueller, Peter R.; Gazelle, G. Scott
2010-01-01
Purpose: To evaluate the effectiveness, cost, and cost-effectiveness of using renal mass biopsy to guide treatment decisions for small incidentally detected renal tumors. Materials and Methods: A decision-analytic Markov model was developed to estimate life expectancy and lifetime costs for patients with small (≤4-cm) renal tumors. Two strategies were compared: renal mass biopsy to triage patients to surgery or imaging surveillance and empiric nephron-sparing surgery. The model incorporated biopsy performance, the probability of track seeding with malignant cells, the prevalence and growth of benign and malignant tumors, treatment effectiveness and costs, and patient outcomes. An incremental cost-effectiveness analysis was performed to identify strategy preference under a willingness-to-pay threshold of $75 000 per quality-adjusted life-year (QALY). Effects of changes in key parameters on strategy preference were evaluated in sensitivity analysis. Results: Under base-case assumptions, the biopsy strategy yielded a minimally greater quality-adjusted life expectancy (4 days) than did empiric surgery at a lower lifetime cost ($3466), dominating surgery from a cost-effectiveness perspective. Over the majority of parameter ranges tested in one-way sensitivity analysis, the biopsy strategy dominated surgery or was cost-effective relative to surgery based on a $75 000-per-QALY willingness-to-pay threshold. In two-way sensitivity analysis, surgery yielded greater life expectancy when the prevalence of malignancy and propensity for biopsy-negative cancers to metastasize were both higher than expected or when the sensitivity and specificity of biopsy were both lower than expected. Conclusion: The use of biopsy to guide treatment decisions for small incidentally detected renal tumors is cost-effective and can prevent unnecessary surgery in many cases. © RSNA, 2010 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.10092013/-/DC1 PMID:20720070
Holt, S; Bertelli, G; Humphreys, I; Valentine, W; Durrani, S; Pudney, D; Rolles, M; Moe, M; Khawaja, S; Sharaiha, Y; Brinkworth, E; Whelan, S; Jones, S; Bennett, H; Phillips, C J
2013-01-01
Background: Tumour gene expression analysis is useful in predicting adjuvant chemotherapy benefit in early breast cancer patients. This study aims to examine the implications of routine Oncotype DX testing in the UK. Methods: Women with oestrogen receptor positive (ER+), pNO or pN1mi breast cancer were assessed for adjuvant chemotherapy and subsequently offered Oncotype DX testing, with changes in chemotherapy decisions recorded. A subset of patients completed questionnaires about their uncertainties regarding chemotherapy decisions pre- and post-testing. All patients were asked to complete a diary of medical interactions over the next 6 months, from which economic data were extracted to model the cost-effectiveness of testing. Results: Oncotype DX testing resulted in changes in chemotherapy decisions in 38 of 142 (26.8%) women, with 26 of 57 (45.6%) spared chemotherapy and 12 of 85 (14.1%) requiring chemotherapy when not initially recommended (9.9% reduction overall). Decision conflict analysis showed that Oncotype DX testing increased patients' confidence in treatment decision making. Economic analysis showed that routine Oncotype DX testing costs £6232 per quality-adjusted life year gained. Conclusion: Oncotype DX decreased chemotherapy use and increased confidence in treatment decision making in patients with ER+ early-stage breast cancer. Based on these findings, Oncotype DX is cost-effective in the UK setting. PMID:23695023
The use of decision analysis to examine ethical decision making by critical care nurses.
Hughes, K K; Dvorak, E M
1997-01-01
To examine the extent to which critical care staff nurses make ethical decisions that coincide with those recommended by a decision analytic model. Nonexperimental, ex post facto. Midwestern university-affiliated 500 bed tertiary care medical center. One hundred critical care staff nurses randomly selected from seven critical care units. Complete responses were obtained from 82 nurses (for a final response rate of 82%). The dependent variable--consistent decision making--was measured as staff nurses' abilities to make ethical decisions that coincided with those prescribed by the decision model. Subjects completed two instruments, the Ethical Decision Analytic Model, a computer-administered instrument designed to measure staff nurses' abilities to make consistent decisions about a chemically-impaired colleague; and a Background Inventory. The results indicate marked consensus among nurses when informal methods were used. However, there was little consistency between the nurses' informal decisions and those recommended by the decision analytic model. Although 50% (n = 41) of all nurses chose a course of action that coincided with the model's least optimal alternative, few nurses agreed with the model as to the most optimal course of action. The findings also suggest that consistency was unrelated (p > 0.05) to the nurses' educational background or years of clinical experience; that most subjects reported receiving little or no education in decision making during their basic nursing education programs; but that exposure to decision-making strategies was related to years of nursing experience (p < 0.05). The findings differ from related studies that have found a moderate degree of consistency between nurses and decision analytic models for strictly clinical decision tasks, especially when those tasks were less complex. However, the findings partially coincide with other findings that decision analysis may not be particularly well-suited to the critical care environment. Additional research is needed to determine whether critical care nurses use the same decision-making methods as do other nurses; and to clarify the effects of decision task (clinical versus ethical) on nurses' decision making. It should not be assumed that methods used to study nurses' clinical decision making are applicable for all nurses or all types of decisions, including ethical decisions.
Cost-Utility Analysis of Bariatric Surgery in Italy: Results of Decision-Analytic Modelling.
Lucchese, Marcello; Borisenko, Oleg; Mantovani, Lorenzo Giovanni; Cortesi, Paolo Angelo; Cesana, Giancarlo; Adam, Daniel; Burdukova, Elisabeth; Lukyanov, Vasily; Di Lorenzo, Nicola
2017-01-01
To evaluate the cost-effectiveness of bariatric surgery in Italy from a third-party payer perspective over a medium-term (10 years) and a long-term (lifetime) horizon. A state-transition Markov model was developed, in which patients may experience surgery, post-surgery complications, diabetes mellitus type 2, cardiovascular diseases or die. Transition probabilities, costs, and utilities were obtained from the Italian and international literature. Three types of surgeries were considered: gastric bypass, sleeve gastrectomy, and adjustable gastric banding. A base-case analysis was performed for the population, the characteristics of which were obtained from surgery candidates in Italy. In the base-case analysis, over 10 years, bariatric surgery led to cost increment of EUR 2,661 and generated additional 1.1 quality-adjusted life years (QALYs). Over a lifetime, surgery led to savings of EUR 8,649, additional 0.5 life years and 3.2 QALYs. Bariatric surgery was cost-effective at 10 years with an incremental cost-effectiveness ratio of EUR 2,412/QALY and dominant over conservative management over a lifetime. In a comprehensive decision analytic model, a current mix of surgical methods for bariatric surgery was cost-effective at 10 years and cost-saving over the lifetime of the Italian patient cohort considered in this analysis. © 2017 The Author(s) Published by S. Karger GmbH, Freiburg.
Decision-Making in Agent-Based Models of Migration: State of the Art and Challenges.
Klabunde, Anna; Willekens, Frans
We review agent-based models (ABM) of human migration with respect to their decision-making rules. The most prominent behavioural theories used as decision rules are the random utility theory, as implemented in the discrete choice model, and the theory of planned behaviour. We identify the critical choices that must be made in developing an ABM, namely the modelling of decision processes and social networks. We also discuss two challenges that hamper the widespread use of ABM in the study of migration and, more broadly, demography and the social sciences: (a) the choice and the operationalisation of a behavioural theory (decision-making and social interaction) and (b) the selection of empirical evidence to validate the model. We offer advice on how these challenges might be overcome.
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2016-12-01
Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.