Transactions on Systems, Man, and Cybernetics, Vol. SMC-7, No. 5, May, 1977. . 7. Farquhar, P.H., "A Survey of Multiattribute Utility Theory and...Multi-Attribute Decision Analysis Model. The theoretical underpinnings of MADAM involve portions of multi-attribute utility theory . This interactive...Attribute Utility Theory (MAUT) model is discussed in Section 2. The actual computer program modifications developed and then implemented in code
Bohanec, M; Zupan, B; Rajkovic, V
Hierarchical decision models are developed through decomposition of complex decision problems into smaller and less complex subproblems. They are aimed at the classification or evaluation of options and can be used for analysis, simulation and explanation. This paper presents a set of methods for the construction and application of qualitative hierarchical decision models in health care. We present the results of four ongoing projects in oncology, radiology, community nursing and diabetic foot treatment.
Staats, Raymond W.
This research uses multi-attribute utility theory (MAUT) to define a mathematical representation of a decision maker's utility associated with a satellite system. While developing the survey instrument, we focused on making it simpler to administer, primarily by eliminating the use of lottery questions. These simplifications enabled us to shorten our interview with the decision maker to under two hours for a rather complex model. The MAUT model gives National Air Intelligence Center (NAIC) analysts the ability to rank order satellite systems using the common measurement scale of 'utiles.' This tool allows a meaningful comparison of vastly different satellites. Properly prioritized launch of space assets will be key to maintaining our capabilities in the long term. The ordering methodology of this model was extended to a multi-criterion optimization (MCO) problem to demonstrate its potential use in prioritizing and scheduling limited launch resources. The results of these two case studies and the MCO application are combined to develop some characterizations of a theoretical group utility function. Most complex decisions are made by groups rather than by an individual. This research concludes with some insights on the impact of an individual's preferences on a decision that is ultimately made by the group.
Utility Theory . MAUT" will be the approach used in this research. It allows us to express a definitive multiattribute utility function that can be...Captain Raymond WY. •t,,ts•, UAP, rCLASS. GSr%-Q94D. THESIS TITLE: A Multi-Attribute- Utility - Theory Model That Minimizes Interview-Data Requirements: A...2.2.2 Interactive Approach ................................................................. 2-4 2.2.3 M ultiattribute Utility Theory
Brennan, P F; Anthony, M K
Nursing Practice Models (NPMs) represent the structural and contextual features that exist within any group practice of nursing. Currently, measurement of NPMs relies on costly and nonreproducible global judgments by experts. Quantitative measurement techniques are needed to provide a useful evaluation of nursing practice. Guided by Multi-Attribute Utility theory (MAU theory), an expert panel identified 24 factors representative of N PMs. The factors became elements in a computational index that, when summed, assigns a score to a given nursing unit reflecting the extent to which that unit's nursing practice model achieves the nursing professional ideal. Initial validation of the index and its elements consisted of comparing assessments of 40 nursing units generated by the index with a global evaluation provided by each of the expert panelists who proposed the model factors. Pearson correlations between the index-generated scores and the global assigned scores provided evidence supporting the preliminary validation of the index.
Wang, Jing; Li, Li; Tan, Feng; Zhu, Ying; Feng, Weisi
Microblogging as a kind of social network has become more and more important in our daily lives. Enormous amounts of information are produced and shared on a daily basis. Detecting hot topics in the mountains of information can help people get to the essential information more quickly. However, due to short and sparse features, a large number of meaningless tweets and other characteristics of microblogs, traditional topic detection methods are often ineffective in detecting hot topics. In this paper, we propose a new topic model named multi-attribute latent dirichlet allocation (MA-LDA), in which the time and hashtag attributes of microblogs are incorporated into LDA model. By introducing time attribute, MA-LDA model can decide whether a word should appear in hot topics or not. Meanwhile, compared with the traditional LDA model, applying hashtag attribute in MA-LDA model gives the core words an artificially high ranking in results meaning the expressiveness of outcomes can be improved. Empirical evaluations on real data sets demonstrate that our method is able to detect hot topics more accurately and efficiently compared with several baselines. Our method provides strong evidence of the importance of the temporal factor in extracting hot topics. PMID:26496635
Excellence (MSCoE) Fort Leonard Wood , MO Approved for public release; distribution is unlimited. 1 Multi-Attribute Decision Making (MADM) in...Battle Lab (MSBL) Capabilities Development and Integration Directorate (CDID) Maneuver Support Center of Excellence (MSCoE) Fort Leonard Wood , MO 8...Backup Background • MSBL supports EN, MP, CM Schools and the Maneuver Support and Protection Warfighting Function (WfF). • MSBL manages a
Chapman, G B; Elstein, A S; Kuzel, T M; Nadler, R B; Sharifi, R; Bennett, C L
Multi-attribute utility theory (MAUT) provides a way to model decisions involving trade-offs among different aspects or goals of a problem. We used MAUT to model prostate cancer patients' preferences for their own health state and we compared this model to patients' global judgments of health state utility. 57 patients with prostate cancer (mean age = 70) at two Chicago Veterans Administration health clinics were asked to evaluate health states described in terms of five health attributes affected by prostate cancer: pain, mood, sexual function, bladder and bowel function, and fatigue and energy. Each attribute had three levels that were used to form three clinically realistic health state descriptions (A = high, B = moderate, C = low). A fourth personalized health description (P) matched the patient's current health. We first measured patients' preferences using time trade-off (TTO) judgments for the three health states (A, B, and C) and for their own current health state (P). The TTO for the patient's own health state (P) was standardized by comparing it to TTO judgments for states A and C. We next constructed a multi-attribute model using the relative importance of the five attributes. The MAU scores were moderately correlated with the TTO preference judgments for the personalized state (Pearson r = 0.38, N = 57, p < 0.01). Thus, patients' preference judgments are moderately consistent and systematic. MAUT appears to be a potentially feasible method for evaluating preferences of prostate cancer patients and may prove helpful in assisting with patient decision making.
Ding, Shuai; Xia, Chen-Yi; Zhou, Kai-Le; Yang, Shan-Lin; Shang, Jennifer S.
Facing a customer market with rising demands for cloud service dependability and security, trustworthiness evaluation techniques are becoming essential to cloud service selection. But these methods are out of the reach to most customers as they require considerable expertise. Additionally, since the cloud service evaluation is often a costly and time-consuming process, it is not practical to measure trustworthy attributes of all candidates for each customer. Many existing models cannot easily deal with cloud services which have very few historical records. In this paper, we propose a novel service selection approach in which the missing value prediction and the multi-attribute trustworthiness evaluation are commonly taken into account. By simply collecting limited historical records, the current approach is able to support the personalized trustworthy service selection. The experimental results also show that our approach performs much better than other competing ones with respect to the customer preference and expectation in trustworthiness assessment. PMID:24972237
Ding, Shuai; Xia, Cheng-Yi; Xia, Chen-Yi; Zhou, Kai-Le; Yang, Shan-Lin; Shang, Jennifer S
Facing a customer market with rising demands for cloud service dependability and security, trustworthiness evaluation techniques are becoming essential to cloud service selection. But these methods are out of the reach to most customers as they require considerable expertise. Additionally, since the cloud service evaluation is often a costly and time-consuming process, it is not practical to measure trustworthy attributes of all candidates for each customer. Many existing models cannot easily deal with cloud services which have very few historical records. In this paper, we propose a novel service selection approach in which the missing value prediction and the multi-attribute trustworthiness evaluation are commonly taken into account. By simply collecting limited historical records, the current approach is able to support the personalized trustworthy service selection. The experimental results also show that our approach performs much better than other competing ones with respect to the customer preference and expectation in trustworthiness assessment.
Nosofsky, Robert M.; Bergert, F. Bryabn
Observers were presented with pairs of objects varying along binary-valued attributes and learned to predict which member of each pair had a greater value on a continuously varying criterion variable. The predictions from exemplar models of categorization were contrasted with classic alternative models, including generalized versions of a…
emphasize the judgment handling capability of MAUT by saying: Basing a capability measure on multiattribute utility theory capitalizes on the notion...NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS FMULTI-ATTRIBUTE UTILITY THEORY TO ASSIST TOP-LEVEL ACQUIS ITION DECIS ION-MKN by Ran Goreni...CATALOG MUMEU 4. TITL (1d S ufiDIO) . ?YoE 011 REPORT & PERIOD COVENEO Multi-Attribute Utility Theory to master’s thesis; Assist Top-Level Acquisition
Knobler, Stacey; Bok, Karin; Gellin, Bruce
SMART Vaccines 2.0 software is being developed to support decision-making among multiple stakeholders in the process of prioritizing investments to optimize the outcomes of vaccine development and deployment. Vaccines and associated vaccination programs are one of the most successful and effective public health interventions to prevent communicable diseases and vaccine researchers are continually working towards expanding targets for communicable and non-communicable diseases through preventive and therapeutic modes. A growing body of evidence on emerging vaccine technologies, trends in disease burden, costs associated with vaccine development and deployment, and benefits derived from disease prevention through vaccination and a range of other factors can inform decision-making and investment in new and improved vaccines and targeted utilization of already existing vaccines. Recognizing that an array of inputs influences these decisions, the strategic multi-attribute ranking method for vaccines (SMART Vaccines 2.0) is in development as a web-based tool-modified from a U.S. Institute of Medicine Committee effort (IOM, 2015)-to highlight data needs and create transparency to facilitate dialogue and information-sharing among decision-makers and to optimize the investment of resources leading to improved health outcomes. Current development efforts of the SMART Vaccines 2.0 framework seek to generate a weighted recommendation on vaccine development or vaccination priorities based on population, disease, economic, and vaccine-specific data in combination with individual preference and weights of user-selected attributes incorporating valuations of health, economics, demographics, public concern, scientific and business, programmatic, and political considerations. Further development of the design and utility of the tool is being carried out by the National Vaccine Program Office of the Department of Health and Human Services and the Fogarty International Center of the
Krabbe, Paul F. M.
After 40 years of deriving metric values for health status or health-related quality of life, the effective quantification of subjective health outcomes is still a challenge. Here, two of the best measurement tools, the discrete choice and the Rasch model, are combined to create a new model for deriving health values. First, existing techniques to value health states are briefly discussed followed by a reflection on the recent revival of interest in patients’ experience with regard to their possible role in health measurement. Subsequently, three basic principles for valid health measurement are reviewed, namely unidimensionality, interval level, and invariance. In the main section, the basic operation of measurement is then discussed in the framework of probabilistic discrete choice analysis (random utility model) and the psychometric Rasch model. It is then shown how combining the main features of these two models yields an integrated measurement model, called the multi-attribute preference response (MAPR) model, which is introduced here. This new model transforms subjective individual rank data into a metric scale using responses from patients who have experienced certain health states. Its measurement mechanism largely prevents biases such as adaptation and coping. Several extensions of the MAPR model are presented. The MAPR model can be applied to a wide range of research problems. If extended with the self-selection of relevant health domains for the individual patient, this model will be more valid than existing valuation techniques. PMID:24278141
Liu, K. K.; Li, C. H.; Cai, Y. P.; Xu, M.; Xia, X. H.
In this paper, a fuzzy multi-attribute decision analysis approach (FMADAA) was developed for supporting the evaluation of water resources security in nine provinces within the Yellow River basin. A numerical approximation system and a modified left-right scoring approach were adopted to cope with the uncertainties in the acquired information. Also, four conventional multi-attribute decision analysis (MADA) methods were implemented in the evaluation model for impact evaluation, including simple weighted addition (SWA), weighted product (WP), cooperative game theory (CGT) and technique for order preference by similarity to ideal solution (TOPSIS). Moreover, several aggregation methods including average ranking procedure, Borda and Copeland methods were used to integrate the ranking results, helping rank the water resources security in those nine provinces as well as improving reliability of evaluation results. The ranking results showed that the water resources security of the entire basin was in critical condition, including the insecurity and absolute insecurity states, especially in Shanxi, Inner Mongolia and Ningxia provinces in which water resources were lower than the average quantity in China. Hence, the improvement of water eco-environment statuses in the above-mentioned provinces should be prioritized in the future planning of the Yellow River basin.
Liu, K. K.; Li, C. H.; Cai, Y. P.; Xu, M.; Xia, X. H.
In this paper, a Fuzzy Multi-Attribute Decision Analysis Approach (FMADAA) was adopted in water resources security evaluation for the nine provinces in the Yellow River basin in 2006. A numerical approximation system and a modified left-right scoring approach were adopted to cope with the uncertainties in the acquired information. Four multi-attribute decision making methods were implemented in the evaluation model for impact evaluation, including simple weighted addition (SWA), weighted product (WP), cooperative game theory (CGT) and technique for order preference by similarity to ideal solution (TOPSIS) which could be used for helping rank the water resources security in those nine provinces as well as the criteria alternatives. Moreover, several aggregation methods including average ranking procedure, borda and copeland methods were used to integrate the ranking results. The ranking results showed that the water resources security of the entire basin is in critical, insecurity and absolute insecurity state, especially in Shanxi, Inner Mongolia and Ningxia provinces in which water resources were lower than the average quantity in China. Hence, future planning of the Yellow River basin should mainly focus on the improvement of water eco-environment status in the provinces above.
Henrion, Max; Bernstein, Brock; Swamy, Surya
The 27 oil and gas platforms off the coast of southern California are reaching the end of their economic lives. Because their decommissioning involves large costs and potential environmental impacts, this became an issue of public controversy. As part of a larger policy analysis conducted for the State of California, we implemented a decision analysis as a software tool (PLATFORM) to clarify and evaluate decision strategies against a comprehensive set of objectives. Key options selected for in-depth analysis are complete platform removal and partial removal to 85 feet below the water line, with the remaining structure converted in place to an artificial reef to preserve the rich ecosystems supported by the platform's support structure. PLATFORM was instrumental in structuring and performing key analyses of the impacts of each option (e.g., on costs, fishery production, air emissions) and dramatically improved the team's productivity. Sensitivity analysis found that disagreement about preferences, especially about the relative importance of strict compliance with lease agreements, has much greater effects on the preferred option than does uncertainty about specific outcomes, such as decommissioning costs. It found a near-consensus of stakeholders in support of partial removal and "rigs-to-reefs" program. The project's results played a role in the decision to pass legislation enabling an expanded California "rigs-to-reefs" program that includes a mechanism for sharing cost savings between operators and the state.
Fellows, Lesley K
Ventromedial frontal lobe (VMF) damage is associated with impaired decision making. Recent efforts to understand the functions of this brain region have focused on its role in tracking reward, punishment and risk. However, decision making is complex, and frontal lobe damage might be expected to affect it at other levels. This study used process-tracing techniques to explore the effect of VMF damage on multi-attribute decision making under certainty. Thirteen subjects with focal VMF damage were compared with 11 subjects with frontal damage that spared the VMF and 21 demographically matched healthy control subjects. Participants chose rental apartments in a standard information board task drawn from the literature on normal decision making. VMF subjects performed the decision making task in a way that differed markedly from all other groups, favouring an 'alternative-based' information acquisition strategy (i.e. they organized their information search around individual apartments). In contrast, both healthy control subjects and subjects with damage predominantly involving dorsal and/or lateral prefrontal cortex pursued primarily 'attribute-based' search strategies (in which information was acquired about categories such as rent and noise level across several apartments). This difference in the pattern of information acquisition argues for systematic differences in the underlying decision heuristics and strategies employed by subjects with VMF damage, which in turn may affect the quality of their choices. These findings suggest that the processes supported by ventral and medial prefrontal cortex need to be conceptualized more broadly, to account for changes in decision making under conditions of certainty, as well as uncertainty, following damage to these areas.
Burg, Cecile M.; Hill, Geoffrey A.; Brown, Sherilyn A.; Geiselhart, Karl A.
The Systems Analysis Branch at NASA Langley Research Center has investigated revolutionary Propulsion Airframe Aeroacoustics (PAA) technologies and configurations for a Blended-Wing-Body (BWB) type aircraft as part of its research for NASA s Quiet Aircraft Technology (QAT) Project. Within the context of the long-term NASA goal of reducing the perceived aircraft noise level by a factor of 4 relative to 1997 state of the art, major configuration changes in the propulsion airframe integration system were explored with noise as a primary design consideration. An initial down-select and assessment of candidate PAA technologies for the BWB was performed using a Multi-Attribute Decision Making (MADM) process consisting of organized brainstorming and decision-making tools. The assessments focused on what effect the PAA technologies had on both the overall noise level of the BWB and what effect they had on other major design considerations such as weight, performance and cost. A probabilistic systems analysis of the PAA configurations that presented the best noise reductions with the least negative impact on the system was then performed. Detailed results from the MADM study and the probabilistic systems analysis will be published in the near future.
of Defense (DOD) are examining ways to utilize multiattribute decision models in the project and source selection processes. Results of some of these... multiattribute decision techniques have been examined by others in DOD to see how they may be utilized in various stages of the procurement process. The... Utility Theory (MAUT) and the AHP to determine how they could be applied to source selection in an Air Force system program office. 18 Additionally
elephants” – Harvard Business Review on Decision Making, 2001 By applying the Multi-Attributes Utility Theory to analysis of a modeled system...1976 8. US Army, “Guideline for Army Analysis”, 1999 9. Harvard Business Review on Decision Making, 2001 10. US Army, “Verification, Validation, and
elephants” – Harvard Business Review on Decision Making, 2001 By applying the Multi-Attributes Utility Theory to analysis of a modeled system...1976 8. US Army, “Guideline for Army Analysis”, 1999 9. Harvard Business Review on Decision Making, 2001 10. US Army, “Verification, Validation, and
Davis, F.; Kuzio, K.; Sorenson, K.; Weiner, R.; Wheeler, T.
A multi-attribute utility analysis is applied to the decision to select a treatment method for the management of aluminum-based spent nuclear i%el (A1-SNF) owned by the United States Department of Energy (DOE). DOE will receive, treat, and temporarily store Al- SNF, most of which is composed of highly enriched uranium, at its Savannah River Site in South Carolina. DOE intends ultimately to send the treated Al-SNJ? to a geologic repository for permanent disposal. DOE initially considered ten treatment alternatives for the management of A1-SNF, and has narrowed the choice to two of these the direct disposal and melt and dilute alternatives. The decision analysis presented in this document focuses on a decision between these two remaining alternatives.
Method (Satisfying Method) Disjunctive Method Standart Level Elimination by Aspect Lexicograhic Semi order Lexicographic Method Ordinal Weigted Sum...framework for sensitivity analysis of hierarchical additive value models and standardizes the sensitivity analysis notation and terminology . Finally
Ghoochani, Omid M; Ghanian, Mansour; Baradaran, Masoud; Azadi, Hossein
Organisms that have been genetically engineered and modified (GM) are referred to as genetically modified organisms (GMOs). Bt crops are plants that have been genetically modified to produce certain proteins from the soil bacteria Bacillus thuringiensis (Bt), which makes these plants resistant to certain lepidopteran and coleopteran species. Genetically Modified (GM) rice was produced in 2006 by Iranian researchers from Tarom Mowla'ii and has since been called 'Bt rice'. As rice is an important source of food for over 3 billion inhabitants on Earth, this study aims to use a correlational survey in order to shed light on the predicting factors relating to the extent of stakeholders' behavioral intentions towards Bt rice. It is assumed and the results confirm that "attitudes toward GM crops" can be used as a bridge in the Attitude Model and the Behavioral Intention Model in order to establish an integrated model. To this end, a case study was made of the Southwest part of Iran in order to verify this research model. This study also revealed that as a part of the integrated research framework in the Behavior Intention Model both constructs of attitude and the subjective norm of the respondents serve as the predicting factors of stakeholders' intentions of working with Bt rice. In addition, the Attitude Model, as the other part of the integrated research framework, showed that the stakeholders' attitudes toward Bt rice can only be determined by the perceived benefits (e.g. positive outcomes) of Bt rice.
Wang, Peng; Fang, Weining; Guo, Beiyuan
This paper proposed a colored petri nets based workload evaluation model. A formal interpretation of workload was firstly introduced based on the process that reflection of petri nets components to task. A petri net based description of Multiple Resources theory was given by comprehending it from a new angle. A new application of VACP rating scales named V/A-C-P unit, and the definition of colored transitions were proposed to build a model of task process. The calculation of workload mainly has the following four steps: determine token's initial position and values; calculate the weight of directed arcs on the basis of the rules proposed; calculate workload from different transitions, and correct the influence of repetitive behaviors. Verify experiments were carried out based on Multi-Attribute Task Battery-II software. Our results show that there is a strong correlation between the model values and NASA -Task Load Index scores (r=0.9513). In addition, this method can also distinguish behavior characteristics between different people.
Postgraduate School,Code 64,699 Dyer Rd,Monterey,CA,93943 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES...Graduate School of Business and Public Policy Naval Postgraduate School 555 Dyer Road, Room 332 Monterey, CA 93943-5103 Tel: (831) 656-2092 Fax...699 Dyer Rd. Monterey, CA 93943 firstname.lastname@example.org Office: 831-656-2009 Jay Simon—Dr. Simon is an Assistant Professor of Decision Science at the
Shyyan, Vitaliy; Christensen, Laurene; Thurlow, Martha; Lazarus, Sheryl
The Multi-Attribute Consensus Building (MACB) method is a quantitative approach for determining a group's opinion about the importance of each item (strategy, decision, recommendation, policy, priority, etc.) on a list (Vanderwood, & Erickson, 1994). This process enables a small or large group of participants to generate and discuss a set…
Stevens, Katherine; McCabe, Christopher; Brazier, John; Roberts, Jennifer
A key issue in health state valuation modelling is the choice of functional form. The two most frequently used preference based instruments adopt different approaches; one based on multi-attribute utility theory (MAUT), the other on statistical analysis. There has been no comparison of these alternative approaches in the context of health economics. We report a comparison of these approaches for the health utilities index mark 2. The statistical inference model predicts more accurately than the one based on MAUT. We discuss possible explanations for the differences in performance, the importance of the findings, and implications for future research.
Bearden, J. Neil; Connolly, Terry
This article describes empirical and theoretical results from two multi-attribute sequential search tasks. In both tasks, the DM sequentially encounters options described by two attributes and must pay to learn the values of the attributes. In the "continuous" version of the task the DM learns the precise numerical value of an attribute when she…
Alternatives Alternative Facing Brand Type Fiberglass Batt (.82 lb/ft^3) Unfaced Owens Corning Thermal Batt Fiberglass Batt (.82 lb/ft^3) Foil... Owens Corning Thermal Batt Fiberglass Batt (.82 lb/ft^3) Kraft Owens Corning Thermal Batt Fiberglass Blown In (1.68 lb/ft^3) Unfaced Owens Corning ProPink...Fiberglass Rigid Board (1.68 lb/ft^3) Unfaced Owens Corning 701 Insulation Fiberglass Rigid Board (3.49 lb/ft^3) Unfaced Owens Corning 703
Torrance, G W; Furlong, W; Feeny, D; Boyle, M
Multi-attribute utility theory, an extension of conventional utility theory, can be applied to model preference scores for health states defined by multi-attribute health status classification systems. The type of preference independence among the attributes determines the type of preference function required: additive, multiplicative or multilinear. In addition, the type of measurement instrument used determines the type of preference score obtained: value or utility. Multi-attribute utility theory has been applied to 2 recently developed multi-attribute health status classification systems, the Health Utilities Index (HUI) Mark II and Mark III systems. Results are presented for the Mark II system, and ongoing research is described for the Mark III system. The theory is also discussed in the context of other well known multi-attribute systems. The HUI system is an efficient method of determining a general public-based utility score for a specified health outcome or for the health status of an individual. In clinical populations, the scores can be used to provide a single summary measure of health-related quality of life. In cost-utility analyses, the scores can be used as quality weights for calculating quality-adjusted life years. In general populations, the measure can be used as quality weights for determining population health expectancy.
Ross, A. M.; Hastings, D. E.
The complexity inherent in space systems necessarily requires intense expenditures of resources both human and monetary. The high level of ambiguity present in the early design phases of these systems causes long, highly iterative, and costly design cycles. This paper looks at incorporating decision theory methods into the early design processes to streamline communication of wants and needs among stakeholders and between levels of design. Communication channeled through formal utility interviews and analysis enables engineers to better understand the key drivers for the system and allows a more thorough exploration of the design tradespace. Multi-Attribute Tradespace Exploration (MATE), an evolving process incorporating decision theory into model and simulation- based design, has been applied to several space system case studies at MIT. Preliminary results indicate that this process can improve the quality of communication to more quickly resolve project ambiguity, and enable the engineer to discover better value designs for multiple stakeholders. MATE is also being integrated into a concurrent design environment to facilitate the transfer knowledge of important drivers into higher fidelity design phases. Formal utility theory provides a mechanism to bridge the language barrier between experts of different backgrounds and differing needs (e.g. scientists, engineers, managers, etc). MATE with concurrent design couples decision makers more closely to the design, and most importantly, maintains their presence between formal reviews.
Kolar, Mladen; Liu, Han; Xing, Eric P.
Undirected graphical models are important in a number of modern applications that involve exploring or exploiting dependency structures underlying the data. For example, they are often used to explore complex systems where connections between entities are not well understood, such as in functional brain networks or genetic networks. Existing methods for estimating structure of undirected graphical models focus on scenarios where each node represents a scalar random variable, such as a binary neural activation state or a continuous mRNA abundance measurement, even though in many real world problems, nodes can represent multivariate variables with much richer meanings, such as whole images, text documents, or multi-view feature vectors. In this paper, we propose a new principled framework for estimating the structure of undirected graphical models from such multivariate (or multi-attribute) nodal data. The structure of a graph is inferred through estimation of non-zero partial canonical correlation between nodes. Under a Gaussian model, this strategy is equivalent to estimating conditional independencies between random vectors represented by the nodes and it generalizes the classical problem of covariance selection (Dempster, 1972). We relate the problem of estimating non-zero partial canonical correlations to maximizing a penalized Gaussian likelihood objective and develop a method that efficiently maximizes this objective. Extensive simulation studies demonstrate the effectiveness of the method under various conditions. We provide illustrative applications to uncovering gene regulatory networks from gene and protein profiles, and uncovering brain connectivity graph from positron emission tomography data. Finally, we provide sufficient conditions under which the true graphical structure can be recovered correctly. PMID:25620892
Li, Lian-hui; Mo, Rong
The production task queue has a great significance for manufacturing resource allocation and scheduling decision. Man-made qualitative queue optimization method has a poor effect and makes the application difficult. A production task queue optimization method is proposed based on multi-attribute evaluation. According to the task attributes, the hierarchical multi-attribute model is established and the indicator quantization methods are given. To calculate the objective indicator weight, criteria importance through intercriteria correlation (CRITIC) is selected from three usual methods. To calculate the subjective indicator weight, BP neural network is used to determine the judge importance degree, and then the trapezoid fuzzy scale-rough AHP considering the judge importance degree is put forward. The balanced weight, which integrates the objective weight and the subjective weight, is calculated base on multi-weight contribution balance model. The technique for order preference by similarity to an ideal solution (TOPSIS) improved by replacing Euclidean distance with relative entropy distance is used to sequence the tasks and optimize the queue by the weighted indicator value. A case study is given to illustrate its correctness and feasibility. PMID:26414758
Hoogenveen, Rudolf T; Boshuizen, Hendriek C; Engelfriet, Peter M; van Baal, Pieter Hm
Mortality rates in Markov models, as used in health economic studies, are often estimated from summary statistics that allow limited adjustment for confounders. If interventions are targeted at multiple diseases and/or risk factors, these mortality rates need to be combined in a single model. This requires them to be mutually adjusted to avoid 'double counting' of mortality. We present a mathematical modeling approach to describe the joint effect of mutually dependent risk factors and chronic diseases on mortality in a consistent manner. Most importantly, this approach explicitly allows the use of readily available external data sources. An additional advantage is that existing models can be smoothly expanded to encompass more diseases/risk factors. To illustrate the usefulness of this method and how it should be implemented, we present a health economic model that links risk factors for diseases to mortality from these diseases, and describe the causal chain running from these risk factors (e.g., obesity) through to the occurrence of disease (e.g., diabetes, CVD) and death. Our results suggest that these adjustment procedures may have a large impact on estimated mortality rates. An improper adjustment of the mortality rates could result in an underestimation of disease prevalence and, therefore, disease costs.
Water resource management decisions often involve multiple parties engaged in contentious negotiations that try to navigate through complex combinations of legal, social, hydrologic, financial, and engineering considerations. The standard approach for resolving these issues is some form of multi-party negotiation, a formal court decision, or a combination of the two. In all these cases, the role of the decision maker(s) is to choose and implement the best option that fits the needs and wants of the community. However, each path to a decision carries the risk of technical and/or financial infeasibility as well as the possibility of unintended consequences. To help reduce this risk, decision makers often rely on some type of predictive analysis from which they can evaluate the projected consequences of their decisions. Typically, decision makers are supported in the analysis process by trusted advisors who engage in the analysis as well as the day to day tasks associated with multi-party negotiations. In the case of water resource management, the analysis is frequently a numerical model or set of models that can simulate various management decisions across multiple systems and output results that illustrate the impact on areas of concern. Thus, in order to communicate scientific knowledge to the decision makers, the quality of the communication between the analysts, the trusted advisor, and the decision maker must be clear and direct. To illustrate this concept, a multi-attribute decision analysis matrix will be used to outline the value of computer model-based collaborative negotiation approaches to guide water resources decision making and communication with decision makers. In addition, the critical role of the trusted advisor and other secondary participants in the decision process will be discussed using examples from recent water negotiations.
Zoning of agricultural fields is an important task for utilization of precision farming technology. This paper extends previously published work entitled “Zoning of an agricultural field using a fuzzy indicator model” to a general case where there is disagreement between groups of managers or expert...
Gratzl, Samuel; Lex, Alexander; Gehlenborg, Nils; Pfister, Hanspeter; Streit, Marc
Rankings are a popular and universal approach to structuring otherwise unorganized collections of items by computing a rank for each item based on the value of one or more of its attributes. This allows us, for example, to prioritize tasks or to evaluate the performance of products relative to each other. While the visualization of a ranking itself is straightforward, its interpretation is not, because the rank of an item represents only a summary of a potentially complicated relationship between its attributes and those of the other items. It is also common that alternative rankings exist which need to be compared and analyzed to gain insight into how multiple heterogeneous attributes affect the rankings. Advanced visual exploration tools are needed to make this process efficient. In this paper we present a comprehensive analysis of requirements for the visualization of multi-attribute rankings. Based on these considerations, we propose LineUp - a novel and scalable visualization technique that uses bar charts. This interactive technique supports the ranking of items based on multiple heterogeneous attributes with different scales and semantics. It enables users to interactively combine attributes and flexibly refine parameters to explore the effect of changes in the attribute combination. This process can be employed to derive actionable insights as to which attributes of an item need to be modified in order for its rank to change. Additionally, through integration of slope graphs, LineUp can also be used to compare multiple alternative rankings on the same set of items, for example, over time or across different attribute combinations. We evaluate the effectiveness of the proposed multi-attribute visualization technique in a qualitative study. The study shows that users are able to successfully solve complex ranking tasks in a short period of time. PMID:24051794
Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.
Time series data commonly occur when variables are monitored over time. Many real-world applications involve the comparison of long time series across multiple variables (multi-attributes). Often business people want to compare this year's monthly sales with last year's sales to make decisions. Data warehouse administrators (DBAs) want to know their daily data loading job performance. DBAs need to detect the outliers early enough to act upon them. In this paper, two new visual analytic techniques are introduced: The color cell-based Visual Time Series Line Charts and Maps highlight significant changes over time in a long time series data and the new Visual Content Query facilitates finding the contents and histories of interesting patterns and anomalies, which leads to root cause identification. We have applied both methods to two real-world applications to mine enterprise data warehouse and customer credit card fraud data to illustrate the wide applicability and usefulness of these techniques.
Jumadinova, Janyl; Dasgupta, Prithviraj
In this paper, we consider the problem of dynamic pricing by a set of competing sellers in an information economy where buyers differentiate products along multiple attributes, and buyer preferences can change temporally. Previous research in this area has either focused on dynamic pricing along a limited number of (e.g. binary) attributes, or, assumes that each seller has access to private information such as preference distribution of buyers, and profit/price information of other sellers. However, in real information markets, private information about buyers and sellers cannot be assumed to be available a priori. Moreover, due to the competition between sellers, each seller faces a tradeoff between accuracy and rapidity of the pricing mechanism. In this paper, we describe a multi-attribute dynamic pricing algorithm based on minimax regret that can be used by a seller's agent called a pricebot, to maximize the seller's utility. Our simulation results show that the minimax regret based dynamic pricing algorithm performs significantly better than other algorithms for rapidly and dynamically tracking consumer attributes without using any private information from either buyers or sellers.
Charpentier, Caroline J.; De Neve, Jan-Emmanuel; Li, Xinyi; Roiser, Jonathan P.; Sharot, Tali
Intuitively, how you feel about potential outcomes will determine your decisions. Indeed, an implicit assumption in one of the most influential theories in psychology, prospect theory, is that feelings govern choice. Surprisingly, however, very little is known about the rules by which feelings are transformed into decisions. Here, we specified a computational model that used feelings to predict choices. We found that this model predicted choice better than existing value-based models, showing a unique contribution of feelings to decisions, over and above value. Similar to the value function in prospect theory, our feeling function showed diminished sensitivity to outcomes as value increased. However, loss aversion in choice was explained by an asymmetry in how feelings about losses and gains were weighted when making a decision, not by an asymmetry in the feelings themselves. The results provide new insights into how feelings are utilized to reach a decision. PMID:27071751
Making, Multiattribute Utility Theory : The Next Ten Years”. Management Science, 38(5):645–654, 1992. Fulop, Janos. “Introduction to Decision Making... Utility Theory . . . . . . . . . . . . . . . . . 21 2.2.4 ELECTRE Method . . . . . . . . . . . . . . . . . . . . . . . . 22 2.2.5 PROMETHEE Method...10 DSS Decision Support Systems . . . . . . . . . . . . . . . . . . . . 16 MAUT Multi-Attribute Utility Theory
Edwards, W; Fasolo, B
This review is about decision technology-the rules and tools that help us make wiser decisions. First, we review the three rules that are at the heart of most traditional decision technology-multi-attribute utility, Bayes' theorem, and subjective expected utility maximization. Since the inception of decision research, these rules have prescribed how we should infer values and probabilities and how we should combine them to make better decisions. We suggest how to make best use of all three rules in a comprehensive 19-step model. The remainder of the review explores recently developed tools of decision technology. It examines the characteristics and problems of decision-facilitating sites on the World Wide Web. Such sites now provide anyone who can use a personal computer with access to very sophisticated decision-aiding tools structured mainly to facilitate consumer decision making. It seems likely that the Web will be the mode by means of which decision tools will be distributed to lay users. But methods for doing such apparently simple things as winnowing 3000 options down to a more reasonable number, like 10, contain traps for unwary decision technologists. The review briefly examines Bayes nets and influence diagrams-judgment and decision-making tools that are available as computer programs. It very briefly summarizes the state of the art of eliciting probabilities from experts. It concludes that decision tools will be as important in the 21st century as spreadsheets were in the 20th.
Wichary, Szymon; Smolen, Tomasz
In multi-attribute choice, decision makers use decision strategies to arrive at the final choice. What are the neural mechanisms underlying decision strategy selection? The first goal of this paper is to provide a literature review on the neural underpinnings and cognitive models of decision strategy selection and thus set the stage for a neurocognitive model of this process. The second goal is to outline such a unifying, mechanistic model that can explain the impact of noncognitive factors (e.g., affect, stress) on strategy selection. To this end, we review the evidence for the factors influencing strategy selection, the neural basis of strategy use and the cognitive models of this process. We also present the Bottom-Up Model of Strategy Selection (BUMSS). The model assumes that the use of the rational Weighted Additive strategy and the boundedly rational heuristic Take The Best can be explained by one unifying, neurophysiologically plausible mechanism, based on the interaction of the frontoparietal network, orbitofrontal cortex, anterior cingulate cortex and the brainstem nucleus locus coeruleus. According to BUMSS, there are three processes that form the bottom-up mechanism of decision strategy selection and lead to the final choice: (1) cue weight computation, (2) gain modulation, and (3) weighted additive evaluation of alternatives. We discuss how these processes might be implemented in the brain, and how this knowledge allows us to formulate novel predictions linking strategy use and neural signals. PMID:27877103
Wichary, Szymon; Smolen, Tomasz
In multi-attribute choice, decision makers use decision strategies to arrive at the final choice. What are the neural mechanisms underlying decision strategy selection? The first goal of this paper is to provide a literature review on the neural underpinnings and cognitive models of decision strategy selection and thus set the stage for a neurocognitive model of this process. The second goal is to outline such a unifying, mechanistic model that can explain the impact of noncognitive factors (e.g., affect, stress) on strategy selection. To this end, we review the evidence for the factors influencing strategy selection, the neural basis of strategy use and the cognitive models of this process. We also present the Bottom-Up Model of Strategy Selection (BUMSS). The model assumes that the use of the rational Weighted Additive strategy and the boundedly rational heuristic Take The Best can be explained by one unifying, neurophysiologically plausible mechanism, based on the interaction of the frontoparietal network, orbitofrontal cortex, anterior cingulate cortex and the brainstem nucleus locus coeruleus. According to BUMSS, there are three processes that form the bottom-up mechanism of decision strategy selection and lead to the final choice: (1) cue weight computation, (2) gain modulation, and (3) weighted additive evaluation of alternatives. We discuss how these processes might be implemented in the brain, and how this knowledge allows us to formulate novel predictions linking strategy use and neural signals.
Kang, Jian; Bowman, F DuBois; Mayberg, Helen; Liu, Han
To establish brain network properties associated with major depressive disorder (MDD) using resting-state functional magnetic resonance imaging (Rs-fMRI) data, we develop a multi-attribute graph model to construct a region-level functional connectivity network that uses all voxel level information. For each region pair, we define the strength of the connectivity as the kernel canonical correlation coefficient between voxels in the two regions; and we develop a permutation test to assess the statistical significance. We also construct a network based classifier for making predictions on the risk of MDD. We apply our method to Rs-fMRI data from 20 MDD patients and 20 healthy control subjects in the Predictors of Remission in Depression to Individual and Combined Treatments (PReDICT) study. Using this method, MDD patients can be distinguished from healthy control subjects based on significant differences in the strength of regional connectivity. We also demonstrate the performance of the proposed method using simulationstudies.
Gouveia, Feliz R.; Barthes, Jean-Paul A.
This paper surveys spatial database organization and modelling as it is becoming a crucial issue for an ever increasing number of geometric data manipulation systems. We are here interested in efficient representation and storage structures for rapid processing of large sets of geometric data, as required by robotics applications, Very Large Scale Integration (VLSI) layout design, cartography, Computer Aided Design (CAD), or geographic information systems (GIS), where frequent operations involve spatial reasoning over that data. Existing database systems lack expressiveness to store some kinds of information which are inherently present in a geometric reasoning process, such as metric information, e.g. proximity, parallelism; or topological information, e.g. inclusion, intersection, contiguity, crossing. Geometric databases (GDB) alleviate this problem by providing an explicit representation for the spatial layout of the world in terms of empty and occupied space, together with a complete description of each object in it. Access to the data is done in an associative manner, that is, by specifying values over some usually small (sub)set of attributes, e.g. the coordinates of physical space. Manipulating data in GDB systems involves often spatially localized operations, i.e., locations, and consequently objects, which are accessed in the present are likely to be accessed again in a near future; this locality of reference which Hegron  calls temporal coherence, is due mainly to real world physical constraints. Indeed if accesses are caused for example by a sensor module which inspects its surroundings, then it is reasonable to suppose that successive scanned territories are not very far apart.
Kabassi, K.; Virvou, M.
This paper describes how the Multi-Attribute Utility Theory can be combined with adaptive techniques to improve individualised teaching in an Intelligent Learning Environment (ILE). The ILE is called Web F-SMILE, it operates over the Web and is meant to help novice users learn basic skills of computer use. Tutoring is dynamically adapted to the…
This project consists of three key interrelated Phases, each focusing on the central issue of imaging and quantifying fractured reservoirs, through improved integration of the principles of rock physics, geology, and seismic wave propagation. This report summarizes the results of Phase I of the project. The key to successful development of low permeability reservoirs lies in reliably characterizing fractures. Fractures play a crucial role in controlling almost all of the fluid transport in tight reservoirs. Current seismic methods to characterize fractures depend on various anisotropic wave propagation signatures that can arise from aligned fractures. We are pursuing an integrated study that relates to high-resolution seismic images of natural fractures to the rock parameters that control the storage and mobility of fluids. Our goal is to go beyond the current state-of-the art to develop and demonstrate next generation methodologies for detecting and quantitatively characterizing fracture zones using seismic measurements. Our study incorporates 3 key elements: (1) Theoretical rock physics studies of the anisotropic viscoelastic signatures of fractured rocks, including up scaling analysis and rock-fluid interactions to define the factors relating fractures in the lab and in the field. (2) Modeling of optimal seismic attributes, including offset and azimuth dependence of travel time, amplitude, impedance and spectral signatures of anisotropic fractured rocks. We will quantify the information content of combinations of seismic attributes, and the impact of multi-attribute analyses in reducing uncertainty in fracture interpretations. (3) Integration and interpretation of seismic, well log, and laboratory data, incorporating field geologic fracture characterization and the theoretical results of items 1 and 2 above. The focal point for this project is the demonstration of these methodologies in the Marathon Oil Company Yates Field in West Texas.
Recently, because of the rapid increase in the popularity of the Internet, the delivery of learning programmes has gradually shifted from local desktop to online-based applications. As more and more technological tools become available for online education, there is an increasing interest among educators and other professionals in the application…
businesses, HMOs , Universities and other local governments can help reduce this gap in infrastructure by utilizing previously untapped resources. For...civil service agencies, large businesses, businesses that deal with critical infrastructure, hospitals, colleges and universities or HMOs it will be a...Dispensing, Pre-positioning, Business PODs, Special Needs Population, University PODs, Hotel PODs, Kaiser Permanente, Door to Door Dispensing, USPS, Drive
Büyükdamgaci-Alogan, G; Elele, T; Hayran, M; Erman, M; Kiliçkap, S
The purpose was to construct a decision model that incorporated patient preferences over differing health state prospects and to analyze the decision context of early stage breast cancer patients in relation to two main surgical treatment options. A Markov chain was constructed to project the clinical history of breast carcinoma following surgery. A Multi Attribute Utility Model was developed for outcome evaluation. Transition probabilities were obtained by using subjective probability assessment. This study was performed on the sample population of female university students and utilities were elicited from these healthy volunteers. The results were validated by using Standard Gamble technique. Finally, Monte Carlo Simulation was utilized in Treeage-Pro 2006-Suit software program in order to calculate expected utility generated by each treatment option. The results showed that, if the subject had mastectomy, mean value for the quality adjusted life years gained was 6.42; on the other hand, if the preference was lumpectomy, it was 7.00 out of a possible 10 years. Sensitivity analysis on transition probabilities to local recurrence and salvaged states was performed and two threshold values were observed. Additionally, sensitivity analysis on utilities showed that the model was more sensitive to no evidence of disease state; however, was not sensitive to utilities of local recurrence and salvaged states. The decision model was developed with reasonable success for early stage breast cancer patients, and tested by using general public data. The results obtained from these data showed that lumpectomy was more favourable for these participants.
Patrick, Nicholas J. M.
Our scientific goal is to understand the process of human decision-making. Specifically, a model of human decision-making in piloting modern commercial aircraft which prescribes optimal behavior, and against which we can measure human sub-optimality is sought. This model should help us understand such diverse aspects of piloting as strategic decision-making, and the implicit decisions involved in attention allocation. Our engineering goal is to provide design specifications for (1) better computer-based decision-aids, and (2) better training programs for the human pilot (or human decision-maker, DM).
Comstock, J. R.
MAT, a Multi-Attribute Task battery, gives the researcher the capability of performing multi-task workload and performance experiments. The battery provides a benchmark set of tasks for use in a wide range of laboratory studies of operator performance and workload. MAT incorporates tasks analogous to activities that aircraft crew members perform in flight, while providing a high degree of experiment control, performance data on each subtask, and freedom to use non-pilot test subjects. The MAT battery primary display is composed of four separate task windows which are as follows: a monitoring task window which includes gauges and warning lights, a tracking task window for the demands of manual control, a communication task window to simulate air traffic control communications, and a resource management task window which permits maintaining target levels on a fuel management task. In addition, a scheduling task window gives the researcher information about future task demands. The battery also provides the option of manual or automated control of tasks. The task generates performance data for each subtask. The task battery may be paused and onscreen workload rating scales presented to the subject. The MAT battery was designed to use a serially linked second computer to generate the voice messages for the Communications task. The MATREMX program and support files, which are included in the MAT package, were designed to work with the Heath Voice Card (Model HV-2000, available through the Heath Company, Benton Harbor, Michigan 49022); however, the MATREMX program and support files may easily be modified to work with other voice synthesizer or digitizer cards. The MAT battery task computer may also be used independent of the voice computer if no computer synthesized voice messages are desired or if some other method of presenting auditory messages is devised. MAT is written in QuickBasic and assembly language for IBM PC series and compatible computers running MS-DOS. The
Kerr, Thomas A; Dakins, Maxine; Gibson, Patrick Lavern; Joe, Jeffrey Clark; Nitschke, Robert Leon; Piet, Steven James
The KONVERGENCE Model for Sustainable Decisions is a new way of viewing, developing, organizing, and evaluating alternatives for decisions that may affect a wide range of interests and that must factor in long timeframes, enduring hazards, and/or continuing responsibilities. It differs from other models in that it addresses the need for decisions to continue to "work" over long time periods in an ever-changing decision environment. The authors show that the model contains three major universes - knowledge, values, and resources (the K, V, and R in KONVERGENCE)- that interact and overlap throughout the effective lifetime of a decision. They discuss how decision-makers and decision participants can use the model to craft and analyze decisions and decision processes that stand the test of time. The authors use the U.S. moon-landing program as an example of a major decision process that was sustained over time. They use the model to explain why events unfolded in the way that they did - and why we are where we are today in that program. The authors believe that this model will be especially useful in long-term decision processes such as those that address contamination cleanup programs, long-term environmental stewardship, and the initial siting of facilities with long-term objectives. Companion papers describe the KONVERGENCE Model process steps and implications for intractable cleanup decisions.
Söllner, Anke; Bröder, Arndt; Glöckner, Andreas; Betsch, Tilmann
When decision makers are confronted with different problems and situations, do they use a uniform mechanism as assumed by single-process models (SPMs) or do they choose adaptively from a set of available decision strategies as multiple-strategy models (MSMs) imply? Both frameworks of decision making have gathered a lot of support, but only rarely have they been contrasted with each other. Employing an information intrusion paradigm for multi-attribute decisions from givens, SPM and MSM predictions on information search, decision outcomes, attention, and confidence judgments were derived and tested against each other in two experiments. The results consistently support the SPM view: Participants seemingly using a "take-the-best" (TTB) strategy do not ignore TTB-irrelevant information as MSMs would predict, but adapt the amount of information searched, choose alternative choice options, and show varying confidence judgments contingent on the quality of the "irrelevant" information. The uniformity of these findings underlines the adequacy of the novel information intrusion paradigm and comprehensively promotes the notion of a uniform decision making mechanism as assumed by single-process models.
Orasanu, Judith; Statler, Irving C. (Technical Monitor)
The importance of crew decision making to aviation safety has been well established through NTSB accident analyses: Crew judgment and decision making have been cited as causes or contributing factors in over half of all accidents in commercial air transport, general aviation, and military aviation. Yet the bulk of research on decision making has not proven helpful in improving the quality of decisions in the cockpit. One reason is that traditional analytic decision models are inappropriate to the dynamic complex nature of cockpit decision making and do not accurately describe what expert human decision makers do when they make decisions. A new model of dynamic naturalistic decision making is offered that may prove more useful for training or aiding cockpit decision making. Based on analyses of crew performance in full-mission simulation and National Transportation Safety Board accident reports, features that define effective decision strategies in abnormal or emergency situations have been identified. These include accurate situation assessment (including time and risk assessment), appreciation of the complexity of the problem, sensitivity to constraints on the decision, timeliness of the response, and use of adequate information. More effective crews also manage their workload to provide themselves with time and resources to make good decisions. In brief, good decisions are appropriate to the demands of the situation and reflect the crew's metacognitive skill. Effective crew decision making and overall performance are mediated by crew communication. Communication contributes to performance because it assures that all crew members have essential information, but it also regulates and coordinates crew actions and is the medium of collective thinking in response to a problem. This presentation will examine the relation between communication that serves to build performance. Implications of these findings for crew training will be discussed.
Spence, Caitlin M.; Brown, Casey M.
Hydroclimatic stationarity is increasingly questioned as a default assumption in flood risk management (FRM), but successor methods are not yet established. Some potential successors depend on estimates of future flood quantiles, but methods for estimating future design storms are subject to high levels of uncertainty. Here we apply a Nonstationary Decision Model (NDM) to flood risk planning within the decision scaling framework. The NDM combines a nonstationary probability distribution of annual peak flow with optimal selection of flood management alternatives using robustness measures. The NDM incorporates structural and nonstructural FRM interventions and valuation of flows supporting ecosystem services to calculate expected cost of a given FRM strategy. A search for the minimum-cost strategy under incrementally varied representative scenarios extending across the plausible range of flood trend and value of the natural flow regime discovers candidate FRM strategies that are evaluated and compared through a decision scaling analysis (DSA). The DSA selects a management strategy that is optimal or close to optimal across the broadest range of scenarios or across the set of scenarios deemed most likely to occur according to estimates of future flood hazard. We illustrate the decision framework using a stylized example flood management decision based on the Iowa City flood management system, which has experienced recent unprecedented high flow episodes. The DSA indicates a preference for combining infrastructural and nonstructural adaptation measures to manage flood risk and makes clear that options-based approaches cannot be assumed to be "no" or "low regret."
Teodorescu, Andrei R.; Usher, Marius
A multitude of models have been proposed to account for the neural mechanism of value integration and decision making in speeded decision tasks. While most of these models account for existing data, they largely disagree on a fundamental characteristic of the choice mechanism: independent versus different types of competitive processing. Five…
Xue-jun, Tang; Jia, Chen
According to the basic theory of grey relational analysis, this paper constructs a three-dimensional grey interval relation degree model for the three dimensions of time, index and scheme. On its basis, it sets up and solves a single-targeted optimization model, and obtains each scheme's affiliate degree for the positive/negative ideal scheme and also arranges the schemes in sequence. The result shows that the three-dimensional grey relation degree simplifies the traditional dynamic multi-attribute decision-making method and can better resolve the dynamic multi-attribute decision-making method of interval numbers. Finally, this paper proves the practicality and efficiency of the model through a case study.
This paper first constructs a classification framework for multi-attribute evaluation methods oriented to academic journals, and then discusses the comparability of the vast majority of non-linear evaluation methods and the majority of linear evaluation methods theoretically, taking the TOPSIS method as an example and the evaluation data on agricultural journals as an exercise of validation. The analysis result shows that we should attach enough importance to the comparability of evaluation methods for academic journals; the evaluation objectives are closely related to the choice of evaluation methods, and also relevant to the comparability of evaluation methods; the specialized organizations for journal evaluation had better release the evaluation data, evaluation methods and evaluation results to the best of their abilities; only purely subjective evaluation method is of broad comparability.
Torrance, G W; Boyle, M H; Horwood, S P
A four-attribute health state classification system designed to uniquely categorize the health status of all individuals two years of age and over is presented. A social preference function defined over the health state classification system is required. Standard multi-attribute utility theory is investigated for the task, problems are identified and modifications to the standard method are proposed. The modified methods is field tested in a survey research project involving 112 home interviews. Results are presented and discussed in detail for both the social preference function and the performance of the modified method. A recommended social preference function is presented, complete with a range of uncertainty. The modified method is found to be applicable to the task--no insurmountable difficulties are encountered. Recommendations are presented, based on our experience, for other investigators who may be interested in reapplying the method in other studies.
Elangovan, Vinayak; Shirkhodaie, Amir
Improved Situational awareness is a vital ongoing research effort for the U.S. Homeland Security for the past recent years. Many outdoor anomalous activities involve vehicles as their primary source of transportation to and from the scene where a plot is executed. Analysis of dynamics of Human-Vehicle Interaction (HVI) helps to identify correlated patterns of activities representing potential threats. The objective of this paper is bi-folded. Primarily, we discuss a method for temporal HVI events detection and verification for generation of HVI hypotheses. To effectively recognize HVI events, a Multi-attribute Vehicle Detection and Identification technique (MVDI) for detection and classification of stationary vehicles is presented. Secondly, we describe a method for identification of pertinent anomalous behaviors through analysis of state transitions between two successively detected events. Finally, we present a technique for generation of HVI semantic messages and present our experimental results to demonstrate the effectiveness of semantic messages for discovery of HVI in group activities.
One of the main objectives of landscape ecology is to orient land-use planning by providing indications of optimal ecosystem patterning to support nature conservation. A frequent limitation to the practical use of the findings of landscape ecological studies is that they tend to focus on the identification and computation of indicators rather than on their interpretation and assessment. This paper presents and discusses the use of a methodology to formalise expert opinion through the elicitation of multi-attribute value functions. In particular, the value functions aim at assessing spatial indicators so as to provide an overall judgment of the viability of different ecosystem patches. The result consisted of the ranking of the ecosystems according to their degree of viability and therefore their suitability for nature conservation. The method of formalising expert opinion and knowledge complements traditional analyses based on the measurement of spatial ecological indicators.
An integrated GIS-based, multi-attribute decision model deployed in a web-based platform is presented enabling an iterative, spatially explicit and collaborative analysis of relevant and available information for repurposing vacant land. The process incorporated traditional and ...
I[ -, . 1’, oo Ii AL-CR-i1992-0004 AD-A256 947lEE = IIEI ifl ll 1l I JOB AIDING/TRAINING DECISION PROCESS MODEL A R M John P. Zenyuh DTIC S Phillip C...March 1990 - April 1990 4. TITLE AND SUBTITLE S. FUNDING NUMBERS C - F33615-86-C-0545 Job Aiding/Training Decision Process Model PE - 62205F PR - 1121 6...Components to Process Model Decision and Selection Points ........... 32 13. Summary of Subject Recommendations for Aiding Approaches
37 e. Compromise Programming (CP)............................................38 f. Multi-Attribute Utility Theory ( MAUT ...their assigned weights.87 f. Multi-Attribute Utility Theory ( MAUT ) Multi-attribute utility theory is another popular method for decision- making...and Roger Smith, “ Multiattribute Utility Theory ,” (technical report, University of Wisconsin), http://www.r2d2.uwm.edu/atoms/archive
Stubelj Ars, Mojca; Bohanec, Marko
This paper studies mountain hut infrastructure in the Alps as an important element of ecotourism in the Alpine region. To improve the decision-making process regarding the implementation of future infrastructure and improvement of existing infrastructure in the vulnerable natural environment of mountain ecosystems, a new decision support model has been developed. The methodology is based on qualitative multi-attribute modelling supported by the DEXi software. The integrated rule-based model is hierarchical and consists of two submodels that cover the infrastructure of the mountain huts and that of the huts' surroundings. The final goal for the designed tool is to help minimize the ecological footprint of tourists in environmentally sensitive and undeveloped mountain areas and contribute to mountain ecotourism. The model has been tested in the case study of four mountain huts in Triglav National Park in Slovenia. Study findings provide a new empirical approach to evaluating existing mountain infrastructure and predicting improvements for the future. The assessment results are of particular interest for decision makers in protected areas, such as Alpine national parks managers and administrators. In a way, this model proposes an approach to the management assessment of mountain huts with the main aim of increasing the quality of life of mountain environment visitors as well as the satisfaction of tourists who may eventually become ecotourists.
van Ravenzwaaij, Don; van der Maas, Han L. J.; Wagenmakers, Eric-Jan
In their influential "Psychological Review" article, Bogacz, Brown, Moehlis, Holmes, and Cohen (2006) discussed optimal decision making as accomplished by the drift diffusion model (DDM). The authors showed that neural inhibition models, such as the leaky competing accumulator model (LCA) and the feedforward inhibition model (FFI), can mimic the…
XCITE lifeform entities to detect and track moving or stationary objects is research and development work that should continue. The WRSTP team has... target identification scenario. A system dynamics model was developed to predict those results. Research limitations/Implications – While decision delays... target identification scenario. Many other decision models lack this time component and are therefore of limited use in time-critical situations. Take
Gevarter, William B.
Models of human decision making are reviewed. Models which treat just the cognitive aspects of human behavior are included as well as models which include motivation. Both models which have associated computer programs, and those that do not, are considered. Since flow diagrams, that assist in constructing computer simulation of such models, were not generally available, such diagrams were constructed and are presented. The result provides a rich source of information, which can aid in construction of more realistic future simulations of human decision making.
Stiglic, Gregor; Kocbek, Simon; Pernek, Igor; Kokol, Peter
Purpose Classification is an important and widely used machine learning technique in bioinformatics. Researchers and other end-users of machine learning software often prefer to work with comprehensible models where knowledge extraction and explanation of reasoning behind the classification model are possible. Methods This paper presents an extension to an existing machine learning environment and a study on visual tuning of decision tree classifiers. The motivation for this research comes from the need to build effective and easily interpretable decision tree models by so called one-button data mining approach where no parameter tuning is needed. To avoid bias in classification, no classification performance measure is used during the tuning of the model that is constrained exclusively by the dimensions of the produced decision tree. Results The proposed visual tuning of decision trees was evaluated on 40 datasets containing classical machine learning problems and 31 datasets from the field of bioinformatics. Although we did not expected significant differences in classification performance, the results demonstrate a significant increase of accuracy in less complex visually tuned decision trees. In contrast to classical machine learning benchmarking datasets, we observe higher accuracy gains in bioinformatics datasets. Additionally, a user study was carried out to confirm the assumption that the tree tuning times are significantly lower for the proposed method in comparison to manual tuning of the decision tree. Conclusions The empirical results demonstrate that by building simple models constrained by predefined visual boundaries, one not only achieves good comprehensibility, but also very good classification performance that does not differ from usually more complex models built using default settings of the classical decision tree algorithm. In addition, our study demonstrates the suitability of visually tuned decision trees for datasets with binary class
Zhang, Lin; Yin, Na; Fu, Xiong; Lin, Qiaomin; Wang, Ruchuan
With the development of wireless sensor networks, certain network problems have become more prominent, such as limited node resources, low data transmission security, and short network life cycles. To solve these problems effectively, it is important to design an efficient and trusted secure routing algorithm for wireless sensor networks. Traditional ant-colony optimization algorithms exhibit only local convergence, without considering the residual energy of the nodes and many other problems. This paper introduces a multi-attribute pheromone ant secure routing algorithm based on reputation value (MPASR). This algorithm can reduce the energy consumption of a network and improve the reliability of the nodes' reputations by filtering nodes with higher coincidence rates and improving the method used to update the nodes' communication behaviors. At the same time, the node reputation value, the residual node energy and the transmission delay are combined to formulate a synthetic pheromone that is used in the formula for calculating the random proportion rule in traditional ant-colony optimization to select the optimal data transmission path. Simulation results show that the improved algorithm can increase both the security of data transmission and the quality of routing service.
Zhang, Lin; Yin, Na; Fu, Xiong; Lin, Qiaomin; Wang, Ruchuan
With the development of wireless sensor networks, certain network problems have become more prominent, such as limited node resources, low data transmission security, and short network life cycles. To solve these problems effectively, it is important to design an efficient and trusted secure routing algorithm for wireless sensor networks. Traditional ant-colony optimization algorithms exhibit only local convergence, without considering the residual energy of the nodes and many other problems. This paper introduces a multi-attribute pheromone ant secure routing algorithm based on reputation value (MPASR). This algorithm can reduce the energy consumption of a network and improve the reliability of the nodes’ reputations by filtering nodes with higher coincidence rates and improving the method used to update the nodes’ communication behaviors. At the same time, the node reputation value, the residual node energy and the transmission delay are combined to formulate a synthetic pheromone that is used in the formula for calculating the random proportion rule in traditional ant-colony optimization to select the optimal data transmission path. Simulation results show that the improved algorithm can increase both the security of data transmission and the quality of routing service. PMID:28282894
Arnegard, Ruth J.; Comstock, J. R., Jr.
The Multi-Attribute Task (MAT) Battery provides a benchmark set of tasks for use in a wide range of lab studies of operator performance and workload. The battery incorporates tasks analogous to activities that aircraft crewmembers perform in flight, while providing a high degree of experimenter control, performance data on each subtask, and freedom to nonpilot test subjects. Features not found in existing computer based tasks include an auditory communication task (to simulate Air Traffic Control communication), a resource management task permitting many avenues or strategies of maintaining target performance, a scheduling window which gives the operator information about future task demands, and the option of manual or automated control of tasks. Performance data are generated for each subtask. In addition, the task battery may be paused and onscreen workload rating scales presented to the subject. The MAT Battery requires a desktop computer with color graphics. The communication task requires a serial link to a second desktop computer with a voice synthesizer or digitizer card.
Comstock, J. Raymond, Jr.; Arnegard, Ruth J.
The Multi-Attribute Task (MAT) Battery provides a benchmark set of tasks for use in a wide range of lab studies of operator performance and workload. The battery incorporates tasks analogous to activities that aircraft crewmembers perform in flight, while providing a high degree of experimenter control, performance data on each subtask, and freedom to use nonpilot test subjects. Features not found in existing computer based tasks include an auditory communication task (to simulate Air Traffic Control communication), a resource management task permitting many avenues or strategies of maintaining target performance, a scheduling window which gives the operator information about future task demands, and the option of manual or automated control of tasks. Performance data are generated for each subtask. In addition, the task battery may be paused and onscreen workload rating scales presented to the subject. The MAT Battery requires a desktop computer with color graphics. The communication task requires a serial link to a second desktop computer with a voice synthesizer or digitizer card.
Hazelrigg, G. A., Jr.; Brigadier, W. L.
Many techniques developed for the solution of problems in economics and operations research are directly applicable to problems involving engineering trade-offs. This paper investigates the use of utility theory for decision making in planetary exploration space missions. A decision model is derived that accounts for the objectives of the mission - science - the cost of flying the mission and the risk of mission failure. A simulation methodology for obtaining the probability distribution of science value and costs as a function spacecraft and mission design is presented and an example application of the decision methodology is given for various potential alternatives in a comet Encke mission.
The use of willingness-to-pay (WTP) survey techniques based on multi-attribute utility (MAU) approaches has been recommended by some authors as a way to deal simultaneously with two difficulties that increasingly plague environmental valuation. The first of th...
Jones, Andrew; Calvin, Katherine; Lamarque, Jean -Francois
The need for regional- and local-scale climate information is increasing rapidly as decision makers seek to anticipate and manage a variety of context-specific climate risks over the next several decades. Furthermore, global climate models are not developed with these user needs in mind, and they typically operate at resolutions that are too coarse to provide information that could be used to support regional and local decisions.
Read, L.; Madani, K.; Mokhtari, S.; Hanks, C. L.; Sheets, B.
Many competing projects have been proposed to address Interior Alaska's high cost of energy—both for electricity production and for heating. Public and private stakeholders are considering the costs associated with these competing projects which vary in fuel source, subsidy requirements, proximity, and other factors. As a result, the current projects under consideration involve a complex cost structure of potential subsidies and reliance on present and future market prices, introducing a significant amount of uncertainty associated with each selection. Multi-criteria multi-decision making (MCMDM) problems of this nature can benefit from game theory and systems engineering methods, which account for behavior and preferences of stakeholders in the analysis to produce feasible and relevant solutions. This work uses a stochastic MCMDM framework to evaluate the trade-offs of each proposed project based on a complete cost analysis, environmental impact, and long-term sustainability. Uncertainty in the model is quantified via a Monte Carlo analysis, which helps characterize the sensitivity and risk associated with each project. Based on performance measures and criteria outlined by the stakeholders, a decision matrix will inform policy on selecting a project that is both efficient and preferred by the constituents.
Purcell, Braden A.; Heitz, Richard P.; Cohen, Jeremiah Y.; Schall, Jeffrey D.; Logan, Gordon D.; Palmeri, Thomas J.
Stochastic accumulator models account for response time in perceptual decision-making tasks by assuming that perceptual evidence accumulates to a threshold. The present investigation mapped the firing rate of frontal eye field (FEF) visual neurons onto perceptual evidence and the firing rate of FEF movement neurons onto evidence accumulation to…
assessing human performance in a controlled multitask environment. The most recent release of AF-MATB contains numerous improvements and additions...Strategic Behavior, MATB, Multitasking , Task Battery, Simulator, Multi-Attribute Task Battery, Automation 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...performance and multitasking strategy. As a result, a specific Information Throughput (IT) Mode was designed to customize the task to fit the Human
Ke, Yufeng; Qi, Hongzhi; He, Feng; Liu, Shuang; Zhao, Xin; Zhou, Peng; Zhang, Lixin; Ming, Dong
Mental workload (MW)-based adaptive system has been found to be an effective approach to enhance the performance of human-machine interaction and to avoid human error caused by overload. However, MW estimated from the spontaneously generated electroencephalogram (EEG) was found to be task-specific. In existing studies, EEG-based MW classifier can work well under the task used to train the classifier (within-task) but crash completely when used to classify MW of a task that is similar to but not included in the training data (cross-task). The possible causes have been considered to be the task-specific EEG patterns, the mismatched workload across tasks and the temporal effects. In this study, cross-task performance-based feature selection (FS) and regression model were tried to cope with these challenges, in order to make EEG-based MW estimator trained on working memory tasks work well under a complex simulated multi-attribute task (MAT). The results show that the performance of regression model trained on working memory task and tested on multi-attribute task with the feature subset picked-out were significantly improved (correlation coefficient (COR): 0.740 ± 0.147 and 0.598 ± 0.161 for FS data and validation data respectively) when compared to the performance in the same condition with all features (chance level). It can be inferred that there do exist some MW-related EEG features can be picked out and there are something in common between MW of a relatively simple task and a complex task. This study provides a promising approach to measure MW across tasks. PMID:25249967
Ke, Yufeng; Qi, Hongzhi; He, Feng; Liu, Shuang; Zhao, Xin; Zhou, Peng; Zhang, Lixin; Ming, Dong
Mental workload (MW)-based adaptive system has been found to be an effective approach to enhance the performance of human-machine interaction and to avoid human error caused by overload. However, MW estimated from the spontaneously generated electroencephalogram (EEG) was found to be task-specific. In existing studies, EEG-based MW classifier can work well under the task used to train the classifier (within-task) but crash completely when used to classify MW of a task that is similar to but not included in the training data (cross-task). The possible causes have been considered to be the task-specific EEG patterns, the mismatched workload across tasks and the temporal effects. In this study, cross-task performance-based feature selection (FS) and regression model were tried to cope with these challenges, in order to make EEG-based MW estimator trained on working memory tasks work well under a complex simulated multi-attribute task (MAT). The results show that the performance of regression model trained on working memory task and tested on multi-attribute task with the feature subset picked-out were significantly improved (correlation coefficient (COR): 0.740 ± 0.147 and 0.598 ± 0.161 for FS data and validation data respectively) when compared to the performance in the same condition with all features (chance level). It can be inferred that there do exist some MW-related EEG features can be picked out and there are something in common between MW of a relatively simple task and a complex task. This study provides a promising approach to measure MW across tasks.
This article develops a parsimonious descriptive model of individual choice and valuation in the kinds of experiments that constitute a substantial part of the literature relating to decision making under risk and uncertainty. It suggests that many of the best known "regularities" observed in those experiments may arise from a tendency for participants to perceive probabilities and payoffs in a particular way. This model organizes more of the data than any other extant model and generates a number of novel testable implications which are examined with new data.
Purcell, Braden A.; Heitz, Richard P.; Cohen, Jeremiah Y.; Schall, Jeffrey D.; Logan, Gordon D.; Palmeri, Thomas J.
Stochastic accumulator models account for response time in perceptual decision-making tasks by assuming that perceptual evidence accumulates to a threshold. The present investigation mapped the firing rate of frontal eye field (FEF) visual neurons onto perceptual evidence and the firing rate of FEF movement neurons onto evidence accumulation to test alternative models of how evidence is combined in the accumulation process. The models were evaluated on their ability to predict both response time distributions and movement neuron activity observed in monkeys performing a visual search task. Models that assume gating of perceptual evidence to the accumulating units provide the best account of both behavioral and neural data. These results identify discrete stages of processing with anatomically distinct neural populations and rule out several alternative architectures. The results also illustrate the use of neurophysiological data as a model selection tool and establish a novel framework to bridge computational and neural levels of explanation. PMID:20822291
Ito, Makoto; Doya, Kenji
Computational models of reinforcement learning have recently been applied to analysis of brain imaging and neural recording data to identity neural correlates of specific processes of decision making, such as valuation of action candidates and parameters of value learning. However, for such model-based analysis paradigms, selecting an appropriate model is crucial. In this study we analyze the process of choice learning in rats using stochastic rewards. We show that "Q-learning," which is a standard reinforcement learning algorithm, does not adequately reflect the features of choice behaviors. Thus, we propose a generalized reinforcement learning (GRL) algorithm that incorporates the negative reward effect of reward loss and forgetting of values of actions not chosen. Using the Bayesian estimation method for time-varying parameters, we demonstrated that the GRL algorithm can predict an animal's choice behaviors as efficiently as the best Markov model. The results suggest the usefulness of the GRL for the model-based analysis of neural processes involved in decision making.
Levison, W. H.; Tanner, R. B.
A model for human decision making is an adaptation of an optimal control model for pilot/vehicle systems. The models for decision and control both contain concepts of time delay, observation noise, optimal prediction, and optimal estimation. The decision making model was intended for situations in which the human bases his decision on his estimate of the state of a linear plant. Experiments are described for the following task situations: (a) single decision tasks, (b) two-decision tasks, and (c) simultaneous manual control and decision making. Using fixed values for model parameters, single-task and two-task decision performance can be predicted to within an accuracy of 10 percent. Agreement is less good for the simultaneous decision and control situation.
Peacock, Stuart J; Richardson, Jeff R J; Carter, Rob; Edwards, Diana
Programme budgeting and marginal analysis (PBMA) is becoming an increasingly popular tool in setting health service priorities. This paper presents a novel multi-attribute utility (MAU) approach to setting health service priorities using PBMA. This approach includes identifying the attributes of the MAU function; describing and scaling attributes; quantifying trade-offs between attributes; and combining single conditional utility functions into the MAU function. We illustrate the MAU approach using a PBMA case study in mental health services from the Community Health Sector in metropolitan South Australia.
Tweddale, R. Bruce
Presented is a budgetary decision model developed to aid the executive officers in arriving at tentative decisions on enrollment, tuition rates, increased compensation, and level of staffing as they affect the total institutional budget. The model utilizes a desk-top programmable calculator (in this case, a Burroughs Model C 3660). The model…
Zhang, Wancheng; Xu, Yejun; Wang, Huimin
The aim of this paper is to put forward a consensus reaching method for multi-attribute group decision-making (MAGDM) problems with linguistic information, in which the weight information of experts and attributes is unknown. First, some basic concepts and operational laws of 2-tuple linguistic label are introduced. Then, a grey relational analysis method and a maximising deviation method are proposed to calculate the incomplete weight information of experts and attributes respectively. To eliminate the conflict in the group, a weight-updating model is employed to derive the weights of experts based on their contribution to the consensus reaching process. After conflict elimination, the final group preference can be obtained which will give the ranking of the alternatives. The model can effectively avoid information distortion which is occurred regularly in the linguistic information processing. Finally, an illustrative example is given to illustrate the application of the proposed method and comparative analysis with the existing methods are offered to show the advantages of the proposed method.
Evers, Colin W.
Explores implications for understanding educational decision making from a cognitive science perspective. Examines three models of mind providing the methodological framework for decision-making studies. The "absent mind" embodies the behaviorist research tradition. The "functionalist mind" underwrites traditional cognitivism…
Siminoff, Laura A; Step, Mary M
The authors present a communication model of shared decision making (CMSDM) that explicitly identifies the communication process as the vehicle for decision making in cancer treatment. In this view, decision making is necessarily a sociocommunicative process whereby people enter into a relationship, exchange information, establish preferences, and choose a course of action. The model derives from contemporary notions of behavioral decision making and ethical conceptions of the doctor-patient relationship. This article briefly reviews the theoretical approaches to decision making, notes deficiencies, and embeds a more socially based process into the dynamics of the physician-patient relationship, focusing on cancer treatment decisions. In the CMSDM, decisions depend on (a) antecedent factors that have potential to influence communication, (b) jointly constructed communication climate, and (c) treatment preferences established by the physician and the patient.
Sam, Kabari; Coulon, Frédéric; Prpich, George
The Ogoniland region of the Niger Delta contains a vast number of sites contaminated with petroleum hydrocarbons that originated from Nigeria's active oil sector. The United Nations Environment Programme (UNEP) reported on this widespread contamination in 2011, however, wide-scale action to clean-up these sites has yet to be initiated. A challenge for decision makers responsible for the clean-up of these sites has been the prioritisation of sites to enable appropriate allocation of scarce resources. In this study, a risk-based multi-criteria decision analysis framework was used to prioritise high-risk sites contaminated with petroleum hydrocarbons in the Ogoniland region of Nigeria. The prioritisation method used a set of risk-based attributes that took into account chemical and ecological impacts, as well as socio-economic impacts, providing a holistic assessment of the risk. Data for the analysis was taken from the UNEP Environmental Assessment of Ogoniland, where over 110 communities were assessed for oil-contamination. Results from our prioritisation show that the highest-ranking sites were not necessarily the sites with the highest observed level of hydrocarbon contamination. This differentiation was due to our use of proximity as a surrogate measure for likelihood of exposure. Composite measures of risk provide a more robust assessment, and can enrich discussions about risk management and the allocation of resources for the clean-up of affected sites.
Apperl, B.; Pulido-Velazquez, M.; Andreu, J.; Karjalainen, T. P.
The implementation of the EU Water Framework Directive demands participatory water resource management approaches. Decision making in groundwater quantity and quality management is complex because of the existence of many independent actors, heterogeneous stakeholder interests, multiple objectives, different potential policies, and uncertain outcomes. Conflicting stakeholder interests have often been identified as an impediment to the realisation and success of water regulations and policies. The management of complex groundwater systems requires the clarification of stakeholders' positions (identifying stakeholder preferences and values), improving transparency with respect to outcomes of alternatives, and moving the discussion from the selection of alternatives towards the definition of fundamental objectives (value-thinking approach), which facilitates negotiation. The aims of the study are to analyse the potential of the multi-attribute value theory for conflict resolution in groundwater management and to evaluate the benefit of stakeholder incorporation into the different stages of the planning process, to find an overall satisfying solution for groundwater management. The research was conducted in the Mancha Oriental groundwater system (Spain), subject to intensive use of groundwater for irrigation. A complex set of objectives and attributes was defined, and the management alternatives were created by a combination of different fundamental actions, considering different implementation stages and future changes in water resource availability. Interviews were conducted with representative stakeholder groups using an interactive platform, showing simultaneously the consequences of changes in preferences to the alternative ranking. Results show that the approval of alternatives depends strongly on the combination of measures and the implementation stages. Uncertainties in the results were notable, but did not influence the alternative ranking heavily. The
Apperl, B.; Andreu, J.; Karjalainen, T. P.; Pulido-Velazquez, M.
The implementation of the EU Water Framework Directive demands participatory water resource management approaches. Decision making in groundwater quantity and quality management is complex because of the existence of many independent actors, heterogeneous stakeholder interests, multiple objectives, different potential policies, and uncertain outcomes. Conflicting stakeholder interests have been often identified as an impediment to the realization and success of water regulations and policies. The management of complex groundwater systems requires clarifying stakeholders' positions (identifying stakeholders preferences and values), improving transparency with respect to outcomes of alternatives, and moving the discussion from the selection of alternatives towards definition of fundamental objectives (value-thinking approach), what facilitates negotiation. The aims of the study are to analyse the potential of the multi attribute value theory for conflict resolution in groundwater management and to evaluate the benefit of stakeholder incorporation in the different stages of the planning process to find an overall satisfying solution for groundwater management. The research was conducted in the Mancha Oriental groundwater system (Spain), subject to an intensive use of groundwater for irrigation. A complex set of objectives and attributes were defined, and the management alternatives were created by a combination of different fundamental actions, considering different implementation stages and future changes in water resources availability. Interviews were conducted with representative stakeholder groups using an interactive platform, showing simultaneously the consequences of changes of preferences to the alternative ranking. Results show that the acceptation of alternatives depends strongly on the combination of measures and the implementation stages. Uncertainties of the results were notable but did not influence heavily on the alternative ranking. The expected
Converse, Sarah J.; Moore, Clinton T.; Armstrong, Doug P.
Reintroduction can be necessary for recovering populations of threatened species. However, the success of reintroduction efforts has been poorer than many biologists and managers would hope. To increase the benefits gained from reintroduction, management decision making should be couched within formal decision-analytic frameworks. Decision analysis is a structured process for informing decision making that recognizes that all decisions have a set of components—objectives, alternative management actions, predictive models, and optimization methods—that can be decomposed, analyzed, and recomposed to facilitate optimal, transparent decisions. Because the outcome of interest in reintroduction efforts is typically population viability or related metrics, models used in decision analysis efforts for reintroductions will need to include population models. In this special section of the Journal of Wildlife Management, we highlight examples of the construction and use of models for informing management decisions in reintroduced populations. In this introductory contribution, we review concepts in decision analysis, population modeling for analysis of decisions in reintroduction settings, and future directions. Increased use of formal decision analysis, including adaptive management, has great potential to inform reintroduction efforts. Adopting these practices will require close collaboration among managers, decision analysts, population modelers, and field biologists.
Bracco, A.; Neelin, J. D.; Luo, H.; McWilliams, J. C.; Meyerson, J. E.
An important source of uncertainty in climate models is linked to the calibration of model parameters. Interest in systematic and automated parameter optimization procedures stems from the desire to improve the model climatology and to quantify the average sensitivity associated with potential changes in the climate system. Building upon on the smoothness of the response of an atmospheric circulation model (AGCM) to changes of four adjustable parameters, Neelin et al. (2010) used a quadratic metamodel to objectively calibrate the AGCM. The metamodel accurately estimates global spatial averages of common fields of climatic interest, from precipitation, to low and high level winds, from temperature at various levels to sea level pressure and geopotential height, while providing a computationally cheap strategy to explore the influence of parameter settings. Here, guided by the metamodel, the ambiguities or dilemmas related to the decision making process in relation to model sensitivity and optimization are examined. Simulations of current climate are subject to considerable regional-scale biases. Those biases may vary substantially depending on the climate variable considered, and/or on the performance metric adopted. Common dilemmas are associated with model revisions yielding improvement in one field or regional pattern or season, but degradation in another, or improvement in the model climatology but degradation in the interannual variability representation. Challenges are posed to the modeler by the high dimensionality of the model output fields and by the large number of adjustable parameters. The use of the metamodel in the optimization strategy helps visualize trade-offs at a regional level, e.g., how mismatches between sensitivity and error spatial fields yield regional errors under minimization of global objective functions.
Markov models (Multistate transition models) are mathematical tools to simulate a cohort of individuals followed over time to assess the prognosis resulting from different strategies. They are applied on the assumption that persons are in one of a finite number of states of health (Markov states). Each condition is given a transition probability as well as an incremental value. Probabilities may be chosen constant or varying over time due to predefined rules. Time horizon is divided into equal increments (Markov cycles). The model calculates quality-adjusted life expectancy employing real-life units and values and summing up the length of time spent in each health state adjusted for objective outcomes and subjective appraisal. This sort of modeling prognosis for a given patient is analogous to utility in common decision trees. Markov models can be evaluated by matrix algebra, probabilistic cohort simulation and Monte Carlo simulation. They have been applied to assess the relative benefits and risks of a limited number of diagnostic and therapeutic procedures in radiology. More interventions should be submitted to Markov analyses in order to elucidate their cost-effectiveness.
Kuakunrittiwong, T.; Ratanakuakangwan, S.
Under Power Development Plan 2015, Thailand has to diversify its heavily gas-fired electricity generation. The main owner of electricity transmission grids is responsible to implement several coal-fired power plants with clean coal technology. To environmentally handle and economically transport unprecedented quantities of sub-bituminous and bituminous coal, a coal center is required. The location of such facility is an important strategic decision and a paramount to the success of the energy plan. As site selection involves many criteria, Fuzzy Analytical Hierarchy Process or Fuzzy-AHP is applied to select the most suitable location among three candidates. Having analyzed relevant criteria and the potential alternatives, the result reveals that engineering and socioeconomic are important criteria and Map Ta Phut is the most suitable site for the coal center.
In an age when decision making is becoming more and more significant for us human beings as we face dilemmas about whether or not to clone, to engineer behavior on mass scale, to expand or to decrease nuclear power, we educators must assist students to increase their decision-making skills. Many of our students will soon be decision makers for…
Bitzer, Sebastian; Bruineberg, Jelle; Kiebel, Stefan J.
Even for simple perceptual decisions, the mechanisms that the brain employs are still under debate. Although current consensus states that the brain accumulates evidence extracted from noisy sensory information, open questions remain about how this simple model relates to other perceptual phenomena such as flexibility in decisions, decision-dependent modulation of sensory gain, or confidence about a decision. We propose a novel approach of how perceptual decisions are made by combining two influential formalisms into a new model. Specifically, we embed an attractor model of decision making into a probabilistic framework that models decision making as Bayesian inference. We show that the new model can explain decision making behaviour by fitting it to experimental data. In addition, the new model combines for the first time three important features: First, the model can update decisions in response to switches in the underlying stimulus. Second, the probabilistic formulation accounts for top-down effects that may explain recent experimental findings of decision-related gain modulation of sensory neurons. Finally, the model computes an explicit measure of confidence which we relate to recent experimental evidence for confidence computations in perceptual decision tasks. PMID:26267143
Flaming, Susan C.
The continuing saga of satellite technology development is as much a story of successful risk management as of innovative engineering. How do program leaders on complex, technology projects manage high stakes risks that threaten business success and satellite performance? This grounded theory study of risk decision making portrays decision leadership practices at one communication satellite company. Integrated product team (IPT) leaders of multi-million dollar programs were interviewed and observed to develop an extensive description of the leadership skills required to navigate organizational influences and drive challenging risk decisions to closure. Based on the study's findings the researcher proposes a new decision making model, Deliberative Decision Making, to describe the program leaders' cognitive and organizational leadership practices. This Deliberative Model extends the insights of prominent decision making models including the rational (or classical) and the naturalistic and qualifies claims made by bounded rationality theory. The Deliberative Model describes how leaders proactively engage resources to play a variety of decision leadership roles. The Model incorporates six distinct types of leadership decision activities, undertaken in varying sequence based on the challenges posed by specific risks. Novel features of the Deliberative Decision Model include: an inventory of leadership methods for managing task challenges, potential stakeholder bias and debates; four types of leadership meta-decisions that guide decision processes, and aligned organizational culture. Both supporting and constraining organizational influences were observed as leaders managed major risks, requiring active leadership on the most difficult decisions. Although the company's engineering culture emphasized the importance of data-based decisions, the uncertainties intrinsic to satellite risks required expert engineering judgment to be exercised throughout. An investigation into
Levison, W. H.
The optimal control model for pilot-vehicle systems has been extended to handle certain types of human decision tasks. The model for decision making incorporates the observation noise, optimal estimation, and prediction concepts that form the basis of the model for control behavior. Experiments are described for the following task situations: (1) single decision tasks; (2) two decision tasks; and (3) simultaneous manual control and decision tasks. Using fixed values for model parameters, single-task and two-task decision performance scores to within an accuracy of 10 percent can be predicted. The experiment on simultaneous control and decision indicates the presence of task interference in this situation, but the results are not adequate to allow a conclusive test of the predictive capability of the model.
Stefanopoulos, Kyriakos; Yang, Hong; Gemitzi, Alexandra; Tsagarakis, Konstantinos P
Multi-Attribute Value Theory (MAVT) was used to investigate stakeholders' preferences and beliefs in ameliorating a deteriorating ecosystem, i.e. Vosvozis River and Ismarida Lake in Northeastern Greece. Various monetary and environmental criteria were evaluated with scores and weights by different stakeholder groups and key individuals such as farmers, fishermen, entrepreneurs, residents and ecologists to elicit their preferences concerning alternative protection scenarios. The ultimate objective was to propose policy recommendations for a sustainable water resources management for the case study area. The analysis revealed an overwhelming agreement among stakeholders regarding the dire need for immediate actions in order to preserve and enhance Vosvozis ecosystem. With a two stage evaluation process, the MAVT analysis led to a high consensus among the stakeholders on the alternative that favors water recycling from the wastewater treatment plant combined with small dams for rainwater harvesting.
Santiago-Espada, Yamira; Myer, Robert R.; Latorella, Kara A.; Comstock, James R., Jr.
The Multi-Attribute Task Battery (MAT Battery). is a computer-based task designed to evaluate operator performance and workload, has been redeveloped to operate in Windows XP Service Pack 3, Windows Vista and Windows 7 operating systems.MATB-II includes essentially the same tasks as the original MAT Battery, plus new configuration options including a graphical user interface for controlling modes of operation. MATB-II can be executed either in training or testing mode, as defined by the MATB-II configuration file. The configuration file also allows set up of the default timeouts for the tasks, the flow rates of the pumps and tank levels of the Resource Management (RESMAN) task. MATB-II comes with a default event file that an experimenter can modify and adapt
Zhao, Tiesong; Kwong, Sam; Wang, Hanli; Wang, Zhou; Pan, Zhaoqing; Kuo, C-C Jay
In a generic decision process, optimal stopping theory aims to achieve a good tradeoff between decision performance and time consumed, with the advantages of theoretical decision-making and predictable decision performance. In this paper, optimal stopping theory is employed to develop an effective hybrid model for the mode decision problem, which aims to theoretically achieve a good tradeoff between the two interrelated measurements in mode decision, as computational complexity reduction and rate-distortion degradation. The proposed hybrid model is implemented and examined with a multiview encoder. To support the model and further promote coding performance, the multiview coding mode characteristics, including predicted mode probability and estimated coding time, are jointly investigated with inter-view correlations. Exhaustive experimental results with a wide range of video resolutions reveal the efficiency and robustness of our method, with high decision accuracy, negligible computational overhead, and almost intact rate-distortion performance compared to the original encoder.
Nygren, Thomas E.
Research on human decision making has traditionally focused on how people actually make decisions, how good their decisions are, and how their decisions can be improved. Recent research suggests that this model is inadequate. Affective as well as cognitive components drive the way information about relevant outcomes and events is perceived, integrated, and used in the decision making process. The affective components include how the individual frames outcomes as good or bad, whether the individual anticipates regret in a decision situation, the affective mood state of the individual, and the psychological stress level anticipated or experienced in the decision situation. A focus of the current work has been to propose empirical studies that will attempt to examine in more detail the relationships between the latter two critical affective influences (mood state and stress) on decision making behavior.
Yanik, H. Bahadir; Memis, Yasin
Engaging students in studies about conservation and sustainability can support their understanding of making environmental conscious decisions to conserve Earth. This article aims to contribute these efforts and direct students' attention to how they can use mathematics to make environmental decisions. Contributors to iSTEM: Integrating…
looked at research on “niacin” indexed by Thomson Reuters Web of Knowledge (http://wokinfo.com). This topic was cho- sen due to its interest to a variety... Thomson Reuters Web of Knowledge. 5. CONCLUSION In this paper, we presented the MANCaLog language for modeling cascades in multi-agent systems organized...density lipopro- teins (HDL). Using Thomson Reuters Web of Knowledge (http://wokinfo.com) we were able to extract information on 4, 202 articles about
Wei, Huaqiang; Alves-Foss, James; Soule, Terry; Pforsich, Hugh; Zhang, Du; Frincke, Deborah A.
System security involves decisions in at least three areas: identification of well-defined security policies, selection of cost-effective defence strategies, and implementation of real-time defence tactics. Although choices made in each of these areas affect the others, existing decision models typically handle these three decision areas in isolation. There is no comprehensive tool that can integrate them to provide a single efficient model for safeguarding a network. In addition, there is no clear way to determine which particular combinations of defence decisions result in cost-effective solutions. To address these problems, this paper introduces a Layered Decision Model (LDM) for use in deciding how to address defence decisions based on their cost-effectiveness. To validate the LDM and illustrate how it is used, we used simulation to test model rationality and applied the LDM to the design of system security for an e-commercial business case.
is likely to be more intense near faults--sometimes referred to as the damaged zone. Yet another constraint, based on world-wide observations, is that the maximum likely fracture density increases with depth in a well-defined way. Defining these prior constrains has several benefits: they lead to a priori probability distributions of fractures, that are important for objective statistical integration; they limit the number of geologic hypotheses that need to be theoretically modeled; they provide plausible models for fracture distributions below the seismic resolution. The second element was theoretical rock physics modeling of optimal seismic attributes, including offset and azimuth dependence of traveltime, amplitude, and impedance signatures of anisotropic fractured rocks. The suggested workflow is to begin with an elastic earth model, based on well logs, theoretically add fractures to the likely facies as defined by the geologic prior information, and then compute synthetic seismic traces and attributes, including variations in P and S-wave velocities, Poisson's ratio, reflectivity, travel time, attenuation, and anisotropies of these parameters. This workflow is done in a Monte-Carlo fashion, yielding ranges of expected fracture signatures, and allowing realistic assessments of uncertainty to be honored. The third element was statistical integration of the geophysical data and prior constraints to map fracture intensity and orientations, along with uncertainties. A Bayesian framework was developed that allowed systematic integration of the prior constraints, the theoretical relations between fractures and their seismic signatures, and the various observed seismic observations. The integration scheme was successfully applied on an East Texas field site. The primary benefit from the study was the optimization and refinement of practical workflows for improved geophysical characterization of natural fractures and for quantifying the uncertainty of these
The application of the mathematical formalism of quantum mechanics to model behavioral patterns in social science and economics is a novel and constantly emerging field. The aim of the so called 'quantum like' models is to model the decision making processes in a macroscopic setting, capturing the particular 'context' in which the decisions are taken. Several subsequent empirical findings proved that when making a decision people tend to violate the axioms of expected utility theory and Savage's Sure Thing principle, thus violating the law of total probability. A quantum probability formula was devised to describe more accurately the decision making processes. A next step in the development of QL-modeling in decision making was the application of Schroedinger equation to describe the evolution of people's mental states. A shortcoming of Schroedinger equation is its inability to capture dynamics of an open system; the brain of the decision maker can be regarded as such, actively interacting with the external environment. Recently the master equation, by which quantum physics describes the process of decoherence as the result of interaction of the mental state with the environmental 'bath', was introduced for modeling the human decision making. The external environment and memory can be referred to as a complex 'context' influencing the final decision outcomes. The master equation can be considered as a pioneering and promising apparatus for modeling the dynamics of decision making in different contexts.
The application of the mathematical formalism of quantum mechanics to model behavioral patterns in social science and economics is a novel and constantly emerging field. The aim of the so called 'quantum like' models is to model the decision making processes in a macroscopic setting, capturing the particular 'context' in which the decisions are taken. Several subsequent empirical findings proved that when making a decision people tend to violate the axioms of expected utility theory and Savage's Sure Thing principle, thus violating the law of total probability. A quantum probability formula was devised to describe more accurately the decision making processes. A next step in the development of QL-modeling in decision making was the application of Schrödinger equation to describe the evolution of people's mental states. A shortcoming of Schrödinger equation is its inability to capture dynamics of an open system; the brain of the decision maker can be regarded as such, actively interacting with the external environment. Recently the master equation, by which quantum physics describes the process of decoherence as the result of interaction of the mental state with the environmental 'bath', was introduced for modeling the human decision making. The external environment and memory can be referred to as a complex 'context' influencing the final decision outcomes. The master equation can be considered as a pioneering and promising apparatus for modeling the dynamics of decision making in different contexts.
Kuswa, Glenn W.; Tsao, Jeffrey Yeenien; Drennen, Thomas E.; Zuffranieri, Jason V.; Paananen, Orman Henrie; Jones, Scott A.; Ortner, Juergen G.; Brewer, Jeffrey D.; Valdez, Maximo M.
This report began with a Laboratory-Directed Research and Development (LDRD) project to improve Sandia National Laboratories multidisciplinary capabilities in energy systems analysis. The aim is to understand how various electricity generating options can best serve needs in the United States. The initial product is documented in a series of white papers that span a broad range of topics, including the successes and failures of past modeling studies, sustainability, oil dependence, energy security, and nuclear power. Summaries of these projects are included here. These projects have provided a background and discussion framework for the Energy Systems Analysis LDRD team to carry out an inter-comparison of many of the commonly available electric power sources in present use, comparisons of those options, and efforts needed to realize progress towards those options. A computer aid has been developed to compare various options based on cost and other attributes such as technological, social, and policy constraints. The Energy Systems Analysis team has developed a multi-criteria framework that will allow comparison of energy options with a set of metrics that can be used across all technologies. This report discusses several evaluation techniques and introduces the set of criteria developed for this LDRD.
Merrick, Jason R W; Leclerc, Philip
Counterterrorism decisions have been an intense area of research in recent years. Both decision analysis and game theory have been used to model such decisions, and more recently approaches have been developed that combine the techniques of the two disciplines. However, each of these approaches assumes that the attacker is maximizing its utility. Experimental research shows that human beings do not make decisions by maximizing expected utility without aid, but instead deviate in specific ways such as loss aversion or likelihood insensitivity. In this article, we modify existing methods for counterterrorism decisions. We keep expected utility as the defender's paradigm to seek for the rational decision, but we use prospect theory to solve for the attacker's decision to descriptively model the attacker's loss aversion and likelihood insensitivity. We study the effects of this approach in a critical decision, whether to screen containers entering the United States for radioactive materials. We find that the defender's optimal decision is sensitive to the attacker's levels of loss aversion and likelihood insensitivity, meaning that understanding such descriptive decision effects is important in making such decisions.
Lundin, Edward.; Welty, Gordon
The rational model of classical economic theory assumes that the decision maker has complete information on alternatives and consequences, and that he chooses the alternative that maximizes expected utility. This model does not allow for constraints placed on the decision maker resulting from lack of information, organizational pressures,…
Ratcliff, Roger; Gomez, Pablo; McKoon, Gail
The diffusion model for 2-choice decisions (R. Ratcliff, 1978) was applied to data from lexical decision experiments in which word frequency, proportion of high- versus low-frequency words, and type of nonword were manipulated. The model gave a good account of all of the dependent variables--accuracy, correct and error response times, and their…
Singer, Alexander; Salman, Mo; Thulke, Hans-Hermann
Animal health is of societal importance as it affects human welfare, and anthropogenic interests shape decision making to assure animal health. Scientific advice to support decision making is manifold. Modelling, as one piece of the scientific toolbox, is appreciated for its ability to describe and structure data, to give insight in complex processes and to predict future outcome. In this paper we study the application of scientific modelling to support practical animal health decisions. We reviewed the 35 animal health related scientific opinions adopted by the Animal Health and Animal Welfare Panel of the European Food Safety Authority (EFSA). Thirteen of these documents were based on the application of models. The review took two viewpoints, the decision maker's need and the modeller's approach. In the reviewed material three types of modelling questions were addressed by four specific model types. The correspondence between tasks and models underpinned the importance of the modelling question in triggering the modelling approach. End point quantifications were the dominating request from decision makers, implying that prediction of risk is a major need. However, due to knowledge gaps corresponding modelling studies often shed away from providing exact numbers. Instead, comparative scenario analyses were performed, furthering the understanding of the decision problem and effects of alternative management options. In conclusion, the most adequate scientific support for decision making - including available modelling capacity - might be expected if the required advice is clearly stated.
Piet, Steven James; Dakins, Maxine Ellen; Gibson, Patrick Lavern; Joe, Jeffrey Clark; Kerr, Thomas A; Nitschke, Robert Leon
Abstract—Some cleanup decisions, such as cleanup of intractable contaminated sites or disposal of spent nuclear fuel, have proven difficult to make. Such decisions face high resistance to agreement from stakeholders possibly because they do not trust the decision makers, view the consequences of being wrong as too high, etc. Our project’s goal is to improve sciencebased cleanup decision-making. This includes diagnosing intractable situations, as a step to identifying a path toward sustainable solutions. Companion papers describe the underlying philosophy of the KONVERGENCE Model for Sustainable Decisions,1 and the overall framework and process steps.2 Where knowledge, values, and resources converge (the K, V, and R in KONVERGENCE), you will find a sustainable decision – a decision that works over time. For intractable cases, serious consideration of the adaptable class of alternatives is warranted – if properly implemented and packaged.
Rogers, Richard S; Nightlinger, Nancy S; Livingston, Brittney; Campbell, Phil; Bailey, Robert; Balland, Alain
Regulatory agencies have recently recommended a Quality by Design (QbD) approach for the manufacturing of therapeutic molecules. A QbD strategy requires deep understanding at the molecular level of the attributes that are crucial for safety and efficacy and for insuring that the desired quality of the purified protein drug product is met at the end of the manufacturing process. A mass spectrometry (MS)-based approach to simultaneously monitor the extensive array of product quality attributes (PQAs) present on therapeutic molecules has been developed. This multi-attribute method (MAM) uses a combination of high mass accuracy / high resolution MS data generated by Orbitrap technology and automated identification and relative quantification of PQAs with dedicated software (Pinpoint). The MAM has the potential to replace several conventional electrophoretic and chromatographic methods currently used in Quality Control to release therapeutic molecules. The MAM represents an optimized analytical solution to focus on the attributes of the therapeutic molecule essential for function and implement QbD principles across process development, manufacturing and drug disposition. PMID:26186204
Fuss, Ian G; Navarro, Daniel J
In recent years quantum probability models have been used to explain many aspects of human decision making, and as such quantum models have been considered a viable alternative to Bayesian models based on classical probability. One criticism that is often leveled at both kinds of models is that they lack a clear interpretation in terms of psychological mechanisms. In this paper we discuss the mechanistic underpinnings of a quantum walk model of human decision making and response time. The quantum walk model is compared to standard sequential sampling models, and the architectural assumptions of both are considered. In particular, we show that the quantum model has a natural interpretation in terms of a cognitive architecture that is both massively parallel and involves both co-operative (excitatory) and competitive (inhibitory) interactions between units. Additionally, we introduce a family of models that includes aspects of the classical and quantum walk models.
Zeiss, Ragna; van Egmond, Stans
This article studies the roles three science-based models play in Dutch policy and decision making processes. Key is the interaction between model construction and environment. Their political and scientific environments form contexts that shape the roles of models in policy decision making. Attention is paid to three aspects of the wider context of the models: a) the history of the construction process; b) (changes in) the political and scientific environments; and c) the use in policy processes over longer periods of time. Models are more successfully used when they are constructed in a stable political and scientific environment. Stability and certainty within a scientific field seems to be a key predictor for the usefulness of models for policy making. The economic model is more disputed than the ecology-based model and the model that has its theoretical foundation in physics and chemistry. The roles models play in policy processes are too complex to be considered as straightforward technocratic powers.
Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.
One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.
Uncertainty • Causal Reasoning and Bayesian /Machine Learning Algorithms • Neural Basis of Cognition and Decision • Computational Cognitive...EECS) The Scientific Challenge: The Holy Grail of Neuroscience - What is the neural code ? - Can we “reconstruct” cognition from neural spiking...statistical inference, etc. Neuroscience: implementing the solution by the neural architecture (including hardware and currency). Behavior
Wright, Adam; Sittig, Dean F.
In this paper, we develop a four-phase model for evaluating architectures for clinical decision support that focuses on: defining a set of desirable features for a decision support architecture; building a proof-of-concept prototype; demonstrating that the architecture is useful by showing that it can be integrated with existing decision support systems and comparing its coverage to that of other architectures. We apply this framework to several well-known decision support architectures, including Arden Syntax, GLIF, SEBASTIAN and SAGE PMID:18462999
Leist, James C.; Konen, Joseph C.
Four factors of clinical decision making identified by medical students include quality of care, cost, ethics, and legal concerns. This paper argues that physicians have two responsibilities in the clinical decision-making model: to be the primary advocate for quality health care and to ensure balance among the four factors, working in partnership…
Gibson, Denise D.; Borges, Nicole J.
Objectives: The purpose of this study was to develop a working model to explain medical specialty decision-making. Using Social Cognitive Career Theory, we examined personality, medical specialty preferences, job satisfaction, and expectations about specialty choice to create a conceptual framework to guide specialty choice decision-making.…
Feather, M. S.; Cornford, S. L.; Dunphy, J.; Hicks, K.
Decisions made in the earliest phases of system development have the most leverage to influence the success of the entire development effort, and yet must be made when information is incomplete and uncertain. We have developed a scalable cost-benefit model to support this critical phase of early-lifecycle decision-making.
Bangert, Daniel; Schubert, Emery; Fabian, Dorottya
This paper describes a model of how musicians make decisions about performing notated music. The model builds on psychological theories of decision-making and was developed from empirical studies of Western art music performance that aimed to identify intuitive and deliberate processes of decision-making, a distinction consistent with dual-process theories of cognition. The model proposes that the proportion of intuitive (Type 1) and deliberate (Type 2) decision-making processes changes with increasing expertise and conceptualizes this change as movement along a continually narrowing upward spiral where the primary axis signifies principal decision-making type and the vertical axis marks level of expertise. The model is intended to have implications for the development of expertise as described in two main phases. The first is movement from a primarily intuitive approach in the early stages of learning toward greater deliberation as analytical techniques are applied during practice. The second phase occurs as deliberate decisions gradually become automatic (procedural), increasing the role of intuitive processes. As a performer examines more issues or reconsiders decisions, the spiral motion toward the deliberate side and back to the intuitive is repeated indefinitely. With increasing expertise, the spiral tightens to signify greater control over decision type selection. The model draws on existing theories, particularly Evans’ (2011) Intervention Model of dual-process theories, Cognitive Continuum Theory Hammond et al. (1987), Hammond (2007), Baylor’s (2001) U-shaped model for the development of intuition by level of expertise. By theorizing how musical decision-making operates over time and with increasing expertise, this model could be used as a framework for future research in music performance studies and performance science more generally. PMID:24795673
Bhayat, Imtiaz; Manuguerra, Maurizio; Baldock, Clive
In this paper, a model and tool is proposed to assist universities and other mission-based organisations to ascertain systematically the optimal portfolio of projects, in any year, meeting the organisations risk tolerances and available funds. The model and tool presented build on previous work on university operations and decision support systems…
Wei, Huaqiang; Alves-Foss, James; Zhang, Du; Frincke, Deb
We propose a cost-effective network defense strategy built on three key: three decision layers: security policies, defense strategies, and real-time defense tactics for countering immediate threats. A layered decision model (LDM) can be used to capture this decision process. The LDM helps decision-makers gain insight into the hierarchical relationships among inter-connected entities and decision types, and supports the selection of cost-effective defense mechanisms to safeguard computer networks. To be effective as a business tool, it is first necessary to validate the rationality of model before applying it to real-world business cases. This paper describes our efforts in validating the LDM rationality through simulation.
Rao, Rajesh P. N.
A fundamental problem faced by animals is learning to select actions based on noisy sensory information and incomplete knowledge of the world. It has been suggested that the brain engages in Bayesian inference during perception but how such probabilistic representations are used to select actions has remained unclear. Here we propose a neural model of action selection and decision making based on the theory of partially observable Markov decision processes (POMDPs). Actions are selected based not on a single “optimal” estimate of state but on the posterior distribution over states (the “belief” state). We show how such a model provides a unified framework for explaining experimental results in decision making that involve both information gathering and overt actions. The model utilizes temporal difference (TD) learning for maximizing expected reward. The resulting neural architecture posits an active role for the neocortex in belief computation while ascribing a role to the basal ganglia in belief representation, value computation, and action selection. When applied to the random dots motion discrimination task, model neurons representing belief exhibit responses similar to those of LIP neurons in primate neocortex. The appropriate threshold for switching from information gathering to overt actions emerges naturally during reward maximization. Additionally, the time course of reward prediction error in the model shares similarities with dopaminergic responses in the basal ganglia during the random dots task. For tasks with a deadline, the model learns a decision making strategy that changes with elapsed time, predicting a collapsing decision threshold consistent with some experimental studies. The model provides a new framework for understanding neural decision making and suggests an important role for interactions between the neocortex and the basal ganglia in learning the mapping between probabilistic sensory representations and actions that maximize
Lee, George; Romo Bucheli, David Edmundo; Madabhushi, Anant
Medical diagnostics is often a multi-attribute problem, necessitating sophisticated tools for analyzing high-dimensional biomedical data. Mining this data often results in two crucial bottlenecks: 1) high dimensionality of features used to represent rich biological data and 2) small amounts of labelled training data due to the expense of consulting highly specific medical expertise necessary to assess each study. Currently, no approach that we are aware of has attempted to use active learning in the context of dimensionality reduction approaches for improving the construction of low dimensional representations. We present our novel methodology, AdDReSS (Adaptive Dimensionality Reduction with Semi-Supervision), to demonstrate that fewer labeled instances identified via AL in embedding space are needed for creating a more discriminative embedding representation compared to randomly selected instances. We tested our methodology on a wide variety of domains ranging from prostate gene expression, ovarian proteomic spectra, brain magnetic resonance imaging, and breast histopathology. Across these various high dimensional biomedical datasets with 100+ observations each and all parameters considered, the median classification accuracy across all experiments showed AdDReSS (88.7%) to outperform SSAGE, a SSDR method using random sampling (85.5%), and Graph Embedding (81.5%). Furthermore, we found that embeddings generated via AdDReSS achieved a mean 35.95% improvement in Raghavan efficiency, a measure of learning rate, over SSAGE. Our results demonstrate the value of AdDReSS to provide low dimensional representations of high dimensional biomedical data while achieving higher classification rates with fewer labelled examples as compared to without active learning.
The study reviewed 20 currently-available structured ethical decision-making models and developed an integrated model consisting of six steps with useful questions and tools that help better performance each step: (1) the identification of an ethical problem; (2) the collection of additional information to identify the problem and develop solutions; (3) the development of alternatives for analysis and comparison; (4) the selection of the best alternatives and justification; (5) the development of diverse, practical ways to implement ethical decisions and actions; and (6) the evaluation of effects and development of strategies to prevent a similar occurrence. From a pilot-test of the model, nursing students reported positive experiences, including being satisfied with having access to a comprehensive review process of the ethical aspects of decision making and becoming more confident in their decisions. There is a need for the model to be further tested and refined in both the educational and practical environments.
Rodríguez, Rosa M.; Martínez, Luis
It is common that experts involved in complex real-world decision problems use natural language for expressing their knowledge in uncertain frameworks. The language is inherent vague, hence probabilistic decision models are not very suitable in such cases. Therefore, other tools such as fuzzy logic and fuzzy linguistic approaches have been successfully used to model and manage such vagueness. The use of linguistic information implies to operate with such a type of information, i.e. processes of computing with words (CWW). Different schemes have been proposed to deal with those processes, and diverse symbolic linguistic computing models have been introduced to accomplish the linguistic computations. In this paper, we overview the relationship between decision making and CWW, and focus on symbolic linguistic computing models that have been widely used in linguistic decision making to analyse if all of them can be considered inside of the CWW paradigm.
CareScience, Inc. is a public company (NASDAQ: CARE) that originated ten years ago to commercialize risk adjustment and complication predictions developed by the Wharton School of Business and the University of Pennsylvania School of Medicine. Over the past decade, the company has grown to approximately 200 clients and 150 employees. Among the "firsts" recorded by the company, CareScience was the first to offer a clinical decision support system as an Application Service Provider (ASP), the first to offer peer-to-peer clinical data sharing among health care provider organizations and practitioners (Santa Barbara Care Data Exchange), and the first to provide a care management outsourcing arrangement.
Jimison, Holly B.
For many medical domains uncertainty and patient preferences are important components of decision making. Decision theory is useful as a representation for such medical models in computer decision aids, but the methodology has typically had poor performance in the areas of explanation and user interface. The additional representation of probabilities and utilities as random variables serves to provide a framework for graphical and text insight into complicated decision models. The approach allows for efficient customization of a generic model that describes the general patient population of interest to a patient- specific model. Monte Carlo simulation is used to calculate the expected value of information and sensitivity for each model variable, thus providing a metric for deciding what to emphasize in the graphics and text summary. The computer-generated explanation includes variables that are sensitive with respect to the decision or that deviate significantly from what is typically observed. These techniques serve to keep the assessment and explanation of the patient's decision model concise, allowing the user to focus on the most important aspects for that patient.
Culbreth, Adam J; Westbrook, Andrew; Daw, Nathaniel D; Botvinick, Matthew; Barch, Deanna M
Individuals with schizophrenia have a diminished ability to use reward history to adaptively guide behavior. However, tasks traditionally used to assess such deficits often rely on multiple cognitive and neural processes, leaving etiology unresolved. In the current study, we adopted recent computational formalisms of reinforcement learning to distinguish between model-based and model-free decision-making in hopes of specifying mechanisms associated with reinforcement-learning dysfunction in schizophrenia. Under this framework, decision-making is model-free to the extent that it relies solely on prior reward history, and model-based if it relies on prospective information such as motivational state, future consequences, and the likelihood of obtaining various outcomes. Model-based and model-free decision-making was assessed in 33 schizophrenia patients and 30 controls using a 2-stage 2-alternative forced choice task previously demonstrated to discern individual differences in reliance on the 2 forms of reinforcement-learning. We show that, compared with controls, schizophrenia patients demonstrate decreased reliance on model-based decision-making. Further, parameter estimates of model-based behavior correlate positively with IQ and working memory measures, suggesting that model-based deficits seen in schizophrenia may be partially explained by higher-order cognitive deficits. These findings demonstrate specific reinforcement-learning and decision-making deficits and thereby provide valuable insights for understanding disordered behavior in schizophrenia. (PsycINFO Database Record
Cher, D J; Miyamoto, J; Lenert, L A
Most decision models published in the medical literature take a risk-neutral perspective. Under risk neutrality, the utility of a gamble is equivalent to its expected value and the marginal utility of living a given unit of time is the same regardless of when it occurs. Most patients, however, are not risk-neutral. Not only does risk aversion affect decision analyses when tradeoffs between short- and long-term survival are involved, it also affects the interpretation of time-tradeoff measures of health-state utility. The proportional time tradeoff under- or overestimates the disutility of an inferior health state, depending on whether the patient is risk-seeking or risk-averse (it is unbiased if the patient is risk-neutral). The authors review how risk attitude with respect to gambles for survival duration can be incorporated into decision models using the framework of risk-adjusted quality-adjusted life years (RA-QALYs). They present a simple extension of this framework that allows RA-QALYs to be calculated for Markov-process decision models. Using a previously published Markov-process model of surgical vs expectant treatment for benign prostatic hypertrophy (BPH), they show how attitude towards risk affects the expected number of QALYs calculated by the model. In this model, under risk neutrality, surgery was the preferred option. Under mild risk aversion, expectant treatment was the preferred option. Risk attitude is an important aspect of preferences that should be incorporated into decision models where one treatment option has upfront risks of morbidity or mortality.
Dowlatabadi, H.; Kandlikar, M.; Linville, C.
A critical aspect of climate change decision-making is uncertainties in current understanding of the socioeconomic, climatic and biogeochemical processes involved. Decision-making processes are much better informed if these uncertainties are characterized and their implications understood. Quantitative analysis of these uncertainties serve to inform decision makers about the likely outcome of policy initiatives, and help set priorities for research so that outcome ambiguities faced by the decision-makers are reduced. A family of integrated assessment models of climate change have been developed at Carnegie Mellon. These models are distinguished from other integrated assessment efforts in that they were designed from the outset to characterize and propagate parameter, model, value, and decision-rule uncertainties. The most recent of these models is ICAM 2.1. This model includes representation of the processes of demographics, economic activity, emissions, atmospheric chemistry, climate and sea level change and impacts from these changes and policies for emissions mitigation, and adaptation to change. The model has over 800 objects of which about one half are used to represent uncertainty. In this paper we show, that when considering parameter uncertainties, the relative contribution of climatic uncertainties are most important, followed by uncertainties in damage calculations, economic uncertainties and direct aerosol forcing uncertainties. When considering model structure uncertainties we find that the choice of policy is often dominated by model structure choice, rather than parameter uncertainties.
Goldhaber-Fiebert, Jeremy D.; Bailey, Stephanie L.; Hurlburt, Michael S.; Zhang, Jinjin; Snowden, Lonnie R.; Wulczyn, Fred; Landsverk, John; Horwitz, Sarah M.
The objective was to demonstrate decision-analytic modeling in support of Child Welfare policymakers considering implementing evidence-based interventions. Outcomes included permanency (e.g., adoptions) and stability (e.g., foster placement changes). Analyses of a randomized trial of KEEP -- a foster parenting intervention -- and NSCAW-1 estimated placement change rates and KEEP's effects. A microsimulation model generalized these findings to other Child Welfare systems. The model projected that KEEP could increase permanency and stability, identifying strategies targeting higher-risk children and geographical regions that achieve benefits efficiently. Decision-analytic models enable planners to gauge the value of potential implementations. PMID:21861204
Qi, Xue-Mei; Zhang, Shao-Cong
D-S evidence theory provides a good approach to fuse uncertain information. In this article, we introduce seismic multi-attribute fusion based on D-S evidence theory to predict the coalbed methane (CBM) concentrated areas. First, we choose seismic attributes that are most sensitive to CBM content changes with the guidance of CBM content measured at well sites. Then the selected seismic attributes are fused using D-S evidence theory and the fusion results are used to predict CBM-enriched area. The application shows that the predicted CBM content and the measured values are basically consistent. The results indicate that using D-S evidence theory in seismic multi-attribute fusion to predict CBM-enriched areas is feasible.
Ferris, Michael; Brennan, Patricia Flatley; Tang, Lisa; Marquard, Jenna; Robinson, Stephen; Wright, Stephen
Eight years of progress towards the creation of a national health information network has resulted in a plethora of health data exchange relationships, most commonly called regional health information organizations (RHIOs). Various network types reflect both governance decisions and practical aspects, such as the need for a variety of information sharing pathways between and among organizations. Applying systematic business planning approaches will help ensure that decisions about structure, governance, pricing and incentive lead to RHIO arrangements that meet both the RHIOs' and the participants' business goals. This paper describes the model formulation stage of an ongoing project that applies operations research methods to RHIO participation decisions.
Prochaska, James O
Decision making is an integral part of the transtheoretical model of behavior change. Stage of change represents a temporal dimension for behavior change and has been the key dimension for integrating principles and processes of change from across leading theories of psychotherapy and behavior change. The decision-making variables representing the pros and cons of changing have been found to have systematic relationships across the stages of change for 50 health-related behaviors. Implications of these patterns of relationships are discussed in the context of helping patients make more effective decisions to decrease health risk behaviors and increase health-enhancing behaviors.
Ferris, Michael; Brennan, Patricia Flatley; Tang, Lisa; Marquard, Jenna; Robinson, Stephen; Wright, Stephen
Eight years of progress towards the creation of a national health information network has resulted in a plethora of health data exchange relationships, most commonly called regional health information organizations (RHIOs). Various network types reflect both governance decisions and practical aspects, such as the need for a variety of information sharing pathways between and among organizations. Applying systematic business planning approaches will help ensure that decisions about structure, governance, pricing and incentives lead to RHIO arrangements that meet both the RHIOs’ and the participants’ business goals. This paper describes the model formulation stage of an ongoing project that applies operations research methods to RHIO participation decisions. PMID:18693834
Hauck, Jessica; Ling, Thomson
Although art therapists have discussed the importance of taking a positive stance in terms of ethical decision making (Hinz, 2011), an ethical decision-making model applicable for the field of art therapy has yet to emerge. As the field of art therapy continues to grow, an accessible, theoretically grounded, and logical decision-making model is…
Samsa, M.; Van Kuiken, J.; Jusko, M.; Decision and Information Sciences
The Critical Infrastructure Protection Decision Support System Decision Model (CIPDSS-DM) is a useful tool for comparing the effectiveness of alternative risk-mitigation strategies on the basis of CIPDSS consequence scenarios. The model is designed to assist analysts and policy makers in evaluating and selecting the most effective risk-mitigation strategies, as affected by the importance assigned to various impact measures and the likelihood of an incident. A typical CIPDSS-DM decision map plots the relative preference of alternative risk-mitigation options versus the annual probability of an undesired incident occurring once during the protective life of the investment, assumed to be 20 years. The model also enables other types of comparisons, including a decision map that isolates a selected impact variable and displays the relative preference for the options of interest--parameterized on the basis of the contribution of the isolated variable to total impact, as well as the likelihood of the incident. Satisfaction/regret analysis further assists the analyst or policy maker in evaluating the confidence with which one option can be selected over another.
Wewerinke, P. H.
The decision process is described in terms of classical sequential decision theory by considering the hypothesis that an abnormal condition has occurred by means of a generalized likelihood ratio test. For this, a sufficient statistic is provided by the innovation sequence which is the result of the perception an information processing submodel of the human observer. On the basis of only two model parameters, the model predicts the decision speed/accuracy trade-off and various attentional characteristics. A preliminary test of the model for single variable failure detection tasks resulted in a very good fit of the experimental data. In a formal validation program, a variety of multivariable failure detection tasks was investigated and the predictive capability of the model was demonstrated.
Ozatalay, Savas; Golin, Myron
In order to provide flexibility in mid-range financial planning in higher education institutions, a predictive modular decision-making model is prepared. Simple computational procedure and a small external information requirement will enable small institutions to utilize the model. (Author/MLW)
Stephens, Ginny Lee; Reynolds, JoLynne
Discusses using Gerald Egan's model for creative decision making as a career counseling tool. Explains why to use this model and how it was adapted to meet career counseling issues. Describes its successful use in three case studies with a college sophomore in search of a major, a new graduate in search of a first job, and a homemaker. (Author/ABL)
Li, Aiping; Jin, Songchang; Zhang, Lumin; Jia, Yan
Although diagnostic expert systems using a knowledge base which models decision-making of traditional experts can provide important information to non-experts, they tend to duplicate the errors made by experts. Decision-Theoretic Model (DTM) is therefore very useful in expert system since they prevent experts from incorrect reasoning under uncertainty. For the diagnostic expert system, corresponding DTM and arithmetic are studied and a sequential diagnostic decision-theoretic model based on Bayesian Network is given. In the model, the alternative features are categorized into two classes (including diseases features and test features), then an arithmetic for prior of test is provided. The different features affect other features weights are also discussed. Bayesian Network is adopted to solve uncertainty presentation and propagation. The model can help knowledge engineers model the knowledge involved in sequential diagnosis and decide evidence alternative priority. A practical example of the models is also presented: at any time of the diagnostic process the expert is provided with a dynamically updated list of suggested tests in order to support him in the decision-making problem about which test to execute next. The results show it is better than the traditional diagnostic model which is based on experience.
Hare, A.P.; Scheiblechner, Hartmann
In a test of three computer models to simulate group decisions, data were used from 31 American and Austrian groups on a total of 307 trials. The task for each group was to predict a series of answers of an unknown subject on a value-orientation questionnaire, after being given a sample of his typical responses. The first model, used the mean of…
Hess, Leonardo Emanuel; Haimovici, Ariel; Muñoz, Miguel Angel; Montoya, Pedro
Risky decision-making seems to be markedly disrupted in patients with chronic pain, probably due to the high cost that impose pain and negative mood on executive control functions. Patients’ behavioral performance on decision-making tasks such as the Iowa Gambling Task (IGT) is characterized by selecting cards more frequently from disadvantageous than from advantageous decks, and by switching often between competing responses in comparison with healthy controls (HCs). In the present study, we developed a simple heuristic model to simulate individuals’ choice behavior by varying the level of decision randomness and the importance given to gains and losses. The findings revealed that the model was able to differentiate the behavioral performance of patients with chronic pain and HCs at the group, as well as at the individual level. The best fit of the model in patients with chronic pain was yielded when decisions were not based on previous choices and when gains were considered more relevant than losses. By contrast, the best account of the available data in HCs was obtained when decisions were based on previous experiences and losses loomed larger than gains. In conclusion, our model seems to provide useful information to measure each individual participant extensively, and to deal with the data on a participant-by-participant basis. PMID:25136301
Yan, Xiangbin; Dai, Shiliang
Previous research on online consumer behavior has mostly been confined to the perceived risk which is used to explain those barriers for purchasing online. However, perceived benefit is another important factor which influences consumers’ decision when shopping online. As a result, an integrated consumer online shopping decision-making model is developed which contains three elements—Consumer, Product, and Web Site. This model proposed relative factors which influence the consumers’ intention during the online shopping progress, and divided them into two different dimensions—mentally level and material level. We tested those factors with surveys, from both online volunteers and offline paper surveys with more than 200 samples. With the help of SEM, the experimental results show that the proposed model and method can be used to analyze consumer’s online shopping decision-making process effectively.
Fienen, Michael N.; Masterson, John P.; Plant, Nathaniel G.; Gutierrez, Benjamin T.; Thieler, E. Robert
Resource managers need to make decisions to plan for future environmental conditions, particularly sea level rise, in the face of substantial uncertainty. Many interacting processes factor in to the decisions they face. Advances in process models and the quantification of uncertainty have made models a valuable tool for this purpose. Long-simulation runtimes and, often, numerical instability make linking process models impractical in many cases. A method for emulating the important connections between model input and forecasts, while propagating uncertainty, has the potential to provide a bridge between complicated numerical process models and the efficiency and stability needed for decision making. We explore this using a Bayesian network (BN) to emulate a groundwater flow model. We expand on previous approaches to validating a BN by calculating forecasting skill using cross validation of a groundwater model of Assateague Island in Virginia and Maryland, USA. This BN emulation was shown to capture the important groundwater-flow characteristics and uncertainty of the groundwater system because of its connection to island morphology and sea level. Forecast power metrics associated with the validation of multiple alternative BN designs guided the selection of an optimal level of BN complexity. Assateague island is an ideal test case for exploring a forecasting tool based on current conditions because the unique hydrogeomorphological variability of the island includes a range of settings indicative of past, current, and future conditions. The resulting BN is a valuable tool for exploring the response of groundwater conditions to sea level rise in decision support.
In this dissertation we study emergent collective decision-making in social groups with time-varying interactions and heterogeneously informed individuals. First we analyze a nonlinear dynamical systems model motivated by animal collective motion with heterogeneously informed subpopulations, to examine the role of uninformed individuals. We find through formal analysis that adding uninformed individuals in a group increases the likelihood of a collective decision. Secondly, we propose a model for human shared decision-making with continuous-time feedback and where individuals have little information about the true preferences of other group members. We study model equilibria using bifurcation analysis to understand how the model predicts decisions based on the critical threshold parameters that represent an individual's tradeoff between social and environmental influences. Thirdly, we analyze continuous-time data of pairs of human subjects performing an experimental shared tracking task using our second proposed model in order to understand transient behavior and the decision-making process. We fit the model to data and show that it reproduces a wide range of human behaviors surprisingly well, suggesting that the model may have captured the mechanisms of observed behaviors. Finally, we study human behavior from a game-theoretic perspective by modeling the aforementioned tracking task as a repeated game with incomplete information. We show that the majority of the players are able to converge to playing Nash equilibrium strategies. We then suggest with simulations that the mean field evolution of strategies in the population resemble replicator dynamics, indicating that the individual strategies may be myopic. Decisions form the basis of control and problems involving deciding collectively between alternatives are ubiquitous in nature and in engineering. Understanding how multi-agent systems make decisions among alternatives also provides insight for designing
by Presutti and Trepp in their paper "Much Ado about EOQ." [2) The constraints used in the stock fund model are total stock fund dollars and limits on...Jersey, 1963. 2. Presutti, Victor J., Jr. and Trepp , Richard C., More Ado About Economic Order Ouantities (EOO), Operations Analysis Office
Rustichini, Aldo; Padoa-Schioppa, Camillo
Neuronal recordings and lesion studies indicate that key aspects of economic decisions take place in the orbitofrontal cortex (OFC). Previous work identified in this area three groups of neurons encoding the offer value, the chosen value, and the identity of the chosen good. An important and open question is whether and how decisions could emerge from a neural circuit formed by these three populations. Here we adapted a biophysically realistic neural network previously proposed for perceptual decisions (Wang XJ. Neuron 36: 955-968, 2002; Wong KF, Wang XJ. J Neurosci 26: 1314-1328, 2006). The domain of economic decisions is significantly broader than that for which the model was originally designed, yet the model performed remarkably well. The input and output nodes of the network were naturally mapped onto two groups of cells in OFC. Surprisingly, the activity of interneurons in the network closely resembled that of the third group of cells, namely, chosen value cells. The model reproduced several phenomena related to the neuronal origins of choice variability. It also generated testable predictions on the excitatory/inhibitory nature of different neuronal populations and on their connectivity. Some aspects of the empirical data were not reproduced, but simple extensions of the model could overcome these limitations. These results render a biologically credible model for the neuronal mechanisms of economic decisions. They demonstrate that choices could emerge from the activity of cells in the OFC, suggesting that chosen value cells directly participate in the decision process. Importantly, Wang's model provides a platform to investigate the implications of neuroscience results for economic theory.
Kim, Lois G; Thompson, Simon G
Health economic decision models are based on specific assumptions relating to model structure and parameter estimation. Validation of these models is recommended as an indicator of reliability, but is not commonly reported. Furthermore, models derived from different data and employing different assumptions may produce a variety of results.A Markov model for evaluating the long-term cost-effectiveness of screening for abdominal aortic aneurysm is described. Internal, prospective and external validations are carried out using individual participant data from two randomised trials. Validation is assessed in terms of total numbers and timings of key events, and total costs and life-years. Since the initial model validates well only internally, two further models are developed that better fit the prospective and external validation data. All three models are then extrapolated to a life-time horizon, producing cost-effectiveness estimates ranging from pound1600 to pound4200 per life-year gained.Parameter uncertainty is now commonly addressed in health economic decision modelling. However, the derivation of models from different data sources adds another level of uncertainty. This extra uncertainty should be recognised in practical decision-making and, where possible, specifically investigated through independent model validation.
changing as different features are extracted from the test item and knowledge memory . The results, based both on free response tasks and time limited...AFRL-AFOSR-VA-TR-2015-0354 A Dynamic Model for Decision Making During Memory Retrieval Richard Shiffrin TRUSTEES OF INDIANA UNIVERSITY Final Report...4. TITLE AND SUBTITLE A Dynamic Model for Decision Making During Memory Retrieval 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-12-1-0255 5c. PROGRAM
CONTAINED A SIGNIFICANT NUMBER OF PAGES WHICH DO NOT REPRODUCE LEGIBLY. AFIT/GOR/AA/81 0-1 MADAM : MULTIPLE-ATTRIBUTE DECISION ANALYSIS MODEL VOLUME...11 T!IFSIS w T C AFIT/GOR/AA/81D-I Wayne A. Stimpson (J> CC’ T 2Lt USAFR ~~FEB 1 9 1982 AFITj,0R/AA/81 D-1 Thes is t", MADAM : MULTIPLE-ATTRIBUTE...objectives to be satisfied. The program is MADAM : Multiple-Attribute Decision Analysis Model, and it is written in FORTRAN V and is implemented on the
Bonachea, Jaime; Remondo, Juan; de Terán, José Ramón Díaz; González-Díez, Alberto; Cendrero, Antonio
This contribution presents a quantitative procedure for landslide risk analysis and zoning considering hazard, exposure (or value of elements at risk), and vulnerability. The method provides the means to obtain landslide risk models (expressing expected damage due to landslides on material elements and economic activities in monetary terms, according to different scenarios and periods) useful to identify areas where mitigation efforts will be most cost effective. It allows identifying priority areas for the implementation of actions to reduce vulnerability (elements) or hazard (processes). The procedure proposed can also be used as a preventive tool, through its application to strategic environmental impact analysis (SEIA) of land-use plans. The underlying hypothesis is that reliable predictions about hazard and risk can be made using models based on a detailed analysis of past landslide occurrences in connection with conditioning factors and data on past damage. The results show that the approach proposed and the hypothesis formulated are essentially correct, providing estimates of the order of magnitude of expected losses for a given time period. Uncertainties, strengths, and shortcomings of the procedure and results obtained are discussed and potential lines of research to improve the models are indicated. Finally, comments and suggestions are provided to generalize this type of analysis.
Kulkarni, Vrushali Y.; Sinha, Pradeep K.; Petare, Manisha C.
Random Forest is an ensemble, supervised machine learning algorithm. An ensemble generates many classifiers and combines their results by majority voting. Random forest uses decision tree as base classifier. In decision tree induction, an attribute split/evaluation measure is used to decide the best split at each node of the decision tree. The generalization error of a forest of tree classifiers depends on the strength of the individual trees in the forest and the correlation among them. The work presented in this paper is related to attribute split measures and is a two step process: first theoretical study of the five selected split measures is done and a comparison matrix is generated to understand pros and cons of each measure. These theoretical results are verified by performing empirical analysis. For empirical analysis, random forest is generated using each of the five selected split measures, chosen one at a time. i.e. random forest using information gain, random forest using gain ratio, etc. The next step is, based on this theoretical and empirical analysis, a new approach of hybrid decision tree model for random forest classifier is proposed. In this model, individual decision tree in Random Forest is generated using different split measures. This model is augmented by weighted voting based on the strength of individual tree. The new approach has shown notable increase in the accuracy of random forest.
Glynn, Pierre D.
Integrated Environmental Modelling (IEM) is an invaluable tool for understanding the complex, dynamic ecosystems that house our natural resources and control our environments. Human behaviour affects the ways in which the science of IEM is assembled and used for meaningful societal applications. In particular, human biases and heuristics reflect adaptation and experiential learning to issues with frequent, sharply distinguished, feedbacks. Unfortunately, human behaviour is not adapted to the more diffusely experienced problems that IEM typically seeks to address. Twelve biases are identified that affect IEM (and science in general). These biases are supported by personal observations and by the findings of behavioural scientists. A process for critical analysis is proposed that addresses some human challenges of IEM and solicits explicit description of (1) represented processes and information, (2) unrepresented processes and information, and (3) accounting for, and cognizance of, potential human biases. Several other suggestions are also made that generally complement maintaining attitudes of watchful humility, open-mindedness, honesty and transparent accountability. These suggestions include (1) creating a new area of study in the behavioural biogeosciences, (2) using structured processes for engaging the modelling and stakeholder communities in IEM, and (3) using ‘red teams’ to increase resilience of IEM constructs and use.
Lee, Ching Hua; Lucas, Andrew
We describe a simple model of heterogeneous, interacting agents making decisions between n ≥2 discrete choices. For a special class of interactions, our model is the mean field description of random field Potts-like models and is effectively solved by finding the extrema of the average energy E per agent. In these cases, by studying the propagation of decision changes via avalanches, we argue that macroscopic dynamics is well captured by a gradient flow along E . We focus on the permutation symmetric case, where all n choices are (on average) the same, and spontaneous symmetry breaking (SSB) arises purely from cooperative social interactions. As examples, we show that bimodal heterogeneity naturally provides a mechanism for the spontaneous formation of hierarchies between decisions and that SSB is a preferred instability to discontinuous phase transitions between two symmetric points. Beyond the mean field limit, exponentially many stable equilibria emerge when we place this model on a graph of finite mean degree. We conclude with speculation on decision making with persistent collective oscillations. Throughout the paper, we emphasize analogies between methods of solution to our model and common intuition from diverse areas of physics, including statistical physics and electromagnetism.
Fienen, M. N.; Masterson, J.; Plant, N. G.; Gutierrez, B. T.; Thieler, E. R.
Bayesian decision networks (BDN) have long been used to provide decision support in systems that require explicit consideration of uncertainty; applications range from ecology to medical diagnostics and terrorism threat assessments. Until recently, however, few studies have applied BDNs to the study of groundwater systems. BDNs are particularly useful for representing real-world system variability by synthesizing a range of hydrogeologic situations within a single simulation. Because BDN output is cast in terms of probability—an output desired by decision makers—they explicitly incorporate the uncertainty of a system. BDNs can thus serve as a more efficient alternative to other uncertainty characterization methods such as computationally demanding Monte Carlo analyses and others methods restricted to linear model analyses. We present a unique application of a BDN to a groundwater modeling analysis of the hydrologic response of Assateague Island, Maryland to sea-level rise. Using both input and output variables of the modeled groundwater response to different sea-level (SLR) rise scenarios, the BDN predicts the probability of changes in the depth to fresh water, which exerts an important influence on physical and biological island evolution. Input variables included barrier-island width, maximum island elevation, and aquifer recharge. The variability of these inputs and their corresponding outputs are sampled along cross sections in a single model run to form an ensemble of input/output pairs. The BDN outputs, which are the posterior distributions of water table conditions for the sea-level rise scenarios, are evaluated through error analysis and cross-validation to assess both fit to training data and predictive power. The key benefit for using BDNs in groundwater modeling analyses is that they provide a method for distilling complex model results into predictions with associated uncertainty, which is useful to decision makers. Future efforts incorporate
Visschedijk, Gillian C.; Lazonder, Ard W.; van der Hulst, Anja; Vink, Nathalie; Leemkuil, Henny
The training of tactical decision making increasingly occurs through serious computer games. A challenging aspect of designing such games is the modelling of human emotions. Two studies were performed to investigate the relation between fidelity and human emotion recognition in virtual human characters. Study 1 compared five versions of a virtual…
AL-TP-1 991-0009 AD-A23 6 033 THE INTEGRATED DECISION MODELING SYSTEM (IDMS) USER’S MANUAL IJonathan C. Fast John N. Taylor Metrica , Incorporated...Looper 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) . PERFORMING ORGANIZATION REPORT NUMBER Metrica , Incorporated 8301 Broadway, Suite 215 San
Forsberg, Helena Hvitfeldt; Aronsson, Håkan; Keller, Christina; Lindblad, Staffan
Simulation modeling is a way to test changes in a computerized environment to give ideas for improvements before implementation. This article reviews research literature on simulation modeling as support for health care decision making. The aim is to investigate the experience and potential value of such decision support and quality of articles retrieved. A literature search was conducted, and the selection criteria yielded 59 articles derived from diverse applications and methods. Most met the stated research-quality criteria. This review identified how simulation can facilitate decision making and that it may induce learning. Furthermore, simulation offers immediate feedback about proposed changes, allows analysis of scenarios, and promotes communication on building a shared system view and understanding of how a complex system works. However, only 14 of the 59 articles reported on implementation experiences, including how decision making was supported. On the basis of these articles, we proposed steps essential for the success of simulation projects, not just in the computer, but also in clinical reality. We also presented a novel concept combining simulation modeling with the established plan-do-study-act cycle for improvement. Future scientific inquiries concerning implementation, impact, and the value for health care management are needed to realize the full potential of simulation modeling.
Olson, Christine; And Others
A four-component model for career decision-making counseling relates each component to assessment questions and appropriate intervention strategies. The components are (1) conceptualization (definition of the problem); (2) enlargement of response repertoire (generation of alternatives); (3) identification of discriminative stimuli (consequences of…
Small, Ruth V.; Venkatesh, Murali
Introduces the Cognitive-Motivational Model of Decision Satisfaction that extends work on closure and the motivational aspects of instruction and learning. Recognizes the importance of information processing in judgmental tasks and specifies confidence as a major contributing factor to learning satisfaction. Suggests potential applications to…
Dufau, Stephane; Grainger, Jonathan; Ziegler, Johannes C.
We describe a leaky competing accumulator (LCA) model of the lexical decision task that can be used as a response/decision module for any computational model of word recognition. The LCA model uses evidence for a word, operationalized as some measure of lexical activity, as input to the "YES" decision node. Input to the "NO" decision node is…
Berkowitsch, Nicolas A J; Scheibehenne, Benjamin; Rieskamp, Jörg
Cognitive models of decision making aim to explain the process underlying observed choices. Here, we test a sequential sampling model of decision making, multialternative decision field theory (MDFT; Roe, Busemeyer, & Townsend, 2001), on empirical grounds and compare it against 2 established random utility models of choice: the probit and the logit model. Using a within-subject experimental design, participants in 2 studies repeatedly choose among sets of options (consumer products) described on several attributes. The results of Study 1 showed that all models predicted participants' choices equally well. In Study 2, in which the choice sets were explicitly designed to distinguish the models, MDFT had an advantage in predicting the observed choices. Study 2 further revealed the occurrence of multiple context effects within single participants, indicating an interdependent evaluation of choice options and correlations between different context effects. In sum, the results indicate that sequential sampling models can provide relevant insights into the cognitive process underlying preferential choices and thus can lead to better choice predictions.
of the effect of diversion on recidivism among I,os Angeles area juvenile delinquents, and evaluation of the effects of decriminali/.ation of status...ponsor interest. At present (Spring, 1975), SSRI has four programs: Criminal justice ami juvenile delinquency. Typical projects include studies... offenders . Decision analysis and social program evaluation. Typical projects include study of elicitation methods for continuous probability
Gao, Yongli; Alexander, E. Calvin
An understanding of what influences sinkhole formation and the ability to accurately predict sinkhole hazards is critical to environmental management efforts in the karst lands of southeastern Minnesota. Based on the distribution of distances to the nearest sinkhole, sinkhole density, bedrock geology and depth to bedrock in southeastern Minnesota and northwestern Iowa, a decision tree model has been developed to construct maps of sinkhole probability in Minnesota. The decision tree model was converted as cartographic models and implemented in ArcGIS to create a preliminary sinkhole probability map in Goodhue, Wabasha, Olmsted, Fillmore, and Mower Counties. This model quantifies bedrock geology, depth to bedrock, sinkhole density, and neighborhood effects in southeastern Minnesota but excludes potential controlling factors such as structural control, topographic settings, human activities and land-use. The sinkhole probability map needs to be verified and updated as more sinkholes are mapped and more information about sinkhole formation is obtained.
Lin, Chin-Feng; Wang, Hui-Fang
Based on the concepts of brand equity, means-end chain, and Web site trust, this study proposes a novel model called the consumption decision-making process of adolescents (CDMPA) to understand adolescents' Internet consumption habits and behavioral intention toward particular sporting goods. The findings of the CDMPA model can help marketers understand adolescents' consumption preferences and habits for developing effective Internet marketing strategies.
Hemez, Francois M.
The scientific and engineering communities are relying more and more on numerical models to simulate ever-increasingly complex phenomena. Selecting a model, from among a family of models that meets the simulation requirements, presents a challenge to modern-day analysts. To address this concern, a framework is adopted anchored in info-gap decision theory. The framework proposes to select models by examining the trade-offs between prediction accuracy and sensitivity to epistemic uncertainty. The framework is demonstrated on two structural engineering applications by asking the following question: Which model, of several numerical models, approximates the behavior of a structure when parameters that define each of those models are unknown? One observation is that models that are nominally more accurate are not necessarily more robust, and their accuracy can deteriorate greatly depending upon the assumptions made. It is posited that, as reliance on numerical models increases, establishing robustness will become as important as demonstrating accuracy.
Brownhill, Suzanne; Chang, Esther; Bidewell, John; Johnson, Amanda
Community (district) nurses play a significant role in assisting and supporting bereaved informal carers (family members and friends) of recently decease clients of palliative care. Bereavement care demands a wide range of competencies including clinical decision-making. To date, little has been known about the decision-making role of community nurses in Australia. The aim of this study was to conduct in-depth examination of an existing data set generated from semi-structured interviews of 10 community nurses providing follow-up bereavement care home visits within an area health service of a metropolitan region of Sydney, Australia. A grounded theory approach to data analysis generated a model, which highlights an interaction between 'the relationship','the circumstances' (surrounding the bereavement),'the psychosocial variant', 'the mix of nurses', 'the workload', and 'the support' available for the bereaved and for community nurses, and elements of 'the visit' (central to bereavement care). The role of community nurses in bereavement care is complex, particularly where decision-making is discretionary and contingent on multiple variables that effect the course of the family's grief. The decision model has the potential to inform community nurses in their support of informal carers, to promote reflective practice and professional accountability, ensuring continuing competence in bereavement care.
Avanaki, Ali R. N.; Espig, Kathryn S.; Kimpe, Tom R. L.; Maidment, Andrew D. A.
By analyzing human readers' performance in detecting small round lesions in simulated digital breast tomosynthesis background in a location known exactly scenario, we have developed a model observer that is a better predictor of human performance with different levels of background complexity (i.e., anatomical and quantum noise). Our analysis indicates that human observers perform a lesion detection task by combining a number of sub-decisions, each an indicator of the presence of a lesion in the image stack. This is in contrast to a channelized Hotelling observer, where the detection task is conducted holistically by thresholding a single decision variable, made from an optimally weighted linear combination of channels. However, it seems that the sub-par performance of human readers compared to the CHO cannot be fully explained by their reliance on sub-decisions, or perhaps we do not consider a sufficient number of subdecisions. To bridge the gap between the performances of human readers and the model observer based upon subdecisions, we use an additive noise model, the power of which is modulated with the level of background complexity. The proposed model observer better predicts the fast drop in human detection performance with background complexity.
EPA’s Sustainable and Healthy Communities Research Program (SHC) is conducting transdisciplinary research to inform and empower decision-makers. EPA tools and approaches are being developed to enable communities to effectively weigh and integrate human health, socioeconomic, environmental, and ecological factors into their decisions to promote community sustainability. To help achieve this goal, EPA researchers have developed systems approaches to account for the linkages among resources, assets, and outcomes managed by a community. System dynamics (SD) is a member of the family of systems approaches and provides a framework for dynamic modeling that can assist with assessing and understanding complex issues across multiple dimensions. To test the utility of such tools when applied to a real-world situation, the EPA has developed a prototype SD model for community sustainability using the proposed Durham-Orange Light Rail Project (D-O LRP) as a case study.The EPA D-O LRP SD modeling team chose the proposed D-O LRP to demonstrate that an integrated modeling approach could represent the multitude of related cross-sectoral decisions that would be made and the cascading impacts that could result from a light rail transit system connecting Durham and Chapel Hill, NC. In keeping with the SHC vision described above, the proposal for the light rail is a starting point solution for the more intractable problems of population growth, unsustainable land use, environmenta
A methodology has been conceived for efficient synthesis of dynamical models that simulate common-sense decision- making processes. This methodology is intended to contribute to the design of artificial-intelligence systems that could imitate human common-sense decision making or assist humans in making correct decisions in unanticipated circumstances. This methodology is a product of continuing research on mathematical models of the behaviors of single- and multi-agent systems known in biology, economics, and sociology, ranging from a single-cell organism at one extreme to the whole of human society at the other extreme. Earlier results of this research were reported in several prior NASA Tech Briefs articles, the three most recent and relevant being Characteristics of Dynamics of Intelligent Systems (NPO -21037), NASA Tech Briefs, Vol. 26, No. 12 (December 2002), page 48; Self-Supervised Dynamical Systems (NPO-30634), NASA Tech Briefs, Vol. 27, No. 3 (March 2003), page 72; and Complexity for Survival of Living Systems (NPO- 43302), NASA Tech Briefs, Vol. 33, No. 7 (July 2009), page 62. The methodology involves the concepts reported previously, albeit viewed from a different perspective. One of the main underlying ideas is to extend the application of physical first principles to the behaviors of living systems. Models of motor dynamics are used to simulate the observable behaviors of systems or objects of interest, and models of mental dynamics are used to represent the evolution of the corresponding knowledge bases. For a given system, the knowledge base is modeled in the form of probability distributions and the mental dynamics is represented by models of the evolution of the probability densities or, equivalently, models of flows of information. Autonomy is imparted to the decisionmaking process by feedback from mental to motor dynamics. This feedback replaces unavailable external information by information stored in the internal knowledge base. Representation
Alpert, J. C.
Weather forecasts and warnings must be prepared and then delivered so as to reach their intended audience in good time to enable effective decision-making. An effort to mitigate these difficulties was studied at a Workshop, 'Sustaining National Meteorological Services - Strengthening WMO Regional and Global Centers' convened, June , 2013, by the World Bank, WMO and the US National Weather Service (NWS). The skill and accuracy of atmospheric forecasts from deterministic models have increased and there are now ensembles of such models that improve decisions to protect life, property and commerce. The NWS production of numerical weather prediction products result in model output from global and high resolution regional ensemble forecasts. Ensembles are constructed by changing the initial conditions to make a 'cloud' of forecasts that attempt to span the space of possible atmospheric realizations which can quantify not only the most likely forecast, but also the uncertainty. This has led to an unprecedented increase in data production and information content from higher resolution, multi-model output and secondary calculations. One difficulty is to obtain the needed subset of data required to estimate the probability of events, and report the information. The calibration required to reliably estimate the probability of events, and honing of threshold adjustments to reduce false alarms for decision makers is also needed. To meet the future needs of the ever-broadening user community and address these issues on a national and international basis, the weather service implemented the NOAA Operational Model Archive and Distribution System (NOMADS). NOMADS provides real-time and retrospective format independent access to climate, ocean and weather model data and delivers high availability content services as part of NOAA's official real time data dissemination at its new NCWCP web operations center. An important aspect of the server's abilities is to aggregate the matrix of
Chew, Gilbert; Pelaccio, Dennis G.; Jacobs, Mark; Stancati, Michael; Cataldo, Robert
NASA continues to evaluate power systems to support human exploration of the Moon and Mars. The system(s) would address all power needs of surface bases and on-board power for space transfer vehicles. Prior studies have examined both solar and nuclear-based alternatives with respect to individual issues such as sizing or cost. What has not been addressed is a comprehensive look at the risks and benefits of the options that could serve as the analytical framework to support a system choice that best serves the needs of the exploration program. This paper describes the SAIC developed Space Power System Decision Model, which uses a formal Two-step Analytical Hierarchy Process (TAHP) methodology that is used in the decision-making process to clearly distinguish candidate power systems in terms of benefits, safety, and risk. TAHP is a decision making process based on the Analytical Hierarchy Process, which employs a hierarchic approach of structuring decision factors by weights, and relatively ranks system design options on a consistent basis. This decision process also includes a level of data gathering and organization that produces a consistent, well-documented assessment, from which the capability of each power system option to meet top-level goals can be prioritized. The model defined on this effort focuses on the comparative assessment candidate power system options for Mars surface application(s). This paper describes the principles of this approach, the assessment criteria and weighting procedures, and the tools to capture and assess the expert knowledge associated with space power system evaluation. .
Sireli, Yesim; Kauffmann, Paul; Gupta, Surabhi; Kachroo, Pushkin; Johnson, Edward J., Jr. (Technical Monitor)
There is a significant market demand for advanced cockpit weather information products. However, it is unclear how to identify the most promising technological options that provide the desired mix of consumer requirements by employing feasible technical systems at a price that achieves market success. This study develops a unique product development decision model that employs Quality Function Deployment (QFD) and Kano's model of consumer choice. This model is specifically designed for exploration and resolution of this and similar information technology related product development problems.
Schwarick, Martin; Heiner, Monika
This paper presents an Interval Decision Diagram based approach to symbolic CSL model checking of Continuous Time Markov Chains which are derived from stochastic Petri nets. Matrix-vector and vector-matrix multiplication are the major tasks of exact analysis. We introduce a simple, but powerful algorithm which uses explicitly the Petri net structure and allows for parallelisation. We present results demonstrating the efficiency of our first prototype implementation when applied to biochemical network models, specifically with increasing token numbers. Our tool currently supports CSL model checking of time-bounded operators and the Next operator for ordinary stochastic Petri nets.
Sireli, Yesim; Kauffmann, Paul; Gupta, Surabhi; Kachroo, Pushkin
There is a significant market demand for advanced cockpit weather information products. However, it is unclear how to identify the most promising technological options that provide the desired mix of consumer requirements by employing feasible technical systems at a price that achieves market success. This study develops a unique product development decision model that employs Quality Function Deployment (QFD) and Kano's model of consumer choice. This model is specifically designed for exploration and resolution of this and similar information technology related product development problems.
Roux, Pierre; Siminiceanu, Radu I.
We describe an algebra of Edge-Valued Decision Diagrams (EVMDDs) to encode arithmetic functions and its implementation in a model checking library. We provide efficient algorithms for manipulating EVMDDs and review the theoretical time complexity of these algorithms for all basic arithmetic and relational operators. We also demonstrate that the time complexity of the generic recursive algorithm for applying a binary operator on EVMDDs is no worse than that of Multi- Terminal Decision Diagrams. We have implemented a new symbolic model checker with the intention to represent in one formalism the best techniques available at the moment across a spectrum of existing tools. Compared to the CUDD package, our tool is several orders of magnitude faster
Li, Ru; Shen, Jiao; Chen, Jun; Liu, Qiuhuan
With the development of computer technology and the increasing demand for mobile communications, the next generation wireless networks will be composed of various wireless networks (e.g., WiMAX and WiFi). Vertical handoff is a key technology of next generation wireless networks. During the vertical handoff procedure, handoff decision is a crucial issue for an efficient mobility. Based on auto regression moving average (ARMA) prediction model, we propose a vertical handoff decision algorithm, which aims to improve the performance of vertical handoff and avoid unnecessary handoff. Based on the current received signal strength (RSS) and the previous RSS, the proposed approach adopt ARMA model to predict the next RSS. And then according to the predicted RSS to determine whether trigger the link layer triggering event and complete vertical handoff. The simulation results indicate that the proposed algorithm outperforms the RSS-based scheme with a threshold in the performance of handoff and the number of handoff.
Li, Ru; Shen, Jiao; Chen, Jun; Liu, Qiuhuan
With the development of computer technology and the increasing demand for mobile communications, the next generation wireless networks will be composed of various wireless networks (e.g., WiMAX and WiFi). Vertical handoff is a key technology of next generation wireless networks. During the vertical handoff procedure, handoff decision is a crucial issue for an efficient mobility. Based on auto regression moving average (ARMA) prediction model, we propose a vertical handoff decision algorithm, which aims to improve the performance of vertical handoff and avoid unnecessary handoff. Based on the current received signal strength (RSS) and the previous RSS, the proposed approach adopt ARMA model to predict the next RSS. And then according to the predicted RSS to determine whether trigger the link layer triggering event and complete vertical handoff. The simulation results indicate that the proposed algorithm outperforms the RSS-based scheme with a threshold in the performance of handoff and the number of handoff.
Sordo, M; Fox, J; Blum, C; Taylor, P; Lee, R; Alberdi, E
This paper addresses two important problems in medical image interpretation:(1) integration of numeric and symbolic information, (2) access to external sources of medical knowledge. We have developed a prototype in which image processing algorithms are combined with symbolic representations for reasoning, decision making and task management in an integrated, platform-independent system for the differential diagnosis of abnormalities in mammograms. The prototype is based on PROforma, a generic technology for building decision support systems based on clinical guidelines. The PROforma language defines a set of tasks, one of which, the enquiry, is used as means of interaction with the outside world. However, the current enquiry model has proved to be too limited for our purposes. In this paper we outline a more general model, which can be used as an interface between symbolic functions and image or other signal data.
Fraser, Hannah; Rumpff, Libby; Yen, Jian D L; Robinson, Doug; Wintle, Brendan A
Many objectives motivate ecological restoration including improving vegetation condition, increasing the range and abundance of threatened species, and improving aggregate measures of biodiversity such as richness and diversity. While ecological models have been used to examine the outcomes of ecological restoration, there are few attempts to develop models to account for multiple, potentially competing objectives. We develop the first predictive model that integrates a vegetation-focused state-and-transition model with species distribution models for birds. We demonstrate how this integrated model can be used to identify effective restoration options for vegetation and bird species under a constrained budget. For example, using a typical agricultural land management scenario from south-eastern Australia, we demonstrate how the optimal management actions for promoting the occurrence of the Brown Treecreeper, an iconic threatened species, may be suboptimal for meeting vegetation condition objectives. This highlights that any 'preferred' management decision depends on the value assigned to the different objectives. An exploration of sensitivity to value weightings highlighted that 'no management' or 'weed control' were most likely to be the best management options to meet multiple objectives in the scenario we explored. We thus illustrate an approach to using the model outputs to explore trade-offs between bird and vegetation objectives. Our approach to exploring management outcomes and trade-offs using integrated modelling and structured decision support approaches has wide application for conservation management problems in which trade-offs exist between competing objectives. This article is protected by copyright. All rights reserved.
Costi, P.; Minciardi, R.; Robba, M.; Rovatti, M.; Sacile, R
The aim of this work is to present the structure and the application of a decision support system (DSS) designed to help decision makers of a municipality in the development of incineration, disposal, treatment and recycling integrated programs. Specifically, within a MSW management system, several treatment plants and facilities can generally be found: separators, plants for production of refuse derived fuel (RDF), incinerators with energy recovery, plants for treatment of organic material, and sanitary landfills. The main goal of the DSS is to plan the MSW management, defining the refuse flows that have to be sent to recycling or to different treatment or disposal plants, and suggesting the optimal number, the kinds, and the localization of the plants that have to be active. The DSS is based on a decision model that requires the solution of a constrained non-linear optimization problem, where some decision variables are binary and other ones are continuous. The objective function takes into account all possible economic costs, whereas constraints arise from technical, normative, and environmental issues. Specifically, pollution and impacts, induced by the overall solid waste management system, are considered through the formalization of constraints on incineration emissions and on negative effects produced by disposal or other particular treatments.
The methodology in this report improves on some of the limitations of many conventional safety assessment and decision analysis methods. A top-down mathematical approach is developed for decomposing systems and for expressing imprecise individual metrics as possibilistic or fuzzy numbers. A ''Markov-like'' model is developed that facilitates combining (aggregating) inputs into overall metrics and decision aids, also portraying the inherent uncertainty. A major goal of Markov modeling is to help convey the top-down system perspective. One of the constituent methodologies allows metrics to be weighted according to significance of the attribute and aggregated nonlinearly as to contribution. This aggregation is performed using exponential combination of the metrics, since the accumulating effect of such factors responds less and less to additional factors. This is termed ''soft'' mathematical aggregation. Dependence among the contributing factors is accounted for by incorporating subjective metrics on ''overlap'' of the factors as well as by correspondingly reducing the overall contribution of these combinations to the overall aggregation. Decisions corresponding to the meaningfulness of the results are facilitated in several ways. First, the results are compared to a soft threshold provided by a sigmoid function. Second, information is provided on input ''Importance'' and ''Sensitivity,'' in order to know where to place emphasis on considering new controls that may be necessary. Third, trends in inputs and outputs are tracked in order to obtain significant information% including cyclic information for the decision process. A practical example from the air transportation industry is used to demonstrate application of the methodology. Illustrations are given for developing a structure (along with recommended inputs and weights) for air transportation oversight at three different levels, for developing and using cycle information, for developing Importance and
A signal detection model of compound decision tasks Matthew Duncan Defence R& D Canada Technical Report DRDC Toronto TR 2006-256 December 2006...tasks Matthew Duncan Defence R& D Canada – Toronto Technical Report DRDC Toronto TR 2006-256 December 2006 Author Original approved by...la prise de décision, il faut une méthode formelle pour distinguer (clarifier) les effets des divers facteurs, et pour simplifier l’évaluation des
functions for possible outcomes. Multi-Attribute Utility Theory ( MAUT ) MAUT involves a conjoint worth measurement in an uncertain environ- ment. The utility ...pp. 460-81. VonWnterfeldt, D., and G. W. Fischer. " Multiattribute Utility Theory : Models and Assessment Procedures," in apt ?’,o’a’f,, , i I Hwman IX...48 Utility Theory ....... ............................... 49 Operator Performance .......... .................... 53
Liu, Kuang-Yu; Lin, Jennifer; Zhou, Xiaobo; Wong, Stephen T C
We applied the alternating decision trees (ADTrees) method to the last 3 replicates from the Aipotu, Danacca, Karangar, and NYC populations in the Problem 2 simulated Genetic Analysis Workshop dataset. Using information from the 12 binary phenotypes and sex as input and Kofendrerd Personality Disorder disease status as the outcome of ADTrees-based classifiers, we obtained a new quantitative trait based on average prediction scores, which was then used for genome-wide quantitative trait linkage (QTL) analysis. ADTrees are machine learning methods that combine boosting and decision trees algorithms to generate smaller and easier-to-interpret classification rules. In this application, we compared four modeling strategies from the combinations of two boosting iterations (log or exponential loss functions) coupled with two choices of tree generation types (a full alternating decision tree or a classic boosting decision tree). These four different strategies were applied to the founders in each population to construct four classifiers, which were then applied to each study participant. To compute average prediction score for each subject with a specific trait profile, such a process was repeated with 10 runs of 10-fold cross validation, and standardized prediction scores obtained from the 10 runs were averaged and used in subsequent expectation-maximization Haseman-Elston QTL analyses (implemented in GENEHUNTER) with the approximate 900 SNPs in Hardy-Weinberg equilibrium provided for each population. Our QTL analyses on the basis of four models (a full alternating decision tree and a classic boosting decision tree paired with either log or exponential loss function) detected evidence for linkage (Z >or= 1.96, p < 0.01) on chromosomes 1, 3, 5, and 9. Moreover, using average iteration and abundance scores for the 12 phenotypes and sex as their relevancy measurements, we found all relevant phenotypes for all four populations except phenotype b for the Karangar population
Leite, Fábio P.; Ratcliff, Roger
Several sequential sampling models using racing diffusion processes for multiple-alternative decisions were evaluated using data from two perceptual discrimination experiments. The structures of the models differed on a number of dimensions, including whether there was lateral inhibition between accumulators, whether there was decay in evidence, whether evidence could be negative, and whether there was variability in starting points. Data were collected from a letter discrimination task in which stimulus difficulty and probability of the response alternatives were varied along with number of response alternatives. Model fitting results ruled out a large number of model classes in favor of a smaller number of specific models, most of which showed a moderate to high degree of mimicking. The best-fitting models had zero to moderate values of decay, no inhibition, and assumed that the addition of alternatives either affected the subprocesses contributing to the nondecisional time, the degree of caution, or the quality of evidence extracted from stimuli. PMID:20045893
Cimorelli, A.J.; Stahl, C.H.; Chow, A.H.; Fernandez, C.
A critical evaluation of the many environmental issues facing EPA Region 3 has established five major priorities: (1) ozone pollution (and its precursors); (2) impacts of acidification (acid deposition and acid mine drainage); (3) eutrophication of the Chesapeake Bay from atmospheric nitrogen deposition; (4) Cities/Urban Environment (ozone, particulate matter (PM), air toxics are some of the air components); and (5) Climate Change. Recognizing the complex nature of the systems controlling these issues, Region III's Air Protection Division (APD) is developing a decision support tool, i.e., the Decision Consequence Model (DCM), that will integrate and automate the analysis of environmental impacts in a manner that allows them to holistically address these regional priorities. Using this tool the authors intend to consider the interdependency of pollutants and their environmental impacts in order to support real-time decision making. The purpose of this paper is to outline the basic concept of the DCM and to present an example set of environmental indicators to illustrate how the DCM will be used to evaluate environmental impacts. The authors will discuss their process of indicator development, and present an example suite of indicators to provide a concrete example of the concepts presented above and, to illustrate the utility of the DCM to simultaneously evaluate multiple effects of a single pollutant. They will discuss the type of indicators chosen for this example as well as the general criteria the DCM indicators must satisfy. The framework that was developed to construct the indicators is discussed and used to calculate the example indicators. The yearly magnitudes of these example indicators are calculated for various multi-year periods to show their behavior over time.
Kumar, Sameer; Ghildayal, Nidhi; Ghildayal, Neha
Purpose Urinary incontinence (UI) is a common chronic health condition, a problem specifically among elderly women that impacts quality of life negatively. However, UI is usually viewed as likely result of old age, and as such is generally not evaluated or even managed appropriately. Many treatments are available to manage incontinence, such as bladder training and numerous surgical procedures such as Burch colposuspension and Sling for UI which have high success rates. The purpose of this paper is to analyze which of these popular surgical procedures for UI is effective. Design/methodology/approach This research employs randomized, prospective studies to obtain robust cost and utility data used in the Markov chain decision model for examining which of these surgical interventions is more effective in treating women with stress UI based on two measures: number of quality adjusted life years (QALY) and cost per QALY. Treeage Pro Healthcare software was employed in Markov decision analysis. Findings Results showed the Sling procedure is a more effective surgical intervention than the Burch. However, if a utility greater than certain utility value, for which both procedures are equally effective, is assigned to persistent incontinence, the Burch procedure is more effective than the Sling procedure. Originality/value This paper demonstrates the efficacy of a Markov chain decision modeling approach to study the comparative effectiveness analysis of available treatments for patients with UI, an important public health issue, widely prevalent among elderly women in developed and developing countries. This research also improves upon other analyses using a Markov chain decision modeling process to analyze various strategies for treating UI.
Foster, T.; Brozovic, N.; Butler, A. P.
Providing effective policy solutions to aquifer depletion caused by abstraction for irrigation is a key challenge for socio-hydrology. However, most crop production functions used in hydrological models do not capture the intraseasonal nature of irrigation planning, or the importance of well yield in land and water use decisions. Here we develop a method for determining stochastic intraseasonal water use that is based on observed farmer behaviour but is also theoretically consistent with dynamically optimal decision making. We use the model to (i) analyse the joint land and water use decision by farmers; (ii) to assess changes in behaviour and production risk in response to water scarcity; and (iii) to understand the limits of applicability of current methods in policy design. We develop a biophysical model of water-limited crop yield building on the AquaCrop model. The model is calibrated and applied to case studies of irrigated corn production in Nebraska and Texas. We run the model iteratively, using long-term climate records, to define two formulations of the crop-water production function: (i) the aggregate relationship between total seasonal irrigation and yield (typical of current approaches); and (ii) the stochastic response of yield and total seasonal irrigation to the choice of an intraseasonal soil moisture target and irrigated area. Irrigated area (the extensive margin decision) and per-area irrigation intensity (the intensive margin decision) are then calculated for different seasonal water restrictions (corresponding to regulatory policies) and well yield constraints on intraseasonal abstraction rates (corresponding to aquifer system limits). Profit- and utility-maximising decisions are determined assuming risk neutrality and varying degrees of risk aversion, respectively. Our results demonstrate that the formulation of the production function has a significant impact on the response to water scarcity. For low well yields, which are the major concern
Attanasi, E.D.; Coburn, T.C.; Freeman, P.A.
Sustained increases in energy prices have focused attention on gas resources in low permeability shale or in coals that were previously considered economically marginal. Daily well deliverability is often relatively small, although the estimates of the total volumes of recoverable resources in these settings are large. Planning and development decisions for extraction of such resources must be area-wide because profitable extraction requires optimization of scale economies to minimize costs and reduce risk. For an individual firm the decision to enter such plays depends on reconnaissance level estimates of regional recoverable resources and on cost estimates to develop untested areas. This paper shows how simple nonparametric local regression models, used to predict technically recoverable resources at untested sites, can be combined with economic models to compute regional scale cost functions. The context of the worked example is the Devonian Antrim shale gas play, Michigan Basin. One finding relates to selection of the resource prediction model to be used with economic models. Models which can best predict aggregate volume over larger areas (many hundreds of sites) may lose granularity in the distribution of predicted volumes at individual sites. This loss of detail affects the representation of economic cost functions and may affect economic decisions. Second, because some analysts consider unconventional resources to be ubiquitous, the selection and order of specific drilling sites may, in practice, be determined by extraneous factors. The paper also shows that when these simple prediction models are used to strategically order drilling prospects, the gain in gas volume over volumes associated with simple random site selection amounts to 15 to 20 percent. It also discusses why the observed benefit of updating predictions from results of new drilling, as opposed to following static predictions, is somewhat smaller. Copyright 2007, Society of Petroleum Engineers.
Attanasi, E.D.; Coburn, T.C.; Freeman, P.A.
Sustained increases in energy prices have focused attention on gas resources in low-permeability shale or in coals that were previously considered economically marginal. Daily well deliverability is often relatively small, although the estimates of the total volumes of recoverable resources in these settings are often large. Planning and development decisions for extraction of such resources must be areawide because profitable extraction requires optimization of scale economies to minimize costs and reduce risk. For an individual firm, the decision to enter such plays depends on reconnaissance-level estimates of regional recoverable resources and on cost estimates to develop untested areas. This paper shows how simple nonparametric local regression models, used to predict technically recoverable resources at untested sites, can be combined with economic models to compute regional-scale cost functions. The context of the worked example is the Devonian Antrim-shale gas play in the Michigan basin. One finding relates to selection of the resource prediction model to be used with economic models. Models chosen because they can best predict aggregate volume over larger areas (many hundreds of sites) smooth out granularity in the distribution of predicted volumes at individual sites. This loss of detail affects the representation of economic cost functions and may affect economic decisions. Second, because some analysts consider unconventional resources to be ubiquitous, the selection and order of specific drilling sites may, in practice, be determined arbitrarily by extraneous factors. The analysis shows a 15-20% gain in gas volume when these simple models are applied to order drilling prospects strategically rather than to choose drilling locations randomly. Copyright ?? 2008 Society of Petroleum Engineers.
Tatler, Benjamin W; Brockmole, James R; Carpenter, R H S
Many of our actions require visual information, and for this it is important to direct the eyes to the right place at the right time. Two or three times every second, we must decide both when and where to direct our gaze. Understanding these decisions can reveal the moment-to-moment information priorities of the visual system and the strategies for information sampling employed by the brain to serve ongoing behavior. Most theoretical frameworks and models of gaze control assume that the spatial and temporal aspects of fixation point selection depend on different mechanisms. We present a single model that can simultaneously account for both when and where we look. Underpinning this model is the theoretical assertion that each decision to move the eyes is an evaluation of the relative benefit expected from moving the eyes to a new location compared with that expected by continuing to fixate the current target. The eyes move when the evidence that favors moving to a new location outweighs that favoring staying at the present location. Our model provides not only an account of when the eyes move, but also what will be fixated. That is, an analysis of saccade timing alone enables us to predict where people look in a scene. Indeed our model accounts for fixation selection as well as (and often better than) current computational models of fixation selection in scene viewing. (PsycINFO Database Record
Tulga, M. K.; Sheridan, T. B.
An optimal decision control model was developed, which is based primarily on a dynamic programming algorithm which looks at all the available task possibilities, charts an optimal trajectory, and commits itself to do the first step (i.e., follow the optimal trajectory during the next time period), and then iterates the calculation. A Bayesian estimator was included which estimates the tasks which might occur in the immediate future and provides this information to the dynamic programming routine. Preliminary trials comparing the human subject's performance to that of the optimal model show a great similarity, but indicate that the human skips certain movements which require quick change in strategy.
Skorheim, Steven; Lonjers, Peter; Bazhenov, Maxim
Reward-modulated spike timing dependent plasticity (STDP) combines unsupervised STDP with a reinforcement signal that modulates synaptic changes. It was proposed as a learning rule capable of solving the distal reward problem in reinforcement learning. Nonetheless, performance and limitations of this learning mechanism have yet to be tested for its ability to solve biological problems. In our work, rewarded STDP was implemented to model foraging behavior in a simulated environment. Over the course of training the network of spiking neurons developed the capability of producing highly successful decision-making. The network performance remained stable even after significant perturbations of synaptic structure. Rewarded STDP alone was insufficient to learn effective decision making due to the difficulty maintaining homeostatic equilibrium of synaptic weights and the development of local performance maxima. Our study predicts that successful learning requires stabilizing mechanisms that allow neurons to balance their input and output synapses as well as synaptic noise.
The report, National Academy of Sciences report, "Assessing the TMDL Approach to Water Quality Management" endorsed the "watershed" and "ambient water quality focused" approach" to water quality management called for in the TMDL program. The committee felt that available data and models were adequate to move such a program forward, if the EPA and all stakeholders better understood the nature of the scientific enterprise and its application to the TMDL program. Specifically, the report called for a greater acknowledgement of model prediction uncertinaity in making and implementing TMDL plans. To assure that such uncertinaity was addressed in water quality decision making the committee called for a commitment to "adaptive implementation" of water quality management plans. The committee found that the number and complexity of the interactions of multiple stressors, combined with model prediction uncertinaity means that we need to avoid the temptation to make assurances that specific actions will result in attainment of particular water quality standards. Until the work on solving a water quality problem begins, analysts and decision makers cannot be sure what the correct solutions are, or even what water quality goals a community should be seeking. In complex systems we need to act in order to learn; adaptive implementation is a concurrent process of action and learning. Learning requires (1) continued monitoring of the waterbody to determine how it responds to the actions taken and (2) carefully designed experiments in the watershed. If we do not design learning into what we attempt we are not doing adaptive implementation. Therefore, there needs to be an increased commitment to monitoring and experiments in watersheds that will lead to learning. This presentation will 1) explain the logic for adaptive implementation; 2) discuss the ways that water quality modelers could characterize and explain model uncertinaity to decision makers; 3) speculate on the implications
Moreira, Catarina; Wichert, Andreas
In this work, we explore an alternative quantum structure to perform quantum probabilistic inferences to accommodate the paradoxical findings of the Sure Thing Principle. We propose a Quantum-Like Bayesian Network, which consists in replacing classical probabilities by quantum probability amplitudes. However, since this approach suffers from the problem of exponential growth of quantum parameters, we also propose a similarity heuristic that automatically fits quantum parameters through vector similarities. This makes the proposed model general and predictive in contrast to the current state of the art models, which cannot be generalized for more complex decision scenarios and that only provide an explanatory nature for the observed paradoxes. In the end, the model that we propose consists in a nonparametric method for estimating inference effects from a statistical point of view. It is a statistical model that is simpler than the previous quantum dynamic and quantum-like models proposed in the literature. We tested the proposed network with several empirical data from the literature, mainly from the Prisoner's Dilemma game and the Two Stage Gambling game. The results obtained show that the proposed quantum Bayesian Network is a general method that can accommodate violations of the laws of classical probability theory and make accurate predictions regarding human decision-making in these scenarios.
Moreira, Catarina; Wichert, Andreas
In this work, we explore an alternative quantum structure to perform quantum probabilistic inferences to accommodate the paradoxical findings of the Sure Thing Principle. We propose a Quantum-Like Bayesian Network, which consists in replacing classical probabilities by quantum probability amplitudes. However, since this approach suffers from the problem of exponential growth of quantum parameters, we also propose a similarity heuristic that automatically fits quantum parameters through vector similarities. This makes the proposed model general and predictive in contrast to the current state of the art models, which cannot be generalized for more complex decision scenarios and that only provide an explanatory nature for the observed paradoxes. In the end, the model that we propose consists in a nonparametric method for estimating inference effects from a statistical point of view. It is a statistical model that is simpler than the previous quantum dynamic and quantum-like models proposed in the literature. We tested the proposed network with several empirical data from the literature, mainly from the Prisoner's Dilemma game and the Two Stage Gambling game. The results obtained show that the proposed quantum Bayesian Network is a general method that can accommodate violations of the laws of classical probability theory and make accurate predictions regarding human decision-making in these scenarios. PMID:26858669
In the real world, decision making processes must be able to integrate non-stationary information that changes systematically while the decision is in progress. Although theories of decision making have traditionally been applied to paradigms with stationary information, non-stationary stimuli are now of increasing theoretical interest. We use a random-dot motion paradigm along with cognitive modeling to investigate how the decision process is updated when a stimulus changes. Participants viewed a cloud of moving dots, where the motion switched directions midway through some trials, and were asked to determine the direction of motion. Behavioral results revealed a strong delay effect: after presentation of the initial motion direction there is a substantial time delay before the changed motion information is integrated into the decision process. To further investigate the underlying changes in the decision process, we developed a Piecewise Linear Ballistic Accumulator model (PLBA). The PLBA is efficient to simulate, enabling it to be fit to participant choice and response-time distribution data in a hierarchal modeling framework using a non-parametric approximate Bayesian algorithm. Consistent with behavioral results, PLBA fits confirmed the presence of a long delay between presentation and integration of new stimulus information, but did not support increased response caution in reaction to the change. We also found the decision process was not veridical, as symmetric stimulus change had an asymmetric effect on the rate of evidence accumulation. Thus, the perceptual decision process was slow to react to, and underestimated, new contrary motion information. PMID:26760448
Keeney, Ralph L; von Winterfeldt, Detlof
One of the most challenging tasks of homeland security policymakers is to allocate their limited resources to reduce terrorism risks cost effectively. To accomplish this task, it is useful to develop a comprehensive set of homeland security objectives, metrics to measure each objective, a utility function, and value tradeoffs relevant for making homeland security investments. Together, these elements form a homeland security value model. This article develops a homeland security value model based on literature reviews, a survey, and experience with building value models. The purposes of the article are to motivate the use of a value model for homeland security decision making and to illustrate its use to assess terrorism risks, assess the benefits of countermeasures, and develop a severity index for terrorism attacks.
Sordo, Margarita; Rocha, Beatriz H; Morales, Alfredo A; Maviglia, Saverio M; Oglio, Elisa Dell'Oglio; Fairbanks, Amanda; Aroy, Teal; Dubois, David; Bouyer-Ferullo, Sharon; Rocha, Roberto A
Traditionally, rule interactions are handled at implementation time through rule task properties that control the order in which rules are executed. By doing so, knowledge about the behavior and interactions of decision rules is not captured at modeling time. We argue that this is important knowledge that should be integrated in the modeling phase. In this project, we build upon current work on a conceptual schema to represent clinical knowledge for decision support in the form of if
Eppinger, Ben; Walter, Maik; Li, Shu-Chen
In this study, we investigated the interplay of habitual (model-free) and goal-directed (model-based) decision processes by using a two-stage Markov decision task in combination with event-related potentials (ERPs) and computational modeling. To manipulate the demands on model-based decision making, we applied two experimental conditions with different probabilities of transitioning from the first to the second stage of the task. As we expected, when the stage transitions were more predictable, participants showed greater model-based (planning) behavior. Consistent with this result, we found that stimulus-evoked parietal (P300) activity at the second stage of the task increased with the predictability of the state transitions. However, the parietal activity also reflected model-free information about the expected values of the stimuli, indicating that at this stage of the task both types of information are integrated to guide decision making. Outcome-related ERP components only reflected reward-related processes: Specifically, a medial prefrontal ERP component (the feedback-related negativity) was sensitive to negative outcomes, whereas a component that is elicited by reward (the feedback-related positivity) increased as a function of positive prediction errors. Taken together, our data indicate that stimulus-locked parietal activity reflects the integration of model-based and model-free information during decision making, whereas feedback-related medial prefrontal signals primarily reflect reward-related decision processes.
Bales, J. D.; Cline, D. W.; Pietrowsky, R.
The National Weather Service (NWS), the U.S. Army Corps of Engineers (USACE), and the U.S. Geological Survey (USGS), all Federal agencies with complementary water-resources activities, entered into an Interagency Memorandum of Understanding (MOU) "Collaborative Science Services and Tools to Support Integrated and Adaptive Water Resources Management" to collaborate in activities that are supportive to their respective missions. One of the interagency activities is the development of a highly integrated national water modeling framework and information services framework. Together these frameworks establish a common operating picture, improve modeling and synthesis, support the sharing of data and products among agencies, and provide a platform for incorporation of new scientific understanding. Each of the agencies has existing operational systems to assist in carrying out their respective missions. The systems generally are designed, developed, tested, fielded, and supported by specialized teams. A broader, shared approach is envisioned and would include community modeling, wherein multiple independent investigators or teams develop and contribute new modeling capabilities based on science advances; modern technology in coupling model components and visualizing results; and a coupled atmospheric - hydrologic model construct such that the framework could be used in real-time water-resources decision making or for long-term management decisions. The framework also is being developed to account for organizational structures of the three partners such that, for example, national data sets can move down to the regional scale, and vice versa. We envision the national water modeling framework to be an important element of North American Water Program, to contribute to goals of the Program, and to be informed by the science and approaches developed as a part of the Program.
Reichenau, T. G.; Krimly, T.; Schneider, K.
Due to various interdependencies between the cycles of water, carbon, nitrogen, and energy the impacts of climate change on ecohydrological systems can only be investigated in an integrative way. Furthermore, the human intervention in the environmental processes makes the system even more complex. On the one hand human impact affects natural systems. On the other hand the changing natural systems have a feedback on human decision making. One of the most important examples for this kind of interaction can be found in the agricultural sector. Management dates (planting, fertilization, harvesting) are chosen based on meteorological conditions and yield expectations. A faster development of crops under a warmer climate causes shorter cropping seasons. The choice of crops depends on their profitability, which is mainly determined by market prizes, the agro-political framework, and the (climate dependent) crop yield. This study investigates these relations for the district Günzburg located in the Upper Danube catchment in southern Germany. The modeling system DANUBIA was used to perform dynamically coupled simulations of plant growth, surface and soil hydrological processes, soil nitrogen transformations, and agricultural decision making. The agro-economic model simulates decisions on management dates (based on meteorological conditions and the crops' development state), on fertilization intensities (based on yield expectations), and on choice of crops (based on profitability). The environmental models included in DANUBIA are to a great extent process based to enable its use in a climate change scenario context. Scenario model runs until 2058 were performed using an IPCC A1B forcing. In consecutive runs, dynamic crop management, dynamic crop selection, and a changing agro-political framework were activated. Effects of these model features on hydrological and ecological variables were analyzed separately by comparing the results to a model run with constant crop
Roux, Pierre; Siminiceanu, Radu I.
We describe an algebra of Edge-Valued Decision Diagrams (EVMDDs) to encode arithmetic functions and its implementation in a model checking library along with state-of-the-art algorithms for building the transition relation and the state space of discrete state systems. We provide efficient algorithms for manipulating EVMDDs and give upper bounds of the theoretical time complexity of these algorithms for all basic arithmetic and relational operators. We also demonstrate that the time complexity of the generic recursive algorithm for applying a binary operator on EVMDDs is no worse than that of Multi-Terminal Decision Diagrams. We have implemented a new symbolic model checker with the intention to represent in one formalism the best techniques available at the moment across a spectrum of existing tools: EVMDDs for encoding arithmetic expressions, identity-reduced MDDs for representing the transition relation, and the saturation algorithm for reachability analysis. We compare our new symbolic model checking EVMDD library with the widely used CUDD package and show that, in many cases, our tool is several orders of magnitude faster than CUDD.
Ito, Makoto; Doya, Kenji
Reinforcement learning theory plays a key role in understanding the behavioral and neural mechanisms of choice behavior in animals and humans. Especially, intermediate variables of learning models estimated from behavioral data, such as the expectation of reward for each candidate choice (action value), have been used in searches for the neural correlates of computational elements in learning and decision making. The aims of the present study are as follows: (1) to test which computational model best captures the choice learning process in animals and (2) to elucidate how action values are represented in different parts of the corticobasal ganglia circuit. We compared different behavioral learning algorithms to predict the choice sequences generated by rats during a free-choice task and analyzed associated neural activity in the nucleus accumbens (NAc) and ventral pallidum (VP). The major findings of this study were as follows: (1) modified versions of an action-value learning model captured a variety of choice strategies of rats, including win-stay-lose-switch and persevering behavior, and predicted rats' choice sequences better than the best multistep Markov model; and (2) information about action values and future actions was coded in both the NAc and VP, but was less dominant than information about trial types, selected actions, and reward outcome. The results of our model-based analysis suggest that the primary role of the NAc and VP is to monitor information important for updating choice behaviors. Information represented in the NAc and VP might contribute to a choice mechanism that is situated elsewhere.
Dolan, James G.
Current models of healthcare quality recommend that patient management decisions be evidence-based and patient-centered. Evidence-based decisions require a thorough understanding of current information regarding the natural history of disease and the anticipated outcomes of different management options. Patient-centered decisions incorporate patient preferences, values, and unique personal circumstances into the decision making process and actively involve both patients along with health care providers as much as possible. Fundamentally, therefore, evidence-based, patient-centered decisions are multi-dimensional and typically involve multiple decision makers. Advances in the decision sciences have led to the development of a number of multiple criteria decision making methods. These multi-criteria methods are designed to help people make better choices when faced with complex decisions involving several dimensions. They are especially helpful when there is a need to combine “hard data” with subjective preferences, to make trade-offs between desired outcomes, and to involve multiple decision makers. Evidence-based, patient-centered clinical decision making has all of these characteristics. This close match suggests that clinical decision support systems based on multi-criteria decision making techniques have the potential to enable patients and providers to carry out the tasks required to implement evidence-based, patient-centered care effectively and efficiently in clinical settings. The goal of this paper is to give readers a general introduction to the range of multi-criteria methods available and show how they could be used to support clinical decision-making. Methods discussed include the balance sheet, the even swap method, ordinal ranking methods, direct weighting methods, multi-attribute decision analysis, and the analytic hierarchy process (AHP) PMID:21394218
Zehetleitner, Michael; Ratko-Dehnert, Emil; Müller, Hermann J.
The redundant-signals paradigm (RSP) is designed to investigate response behavior in perceptual tasks in which response-relevant targets are defined by either one or two features, or modalities. The common finding is that responses are speeded for redundantly compared to singly defined targets. This redundant-signals effect (RSE) can be accounted for by race models if the response times do not violate the race model inequality (RMI). When there are violations of the RMI, race models are effectively excluded as a viable account of the RSE. The common alternative is provided by co-activation accounts, which assume that redundant target signals are integrated at some processing stage. However, “co-activation” has mostly been only indirectly inferred and the accounts have only rarely been explicitly modeled; if they were modeled, the RSE has typically been assumed to have a decisional locus. Yet, there are also indications in the literature that the RSE might originate, at least in part, at a non-decisional or motor stage. In the present study, using a distribution analysis of sequential-sampling models (ex-Wald and Ratcliff Diffusion model), the locus of the RSE was investigated for two bimodal (audio-visual) detection tasks that strongly violated the RMI, indicative of substantial co-activation. Three model variants assuming different loci of the RSE were fitted to the quantile reaction time proportions: a decision, a non-decision, and a combined variant both to vincentized group as well as individual data. The results suggest that for the two bimodal detection tasks, co-activation has a shared decisional and non-decisional locus. These findings point to the possibility that the mechanisms underlying the RSE depend on the specifics (task, stimulus, conditions, etc.) of the experimental paradigm. PMID:25805987
Zehetleitner, Michael; Ratko-Dehnert, Emil; Müller, Hermann J
The redundant-signals paradigm (RSP) is designed to investigate response behavior in perceptual tasks in which response-relevant targets are defined by either one or two features, or modalities. The common finding is that responses are speeded for redundantly compared to singly defined targets. This redundant-signals effect (RSE) can be accounted for by race models if the response times do not violate the race model inequality (RMI). When there are violations of the RMI, race models are effectively excluded as a viable account of the RSE. The common alternative is provided by co-activation accounts, which assume that redundant target signals are integrated at some processing stage. However, "co-activation" has mostly been only indirectly inferred and the accounts have only rarely been explicitly modeled; if they were modeled, the RSE has typically been assumed to have a decisional locus. Yet, there are also indications in the literature that the RSE might originate, at least in part, at a non-decisional or motor stage. In the present study, using a distribution analysis of sequential-sampling models (ex-Wald and Ratcliff Diffusion model), the locus of the RSE was investigated for two bimodal (audio-visual) detection tasks that strongly violated the RMI, indicative of substantial co-activation. Three model variants assuming different loci of the RSE were fitted to the quantile reaction time proportions: a decision, a non-decision, and a combined variant both to vincentized group as well as individual data. The results suggest that for the two bimodal detection tasks, co-activation has a shared decisional and non-decisional locus. These findings point to the possibility that the mechanisms underlying the RSE depend on the specifics (task, stimulus, conditions, etc.) of the experimental paradigm.
Fuzzy Cognitive Mapping MAUT: Multi-Attribute Utility Theory MCDM: Multi-Criteria Decision Making MOE: Measure of Effectiveness OWA...these decision theoretics can be found in texts such as [1,31,33,40,42,45,62,87,94,100]. 6.4.1 Multi-Attribute Utility Theory (MAUT) This is a...Neumann and Morganstern  axiomatised expected utility theory and thus laid the foundations of MAUT, as applied to econometrics. Accordingly, the
Three separate studies in decision analysis were conducted in the context of air pollution control wherein the preferences of informed subjects were individually assessed. The first study was designed to develop a decision model for the control of sulfur dioxide emissions by incorporating multi-media effects of pollution control using both fundamental and proxy attributes. The second study specifically compared fundamental and proxy attributes and tested the hypothesis that proxy attributes lead to biased decisions. The third study validated the results of the previous one and was extended to examine the hypothesis that proxy bias could be reduced by appropriate elicitation techniques. The findings of this study indicated that subjects behaved according to the norms of expected utility theory when the unidimensional utility function for the proxy attribute was assessed. However, subjects exhibited a near universal bias to overweight the proxy attribute, relative to prescriptions of expected utility theory, in a multi-attribute scenario.
International Mobile Telecommunications-Advanced (IMT Advanced), better known as 4G is the next level of evolution in the field of wireless communications. 4G Wireless networks enable users to access information anywhere, anytime, with a seamless connection to a wide range of information and services, and receiving a large volume of information, data, pictures, video and thus increasing the demand for High Bandwidth and Signal Strength. The mobility among various networks is achieved through Vertical Handoff. Vertical handoffs refer to the automatic failover from one technology to another in order to maintain communication. The heterogeneous co-existence of access technologies with largely different characteristics creates a decision problem of determining the "best" available network at "best" time for handoff. In this paper, we implemented the proposed Dynamic and Smart Decision model to decide the "best" network interface and "best" time moment to handoff. The proposed model implementation not only demonstrates the individual user needs but also improve the whole system performance i.e. Quality of Service by reducing the unnecessary handoffs and maintain mobility.
Murray, Richard F
Most of the theory supporting our understanding of classification images relies on standard signal detection models and the use of normally distributed stimulus noise. Here I show that the most common methods of calculating classification images by averaging stimulus noise samples within stimulus-response classes of trials are much more general than has previously been demonstrated, and that they give unbiased estimates of an observer's template for a wide range of decision rules and non-Gaussian stimulus noise distributions. These results are similar to findings on reverse correlation and related methods in the neurophysiology literature, but here I formulate them in terms that are tailored to signal detection analyses of visual tasks, in order to make them more accessible and useful to visual psychophysicists. I examine 2AFC and yes-no designs. These findings make it possible to use and interpret classification images in tasks where observers' decision strategies may not conform to classic signal detection models such as the difference rule, and in tasks where the stimulus noise is non-Gaussian.
Walden, R. S.; Rouse, W. B.
Allocation of decision making responsibility between pilot and computer is considered and a flight management task, designed for the study of pilot-computer interaction, is discussed. A queueing theory model of pilot decision making in this multi-task, control and monitoring situation is presented. An experimental investigation of pilot decision making and the resulting model parameters are discussed.
Zirk, Deborah A.; And Others
Findings of previous scientific decision-making literature are reviewed in an effort to specify a model depicting the many facets of the individual military enlistment decision. Theories and/or models reviewed include decision theory, social judgment theory, information integration theory, conjoint measurement/unfolding theory, cognitive decision…
This research focuses on the development of a decision support model to identify the preferred strategy for managing municipal solid waste using the...principles of decision analysis theory. The model provides an effective decision making tool to evaluate and compare different municipal solid waste management
Brémault-Phillips, Suzette C.; Parmar, Jasneet; Friesen, Steven; Rogers, Laura G.; Pike, Ashley; Sluggett, Bryan
Background The Decision-Making Capacity Assessment (DMCA) Model includes a best-practice process and tools to assess DMCA, and implementation strategies at the organizational and assessor levels to support provision of DMCAs across the care continuum. A Developmental Evaluation of the DMCA Model was conducted. Methods A mixed methods approach was used. Survey (N = 126) and focus group (N = 49) data were collected from practitioners utilizing the Model. Results Strengths of the Model include its best-practice and implementation approach, applicability to independent practitioners and inter-professional teams, focus on training/mentoring to enhance knowledge/skills, and provision of tools/processes. Post-training, participants agreed that they followed the Model’s guiding principles (90%), used problem-solving (92%), understood discipline-specific roles (87%), were confident in their knowledge of DMCAs (75%) and pertinent legislation (72%), accessed consultative services (88%), and received management support (64%). Model implementation is impeded when role clarity, physician engagement, inter-professional buy-in, accountability, dedicated resources, information sharing systems, and remuneration are lacking. Dedicated resources, job descriptions inclusive of DMCAs, ongoing education/mentoring supports, access to consultative services, and appropriate remuneration would support implementation. Conclusions The DMCA Model offers practitioners, inter-professional teams, and organizations a best-practice and implementation approach to DMCAs. Addressing barriers and further contextualizing the Model would be warranted. PMID:27729947
This viewgraph presentation reviews various methods of decision making and the impact that they have on space economics and systems engineering. Some of the methods discussed are: Present Value and Internal Rate of Return (IRR); Cost-Benefit Analysis; Real Options; Cost-Effectiveness Analysis; Cost-Utility Analysis; Multi-Attribute Utility Theory (MAUT); and Analytic Hierarchy Process (AHP).
Collocated cokriging and neural-network multi-attribute transform in the prediction of effective porosity: A comparative case study for the Second Wall Creek Sand of the Teapot Dome field, Wyoming, USA
Moon, Seonghoon; Lee, Gwang H.; Kim, Hyeonju; Choi, Yosoon; Kim, Han-Joon
Collocated cokriging (CCK) and neural-network multi-attribute transform (NN-MAT) are widely used in the prediction of reservoir properties because they can integrate sparsely-distributed, high-resolution well-log data and densely-sampled, low-resolution seismic data. CCK is a linear-weighted averaging method based on spatial covariance model. NN-MAT, based on a nonlinear relationship between seismic attributes and log values, treats data as spatially independent observations. In this study, we analyzed 3-D seismic and well-log data from the Second Wall Creek Sand of the Teapot Dome field, Wyoming, USA to investigate: (1) how CCK and NN-MAT perform in the prediction of porosity and (2) how the number of wells affects the results. Among a total of 64 wells, 25 wells were selected for CCK and NN-MAT and 39 wells were withheld for validation. We examined four cases: 25, 20, 15, and 10 wells. CCK overpredicted the porosity in the validation wells for all cases likely due to the strong influence of high values, but failed to predict very large porosities. Overprediction of CCK porosity becomes more pronounced with decreasing number of wells. NN-MAT largely underpredicted the porosity for all cases probably due to the band-limited nature of seismic data. The performance of CCK appears to be not affected significantly by the number of wells. Overall, NN-MAT performed better than CCK although its performance decreases continuously with decreasing number of wells.
Swain, N. R.; Latu, K.; Christiensen, S.; Jones, N.; Nelson, J.
Advances in computation resources and greater availability of water resources data represent an untapped resource for addressing hydrologic uncertainties in water resources decision-making. The current practice of water authorities relies on empirical, lumped hydrologic models to estimate watershed response. These models are not capable of taking advantage of many of the spatial datasets that are now available. Physically-based, distributed hydrologic models are capable of using these data resources and providing better predictions through stochastic analysis. However, there exists a digital divide that discourages many science-minded decision makers from using distributed models. This divide can be spanned using a combination of existing web technologies. The purpose of this presentation is to present a cloud-based environment that will offer hydrologic modeling tools or 'apps' for decision support and the web technologies that have been selected to aid in its implementation. Compared to the more commonly used lumped-parameter models, distributed models, while being more intuitive, are still data intensive, computationally expensive, and difficult to modify for scenario exploration. However, web technologies such as web GIS, web services, and cloud computing have made the data more accessible, provided an inexpensive means of high-performance computing, and created an environment for developing user-friendly apps for distributed modeling. Since many water authorities are primarily interested in the scenario exploration exercises with hydrologic models, we are creating a toolkit that facilitates the development of a series of apps for manipulating existing distributed models. There are a number of hurdles that cloud-based hydrologic modeling developers face. One of these is how to work with the geospatial data inherent with this class of models in a web environment. Supporting geospatial data in a website is beyond the capabilities of standard web frameworks and it
Jackson, Christopher H; Thompson, Simon G; Sharples, Linda D
Health economic decision models are subject to considerable uncertainty, much of which arises from choices between several plausible model structures, e.g. choices of covariates in a regression model. Such structural uncertainty is rarely accounted for formally in decision models but can be addressed by model averaging. We discuss the most common methods of averaging models and the principles underlying them. We apply them to a comparison of two surgical techniques for repairing abdominal aortic aneurysms. In model averaging, competing models are usually either weighted by using an asymptotically consistent model assessment criterion, such as the Bayesian information criterion, or a measure of predictive ability, such as Akaike's information criterion. We argue that the predictive approach is more suitable when modelling the complex underlying processes of interest in health economics, such as individual disease progression and response to treatment.
The fundamental problem with the health care delivery system remains too little health delivered for too great a cost. Information essential to sound clinical and administrative decision making is too frequently missing at the time and place of decision. Automated systems offer opportunities both to improve health and to reduce cost through effective and efficient information management. Information systems are the enabling technology for those business practice changes which improve the benefit-cost profile of a re-engineered delivery system. The Computer-based Patient Record (CPR) is the organizing framework of an enterprise-wide health information system. Since information management is a core function of the health care enterprise, evaluation of the CPR should include its impact on the value of health outcomes and contribution to the organizational mission, rather than solely by benefits which accrue within the delivery system. This paper proposes a model to measure the impact of information technology and specifically a CPR on a re-engineered health care delivery system. PMID:8563375
Marandi, Ramtin Zargari; Sabzpoushan, S H
A novel method based on electrooculography (EOG) has been introduced in this work to study the decision-making process. An experiment was designed and implemented wherein subjects were asked to choose between two items from the same category that were presented within a limited time. The EOG and voice signals of the subjects were recorded during the experiment. A calibration task was performed to map the EOG signals to their corresponding gaze positions on the screen by using an artificial neural network. To analyze the data, 16 parameters were extracted from the response time and EOG signals of the subjects. Evaluation and comparison of the parameters, together with subjects' choices, revealed functional information. On the basis of this information, subjects switched their eye gazes between items about three times on average. We also found, according to statistical hypothesis testing-that is, a t test, t(10) = 71.62, SE = 1.25, p < .0001-that the correspondence rate of a subjects' gaze at the moment of selection with the selected item was significant. Ultimately, on the basis of these results, we propose a qualitative choice model for the decision-making task.
Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Mzoughi, Olfa; Toumi, Mondher
Background and objective With constant incentives for healthcare payers to contain their pharmaceutical budgets, modelling policy decision impact became critical. The objective of this project was to test the impact of various policy decisions on pharmaceutical budget (developed for the European Commission for the project ‘European Union (EU) Pharmaceutical expenditure forecast’ – http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). Methods A model was built to assess policy scenarios’ impact on the pharmaceutical budgets of seven member states of the EU, namely France, Germany, Greece, Hungary, Poland, Portugal, and the United Kingdom. The following scenarios were tested: expanding the UK policies to EU, changing time to market access, modifying generic price and penetration, shifting the distribution chain of biosimilars (retail/hospital). Results Applying the UK policy resulted in dramatic savings for Germany (10 times the base case forecast) and substantial additional savings for France and Portugal (2 and 4 times the base case forecast, respectively). Delaying time to market was found be to a very powerful tool to reduce pharmaceutical expenditure. Applying the EU transparency directive (6-month process for pricing and reimbursement) increased pharmaceutical expenditure for all countries (from 1.1 to 4 times the base case forecast), except in Germany (additional savings). Decreasing the price of generics and boosting the penetration rate, as well as shifting distribution of biosimilars through hospital chain were also key methods to reduce pharmaceutical expenditure. Change in the level of reimbursement rate to 100% in all countries led to an important increase in the pharmaceutical budget. Conclusions Forecasting pharmaceutical expenditure is a critical exercise to inform policy decision makers. The most important leverages identified by the model on pharmaceutical budget were driven by generic and biosimilar prices, penetration rate
Bitzer, Sebastian; Park, Hame; Blankenburg, Felix; Kiebel, Stefan J.
Behavioral data obtained with perceptual decision making experiments are typically analyzed with the drift-diffusion model. This parsimonious model accumulates noisy pieces of evidence toward a decision bound to explain the accuracy and reaction times of subjects. Recently, Bayesian models have been proposed to explain how the brain extracts information from noisy input as typically presented in perceptual decision making tasks. It has long been known that the drift-diffusion model is tightly linked with such functional Bayesian models but the precise relationship of the two mechanisms was never made explicit. Using a Bayesian model, we derived the equations which relate parameter values between these models. In practice we show that this equivalence is useful when fitting multi-subject data. We further show that the Bayesian model suggests different decision variables which all predict equal responses and discuss how these may be discriminated based on neural correlates of accumulated evidence. In addition, we discuss extensions to the Bayesian model which would be difficult to derive for the drift-diffusion model. We suggest that these and other extensions may be highly useful for deriving new experiments which test novel hypotheses. PMID:24616689
Roth, Philip M; Reynolds, Steven D; Tesche, Thomas W
Despite the widespread application of photochemical air quality models (AQMs) in U.S. state implementation planning (SIP) for attainment of the ambient ozone standard, documentation for the reliability of projections has remained highly subjective. An "idealized" evaluation framework is proposed that provides a means for assessing reliability. Applied to 18 cases of regulatory modeling in the early 1990s in North America, a comparative review of these applications is reported. The intercomparisons suggest that more than two thirds of these AQM applications suffered from having inadequate air quality and meteorological databases. Emissions representations often were unreliable; uncertainties were too high. More than two thirds of the performance evaluation efforts were judged to be substandard compared with idealized goals. Meteorological conditions chosen according regulatory guidelines were limited to one or two cases and tended to be similar, thus limiting the extent to which public policy makers could be confident that the emission controls adopted would yield attainment for a broad range of adverse atmospheric conditions. More than half of the studies reviewed did not give sufficient attention to addressing the potential for compensating errors. Corroborative analyses were conducted in only one of the 18 studies reviewed. Insufficient attention was given to the estimation of model and/or input database errors, uncertainties, or variability in all of the cases examined. However, recent SIP and policy-related regional modeling provides evidence of substantial improvements in the underlying science and available modeling systems used for regulatory decision making. Nevertheless, the availability of suitable databases to support increasingly sophisticated modeling continues to be a concern for many locations. Thus, AQM results may still be subject to significant uncertainties. The evaluative process used here provides a framework for modelers and public policy
Elgin, Peter D.; Thomas, Rickey P.
The National Airspace System s capacity will experience considerable growth in the next few decades. Weather adversely affects safe air travel. The FAA and NASA are working to develop new technologies that display weather information to support situation awareness and optimize pilot decision-making in avoiding hazardous weather. Understanding situation awareness and naturalistic decision-making is an important step in achieving this goal. Information representation and situation time stress greatly influence attentional resource allocation and working memory capacity, potentially obstructing accurate situation awareness assessments. Three naturalistic decision-making theories were integrated to provide an understanding of the levels of decision making incorporated in three operational situations and two conditions. The task characteristics associated with each phase of flight govern the level of situation awareness attained and the decision making processes utilized. Weather product s attributes and situation task characteristics combine to classify weather products according to the decision-making processes best supported. In addition, a graphical interface is described that affords intuitive selection of the appropriate weather product relative to the pilot s current flight situation.
Holländer, H. M.; Bormann, H.; Blume, T.; Buytaert, W.; Chirico, G. B.; Exbrayat, J.-F.; Gustafsson, D.; Hölzel, H.; Krauße, T.; Kraft, P.; Stoll, S.; Blöschl, G.; Flühler, H.
In practice, the catchment hydrologist is often confronted with the task of predicting discharge without having the needed records for calibration. Here, we report the discharge predictions of 10 modellers - using the model of their choice - for the man-made Chicken Creek catchment (6 ha, northeast Germany, Gerwin et al., 2009b) and we analyse how well they improved their prediction in three steps based on adding information prior to each following step. The modellers predicted the catchment's hydrological response in its initial phase without having access to the observed records. They used conceptually different physically based models and their modelling experience differed largely. Hence, they encountered two problems: (i) to simulate discharge for an ungauged catchment and (ii) using models that were developed for catchments, which are not in a state of landscape transformation. The prediction exercise was organized in three steps: (1) for the first prediction the modellers received a basic data set describing the catchment to a degree somewhat more complete than usually available for a priori predictions of ungauged catchments; they did not obtain information on stream flow, soil moisture, nor groundwater response and had therefore to guess the initial conditions; (2) before the second prediction they inspected the catchment on-site and discussed their first prediction attempt; (3) for their third prediction they were offered additional data by charging them pro forma with the costs for obtaining this additional information. Holländer et al. (2009) discussed the range of predictions obtained in step (1). Here, we detail the modeller's assumptions and decisions in accounting for the various processes. We document the prediction progress as well as the learning process resulting from the availability of added information. For the second and third steps, the progress in prediction quality is evaluated in relation to individual modelling experience and costs of
Raskob, W; Heling, R; Zheleznyak, M
This paper discusses the role of hydrological modelling in decision support systems for nuclear emergencies. In particular, most recent developments such as, the radionuclide transport models integrated in to the decision support system RODOS will be explored. Recent progress in the implementation of physically-based distributed hydrological models for operational forecasting in national and supranational centres, may support a closer cooperation between national hydrological services and therefore, strengthen the use of hydrological and radiological models implemented in decision support systems.
Managing an efficient outpatient clinic can often be complicated by significant no-show rates and escalating appointment lead times. One method that has been proposed for avoiding the wasted capacity due to no-shows is called open or advanced access. The essence of open access is "do today's demand today". We develop a Markov Decision Process (MDP) model that demonstrates that a short booking window does significantly better than open access. We analyze a number of scenarios that explore the trade-off between patient-related measures (lead times) and physician- or system-related measures (revenue, overtime and idle time). Through simulation, we demonstrate that, over a wide variety of potential scenarios and clinics, the MDP policy does as well or better than open access in terms of minimizing costs (or maximizing profits) as well as providing more consistent throughput.
Rosenberg, David E.; Tarawneh, Tarek; Abdel-Khaleq, Rania; Lund, Jay R.
We apply systems analysis to estimate household water use in an intermittent supply system considering numerous interdependent water user behaviors. Some 39 household actions include conservation; improving local storage or water quality; and accessing sources having variable costs, availabilities, reliabilities, and qualities. A stochastic optimization program with recourse decisions identifies the infrastructure investments and short-term coping actions a customer can adopt to cost-effectively respond to a probability distribution of piped water availability. Monte Carlo simulations show effects for a population of customers. Model calibration reproduces the distribution of billed residential water use in Amman, Jordan. Parametric analyses suggest economic and demand responses to increased availability and alternative pricing. It also suggests potential market penetration for conservation actions, associated water savings, and subsidies to entice further adoption. We discuss new insights to size, target, and finance conservation.
Meira, Luis A. A.; Coelho, Guilherme P.; Santos, Antonio Alberto S.; Schiozer, Denis J.
The decision-making process in oil fields includes a step of risk analysis associated with the uncertainties present in the variables of the problem. Such uncertainties lead to hundreds, even thousands, of possible scenarios that are supposed to be analyzed so an effective production strategy can be selected. Given this high number of scenarios, a technique to reduce this set to a smaller, feasible subset of representative scenarios is imperative. The selected scenarios must be representative of the original set and also free of optimistic and pessimistic bias. This paper is devoted to propose an assisted methodology to identify representative models in oil fields. To do so, first a mathematical function was developed to model the representativeness of a subset of models with respect to the full set that characterizes the problem. Then, an optimization tool was implemented to identify the representative models of any problem, considering not only the cross-plots of the main output variables, but also the risk curves and the probability distribution of the attribute-levels of the problem. The proposed technique was applied to two benchmark cases and the results, evaluated by experts in the field, indicate that the obtained solutions are richer than those identified by previously adopted manual approaches. The program bytecode is available under request.
van der Bolt, Frank; Seid, Abdulkarim
To analyze options for increasing food supply in the Nile basin the Nile Agricultural Model (AM) was developed. The AM includes state-of-the-art descriptions of biophysical, hydrological and economic processes and realizes a coherent and consistent integration of hydrology, agronomy and economics. The AM covers both the agro-ecological domain (water, crop productivity) and the economic domain (food supply, demand, and trade) and allows to evaluate the macro-economic and hydrological impacts of scenarios for agricultural development. Starting with the hydrological information from the NileBasin-DSS the AM calculates the available water for agriculture, the crop production and irrigation requirements with the FAO-model AquaCrop. With the global commodity trade model MAGNET scenarios for land development and conversion are evaluated. The AM predicts consequences for trade, food security and development based on soil and water availability, crop allocation, food demand and food policy. The model will be used as a decision support tool to contribute to more productive and sustainable agriculture in individual Nile countries and the whole region.
Thomson, A. M.; Izaurralde, R. C.; Beach, R.; Zhang, X.; Zhao, K.; Monier, E.
A range of approaches can be used in the application of climate change projections to agricultural impacts assessment. Climate projections can be used directly to drive crop models, which in turn can be used to provide inputs for agricultural economic or integrated assessment models. These model applications, and the transfer of information between models, must be guided by the state of the science. But the methodology must also account for the specific needs of stakeholders and the intended use of model results beyond pure scientific inquiry, including meeting the requirements of agencies responsible for designing and assessing policies, programs, and regulations. Here we present methodology and results of two climate impacts studies that applied climate model projections from CMIP3 and from the EPA Climate Impacts and Risk Analysis (CIRA) project in a crop model (EPIC - Environmental Policy Indicator Climate) in order to generate estimates of changes in crop productivity for use in an agricultural economic model for the United States (FASOM - Forest and Agricultural Sector Optimization Model). The FASOM model is a forward-looking dynamic model of the US forest and agricultural sector used to assess market responses to changing productivity of alternative land uses. The first study, focused on climate change impacts on the UDSA crop insurance program, was designed to use available daily climate projections from the CMIP3 archive. The decision to focus on daily data for this application limited the climate model and time period selection significantly; however for the intended purpose of assessing impacts on crop insurance payments, consideration of extreme event frequency was critical for assessing periodic crop failures. In a second, coordinated impacts study designed to assess the relative difference in climate impacts under a no-mitigation policy and different future climate mitigation scenarios, the stakeholder specifically requested an assessment of a
Holländer, H. M.; Bormann, H.; Blume, T.; Buytaert, W.; Chirico, G. B.; Exbrayat, J.-F.; Gustafsson, D.; Hölzel, H.; Krauße, T.; Kraft, P.; Stoll, S.; Blöschl, G.; Flühler, H.
The purpose of this paper is to stimulate a re-thinking of how we, the catchment hydrologists, could become reliable forecasters. A group of catchment modellers predicted the hydrological response of a man-made 6 ha catchment in its initial phase (Chicken Creek) without having access to the observed records. They used conceptually different model families. Their modelling experience differed largely. The prediction exercise was organized in three steps: (1) for the 1st prediction modellers received a basic data set describing the internal structure of the catchment (somewhat more complete than usually available to a priori predictions in ungauged catchments). They did not obtain time series of stream flow, soil moisture or groundwater response. (2) Before the 2nd improved prediction they inspected the catchment on-site and attended a workshop where the modellers presented and discussed their first attempts. (3) For their improved 3rd prediction they were offered additional data by charging them pro forma with the costs for obtaining this additional information. Holländer et al. (2009) discussed the range of predictions obtained in step 1. Here, we detail the modeller's decisions in accounting for the various processes based on what they learned during the field visit (step 2) and add the final outcome of step 3 when the modellers made use of additional data. We document the prediction progress as well as the learning process resulting from the availability of added information. For the 2nd and 3rd step, the progress in prediction quality could be evaluated in relation to individual modelling experience and costs of added information. We learned (i) that soft information such as the modeller's system understanding is as important as the model itself (hard information), (ii) that the sequence of modelling steps matters (field visit, interactions between differently experienced experts, choice of model, selection of available data, and methods for parameter guessing
Webb, Colleen T; Ferrari, Matthew; Lindström, Tom; Carpenter, Tim; Dürr, Salome; Garner, Graeme; Jewell, Chris; Stevenson, Mark; Ward, Michael P; Werkman, Marleen; Backer, Jantien; Tildesley, Michael
Epidemiological models in animal health are commonly used as decision-support tools to understand the impact of various control actions on infection spread in susceptible populations. Different models contain different assumptions and parameterizations, and policy decisions might be improved by considering outputs from multiple models. However, a transparent decision-support framework to integrate outputs from multiple models is nascent in epidemiology. Ensemble modelling and structured decision-making integrate the outputs of multiple models, compare policy actions and support policy decision-making. We briefly review the epidemiological application of ensemble modelling and structured decision-making and illustrate the potential of these methods using foot and mouth disease (FMD) models. In case study one, we apply structured decision-making to compare five possible control actions across three FMD models and show which control actions and outbreak costs are robustly supported and which are impacted by model uncertainty. In case study two, we develop a methodology for weighting the outputs of different models and show how different weighting schemes may impact the choice of control action. Using these case studies, we broadly illustrate the potential of ensemble modelling and structured decision-making in epidemiology to provide better information for decision-making and outline necessary development of these methods for their further application.
Khefacha, I.; Belkacem, L.
This study investigates how decisions are made in Tunisian public higher education establishments. Some factors are identified as having a potentially significant impact on the odds that the decision-making process follows the characteristics of one of the most well known decision-making models: collegial, political, bureaucratic or anarchical…
A study investigated the factors influencing students' decisions about attending a college to which they had been admitted. Logit analysis confirmed gravity model predictions that geographic distance and student ability would most influence the enrollment decision and found other variables, although affecting earlier stages of decision making, did…
Wang, Jane X; Voss, Joel L
Exploration permits acquisition of the most relevant information during learning. However, the specific information needed, the influences of this information on decision making, and the relevant neural mechanisms remain poorly understood. We modeled distinct information types available during contextual association learning and used model-based fMRI in conjunction with manipulation of exploratory decision making to identify neural activity associated with information-based decisions. We identified hippocampal-prefrontal contributions to advantageous decisions based on immediately available novel information, distinct from striatal contributions to advantageous decisions based on the sum total available (accumulated) information. Furthermore, network-level interactions among these regions during exploratory decision making were related to learning success. These findings link strategic exploration decisions during learning to quantifiable information and advance understanding of adaptive behavior by identifying the distinct and interactive nature of brain-network contributions to decisions based on distinct information types.
Huang, Wei; Cao, Xiaoyi; Biase, Fernando H.; Yu, Pengfei; Zhong, Sheng
Both spatial characteristics and temporal features are often the subjects of concern in physical, social, and biological studies. This work tackles the clustering problems for time course data in which the cluster number and clustering structure change with respect to time, dubbed time-variant clustering. We developed a hierarchical model that simultaneously clusters the objects at every time point and describes the relationships of the clusters between time points. The hidden layer of this model is a generalized form of branching processes. A reversible-jump Markov Chain Monte Carlo method was implemented for model inference, and a feature selection procedure was developed. We applied this method to explore an open question in preimplantation embryonic development. Our analyses using single-cell gene expression data suggested that the earliest cell fate decision could start at the 4-cell stage in mice, earlier than the commonly thought 8- to 16-cell stage. These results together with independent experimental data from single-cell RNA-seq provided support against a prevailing hypothesis in mammalian development. PMID:25339442
Kurth-Nelson, Zeb; Redish, A. David
Addiction and many other disorders are linked to impulsivity, where a suboptimal choice is preferred when it is immediately available. One solution to impulsivity is precommitment: constraining one's future to avoid being offered a suboptimal choice. A form of impulsivity can be measured experimentally by offering a choice between a smaller reward delivered sooner and a larger reward delivered later. Impulsive subjects are more likely to select the smaller-sooner choice; however, when offered an option to precommit, even impulsive subjects can precommit to the larger-later choice. To precommit or not is a decision between two conditions: (A) the original choice (smaller-sooner vs. larger-later), and (B) a new condition with only larger-later available. It has been observed that precommitment appears as a consequence of the preference reversal inherent in non-exponential delay-discounting. Here we show that most models of hyperbolic discounting cannot precommit, but a distributed model of hyperbolic discounting does precommit. Using this model, we find (1) faster discounters may be more or less likely than slow discounters to precommit, depending on the precommitment delay, (2) for a constant smaller-sooner vs. larger-later preference, a higher ratio of larger reward to smaller reward increases the probability of precommitment, and (3) precommitment is highly sensitive to the shape of the discount curve. These predictions imply that manipulations that alter the discount curve, such as diet or context, may qualitatively affect precommitment. PMID:21179584
Fuller, Angela K.; Linden, Daniel W.; Royle, J. Andrew
Harvest data are often used by wildlife managers when setting harvest regulations for species because the data are regularly collected and do not require implementation of logistically and financially challenging studies to obtain the data. However, when harvest data are not available because an area had not previously supported a harvest season, alternative approaches are required to help inform management decision making. When distribution or density data are required across large areas, occupancy modeling is a useful approach, and under certain conditions, can be used as a surrogate for density. We collaborated with the New York State Department of Environmental Conservation (NYSDEC) to conduct a camera trapping study across a 70,096-km2 region of southern New York in areas that were currently open to fisher (Pekania [Martes] pennanti) harvest and those that had been closed to harvest for approximately 65 years. We used detection–nondetection data at 826 sites to model occupancy as a function of site-level landscape characteristics while accounting for sampling variation. Fisher occupancy was influenced positively by the proportion of conifer and mixed-wood forest within a 15-km2 grid cell and negatively associated with road density and the proportion of agriculture. Model-averaged predictions indicated high occupancy probabilities (>0.90) when road densities were low (<1 km/km2) and coniferous and mixed forest proportions were high (>0.50). Predicted occupancy ranged 0.41–0.67 in wildlife management units (WMUs) currently open to trapping, which could be used to guide a minimum occupancy threshold for opening new areas to trapping seasons. There were 5 WMUs that had been closed to trapping but had an average predicted occupancy of 0.52 (0.07 SE), and above the threshold of 0.41. These areas are currently under consideration by NYSDEC for opening a conservative harvest season. We demonstrate the use of occupancy modeling as an aid to management
Hung, M.-L. . E-mail: email@example.com; Ma Hwongwen . E-mail: firstname.lastname@example.org; Yang, W.-F. . E-mail: email@example.com
This paper reviews several models developed to support decision making in municipal solid waste management (MSWM). The concepts underlying sustainable MSWM models can be divided into two categories: one incorporates social factors into decision making methods, and the other includes public participation in the decision-making process. The public is only apprised or takes part in discussion, and has little effect on decision making in most research efforts. Few studies have considered public participation in the decision-making process, and the methods have sought to strike a compromise between concerned criteria, not between stakeholders. However, the source of the conflict arises from the stakeholders' complex web of value. Such conflict affects the feasibility of implementing any decision. The purpose of this study is to develop a sustainable decision making model for MSWM to overcome these shortcomings. The proposed model combines multicriteria decision making (MCDM) and a consensus analysis model (CAM). The CAM is built up to aid in decision-making when MCDM methods are utilized and, subsequently, a novel sustainable decision making model for MSWM is developed. The main feature of CAM is the assessment of the degree of consensus between stakeholders for particular alternatives. A case study for food waste management in Taiwan is presented to demonstrate the practicality of this model.
McCullough, Laurence B
The professional medical ethics model of decision making may be applied to decisions clinicians and patients make under the conditions of clinical uncertainty that exist when evidence is low or very low. This model uses the ethical concepts of medicine as a profession, the professional virtues of integrity and candor and the patient's virtue of prudence, the moral management of medical uncertainty, and trial of intervention. These features combine to justifiably constrain clinicians' and patients' autonomy with the goal of preventing nondeliberative decisions of patients and clinicians. To prevent biased recommendations by the clinician that promote such nondeliberative decisions, medically reasonable alternatives supported by low or very low evidence should be offered but not recommended. The professional medical ethics model of decision making aims to improve the quality of decisions by reducing the unacceptable variation that can result from nondeliberative decision making by patients and clinicians when evidence is low or very low.
Given the importance of education and the growing public demand for improving education quality under tight budget constraints, there has been an emerging movement to call for research-informed decisions in educational resource allocation. Despite the abundance of rigorous studies on the effectiveness, cost, and implementation of educational…
Cowardin, L.M.; Johnson, D.H.; Shaffer, T.L.; Sparling, D.W.
A system comprising simulation models and data bases for habitat availability and nest success rates was used to predict results from a mallard (Anas platyrhynchos) management plan and to compare six management methods with a control. Individual treatments in the applications included land purchase for waterfowl production, wetland easement purchase, lease of uplands for waterfowl management, cropland retirement, use of no-till winter wheat, delayed cutting of alfalfa, installation of nest baskets, nesting island construction, and use of predator-resistant fencing.The simulations predicted that implementation of the management plan would increase recruits by 24%. Nest baskets were the most effective treatment, accounting for 20.4% of the recruits. No-till winter wheat was the second most effective, accounting for 5.9% of the recruits. Wetland loss due to drainage would cause an 11% loss of breeding population in 10 years.The models were modified to account for migrational homing. The modification indicated that migrational homing would enhance the effects of management. Nest success rates were critical contributions to individual management methods. The most effective treatments, such as nest baskets, had high success rates and affected a large portion of the breeding population.Economic analyses indicated that nest baskets would be the most economical of the three techniques tested. The applications indicated that the system is a useful tool to aid management decisions, but data are scarce for several important variables. Basic research will be required to adequately model the effect of migrational homing and density dependence on production. The comprehensive nature of predictions desired by managers will also require that production models like the one described here be extended to encompass the entire annual cycle of waterfowl.
Gong, P. . Dept. of Forest Economics)
Different decision models can be constructed and used to analyze a regeneration decision in even-aged stand management. However, the optimal decision and management outcomes determined in an analysis may depend on the decision model used in the analysis. This paper examines the proper choice of decision model for determining the optimal planting density and land expectation value (LEV) for a Scots pine (Pinus sylvestris L.) plantation in northern Sweden. First, a general adaptive decision model for determining the regeneration alternative that maximizes the LEV is presented. This model recognizes future stand state and timber price uncertainties by including multiple stand state and timber price scenarios, and assumes that the harvest decision in each future period will be made conditional on the observed stand state and timber prices. Alternative assumptions about future stand states, timber prices, and harvest decisions can be incorporated into this general decision model, resulting in several different decision models that can be used to analyze a specific regeneration problem. Next, the consequences of choosing different modeling assumptions are determined using the example Scots pine plantation problem. Numerical results show that the most important sources of uncertainty that affect the optimal planting density and LEV are variations of the optimal clearcut time due to short-term fluctuations of timber prices. It is appropriate to determine the optimal planting density and harvest policy using an adaptive decision model that recognizes uncertainty only in future timber prices. After the optimal decisions have been found, however, the LEV should be re-estimated by incorporating both future stand state and timber price uncertainties.
Pohl, Jens; Myers, Leonard
A cooperative decision making model is described which is comprised of six concurrently executing domain experts coordinated by a blackboard control expert. The focus application field is architectural design, and the domain experts represent consultants in the area of daylighting, noise control, structural support, cost estimating, space planning, and climate responsiveness. Both the domain experts and the blackboard were implemented as production systems, using an enhanced version of the basic CLIPS package. Acting in unison as an Expert Design Advisor, the domain and control experts react to the evolving design solution progressively developed by the user in a 2-D CAD drawing environment. A Geometry Interpreter maps each drawing action taken by the user to real world objects, such as spaces, walls, windows, and doors. These objects, endowed with geometric and nongeometric attributes, are stored as frames in a semantic network. Object descriptions are derived partly from the geometry of the drawing environment and partly from knowledge bases containing prototypical, generalized information about the building type and site conditions under consideration.
Paulson, Patrick R.; Coles, Garill A.; Shoemaker, Steven V.
Modernization of nuclear power operations control systems, in particular the move to digital control systems, creates an opportunity to modernize existing legacy infrastructure and extend plant life. We describe here decision support tools that allow the assessment of different facets of risk and support the optimization of available resources to reduce risk as plants are upgraded and maintained. This methodology could become an integrated part of the design review process and a part of the operations management systems. The methodology can be applied to the design of new reactors such as small nuclear reactors (SMR), and be helpful in assessing the risks of different configurations of the reactors. Our tool provides a low cost evaluation of alternative configurations and provides an expanded safety analysis by considering scenarios while early in the implementation cycle where cost impacts can be minimized. The effects of failures can be modeled and thoroughly vetted to understand their potential impact on risk. The process and tools presented here allow for an integrated assessment of risk by supporting traditional defense in depth approaches while taking into consideration the insertion of new digital instrument and control systems.
Terando, A. J.; Wootten, A.; Eaton, M. J.; Runge, M. C.; Littell, J. S.; Bryan, A. M.; Carter, S. L.
Two types of decisions face society with respect to anthropogenic climate change: (1) whether to enact a global greenhouse gas abatement policy, and (2) how to adapt to the local consequences of current and future climatic changes. The practice of downscaling global climate models (GCMs) is often used to address (2) because GCMs do not resolve key features that will mediate global climate change at the local scale. In response, the development of downscaling techniques and models has accelerated to aid decision makers seeking adaptation guidance. However, quantifiable estimates of the value of information are difficult to obtain, particularly in decision contexts characterized by deep uncertainty and low system-controllability. Here we demonstrate a method to quantify the additional value that decision makers could expect if research investments are directed towards developing new downscaled climate projections. As a proof of concept we focus on a real-world management problem: whether to undertake assisted migration for an endangered tropical avian species. We also take advantage of recently published multivariate methods that account for three vexing issues in climate impacts modeling: maximizing climate model quality information, accounting for model dependence in ensembles of opportunity, and deriving probabilistic projections. We expand on these global methods by including regional (Caribbean Basin) and local (Puerto Rico) domains. In the local domain, we test whether a high resolution (2km) dynamically downscaled GCM reduces the multivariate error estimate compared to the original coarse-scale GCM. Initial tests show little difference between the downscaled and original GCM multivariate error. When propagated through to a species population model, the Value of Information analysis indicates that the expected utility that would accrue to the manager (and species) if this downscaling were completed may not justify the cost compared to alternative actions.
Khodadadi, Arash; Fakhari, Pegah; Busemeyer, Jerome R.
When animals have to make a number of decisions during a limited time interval, they face a fundamental problem: how much time they should spend on each decision in order to achieve the maximum possible total outcome. Deliberating more on one decision usually leads to more outcome but less time will remain for other decisions. In the framework of sequential sampling models, the question is how animals learn to set their decision threshold such that the total expected outcome achieved during a limited time is maximized. The aim of this paper is to provide a theoretical framework for answering this question. To this end, we consider an experimental design in which each trial can come from one of the several possible “conditions.” A condition specifies the difficulty of the trial, the reward, the penalty and so on. We show that to maximize the expected reward during a limited time, the subject should set a separate value of decision threshold for each condition. We propose a model of learning the optimal value of decision thresholds based on the theory of semi-Markov decision processes (SMDP). In our model, the experimental environment is modeled as an SMDP with each “condition” being a “state” and the value of decision thresholds being the “actions” taken in those states. The problem of finding the optimal decision thresholds then is cast as the stochastic optimal control problem of taking actions in each state in the corresponding SMDP such that the average reward rate is maximized. Our model utilizes a biologically plausible learning algorithm to solve this problem. The simulation results show that at the beginning of learning the model choses high values of decision threshold which lead to sub-optimal performance. With experience, however, the model learns to lower the value of decision thresholds till finally it finds the optimal values. PMID:24904252
Khodadadi, Arash; Fakhari, Pegah; Busemeyer, Jerome R
WHEN ANIMALS HAVE TO MAKE A NUMBER OF DECISIONS DURING A LIMITED TIME INTERVAL, THEY FACE A FUNDAMENTAL PROBLEM: how much time they should spend on each decision in order to achieve the maximum possible total outcome. Deliberating more on one decision usually leads to more outcome but less time will remain for other decisions. In the framework of sequential sampling models, the question is how animals learn to set their decision threshold such that the total expected outcome achieved during a limited time is maximized. The aim of this paper is to provide a theoretical framework for answering this question. To this end, we consider an experimental design in which each trial can come from one of the several possible "conditions." A condition specifies the difficulty of the trial, the reward, the penalty and so on. We show that to maximize the expected reward during a limited time, the subject should set a separate value of decision threshold for each condition. We propose a model of learning the optimal value of decision thresholds based on the theory of semi-Markov decision processes (SMDP). In our model, the experimental environment is modeled as an SMDP with each "condition" being a "state" and the value of decision thresholds being the "actions" taken in those states. The problem of finding the optimal decision thresholds then is cast as the stochastic optimal control problem of taking actions in each state in the corresponding SMDP such that the average reward rate is maximized. Our model utilizes a biologically plausible learning algorithm to solve this problem. The simulation results show that at the beginning of learning the model choses high values of decision threshold which lead to sub-optimal performance. With experience, however, the model learns to lower the value of decision thresholds till finally it finds the optimal values.
Disney, W Terry; Peters, Mark A
Simulation modeling can be used in aiding decision-makers in deciding when to invest in additional research and when the risky animal disease-import decision should go forward. Simulation modeling to evaluate value-of-information (VOI) techniques provides a robust, objective and transparent framework for assisting decision-makers in making risky animal and animal product decisions. In this analysis, the hypothetical risk from poultry disease in chicken-meat imports was modeled. Economic criteria were used to quantify alternative confidence-increasing decisions regarding potential import testing and additional research requirements. In our hypothetical example, additional information about poultry disease in the exporting country (either by requiring additional export-flock surveillance that results in no sign of disease, or by conducting additional research into lack of disease transmittal through chicken-meat ingestion) captured >75% of the value-of-information attainable regarding the chicken-meat-import decision.
Lafond, Daniel; Lacouture, Yves; Cohen, Andrew L.
The authors present 3 decision-tree models of categorization adapted from T. Trabasso, H. Rollins, and E. Shaughnessy (1971) and use them to provide a quantitative account of categorization response times, choice proportions, and typicality judgments at the individual-participant level. In Experiment 1, the decision-tree models were fit to…
Francis, Perry C.
Mental health professionals are faced with increasingly complex ethical decisions that are impacted by culture, personal and professional values, and the contexts in which they and their clients inhabit. This article presents the reasons for developing and implementing multiple ethical decision making models and reviews four models that address…
A prototype decision support system (DSS) called Apollo was developed to assist researchers in using the Decision Support System for Agrotechnology Transfer (DSSAT) crop growth models to analyze precision farming datasets. Because the DSSAT models are written to simulate crop growth and development...
Gamification means the use of various elements of game design in nongame contexts including workplace collaboration, marketing, education, military, and medical services. Gamification is effective for both improving workplace productivity and motivating employees. However, introduction of gamification is not easy because the planning and implementation processes of gamification are very complicated and it needs interdisciplinary knowledge such as information systems, organization behavior, and human psychology. Providing a systematic decision making method for gamification process is the purpose of this paper. This paper suggests the decision criteria for selection of gamification platform to support a systematic decision making process for managements. The criteria are derived from previous works on gamification, introduction of information systems, and analytic hierarchy process. The weights of decision criteria are calculated through a survey by the professionals on game, information systems, and business administration. The analytic hierarchy process is used to derive the weights. The decision criteria and weights provided in this paper could support the managements to make a systematic decision for selection of gamification platform.
Gamification means the use of various elements of game design in nongame contexts including workplace collaboration, marketing, education, military, and medical services. Gamification is effective for both improving workplace productivity and motivating employees. However, introduction of gamification is not easy because the planning and implementation processes of gamification are very complicated and it needs interdisciplinary knowledge such as information systems, organization behavior, and human psychology. Providing a systematic decision making method for gamification process is the purpose of this paper. This paper suggests the decision criteria for selection of gamification platform to support a systematic decision making process for managements. The criteria are derived from previous works on gamification, introduction of information systems, and analytic hierarchy process. The weights of decision criteria are calculated through a survey by the professionals on game, information systems, and business administration. The analytic hierarchy process is used to derive the weights. The decision criteria and weights provided in this paper could support the managements to make a systematic decision for selection of gamification platform. PMID:24892075
Besedeš, Tibor; Deck, Cary; Sarangi, Sudipta; Shor, Mikhael
Using paper and pencil experiments administered in senior centers, we examine decision-making performance in multi-attribute decision problems. We differentiate the effects of declining cognitive performance and changing cognitive process on decision-making performance of seniors as they age. We find a significant decline in performance with age due to reduced reliance on common heuristics and increased decision-making randomness among our oldest subjects. However, we find that increasing the number of options in a decision problem increases the number of heuristics brought to the task. This challenges the choice overload view that people give up when confronted with too much choice. PMID:22408282
Wright, Adam; Sittig, Dean F.
Background A large body of evidence over many years suggests that clinical decision support systems can be helpful in improving both clinical outcomes and adherence to evidence-based guidelines. However, to this day, clinical decision support systems are not widely used outside of a small number of sites. One reason why decision support systems are not widely used is the relative difficulty of integrating such systems into clinical workflows and computer systems. Purpose To review and synthesize the history of clinical decision support systems, and to propose a model of various architectures for integrating clinical decision support systems with clinical systems. Methods The authors conducted an extensive review of the clinical decision support literature since 1959, sequenced the systems and developed a model. Results The model developed consists of four phases: standalone decision support systems, decision support integrated into clinical systems, standards for sharing clinical decision support content and service models for decision support. These four phases have not heretofore been identified, but they track remarkably well with the chronological history of clinical decision support, and show evolving and increasingly sophisticated attempts to ease integrating decision support systems into clinical workflows and other clinical systems. Conclusions Each of the four evolutionary approaches to decision support architecture has unique advantages and disadvantages. A key lesson was that there were common limitations that almost all the approaches faced, and no single approach has been able to entirely surmount: 1) fixed knowledge representation systems inherently circumscribe the type of knowledge that can be represented in them, 2) there are serious terminological issues, 3) patient data may be spread across several sources with no single source having a complete view of the patient, and 4) major difficulties exist in transferring successful interventions from one
Gill, Wanda E.
Three decision-making models that have applications for college presidents and administrators are reviewed. While both individual and group decision-making are addressed, emphasis is placed on the importance of group decisions on institutional policy planning. The model of Edmund M. Burke (1979) presents specific decision-making strategies in…
Watershed simulation models are used extensively to investigate hydrologic processes, landuse and climate change impacts, pollutant load assessments and best management practices (BMPs). Developing, calibrating and validating these models require a number of critical decisions that will influence t...
Héricé, Charlotte; Khalil, Radwa; Moftah, Marie; Boraud, Thomas; Guthrie, Martin; Garenne, André
The mechanisms of decision-making and action selection are generally thought to be under the control of parallel cortico-subcortical loops connecting back to distinct areas of cortex through the basal ganglia and processing motor, cognitive and limbic modalities of decision-making. We have used these properties to develop and extend a connectionist model at a spiking neuron level based on a previous rate model approach. This model is demonstrated on decision-making tasks that have been studied in primates and the electrophysiology interpreted to show that the decision is made in two steps. To model this, we have used two parallel loops, each of which performs decision-making based on interactions between positive and negative feedback pathways. This model is able to perform two-level decision-making as in primates. We show here that, before learning, synaptic noise is sufficient to drive the decision-making process and that, after learning, the decision is based on the choice that has proven most likely to be rewarded. The model is then submitted to lesion tests, reversal learning and extinction protocols. We show that, under these conditions, it behaves in a consistent manner and provides predictions in accordance with observed experimental data.
Karmperis, Athanasios C.; Aravossis, Konstantinos; Tatsiopoulos, Ilias P.; Sotirchos, Anastasios
Highlights: ► The mainly used decision support frameworks for solid waste management are reviewed. ► The LCA, CBA and MCDM models are presented and their strengths, weaknesses, similarities and possible combinations are analyzed. ► The game-theoretic approach in a solid waste management context is presented. ► The waste management bargaining game is introduced as a specific decision support framework. ► Cooperative and non-cooperative game-theoretic approaches to decision support for solid waste management are discussed. - Abstract: This paper surveys decision support models that are commonly used in the solid waste management area. Most models are mainly developed within three decision support frameworks, which are the life-cycle assessment, the cost–benefit analysis and the multi-criteria decision-making. These frameworks are reviewed and their strengths and weaknesses as well as their critical issues are analyzed, while their possible combinations and extensions are also discussed. Furthermore, the paper presents how cooperative and non-cooperative game-theoretic approaches can be used for the purpose of modeling and analyzing decision-making in situations with multiple stakeholders. Specifically, since a waste management model is sustainable when considering not only environmental and economic but also social aspects, the waste management bargaining game is introduced as a specific decision support framework in which future models can be developed.
Rigoux, Lionel; Guigon, Emmanuel
Costs (e.g. energetic expenditure) and benefits (e.g. food) are central determinants of behavior. In ecology and economics, they are combined to form a utility function which is maximized to guide choices. This principle is widely used in neuroscience as a normative model of decision and action, but current versions of this model fail to consider how decisions are actually converted into actions (i.e. the formation of trajectories). Here, we describe an approach where decision making and motor control are optimal, iterative processes derived from the maximization of the discounted, weighted difference between expected rewards and foreseeable motor efforts. The model accounts for decision making in cost/benefit situations, and detailed characteristics of control and goal tracking in realistic motor tasks. As a normative construction, the model is relevant to address the neural bases and pathological aspects of decision making and motor control. PMID:23055916
Process Model , Bureaucratic Politics Model, Cognitive...Choice Model ............................................................................................. 23 Organizational Process Model .................................................................................. 25... Model The Organizational Process Model is relatively new compared to the Rational Choice Model. It uses processes and predetermined procedures
Hanagud, S.; Uppaluri, B.
This paper describes a methodology for making cost effective fatigue design decisions. The methodology is based on a probabilistic model for the stochastic process of fatigue crack growth with time. The development of a particular model for the stochastic process is also discussed in the paper. The model is based on the assumption of continuous time and discrete space of crack lengths. Statistical decision theory and the developed probabilistic model are used to develop the procedure for making fatigue design decisions on the basis of minimum expected cost or risk function and reliability bounds. Selections of initial flaw size distribution, NDT, repair threshold crack lengths, and inspection intervals are discussed.
Larsen, Janet D.
Describes two classroom demonstrations, based on the prisoner's dilemma, which illustrate some elements of decision making. Examines how students either cooperate or take advantage of one another, and discusses the use of this activity as an introduction to various concepts in psychology and other social sciences. (GEA)
Feldman, Daniel C.; Beehr, Terry A.
The present article organizes prominent theories about retirement decision making around three different types of thinking about retirement: imagining the possibility of retirement, assessing when it is time to let go of long-held jobs, and putting concrete plans for retirement into action at present. It also highlights important directions for…
identical except for orientation, they are mentally rotated. (adapted from Shepard & Metzler 1971). Mental simulation in the psychological domain...own minds to simulate the psychological causes of others’ behavior, typically by making decisions within a pretended context (Gordon, 2001). Ac
Kraft, Donald H.
A threshold rule is analyzed and compared to the Neyman-Pearson procedure, indicating that the threshold rule provides a necessary but not sufficient measure of the minimal performance of a retrieval system, whereas Neyman-Pearson yields a better apriori decision for retrieval. (Author/MBR)
Wang, Hei-Chia; Chou, Ya-lin; Guo, Jiunn-Liang
Purpose: The paper's aim is to propose a core journal decision method, called the local impact factor (LIF), which can evaluate the requirements of the local user community by combining both the access rate and the weighted impact factor, and by tracking citation information on the local users' articles. Design/methodology/approach: Many…
Tolson, Bryan; Craig, James
Decision-makers in water resources are often burdened with selecting appropriate multi-million dollar strategies to mitigate the impacts of climate or land use change. Unfortunately, the suitability of existing hydrologic simulation models to accurately inform decision-making is in doubt because the testing procedures used to evaluate model utility (i.e., model validation) are insufficient. For example, many authors have identified that a good standard framework for model testing called the Klemes Crash Tests (KCTs), which are the classic model validation procedures from Klemeš (1986) that Andréassian et al. (2009) rename as KCTs, have yet to become common practice in hydrology. Furthermore, Andréassian et al. (2009) claim that the progression of hydrological science requires widespread use of KCT and the development of new crash tests. Existing simulation (not forecasting) model testing procedures such as KCTs look backwards (checking for consistency between simulations and past observations) rather than forwards (explicitly assessing if the model is likely to support future decisions). We propose a fundamentally different, forward-looking, decision-oriented hydrologic model testing framework based upon the concept of fit-for-purpose model testing that we call Decision Crash Tests or DCTs. Key DCT elements are i) the model purpose (i.e., decision the model is meant to support) must be identified so that model outputs can be mapped to management decisions ii) the framework evaluates not just the selected hydrologic model but the entire suite of model-building decisions associated with model discretization, calibration etc. The framework is constructed to directly and quantitatively evaluate model suitability. The DCT framework is applied to a model building case study on the Grand River in Ontario, Canada. A hypothetical binary decision scenario is analysed (upgrade or not upgrade the existing flood control structure) under two different sets of model building
Williams, B.K.; Nichols, J.D.; Conroy, M.J.
This book deals with the processes involved in making informed decisions about the management of animal populations. It covers the modeling of population responses to management actions, the estimation of quantities needed in the modeling effort, and the application of these estimates and models to the development of sound management decisions. The book synthesizes and integrates in a single volume the methods associated with these themes, as they apply to ecological assessment and conservation of animal populations. KEY FEATURES * Integrates population modeling, parameter estimation and * decision-theoretic approaches to management in a single, cohesive framework * Provides authoritative, state-of-the-art descriptions of quantitative * approaches to modeling, estimation and decision-making * Emphasizes the role of mathematical modeling in the conduct of science * and management * Utilizes a unifying biological context, consistent mathematical notation, * and numerous biological examples
Day, W; Audsley, E; Frost, A R
Engineering research and development contributes to the advance of sustainable agriculture both through innovative methods to manage and control processes, and through quantitative understanding of the operation of practical agricultural systems using decision models. This paper describes how an engineering approach, drawing on mathematical models of systems and processes, contributes new methods that support decision making at all levels from strategy and planning to tactics and real-time control. The ability to describe the system or process by a simple and robust mathematical model is critical, and the outputs range from guidance to policy makers on strategic decisions relating to land use, through intelligent decision support to farmers and on to real-time engineering control of specific processes. Precision in decision making leads to decreased use of inputs, less environmental emissions and enhanced profitability-all essential to sustainable systems.
Ge, Lan; Mourits, Monique C M; Kristensen, Anders R; Huirne, Ruud B M
Most studies on control strategies for contagious diseases such as foot-and-mouth disease (FMD) evaluate pre-defined control strategies and imply static decision-making during epidemic control. Such a static approach contradicts the dynamic nature of the decision-making process during epidemic control. This paper presents an integrated epidemic-economic modelling approach to support dynamic decision-making in controlling FMD epidemics. This new modelling approach reflects ongoing uncertainty about epidemic growth during epidemic control and provides information required by a dynamic decision process. As demonstrated for a Dutch FMD-case, the modelling approach outperforms static evaluation of pre-fixed control strategies by: (1) providing guidance to decision-making during the entire control process; and (2) generating more realistic estimation of the costs of overreacting or underreacting in choosing control options.
Park, Guihyun; Deshon, Richard P
The consideration of minority opinions when making team decisions is an important factor that contributes to team effectiveness. A multilevel model of minority opinion influence in decision-making teams is developed to address the conditions that relate to adequate consideration of minority opinions. Using a sample of 57 teams working on a simulated airport security-screening task, we demonstrate that team learning goal orientation influences the confidence of minority opinion holders and team discussion. Team discussion, in turn, relates to minority influence, greater decision quality, and team satisfaction. Implications for managing decision-making teams in organizations are discussed.
Ioannis, Seimenis; Damianos, Sakas P.; Nikolaos, Konstantopoulos
This article examines the factors that affect the decision making of the training managers responsible in case of business communication field as they have emerged from the study of the decision that have taken place in the commercial sector in this specific Greek market. Previous researches have indicated the participation of a number of variables in this kind of decision. The aim of this article is to locate the main factors which determine, in the commercial sector the decision for the training of the employees in the field of business communication. On the basis of quality research, dynamic simulation model have been created for some of this main factors.
Piet, Steven James; Dettmers, Dana Lee; Dakins, Maxine Ellen; Eide, Steven Arvid; Gibson, Patrick Lavern; Joe, Jeffrey Clark; Kerr, Thomas A; Nitschke, Robert Leon; Oswald, Kyle Blaine; Reisenauer, John Phillip
The effects of closure decisions for used nuclear facilities can extend centuries into the future. Yet, the longevity of decisions made over the past half century has been poor. Our goal is an improved decision framework for decommissioning, stewardship, and waste management. This paper describes our overall framework. Companion papers describe the underlying philosophy of the KONVERGENCE Model for Sustainable Decisions1 and implications for a class of intractable decision problems.2 Where knowledge, values, and resources converge (the K, V, and R in KONVERGENCE), you will find a sustainable decision – a decision that works over time. Our approach clarifies what is needed to make and keep decisions over relevant time periods. The process guides participants through establishing the real problem, understanding the universes of knowledge, values, resources, and generating alternatives. We explore three classes of alternatives – reusable (e.g. greenfield), closed (e.g. entombed structures), and adaptable. After testing for konvergence of alternatives among knowledge, values, resources, we offer suggestions to diagnose divergence, to reduce divergence by refining alternatives to address identified weaknesses, and to plan to keep konvergence over the life of the decision. We believe that decisions made via this method will better stand the test of time – because it will be either acceptable to keep them unchanged or possible to adapt them as knowledge, values, and resources change.
This presentation includes a combination of modeling and measurement results to characterize near-source air quality in Newark, New Jersey with consideration of how this information could be used to inform decision making to reduce risk of health impacts. Decisions could include ...
Huang, Yanping; Rao, Rajesh P N
A key problem in neuroscience is understanding how the brain makes decisions under uncertainty. Important insights have been gained using tasks such as the random dots motion discrimination task in which the subject makes decisions based on noisy stimuli. A descriptive model known as the drift diffusion model has previously been used to explain psychometric and reaction time data from such tasks but to fully explain the data, one is forced to make ad-hoc assumptions such as a time-dependent collapsing decision boundary. We show that such assumptions are unnecessary when decision making is viewed within the framework of partially observable Markov decision processes (POMDPs). We propose an alternative model for decision making based on POMDPs. We show that the motion discrimination task reduces to the problems of (1) computing beliefs (posterior distributions) over the unknown direction and motion strength from noisy observations in a bayesian manner, and (2) selecting actions based on these beliefs to maximize the expected sum of future rewards. The resulting optimal policy (belief-to-action mapping) is shown to be equivalent to a collapsing decision threshold that governs the switch from evidence accumulation to a discrimination decision. We show that the model accounts for both accuracy and reaction time as a function of stimulus strength as well as different speed-accuracy conditions in the random dots task.
Rowell, P. Clay; Mobley, A. Keith; Kemer, Gulsah; Giordano, Amanda
The authors examined the effectiveness of a group career counseling model (Pyle, K. R., 2007) on college students' career decision-making abilities. They used a Solomon 4-group design and found that students who participated in the career counseling groups had significantly greater increases in career decision-making abilities than those who…
Anders, Mary C.; Christopher, F. Scott
The purpose of our study was to identify factors underlying rape survivors' post-assault prosecution decisions by testing a decision model that included the complex relations between the multiple social ecological systems within which rape survivors are embedded. We coded 440 police rape cases for characteristics of the assault and characteristics…
Hammer, Joseph H.; Vogel, David L.
Prior research on professional psychological help-seeking behavior has operated on the assumption that the decision to seek help is based on intentional and reasoned processes. However, research on the dual-process prototype/willingness model (PWM; Gerrard, Gibbons, Houlihan, Stock, & Pomery, 2008) suggests health-related decisions may also…
Park, Guihyun; DeShon, Richard P.
The consideration of minority opinions when making team decisions is an important factor that contributes to team effectiveness. A multilevel model of minority opinion influence in decision-making teams is developed to address the conditions that relate to adequate consideration of minority opinions. Using a sample of 57 teams working on a…
He, Chunyan; Lei, Yalin; Ge, Jianping
Chinese rare earth export policies currently result in accelerating its depletion. Thus adopting an optimal export trade selection strategy is crucial to determining and ultimately identifying the ideal trading partners. This paper introduces a multi-attribute decision-making methodology which is then used to select the optimal trading partner. In the method, an evaluation criteria system is established to assess the seven top trading partners based on three dimensions: political relationships, economic benefits and industrial security. Specifically, a simple additive weighing model derived from an additive utility function is utilized to calculate, rank and select alternatives. Results show that Japan would be the optimal trading partner for Chinese rare earths. The criteria evaluation method of trading partners for China's rare earth exports provides the Chinese government with a tool to enhance rare earth industrial policies.
He, Chunyan; Lei, Yalin; Ge, Jianping
Chinese rare earth export policies currently result in accelerating its depletion. Thus adopting an optimal export trade selection strategy is crucial to determining and ultimately identifying the ideal trading partners. This paper introduces a multi-attribute decision-making methodology which is then used to select the optimal trading partner. In the method, an evaluation criteria system is established to assess the seven top trading partners based on three dimensions: political relationships, economic benefits and industrial security. Specifically, a simple additive weighing model derived from an additive utility function is utilized to calculate, rank and select alternatives. Results show that Japan would be the optimal trading partner for Chinese rare earths. The criteria evaluation method of trading partners for China's rare earth exports provides the Chinese government with a tool to enhance rare earth industrial policies. PMID:25051534
Kim, Henry M.
An enterprise model, a computational model of knowledge about an enterprise, is a useful tool for integrated decision-making by e-commerce suppliers and customers. Sharable knowledge, once represented in an enterprise model, can be integrated by the modeled enterprise's e-commerce partners. Presents background on enterprise modeling, followed by…
failing to comply with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE...neuroeconomic choice model combining the normalization model and a general theoretical framework for the neural decision process; this model describes the...Mel Win Khaw, Paul W. Glimcher. Normalization is a general neural mechanism for context-dependent decision making, Proceedings of the National
Fernández, Ariel; Scott, L Ridgway
Lead optimization (LO) is essential to fulfill the efficacy and safety requirements of drug-based targeted therapy. The ease with which water may be locally removed from around the target protein crucially influences LO decisions. However, inferred binding sites often defy intuition and the resulting LO decisions are often counterintuitive, with nonpolar groups in the drug placed next to polar groups in the target. We first introduce biophysical advances to reconcile these apparent mismatches. We incorporate three-body energy terms that account for the net stabilization of preformed target structures upon removal of interfacial water concurrent with drug binding. These unexplored drug-induced environmental changes enhancing the target electrostatics are validated against drug-target affinity data, yielding superior computational accuracy required to improve drug design.
Lee, Sara; Riley-Behringer, Maureen; Rose, Jeanmarie C; Meropol, Sharon B; Lazebnik, Rina
This study explores how parents' intentions regarding vaccination prior to their children's visit were associated with actual vaccine acceptance. A convenience sample of parents accompanying 6-week-old to 17-year-old children completed a written survey at 2 pediatric practices. Using hierarchical logistic regression, for hospital-based participants (n = 216), vaccine refusal history (P < .01) and vaccine decision made before the visit (P < .05) explained 87% of vaccine refusals. In community-based participants (n = 100), vaccine refusal history (P < .01) explained 81% of refusals. Over 1 in 5 parents changed their minds about vaccination during the visit. Thirty parents who were previous vaccine refusers accepted current vaccines, and 37 who had intended not to vaccinate choose vaccination. Twenty-nine parents without a refusal history declined vaccines, and 32 who did not intend to refuse before the visit declined vaccination. Future research should identify key factors to nudge parent decision making in favor of vaccination.
Sohl, Terry L.; Claggett, Peter R.
The last decade has seen a remarkable increase in the number of modeling tools available to examine future land-use and land-cover (LULC) change. Integrated modeling frameworks, agent-based models, cellular automata approaches, and other modeling techniques have substantially improved the representation of complex LULC systems, with each method using a different strategy to address complexity. However, despite the development of new and better modeling tools, the use of these tools is limited for actual planning, decision-making, or policy-making purposes. LULC modelers have become very adept at creating tools for modeling LULC change, but complicated models and lack of transparency limit their utility for decision-makers. The complicated nature of many LULC models also makes it impractical or even impossible to perform a rigorous analysis of modeling uncertainty. This paper provides a review of land-cover modeling approaches and the issues causes by the complicated nature of models, and provides suggestions to facilitate the increased use of LULC models by decision-makers and other stakeholders. The utility of LULC models themselves can be improved by 1) providing model code and documentation, 2) through the use of scenario frameworks to frame overall uncertainties, 3) improving methods for generalizing key LULC processes most important to stakeholders, and 4) adopting more rigorous standards for validating models and quantifying uncertainty. Communication with decision-makers and other stakeholders can be improved by increasing stakeholder participation in all stages of the modeling process, increasing the transparency of model structure and uncertainties, and developing user-friendly decision-support systems to bridge the link between LULC science and policy. By considering these options, LULC science will be better positioned to support decision-makers and increase real-world application of LULC modeling results.
Collective decision-making plays a central part in the lives of many social animals. Two important factors that influence collective decision-making are information uncertainty and conflicting preferences. Here, I bring together, and briefly review, basic models relating to animal collective decision-making in situations with information uncertainty and in situations with conflicting preferences between group members. The intention is to give an overview about the different types of modelling approaches that have been employed and the questions that they address and raise. Despite the use of a wide range of different modelling techniques, results show a coherent picture, as follows. Relatively simple cognitive mechanisms can lead to effective information pooling. Groups often face a trade-off between decision accuracy and speed, but appropriate fine-tuning of behavioural parameters could achieve high accuracy while maintaining reasonable speed. The right balance of interdependence and independence between animals is crucial for maintaining group cohesion and achieving high decision accuracy. In conflict situations, a high degree of decision-sharing between individuals is predicted, as well as transient leadership and leadership according to needs and physiological status. Animals often face crucial trade-offs between maintaining group cohesion and influencing the decision outcome in their own favour. Despite the great progress that has been made, there remains one big gap in our knowledge: how do animals make collective decisions in situations when information uncertainty and conflict of interest operate simultaneously?
Golman, Russell; Hagmann, David; Miller, John H.
How do social systems make decisions with no single individual in control? We observe that a variety of natural systems, including colonies of ants and bees and perhaps even neurons in the human brain, make decentralized decisions using common processes involving information search with positive feedback and consensus choice through quorum sensing. We model this process with an urn scheme that runs until hitting a threshold, and we characterize an inherent tradeoff between the speed and the accuracy of a decision. The proposed common mechanism provides a robust and effective means by which a decentralized system can navigate the speed-accuracy tradeoff and make reasonably good, quick decisions in a variety of environments. Additionally, consensus choice exhibits systemic risk aversion even while individuals are idiosyncratically risk-neutral. This too is adaptive. The model illustrates how natural systems make decentralized decisions, illuminating a mechanism that engineers of social and artificial systems could imitate. PMID:26601255
Nong, Bao Anh; Ertsen, Maurits; Schoups, Gerrit
Complexity and uncertainty in natural resources management have been focus themes in recent years. Within these debates, with the aim to define an approach feasible for water management practice, we are developing an integrated conceptual modeling framework for simulating decision-making processes of citizens, in our case in the Day river area, Vietnam. The model combines Bayesian Networks (BNs) and Agent-Based Modeling (ABM). BNs are able to combine both qualitative data from consultants / experts / stakeholders, and quantitative data from observations on different phenomena or outcomes from other models. Further strengths of BNs are that the relationship between variables in the system is presented in a graphical interface, and that components of uncertainty are explicitly related to their probabilistic dependencies. A disadvantage is that BNs cannot easily identify the feedback of agents in the system once changes appear. Hence, ABM was adopted to represent the reaction among stakeholders under changes. The modeling framework is developed as an attempt to gain better understanding about citizen's behavior and factors influencing their decisions in order to reduce uncertainty in the implementation of water management policy.
Lin, W C; Ball, C
Compliance with Hepatitis B vaccination for nurses has been reported to be low in Taiwan. Therefore, a study of nursing students' view was conducted in Taiwan to discover possible reasons. As complex decision-making was involved in taking the vaccine, a four-level utility decision model underpinned by the Multi-Attribute Utility theory was proposed to ascertain the relative contribution of the specific components of attitude and beliefs to the final decision and experience of being vaccinated against Hepatitis B infection. Results indicated that the 'personal value of Hepatitis B vaccination', in particular for 'concern about the efficacy of the Hepatitis B vaccine', 'fear of pain from repeated injections', 'time' and 'money', were the main determinants in relation to the uptake of the Hepatitis B vaccination. Such results were consistent with earlier findings based on the Health Belief Model. It appears that the greater the experience gained in nursing care the lower the rate of vaccination; the important items under the concept of 'Personal value of Hepatitis B vaccination' varied by 'experience in nursing care'. The overall predictive validity was 67%, based on the utility decision model. When stratified by 'experience in nursing care', the prediction improved, ranging from 89% to 100%. Based on these findings, a specific intervention programme should be provided to change behaviour and improve the vaccination rate.
Muegge, Ingo; Bentzien, Jörg; Mukherjee, Prasenjit; Hughes, Robert O
Using predictive models for early decision-making in drug discovery has become standard practice. We suggest that model building needs to be automated with minimum input and low technical maintenance requirements. Models perform best when tailored to answering specific compound optimization related questions. If qualitative answers are required, 2-bin classification models are preferred. Integrating predictive modeling results with structural information stimulates better decision making. For in silico models supporting rapid structure-activity relationship cycles the performance deteriorates within weeks. Frequent automated updates of predictive models ensure best predictions. Consensus between multiple modeling approaches increases the prediction confidence. Combining qualified and nonqualified data optimally uses all available information. Dose predictions provide a holistic alternative to multiple individual property predictions for reaching complex decisions.
and Canadian indices differ. The U.S. uses the Heat Index (HI) based on Steadman’s model. Canada uses Humidex (HD). Our comparison used the USARIEM Heat Strain Decision Aid (HSDA) to evaluate both indices.
This viewgraph presentation reviews the development of an Integrated Medical Model (IMM) decision support tool for in-flight crew health care safety. Clinical methods, resources, and case scenarios are also addressed.
The predictions of most terrestrial ecosystem models originate from deterministic simulations. Relatively few uncertainty evaluation exercises in model outputs are performed by either model developers or users. This issue has important consequences for decision makers who rely on models to develop n...
Verdejo-García, A; Pérez-García, M; Bechara, A
Similar to patients with orbitofrontal cortex lesions, substance dependent individuals (SDI) show signs of impairments in decision-making, characterised by a tendency to choose the immediate reward at the expense of severe negative future consequences. The somatic-marker hypothesis proposes that decision-making depends in many important ways on neural substrates that regulate homeostasis, emotion and feeling. According to this model, there should be a link between abnormalities in experiencing emotions in SDI, and their severe impairments in decision-making in real-life. Growing evidence from neuroscientific studies suggests that core aspects of substance addiction may be explained in terms of abnormal emotional guidance of decision-making. Behavioural studies have revealed emotional processing and decision-making deficits in SDI. Combined neuropsychological and physiological assessment has demonstrated that the poorer decision-making of SDI is associated with altered reactions to reward and punishing events. Imaging studies have shown that impaired decision-making in addiction is associated with abnormal functioning of a distributed neural network critical for the processing of emotional information, including the ventromedial cortex, the amygdala, the striatum, the anterior cingulate cortex, and the insular/somato-sensory cortices, as well as non-specific neurotransmitter systems that modulate activities of neural processes involved in decision-making. The aim of this paper is to review this growing evidence, and to examine the extent of which these studies support a somatic-marker model of addiction. PMID:18615136
The decision to add a new vaccine to the immunization schedule is a complex and multidisciplinary process based on the risk-benefit balance and, increasingly, on the cost- effectiveness ratio. Such decisions now use mathematical models that can predict the indirect, and potentially detrimental, effects of mass vaccination on the epidemiology of the target disease. The adjunction of an economic component to the modeling process ensures that vaccination represents an efficient allocation of available financial resources in an increasingly constrained environment.
Orsini, Caitlin A.; Willis, Markie L.; Gilbert, Ryan J.; Bizon, Jennifer L.; Setlow, Barry
Many debilitating psychiatric conditions, including drug addiction, are characterized by poor decision making and maladaptive risk-taking. Recent research has begun to probe this relationship to determine how brain mechanisms mediating risk-taking become compromised after chronic drug use. Currently, however, the majority of work in this field has used male subjects. Given the well-established sex differences in drug addiction, it is conceivable that such differences are also evident in risk-based decision making. To test this possibility, male and female adult rats were trained in a “Risky Decision making Task” (RDT), in which they chose between a small, “safe” food reward and a large, “risky” food reward accompanied by an increasing probability of mild footshock punishment. Consistent with findings in human subjects, females were more risk averse, choosing the large, risky reward significantly less than males. This effect was not due to differences in shock reactivity or body weight, and risk-taking in females was not modulated by estrous phase. Systemic amphetamine administration decreased risk-taking in both males and females; however, females exhibited greater sensitivity to amphetamine, suggesting that dopaminergic signaling may partially account for sex differences in risk-taking. Finally, although males displayed greater instrumental responding for food reward, reward choice in the RDT was not affected by satiation, indicating that differences in motivation to obtain food reward cannot fully account for sex differences in risk-taking. These results should prove useful for developing targeted treatments for psychiatric conditions in which risk-taking is altered and that are known to differentially affect males and females. PMID:26653713
Chang, Ting-Cheng; Wang, Hui
This paper proposes a cloud multi-criteria group decision-making model for teacher evaluation in higher education which is involving subjectivity, imprecision and fuzziness. First, selecting the appropriate evaluation index depending on the evaluation objectives, indicating a clear structural relationship between the evaluation index and…
Chen, Duan; Leon, Arturo S.; Gibson, Nathan L.; Hosseini, Parnian
Optimizing the operation of a multireservoir system is challenging due to the high dimension of the decision variables that lead to a large and complex search space. A spectral optimization model (SOM), which transforms the decision variables from time domain to frequency domain, is proposed to reduce the dimensionality. The SOM couples a spectral dimensionality-reduction method called Karhunen-Loeve (KL) expansion within the routine of Nondominated Sorting Genetic Algorithm (NSGA-II). The KL expansion is used to represent the decision variables as a series of terms that are deterministic orthogonal functions with undetermined coefficients. The KL expansion can be truncated into fewer significant terms, and consequently, fewer coefficients by a predetermined number. During optimization, operators of the NSGA-II (e.g., crossover) are conducted only on the coefficients of the KL expansion rather than the large number of decision variables, significantly reducing the search space. The SOM is applied to the short-term operation of a 10-reservoir system in the Columbia River of the United States. Two scenarios are considered herein, the first with 140 decision variables and the second with 3360 decision variables. The hypervolume index is used to evaluate the optimization performance in terms of convergence and diversity. The evaluation of optimization performance is conducted for both conventional optimization model (i.e., NSGA-II without KL) and the SOM with different number of KL terms. The results show that the number of decision variables can be greatly reduced in the SOM to achieve a similar or better performance compared to the conventional optimization model. For the scenario with 140 decision variables, the optimal performance of the SOM model is found with six KL terms. For the scenario with 3360 decision variables, the optimal performance of the SOM model is obtained with 11 KL terms.
Andriyas, S.; McKee, M.
Anticipating farmers' irrigation decisions can provide the possibility of improving the efficiency of canal operations in on-demand irrigation systems. Although multiple factors are considered during irrigation decision making, for any given farmer there might be one factor playing a major role. Identification of that biophysical factor which led to a farmer deciding to irrigate is difficult because of high variability of those factors during the growing season. Analysis of the irrigation decisions of a group of farmers for a single crop can help to simplify the problem. We developed a hidden Markov model (HMM) to analyze irrigation decisions and explore the factor and level at which the majority of farmers decide to irrigate. The model requires observed variables as inputs and the hidden states. The chosen model inputs were relatively easily measured, or estimated, biophysical data, including such factors (i.e., those variables which are believed to affect irrigation decision-making) as cumulative evapotranspiration, soil moisture depletion, soil stress coefficient, and canal flows. Irrigation decision series were the hidden states for the model. The data for the work comes from the Canal B region of the Lower Sevier River Basin, near Delta, Utah. The main crops of the region are alfalfa, barley, and corn. A portion of the data was used to build and test the model capability to explore that factor and the level at which the farmer takes the decision to irrigate for future irrigation events. Both group and individual level behavior can be studied using HMMs. The study showed that the farmers cannot be classified into certain classes based on their irrigation decisions, but vary in their behavior from irrigation-to-irrigation across all years and crops. HMMs can be used to analyze what factor and, subsequently, what level of that factor on which the farmer most likely based the irrigation decision. The study shows that the HMM is a capable tool to study a process
Bornstein, Aaron M.; Daw, Nathaniel D.
How do we use our memories of the past to guide decisions we've never had to make before? Although extensive work describes how the brain learns to repeat rewarded actions, decisions can also be influenced by associations between stimuli or events not directly involving reward — such as when planning routes using a cognitive map or chess moves using predicted countermoves — and these sorts of associations are critical when deciding among novel options. This process is known as model-based decision making. While the learning of environmental relations that might support model-based decisions is well studied, and separately this sort of information has been inferred to impact decisions, there is little evidence concerning the full cycle by which such associations are acquired and drive choices. Of particular interest is whether decisions are directly supported by the same mnemonic systems characterized for relational learning more generally, or instead rely on other, specialized representations. Here, building on our previous work, which isolated dual representations underlying sequential predictive learning, we directly demonstrate that one such representation, encoded by the hippocampal memory system and adjacent cortical structures, supports goal-directed decisions. Using interleaved learning and decision tasks, we monitor predictive learning directly and also trace its influence on decisions for reward. We quantitatively compare the learning processes underlying multiple behavioral and fMRI observables using computational model fits. Across both tasks, a quantitatively consistent learning process explains reaction times, choices, and both expectation- and surprise-related neural activity. The same hippocampal and ventral stream regions engaged in anticipating stimuli during learning are also engaged in proportion to the difficulty of decisions. These results support a role for predictive associations learned by the hippocampal memory system to be recalled
The omnipresent need for optimisation requires constant improvements of companies’ business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and “what-if” scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results. PMID:26871694
Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana
The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.
He, Xiaoxing Z; Lyons, John S; Heinemann, Allen W
We studied 1492 children in state custody over a 6-month period to investigate the relationship between children's hospital admissions and the crisis workers' clinical assessment. A 27-item standardized decision-support tool [the Childhood Severity of Psychiatric Illness (CSPI)] was used to evaluate the symptoms, risk factors, functioning, comorbidity, and system characteristics. The CSPI has been shown to have a reliability range from 0.70 to 0.80 using intraclass correlations. Logistic regression was used to calculate age-adjusted odds ratios (AOR) of hospitalization, their 95% confidence intervals, and corresponding P values. The results showed that risk factors, symptoms, functioning, comorbidities, and system characteristics were all associated with hospital admissions. Children with a recent suicide attempt, severe danger to others, or history of running away from home/treatment settings were more likely to be hospitalized (respective AOR=12.7, P<.0001; AOR=32.3, P<.0001; AOR=3.0, P=.001). In addition, hospitalization was inversely associated with caregiver knowledge of children (AOR=0.2, P=.01) and multisystem needs (AOR=0.3, P=.04). The decision to hospitalize children psychiatrically appears to be complex. As predicted, risk behaviors and severe symptoms were independent predictors of children's hospital admissions. Interestingly, the capacity of the caregiver and the children's involvement in multiple systems also predict children's hospital admissions.
Carbone, Giuseppe; Giannoccaro, Ilaria
A continuous-time Markov process is proposed to analyze how a group of humans solves a complex task, consisting in the search of the optimal set of decisions on a fitness landscape. Individuals change their opinions driven by two different forces: (i) the self-interest, which pushes them to increase their own fitness values, and (ii) the social interactions, which push individuals to reduce the diversity of their opinions in order to reach consensus. Results show that the performance of the group is strongly affected by the strength of social interactions and by the level of knowledge of the individuals. Increasing the strength of social interactions improves the performance of the team. However, too strong social interactions slow down the search of the optimal solution and worsen the performance of the group. In particular, we find that the threshold value of the social interaction strength, which leads to the emergence of a superior intelligence of the group, is just the critical threshold at which the consensus among the members sets in. We also prove that a moderate level of knowledge is already enough to guarantee high performance of the group in making decisions.
Loo, Chu Kiong
A novel decision making for intelligent agent using quantum-inspired approach is proposed. A formal, generalized solution to the problem is given. Mathematically, the proposed model is capable of modeling higher dimensional decision problems than previous researches. Four experiments are conducted, and both empirical experiments results and proposed model's experiment results are given for each experiment. Experiments showed that the results of proposed model agree with empirical results perfectly. The proposed model provides a new direction for researcher to resolve cognitive basis in designing intelligent agent. PMID:24778580
Doherty, John; Simmons, Craig T.
Groundwater models are commonly used as basis for environmental decision-making. There has been discussion and debate in recent times regarding the issue of model simplicity and complexity. This paper contributes to this ongoing discourse. The selection of an appropriate level of model structural and parameterization complexity is not a simple matter. Although the metrics on which such selection should be based are simple, there are many competing, and often unquantifiable, considerations which must be taken into account as these metrics are applied. A unified conceptual framework is introduced and described which is intended to underpin groundwater modelling in decision support with a direct focus on matters regarding model simplicity and complexity.
Rajavel, Rajkumar; Thangarathinam, Mala
Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework. PMID:26543899
Rajavel, Rajkumar; Thangarathinam, Mala
Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework.
Yılmaz Balaman, Şebnem; Selim, Hasan
The core driver of this study is to deal with the design of anaerobic digestion based biomass to energy supply chains in a cost effective manner. In this concern, a decision model is developed. The model is based on fuzzy multi objective decision making in order to simultaneously optimize multiple economic objectives and tackle the inherent uncertainties in the parameters and decision makers' aspiration levels for the goals. The viability of the decision model is explored with computational experiments on a real-world biomass to energy supply chain and further analyses are performed to observe the effects of different conditions. To this aim, scenario analyses are conducted to investigate the effects of energy crop utilization and operational costs on supply chain structure and performance measures.
Muralidharan, R.; Baron, S.
A report is given on the ongoing efforts to mode the human operator in the context of the task during the enroute/return phases in the ground based control of multiple flights of remotely piloted vehicles (RPV). The approach employed here uses models that have their analytical bases in control theory and in statistical estimation and decision theory. In particular, it draws heavily on the modes and the concepts of the optimal control model (OCM) of the human operator. The OCM is being extended into a combined monitoring, decision, and control model (DEMON) of the human operator by infusing decision theoretic notions that make it suitable for application to problems in which human control actions are infrequent and in which monitoring and decision-making are the operator's main activities. Some results obtained with a specialized version of DEMON for the RPV control problem are included.
Rubrichi, Stefania; Rognoni, Carla; Sacchi, Lucia; Parimbelli, Enea; Napolitano, Carlo; Mazzanti, Andrea; Quaglini, Silvana
The inclusion of patients' perspectives in clinical practice has become an important matter for health professionals, in view of the increasing attention to patient-centered care. In this regard, this report illustrates a method for developing a visual aid that supports the physician in the process of informing patients about a critical decisional problem. In particular, we focused on interpretation of the results of decision trees embedding Markov models implemented with the commercial tool TreeAge Pro. Starting from patient-level simulations and exploiting some advanced functionalities of TreeAge Pro, we combined results to produce a novel graphical output that represents the distributions of outcomes over the lifetime for the different decision options, thus becoming a more informative decision support in a context of shared decision making. The training example used to illustrate the method is a decision tree for thromboembolism risk prevention in patients with nonvalvular atrial fibrillation.
The basic elements of social decision scheme (SDS) theory are individual preferences, group preference compositions (distinguishable distributions), patterns of group influence (decision schemes, social combination rules), and collective responses (group decisions, judgments, solutions, and the like). The theory provides a framework for addressing two fundamental questions in the study of group performance: How are individual resources combined to yield a group response (the individual-into-group problem)? What are the implications of empirical observations under one set of circumstances for other conditions where data do not exist (the sparse data problem)? Several prescriptions for how to conduct fruitful group research are contained in the SDS tradition: make precise theoretical statements, provide strong and competitive tests of theories, and interpret empirical findings in the context of robust process models. Copyright 1999 Academic Press.
Zhang, Zhen; Guo, Chonghui
Due to the uncertainty of the decision environment and the lack of knowledge, decision-makers may use uncertain linguistic preference relations to express their preferences over alternatives and criteria. For group decision-making problems with preference relations, it is important to consider the individual consistency and the group consensus before aggregating the preference information. In this paper, consistency and consensus models for group decision-making with uncertain 2-tuple linguistic preference relations (U2TLPRs) are investigated. First of all, a formula which can construct a consistent U2TLPR from the original preference relation is presented. Based on the consistent preference relation, the individual consistency index for a U2TLPR is defined. An iterative algorithm is then developed to improve the individual consistency of a U2TLPR. To help decision-makers reach consensus in group decision-making under uncertain linguistic environment, the individual consensus and group consensus indices for group decision-making with U2TLPRs are defined. Based on the two indices, an algorithm for consensus reaching in group decision-making with U2TLPRs is also developed. Finally, two examples are provided to illustrate the effectiveness of the proposed algorithms.
Murphy, J.; Lammers, R. B.; Proussevitch, A. A.; Ozik, J.; Altaweel, M.; Collier, N. T.; Alessa, L.; Kliskey, A. D.
The global hydrological cycle intersects with human decision making at multiple scales, from dams and irrigation works to the taps in individuals' homes. Residential water consumers are commonly encouraged to conserve; these messages are heard against a background of individual values and conceptions about water quality, uses, and availability. The degree to which these values impact the larger-hydrological dynamics, the way that changes in those values have impacts on the hydrological cycle through time, and the feedbacks by which water availability and quality in turn shape those values, are not well explored. To investigate this domain we employ a global-scale water balance model (WBM) coupled with a social-science-grounded agent-based model (ABM). The integration of a hydrological model with an agent-based model allows us to explore driving factors in the dynamics in coupled human-natural systems. From the perspective of the physical hydrologist, the ABM offers a richer means of incorporating the human decisions that drive the hydrological system; from the view of the social scientist, a physically-based hydrological model allows the decisions of the agents to play out against constraints faithful to the real world. We apply the interconnected models to a study of Tucson, Arizona, USA, and its role in the larger Colorado River system. Our core concept is Technology-Induced Environmental Distancing (TIED), which posits that layers of technology can insulate consumers from direct knowledge of a resource. In Tucson, multiple infrastructure and institutional layers have arguably increased the conceptual distance between individuals and their water supply, offering a test case of the TIED framework. Our coupled simulation allows us to show how the larger system transforms a resource with high temporal and spatial variability into a consumer constant, and the effects of this transformation on the regional system. We use this to explore how pricing, messaging, and
Lund, J. R.; Rosenberg, D.
We present an economic-engineering method to estimate urban water use demands with intermittent water supplies. A two-stage, probabilistic optimization formulation includes a wide variety of water supply enhancement and conservation actions that individual households can adopt to meet multiple water quality uses with uncertain water availability. We embed the optimization in Monte-Carlo simulations to show aggregate effects at a utility (citywide) scale for a population of user conditions and decisions. Parametric analysis provides derivations of supply curves to subsidize conservation, demand responses to alternative pricing, and customer willingness-to-pay to avoid shortages. Results show a good empirical fit for the average and distribution of billed residential water use in Amman, Jordan. Additional outputs give likely market penetration rates for household conservation actions, associated water savings, and subsidies required to entice further adoption. We discuss new insights to size, target, market, and finance conservation programs and interpret a demand curve with block pricing.
Capelo, Carlos; Dias, Joao Ferreira
This study aims to be a contribution to a theoretical model that explains the effectiveness of the learning and decision-making processes by means of a feedback and mental models perspective. With appropriate mental models, managers should be able to improve their capacity to deal with dynamically complex contexts, in order to achieve long-term…
Muhlheim, Michael David; Belles, Randy; Denning, Richard S.
Decision-making is the process of identifying decision alternatives, assessing those alternatives based on predefined metrics, selecting an alternative (i.e., making a decision), and then implementing that alternative. The generation of decisions requires a structured, coherent process, or a decision-making process. The overall objective for this work is that the generalized framework is adopted into an autonomous decision-making framework and tailored to specific requirements for various applications. In this context, automation is the use of computing resources to make decisions and implement a structured decision-making process with limited or no human intervention. The overriding goal of automation is to replace or supplement human decision makers with reconfigurable decision-making modules that can perform a given set of tasks rationally, consistently, and reliably. Risk-informed decision-making requires a probabilistic assessment of the likelihood of success given the status of the plant/systems and component health, and a deterministic assessment between plant operating parameters and reactor protection parameters to prevent unnecessary trips and challenges to plant safety systems. The probabilistic portion of the decision-making engine of the supervisory control system is based on the control actions associated with an ALMR PRISM. Newly incorporated into the probabilistic models are the prognostic/diagnostic models developed by Pacific Northwest National Laboratory. These allow decisions to incorporate the health of components into the decision–making process. Once the control options are identified and ranked based on the likelihood of success, the supervisory control system transmits the options to the deterministic portion of the platform. The deterministic portion of the decision-making engine uses thermal-hydraulic modeling and components for an advanced liquid-metal reactor Power Reactor Inherently Safe Module. The deterministic multi-attribute
Sojda, Richard S.; Chen, Serena H.; El Sawah, Sondoss; Guillaume, Joseph H.A.; Jakeman, A.J.; Lautenbach, Sven; McIntosh, Brian S.; Rizzoli, A.E.; Seppelt, Ralf; Struss, Peter; Voinov, Alexey; Volk, Martin
Two of the basic tenets of decision support system efforts are to help identify and structure the decisions to be supported, and to then provide analysis in how those decisions might be best made. One example from wetland management would be that wildlife biologists must decide when to draw down water levels to optimise aquatic invertebrates as food for breeding ducks. Once such a decision is identified, a system or tool to help them make that decision in the face of current and projected climate conditions could be developed. We examined a random sample of 100 papers published from 2001-2011 in Environmental Modelling and Software that used the phrase “decision support system” or “decision support tool”, and which are characteristic of different sectors. In our review, 41% of the systems and tools related to the water resources sector, 34% were related to agriculture, and 22% to the conservation of fish, wildlife, and protected area management. Only 60% of the papers were deemed to be reporting on DSS. This was based on the papers reviewed not having directly identified a specific decision to be supported. We also report on the techniques that were used to identify the decisions, such as formal survey, focus group, expert opinion, or sole judgment of the author(s). The primary underlying modelling system, e.g., expert system, agent based model, Bayesian belief network, geographical information system (GIS), and the like was categorised next. Finally, since decision support typically should target some aspect of unstructured decisions, we subjectively determined to what degree this was the case. In only 23% of the papers reviewed, did the system appear to tackle unstructured decisions. This knowledge should be useful in helping workers in the field develop more effective systems and tools, especially by being exposed to the approaches in different, but related, disciplines. We propose that a standard blueprint for reporting on DSS be developed for
Tsalatsanis, Athanasios; Hozo, Iztok; Kumar, Ambuj; Djulbegovic, Benjamin
Dual Processing Theories (DPT) assume that human cognition is governed by two distinct types of processes typically referred to as type 1 (intuitive) and type 2 (deliberative). Based on DPT we have derived a Dual Processing Model (DPM) to describe and explain therapeutic medical decision-making. The DPM model indicates that doctors decide to treat when treatment benefits outweigh its harms, which occurs when the probability of the disease is greater than the so called “threshold probability” at which treatment benefits are equal to treatment harms. Here we extend our work to include a wider class of decision problems that involve diagnostic testing. We illustrate applicability of the proposed model in a typical clinical scenario considering the management of a patient with prostate cancer. To that end, we calculate and compare two types of decision-thresholds: one that adheres to expected utility theory (EUT) and the second according to DPM. Our results showed that the decisions to administer a diagnostic test could be better explained using the DPM threshold. This is because such decisions depend on objective evidence of test/treatment benefits and harms as well as type 1 cognition of benefits and harms, which are not considered under EUT. Given that type 1 processes are unique to each decision-maker, this means that the DPM threshold will vary among different individuals. We also showed that when type 1 processes exclusively dominate decisions, ordering a diagnostic test does not affect a decision; the decision is based on the assessment of benefits and harms of treatment. These findings could explain variations in the treatment and diagnostic patterns documented in today’s clinical practice. PMID:26244571
A solid understanding of human decision-making is essential to analyze the complexity of coupled human and natural systems (CHANS) and inform policies to promote resilience in the face of environmental change. Human decisions drive and/or mediate the interactions and feedbacks, and contribute to the heterogeneity and non-linearity that characterize CHANS. However, human decision-making is usually over-simplistically modeled, whereby human agents are represented deterministically either as dumb or clairvoyant decision-makers. Decision-making models fall short in the integration of both environmental and human behavioral drivers, and concerning the latter, tend to focus on only one category, e.g. economic, cultural, or psychological. Furthermore, these models render a linear decision-making process and therefore fail to account for the recursive co-evolutionary dynamics in CHANS. As a result, these models constitute only a weak basis for policy-making. There is therefore scope and an urgent need for better approaches to human decision-making, to produce the knowledge that can inform vulnerability reduction policies in the face of environmental change. This presentation synthesizes the current state-of-the-art of modelling human decision-making in CHANS, with particular reference to agricultural systems, and delineates how the above mentioned shortcomings can be overcome. Through examples from research on pesticide use and adaptation to climate change, both based on the integrative agent-centered framework (Feola and Binder, 2010), the approach for an improved understanding of human agents in CHANS are illustrated. This entails: integrative approach, focus on behavioral dynamics more than states, feedbacks between individual and system levels, and openness to heterogeneity.
Duschl, Richard A.; Wright, Emmett
The focus of this study was to investigate the manner and the degree to which science teachers consider the nature of the subject matter in their decision making addressing the planning and the delivery of instructional tasks. An assumption of the study is that considerations for the nature of the subject matter should be a factor in a teacher's decision making about what to teach and how to teach. Relevant research literature reviewed includes (1) human decision making and the development of cognitive models of reality, (2) modern philosophies of science, and (3) philosophy of science and science education. Methods of data collection and of data analysis followed Spradley's Developmental Research Sequence guidelines for conducting ethnographic research. Validity of research findings was established from the triangulation of observations, interviews, and documents and surveys. The goal of the research was the development of grounded hypotheses about science TEACHERS' pedagogical decision making. Based on the results of this study it is hypothesized that science TEACHERS' decision-making models of reality for the selection, implementation, and development of instructional tasks are dominated by considerations for (a) student development, (b) curriculum guide objectives, and (c) pressures of accountability. Little, if any, consideration is given to the nature of the subject matter by the science teachers in decision making. Implications exist for the disenfranchisement of teachers from the task of making decisions concerning what to teach.
Kadiyala, M D M; Nedumaran, S; Singh, Piara; S, Chukka; Irshad, Mohammad A; Bantilan, M C S
The semi-arid tropical (SAT) regions of India are suffering from low productivity which may be further aggravated by anticipated climate change. The present study analyzes the spatial variability of climate change impacts on groundnut yields in the Anantapur district of India and examines the relative contribution of adaptation strategies. For this purpose, a web based decision support tool that integrates crop simulation model and Geographical Information System (GIS) was developed to assist agronomic decision making and this tool can be scalable to any location and crop. The climate change projections of five global climate models (GCMs) relative to the 1980-2010 baseline for Anantapur district indicates an increase in rainfall activity to the tune of 10.6 to 25% during Mid-century period (2040-69) with RCP 8.5. The GCMs also predict warming exceeding 1.4 to 2.4°C by 2069 in the study region. The spatial crop responses to the projected climate indicate a decrease in groundnut yields with four GCMs (MPI-ESM-MR, MIROC5, CCSM4 and HadGEM2-ES) and a contrasting 6.3% increase with the GCM, GFDL-ESM2M. The simulation studies using CROPGRO-Peanut model reveals that groundnut yields can be increased on average by 1.0%, 5.0%, 14.4%, and 20.2%, by adopting adaptation options of heat tolerance, drought tolerant cultivars, supplemental irrigation and a combination of drought tolerance cultivar and supplemental irrigation respectively. The spatial patterns of relative benefits of adaptation options were geographically different and the greatest benefits can be achieved by adopting new cultivars having drought tolerance and with the application of one supplemental irrigation at 60days after sowing.
The evaluation and prioritization of Engineering Support Requests (ESR's) is a particularly difficult task at the Kennedy Space Center (KSC) -- Shuttle Project Engineering Office. This difficulty is due to the complexities inherent in the evaluation process and the lack of structured information. The evaluation process must consider a multitude of relevant pieces of information concerning Safety, Supportability, O&M Cost Savings, Process Enhancement, Reliability, and Implementation. Various analytical and normative models developed over the past have helped decision makers at KSC utilize large volumes of information in the evaluation of ESR's. The purpose of this project is to build on the existing methodologies and develop a multiple criteria decision support system that captures the decision maker's beliefs through a series of sequential, rational, and analytical processes. The model utilizes the Analytic Hierarchy Process (AHP), subjective probabilities, the entropy concept, and Maximize Agreement Heuristic (MAH) to enhance the decision maker's intuition in evaluating a set of ESR's.
Adriaenssens, Veronique; De Baets, Bernard; Goethals, Peter L M; De Pauw, Niels
To facilitate decision support in the ecosystem management, ecological expertise and site-specific data need to be integrated. Fuzzy logic can deal with highly variable, linguistic, vague and uncertain data or knowledge and, therefore, has the ability to allow for a logical, reliable and transparent information stream from data collection down to data usage in decision-making. Several environmental applications already implicate the use of fuzzy logic. Most of these applications have been set up by trial and error and are mainly limited to the domain of environmental assessment. In this article, applications of fuzzy logic for decision support in ecosystem management are reviewed and assessed, with an emphasis on rule-based models. In particular, the identification, optimisation, validation, the interpretability and uncertainty aspects of fuzzy rule-based models for decision support in ecosystem management are discussed.
Lamontagne, Julie; Beaulieu, Marie; Arcand, Marcel
The elderly in palliative care are confronted with difficult decisions relating to treatments. The philosophy of palliative care, namely, including the patient and his/her family right away, leads the doctor to consult with the two parties involved when choosing a treatment. As no theoretical model allows us to understand how the decision-making process hinges on the trio (a capable elderly person, a family caregiver, and the doctor) in a context of palliative care, we propose one which was developed from three strategies of document analysis: theoretical synthesis, theoretical analysis, and theoretical derivation. According to our model, the decision-making process depends on individual factors influencing the decision of the participant, expectations and attitudes as to the role, the level of confidence amongst the parties involved, the manner in which they communicate with each other, their mutual understanding of the clinical and ethical issues, and, finally, their ability to cooperate.
Fenicia, Fabrizio; Kavetski, Dmitri; Savenije, Hubert H. G.; Pfister, Laurent
This paper explores the development and application of distributed hydrological models, focusing on the key decisions of how to discretize the landscape, which model structures to use in each landscape element, and how to link model parameters across multiple landscape elements. The case study considers the Attert catchment in Luxembourg—a 300 km2 mesoscale catchment with 10 nested subcatchments that exhibit clearly different streamflow dynamics. The research questions are investigated using conceptual models applied at hydrologic response unit (HRU) scales (1-4 HRUs) on 6 hourly time steps. Multiple model structures are hypothesized and implemented using the SUPERFLEX framework. Following calibration, space/time model transferability is tested using a split-sample approach, with evaluation criteria including streamflow prediction error metrics and hydrological signatures. Our results suggest that: (1) models using geology-based HRUs are more robust and capture the spatial variability of streamflow time series and signatures better than models using topography-based HRUs; this finding supports the hypothesis that, in the Attert, geology exerts a stronger control than topography on streamflow generation, (2) streamflow dynamics of different HRUs can be represented using distinct and remarkably simple model structures, which can be interpreted in terms of the perceived dominant hydrologic processes in each geology type, and (3) the same maximum root zone storage can be used across the three dominant geological units with no loss in model transferability; this finding suggests that the partitioning of water between streamflow and evaporation in the study area is largely independent of geology and can be used to improve model parsimony. The modeling methodology introduced in this study is general and can be used to advance our broader understanding and prediction of hydrological behavior, including the landscape characteristics that control hydrologic response, the
in this field is Baysian inference. Pearl [Ref 8] explains the foundations of Baysian methods and develops these techniques for use in decision making...identify the appropriate utility functions and probability vectors over the outcomes. C. PROBABILISTIC MODELING Pearl [Ref 8] develops the use of Baysian ...inferences to drive decision making by developing a Baysian network2 and applying facts until an effect is observed. The effect is a probability
Background During a mass casualty incident, evacuation of patients to the appropriate health care facility is critical to survival. Despite this, no existing system provides the evidence required to make informed evacuation decisions from the scene of the incident. To mitigate this absence and enable more informed decision making, a web based spatial decision support system (SDSS) was developed. This system supports decision making by providing data regarding hospital proximity, capacity, and treatment specializations to decision makers at the scene of the incident. Methods This web-based SDSS utilizes pre-calculated driving times to estimate the actual driving time to each hospital within the inclusive trauma system of the large metropolitan region within which it is situated. In calculating and displaying its results, the model incorporates both road network and hospital data (e.g. capacity, treatment specialties, etc.), and produces results in a matter of seconds, as is required in a MCI situation. In addition, its application interface allows the user to map the incident location and assists in the execution of triage decisions. Results Upon running the model, driving time from the MCI location to the surrounding hospitals is quickly displayed alongside information regarding hospital capacity and capability, thereby assisting the user in the decision-making process. Conclusions The use of SDSS in the prioritization of MCI evacuation decision making is potentially valuable in cases of mass casualty. The key to this model is the utilization of pre-calculated driving times from each hospital in the region to each point on the road network. The incorporation of real-time traffic and hospital capacity data would further improve this model. PMID:21663636
Lee, Saro; Park, Inhye
Subsidence of ground caused by underground mines poses hazards to human life and property. This study analyzed the hazard to ground subsidence using factors that can affect ground subsidence and a decision tree approach in a geographic information system (GIS). The study area was Taebaek, Gangwon-do, Korea, where many abandoned underground coal mines exist. Spatial data, topography, geology, and various ground-engineering data for the subsidence area were collected and compiled in a database for mapping ground-subsidence hazard (GSH). The subsidence area was randomly split 50/50 for training and validation of the models. A data-mining classification technique was applied to the GSH mapping, and decision trees were constructed using the chi-squared automatic interaction detector (CHAID) and the quick, unbiased, and efficient statistical tree (QUEST) algorithms. The frequency ratio model was also applied to the GSH mapping for comparing with probabilistic model. The resulting GSH maps were validated using area-under-the-curve (AUC) analysis with the subsidence area data that had not been used for training the model. The highest accuracy was achieved by the decision tree model using CHAID algorithm (94.01%) comparing with QUEST algorithms (90.37%) and frequency ratio model (86.70%). These accuracies are higher than previously reported results for decision tree. Decision tree methods can therefore be used efficiently for GSH analysis and might be widely used for prediction of various spatial events.
This report reviews social and behavioral science models and techniques for their possible use in understanding and predicting consumer energy decision making and behaviors. A number of models and techniques have been developed that address different aspects of the decision process, use different theoretical bases and approaches, and have been aimed at different audiences. Three major areas of discussion were selected: (1) models of adaptation to social change, (2) decision making and choice, and (3) diffusion of innovation. Within these three areas, the contributions of psychologists, sociologists, economists, marketing researchers, and others were reviewed. Five primary components of the models were identified and compared. The components are: (1) situational characteristics, (2) product characteristics, (3) individual characteristics, (4) social influences, and (5) the interaction or decision rules. The explicit use of behavioral and social science models in energy decision-making and behavior studies has been limited. Examples are given of a small number of energy studies which applied and tested existing models in studying the adoption of energy conservation behaviors and technologies, and solar technology.
Simen, Patrick; Vlasov, Ksenia; Papadakis, Samantha
Weber's law is the canonical scale-invariance law in psychology: when the intensities of 2 stimuli are scaled by any value k, the just-noticeable-difference between them also scales by k. A diffusion model that approximates a spike-counting process accounts for Weber's law (Link, 1992), but there exist surprising corollaries of this account that have not yet been described or tested. We show that (a) this spike-counting diffusion model predicts time-scale invariant decision time distributions in perceptual decision making, and time-scale invariant response time (RT) distributions in interval timing; (b) for 2-choice perceptual decisions, the model predicts equal accuracy but faster responding for stimulus pairs with equally scaled-up intensities; (c) the coefficient of variation (CV) of decision times should remain constant across average intensity scales, but should otherwise decrease as a specific function of stimulus discriminability and speed-accuracy trade-off; and (d) for timing tasks, RT CVs should be constant for all durations, and RT skewness should always equal 3 times the CV. We tested these predictions using visual, auditory and vibrotactile decision tasks and visual interval timing tasks in humans. The data conformed closely to the predictions in all modalities. These results support a unified theory of decision making and timing in terms of a common, underlying spike-counting process, compactly represented as a diffusion process.
Hassan, Ahmed E
Many sites of ground water contamination rely heavily on complex numerical models of flow and transport to develop closure plans. This complexity has created a need for tools and approaches that can build confidence in model predictions and provide evidence that these predictions are sufficient for decision making. Confidence building is a long-term, iterative process and the author believes that this process should be termed model validation. Model validation is a process, not an end result. That is, the process of model validation cannot ensure acceptable prediction or quality of the model. Rather, it provides an important safeguard against faulty models or inadequately developed and tested models. If model results become the basis for decision making, then the validation process provides evidence that the model is valid for making decisions (not necessarily a true representation of reality). Validation, verification, and confirmation are concepts associated with ground water numerical models that not only do not represent established and generally accepted practices, but there is not even widespread agreement on the meaning of the terms as applied to models. This paper presents a review of model validation studies that pertain to ground water flow and transport modeling. Definitions, literature debates, previously proposed validation strategies, and conferences and symposia that focused on subsurface model validation are reviewed and discussed. The review is general and focuses on site-specific, predictive ground water models used for making decisions regarding remediation activities and site closure. The aim is to provide a reasonable starting point for hydrogeologists facing model validation for ground water systems, thus saving a significant amount of time, effort, and cost. This review is also aimed at reviving the issue of model validation in the hydrogeologic community and stimulating the thinking of researchers and practitioners to develop practical and
Cabello, María Eugenia; Ramos, Isidro
In this chapter, we present software variability management using conceptual models for diagnostic decision support information systems (DSS) development. We use a software product line (SPL) approach. In the construction of the SPL, two orthogonal variabilities are used to capture domain (i.e., diagnosis) and application domain (i.e., medical diagnosis) particularities. In this context, we describe how variability is managed by using our BOM (baseline-oriented modeling) approach. BOM is a framework that automatically generates applications as PRISMA software architectural models using model transformations and SPL techniques. We use model-driven architecture (MDA) to build domain models (i.e., computational-independent models, CIMs), which are automatically transformed into platform-independent models, PIMs, and then compiled to a executable application (i.e., platform-specific model, PSM). In order to illustrate BOM, we focus on a type of information system, the decision support system, specifically in the diagnostic domain.
Larocque, Guy R.; Bhatti, Jagtar S.; Ascough, J.C.; Liu, J.; Luckai, N.; Mailly, D.; Archambault, L.; Gordon, Andrew M.
The predictions from most forest ecosystem models originate from deterministic simulations. However, few evaluation exercises for model outputs are performed by either model developers or users. This issue has important consequences for decision makers using these models to develop natural resource management policies, as they cannot evaluate the extent to which predictions stemming from the simulation of alternative management scenarios may result in significant environmental or economic differences. Various numerical methods, such as sensitivity/uncertainty analyses, or bootstrap methods, may be used to evaluate models and the errors associated with their outputs. However, the application of each of these methods carries unique challenges which decision makers do not necessarily understand; guidance is required when interpreting the output generated from each model. This paper proposes a decision flow chart in the form of an analytical framework to help decision makers apply, in an orderly fashion, different steps involved in examining the model outputs. The analytical framework is discussed with regard to the definition of problems and objectives and includes the following topics: model selection, identification of alternatives, modelling tasks and selecting alternatives for developing policy or implementing management scenarios. Its application is illustrated using an on-going exercise in developing silvicultural guidelines for a forest management enterprise in Ontario, Canada.
Lee, Seung Yup
As a key technology for enhancing the smart grid system, Phasor Measurement Unit (PMU) provides synchronized phasor measurements of voltages and currents of wide-area electric power grid. With various benefits from its application, one of the critical issues in utilizing PMUs is the optimal site selection of units. The main aim of this research is to develop a decision support system, which can be used in resource allocation task for smart grid system analysis. As an effort to suggest a robust decision model and standardize the decision modeling process, a harmonized modeling framework, which considers operational circumstances of component, is proposed in connection with a deterministic approach utilizing integer programming. With the results obtained from the optimal PMU placement problem, the advantages and potential that the harmonized modeling process possesses are assessed and discussed.
Serfaty, D.; Kleinman, D. L.
Most research in modelling human information processing and decision making has been devoted to the case of the single human operator. In the present effort, concepts from the fields of organizational behavior, engineering psychology, team theory and mathematical modelling are merged in an attempt to consider first the case of two cooperating decisionmakers (the Dyad) in a multi-task environment. Rooted in the well-known Dynamic Decision Model (DDM), the normative descriptive approach brings basic cognitive and psychophysical characteristics inherent to human behavior into a team theoretic analytic framework. An experimental paradigm, involving teams in dynamic decision making tasks, is designed to produce the data with which to build the theoretical model.
Alamino, Roberto C.
This work introduces a model in which agents of a network act upon one another according to three different kinds of moral decisions. These decisions are based on an increasing level of sophistication in the empathy capacity of the agent, a hierarchy which we name Piaget’s ladder. The decision strategy of the agents is non-rational, in the sense they are arbitrarily fixed, and the model presents quenched disorder given by the distribution of its defining parameters. An analytical solution for this model is obtained in the large system limit as well as a leading order correction for finite-size systems which shows that typical realisations of the model develop a phase structure with both continuous and discontinuous non-thermal transitions.
Amor, J. P.; Dyer, J. S.
A statistical model designed to assist elementary school principals in the process of selection educational areas which should receive additional emphasis is presented. For each educational area, the model produces an index number which represents the expected "value" per dollar spent on an instructional program appropriate for strengthening that…
Mitigation and adaptation are the two key responses available to policymakers to reduce the risks of climate change. We model these two policies together in a new DICE-based integrated assessment model that characterizes adaptation as either short-lived flow spending or long-liv...
Sun, Zhaohao; Sun, Junqing; Meredith, Grant
Customer decision making (CDM) is an indispensable factor for web services. This article examines CDM in web services with a novel P6 model, which consists of the 6 Ps: privacy, perception, propensity, preference, personalization and promised experience. This model integrates the existing 6 P elements of marketing mix as the system environment of CDM in web services. The new integrated P6 model deals with the inner world of the customer and incorporates what the customer think during the DM process. The proposed approach will facilitate the research and development of web services and decision support systems.
Orsini, Caitlin A; Moorman, David E; Young, Jared W; Setlow, Barry; Floresco, Stan B
Over the past 20 years there has been a growing interest in the neural underpinnings of cost/benefit decision-making. Recent studies with animal models have made considerable advances in our understanding of how different prefrontal, striatal, limbic and monoaminergic circuits interact to promote efficient risk/reward decision-making, and how dysfunction in these circuits underlies aberrant decision-making observed in numerous psychiatric disorders. This review will highlight recent findings from studies exploring these questions using a variety of behavioral assays, as well as molecular, pharmacological, neurophysiological, and translational approaches. We begin with a discussion of how neural systems related to decision subcomponents may interact to generate more complex decisions involving risk and uncertainty. This is followed by an overview of interactions between prefrontal-amygdala-dopamine and habenular circuits in regulating choice between certain and uncertain rewards and how different modes of dopamine transmission may contribute to these processes. These data will be compared with results from other studies investigating the contribution of some of these systems to guiding decision-making related to rewards vs. punishment. Lastly, we provide a brief summary of impairments in risk-related decision-making associated with psychiatric disorders, highlighting recent translational studies in laboratory animals.
Link, R; Kallel, S
Soft-decision-feedback MAP decoders are developed for joint source/channel decoding (JSCD) which uses the residual redundancy in two-dimensional sources. The source redundancy is described by a second order Markov model which is made available to the receiver for row-by-row decoding, wherein the output for one row is used to aid the decoding of the next row. Performance can be improved by generalizing so as to increase the vertical depth of the decoder. This is called sheet decoding, and entails generalizing trellis decoding of one-dimensional data to trellis decoding of two-dimensional data (2-D). The proposed soft-decision-feedback sheet decoder is based on the Bahl algorithm, and it is compared to a hard-decision-feedback sheet decoder which is based on the Viterbi algorithm. The method is applied to 3-bit DPCM picture transmission over a binary symmetric channel, and it is found that the soft-decision-feedback decoder with vertical depth V performs approximately as well as the hard-decision-feedback decoder with vertical depth V+1. Because the computational requirement of the decoders depends exponentially on the vertical depth, the soft-decision-feedbark decoder offers significant reduction in complexity. For standard monochrome Lena, at a channel bit error rate of 0.05, the V=1 and V=2 soft-decision-feedback decoder JSCD gains in RSNR are 5.0 and 6.3 dB, respectively.
2008). Yasser Arafat, now relocated to Tunis, faced deep divisions of his own. 8 The Phalanges ...models adds a contextual dimension to the uncertainty. The Israeli population at a given time has either a favorable or unfavorable view of its
Schmolke, Amelie; Thorbek, Pernille; DeAngelis, Donald L.; Grimm, Volker
Ecological models are important for environmental decision support because they allow the consequences of alternative policies and management scenarios to be explored. However, current modeling practice is unsatisfactory. A literature review shows that the elements of good modeling practice have long been identified but are widely ignored. The reasons for this might include lack of involvement of decision makers, lack of incentives for modelers to follow good practice, and the use of inconsistent terminologies. As a strategy for the future, we propose a standard format for documenting models and their analyses: transparent and comprehensive ecological modeling (TRACE) documentation. This standard format will disclose all parts of the modeling process to scrutiny and make modeling itself more efficient and coherent.
Models have become an integral part of decision-making for many LUST and Brownfields sites if only because they form the basis of RCBA tiered assessments. Models, though, are based on a series of assumptions concerning how chemicals behave in the environment, how water flows thr...
Callanan, Gerard A.; Zimmerman, Monica
Reflecting the need for a better and broader understanding of the factors influencing the choices to enter into or exit an entrepreneurial career, this article applies a structured, normative model of career management to the career decision-making of entrepreneurs. The application of a structured model can assist career counselors, college career…
Christensen, Darren R.; Grace, Randolph C.
Grace and McLean (2006) proposed a decision model for acquisition of choice in concurrent chains which assumes that after reinforcement in a terminal link, subjects make a discrimination whether the preceding reinforcer delay was short or long relative to a criterion. Their model was subsequently extended by Christensen and Grace (2008, 2009a,…
Gerrard, Meg; Gibbons, Frederick X.; Houlihan, Amy E.; Stock, Michelle L.; Pomery, Elizabeth A.
Although dual-process models in cognitive, personality, and social psychology have stimulated a large body of research about analytic and heuristic modes of decision making, these models have seldom been applied to the study of adolescent risk behaviors. In addition, the developmental course of these two kinds of information processing, and their…
This comparison of two approaches to the development of computerized supports for decision making--expert systems and multivariate models--focuses on computerized systems that assist professionals with tasks related to diagnosis or classification in human services. Validation of both expert systems and statistical models is emphasized. (39…
in some applications (Kaelbling, Littman, & Cassandra, 1998; Neumann & Morgenstern, 1944; Russell & Norvig , 2003). However, research into people’s...scientists often model peoples’ decisions through machine learning techniques (Russell & Norvig , 2003). These models are based on statistical methods such as...A., & Kraus, S. (2011). Using aspiration adaptation theory to improve learning. In Aamas (p. 423-430). Russell, S. J., & Norvig , P. (2003
Jung, Jae Yup
This study developed and empirically tested two related models of the occupational/career decision-making processes of gifted adolescents using a competing models strategy. The two models that guided the study, which acknowledged cultural orientations, social influences from the family, occupational/career values, and characteristics of…
Fernandez, Miguel A.; Korutcheva, Elka; de la Rubia, F. Javier
We study a diluted Blume-Capel model of 3-states sites as an attempt to understand how some social processes as cooperation or organization happen. For this aim, we study the effect of the complex network topology on the equilibrium properties of the model, by focusing on three different substrates: random graph, Watts-Strogatz and Newman substrates. Our computer simulations are in good agreement with the corresponding analytical results.
Beck, Nicole G; Conley, Gary; Kanner, Lisa; Mathias, Margaret
We present an urban runoff model designed for stormwater managers to quantify runoff reduction benefits of mitigation actions that has lower input data and user expertise requirements than most commonly used models. The stormwater tool to estimate load reductions (TELR) employs a semi-distributed approach, where landscape characteristics and process representation are spatially-lumped within urban catchments on the order of 100 acres (40 ha). Hydrologic computations use a set of metrics that describe a 30-year rainfall distribution, combined with well-tested algorithms for rainfall-runoff transformation and routing to generate average annual runoff estimates for each catchment. User inputs include the locations and specifications for a range of structural best management practice (BMP) types. The model was tested in a set of urban catchments within the Lake Tahoe Basin of California, USA, where modeled annual flows matched that of the observed flows within 18% relative error for 5 of the 6 catchments and had good regional performance for a suite of performance metrics. Comparisons with continuous simulation models showed an average of 3% difference from TELR predicted runoff for a range of hypothetical urban catchments. The model usually identified the dominant BMP outflow components within 5% relative error of event-based measured flow data and simulated the correct proportionality between outflow components. TELR has been implemented as a web-based platform for use by municipal stormwater managers to inform prioritization, report program benefits and meet regulatory reporting requirements (www.swtelr.com).
This paper advances an explanation for decision fiascoes that reflects recent theoretical trends and was developed in response to a growing body of research that has failed to substantiate the groupthink model (Janis, 1982). In this new framework, the lack of vigilance and preference for risk that characterizes groups contaminated by groupthink are attributed in large part to perceptions of collective efficacy that unduly exceed capability. High collective efficacy may also contribute to the negative framing of decisions and to certain administrative and structural organizational faults. In the making of critical decisions, these factors induce a preference for risk and a powerful concurrence seeking tendency that, facilitated by group polarization, crystallize around a decision option that is likely to fail. Implications for research and some evidence in support of this approach to the groupthink phenomenon are also discussed. Copyright 1998 Academic Press.
Pachur, Thorsten; Marinello, Gianmarco
How does expertise impact the selection of decision strategies? We asked airport customs officers and a novice control group to decide which passengers (described on several cue dimensions) they would submit to a search. Additionally, participants estimated the validities of the different cues. Then we modeled the decisions using compensatory strategies, which integrate many cues, and a noncompensatory heuristic, which relies on one-reason decision making. The majority of the customs officers were best described by the noncompensatory heuristic, whereas the majority of the novices were best described by a compensatory strategy. We also found that the experts' subjective cue validity estimates showed a higher dispersion across the cues and that differences in cue dispersion partially mediated differences in strategy use between experts and novices. Our results suggest that experts often rely on one-reason decision making and that expert-novice differences in strategy selection may reflect a response to the internal representation of the environment.
Jana, Biswajit; Mohanty, Sachi Nandan
The purpose of this paper is to enhance the applicability of the fuzzy sets for developing mathematical models for decision making under uncertainty, In general a decision making process consist of four stages, namely collection of information from various sources, compile the information, execute the information and finally take the decision/action. Only fuzzy sets theory is capable to quantifying the linguistic expression to mathematical form in complex situation. Intuitionistic fuzzy set (IFSs) which reflects the fact that the degree of non membership is not always equal to one minus degree of membership. There may be some degree of hesitation. Thus, there are some situations where IFS theory provides a more meaningful and applicable to cope with imprecise information present for solving multiple criteria decision making problem. This paper emphasis on IFSs, which is help for solving real world problem in uncertainty situation.
Booth, Steven Richard
AET-2 has expertise in process modeling, economics, business case analysis, risk assessment, Lean/Six Sigma tools, and decision analysis to provide timely decision support to LANS leading to continuous improvement. This capability is critical during the current tight budgetary environment as LANS pushes to identify potential areas of cost savings and efficiencies. An important arena is business systems and operations, where processes can impact most or all laboratory employees. Lab-wide efforts are needed to identify and eliminate inefficiencies to accomplish Director McMillan’s charge of “doing more with less.” LANS faces many critical and potentially expensive choices that require sound decision support to ensure success. AET-2 is available to provide this analysis support to expedite the decisions at hand.
Boos, S.; Hornung, S.; Müller, H.
Most archaeological predictive models lack significance because fuzziness of data and uncertainty in knowledge about human behaviour and natural processes are hardly ever considered. One possibility to cope with such uncertainties is utilization of probability based approaches like Bayes Theorem or Dempster-Shafer-Theory. We analyzed an area of 50 km2 in Rhineland Palatinate (Germany) near a Celtic oppidum by use of Dempster-Shafer's theory of evidence for predicting spatial probability distribution of archaeological sites. This technique incorporates uncertainty by assigning various weights of evidence to defined variables, in that way estimating the probability for supporting a specific hypothesis (in our case the hypothesis presence or absence of a site). Selection of variables for our model relied both on assumptions about settlement patterns and on statistically tested relationships between known archaeological sites and environmental factors. The modelling process was conducted in a Geographic Information System (GIS) by generating raster-based likelihood surfaces. The corresponding likelihood surfaces were aggregated to a final weight of evidence surface, which resulted in a likelihood value for every single cell of being a site or a non-site. Finally the result was tested against a database of known archaeological sites for evaluating the gain of the model. For the purpose of enhancing the gain of our model and sharpening our criteria we used a two-step approach to improve the modelling of former settlement strategies in our study area. Applying the developed model finally yielded a 100 percent success rate of known archaeological sites located in predicted high potential areas.
Vesselinov, V. V.; Harp, D. R.; Mishra, P. K.; Katzman, D.
A crucial aspect of any decision-making process for environmental management of contaminated sites and protection of groundwater resources is the identification of scientifically defensible remediation scenarios. The selected scenarios are ranked based on both their protective and cost effectiveness. The decision-making process is facilitated by implementation of site-specific data- and model-driven analyses for decision support (DS) taking into account existing uncertainties to evaluate alternative characterization and remedial activities. However, due to lack of data and/or complex interdependent uncertainties (conceptual elements, model parameters, measurement/computational errors, etc.), the DS optimization problem is ill posed (non unique) and the model-prediction uncertainties are difficult to quantify. Recently, we have developed and implemented several novel theoretical approaches and computational algorithms for model-driven decision support. New and existing DS tools have been employed for model analyses of the fate and extent of a chromium plume in the regional aquifer at Sandia Canyon Site, LANL. Since 2007, we have performed three iterations of DS analyses implementing different models, decision-making tools, and data sets providing guidance on design of a subsurface monitoring network for (1) characterization of the flow and transports processes, and (2) protection of the water users. The monitoring network is augmented by new wells at locations where acquired new data can effectively reduce uncertainty in model predicted contaminant concentrations. A key component of the DS analyses is contaminant source identification. Due to data and conceptual uncertainties, subsurface processes controlling the contaminant arrival at the top of the regional aquifer are not well defined. Nevertheless, the model-based analyses of the existing data and conceptual knowledge, including respective uncertainties, provide constrained probabilistic estimates of the
Schwartz, S. H.; Allen, R. W.
A decision model including perceptual noise or inconsistency is developed from expected value theory to explain driver stop and go decisions at signaled intersections. The model is applied to behavior in a car simulation and instrumented vehicle. Objective and subjective changes in driver decision making were measured with changes in blood alcohol concentration (BAC). Treatment levels averaged 0.00, 0.10 and 0.14 BAC for a total of 26 male subjects. Data were taken for drivers approaching signal lights at three timing configurations. The correlation between model predictions and behavior was highly significant. In contrast to previous research, analysis indicates that increased BAC results in increased perceptual inconsistency, which is the primary cause of increased risk taking at low probability of success signal lights.
Hauskrecht, M; Fraser, H
Diagnosis of a disease and its treatment are not separate, one-shot activities. Instead they are very often dependent and interleaved over time, mostly due to uncertainty about the underlying disease, uncertainty associated with the response of a patient to the treatment and varying cost of different diagnostic (investigative) and treatment procedures. The framework of Partially observable Markov decision processes (POMDPs) developed and used in operations research, control theory and artificial intelligence communities is particularly suitable for modeling such a complex decision process. In the paper, we show how the POMDP framework could be used to model and solve the problem of the management of patients with ischemic heart disease, and point out modeling advantages of the framework over standard decision formalisms.
Zhang, Shunan; Lee, Michael D.; Vandekerckhove, Joachim; Maris, Gunter; Wagenmakers, Eric-Jan
Diffusion models are widely-used and successful accounts of the time course of two-choice decision making. Most diffusion models assume constant boundaries, which are the threshold levels of evidence that must be sampled from a stimulus to reach a decision. We summarize theoretical results from statistics that relate distributions of decisions and response times to diffusion models with time-varying boundaries. We then develop a computational method for finding time-varying boundaries from empirical data, and apply our new method to two problems. The first problem involves finding the time-varying boundaries that make diffusion models equivalent to the alternative sequential sampling class of accumulator models. The second problem involves finding the time-varying boundaries, at the individual level, that best fit empirical data for perceptual stimuli that provide equal evidence for both decision alternatives. We discuss the theoretical and modeling implications of using time-varying boundaries in diffusion models, as well as the limitations and potential of our approach to their inference. PMID:25538642
Kamienkowski, Juan E.; Pashler, Harold; Dehaene, Stanislas; Sigman, Mariano
Does extensive practice reduce or eliminate central interference in dual-task processing? We explored the reorganization of task architecture with practice by combining interference analysis (delays in dual-task experiment) and random-walk models of decision making (measuring the decision and non-decision contributions to RT). The main delay…
Within the frame of the US-India bilateral agreement on environmental cooperation, a team of US scientists have been helping India in designing emission control policies to address urban air quality problems. This presentation discusses how air quality models need to be used for ...
A recognition of pradigmatic diversity and the contribution of multiple paradigms to planning greatly enriches both theory and practice. This analysis groups planning definitions into two general models, associates both with hard and soft systems thinking, and examines conceptual roots. The interpretive/humanist paradigm often highlights…
Spatially realistic population models (SRPMs) address a fundamental
problem commonly confronted by wildlife managers - predicting the
effects of landscape-scale habitat management on an animal population.
SRPMs typically consist of three submodels: (1) a habitat submodel...
Cutello, Vincenzo; Montero, Javier
In this paper we present a generalization of the model proposed by Montero, by allowing non-complete fuzzy binary relations for individuals. A degree of unsatisfaction can be defined in this case, suggesting that any democratic aggregation rule should take into account not only ethical conditions or some degree of rationality in the amalgamating procedure, but also a minimum support for the set of alternatives subject to the group analysis.
Jahn, Beate; Theurl, Engelbert; Siebert, Uwe; Pfeiffer, Karl-Peter
In most decision-analytic models in health care, it is assumed that there is treatment without delay and availability of all required resources. Therefore, waiting times caused by limited resources and their impact on treatment effects and costs often remain unconsidered. Queuing theory enables mathematical analysis and the derivation of several performance measures of queuing systems. Nevertheless, an analytical approach with closed formulas is not always possible. Therefore, simulation techniques are used to evaluate systems that include queuing or waiting, for example, discrete event simulation. To include queuing in decision-analytic models requires a basic knowledge of queuing theory and of the underlying interrelationships. This tutorial introduces queuing theory. Analysts and decision-makers get an understanding of queue characteristics, modeling features, and its strength. Conceptual issues are covered, but the emphasis is on practical issues like modeling the arrival of patients. The treatment of coronary artery disease with percutaneous coronary intervention including stent placement serves as an illustrative queuing example. Discrete event simulation is applied to explicitly model resource capacities, to incorporate waiting lines and queues in the decision-analytic modeling example.
Selby, Philip; Medellin-Azuara, Josue; Harou, Julien; Klassert, Christian; Yoon, Jim
We describe an agent based hydro-economic model of groundwater irrigated agriculture in the Jordan Highlands. The model employs a Multi-Agent-Simulation (MAS) framework and is designed to evaluate direct and indirect outcomes of climate change scenarios and policy interventions on farmer decision making, including annual land use, groundwater use for irrigation, and water sales to a water tanker market. Land use and water use decisions are simulated for groups of farms grouped by location and their behavioural and economic similarities. Decreasing groundwater levels, and the associated increase in pumping costs, are important drivers for change within Jordan'S agricultural sector. We describe how this is considered by coupling of agricultural and groundwater models. The agricultural production model employs Positive Mathematical Programming (PMP), a method for calibrating agricultural production functions to observed planted areas. PMP has successfully been used with disaggregate models for policy analysis. We adapt the PMP approach to allow explicit evaluation of the impact of pumping costs, groundwater purchase fees and a water tanker market. The work demonstrates the applicability of agent-based agricultural decision making assessment in the Jordan Highlands and its integration with agricultural model calibration methods. The proposed approach is designed and implemented with software such that it could be used to evaluate a variety of physical and human influences on decision making in agricultural water management.
Convective weather and other constraints create uncertainty in air transportation, leading to costly delays. A Ground Delay Program (GDP) is a strategy to mitigate these effects. Systematic decision support can increase GDP efficacy, reduce delays, and minimize direct operating costs. In this study, a decision analysis (DA) model is constructed by combining a decision tree and Bayesian belief network. Through a study of three New York region airports, the DA model demonstrates that larger GDP scopes that include more flights in the program, along with longer lead times that provide stakeholders greater notice of a pending program, trigger the fewest average arrival delays. These findings are demonstrated to result in a savings of up to $1,850 per flight. Furthermore, when convective weather is predicted, forecast weather confidences remain the same level or greater at least 70% of the time, supporting more strategic decision making. The DA model thus enables quantification of uncertainties and insights on causal relationships, providing support for future GDP decisions.
Lenert, Leslie; Dunlea, Robert; Del Fio, Guilherme; KellyHall, Leslie
Shared Decision Making (SDM) is an approach to medical care based on collaboration between provider and patient with both sharing in medical decisions. When patients’ values and preferences are incorporated in decision-making, then care is more appropriate, ethically sound, and often lower in cost. However, SDM is difficult to implement in routine practice because of the time required for SDM methods, the lack of integration of SDM approaches into electronic health records systems (EHRs), and absence of explanatory mechanisms for providers on the results of patients’ use of decision aids. This paper discusses potential solutions including the concept of a “Personalize Button” for EHRs. Leveraging a four-phased clinical model for SDM, this article describes how computer decision support (CDS) technologies integrated into EHRs can help insure that healthcare is delivered in a way that is respectful of those preferences. The architecture described herein, called CDS for SDM, is built upon recognized standards that are currently integrated into certification requirements for EHRs as part of Meaningful Use regulations. While additional work is needed on modeling of preferences and on techniques for rapid communication models of preferences to clinicians, unless EHRs are re-designed to support SDM around and during clinical encounters, they are likely to continue to be an unintended barrier to SDM. With appropriate development, EHRs could be a powerful tool to promote SDM by reminding providers of situations for SDM and monitoring on going care to insure treatments are consistent with patients’ preferences. PMID:25224366
Probst, Marc A; Kanzaria, Hemal K; Schriger, David L
The use of computed tomographic scanning in blunt head trauma has increased dramatically in recent years without an accompanying rise in the prevalence of injury or hospital admission for serious conditions. Because computed tomography is neither harmless nor inexpensive, researchers have attempted to optimize utilization, largely through research that describes which clinical variables predict intracranial injury, and use this information to develop clinical decision instruments. Although such techniques may be useful when the benefits and harms of each strategy (neuroimaging vs observation) are quantifiable and amenable to comparison, the exact magnitude of these benefits and harms remains unknown in this clinical scenario. We believe that most clinical decision instrument development efforts are misguided insofar as they ignore critical, nonclinical factors influencing the decision to image. In this article, we propose a conceptual model to illustrate how clinical and nonclinical factors influence emergency physicians making this decision. We posit that elements unrelated to standard clinical factors, such as personality of the physician, fear of litigation and of missed diagnoses, patient expectations, and compensation method, may have equal or greater impact on actual decision making than traditional clinical factors. We believe that 3 particular factors deserve special consideration for further research: fear of error/malpractice, financial incentives, and patient engagement. Acknowledgement and study of these factors will be essential if we are to understand how emergency physicians truly make these decisions and how test-ordering behavior can be modified.
Lewis, Krystina B; Stacey, Dawn; Squires, Janet E; Carroll, Sandra
Patient engagement in collaboration with health professionals is essential to deliver quality health care. A shared decision-making (SDM) approach requires that patients are involved in decisions regarding their health. SDM is expanding from the patient-physician dyad to incorporate an interprofessional perspective. Conceptual models can be used to better understand theoretical underpinnings for application in clinical practice. The aim of this article was to conduct a theory analysis of conceptual models using an interprofessional approach to SDM and discuss each model's relevance to nursing practice. Walker and Avant's theory analysis approach was used. Three conceptual models were eligible. For all models, the decision-making process was considered iterative. The development process was described for 1 model. All models were logical, parsimonious, and generalizable. One was supported by empirical testing. No model described how partnerships are enacted to achieve interprofessional SDM. Also, there was limited articulation as to how nurses' roles and contributions differ from other team members. This theory analysis highlights the need for a model that explains how partnerships among interprofessional team members are enacted to better understand the operationalization of interprofessional SDM. Implications for nursing practice at all system levels are offered and supported by the 3 models.
no requirement for the supplier to provide a coordination incentive and V=0. Total costs in this mode are denoted by TCc = TCb(Qc,Rc,0)+TCs(Qc,Nc,0...If the parties are not centralized but can coordinate their policies, the potential exists to divide cost savings of TC+ = TCd- TCc . An interval...comparison of the solutions in the decentralized and centralized models shows that the costs in the entire supply chain can be reduced by TC+ = TCd- TCc = 30
Mccoy, Michael S.; Boys, Randy M.
Manned space operations require that the many automated subsystems of a space platform be controllable by a limited number of personnel. To minimize the interaction required of these operators, artificial intelligence techniques may be applied to embed a human performance model within the automated, or semi-automated, systems, thereby allowing the derivation of operator intent. A similar application has previously been proposed in the domain of fighter piloting, where the demand for pilot intent derivation is primarily a function of limited time and high workload rather than limited operators. The derivation and propagation of pilot intent is presented as it might be applied to some programs.
MULTIATTRIBUTE UTILITY THEORY 57 B. ASSESSMENT OF THE STRENGTHS, LIMITATIONS, AND WEAKNESSES OF THE MAUT MODEL .. ......... 62 VIII. THE GOAL PROGRAMMING MODEL...several sources. We will then move to a development of the Analytic Hierarchy Process (AHP). Next, the 36 Multiattribute Utility Theory ( MAUT ) will...decision model which has proven useful for supporting the evaluation of projects for R&D selection is multiattribute utility theory or MAUT .
AbuKhousa, Eman; Al-Jaroodi, Jameela; Lazarova-Molnar, Sanja; Mohamed, Nader
Recently, most healthcare organizations focus their attention on reducing the cost of their supply chain management (SCM) by improving the decision making pertaining processes' efficiencies. The availability of products through healthcare SCM is often a matter of life or death to the patient; therefore, trial and error approaches are not an option in this environment. Simulation and modeling (SM) has been presented as an alternative approach for supply chain managers in healthcare organizations to test solutions and to support decision making processes associated with various SCM problems. This paper presents and analyzes past SM efforts to support decision making in healthcare SCM and identifies the key challenges associated with healthcare SCM modeling. We also present and discuss emerging technologies to meet these challenges.
Pareschi, Lorenzo; Vellucci, Pierluigi; Zanella, Mattia
We introduce and discuss kinetic models describing the influence of the competence in the evolution of decisions in a multi-agent system. The original exchange mechanism, which is based on the human tendency to compromise and change opinion through self-thinking, is here modified to include the role of the agents' competence. In particular, we take into account the agents' tendency to behave in the same way as if they were as good, or as bad, as their partner: the so-called equality bias. This occurred in a situation where a wide gap separated the competence of group members. We discuss the main properties of the kinetic models and numerically investigate some examples of collective decision under the influence of the equality bias. The results confirm that the equality bias leads the group to suboptimal decisions.
Puskaric, Marin; von Helversen, Bettina; Rieskamp, Jörg
Social information such as observing others can improve performance in decision making. In particular, social information has been shown to be useful when finding the best solution on one's own is difficult, costly, or dangerous. However, past research suggests that when making decisions people do not always consider other people's behaviour when it is at odds with their own experiences. Furthermore, the cognitive processes guiding the integration of social information with individual experiences are still under debate. Here, we conducted two experiments to test whether information about other persons' behaviour influenced people's decisions in a classification task. Furthermore, we examined how social information is integrated with individual learning experiences by testing different computational models. Our results show that social information had a small but reliable influence on people's classifications. The best computational model suggests that in categorization people first make up their own mind based on the non-social information, which is then updated by the social information.
Recently, most healthcare organizations focus their attention on reducing the cost of their supply chain management (SCM) by improving the decision making pertaining processes' efficiencies. The availability of products through healthcare SCM is often a matter of life or death to the patient; therefore, trial and error approaches are not an option in this environment. Simulation and modeling (SM) has been presented as an alternative approach for supply chain managers in healthcare organizations to test solutions and to support decision making processes associated with various SCM problems. This paper presents and analyzes past SM efforts to support decision making in healthcare SCM and identifies the key challenges associated with healthcare SCM modeling. We also present and discuss emerging technologies to meet these challenges. PMID:24683333
Collura, Thomas Francis; Zalaquett, Ronald P; Bonnstetter, Carlos Joyce; Chatters, Seria J
Current brain research increasingly reveals the underlying mechanisms and processes of human behavior, cognition, and emotion. In addition to being of interest to a wide range of scientists, educators, and professionals, as well as laypeople, brain-based models are of particular value in a clinical setting. Psychiatrists, psychologists, counselors, and other mental health professionals are in need of operational models that integrate recent findings in the physical, cognitive, and emotional domains, and offer a common language for interdisciplinary understanding and communication. Based on individual traits, predispositions, and responses to stimuli, we can begin to identify emotional and behavioral pathways and mental processing patterns. The purpose of this article is to present a brain-path activation model to understand individual differences in decision making and psychopathology. The first section discusses the role of frontal lobe electroencephalography (EEG) asymmetry, summarizes state- and trait-based models of decision making, and provides a more complex analysis that supplements the traditional simple left-right brain model. Key components of the new model are the introduction of right hemisphere parallel and left hemisphere serial scanning in rendering decisions, and the proposition of pathways that incorporate both past experiences as well as future implications into the decision process. Main attributes of each decision-making mechanism are provided. The second section applies the model within the realm of clinical mental health as a tool to understand specific human behavior and pathology. Applications include general and chronic anxiety, depression, paranoia, risk taking, and the pathways employed when well-functioning operational integration is observed. Finally, specific applications such as meditation and mindfulness are offered to facilitate positive functioning.
Scopel Simoes, Marcos A.; Barretto, Marcos R. P.
This work proposes the use of Timed Colored Petri nets as a formal base to a decision making tool for applications in industrial productive processes planning and programming. The Timed Colored Petri net is responsible for the transition of states in the decision process, establishing in time the use of resources and of heuristics that correspond to the more important managerial and operational actions for the planning activities and programming of the productive processes of an industrial plant. To negotiate with the uncertainties involved in a decision process, that in general takes care of the responsible specialist's knowledge for the routines involved in the productive system, we make use of the theory of fuzzy sets to suggest decisions logically consistent that obtain a viable solution just leading the viable states of the decision tree, that, in this case, is confused with the occurrence graph of the Petri net. As application example to the proposed model, we used a production system characterized by a port plant, whose model and simulation results are described at the end of this work.
Klaes, Christian; Schneegans, Sebastian; Schöner, Gregor; Gail, Alexander
According to a prominent view of sensorimotor processing in primates, selection and specification of possible actions are not sequential operations. Rather, a decision for an action emerges from competition between different movement plans, which are specified and selected in parallel. For action choices which are based on ambiguous sensory input, the frontoparietal sensorimotor areas are considered part of the common underlying neural substrate for selection and specification of action. These areas have been shown capable of encoding alternative spatial motor goals in parallel during movement planning, and show signatures of competitive value-based selection among these goals. Since the same network is also involved in learning sensorimotor associations, competitive action selection (decision making) should not only be driven by the sensory evidence and expected reward in favor of either action, but also by the subject's learning history of different sensorimotor associations. Previous computational models of competitive neural decision making used predefined associations between sensory input and corresponding motor output. Such hard-wiring does not allow modeling of how decisions are influenced by sensorimotor learning or by changing reward contingencies. We present a dynamic neural field model which learns arbitrary sensorimotor associations with a reward-driven Hebbian learning algorithm. We show that the model accurately simulates the dynamics of action selection with different reward contingencies, as observed in monkey cortical recordings, and that it correctly predicted the pattern of choice errors in a control experiment. With our adaptive model we demonstrate how network plasticity, which is required for association learning and adaptation to new reward contingencies, can influence choice behavior. The field model provides an integrated and dynamic account for the operations of sensorimotor integration, working memory and action selection required for
Tidwell, Vincent Carroll; Correa, Alberto; Maxwell, Paul; Malczynski, Leonard A.
Recently, Sandia National Laboratories and General Motors cooperated on the development of the Biofuels Deployment Model (BDM) to assess the feasibility, implications, limitations, and enablers of producing 90 billion gallons of ethanol per year by 2030. Leveraging the past investment, a decision support model based on the BDM is being developed to assist investors, entrepreneurs, and decision makers in evaluating the costs and benefits associated with biofuels development in the U.S.-Mexico border region. Specifically, the model is designed to assist investors and entrepreneurs in assessing the risks and opportunities associated with alternative biofuels development strategies along the U.S.-Mexico border, as well as, assist local and regional decision makers in understanding the tradeoffs such development poses to their communities. The decision support model is developed in a system dynamics framework utilizing a modular architecture that integrates the key systems of feedstock production, transportation, and conversion. The model adopts a 30-year planning horizon, operating on an annual time step. Spatially the model is disaggregated at the county level on the U.S. side of the border and at the municipos level on the Mexican side. The model extent includes Luna, Hildalgo, Dona Anna, and Otero counties in New Mexico, El Paso and Hudspeth counties in Texas, and the four munipos along the U.S. border in Chihuahua. The model considers a variety of feedstocks; specifically, algae, gitropha, castor oil, and agricultural waste products from chili and pecans - identifying suitable lands for these feedstocks, possible yields, and required water use. The model also evaluates the carbon balance for each crop and provides insight into production costs including labor demands. Finally, the model is fitted with an interactive user interface comprised of a variety of controls (e.g., slider bars, radio buttons), descriptive text, and output graphics allowing stakeholders to
Ruiz-Casares, Monica; Heymann, Jody
Objective: This paper examines different child care arrangements utilized by working families in countries undergoing major socio-economic transitions, with a focus on modeling parental decisions to leave children home alone. Method: The study interviewed 537 working caregivers attending government health clinics in Botswana, Mexico, and Vietnam.…
Huang, Yu-Chen; Tu, Jui-Che; Hung, So-Jeng
In response to the global trend of low carbon and the concept of sustainable development, enterprises need to develop R&D for the manufacturing of energy-saving and sustainable products and low carbon products. Therefore, the purpose of this study was to construct a decision model for sustainable product design and development from product…
In terms of the development of software for decision support for pipeline renewal, more attention to date has been paid to the development of asset management models that help an owner decide on which portions of a system to prioritize needed actions. There has been much less w...
In terms of the development of software for decision support for pipeline renewal, more attention to date has been paid to the development of asset management models that help an owner decide on which portions of a system to prioritize needed actions. There has been much less wor...
Terpstra, Teun; Lindell, Michael K.
Although research indicates that adoption of flood preparations among Europeans is low, only a few studies have attempted to explain citizens' preparedness behavior. This article applies the Protective Action Decision Model (PADM) to explain flood preparedness intentions in the Netherlands. Survey data ("N" = 1,115) showed that…
Calmes, Stephanie A.; Piazza, Nick J.; Laux, John M.
Although some counselors have advocated for the limited use of touch in counseling, others have argued that touch has no place within the counseling relationship. Despite the controversy, the use of touch has been shown to have a number of therapeutic benefits; however, there are few ethical decision-making models that are appropriate for…
Cottone, R. Rocco
A social constructivism model of ethical decision-making is summarized and related to the Canadian Counseling Association Code of Ethics. Social constructivism is described as an intellectual movement that allows for a biological and social conception of human understanding, thereby superceding or displacing psychological theory. The theoretical…
Marston, Doug; Muyskens, Paul; Lau, Matthew; Canter, Andrea
This article describes the problem-solving model (PSM) used in the Minneapolis Public Schools to guide decisions regarding intervention in general education, special education referral, and evaluation for special education eligibility for high-incidence disabilities. Program evaluation indicates students received special education services earlier…
Stringer, W. C.; And Others
Describes the development of a computer simulation model of forage-beef production systems, which is intended to incorporate soil, forage, and animal decisions into an enterprise scenario. Produces a summary of forage production and livestock needs. Cites positive assessment of the program's value by participants in inservice training workshops.…
Lajubutu, Oyebanjo A.
This paper shows how three critical enrollment indicators drawn from a relationship database were used to guide planning and management decisions. The paper discusses the guidelines for the development of the model, attributes needed, variables to be calculated, and other issues that may improve the effectiveness and efficiency of daily enrollment…
Kerstman, Eric; Minard, Charles G.; Saile, Lynn; FreiredeCarvalho, Mary; Myers, Jerry; Walton, Marlei; Butler, Douglas; Lopez, Vilma
The Integrated Medical Model (IMM) is a decision support tool that is useful to space flight mission planners and medical system designers in assessing risks and optimizing medical systems. The IMM employs an evidence-based, probabilistic risk assessment (PRA) approach within the operational constraints of space flight.
Kaufmann, Esther; Wittmann, Werner W.
The success of bootstrapping or replacing a human judge with a model (e.g., an equation) has been demonstrated in Paul Meehl’s (1954) seminal work and bolstered by the results of several meta-analyses. To date, however, analyses considering different types of meta-analyses as well as the potential dependence of bootstrapping success on the decision domain, the level of expertise of the human judge, and the criterion for what constitutes an accurate decision have been missing from the literature. In this study, we addressed these research gaps by conducting a meta-analysis of lens model studies. We compared the results of a traditional (bare-bones) meta-analysis with findings of a meta-analysis of the success of bootstrap models corrected for various methodological artifacts. In line with previous studies, we found that bootstrapping was more successful than human judgment. Furthermore, bootstrapping was more successful in studies with an objective decision criterion than in studies with subjective or test score criteria. We did not find clear evidence that the success of bootstrapping depended on the decision domain (e.g., education or medicine) or on the judge’s level of expertise (novice or expert). Correction of methodological artifacts increased the estimated success of bootstrapping, suggesting that previous analyses without artifact correction (i.e., traditional meta-analyses) may have underestimated the value of bootstrapping models. PMID:27327085
Three new software technologies were applied to develop an efficient and easy to use decision support system for ground-water contaminant modeling. Graphical interfaces create a more intuitive and effective form of communication with the computer compared to text-based interfaces...
Atran, Scott; Medin, Douglas L.; Ross, Norbert O.
This article describes cross-cultural research on the relation between how people conceptualize nature and how they act in it. Mental models of nature differ dramatically among populations living in the same area and engaged in similar activities. This has novel implications for environmental decision making and management, including commons…
Makuch, Gary; And Others
Intended for local school district personnel, the document suggests a model for assisting decision makers in placing handicapped students in the least restrictive environment (LRE). Basic considerations of a multidisciplinary team in determining the appropriate placement for the handicapped students are listed (including the nature and degree of…
Barrett, Jeffrey S
Optimal paediatric pharmacotherapy is reliant on a detailed understanding of the individual patient including their developmental status and disease state as well as the pharmaceutical agents he/she is receiving for treatment or management of side effects. Our appreciation for size and maturation effects on the pharmacokinetic/pharmacodynamic (PK/PD) phenomenon has improved to the point that we can develop predictive models that permit us to individualize therapy, especially in the situation where we are monitoring drug effects or therapeutic concentrations. The growth of efforts to guide paediatric pharmacotherapy via model-based decision support necessitates a coordinated and systematic approach to ensuring reliable and robust output to caregivers that represents the current standard of care and adheres to governance imposed by the host institution or coalition responsible. Model-based systems which guide caregivers on dosing paediatric patients in a more comprehensive manner are in development at several institutions. Care must be taken that these systems provide robust guidance with the current best practice. These systems must evolve as new information becomes available and ultimately are best constructed from diverse data representing global input on demographics, ethnic / racial diversity, diet and other lifestyle factors. Multidisciplinary involvement at the project team level is key to the ultimate clinical valuation. Likewise, early engagement of clinical champions is also critical for the success of model-based tools. Adherence to regulatory requirements as well as best practices with respect to software development and testing are essential if these tools are to be used as part of the routine standard of care. PMID:24251868
Sorvari, Jaana; Seppälä, Jyri
The decisions on risk management (RM) of contaminated sites in Finland have typically been driven by practical factors such as time and money. However, RM is a multifaceted task that generally involves several additional determinants, e.g. performance and environmental effects of remediation methods, psychological and social factors. Therefore, we adopted a multi-criteria decision analysis approach and developed a decision support tool (DST) that is viable in decision-making in such a complex situation. The basic components of the DST are based on the Dutch REC system. However, our DST is more case-specific and allows the consideration of the type, magnitude and scale of contamination, land use, environmental conditions and socio-cultural aspects (e.g. loss of cultural heritage, image aspects). The construction of the DST was started by structuring the decision problem using a value tree. Based on this work we adopted the Multi-Attribute Value Theory (MAVT) for data aggregation. The final DST was demonstrated by two model sites for which the RM alternatives and site-specific data were created on the basis of factual remediation projects and by interviewing experts. The demonstration of the DST was carried out in a workshop where representatives of different stakeholders were requested to rank and weight the decision criteria involved. To get information on the consistency of the ranking of the RM alternatives, we used different weighting techniques (ratio estimation and pair-wise weighting) and alternative ways to treat individual respondents' weights in calculating the preference scores for each RM alternative. These dissimilar approaches resulted in some differences in the preference order of the RM alternatives. The demonstration showed that attention has to be paid to the proper description of the site, the principles of the procedure and the decision criteria. Nevertheless, the procedure proved to enable efficient communication between different stakeholders
Hydromodification is defined as changes in runoff characteristics and in-stream processes caused by altered land use. The impact of hydromodification can manifest itself through adjustment of stream morphology via channel incision, widening, planform alteration, or coarsening of the bed material. The state of the practice for hydromodification management in California and Western Washington for new and re-development has been to mimic pre-development site hydrology. The theory is that if the pre-development distribution of in-stream flows is maintained, then the baseline capacity to transport sediment, a proxy for the geomorphic condition, will be maintained as well. A popular method of mimicking the pre-development flow regime is by maintaining the pre-development frequency distribution of runoff, known as flow duration control. This can be done by routing post-development runoff through structural stormwater facilities (BMPs) such that runoff is stored and slowly released to match pre-development flow duration characteristics. As it turns out, storage requirements for hydromodification control tend to be much larger than that for surface water treatment requirements (see nomograph). As regulatory requirements for hydromodification evolve and begin to spread to other parts of the country, it is necessary that scientists, water resources professionals, and policy makers understand the practical challenges of implementing hydromodification controls, including the sizing and cost constraints, and know about innovations which could make hydromodification controls more feasible to implement. In an effort to provide the audience with this better understanding, this presentation will share a step-by-step approach for predicting long-term hydromodification impacts; demonstrate options for mitigating these impacts within the context of the modeling approach; and discuss sizing sensitivities of LID-type hydromodification control structural BMPs as a function of performance
Mikesell, Lisa; Bromley, Elizabeth; Young, Alexander S; Vona, Pamela; Zima, Bonnie
Shared decision making (SDM) interventions aim to improve client autonomy, information sharing, and collaborative decision making, yet implementation of these interventions has been variably perceived. Using interviews and focus groups with clients and clinicians from mental health clinics, we explored experiences with and perceptions about decision support strategies aimed to promote SDM around psychotropic medication treatment. Using thematic analysis, we identified themes regarding beliefs about participant involvement, information management, and participants' broader understanding of their epistemic expertise. Clients and clinicians highly valued client-centered priorities such as autonomy and empowerment when making decisions. However, two frequently discussed themes revealed complex beliefs about what that involvement should look like in practice: (a) the role of communication and information exchange and (b) the value and stability of clinician and client epistemic expertise. Complex beliefs regarding these two themes suggested a dynamic and reflexive approach to information management. Situating these findings within the Theory of Motivated Information Management, we discuss implications for conceptualizing SDM in mental health services and adapt Siminoff and Step's Communication Model of Shared Decision Making (CMSDM) to propose a Communication-centered Epistemic Model of Shared Decision Making (CEM-SDM).
Peterson, James T; Freeman, Mary C
Stream ecosystems provide multiple, valued services to society, including water supply, waste assimilation, recreation, and habitat for diverse and productive biological communities. Managers striving to sustain these services in the face of changing climate, land uses, and water demands need tools to assess the potential effectiveness of alternative management actions, and often, the resulting tradeoffs between competing objectives. Integrating predictive modeling with monitoring data in an adaptive management framework provides a process by which managers can reduce model uncertainties and thus improve the scientific bases for subsequent decisions. We demonstrate an integration of monitoring data with a dynamic, metapopulation model developed to assess effects of streamflow alteration on fish occupancy in a southeastern US stream system. Although not extensive (collected over three years at nine sites), the monitoring data allowed us to assess and update support for alternative population dynamic models using model probabilities and Bayes rule. We then use the updated model weights to estimate the effects of water withdrawal on stream fish communities and demonstrate how feedback in the form of monitoring data can be used to improve water resource decision making. We conclude that investment in more strategic monitoring, guided by a priori model predictions under alternative hypotheses and an adaptive sampling design, could substantially improve the information available to guide decision-making and management for ecosystem services from lotic systems.
Papamichael, K.; Pal, V.; Bourassa, N.; Loffeld, J.; Capeluto, G.
Decisions throughout the life cycle of a building, from design through construction and commissioning to operation and demolition, require the involvement of multiple interested parties (e.g., architects, engineers, owners, occupants and facility managers). The performance of alternative designs and courses of action must be assessed with respect to multiple performance criteria, such as comfort, aesthetics, energy, cost and environmental impact. Several stand-alone computer tools are currently available that address specific performance issues during various stages of a building's life cycle. Some of these tools support collaboration by providing means for synchronous and asynchronous communications, performance simulations, and monitoring of a variety of performance parameters involved in decisions about a building during building operation. However, these tools are not linked in any way, so significant work is required to maintain and distribute information to all parties. In this paper we describe a software model that provides the data management and process control required for collaborative decision making throughout a building's life cycle. The requirements for the model are delineated addressing data and process needs for decision making at different stages of a building's life cycle. The software model meets these requirements and allows addition of any number of processes and support databases over time. What makes the model infinitely expandable is that it is a very generic conceptualization (or abstraction) of processes as relations among data. The software model supports multiple concurrent users, and facilitates discussion and debate leading to decision making. The software allows users to define rules and functions for automating tasks and alerting all participants to issues that need attention. It supports management of simulated as well as real data and continuously generates information useful for improving performance prediction and
Hunt, R. J.
Models are by definition simplifications of reality; the degree and nature of simplification, however, is debated. One view is "the world is 3D, heterogeneous, and transient, thus good models are too" - the more a model directly simulates the complexity of the real world the better it is considered to be. An alternative view is to only use simple models up front because real-world complexity can never be truly known. A third view is construct and calibrate as many models as predictions. A fourth is to build highly parameterized models and either look at an ensemble of results, or use mathematical regularization to identify an optimal most reasonable parameter set and fit. Although each view may have utility for a given decision-making process, there are common threads that perhaps run through all views. First, the model-construction process itself can help the decision-making process because it raises the discussion of opposing parties from one of contrasting professional opinions to discussion of reasonable types and ranges of model inputs and processes. Secondly, no matter what view is used to guide the model building, model predictions for the future might be expected to perform poorly in the future due to unanticipated future changes and stressors to the underlying system simulated. Although this does not reduce the obligation of the modeler to build representative tools for the system, it should serve to temper expectations of model performance. Finally, perhaps the most under-appreciated utility of models is for calculating the reduction in prediction uncertainty resulting from different data collection strategies - an attractive feature separate from the calculation and minimization of absolute prediction uncertainty itself. This type of model output facilitates focusing on efficient use of current and future monitoring resources - something valued by many decision-makers regardless of background, system managed, and societal context.
Basieva, Irina; Khrennikov, Andrei; Asano, Masanari; Ohya, Masanori; Tanaka, Yoshiharu
We present a quantum-like model of decision making in games of the Prisoner's Dilemma type. By this model the brain processes information by using representation of mental states in complex Hilbert space. Driven by the master equation the mental state of a player, say Alice, approaches an equilibrium point in the space of density matrices. By using this equilibrium point Alice determines her mixed (i.e., probabilistic) strategy with respect to Bob. Thus our model is a model of thinking through decoherence of initially pure mental state. Decoherence is induced by interaction with memory and external environment. In this paper we study (numerically) dynamics of quantum entropy of Alice's state in the process of decision making. Our analysis demonstrates that this dynamics depends nontrivially on the initial state of Alice's mind on her own actions and her prediction state (for possible actions of Bob.)
This essay argues that individual-oriented informed consent is inadequate to protect human research subjects in mainland China. The practice of family-oriented decision-making is better suited to guide moral research conduct. The family's role in medical decision-making originates from the mutual benevolence that exists among family members, and is in accordance with family harmony, which is the aim of Confucian society. I argue that the practice of informed consent for medical research on human subjects ought to remain family-oriented in mainland China. This essay explores the main features of this model of informed consent and demonstrates the proper authority of the family. The family's participation in decision-making as a whole does not negate or deny the importance of the individual who is the subject of the choice, but rather acts more fully to protect research subjects.
Cheng, Zhichao; Xiong, Yang; Xu, Yiwen
An opinion dynamic model with decision-making groups was proposed to study the process of adopting new opinions or ideas by individuals. The opinion's acceptability is introduced to distinguish the general character of different opinions. The simulation results on a free-scale network demonstrate that when two opinions have similar acceptability, the opinion supported by more decision-making groups in the beginning will eventually win the support of more agents, whereas an opinion supported by fewer decision-making groups in the beginning may be supported by the majority at the end only if it has better acceptability, and if the tolerance threshold of the society is higher than a specific value.
Friedel, Eva; Koch, Stefan P; Wendt, Jean; Heinz, Andreas; Deserno, Lorenz; Schlagenhauf, Florian
In experimental psychology different experiments have been developed to assess goal-directed as compared to habitual control over instrumental decisions. Similar to animal studies selective devaluation procedures have been used. More recently sequential decision-making tasks have been designed to assess the degree of goal-directed vs. habitual choice behavior in terms of an influential computational theory of model-based compared to model-free behavioral control. As recently suggested, different measurements are thought to reflect the same construct. Yet, there has been no attempt to directly assess the construct validity of these different measurements. In the present study, we used a devaluation paradigm and a sequential decision-making task to address this question of construct validity in a sample of 18 healthy male human participants. Correlational analysis revealed a positive association between model-based choices during sequential decisions and goal-directed behavior after devaluation suggesting a single framework underlying both operationalizations and speaking in favor of construct validity of both measurement approaches. Up to now, this has been merely assumed but never been directly tested in humans.
Sauer, John R.; Blank, Peter J.; Zipkin, Elise F.; Fallon, Jane E.; Fallon, Frederick W.
Land managers must balance the needs of a variety of species when manipulating habitats. Structured decision making provides a systematic means of defining choices and choosing among alternative management options; implementation of a structured decision requires quantitative approaches to predicting consequences of management on the relevant species. Multi-species occupancy models provide a convenient framework for making structured decisions when the management objective is focused on a collection of species. These models use replicate survey data that are often collected on managed lands. Occupancy can be modeled for each species as a function of habitat and other environmental features, and Bayesian methods allow for estimation and prediction of collective responses of groups of species to alternative scenarios of habitat management. We provide an example of this approach using data from breeding bird surveys conducted in 2008 at the Patuxent Research Refuge in Laurel, Maryland, evaluating the effects of eliminating meadow and wetland habitats on scrub-successional and woodland-breeding bird species using summed total occupancy of species as an objective function. Removal of meadows and wetlands decreased value of an objective function based on scrub-successional species by 23.3% (95% CI: 20.3–26.5), but caused only a 2% (0.5, 3.5) increase in value of an objective function based on woodland species, documenting differential effects of elimination of meadows and wetlands on these groups of breeding birds. This approach provides a useful quantitative tool for managers interested in structured decision making.
King, J-R; Dehaene, S
Subliminal perception studies have shown that one can objectively discriminate a stimulus without subjectively perceiving it. We show how a minimalist framework based on Signal Detection Theory and Bayesian inference can account for this dissociation, by describing subjective and objective tasks with similar decision-theoretic mechanisms. Each of these tasks relies on distinct response classes, and therefore distinct priors and decision boundaries. As a result, they may reach different conclusions. By formalizing, within the same framework, forced-choice discrimination responses, subjective visibility reports and confidence ratings, we show that this decision model suffices to account for several classical characteristics of conscious and unconscious perception. Furthermore, the model provides a set of original predictions on the nonlinear profiles of discrimination performance obtained at various levels of visibility. We successfully test one such prediction in a novel experiment: when varying continuously the degree of perceptual ambiguity between two visual symbols presented at perceptual threshold, identification performance varies quasi-linearly when the stimulus is unseen and in an 'all-or-none' manner when it is seen. The present model highlights how conscious and non-conscious decisions may correspond to distinct categorizations of the same stimulus encoded by a high-dimensional neuronal population vector.
Montgomery, Robert A; Rubeck-Schurtz, C Nichole; Millenbah, Kelly F; Roloff, Gary J; Whalon, Mark E; Olsen, Larry G
In the United States, environmental regulatory agencies are required to use "best available" scientific information when making decisions on a variety of issues. However, agencies are often hindered by coarse or incomplete data, particularly as it pertains to threatened and endangered species protection. Stakeholders often agree that more resolute and integrated processes for decision-making are desirable. We demonstrate a process that uses species occurrence data for a federally endangered insect (Karner blue butterfly), a readily available habitat modeling tool, and spatially explicit information about an important Michigan commodity (tart cherries). This case study has characteristics of many protected species regulatory decisions in that species occurrence data were sparse and unequally distributed; regulatory decisions (on pesticide use) were required with potentially significant impacts on a viable agricultural industry; and stakeholder relations were diverse, misinformed, and, in some situations, unjustly contentious. Results from our process include a large-scale, empirically derived habitat suitability map for the focal species and a risk ranking of tart cherry orchards with risk based on the likelihood that pesticide applications will influence the focal protected species. Although the majority (77%) of pesticide-influence zones overlapped Karner blue butterfly habitat, risk scores associated with each orchard were low. Through our process we demonstrated that spatially explicit models can help stakeholders visualize and quantify potential protected species effects. In addition, model outputs can serve to guide field activities (e.g., species surveys and implementation of pesticide buffer zones) that help minimize future effects.
Montgomery, Robert A.; Rubeck-Schurtz, C. Nichole; Millenbah, Kelly F.; Roloff, Gary J.; Whalon, Mark E.; Olsen, Larry G.
In the United States, environmental regulatory agencies are required to use “best available” scientific information when making decisions on a variety of issues. However, agencies are often hindered by coarse or incomplete data, particularly as it pertains to threatened and endangered species protection. Stakeholders often agree that more resolute and integrated processes for decision-making are desirable. We demonstrate a process that uses species occurrence data for a federally endangered insect (Karner blue butterfly), a readily available habitat modeling tool, and spatially explicit information about an important Michigan commodity (tart cherries). This case study has characteristics of many protected species regulatory decisions in that species occurrence data were sparse and unequally distributed; regulatory decisions (on pesticide use) were required with potentially significant impacts on a viable agricultural industry; and stakeholder relations were diverse, misinformed, and, in some situations, unjustly contentious. Results from our process include a large-scale, empirically derived habitat suitability map for the focal species and a risk ranking of tart cherry orchards with risk based on the likelihood that pesticide applications will influence the focal protected species. Although the majority (77%) of pesticide-influence zones overlapped Karner blue butterfly habitat, risk scores associated with each orchard were low. Through our process we demonstrated that spatially explicit models can help stakeholders visualize and quantify potential protected species effects. In addition, model outputs can serve to guide field activities (e.g., species surveys and implementation of pesticide buffer zones) that help minimize future effects.
King, J-R.; Dehaene, S.
Subliminal perception studies have shown that one can objectively discriminate a stimulus without subjectively perceiving it. We show how a minimalist framework based on Signal Detection Theory and Bayesian inference can account for this dissociation, by describing subjective and objective tasks with similar decision-theoretic mechanisms. Each of these tasks relies on distinct response classes, and therefore distinct priors and decision boundaries. As a result, they may reach different conclusions. By formalizing, within the same framework, forced-choice discrimination responses, subjective visibility reports and confidence ratings, we show that this decision model suffices to account for several classical characteristics of conscious and unconscious perception. Furthermore, the model provides a set of original predictions on the nonlinear profiles of discrimination performance obtained at various levels of visibility. We successfully test one such prediction in a novel experiment: when varying continuously the degree of perceptual ambiguity between two visual symbols presented at perceptual threshold, identification performance varies quasi-linearly when the stimulus is unseen and in an ‘all-or-none’ manner when it is seen. The present model highlights how conscious and non-conscious decisions may correspond to distinct categorizations of the same stimulus encoded by a high-dimensional neuronal population vector. PMID:24639577
The report documents the computer programs written to implement the ECON optical decision model. The programs were written in APL, an extremely compact and powerful language particularly well suited to this model, which makes extensive use of matrix manipulations. The algorithms used are presented and listings of and descriptive information on the APL programs used are given. Possible changes in input data are also given.
Van Belle, Vanya M. C. A.; Van Calster, Ben; Timmerman, Dirk; Bourne, Tom; Bottomley, Cecilia; Valentin, Lil; Neven, Patrick; Van Huffel, Sabine; Suykens, Johan A. K.; Boyd, Stephen
Background Over time, methods for the development of clinical decision support (CDS) systems have evolved from interpretable and easy-to-use scoring systems to very complex and non-interpretable mathematical models. In order to accomplish effective decision support, CDS systems should provide information on how the model arrives at a certain decision. To address the issue of incompatibility between performance, interpretability and applicability of CDS systems, this paper proposes an innovative model structure, automatically leading to interpretable and easily applicable models. The resulting models can be used to guide clinicians when deciding upon the appropriate treatment, estimating patient-specific risks and to improve communication with patients. Methods and Findings We propose the interval coded scoring (ICS) system, which imposes that the effect of each variable on the estimated risk is constant within consecutive intervals. The number and position of the intervals are automatically obtained by solving an optimization problem, which additionally performs variable selection. The resulting model can be visualised by means of appealing scoring tables and color bars. ICS models can be used within software packages, in smartphone applications, or on paper, which is particularly useful for bedside medicine and home-monitoring. The ICS approach is illustrated on two gynecological problems: diagnosis of malignancy of ovarian tumors using a dataset containing 3,511 patients, and prediction of first trimester viability of pregnancies using a dataset of 1,435 women. Comparison of the performance of the ICS approach with a range of prediction models proposed in the literature illustrates the ability of ICS to combine optimal performance with the interpretability of simple scoring systems. Conclusions The ICS approach can improve patient-clinician communication and will provide additional insights in the importance and influence of available variables. Future challenges
Xue, Jie; Gui, Dongwei; Zhao, Ying; Lei, Jiaqiang; Zeng, Fanjiang; Feng, Xinlong; Mao, Donglei; Shareef, Muhammad
The competition for water resources between agricultural and natural oasis ecosystems has become an increasingly serious problem in oasis areas worldwide. Recently, the intensive extension of oasis farmland has led to excessive exploitation of water discharge, and consequently has resulted in a lack of water supply in natural oasis. To coordinate the conflicts, this paper provides a decision-making framework for modeling environmental flows in oasis areas using Bayesian networks (BNs). Three components are included in the framework: (1) assessment of agricultural economic loss due to meeting environmental flow requirements; (2) decision-making analysis using BNs; and (3) environmental flow decision-making under different water management scenarios. The decision-making criterion is determined based on intersection point analysis between the probability of large-level total agro-economic loss and the ratio of total to maximum agro-economic output by satisfying environmental flows. An application in the Qira oasis area of the Tarim Basin, Northwest China indicates that BNs can model environmental flow decision-making associated with agricultural economic loss effectively, as a powerful tool to coordinate water-use conflicts. In the case study, the environmental flow requirement is determined as 50.24%, 49.71% and 48.73% of the natural river flow in wet, normal and dry years, respectively. Without further agricultural economic loss, 1.93%, 0.66% and 0.43% of more river discharge can be allocated to eco-environmental water demands under the combined strategy in wet, normal and dry years, respectively. This work provides a valuable reference for environmental flow decision-making in any oasis area worldwide.
Block, J. L.; Arrowsmith, R.
. The scientists, especially those with hydrological modeling experience, experienced a smooth transition from 2-D contour map outputs to viewing multiple 3-D surfaces, and were comfortable understanding the modeling outputs shown in the Theater. In contrast, the decision makers who have little experience in hydrology require a visualization that shows less detail in the modeling outputs and more information about the coupled processes (e.g. pumping, recharge, and surface water delivery) in order to understand the hydrologic concepts being shown.
Ranger, N.; Millner, A.; Niehoerster, F.
Traditionally, climate change risk assessments have taken a roughly four-stage linear ‘chain’ of moving from socioeconomic projections, to climate projections, to primary impacts and then finally onto economic and social impact assessment. Adaptation decisions are then made on the basis of these outputs. The escalation of uncertainty through this chain is well known; resulting in an ‘explosion’ of uncertainties in the final risk and adaptation assessment. The space of plausible future risk scenarios is growing ever wider with the application of new techniques which aim to explore uncertainty ever more deeply; such as those used in the recent ‘probabilistic’ UK Climate Projections 2009, and the stochastic integrated assessment models, for example PAGE2002. This explosion of uncertainty can make decision-making problematic, particularly given that the uncertainty information communicated can not be treated as strictly probabilistic and therefore, is not an easy fit with standard decision-making under uncertainty approaches. Additional problems can arise from the fact that the uncertainty estimated for different components of the ‘chain’ is rarely directly comparable or combinable. Here, we explore the challenges and limitations of using current projections for adaptation decision-making. We report the findings of a recent report completed for the UK Adaptation Sub-Committee on approaches to deal with these challenges and make robust adaptation decisions today. To illustrate these approaches, we take a number of illustrative case studies, including a case of adaptation to hurricane risk on the US Gulf Coast. This is a particularly interesting case as it involves urgent adaptation of long-lived infrastructure but requires interpreting highly uncertain climate change science and modelling; i.e. projections of Atlantic basin hurricane activity. An approach we outline is reversing the linear chain of assessments to put the economics and decision
Duintjer Tebbens, Radboud J; Pallansch, Mark A; Kew, Olen M; Sutter, Roland W; Bruce Aylward, R; Watkins, Margaret; Gary, Howard; Alexander, James; Jafari, Hamid; Cochi, Stephen L; Thompson, Kimberly M
Decision analytic modeling of polio risk management policies after eradication may help inform decisionmakers about the quantitative tradeoffs implied by various options. Given the significant dynamic complexity and uncertainty involving posteradication decisions, this article aims to clarify the structure of a decision analytic model developed to help characterize the risks, costs, and benefits of various options for polio risk management after eradication of wild polioviruses and analyze the implications of different sources of uncertainty. We provide an influence diagram of the model with a description of each component, explore the impact of different assumptions about model inputs, and present probability distributions of model outputs. The results show that choices made about surveillance, response, and containment for different income groups and immunization policies play a major role in the expected final costs and polio cases. While the overall policy implications of the model remain robust to the variations of assumptions and input uncertainty we considered, the analyses suggest the need for policymakers to carefully consider tradeoffs and for further studies to address the most important knowledge gaps.
Barrett, Jeffrey S; Mondick, John T; Narayan, Mahesh; Vijayakumar, Kalpana; Vijayakumar, Sundararajan
Background Decision analysis in hospital-based settings is becoming more common place. The application of modeling and simulation approaches has likewise become more prevalent in order to support decision analytics. With respect to clinical decision making at the level of the patient, modeling and simulation approaches have been used to study and forecast treatment options, examine and rate caregiver performance and assign resources (staffing, beds, patient throughput). There us a great need to facilitate pharmacotherapeutic decision making in pediatrics given the often limited data available to guide dosing and manage patient response. We have employed nonlinear mixed effect models and Bayesian forecasting algorithms coupled with data summary and visualization tools to create drug-specific decision support systems that utilize individualized patient data from our electronic medical records systems. Methods Pharmacokinetic and pharmacodynamic nonlinear mixed-effect models of specific drugs are generated based on historical data in relevant pediatric populations or from adults when no pediatric data is available. These models are re-executed with individual patient data allowing for patient-specific guidance via a Bayesian forecasting approach. The models are called and executed in an interactive manner through our web-based dashboard environment which interfaces to the hospital's electronic medical records system. Results The methotrexate dashboard utilizes a two-compartment, population-based, PK mixed-effect model to project patient response to specific dosing events. Projected plasma concentrations are viewable against protocol-specific nomograms to provide dosing guidance for potential rescue therapy with leucovorin. These data are also viewable against common biomarkers used to assess patient safety (e.g., vital signs and plasma creatinine levels). As additional data become available via therapeutic drug monitoring, the model is re-executed and projections are
Pasqualini, D.; Witkowski, M.
The Critical Infrastructure Protection / Decision Support System (CIP/DSS) project, supported by the Science and Technology Office, has been developing a risk-informed Decision Support System that provides insights for making critical infrastructure protection decisions. The system considers seventeen different Department of Homeland Security defined Critical Infrastructures (potable water system, telecommunications, public health, economics, etc.) and their primary interdependencies. These infrastructures have been modeling in one model called CIP/DSS Metropolitan Model. The modeling approach used is a system dynamics modeling approach. System dynamics modeling combines control theory and the nonlinear dynamics theory, which is defined by a set of coupled differential equations, which seeks to explain how the structure of a given system determines its behavior. In this poster we present a system dynamics model for one of the seventeen critical infrastructures, a generic metropolitan potable water system (MPWS). Three are the goals: 1) to gain a better understanding of the MPWS infrastructure; 2) to identify improvements that would help protect MPWS; and 3) to understand the consequences, interdependencies, and impacts, when perturbations occur to the system. The model represents raw water sources, the metropolitan water treatment process, storage of treated water, damage and repair to the MPWS, distribution of water, and end user demand, but does not explicitly represent the detailed network topology of an actual MPWS. The MPWS model is dependent upon inputs from the metropolitan population, energy, telecommunication, public health, and transportation models as well as the national water and transportation models. We present modeling results and sensitivity analysis indicating critical choke points, negative and positive feedback loops in the system. A general scenario is also analyzed where the potable water system responds to a generic disruption.
240 finance departments of county, city and state governments’ promotion decisions, Halabv (1976) obtained evidence that the analysis of the decision...oper- ations research techniques and practicing finance managers avoid complex mathematical models in favor of a few simple rules in investment decision...likely its managers to spend time with outside organizations. Similarly, organizations that depend on outside financing select more outside members
Gann, Candace J.; Kunnavatana, S. Shanun
This preliminary study investigated the use of the Function-Based Intervention Decision Model (Decision Model; Umbreit, Ferro, Liaupsin, & Lane, 2007) to improve teacher treatment integrity for a function-based classroom management plan. The participants were a special education teacher and three elementary-age students receiving special…
Hinman, Martha M.; Kallio, Ruth E.
The technical, social, and procedural phenomena that facilitate the effective utilization of analytic models in decision-making are examined. Attention is focused on the theoretical issues associated with: (1) selection and fit of the model to the needs of the decision setting; (2) human factors such as cognitive style and the political climate…
Constantinou, Anthony Costa; Fenton, Norman; Marsh, William; Radlinski, Lukasz
Objectives 1) To develop a rigorous and repeatable method for building effective Bayesian network (BN) models for medical decision support from complex, unstructured and incomplete patient questionnaires and interviews that inevitably contain examples of repetitive, redundant and contradictory responses; 2) To exploit expert knowledge in the BN development since further data acquisition is usually not possible; 3) To ensure the BN model can be used for interventional analysis; 4) To demonstrate why using data alone to learn the model structure and parameters is often unsatisfactory even when extensive data is available. Method The method is based on applying a range of recent BN developments targeted at helping experts build BNs given limited data. While most of the components of the method are based on established work, its novelty is that it provides a rigorous consolidated and generalised framework that addresses the whole life-cycle of BN model development. The method is based on two original and recent validated BN models in forensic psychiatry, known as DSVM-MSS and DSVM-P. Results When employed with the same datasets, the DSVM-MSS demonstrated competitive to superior predictive performance (AUC scores 0.708 and 0.797) against the state-of-the-art (AUC scores ranging from 0.527 to 0.705), and the DSVM-P demonstrated superior predictive performance (cross-validated AUC score of 0.78) against the state-of-the-art (AUC scores ranging from 0.665 to 0.717). More importantly, the resulting models go beyond improving predictive accuracy and into usefulness for risk management purposes through intervention, and enhanced decision support in terms of answering complex clinical questions that are based on unobserved evidence. Conclusions This development process is applicable to any application domain which involves large-scale decision analysis based on such complex information, rather than based on data with hard facts, and in conjunction with the incorporation of
Louz, Derrick; Bergmans, Hans E; Loos, Birgit P; Hoeben, Rob C
Mathematical modeling can be used for the development and implementation of infection control policy to combat outbreaks and epidemics of communicable viral diseases. Here an outline is provided of basic concepts and approaches used in mathematical modeling and parameterization of disease transmission. The use of mathematical models is illustrated, using the 2001 UK foot-and-mouth disease (FMD) epidemic, the 2003 global severe acute respiratory syndrome (SARS) epidemic, and human influenza pandemics, as examples. This provides insights in the strengths, limitations, and weaknesses of the various models, and demonstrates their potential for supporting policy and decision making.
Naito, Masashi; Watanabe, Shun; Matsumoto, Ryutaroh; Uyematsu, Tomohiko
We consider the problem of secret key agreement in Gaussian Maurer's Model. In Gaussian Maurer's model, legitimate receivers, Alice and Bob, and a wire-tapper, Eve, receive signals randomly generated by a satellite through three independent memoryless Gaussian channels respectively. Then Alice and Bob generate a common secret key from their received signals. In this model, we propose a protocol for generating a common secret key by using the result of soft-decision of Alice and Bob's received signals. Then, we calculate a lower bound on the secret key rate in our proposed protocol. As a result of comparison with the protocol that only uses hard-decision, we found that the higher rate is obtained by using our protocol.
Jollans, Lee; Whelan, Robert; Venables, Louise; Turnbull, Oliver H; Cella, Matteo; Dymond, Simon
Complex human cognition, such as decision-making under ambiguity, is reflected in dynamic spatio-temporal activity in the brain. Here, we combined event-related potentials with computational modelling of the time course of decision-making and outcome evaluation during the Iowa Gambling Task. Measures of choice probability generated using the Prospect Valence Learning Delta (PVL-Delta) model, in addition to objective trial outcomes (outcome magnitude and valence), were applied as regressors in a general linear model of the EEG signal. The resulting three-dimensional spatio-temporal characterization of task-related neural dynamics demonstrated that outcome valence, outcome magnitude, and PVL-Delta choice probability were expressed in distinctly separate event related potentials. Our findings showed that the P3 component was associated with an experience-based measure of outcome expectancy.
Niswonger, Richard; Allander, Kip K.; Jeton, Anne E.
A terminal lake basin in west-central Nevada, Walker Lake, has undergone drastic change over the past 90 yrs due to upstream water use for agriculture. Decreased inflows to the lake have resulted in 100 km2 decrease in lake surface area and a total loss of fisheries due to salinization. The ecologic health of Walker Lake is of great concern as the lake is a stopover point on the Pacific route for migratory birds from within and outside the United States. Stakeholders, water institutions, and scientists have engaged in collaborative modeling and the development of a decision support system that is being used to develop and analyze management change options to restore the lake. Here we use an integrated management and hydrologic model that relies on state-of-the-art simulation capabilities to evaluate the benefits of using integrated hydrologic models as components of a decision support system. Nonlinear feedbacks among climate, surface-water and groundwater exchanges, and water use present challenges for simulating realistic outcomes associated with management change. Integrated management and hydrologic modeling provides a means of simulating benefits associated with management change in the Walker River basin where drastic changes in the hydrologic landscape have taken place over the last century. Through the collaborative modeling process, stakeholder support is increasing and possibly leading to management change options that result in reductions in Walker Lake salt concentrations, as simulated by the decision support system.
Asano, Masanari; Ohya, Masanori; Tanaka, Yoshiharu; Basieva, Irina; Khrennikov, Andrei
We present a quantum-like model of decision making in games of the Prisoner's Dilemma type. By this model the brain processes information by using representation of mental states in a complex Hilbert space. Driven by the master equation the mental state of a player, say Alice, approaches an equilibrium point in the space of density matrices (representing mental states). This equilibrium state determines Alice's mixed (i.e., probabilistic) strategy. We use a master equation in which quantum physics describes the process of decoherence as the result of interaction with environment. Thus our model is a model of thinking through decoherence of the initially pure mental state. Decoherence is induced by the interaction with memory and the external mental environment. We study (numerically) the dynamics of quantum entropy of Alice's mental state in the process of decision making. We also consider classical entropy corresponding to Alice's choices. We introduce a measure of Alice's diffidence as the difference between classical and quantum entropies of Alice's mental state. We see that (at least in our model example) diffidence decreases (approaching zero) in the process of decision making. Finally, we discuss the problem of neuronal realization of quantum-like dynamics in the brain; especially roles played by lateral prefrontal cortex or/and orbitofrontal cortex.
Apperl, B.; Pulido-Velazquez, M.; Andreu, J.; Llopis-Albert, C.
observed reactions: acceptance of more rigorous measures, on one hand, and a tendency to soft measures with the same cost, as a reaction to the decreased effectiveness of the alternatives. The implementation of the method to a very complex case study, with many conflicting objectives and alternatives and uncertain outcomes, including future scenarios (climate change) illustrate the potential of the method for supporting management decisions.
Geller, G.; Nativi, S.
Rapid climate and socioeconomic changes may be outrunning society's ability to understand, predict, and respond to change effectively. Decision makers want better information about what these changes will be and how various resources will be affected, while researchers want better understanding of the components and processes of ecological systems, how they interact, and how they respond to change. Although there are many excellent models in ecology and related disciplines, there is only limited coordination among them, and accessible, openly shared models or model systems that can be consulted to gain insight on important ecological questions or assist with decision-making are rare. A "consultative infrastructure" that increased access to and sharing of models and model outputs would benefit decision makers, researchers, as well as modelers. Of course, envisioning such an ambitious system is much easier than building it, but several complementary approaches exist that could contribute. The one discussed here is called the Model Web. This is a concept for an open-ended system of interoperable computer models and databases based on making models and their outputs available as services ("model as a service"). Initially, it might consist of a core of several models from which it could grow gradually as new models or databases were added. However, a model web would not be a monolithic, rigidly planned and built system--instead, like the World Wide Web, it would grow largely organically, with limited central control, within a framework of broad goals and data exchange standards. One difference from the WWW is that a model web is much harder to create, and has more pitfalls, and thus is a long term vision. However, technology, science, observations, and models have advanced enough so that parts of an ecological model web can be built and utilized now, forming a framework for gradual growth as well as a broadly accessible infrastructure. Ultimately, the value of a model
Lingga, Marwan Mossa
A strong trend of returning to nuclear power is evident in different places in the world. Forty-five countries are planning to add nuclear power to their grids and more than 66 nuclear power plants are under construction. Nuclear power plants that generate electricity and steam need to improve safety to become more acceptable to governments and the public. One novel practical solution to increase nuclear power plants' safety factor is to build them away from urban areas, such as offshore or underground. To date, Land-Based siting is the dominant option for siting all commercial operational nuclear power plants. However, the literature reveals several options for building nuclear power plants in safer sitings than Land-Based sitings. The alternatives are several and each has advantages and disadvantages, and it is difficult to distinguish among them and choose the best for a specific project. In this research, we recall the old idea of using the alternatives of offshore and underground sitings for new nuclear power plants and propose a tool to help in choosing the best siting technology. This research involved the development of a decision model for evaluating several potential nuclear power plant siting technologies, both those that are currently available and future ones. The decision model was developed based on the Hierarchical Decision Modeling (HDM) methodology. The model considers five major dimensions, social, technical, economic, environmental, and political (STEEP), and their related criteria and sub-criteria. The model was designed and developed by the author, and its elements' validation and evaluation were done by a large number of experts in the field of nuclear energy. The decision model was applied in evaluating five potential siting technologies and ranked the Natural Island as the best in comparison to Land-Based, Floating Plant, Artificial Island, and Semi-Embedded plant.
The issues presented in this research paper refer to problems connected with the control process for exploitation implemented in the complex systems of exploitation for technical objects. The article presents the description of the method concerning the control availability for technical objects (means of transport) on the basis of the mathematical model of the exploitation process with the implementation of the decisive processes by semi-Markov. The presented method means focused on the preparing the decisive for the exploitation process for technical objects (semi-Markov model) and after that specifying the best control strategy (optimal strategy) from among possible decisive variants in accordance with the approved criterion (criteria) of the activity evaluation of the system of exploitation for technical objects. In the presented method specifying the optimal strategy for control availability in the technical objects means a choice of a sequence of control decisions made in individual states of modelled exploitation process for which the function being a criterion of evaluation reaches the extreme value. In order to choose the optimal control strategy the implementation of the genetic algorithm was chosen. The opinions were presented on the example of the exploitation process of the means of transport implemented in the real system of the bus municipal transport. The model of the exploitation process for the means of transports was prepared on the basis of the results implemented in the real transport system. The mathematical model of the exploitation process was built taking into consideration the fact that the model of the process constitutes the homogenous semi-Markov process.
Aenishaenslin, Cécile; Gern, Lise; Michel, Pascal; Ravel, André; Hongoh, Valérie; Waaub, Jean-Philippe; Milord, François; Bélanger, Denise
Designing preventive programs relevant to vector-borne diseases such as Lyme disease (LD) can be complex given the need to include multiple issues and perspectives into prioritizing public health actions. A multi-criteria decision aid (MCDA) model was previously used to rank interventions for LD prevention in Quebec, Canada, where the disease is emerging. The aim of the current study was to adapt and evaluate the decision model constructed in Quebec under a different epidemiological context, in Switzerland, where LD has been endemic for the last thirty years. The model adaptation was undertaken with a group of Swiss stakeholders using a participatory approach. The PROMETHEE method was used for multi-criteria analysis. Key elements and results of the MCDA model are described and contrasted with the Quebec model. All criteria and most interventions of the MCDA model developed for LD prevention in Quebec were directly transferable to the Swiss context. Four new decision criteria were added, and the list of proposed interventions was modified. Based on the overall group ranking, interventions targeting human populations were prioritized in the Swiss model, with the top ranked action being the implementation of a large communication campaign. The addition of criteria did not significantly alter the intervention rankings, but increased the capacity of the model to discriminate between highest and lowest ranked interventions. The current study suggests that beyond the specificity of the MCDA models developed for Quebec and Switzerland, their general structure captures the fundamental and common issues that characterize the complexity of vector-borne disease prevention. These results should encourage public health organizations to adapt, use and share MCDA models as an effective and functional approach to enable the integration of multiple perspectives and considerations in the prevention and control of complex public health issues such as Lyme disease or other vector
Aenishaenslin, Cécile; Gern, Lise; Michel, Pascal; Ravel, André; Hongoh, Valérie; Waaub, Jean-Philippe; Milord, François; Bélanger, Denise
Designing preventive programs relevant to vector-borne diseases such as Lyme disease (LD) can be complex given the need to include multiple issues and perspectives into prioritizing public health actions. A multi-criteria decision aid (MCDA) model was previously used to rank interventions for LD prevention in Quebec, Canada, where the disease is emerging. The aim of the current study was to adapt and evaluate the decision model constructed in Quebec under a different epidemiological context, in Switzerland, where LD has been endemic for the last thirty years. The model adaptation was undertaken with a group of Swiss stakeholders using a participatory approach. The PROMETHEE method was used for multi-criteria analysis. Key elements and results of the MCDA model are described and contrasted with the Quebec model. All criteria and most interventions of the MCDA model developed for LD prevention in Quebec were directly transferable to the Swiss context. Four new decision criteria were added, and the list of proposed interventions was modified. Based on the overall group ranking, interventions targeting human populations were prioritized in the Swiss model, with the top ranked action being the implementation of a large communication campaign. The addition of criteria did not significantly alter the intervention rankings, but increased the capacity of the model to discriminate between highest and lowest ranked interventions. The current study suggests that beyond the specificity of the MCDA models developed for Quebec and Switzerland, their general structure captures the fundamental and common issues that characterize the complexity of vector-borne disease prevention. These results should encourage public health organizations to adapt, use and share MCDA models as an effective and functional approach to enable the integration of multiple perspectives and considerations in the prevention and control of complex public health issues such as Lyme disease or other vector
Brennan, Alan; Meier, Petra; Purshouse, Robin; Rafia, Rachid; Meng, Yang; Hill-Macmanus, Daniel
This paper sets out the development of a methodological framework for detailed evaluation of public health strategies for alcohol harm reduction to meet UK policy-makers needs. Alcohol is known to cause substantial harms, and controlling its affordability and availability are effective policy options. Analysis and synthesis of a variety of public and commercial data sources is needed to evaluate impact on consumers, health services, crime, employers and industry, so a sound evaluation of impact is important. We discuss the iterative process to engage with stakeholders, identify evidence/data and develop analytic approaches and produce a final model structure. We set out a series of steps in modelling impact including: classification and definition of population subgroups of interest, identification and definition of harms and outcomes for inclusion, classification of modifiable components of risk and their baseline values, specification of the baseline position on policy variables especially prices, estimating effects of changing policy variables on risk factors including price elasticities, quantifying risk functions relating risk factors to harms including 47 health conditions, crimes, absenteeism and unemployment, and monetary valuation. The most difficult model structuring decisions are described, as well as the final results framework used to provide decision support to national level policymakers in the UK. In the discussion we explore issues around the relationship between modelling and policy debates, valuation and scope, limitations of evidence/data, how the framework can be adapted to other countries and decisions. We reflect on the approach taken and outline ongoing plans for further development.
Humphries Choptiany, John Michael; Pelot, Ronald
Multicriteria decision analysis (MCDA) has been applied to various energy problems to incorporate a variety of qualitative and quantitative criteria, usually spanning environmental, social, engineering, and economic fields. MCDA and associated methods such as life-cycle assessments and cost-benefit analysis can also include risk analysis to address uncertainties in criteria estimates. One technology now being assessed to help mitigate climate change is carbon capture and storage (CCS). CCS is a new process that captures CO2 emissions from fossil-fueled power plants and injects them into geological reservoirs for storage. It presents a unique challenge to decisionmakers (DMs) due to its technical complexity, range of environmental, social, and economic impacts, variety of stakeholders, and long time spans. The authors have developed a risk assessment model using a MCDA approach for CCS decisions such as selecting between CO2 storage locations and choosing among different mitigation actions for reducing risks. The model includes uncertainty measures for several factors, utility curve representations of all variables, Monte Carlo simulation, and sensitivity analysis. This article uses a CCS scenario example to demonstrate the development and application of the model based on data derived from published articles and publicly available sources. The model allows high-level DMs to better understand project risks and the tradeoffs inherent in modern, complex energy decisions.
Morlock, L L; Alexander, J A
This study utilizes data from a national survey of 159 multihospital systems in order to describe the types of governance structures currently being utilized, and to compare the policy making process for various types of decisions in systems with different approaches to governance. Survey results indicate that multihospital systems most often use one of three governance models. Forty-one percent of the systems (including 33% of system hospitals) use a parent holding company model in which there is a system-wide corporate governing board and separate governing boards for each member hospital. Twenty-two percent of systems in the sample (but 47% of all system hospitals) utilize what we have termed a modified parent holding company model in which there is one system-wide governing board, but advisory boards are substituted for governing boards at the local hospital level. Twenty-three percent of the sampled systems (including 11% of system hospitals) use a corporate model in which there is one system-wide governing board but no other governing or advisory boards at either the divisional, regional or local hospital levels. A comparison of systems using these three governance approaches found significant variation in terms of system size, ownership and the geographic proximity of member hospitals. In order to examine the relationship between alternative approaches to governance and patterns of decision-making, the three model types were compared with respect to the percentages of systems reporting that local boards, corporate management and/or system-wide corporate boards have responsibility for decision-making in a number of specific issue areas. Study results indicate that, regardless of model type, corporate boards are most likely to have responsibility for decisions regarding the transfer, pledging and sale of assets; the formation of new companies; purchase of assets greater than $100,000; changes in hospital bylaws; and the appointment of local board members. In
Wickens, Christopher; Vieanne, Alex; Clegg, Benjamin; Sebok, Angelia; Janes, Jessica
Fifty six participants time shared a spacecraft environmental control system task with a realistic space robotic arm control task in either a manual or highly automated version. The former could suffer minor failures, whose diagnosis and repair were supported by a decision aid. At the end of the experiment this decision aid unexpectedly failed. We measured visual attention allocation and switching between the two tasks, in each of the eight conditions formed by manual-automated arm X expected-unexpected failure X monitoring- failure management. We also used our multi-attribute task switching model, based on task attributes of priority interest, difficulty and salience that were self-rated by participants, to predict allocation. An un-weighted model based on attributes of difficulty, interest and salience accounted for 96 percent of the task allocation variance across the 8 different conditions. Task difficulty served as an attractor, with more difficult tasks increasing the tendency to stay on task.
Ranger, N. A.; Smith, L. A.; Stainforth, D.; Millner, A.; Niehoerste, F.
Traditionally, climate change risk assessments have taken a roughly four-stage linear ‘chain’ approach of moving from socioeconomic projections, to climate projections, to primary impacts and then finally onto economic and social impact assessment. Adaptation decisions are then made on the basis of these outputs. The escalation of uncertainty through this chain is well known; resulting in an ‘explosion’ of uncertainties in the final risk assessment. Today, the space of plausible future risk scenarios is growing ever wider with the application of new techniques which aim to explore uncertainty ever more deeply; such as the perturbed physics ensembles used to generate the recent ‘probabilistic’ UK Climate Projections 2009, and the stochastic integrated assessment models, for example PAGE2002. While this uncertainty information is important, the explosion of uncertainty can make decision-making problematic, particularly given that the uncertainty information communicated can not be treated as strictly probabilistic and therefore, is not an easy fit with standard decision-making under uncertainty approaches. Additional problems can arise from the fact that the uncertainty estimated for different components of the ‘chain’ is rarely directly comparable or combinable. Here we demonstrate how ‘putting the economics first’ in risk and adaptation assessments, that is, almost reversing the traditional linear chain of a risk assessment, can collapse the uncertainties. Such an approach forces one to focus on the information requirements for the specific decision, simplifying the assessment. We demonstrate this approach for two case studies: an illustrative coastal village susceptible to storm surge and coastal flooding, and a Caribbean island exposed to wind-related tropical cyclone risks. We reflect on what such a change in approach might mean for the design of climate model experiments for adaptation decision-making.
Multicriteria decision analysis (MCDA) is rightly receiving increasing attention in health technology assessment. Nevertheless, a distinguishing feature of the health domain is that technologies must actually improve health, and good performance on other criteria cannot compensate for failure to do so. We argue for two reasonable tests for MCDA models: the treacle test (can a winning intervention be incompletely ineffective?) and the smallpox test (can a winning intervention be for a disease that no one suffers from?). We explore why models might fail such tests (as the models of some existing published studies would do) and offer some suggestions as to how practice should be improved.
Liu, Xiaoqin; Wang, Fuzhang; Wang, Pu
Aiming at the problems of randomness and fuzziness of railway emergency, this paper introduces a decision-making method of railway emergency based on combination weighting and cloud model. Firstly, In order to enhance the subjective and objective consistency of combined weights, the adjustment equations of weight coefficient are established with the Euclidean distance, then combined weights are calculated by means of improved analytic hierarchy process(IAHP) and entropy weight method. Secondly, the decision-making information of experts is converted into the cloud parameters of indexes with cloud model, and the cloud parameters of alternatives are obtained by integrating the combined weights and cloud parameters of indexes. Thirdly, the best alternative is obtained by analyzing and comparing the cloud parameters or cloud images of alternatives. Finally, the effectiveness and feasibility of the method are verified by a case.
Zhang, Dezhi; Li, Shuangyan
This paper proposes a new model of simultaneous optimization of three-level logistics decisions, for logistics authorities, logistics operators, and logistics users, for regional logistics network with environmental impact consideration. The proposed model addresses the interaction among the three logistics players in a complete competitive logistics service market with CO2 emission charges. We also explicitly incorporate the impacts of the scale economics of the logistics park and the logistics users' demand elasticity into the model. The logistics authorities aim to maximize the total social welfare of the system, considering the demand of green logistics development by two different methods: optimal location of logistics nodes and charging a CO2 emission tax. Logistics operators are assumed to compete with logistics service fare and frequency, while logistics users minimize their own perceived logistics disutility given logistics operators' service fare and frequency. A heuristic algorithm based on the multinomial logit model is presented for the three-level decision model, and a numerical example is given to illustrate the above optimal model and its algorithm. The proposed model provides a useful tool for modeling competitive logistics services and evaluating logistics policies at the strategic level. PMID:24977209
Zhang, Dezhi; Li, Shuangyan; Qin, Jin
This paper proposes a new model of simultaneous optimization of three-level logistics decisions, for logistics authorities, logistics operators, and logistics users, for regional logistics network with environmental impact consideration. The proposed model addresses the interaction among the three logistics players in a complete competitive logistics service market with CO2 emission charges. We also explicitly incorporate the impacts of the scale economics of the logistics park and the logistics users' demand elasticity into the model. The logistics authorities aim to maximize the total social welfare of the system, considering the demand of green logistics development by two different methods: optimal location of logistics nodes and charging a CO2 emission tax. Logistics operators are assumed to compete with logistics service fare and frequency, while logistics users minimize their own perceived logistics disutility given logistics operators' service fare and frequency. A heuristic algorithm based on the multinomial logit model is presented for the three-level decision model, and a numerical example is given to illustrate the above optimal model and its algorithm. The proposed model provides a useful tool for modeling competitive logistics services and evaluating logistics policies at the strategic level.
Mandali, Alekhya; Rengaswamy, Maithreye; Chakravarthy, V. Srinivasa; Moustafa, Ahmed A.
To make an optimal decision we need to weigh all the available options, compare them with the current goal, and choose the most rewarding one. Depending on the situation an optimal decision could be to either “explore” or “exploit” or “not to take any action” for which the Basal Ganglia (BG) is considered to be a key neural substrate. In an attempt to expand this classical picture of BG function, we had earlier hypothesized that the Indirect Pathway (IP) of the BG could be the subcortical substrate for exploration. In this study we build a spiking network model to relate exploration to synchrony levels in the BG (which are a neural marker for tremor in Parkinson's disease). Key BG nuclei such as the Sub Thalamic Nucleus (STN), Globus Pallidus externus (GPe) and Globus Pallidus internus (GPi) were modeled as Izhikevich spiking neurons whereas the Striatal output was modeled as Poisson spikes. The model is cast in reinforcement learning framework with the dopamine signal representing reward prediction error. We apply the model to two decision making tasks: a binary action selection task (similar to one used by Humphries et al., 2006) and an n-armed bandit task (Bourdaud et al., 2008). The model shows that exploration levels could be controlled by STN's lateral connection strength which also influenced the synchrony levels in the STN-GPe circuit. An increase in STN's lateral strength led to a decrease in exploration which can be thought as the possible explanation for reduced exploratory levels in Parkinson's patients. Our simulations also show that on complete removal of IP, the model exhibits only Go and No-Go behaviors, thereby demonstrating the crucial role of IP in exploration. Our model provides a unified account for synchronization, action section, and explorative behavior. PMID:26074761
Nobrega, José N.; Hedayatmofidi, Parisa S.; Lobo, Daniela S.
Risky decision-making is characteristic of depression and of addictive disorders, including pathological gambling. However it is not clear whether a propensity to risky choices predisposes to depressive symptoms or whether the converse is the case. Here we tested the hypothesis that rats showing risky decision-making in a rat gambling task (rGT) would be more prone to depressive-like behaviour in the learned helplessness (LH) model. Results showed that baseline rGT choice behaviour did not predict escape deficits in the LH protocol. In contrast, exposure to the LH protocol resulted in a significant increase in risky rGT choices on retest. Unexpectedly, control rats subjected only to escapable stress in the LH protocol showed a subsequent decrease in riskier rGT choices. Further analyses indicated that the LH protocol affected primarily rats with high baseline levels of risky choices and that among these it had opposite effects in rats exposed to LH-inducing stress compared to rats exposed only to the escape trials. Together these findings suggest that while baseline risky decision making may not predict LH behaviour it interacts strongly with LH conditions in modulating subsequent decision-making behaviour. The suggested possibility that stress controllability may be a key factor should be further investigated. PMID:27857171
Hales, Claire A; Robinson, Emma S J; Houghton, Conor J
Human decision making is modified by emotional state. Rodents exhibit similar biases during interpretation of ambiguous cues that can be altered by affective state manipulations. In this study, the impact of negative affective state on judgement bias in rats was measured using an ambiguous-cue interpretation task. Acute treatment with an anxiogenic drug (FG7142), and chronic restraint stress and social isolation both induced a bias towards more negative interpretation of the ambiguous cue. The diffusion model was fit to behavioural data to allow further analysis of the underlying decision making processes. To uncover the way in which parameters vary together in relation to affective state manipulations, independent component analysis was conducted on rate of information accumulation and distances to decision threshold parameters for control data. Results from this analysis were applied to parameters from negative affective state manipulations. These projected components were compared to control components to reveal the changes in decision making processes that are due to affective state manipulations. Negative affective bias in rodents induced by either FG7142 or chronic stress is due to a combination of more negative interpretation of the ambiguous cue, reduced anticipation of the high reward and increased anticipation of the low reward.
Schaefer, M. K.
A Markovian decision model was developed to calculate the optimal inventory of repairable spare parts for an avionics control system for commercial aircraft. Total expected shortage costs, repair costs, and holding costs are minimized for a machine containing a single system of redundant parts. Transition probabilities are calculated for each repair state and repair rate, and optimal spare parts inventory and repair strategies are determined through linear programming. The linear programming solutions are given in a table.
Abram, Samantha V.; Breton, Yannick-André; Schmidt, Brandy; Redish, A. David; MacDonald, Angus W.
Animal models of decision-making are some of the most highly regarded psychological process models; however, there remains a disconnection between how these models are used for pre-clinical applications and the resulting treatment outcomes. This may be due to untested assumptions that different species recruit the same neural or psychological mechanisms. We propose a novel human foraging paradigm (Web-Surf Task) that we translated from a rat foraging paradigm (Restaurant Row) to evaluate cross-species decision-making similarities. We examined behavioral parallels in human and nonhuman animals using the respective tasks. We also compared two variants of the human task, one using videos and the other using photos as rewards, by correlating revealed and stated preferences. We demonstrate similarities in choice behaviors and decision reaction times in human and rat subjects. Findings also indicate that videos yielded more reliable and valid results. The joint use of the Web-Surf Task and Restaurant Row is therefore a promising approach for functional translational research, aiming to bridge pre-clinical and clinical lines of research using analogous tasks. PMID:26377334
Rodrigues, Leonor; Calheiros, Manuela; Pereira, Cícero
Out-of-home placement decisions in residential care are complex, ambiguous and full of uncertainty, especially in cases of parental neglect. Literature on this topic is so far unable to understand and demonstrate the source of errors involved in those decisions and still fails to focus on professional's decision making process. Therefore, this work intends to test a socio-psychological model of decision-making that is a more integrated, dualistic and ecological version of the Theory of Planned Behavior's model. It describes the process through which the decision maker takes into account personal, contextual and social factors of the Decision-Making Ecology in the definition of his/her decision threshold. One hundred and ninety-five professionals from different Children and Youth Protection Units, throughout the Portuguese territory, participated in this online study. After reading a vignette of a (psychological and physical) neglect case toward a one-year-old child, participants were presented with a group of questions that measured worker's assessment of risk, intention, attitude, subjective norm, behavior control and beliefs toward residential care placement decision, as well as worker's behavior experience, emotions and family/child-related-values involved in that decision. A set of structural equation modeling analyses have proven the good fit of the proposed model. The intention to propose a residential care placement decision was determined by cognitive, social, affective, value-laden and experience variables and the perceived risk. Altogether our model explained 61% of professional's decision toward a parental neglect case. The theoretical and practical implications of these results are discussed, namely the importance of raising awareness about the existence of these biased psychosocial determinants.
Quinodoz, H. A.
About five percent of the US population depends on the waters from the Delaware River Basin for its water supply, including New York City and Philadelphia. Water management in the basin is governed by a compact signed in 1961 by the four basin states and the federal government. The compact created the Delaware River Basin Commission (DRBC) and gave it broad powers to plan, regulate, and manage the development of the basin water resources. The compact also recognized a pre-existing (1954) U.S. Supreme Court Decree that grants the City of New York the right to export up to 800 million gallons per day out of the basin, provided that a prescribed minimum flow is met at Montague, New Jersey for the use of the lower-basin states. The Delaware River Basin Compact also allows the DRBC to adjust the releases and diversions under the Decree, subject to the unanimous consent of the decree parties. This mechanism has been used several times over the last 30 years, to implement and modify rules governing drought operations, instream flows, minimum flow targets, and control of salinity intrusion. In every case, decision makers have relied upon extensive modeling of alternative proposals, using a basin-wide daily flow model. Often, stakeholders have modified and used the same model to test and refine their proposals prior to consideration by the decision makers. The flow model has been modified over the years, to simulate new features and processes in a river system partially controlled by more than ten reservoirs. The flow model has proved to be an adaptable tool, able to simulate the dynamics of a complex system driven by conflicting objectives. This presentation reviews the characteristics of the daily flow model in its current form, discuss how model simulations are used to inform the decision-making process, and provide a case study of a recent modification of the system-wide drought operating plan.
Moskowitz, P.D.; Pardi, R.; DePhillips, M.P.; Meinhold, A.F.
Massive efforts are underway to cleanup hazardous and radioactive waste sites located throughout the US To help determine cleanup priorities, computer models are being used to characterize the source, transport, fate and effects of hazardous chemicals and radioactive materials found at these sites. Although, the US Environmental Protection Agency (EPA), the US Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) have provided preliminary guidance to promote the use of computer models for remediation purposes, no Agency has produced directed guidance on models that must be used in these efforts. To identify what models are actually being used to support decision-making at hazardous and radioactive waste sites, a project jointly funded by EPA, DOE and NRC was initiated. The purpose of this project was to: (1) Identify models being used for hazardous and radioactive waste site assessment purposes; and (2) describe and classify these models. This report presents the results of this study.
Habib, Shahid; Pickering, Ken; Tzortziou, Maria; Maninio, Antonio; Policelli, Fritz; Stehr, Jeff
The Gulf of Mexico Modeling Framework is a suite of coupled models linking the deposition and transport of sediment and nutrients to subsequent bio-geo chemical processes and the resulting effect on concentrations of dissolved oxygen in the coastal waters of Louisiana and Texas. Here, we examine the potential benefits of using multiple NASA remote sensing data products within this Modeling Framework for increasing the accuracy of the models and their utility for nutrient control decisions in the Gulf of Mexico. Our approach is divided into three components: evaluation and improvement of (a) the precipitation input data (b) atmospheric constituent concentrations in EPA's air quality/deposition model and (c) the calculation of algal biomass, organic carbon and suspended solids within the water quality/eutrophication models of the framework.
Wu, Zhihua; Guo, Aike
Previous elegant experiments in a flight simulator showed that conditioned Drosophila is able to make a clear-cut decision to avoid potential danger. When confronted with conflicting visual cues, the relative saliency of two competing cues is found to be a sensory ruler for flies to judge which cue should be used for decision-making. Further genetic manipulations and immunohistological analysis revealed that the dopamine system and mushroom bodies are indispensable for such a clear-cut or nonlinear decision. The neural circuit mechanism, however, is far from being clear. In this paper, we adopt a computational modeling approach to investigate how different brain areas and the dopamine system work together to drive a fly to make a decision. By developing a systems-level neural network, a two-pathway circuit is proposed. Besides a direct pathway from a feature binding area to the motor center, another connects two areas via the mushroom body, a target of dopamine release. A raised dopamine level is hypothesized to be induced by complex choice tasks and to enhance lateral inhibition and steepen the units' response gain in the mushroom body. Simulations show that training helps to assign values to formerly neutral features. For a circuit model with a blocked mushroom body, the direct pathway passes all alternatives to the motor center without changing original values, giving rise to a simple choice characterized by a linear choice curve. With respect to an intact circuit, enhanced lateral inhibition dependent on dopamine critically promotes competition between alternatives, turning the linear- into nonlinear choice behavior. Results account well for experimental data, supporting the reasonableness of model working hypotheses. Several testable predictions are made for future studies.
Keating, Elizabeth; Bacon, Diana; Carroll, Susan; Mansoor, Kayyum; Sun, Yunwei; Zheng, Liange; Harp, Dylan; Dai, Zhenxue
The National Risk Assessment Partnership has developed a suite of tools to assess and manage risk at CO2 sequestration sites (www.netldoe.gov/nrap). This capability includes polynomial or look-up table based reduced-order models (ROMs) that predict the impact of CO2 and brine leaks on overlying aquifers. The development of these computationally-efficient models and the underlying reactive transport simulations they emulate has been documented elsewhere (Carroll et al., 2014, Dai et al., 2014, Keating et al., 2015). The ROMs reproduce the ensemble behavior of large numbers of simulations and are well-suited to applications that consider a large number of scenarios to understand parameter sensitivity and uncertainty on the risk of CO2 leakage to groundwater quality. In this paper, we seek to demonstrate applicability of ROM-based ensemble analysis by considering what types of decisions and aquifer types would benefit from the ROM analysis. We present four hypothetical four examples where applying ROMs, in ensemble mode, could support decisions in the early stages in a geologic CO2 sequestration project. These decisions pertain to site selection, site characterization, monitoring network evaluation, and health impacts. In all cases, we consider potential brine/CO2 leak rates at the base of the aquifer to be uncertain. We show that derived probabilities provide information relevant to the decision at hand. Although the ROMs were developed using site-specific data from two aquifers (High Plains and Edwards), the models accept aquifer characteristics as variable inputs and so they may have more broad applicability. We conclude that pH and TDS predictions are the most transferable to other aquifers based on the analysis of the nine water quality metrics (pH, TDS, 4 trace metals, 3 organic compounds). Guidelines are presented for determining the aquifer types for which the ROMs should be applicable.
Wolfslehner, Bernhard; Seidl, Rupert
The decision-making environment in forest management (FM) has changed drastically during the last decades. Forest management planning is facing increasing complexity due to a widening portfolio of forest goods and services, a societal demand for a rational, transparent decision process and rising uncertainties concerning future environmental conditions (e.g., climate change). Methodological responses to these challenges include an intensified use of ecosystem models to provide an enriched, quantitative information base for FM planning. Furthermore, multi-criteria methods are increasingly used to amalgamate information, preferences, expert judgments and value expressions, in support of the participatory and communicative dimensions of modern forestry. Although the potential of combining these two approaches has been demonstrated in a number of studies, methodological aspects in interfacing forest ecosystem models (FEM) and multi-criteria decision analysis (MCDA) are scarcely addressed explicitly. In this contribution we review the state of the art in FEM and MCDA in the context of FM planning and highlight some of the crucial issues when combining ecosystem and preference modeling. We discuss issues and requirements in selecting approaches suitable for supporting FM planning problems from the growing body of FEM and MCDA concepts. We furthermore identify two major challenges in a harmonized application of FEM-MCDA: (i) the design and implementation of an indicator-based analysis framework capturing ecological and social aspects and their interactions relevant for the decision process, and (ii) holistic information management that supports consistent use of different information sources, provides meta-information as well as information on uncertainties throughout the planning process.
Parker, Andrew J
The study of sensory signaling in the visual cortex has been greatly advanced by the recording of neural activity simultaneously with the performance of a specific psychophysical task. Individual nerve cells may also increase their firing leading up to the particular choice or decision made on a single psychophysical trial. Understanding these signals is important because they have been taken as evidence that a particular nerve cell or group of nerve cells in the cortex is involved in the formation of the perceptual decision ultimately signaled by the organism. However, recent analyses show that the size of a decision-related change in firing in a particular neuron is not a secure basis for concluding anything about the contribution of a single neuron to the formation of a decision: rather the size of the decision-related firing is expected to be dominated by the extent to which the activation of a single neuron is correlated with the firing of the pool of neurons. The critical question becomes what defines membership of a population of neurons. This article presents the proposal that groups of neurons are naturally linked together by their connectivity, which in turn reflects the previous history of sensory stimulations. When a new psychophysical task is performed, a group of neurons relevant to the judgment becomes involved because the firing of some neurons in that group is strongly relevant to the task. This group of neurons is called a micro-pool. This article examines the consequences of such a proposal within the visual nervous system. The main focus is on the signals available from single neurons, but it argued that models of choice-related signals must scale up to larger numbers of neurons because MRI and MEG studies also show evidence of similar choice signals.
Hunink, J.; Hoogewoud, J. C.; Prinsen, G.; Veldhuizen, A.
Netherlands Hydrological Modeling Instrument Decision support for dutch drought management and climate change. J. Hunink , J.C.Hoogewoud , A. Veldhuizen , G. Prinsen , The Netherlands Hydrological modeling Instrument (NHI) is the center point of a framework of models, to coherently model the hydrological system and the multitude of functions it supports. Dutch hydrological institutes Deltares, Alterra, Netherlands Environmental Assessment Agency, RWS Waterdienst, STOWA and Vewin are cooperating in enhancing the NHI for adequate decision support. The instrument is used by three different ministries involved in national water policy matters, for instance drought management, manure policy and climate change issues. The basis of the modeling instrument is a state-of-the-art on-line coupling of the groundwater system (MODFLOW), the unsaturated zone (metaSWAP) and the surface water system (MOZART-DM). It brings together hydro(geo)logical processes from the column to the basin scale, ranging from 250x250m plots to the river Rhine and includes salt water flow. The NHI is validated with an eight year run (1998-2006) with dry and wet periods and is updated every year. During periods of water scarcity the NHI is used for operational forecasting and decision support system for the National Board of water Distribution. It provides data on nationwide calculated water demands, development of water levels in reservoirs and possible los of yield in agricultural area's. For the exploration of the future of fresh water supply in the Netherlands an extensive study is set up using the NHI. In this study different climate scenarios are being evalueated. In the first phase the focus is on describing the range of possible effects, the second phase focuses on adaptive measures and preparing for decisions how to alter the hydrological system. Results from the first phase show that in future scenario's fresh water may not be available to current water users. Important decisions about the
Akselrod, Hana; Mercon, Monica; Kirkeby Risoe, Petter; Schlegelmilch, Jeffrey; McGovern, Joanne; Bogucki, Sandy
Modern computational models of infectious diseases greatly enhance our ability to understand new infectious threats and assess the effects of different interventions. The recently-released CDC Framework for Preventing Infectious Diseases calls for increased use of predictive modelling of epidemic emergence for public health preparedness. Currently, the utility of these technologies in preparedness and response to outbreaks is limited by gaps between modelling output and information requirements for incident management. The authors propose an operational structure that will facilitate integration of modelling capabilities into action planning for outbreak management, using the Incident Command System (ICS) and Synchronization Matrix framework. It is designed to be adaptable and scalable for use by state and local planners under the National Response Framework (NRF) and Emergency Support Function #8 (ESF-8). Specific epidemiological modelling requirements are described, and integrated with the core processes for public health emergency decision support. These methods can be used in checklist format to align prospective or real-time modelling output with anticipated decision points, and guide strategic situational assessments at the community level. It is anticipated that formalising these processes will facilitate translation of the CDC's policy guidance from theory to practice during public health emergencies involving infectious outbreaks.
Wild, J. Christian; Dong, Jinghuan; Maly, Kurt
Design is a complex activity whose purpose is to construct an artifact which satisfies a set of constraints and requirements. However the design process is not well understood. The software design and evolution process is the focus of interest, and a three dimensional software development space organized around a decision-making paradigm is presented. An initial instantiation of this model called 3DPM(sub p) which was partly implemented, is presented. Discussion of the use of this model in software reuse and process management is given.
Goodrich, D. C.; Brookshire, D.; Broadbent, C.; Dixon, M. D.; Brand, L. A.; Thacher, J.; Benedict, K. K.; Lansey, K. E.; Stromberg, J. C.; Stewart, S.; McIntosh, M.
Water is a critical component for sustaining both natural and human systems. Yet the value of water for sustaining ecosystem services is not well quantified in monetary terms. Ideally decisions involving water resource management would include an apples-to-apples comparison of the costs and benefits in dollars of both market and non-market goods and services - human and ecosystem. To quantify the value of non-market ecosystem services, scientifically defensible relationships must be developed that link the effect of a decision (e.g. human growth) to the change in ecosystem attributes from current conditions. It is this linkage that requires the "poly-disciplinary" coupling of knowledge and models from the behavioral, physical, and ecological sciences. In our experience another key component of making this successful linkage is development of a strong poly-disciplinary scientific team that can readily communicate complex disciplinary knowledge to non-specialists outside their own discipline. The time to build such a team that communicates well and has a strong sense of trust should not be underestimated. The research described in the presentation incorporated hydrologic, vegetation, avian, economic, and decision models into an integrated framework to determine the value of changes in ecological systems that result from changes in human water use. We developed a hydro-bio-economic framework for the San Pedro River Region in Arizona that considers groundwater, stream flow, and riparian vegetation, as well as abundance, diversity, and distribution of birds. In addition, we developed a similar framework for the Middle Rio Grande of New Mexico. There are six research components for this project: (1) decision support and scenario specification, (2) regional groundwater model, (3) the riparian vegetation model, (4) the avian model, (5) methods for displaying the information gradients in the valuation survey instruments (Choice Modeling and Contingent Valuation), and (6
analogous definition would apply to the “diagnose” reservoir. For each simulated decision, one reservoir accumulated a total of 100 percent- age points...Smelser & P. B. Baltes (Eds.), International encyclopedia of the social and behavioral sciences (pp. 3668–3673). Oxford, UK: Elsevier. Reber, A. (1989
Jani, Ashesh B. Hellman, Samuel
Purpose: To determine the relative influence of treatment features and treatment availabilities on final treatment decisions in early prostate cancer. Methods and Materials: We describe and apply a model, based on hedonic prices, to understand provider-patient interactions in prostate cancer. This model included four treatments (observation, external beam radiotherapy, brachytherapy, and prostatectomy) and five treatment features (one efficacy and four treatment complication features). We performed a literature search to estimate (1) the intersections of the 'bid' functions and 'offer' functions with the price function along different treatment feature axes, and (2) the treatments actually rendered in different patient subgroups based on age. We performed regressions to determine the relative weight of each feature in the overall interaction and the relative availability of each treatment modality to explain differences between observed vs. predicted use of different modalities in different patient subpopulations. Results: Treatment efficacy and potency preservation are the major factors influencing decisions for young patients, whereas preservation of urinary and rectal function is much more important for very elderly patients. Referral patterns seem to be responsible for most of the deviations of observed use of different treatments from those predicted by idealized provider-patient interactions. Specifically, prostatectomy is used far more commonly in young patients and radiotherapy and observation used far more commonly in elderly patients than predicted by a uniform referral pattern. Conclusions: The hedonic prices approach facilitated identifying the relative importance of treatment features and quantification of the impact of the prevailing referral pattern on prostate cancer treatment decisions.
Vigelius, Matthias; Meyer, Bernd; Pascoe, Geoffrey
We present a unified approach to describing certain types of collective decision making in swarm robotics that bridges from a microscopic individual-based description to aggregate properties. Our approach encompasses robot swarm experiments, microscopic and probabilistic macroscopic-discrete simulations as well as an analytic mathematical model. Following up on previous work, we identify the symmetry parameter, a measure of the progress of the swarm towards a decision, as a fundamental integrated swarm property and formulate its time evolution as a continuous-time Markov process. Contrary to previous work, which justified this approach only empirically and a posteriori, we justify it from first principles and derive hard limits on the parameter regime in which it is applicable.
Attaluri, Pavan K.; Chen, Zhengxin; Weerakoon, Aruna M.; Lu, Guoqing
Multiple criteria decision making (MCDM) has significant impact in bioinformatics. In the research reported here, we explore the integration of decision tree (DT) and Hidden Markov Model (HMM) for subtype prediction of human influenza A virus. Infection with influenza viruses continues to be an important public health problem. Viral strains of subtype H3N2 and H1N1 circulates in humans at least twice annually. The subtype detection depends mainly on the antigenic assay, which is time-consuming and not fully accurate. We have developed a Web system for accurate subtype detection of human influenza virus sequences. The preliminary experiment showed that this system is easy-to-use and powerful in identifying human influenza subtypes. Our next step is to examine the informative positions at the protein level and extend its current functionality to detect more subtypes. The web functions can be accessed at http://glee.ist.unomaha.edu/.
Sarif, Siti Mahfuzah; Ibrahim, Norfiza; Shiratuddin, Norshuhada
This paper provides a structured review of a design model of a computerized personal decision aid that is intended for youth, named as YouthPDA Design Model. The proposed design model was examined by experts in related areas to ensure the appropriateness of the proposed components and elements, relevancy of the terminologies used, logic of the flow, usability, and practicality of the design model towards development of YouthPDA application. Seven experts from related areas were involved in the evaluation. Discussions on the findings obtained from the expert review are included in this paper. Finally, a revised design model of YouthPDA is proposed as main guidance to develop YouthPDA application.
Marco-Ruiz, Luis; Maldonado, J Alberto; Karlsen, Randi; Bellika, Johan G
Clinical Decision Support Systems (CDSS) help to improve health care and reduce costs. However, the lack of knowledge management and modelling hampers their maintenance and reuse. Current EHR standards and terminologies can allow the semantic representation of the data and knowledge of CDSS systems boosting their interoperability, reuse and maintenance. This paper presents the modelling process of respiratory conditions' symptoms and signs by a multidisciplinary team of clinicians and information architects with the help of openEHR, SNOMED and clinical information modelling tools for a CDSS. The information model of the CDSS was defined by means of an archetype and the knowledge model was implemented by means of an SNOMED-CT based ontology.
Minsker, B. S.; Zimmer, A. L.; Ostfeld, A.; Schmidt, A.
Enabling real-time decision support, particularly under conditions of uncertainty, requires computationally efficient algorithms that can rapidly generate recommendations. In this paper, a suite of model predictive control (MPC) genetic algorithms are developed and tested offline to explore their value for reducing CSOs during real-time use in a deep-tunnel sewer system. MPC approaches include the micro-GA, the probability-based compact GA, and domain-specific GA methods that reduce the number of decision variable values analyzed within the sewer hydraulic model, thus reducing algorithm search space. Minimum fitness and constraint values achieved by all GA approaches, as well as computational times required to reach the minimum values, are compared to large population sizes with long convergence times. Optimization results for a subset of the Chicago combined sewer system indicate that genetic algorithm variations with coarse decision variable representation, eventually transitioning to the entire range of decision variable values, are most efficient at addressing the CSO control problem. Although diversity-enhancing micro-GAs evaluate a larger search space and exhibit shorter convergence times, these representations do not reach minimum fitness and constraint values. The domain-specific GAs prove to be the most efficient and are used to test CSO sensitivity to energy costs, CSO penalties, and pressurization constraint values. The results show that CSO volumes are highly dependent on the tunnel pressurization constraint, with reductions of 13% to 77% possible with less conservative operational strategies. Because current management practices may not account for varying costs at CSO locations and electricity rate changes in the summer and winter, the sensitivity of the results is evaluated for variable seasonal and diurnal CSO penalty costs and electricity-related system maintenance costs, as well as different sluice gate constraint levels. These findings indicate
Krishnan, N.; Sudheer, K. P.; Raj, C.; Chaubey, I.
The diminishing quantities of non-renewable forms of energy have caused an increasing interest in the renewable sources of energy, such as biofuel, in the recent years. However, the demand for biofuel has created a concern for allocating grain between the fuel and food industry. Consequently, appropriate regulations that limit grain based ethanol production have been developed and are put to practice, which resulted in cultivating perennial grasses like Switch grass and Miscanthus to meet the additional cellulose demand. A change in cropping and management practice, therefore, is essential to cater the conflicting requirement for food and biofuel, which has a long-term impact on the downstream water quality. Therefore it is essential to implement optimal cropping practices to reduce the pollutant loadings. Simulation models in conjunction with optimization procedures are useful in developing efficient cropping practices in such situations. One such model is the Soil and Water Assessment Tool (SWAT), which can simulate both the water and the nutrient cycle, as well as quantify long-term impacts of changes in management practice in the watershed. It is envisaged that the SWAT model, along with an optimization algorithm, can be used to identify the optimal cropping pattern that achieves the minimum guaranteed grain production with less downstream pollution, while maximizing the biomass production for biofuel generation. However, the SWAT simulations do have a certain level of uncertainty that needs to be accounted for before making decisions. Therefore, the objectives of this study are twofold: (i) to understand how model uncertainties influence decision-making, and (ii) to develop appropriate management scenarios that account the uncertainty. The simulation uncertainty of the SWAT model is assessed using Shuffled Complex Evolutionary Metropolis Algorithm (SCEM). With the data collected from St. Joseph basin, IN, USA, the preliminary results indicate that model
Kaufmann, Esther; Reips, Ulf-Dietrich; Wittmann, Werner W
Achieving accurate judgment ('judgmental achievement') is of utmost importance in daily life across multiple domains. The lens model and the lens model equation provide useful frameworks for modeling components of judgmental achievement and for creating tools to help decision makers (e.g., physicians, teachers) reach better judgments (e.g., a correct diagnosis, an accurate estimation of intelligence). Previous meta-analyses of judgment and decision-making studies have attempted to evaluate overall judgmental achievement and have provided the basis for evaluating the success of bootstrapping (i.e., replacing judges by linear models that guide decision making). However, previous meta-analyses have failed to appropriately correct for a number of study design artifacts (e.g., measurement error, dichotomization), which may have potentially biased estimations (e.g., of the variability between studies) and led to erroneous interpretations (e.g., with regards to moderator variables). In the current study we therefore conduct the first psychometric meta-analysis of judgmental achievement studies that corrects for a number of study design artifacts. We identified 31 lens model studies (N = 1,151, k = 49) that met our inclusion criteria. We evaluated overall judgmental achievement as well as whether judgmental achievement depended on decision domain (e.g., medicine, education) and/or the level of expertise (expert vs. novice). We also evaluated whether using corrected estimates affected conclusions with regards to the success of bootstrapping with psychometrically-corrected models. Further, we introduce a new psychometric trim-and-fill method to estimate the effect sizes of potentially missing studies correct psychometric meta-analyses for effects of publication bias. Comparison of the results of the psychometric meta-analysis with the results of a traditional meta-analysis (which only corrected for sampling error) indicated that artifact correction leads to a) an
Kaufmann, Esther; Reips, Ulf-Dietrich; Wittmann, Werner W.
Achieving accurate judgment (‘judgmental achievement’) is of utmost importance in daily life across multiple domains. The lens model and the lens model equation provide useful frameworks for modeling components of judgmental achievement and for creating tools to help decision makers (e.g., physicians, teachers) reach better judgments (e.g., a correct diagnosis, an accurate estimation of intelligence). Previous meta-analyses of judgment and decision-making studies have attempted to evaluate overall judgmental achievement and have provided the basis for evaluating the success of bootstrapping (i.e., replacing judges by linear models that guide decision making). However, previous meta-analyses have failed to appropriately correct for a number of study design artifacts (e.g., measurement error, dichotomization), which may have potentially biased estimations (e.g., of the variability between studies) and led to erroneous interpretations (e.g., with regards to moderator variables). In the current study we therefore conduct the first psychometric meta-analysis of judgmental achievement studies that corrects for a number of study design artifacts. We identified 31 lens model studies (N = 1,151, k = 49) that met our inclusion criteria. We evaluated overall judgmental achievement as well as whether judgmental achievement depended on decision domain (e.g., medicine, education) and/or the level of expertise (expert vs. novice). We also evaluated whether using corrected estimates affected conclusions with regards to the success of bootstrapping with psychometrically-corrected models. Further, we introduce a new psychometric trim-and-fill method to estimate the effect sizes of potentially missing studies correct psychometric meta-analyses for effects of publication bias. Comparison of the results of the psychometric meta-analysis with the results of a traditional meta-analysis (which only corrected for sampling error) indicated that artifact correction leads to a) an
Ibáñez-Gijón, Jorge; Jacobs, David M.
Inhibition of Return (IOR) is one of the most consistent and widely studied effects in experimental psychology. The effect refers to a delayed response to visual stimuli in a cued location after initial priming at that location. This article presents a dynamic field model for IOR. The model describes the evolution of three coupled activation fields. The decision field, inspired by the intermediate layer of the superior colliculus, receives endogenous input and input from a sensory field. The sensory field, inspired by earlier sensory processing, receives exogenous input. Habituation of the sensory field is implemented by a reciprocal coupling with a third field, the habituation field. The model generates IOR because, due to the habituation of the sensory field, the decision field receives a reduced target-induced input in cue-target-compatible situations. The model is consistent with single-unit recordings of neurons of monkeys that perform IOR tasks. Such recordings have revealed that IOR phenomena parallel the activity of neurons in the intermediate layer of the superior colliculus and that neurons in this layer receive reduced input in cue-target-compatible situations. The model is also consistent with behavioral data concerning temporal expectancy effects. In a discussion, the multi-layer dynamic field account of IOR is used to illustrate the broader view that behavior consists of a tuning of the organism to the environment that continuously and concurrently takes place at different spatiotemporal scales. PMID:22427980
Morss, Rebecca E; Demuth, Julie L; Bostrom, Ann; Lazo, Jeffrey K; Lazrus, Heather
Timely warning communication and decision making are critical for reducing harm from flash flooding. To help understand and improve extreme weather risk communication and management, this study uses a mental models research approach to investigate the flash flood warning system and its risk decision context. Data were collected in the Boulder, Colorado area from mental models interviews with forecasters, public officials, and media broadcasters, who each make important interacting decisions in the warning system, and from a group modeling session with forecasters. Analysis of the data informed development of a decision-focused model of the flash flood warning system that integrates the professionals' perspectives. Comparative analysis of individual and group data with this model characterizes how these professionals conceptualize flash flood risks and associated uncertainty; create and disseminate flash flood warning information; and perceive how warning information is (and should be) used in their own and others' decisions. The analysis indicates that warning system functioning would benefit from professionals developing a clearer, shared understanding of flash flood risks and the warning system, across their areas of expertise and job roles. Given the challenges in risk communication and decision making for complex, rapidly evolving hazards such as flash floods, another priority is development of improved warning content to help members of the public protect themselves when needed. Also important is professional communication with members of the public about allocation of responsibilities for managing flash flood risks, as well as improved system-wide management of uncertainty in decisions.
Habib, Shaid; Pickering, Ken; Tzortziou, Maria; Maninio, Antonio; Policelli, Fritz
As required by the Harmful Algal Bloom and Hypoxia Research Control Act of 1998, the Mississippi River/Gulf of Mexico Watershed Nutrient Task Force issued the 2001 Gulf Hypoxia Action Plan (updated in 2008). In response to the Gulf Hypoxia Action Plan of 2001 (updated in 2008), the EPA Gulf of Mexico Hypoxia Modeling and Monitoring Project has established a detailed model for the Mississippi-Attchafalaya River Basin which provides a capability to forecast the multi-source nutrient loading to the Gulf and the subsequent bio-geochemical processes leading to hypoxic conditions and subsequent effects on Gulf habitats and fisheries. The primary purpose of the EPA model is to characterize the impacts of nutrient management actions, or proposed actions on the spatial and temporal characteristics of the Gulf hypoxic zone. The model is expected to play a significant role in determining best practices and improved strategies for incentivizing nutrient reduction strategies, including installation of on-farm structures to reduce sediment and nutrient runoff, use of cover crops and other agricultural practices, restoration of wetlands and riparian buffers, improved waste water treatment and decreased industrial nitrogen emissions. These decisions are currently made in a fragmented way by federal, state, and local agencies, using a variety of small scale models and limited data. During the past three years, EPA has collected an enormous amount of in-situ data to be used in the model. We believe that the use of NASA satellite data products in the model and for long term validation of the model has the potential to significantly increase the accuracy and therefore the utility of the model for the decision making described above. This proposal addresses the Gulf of Mexico Alliance (GOMA) priority issue of reductions in nutrient inputs to coastal ecosystem. It further directly relates to water quality for healthy beaches and shellfish beds and wetland and coastal conservation
Grunow, Martin; Günther, Hans-Otto; Yang, Gang
Clinical studies for the development of new drugs in the pharmaceutical industry consist of a number of individual tasks which have to be carried out in a pre-defined chronological order. Each task requires certain types of medical personnel. This paper investigates the scheduling of clinical studies to be performed during a short-term planning horizon, the allocation of workforce between the studies, and the assignment of individual employees to tasks. Instead of developing a complex monolithic decision model, a hierarchical modelling approach is suggested. In the first stage, a compact integer optimization model is solved in order to determine the start-off times of the studies and the required staffing while taking the limited availability of personnel into account. The objective is to minimize total staffing costs. The assignment of individual employees to tasks is then made in the second stage of the procedure using a binary optimization model.
A great deal of attention is given to the importance of communication in environmental remediation and radioactive waste management. However, very little attention is given to eliciting multiple perspectives so as to formulate high quality decisions. Plans that are based on a limited number of perspectives tend to be narrowly focused whereas those that are based on a wide variety of perspectives tend to be comprehensive, higher quality, and more apt to be put into application. In addition, existing methods of dialogue have built-in limitations in that they typically draw from the predominant thinking patterns which focus in some areas but ignore others. This can result in clarity but a lack of comprehensiveness. This paper presents a Perspective Awareness Model which helps groups such as partnering teams, interagency teams, steering committees, and working groups elicit a wide net of perspectives and viewpoints. The paper begins by describing five factors that makes cooperation among such groups challenging. Next, a Perspective Awareness Model that makes it possible to manage these five factors is presented. The two primary components of this model --- the eight 'Thinking Directions' and the 'Shared Documentation' --- are described in detail. Several examples are given to illustrate how the Perspective Awareness Model can be used to elicit multiple perspectives to formulate high quality decisions in the area of environmental remediation and radioactive waste management. (authors)
Wu, Yirong; Abbey, Craig K.; Chen, Xianqiao; Liu, Jie; Page, David C.; Alagoz, Oguzhan; Peissig, Peggy; Onitilo, Adedayo A.; Burnside, Elizabeth S.
Abstract. Combining imaging and genetic information to predict disease presence and progression is being codified into an emerging discipline called “radiogenomics.” Optimal evaluation methodologies for radiogenomics have not been well established. We aim to develop a decision framework based on utility analysis to assess predictive models for breast cancer diagnosis. We garnered Gail risk factors, single nucleotide polymorphisms (SNPs), and mammographic features from a retrospective case-control study. We constructed three logistic regression models built on different sets of predictive features: (1) Gail, (2) Gail + Mammo, and (3) Gail + Mammo + SNP. Then we generated receiver operating characteristic (ROC) curves for three models. After we assigned utility values for each category of outcomes (true negatives, false positives, false negatives, and true positives), we pursued optimal operating points on ROC curves to achieve maximum expected utility of breast cancer diagnosis. We performed McNemar’s test based on threshold levels at optimal operating points, and found that SNPs and mammographic features played a significant role in breast cancer risk estimation. Our study comprising utility analysis and McNemar’s test provides a decision framework to evaluate predictive models in breast cancer risk estimation. PMID:26835489
Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates tools from environmental assessment into an effective problem-solving environment. SADA was developed by the Institute for Environmental Modeling at the University of Tennessee and inc...